WorldWideScience

Sample records for automatic vol analysis

  1. The change of cerebral blood flow after heart transplantation in congestive heart failure: a voxel-based and automatic VOl analysis of Tc-99m ECD SPECT

    International Nuclear Information System (INIS)

    To investigate the change of global and regional cerebral blood flow after heart transplantation (HT) in congestive heart failure (CHF) patients. Twenty-one patients with CHF who underwent HT (45±12 yrs, M/F=19/2) and 10 healthy volunteers (39±13 yrs, M/F = 7/3) were prospectively included. All patients underwent echocardiography and radionuclide angiography including brain and aorta with brain SPECT which was performed after iv bolus injection of Tc-99m ECD (740MBq) before (175±253 days) and after (129±82 days) HT. Patients were divided into two groups according to the interval between HT and postoperative SPECT [early follow-up (f/u): 6 mo, n=7]. Global CBF (gCBF) of bilateral hemispheres were calculated by Patlak graphical analysis. Absolute rCBF map was obtained from brain SPECT by Lassen's correction algorithm. Age-corrected voxel-based analysis using SPM2 and automatic VOl analysis were performed to assess the rCBF change. Cardiac ejection fraction of all patients improved after HT (20.8%→64.0%). gCBF was reduced compared to normal before HT (35.7±3.9 vs. 49.1±3.0 ml/100g/min; p<0.001) and improved postoperatively (46.6±5.4, p<0.001). The preoperative gCBFs of early and late f/u group were not different (34.6±3.2 vs. 38.0±4.4, p=0.149) but postoperative gCBF (43.9±3.7) of late f/u group was higher than those (52.0±4.0) of early f/u group (p<0.001). On voxel-based analysis, preoperative rCBF was reduced in entire brain but most severely in bilateral superior and inferior frontal cortex, supplementary motor area, precuneus and anterior cingulum, compared to normals (uncorrected p<0.001). After HT, rCBF of these areas improved more significantly in late f/u group than in early f/u group but still lower than normals. Global CBF was significantly reduced in CHF patients and improved after HT. rCBFs of the frontal cortex, precuneus and cingulum were most severely reduced and slowly improved after HT compared to other brain regions

  2. The change of cerebral blood flow after heart transplantation in congestive heart failure: a voxel-based and automatic VOl analysis of Tc-99m ECD SPECT

    Energy Technology Data Exchange (ETDEWEB)

    Hong, I. K.; Kim, J. J.; Lee, C. H.; Lim, K. C.; Moon, D. H.; Rhu, J. S.; Kim, J. S. [Asan Medical Center, Seoul (Korea, Republic of)

    2007-07-01

    To investigate the change of global and regional cerebral blood flow after heart transplantation (HT) in congestive heart failure (CHF) patients. Twenty-one patients with CHF who underwent HT (45{+-}12 yrs, M/F=19/2) and 10 healthy volunteers (39{+-}13 yrs, M/F = 7/3) were prospectively included. All patients underwent echocardiography and radionuclide angiography including brain and aorta with brain SPECT which was performed after iv bolus injection of Tc-99m ECD (740MBq) before (175{+-}253 days) and after (129{+-}82 days) HT. Patients were divided into two groups according to the interval between HT and postoperative SPECT [early follow-up (f/u): <6 mo, n=14; late f/u: >6 mo, n=7]. Global CBF (gCBF) of bilateral hemispheres were calculated by Patlak graphical analysis. Absolute rCBF map was obtained from brain SPECT by Lassen's correction algorithm. Age-corrected voxel-based analysis using SPM2 and automatic VOl analysis were performed to assess the rCBF change. Cardiac ejection fraction of all patients improved after HT (20.8%{yields}64.0%). gCBF was reduced compared to normal before HT (35.7{+-}3.9 vs. 49.1{+-}3.0 ml/100g/min; p<0.001) and improved postoperatively (46.6{+-}5.4, p<0.001). The preoperative gCBFs of early and late f/u group were not different (34.6{+-}3.2 vs. 38.0{+-}4.4, p=0.149) but postoperative gCBF (43.9{+-}3.7) of late f/u group was higher than those (52.0{+-}4.0) of early f/u group (p<0.001). On voxel-based analysis, preoperative rCBF was reduced in entire brain but most severely in bilateral superior and inferior frontal cortex, supplementary motor area, precuneus and anterior cingulum, compared to normals (uncorrected p<0.001). After HT, rCBF of these areas improved more significantly in late f/u group than in early f/u group but still lower than normals. Global CBF was significantly reduced in CHF patients and improved after HT. rCBFs of the frontal cortex, precuneus and cingulum were most severely reduced and slowly improved after

  3. A background to risk analysis. Vol. 3

    International Nuclear Information System (INIS)

    This 4-volumes report gives a background of ideas, principles, and examples which might be of use in developing practical methods for risk analysis. Some of the risk analysis techniques described are somewhat experimental. The report is written in an introductory style, but where some point needs further justifi- cation or evaluation, this is given in the form of a chapter appenix. In this way, it is hoped that the report can serve two purposes, - as a basis for starting risk analysis work and as a basis for discussing effectiveness of risk analysis procedures. The report should be seen as a preliminary stage, prior to a program of industrial trials of risk analysis methods. Vol. 3 contains chapters on quantification of risk, failure and accident probability, risk analysis and design, and examles of risk analysis for process plant. (BP)

  4. A background risk analysis. Vol. 1

    International Nuclear Information System (INIS)

    This 4-volumes report gives a background of ideas, principles, and examples which might be of use in developing practical methods for risk analysis. Some of the risk analysis techniques, described are somewhat experimental. The report is written in an introductory style, but where some point needs further justification or evaluation, this is given in the form of a chapter appendix. In this way, it is hoped that the report can serve two purposes, - as a basis for starting risk analysis work and as a basis for discussing effectiveness of risk analysis procedures. The report should be seen as a preliminary stage, prior to a program of industrial trials of risk analysis methods. Vol. 1 contains a short history of risk analysis, and chapters on risk, failures, errors and accidents, and general procedures for risk analysis. (BP)

  5. A background to risk analysis. Vol. 2

    International Nuclear Information System (INIS)

    This 4-volumes report gives a background of ideas, principles and examples which might be of use in developing practical methods for risk analysis. Some of the risk analysis techniques described are somewhat experimental. The report is written in an introductory style, but where some point needs further justification or evaluation, this is given in the form of a chapter appendix. In this way, it is hoped that the report can serve two purposes, - as a basis for starting risk analysis work and as a basis for discussing effectiveness of risk analysis procedures. The report should be seen as a preliminary stage, prior to a program of industrial trials of risk analysis methods. Vol. 2 treats generic methods of qualitative failure analysis. (BP)

  6. A background to risk analysis. Vol. 4

    International Nuclear Information System (INIS)

    This 4-volumes report gives a background of ideas, principles, and examples which might be of use in developing practical methods for risk analysis. Some of the risk analysis techniques described are somewhat experimental. The report is written in an introductory style, but where some point needs further justification or evaluation, this is given in the form of a chapter appendix. In this way, it is hoped that the report can serve two purposes, - as a basis for starting risk analysis work and as a basis for discussing effectiveness of risk analysis procedures. The report should be seen as a preliminary stage, prior to a program of industrial trials of risk analysis methods. Vol. 4 treats human error in plant operation. (BP)

  7. Automatic Complexity Analysis

    DEFF Research Database (Denmark)

    Rosendahl, Mads

    1989-01-01

    One way to analyse programs is to to derive expressions for their computational behaviour. A time bound function (or worst-case complexity) gives an upper bound for the computation time as a function of the size of input. We describe a system to derive such time bounds automatically using abstract...

  8. 05501 Summary -- Automatic Performance Analysis

    OpenAIRE

    Gerndt, Hans Michael; Malony, Allen; Miller, Barton P.; Nagel, Wolfgang

    2006-01-01

    The Workshop on Automatic Performance Analysis (WAPA 2005, Dagstuhl Seminar 05501), held December 13-16, 2005, brought together performance researchers, developers, and practitioners with the goal of better understanding the methods, techniques, and tools that are needed for the automation of performance analysis for high performance computing.

  9. An automatic visual analysis system for tennis

    OpenAIRE

    Connaghan, Damien; Moran, Kieran; O''Connor, Noel E.

    2013-01-01

    This article presents a novel video analysis system for coaching tennis players of all levels, which uses computer vision algorithms to automatically edit and index tennis videos into meaningful annotations. Existing tennis coaching software lacks the ability to automatically index a tennis match into key events, and therefore, a coach who uses existing software is burdened with time-consuming manual video editing. This work aims to explore the effectiveness of a system to automatically de...

  10. Automatic malware analysis an emulator based approach

    CERN Document Server

    Yin, Heng

    2012-01-01

    Malicious software (i.e., malware) has become a severe threat to interconnected computer systems for decades and has caused billions of dollars damages each year. A large volume of new malware samples are discovered daily. Even worse, malware is rapidly evolving becoming more sophisticated and evasive to strike against current malware analysis and defense systems. Automatic Malware Analysis presents a virtualized malware analysis framework that addresses common challenges in malware analysis. In regards to this new analysis framework, a series of analysis techniques for automatic malware analy

  11. Automatic analysis of multiparty meetings

    Indian Academy of Sciences (India)

    Steve Renals

    2011-10-01

    This paper is about the recognition and interpretation of multiparty meetings captured as audio, video and other signals. This is a challenging task since the meetings consist of spontaneous and conversational interactions between a number of participants: it is a multimodal, multiparty, multistream problem. We discuss the capture and annotation of the Augmented Multiparty Interaction (AMI) meeting corpus, the development of a meeting speech recognition system, and systems for the automatic segmentation, summarization and social processing of meetings, together with some example applications based on these systems.

  12. An Automatic Hierarchical Delay Analysis Tool

    Institute of Scientific and Technical Information of China (English)

    FaridMheir-El-Saadi; BozenaKaminska

    1994-01-01

    The performance analysis of VLSI integrated circuits(ICs) with flat tools is slow and even sometimes impossible to complete.Some hierarchical tools have been developed to speed up the analysis of these large ICs.However,these hierarchical tools suffer from a poor interaction with the CAD database and poorly automatized operations.We introduce a general hierarchical framework for performance analysis to solve these problems.The circuit analysis is automatic under the proposed framework.Information that has been automatically abstracted in the hierarchy is kept in database properties along with the topological information.A limited software implementation of the framework,PREDICT,has also been developed to analyze the delay performance.Experimental results show that hierarchical analysis CPU time and memory requirements are low if heuristics are used during the abstraction process.

  13. Automatic emotional expression analysis from eye area

    Science.gov (United States)

    Akkoç, Betül; Arslan, Ahmet

    2015-02-01

    Eyes play an important role in expressing emotions in nonverbal communication. In the present study, emotional expression classification was performed based on the features that were automatically extracted from the eye area. Fırst, the face area and the eye area were automatically extracted from the captured image. Afterwards, the parameters to be used for the analysis through discrete wavelet transformation were obtained from the eye area. Using these parameters, emotional expression analysis was performed through artificial intelligence techniques. As the result of the experimental studies, 6 universal emotions consisting of expressions of happiness, sadness, surprise, disgust, anger and fear were classified at a success rate of 84% using artificial neural networks.

  14. Automatic Syntactic Analysis of Free Text.

    Science.gov (United States)

    Schwarz, Christoph

    1990-01-01

    Discusses problems encountered with the syntactic analysis of free text documents in indexing. Postcoordination and precoordination of terms is discussed, an automatic indexing system call COPSY (context operator syntax) that uses natural language processing techniques is described, and future developments are explained. (60 references) (LRW)

  15. Microprocessors in automatic chemical analysis

    International Nuclear Information System (INIS)

    Application of microprocessors to programming and computing of solutions chemical analysis by a sequential technique is examined. Safety, performances reliability are compared to other methods. An example is given on uranium titration by spectrophotometry

  16. Handbook of nuclear engineering: vol 1: nuclear engineering fundamentals; vol 2: reactor design; vol 3: reactor analysis; vol 4: reactors of waste disposal and safeguards

    CERN Document Server

    2013-01-01

    The Handbook of Nuclear Engineering is an authoritative compilation of information regarding methods and data used in all phases of nuclear engineering. Addressing nuclear engineers and scientists at all academic levels, this five volume set provides the latest findings in nuclear data and experimental techniques, reactor physics, kinetics, dynamics and control. Readers will also find a detailed description of data assimilation, model validation and calibration, sensitivity and uncertainty analysis, fuel management and cycles, nuclear reactor types and radiation shielding. A discussion of radioactive waste disposal, safeguards and non-proliferation, and fuel processing with partitioning and transmutation is also included. As nuclear technology becomes an important resource of non-polluting sustainable energy in the future, The Handbook of Nuclear Engineering is an excellent reference for practicing engineers, researchers and professionals.

  17. ANALYSIS METHOD OF AUTOMATIC PLANETARY TRANSMISSION KINEMATICS

    OpenAIRE

    Józef DREWNIAK; Stanisław ZAWIŚLAK; Wieczorek, Andrzej

    2014-01-01

    In the present paper, planetary automatic transmission is modeled by means of contour graphs. The goals of modeling could be versatile: ratio calculating via algorithmic equation generation, analysis of velocity and accelerations. The exemplary gears running are analyzed, several drives/gears are consecutively taken into account discussing functional schemes, assigned contour graphs and generated system of equations and their solutions. The advantages of the method are: algorithmic approach, ...

  18. Automatic analysis and classification of surface electromyography.

    Science.gov (United States)

    Abou-Chadi, F E; Nashar, A; Saad, M

    2001-01-01

    In this paper, parametric modeling of surface electromyography (EMG) algorithms that facilitates automatic SEMG feature extraction and artificial neural networks (ANN) are combined for providing an integrated system for the automatic analysis and diagnosis of myopathic disorders. Three paradigms of ANN were investigated: the multilayer backpropagation algorithm, the self-organizing feature map algorithm and a probabilistic neural network model. The performance of the three classifiers was compared with that of the old Fisher linear discriminant (FLD) classifiers. The results have shown that the three ANN models give higher performance. The percentage of correct classification reaches 90%. Poorer diagnostic performance was obtained from the FLD classifier. The system presented here indicates that surface EMG, when properly processed, can be used to provide the physician with a diagnostic assist device. PMID:11556501

  19. ANALYSIS METHOD OF AUTOMATIC PLANETARY TRANSMISSION KINEMATICS

    Directory of Open Access Journals (Sweden)

    Józef DREWNIAK

    2014-06-01

    Full Text Available In the present paper, planetary automatic transmission is modeled by means of contour graphs. The goals of modeling could be versatile: ratio calculating via algorithmic equation generation, analysis of velocity and accelerations. The exemplary gears running are analyzed, several drives/gears are consecutively taken into account discussing functional schemes, assigned contour graphs and generated system of equations and their solutions. The advantages of the method are: algorithmic approach, general approach where particular drives are cases of the generally created model. Moreover, the method allows for further analyzes and synthesis tasks e.g. checking isomorphism of design solutions.

  20. Automatic abundance analysis of high resolution spectra

    CERN Document Server

    Bonifacio, P; Bonifacio, Piercarlo; Caffau, Elisabetta

    2003-01-01

    We describe an automatic procedure for determining abundances from high resolution spectra. Such procedures are becoming increasingly important as large amounts of data are delivered from 8m telescopes and their high-multiplexing fiber facilities, such as FLAMES on ESO-VLT. The present procedure is specifically targeted for the analysis of spectra of giants in the Sgr dSph; however, the procedure may be, in principle, tailored to analyse stars of any type. Emphasis is placed on the algorithms and on the stability of the method; the external accuracy rests, ultimately, on the reliability of the theoretical models (model-atmospheres, synthetic spectra) used to interpret the data. Comparison of the results of the procedure with the results of a traditional analysis for 12 Sgr giants shows that abundances accurate at the level of 0.2 dex, comparable with that of traditional analysis of the same spectra, may be derived in a fast and efficient way. Such automatic procedures are not meant to replace the traditional ...

  1. Semi-automatic analysis of fire debris

    Science.gov (United States)

    Touron; Malaquin; Gardebas; Nicolai

    2000-05-01

    Automated analysis of fire residues involves a strategy which deals with the wide variety of received criminalistic samples. Because of unknown concentration of accelerant in a sample and the wide range of flammable products, full attention from the analyst is required. Primary detection with a photoionisator resolves the first problem, determining the right method to use: the less responsive classical head-space determination or absorption on active charcoal tube, a better fitted method more adapted to low concentrations can thus be chosen. The latter method is suitable for automatic thermal desorption (ATD400), to avoid any risk of cross contamination. A PONA column (50 mx0.2 mm i.d.) allows the separation of volatile hydrocarbons from C(1) to C(15) and the update of a database. A specific second column is used for heavy hydrocarbons. Heavy products (C(13) to C(40)) were extracted from residues using a very small amount of pentane, concentrated to 1 ml at 50 degrees C and then placed on an automatic carousel. Comparison of flammables with referenced chromatograms provided expected identification, possibly using mass spectrometry. This analytical strategy belongs to the IRCGN quality program, resulting in analysis of 1500 samples per year by two technicians. PMID:10802196

  2. Automatic analysis of distance bounding protocols

    CERN Document Server

    Malladi, Sreekanth; Kothapalli, Kishore

    2010-01-01

    Distance bounding protocols are used by nodes in wireless networks to calculate upper bounds on their distances to other nodes. However, dishonest nodes in the network can turn the calculations both illegitimate and inaccurate when they participate in protocol executions. It is important to analyze protocols for the possibility of such violations. Past efforts to analyze distance bounding protocols have only been manual. However, automated approaches are important since they are quite likely to find flaws that manual approaches cannot, as witnessed in literature for analysis pertaining to key establishment protocols. In this paper, we use the constraint solver tool to automatically analyze distance bounding protocols. We first formulate a new trace property called Secure Distance Bounding (SDB) that protocol executions must satisfy. We then classify the scenarios in which these protocols can operate considering the (dis)honesty of nodes and location of the attacker in the network. Finally, we extend the const...

  3. Sensitivity analysis and design optimization through automatic differentiation

    International Nuclear Information System (INIS)

    Automatic differentiation is a technique for transforming a program or subprogram that computes a function, including arbitrarily complex simulation codes, into one that computes the derivatives of that function. We describe the implementation and application of automatic differentiation tools. We highlight recent advances in the combinatorial algorithms and compiler technology that underlie successful implementation of automatic differentiation tools. We discuss applications of automatic differentiation in design optimization and sensitivity analysis. We also describe ongoing research in the design of language-independent source transformation infrastructures for automatic differentiation algorithms

  4. Transboundary diagnostic analysis. Vol. 2. Background and environmental assessment

    OpenAIRE

    2012-01-01

    The Transboundary Diagnosis Analysis(TDA) quantifies and ranks water-related environmental transboundary issues and their causes according to the severity of environmental and/or socio-economic impacts. The three main issues in BOBLME are; overexploitation of marine living resources; degradation of mangroves, coral reefs and seagrasses; pollution and water quality. Volume 2 contains background material that sets out the bio-physical and socio-economic characteristics of the BOBLME; an analysi...

  5. Automatic terrain modeling using transfinite element analysis

    KAUST Repository

    Collier, Nathaniel O.

    2010-05-31

    An automatic procedure for modeling terrain is developed based on L2 projection-based interpolation of discrete terrain data onto transfinite function spaces. The function space is refined automatically by the use of image processing techniques to detect regions of high error and the flexibility of the transfinite interpolation to add degrees of freedom to these areas. Examples are shown of a section of the Palo Duro Canyon in northern Texas.

  6. Experiment data acquisition and analysis system. Vol. 1

    International Nuclear Information System (INIS)

    The Experiment Data Acquisition and Analysis System EDAS was created to acquire and analyze data collected in experiments carried out at the heavy ion accelerator UNILAC. It has been available since 1975 and has become the most frequently used system for evaluating experiments at GSI. EDAS has undergone constant development, and the many enhancements make this completely revised third edition of the EDAS manual necessary. EDAS consists of two sub-systems: GOLDA for experimental data acquisition on PDP-11's and SATAN mainly for off-line analysis in replay mode on large IBM mainframes. The capacity of one IBM 3081 CPU is mainly dedicated to EDAS processing and is almost fully utilized by this application. More than 200 users from GSI as well as from collaborating laboratories and universities use SATAN in more than 100 sessions daily needing 10 to 20 hours of user CPU time. EDAS is designed as an open system. (orig./HSI)

  7. Transboundary diagnostic analysis. Vol. 1. Issues, proximate and root causes

    OpenAIRE

    2012-01-01

    The Transboundary Diagnostic Analysis(TDA) quantifies and ranks water-related environmental transboundary issues and their causes according to the severity of environmental and/or socio-economic impacts. The three main issues in BOBLME are; overexploitation of marine living resources; degradation of mangroves, coral reefs and seagrasses; pollution and water quality. Volume 1 describes the transboundary issues in BOBLME and their proximate and underlying root causes. These will be used to deve...

  8. Towards automatic quantitative analysis of cardiac MR perfusion images

    NARCIS (Netherlands)

    Breeuwer, M.; Quist, M.; Spreeuwers, L.J.; Paetsch, I.; Al-Saadi, N.; Nagel, E.

    2001-01-01

    Magnetic Resonance Imaging (MRI) is a powerful technique for imaging cardiovascular diseases. The introduction of cardiovascular MRI into clinical practice is however hampered by the lack of efficient and reliable automatic image analysis methods. This paper focuses on the automatic evaluation of th

  9. Handbook of nuclear data for neutron activation analysis. Vol. I

    International Nuclear Information System (INIS)

    The first part of a two-volume book which is meant for experimentalists working in instrumental activation analysis and related fields, such as nuclear metrology, materials testing and environmental studies. The volume describes the basic processes of gamma-ray interaction with matter as well as the important phenomena affecting gamma-ray spectra formation in semiconductor spectrometers. A brief account is also given of computation methods commonly employed for spectra evaluation. The results rather than detailed derivations are stressed. A great deal of material si divided into five chapters and nine appendices. The inclusion of many tables of significant spectroscopic data should make the text a useful handbook for those dealing with multi-channel gamma-ray spectra. (author) 26 figs., 82 tabs., 334 refs

  10. Dynamic Analysis of a Pendulum Dynamic Automatic Balancer

    Directory of Open Access Journals (Sweden)

    Jin-Seung Sohn

    2007-01-01

    Full Text Available The automatic dynamic balancer is a device to reduce the vibration from unbalanced mass of rotors. Instead of considering prevailing ball automatic dynamic balancer, pendulum automatic dynamic balancer is analyzed. For the analysis of dynamic stability and behavior, the nonlinear equations of motion for a system are derived with respect to polar coordinates by the Lagrange's equations. The perturbation method is applied to investigate the dynamic behavior of the system around the equilibrium position. Based on the linearized equations, the dynamic stability of the system around the equilibrium positions is investigated by the eigenvalue analysis.

  11. Automatic Gait Recognition by Symmetry Analysis

    OpenAIRE

    Hayfron-Acquah, James B.; Nixon, Mark S.; Carter, John N.

    2001-01-01

    We describe a new method for automatic gait recognition based on analysing the symmetry of human motion, by using the Generalised Symmetry Operator. This operator, rather than relying on the borders of a shape or on general appearance, locates features by their symmetrical properties. This approach is reinforced by the psychologists' view that gait is a symmetrical pattern of motion and by other works. We applied our new method to two different databases and derived gait signatures for silhou...

  12. Automatic quantitative analysis of morphology of apoptotic HL-60 cells

    OpenAIRE

    Liu, Yahui; Lin, Wang; Yang, Xu; Liang, Weizi; Zhang, Jun; Meng, Maobin; Rice, John R.; Sa, Yu; Feng, Yuanming

    2014-01-01

    Morphological identification is a widespread procedure to assess the presence of apoptosis by visual inspection of the morphological characteristics or the fluorescence images. The procedure is lengthy and results are observer dependent. A quantitative automatic analysis is objective and would greatly help the routine work. We developed an image processing and segmentation method which combined the Otsu thresholding and morphological operators for apoptosis study. An automatic determina...

  13. Automatic learning strategies and their application to electrophoresis analysis

    OpenAIRE

    Roch, Christian Maurice; Pun, Thierry; Hochstrasser, Denis; Pellegrini, Christian

    1989-01-01

    Automatic learning plays an important role in image analysis and pattern recognition. A taxonomy of automatic learning strategies is presented; this categorization is based on the amount of inferences the learning element must perform to bridge the gap between environmental and system knowledge representation level. Four main categories are identified and described: rote learning, learning by deduction, learning by induction, and learning by analogy. An application of learning by induction to...

  14. Accuracy analysis of automatic distortion correction

    Directory of Open Access Journals (Sweden)

    Kolecki Jakub

    2015-06-01

    Full Text Available The paper addresses the problem of the automatic distortion removal from images acquired with non-metric SLR camera equipped with prime lenses. From the photogrammetric point of view the following question arises: is the accuracy of distortion control data provided by the manufacturer for a certain lens model (not item sufficient in order to achieve demanded accuracy? In order to obtain the reliable answer to the aforementioned problem the two kinds of tests were carried out for three lens models.

  15. Coloured Petri Nets: Basic Concepts, Analysis Methods and Practical Use. Vol. 3, Practical Use

    DEFF Research Database (Denmark)

    Jensen, Kurt

    and experiences from the projects, in a way which is useful also for readers who do not yet have personal experience with the construction and analysis of large CPN models. The volume demonstrates the feasibility of using CP-nets and the CPN tools for industrial projects. The presentation of the projects is based...... of the CPN models and some of the analysis results. This has been possible since, Vols. 1 and 2 have given the readers a much more thorough knowledge of CP-nets than readers of ordinary research papers. Finally, it is discussed how some of the problems from the projects can be overcome or circumvented. Many...... of these problems have already been removed, e.g., by improvements of the CPN tools. Other problems can be avoided by a careful choice of modelling and analysis techniques. The material has been modified in cooperation with the original authors and the final result has been approved by them. The conclusions...

  16. Automatic analysis of signals during Eddy currents controls

    International Nuclear Information System (INIS)

    A method and the corresponding instrument have been developed for automatic analysis of Eddy currents testing signals. This apparatus enables at the same time the analysis, every 2 milliseconds, of two signals at two different frequencies. It can be used either on line with an Eddy Current testing instrument or with a magnetic tape recorder

  17. Automatic Analysis of Critical Incident Reports: Requirements and Use Cases.

    Science.gov (United States)

    Denecke, Kerstin

    2016-01-01

    Increasingly, critical incident reports are used as a means to increase patient safety and quality of care. The entire potential of these sources of experiential knowledge remains often unconsidered since retrieval and analysis is difficult and time-consuming, and the reporting systems often do not provide support for these tasks. The objective of this paper is to identify potential use cases for automatic methods that analyse critical incident reports. In more detail, we will describe how faceted search could offer an intuitive retrieval of critical incident reports and how text mining could support in analysing relations among events. To realise an automated analysis, natural language processing needs to be applied. Therefore, we analyse the language of critical incident reports and derive requirements towards automatic processing methods. We learned that there is a huge potential for an automatic analysis of incident reports, but there are still challenges to be solved. PMID:27139389

  18. Automatic analysis of microscopic images of red blood cell aggregates

    Science.gov (United States)

    Menichini, Pablo A.; Larese, Mónica G.; Riquelme, Bibiana D.

    2015-06-01

    Red blood cell aggregation is one of the most important factors in blood viscosity at stasis or at very low rates of flow. The basic structure of aggregates is a linear array of cell commonly termed as rouleaux. Enhanced or abnormal aggregation is seen in clinical conditions, such as diabetes and hypertension, producing alterations in the microcirculation, some of which can be analyzed through the characterization of aggregated cells. Frequently, image processing and analysis for the characterization of RBC aggregation were done manually or semi-automatically using interactive tools. We propose a system that processes images of RBC aggregation and automatically obtains the characterization and quantification of the different types of RBC aggregates. Present technique could be interesting to perform the adaptation as a routine used in hemorheological and Clinical Biochemistry Laboratories because this automatic method is rapid, efficient and economical, and at the same time independent of the user performing the analysis (repeatability of the analysis).

  19. Project Report: Automatic Sequence Processor Software Analysis

    Science.gov (United States)

    Benjamin, Brandon

    2011-01-01

    The Mission Planning and Sequencing (MPS) element of Multi-Mission Ground System and Services (MGSS) provides space missions with multi-purpose software to plan spacecraft activities, sequence spacecraft commands, and then integrate these products and execute them on spacecraft. Jet Propulsion Laboratory (JPL) is currently is flying many missions. The processes for building, integrating, and testing the multi-mission uplink software need to be improved to meet the needs of the missions and the operations teams that command the spacecraft. The Multi-Mission Sequencing Team is responsible for collecting and processing the observations, experiments and engineering activities that are to be performed on a selected spacecraft. The collection of these activities is called a sequence and ultimately a sequence becomes a sequence of spacecraft commands. The operations teams check the sequence to make sure that no constraints are violated. The workflow process involves sending a program start command, which activates the Automatic Sequence Processor (ASP). The ASP is currently a file-based system that is comprised of scripts written in perl, c-shell and awk. Once this start process is complete, the system checks for errors and aborts if there are any; otherwise the system converts the commands to binary, and then sends the resultant information to be radiated to the spacecraft.

  20. Requirements for Automatic Performance Analysis - APART Technical Report

    OpenAIRE

    Riley, Graham D.; Gurd, John R.

    1999-01-01

    This report discusses the requirements for automatic performance analysis tools. The discussion proceeds by first examining the nature and purpose of performance analysis. This results in an identification of the sources of performance data available to the analysis process and some properties of the process itself. Consideration is then given to the automation of the process. Many environmental factors affecting the performance analysis process are identified leading to the definition of a s...

  1. Profiling School Shooters: Automatic Text-Based Analysis

    Directory of Open Access Journals (Sweden)

    Yair eNeuman

    2015-06-01

    Full Text Available School shooters present a challenge to both forensic psychiatry and law enforcement agencies. The relatively small number of school shooters, their various charateristics, and the lack of in-depth analysis of all of the shooters prior to the shooting add complexity to our understanding of this problem. In this short paper, we introduce a new methodology for automatically profiling school shooters. The methodology involves automatic analysis of texts and the production of several measures relevant for the identification of the shooters. Comparing texts written by six school shooters to 6056 texts written by a comparison group of male subjects, we found that the shooters' texts scored significantly higher on the Narcissistic Personality dimension as well as on the Humilated and Revengeful dimensions. Using a ranking/priorization procedure, similar to the one used for the automatic identification of sexual predators, we provide support for the validity and relevance of the proposed methodology.

  2. Automatic analysis of trabecular bone structure from knee MRI

    DEFF Research Database (Denmark)

    Marques, Joselene; Granlund, Rabia; Lillholm, Martin;

    2012-01-01

    We investigated the feasibility of quantifying osteoarthritis (OA) by analysis of the trabecular bone structure in low-field knee MRI. Generic texture features were extracted from the images and subsequently selected by sequential floating forward selection (SFFS), following a fully automatic, un...

  3. Trends of Science Education Research: An Automatic Content Analysis

    Science.gov (United States)

    Chang, Yueh-Hsia; Chang, Chun-Yen; Tseng, Yuen-Hsien

    2010-01-01

    This study used scientometric methods to conduct an automatic content analysis on the development trends of science education research from the published articles in the four journals of "International Journal of Science Education, Journal of Research in Science Teaching, Research in Science Education, and Science Education" from 1990 to 2007. The…

  4. METHOD FOR AUTOMATIC ANALYSIS OF WHEAT STRAW PULP CELL TYPES

    Directory of Open Access Journals (Sweden)

    Mikko Karjalainen,

    2012-01-01

    Full Text Available Agricultural residues are receiving increasing interest when studying renewable raw materials for industrial use. Residues, generally referred to as nonwood materials, are usually complex materials. Wheat straw is one of the most abundant agricultural residues around the world and is therefore available for extensive industrial use. However, more information of its cell types is needed to utilize wheat straw efficiently in pulp and papermaking. The pulp cell types and particle dimensions of wheat straw were studied, using an optical microscope and an automatic optical fibre analyzer. The role of various cell types in wheat straw pulp and papermaking is discussed. Wheat straw pulp components were categorized according to particle morphology and categorization with an automatic optical analyzer was used to determine wheat straw pulp cell types. The results from automatic optical analysis were compared to those with microscopic analysis and a good correlation was found. Automatic optical analysis was found to be a promising tool for the in-depth analysis of wheat straw pulp cell types.

  5. Application of software technology to automatic test data analysis

    Science.gov (United States)

    Stagner, J. R.

    1991-01-01

    The verification process for a major software subsystem was partially automated as part of a feasibility demonstration. The methods employed are generally useful and applicable to other types of subsystems. The effort resulted in substantial savings in test engineer analysis time and offers a method for inclusion of automatic verification as a part of regression testing.

  6. Automatic analysis of double coronal mass ejections from coronagraph images

    Science.gov (United States)

    Jacobs, Matthew; Chang, Lin-Ching; Pulkkinen, Antti; Romano, Michelangelo

    2015-11-01

    Coronal mass ejections (CMEs) can have major impacts on man-made technology and humans, both in space and on Earth. These impacts have created a high interest in the study of CMEs in an effort to detect and track events and forecast the CME arrival time to provide time for proper mitigation. A robust automatic real-time CME processing pipeline is greatly desired to avoid laborious and subjective manual processing. Automatic methods have been proposed to segment CMEs from coronagraph images and estimate CME parameters such as their heliocentric location and velocity. However, existing methods suffered from several shortcomings such as the use of hard thresholding and an inability to handle two or more CMEs occurring within the same coronagraph image. Double-CME analysis is a necessity for forecasting the many CME events that occur within short time frames. Robust forecasts for all CME events are required to fully understand space weather impacts. This paper presents a new method to segment CME masses and pattern recognition approaches to differentiate two CMEs in a single coronagraph image. The proposed method is validated on a data set of 30 halo CMEs, with results showing comparable ability in transient arrival time prediction accuracy and the new ability to automatically predict the arrival time of a double-CME event. The proposed method is the first automatic method to successfully calculate CME parameters from double-CME events, making this automatic method applicable to a wider range of CME events.

  7. Automatic Facial Measurements for Quantitative Analysis of Rhinoplasty

    Directory of Open Access Journals (Sweden)

    Mousa Shamsi

    2007-08-01

    Full Text Available Proposing automated algorithms for quantitative analysis of facial images based on facial features may assist surgeons to validate the success of nose surgery in objective and reproducible manner. In this paper, we attempt to develop automatic procedures for quantitative analysis of rhinoplasty operation based on several standard linear and spatial features. The main processing steps include image enhancement, "ncorrection of varying illumination effect, automatic facial skin detection, automatic feature extraction, facial measurements and surgery analysis. For quantitative analysis of nose surgery, we randomly selected 100 patients from the database provided by the ENT division of Imam Hospital, Tehran, Iran. The frontal and profile images of these patients before and after rhinoplasty were available for experiments. For statistical analysis of nasal two clinical parameters, i.e., Nasolabial Angle and Nasal Projection ratio are computed. The mean and standard deviation of Nasolabial Angle by manual measurement of a specialist was 95.98˚(±9.58˚ and 111.02˚(±10.07˚ before and after nose surgery, respectively. The proposed algorithm has automatically computed this parameter as 94.12˚ (±8.86˚ and 109.65˚ (±8.86˚ before and after nose surgery. In addition, the proposed algorithm has automatically computed the Nasal Projection by Good's method as 0.584(±0.0491 and 0.537(±0.066 before and after nose surgery, respectively. Meanwhile, this parameter has manually been measured by a specialist as 0.576(±0.052 and 0.537(±0.077 before and after nose surgery, respectively. The result of the proposed facial skin segmentation, feature detection algorithms, and estimated values for the above two clinical parameters in the presence of the mentioned datasets declare that the techniques are applicable in the common clinical practice of the nose surgery.

  8. Formalising responsibility modelling for automatic analysis

    OpenAIRE

    Simpson, Robbie; Storer, Tim

    2015-01-01

    Modelling the structure of social-technical systems as a basis for informing software system design is a difficult compromise. Formal methods struggle to capture the scale and complexity of the heterogeneous organisations that use technical systems. Conversely, informal approaches lack the rigour needed to inform the software design and construction process or enable automated analysis. We revisit the concept of responsibility modelling, which models social technical systems as a collec...

  9. Automatic quantitative morphological analysis of interacting galaxies

    CERN Document Server

    Shamir, Lior; Wallin, John

    2013-01-01

    The large number of galaxies imaged by digital sky surveys reinforces the need for computational methods for analyzing galaxy morphology. While the morphology of most galaxies can be associated with a stage on the Hubble sequence, morphology of galaxy mergers is far more complex due to the combination of two or more galaxies with different morphologies and the interaction between them. Here we propose a computational method based on unsupervised machine learning that can quantitatively analyze morphologies of galaxy mergers and associate galaxies by their morphology. The method works by first generating multiple synthetic galaxy models for each galaxy merger, and then extracting a large set of numerical image content descriptors for each galaxy model. These numbers are weighted using Fisher discriminant scores, and then the similarities between the galaxy mergers are deduced using a variation of Weighted Nearest Neighbor analysis such that the Fisher scores are used as weights. The similarities between the ga...

  10. Effectiveness of an Automatic Tracking Software in Underwater Motion Analysis

    Directory of Open Access Journals (Sweden)

    Fabrício A. Magalhaes

    2013-12-01

    Full Text Available Tracking of markers placed on anatomical landmarks is a common practice in sports science to perform the kinematic analysis that interests both athletes and coaches. Although different software programs have been developed to automatically track markers and/or features, none of them was specifically designed to analyze underwater motion. Hence, this study aimed to evaluate the effectiveness of a software developed for automatic tracking of underwater movements (DVP, based on the Kanade-Lucas-Tomasi feature tracker. Twenty-one video recordings of different aquatic exercises (n = 2940 markers’ positions were manually tracked to determine the markers’ center coordinates. Then, the videos were automatically tracked using DVP and a commercially available software (COM. Since tracking techniques may produce false targets, an operator was instructed to stop the automatic procedure and to correct the position of the cursor when the distance between the calculated marker’s coordinate and the reference one was higher than 4 pixels. The proportion of manual interventions required by the software was used as a measure of the degree of automation. Overall, manual interventions were 10.4% lower for DVP (7.4% than for COM (17.8%. Moreover, when examining the different exercise modes separately, the percentage of manual interventions was 5.6% to 29.3% lower for DVP than for COM. Similar results were observed when analyzing the type of marker rather than the type of exercise, with 9.9% less manual interventions for DVP than for COM. In conclusion, based on these results, the developed automatic tracking software presented can be used as a valid and useful tool for underwater motion analysis.

  11. Facilitator control as automatic behavior: A verbal behavior analysis

    OpenAIRE

    Hall, Genae A.

    1993-01-01

    Several studies of facilitated communication have demonstrated that the facilitators were controlling and directing the typing, although they appeared to be unaware of doing so. Such results shift the focus of analysis to the facilitator's behavior and raise questions regarding the controlling variables for that behavior. This paper analyzes facilitator behavior as an instance of automatic verbal behavior, from the perspective of Skinner's (1957) book Verbal Behavior. Verbal behavior is autom...

  12. ATLASWatchMan, a tool for automatized data analysis

    International Nuclear Information System (INIS)

    The ATLAS detector will start soon to take data and many New Physics phenomena are expected. The ATLASWatchMan package has been developed with the principles of CASE (Computer Aided Software Engineering) and it helps the user setting up any analysis by automatically generating the actual analysis code and data files from user settings. ATLASWatchMan provides a light and transparent framework to plug in user-defined cuts and algorithms to look at as many channels the user wants, running the analysis both locally and on the Grid. Examples of analyses run with the package using the latest release of the ATLAS software are shown

  13. Development of an automatic identification algorithm for antibiogram analysis.

    Science.gov (United States)

    Costa, Luan F R; da Silva, Eduardo S; Noronha, Victor T; Vaz-Moreira, Ivone; Nunes, Olga C; Andrade, Marcelino M de

    2015-12-01

    Routinely, diagnostic and microbiology laboratories perform antibiogram analysis which can present some difficulties leading to misreadings and intra and inter-reader deviations. An Automatic Identification Algorithm (AIA) has been proposed as a solution to overcome some issues associated with the disc diffusion method, which is the main goal of this work. AIA allows automatic scanning of inhibition zones obtained by antibiograms. More than 60 environmental isolates were tested using susceptibility tests which were performed for 12 different antibiotics for a total of 756 readings. Plate images were acquired and classified as standard or oddity. The inhibition zones were measured using the AIA and results were compared with reference method (human reading), using weighted kappa index and statistical analysis to evaluate, respectively, inter-reader agreement and correlation between AIA-based and human-based reading. Agreements were observed in 88% cases and 89% of the tests showed no difference or a reading problems such as overlapping inhibition zones, imperfect microorganism seeding, non-homogeneity of the circumference, partial action of the antimicrobial, and formation of a second halo of inhibition. Furthermore, AIA proved to overcome some of the limitations observed in other automatic methods. Therefore, AIA may be a practical tool for automated reading of antibiograms in diagnostic and microbiology laboratories. PMID:26513468

  14. Enhanced Automatic Wavelet Independent Component Analysis for Electroencephalographic Artifact Removal

    Directory of Open Access Journals (Sweden)

    Nadia Mammone

    2014-12-01

    Full Text Available Electroencephalography (EEG is a fundamental diagnostic instrument for many neurological disorders, and it is the main tool for the investigation of the cognitive or pathological activity of the brain through the bioelectromagnetic fields that it generates. The correct interpretation of the EEG is misleading, both for clinicians’ visual evaluation and for automated procedures, because of artifacts. As a consequence, artifact rejection in EEG is a key preprocessing step, and the quest for reliable automatic processors has been quickly growing in the last few years. Recently, a promising automatic methodology, known as automatic wavelet-independent component analysis (AWICA, has been proposed. In this paper, a more efficient and sensitive version, called enhanced-AWICA (EAWICA, is proposed, and an extensive performance comparison is carried out by a step of tuning the different parameters that are involved in artifact detection. EAWICA is shown to minimize information loss and to outperform AWICA in artifact removal, both on simulated and real experimental EEG recordings.

  15. DMET-Analyzer: automatic analysis of Affymetrix DMET Data

    Directory of Open Access Journals (Sweden)

    Guzzi Pietro

    2012-10-01

    Full Text Available Abstract Background Clinical Bioinformatics is currently growing and is based on the integration of clinical and omics data aiming at the development of personalized medicine. Thus the introduction of novel technologies able to investigate the relationship among clinical states and biological machineries may help the development of this field. For instance the Affymetrix DMET platform (drug metabolism enzymes and transporters is able to study the relationship among the variation of the genome of patients and drug metabolism, detecting SNPs (Single Nucleotide Polymorphism on genes related to drug metabolism. This may allow for instance to find genetic variants in patients which present different drug responses, in pharmacogenomics and clinical studies. Despite this, there is currently a lack in the development of open-source algorithms and tools for the analysis of DMET data. Existing software tools for DMET data generally allow only the preprocessing of binary data (e.g. the DMET-Console provided by Affymetrix and simple data analysis operations, but do not allow to test the association of the presence of SNPs with the response to drugs. Results We developed DMET-Analyzer a tool for the automatic association analysis among the variation of the patient genomes and the clinical conditions of patients, i.e. the different response to drugs. The proposed system allows: (i to automatize the workflow of analysis of DMET-SNP data avoiding the use of multiple tools; (ii the automatic annotation of DMET-SNP data and the search in existing databases of SNPs (e.g. dbSNP, (iii the association of SNP with pathway through the search in PharmaGKB, a major knowledge base for pharmacogenomic studies. DMET-Analyzer has a simple graphical user interface that allows users (doctors/biologists to upload and analyse DMET files produced by Affymetrix DMET-Console in an interactive way. The effectiveness and easy use of DMET Analyzer is demonstrated through different

  16. Automatic Video-based Analysis of Human Motion

    DEFF Research Database (Denmark)

    Fihl, Preben

    . A multi-view approach to pose estimation is also presented that integrates low level information from different cameras to generate better pose estimates during heavy occlusions. The works presented in this thesis contribute in these different areas of video-based analysis of human motion and altogether......The human motion contains valuable information in many situations and people frequently perform an unconscious analysis of the motion of other people to understand their actions, intentions, and state of mind. An automatic analysis of human motion will facilitate many applications and thus has...... received great interest from both industry and research communities. The focus of this thesis is on video-based analysis of human motion and the thesis presents work within three overall topics, namely foreground segmentation, action recognition, and human pose estimation. Foreground segmentation is often...

  17. Automatic type-curve matching for well test analysis

    Energy Technology Data Exchange (ETDEWEB)

    Abbaszadeh, M.; Kamal, M.M.

    1988-09-01

    This paper presents a general method for automatic well test interpretation. The method matches the field data with theoretical reservoir models using a constrained, nonlinear, least-squares regression technique coupled with numerical Laplace inversion of pressure-drawndown equations. Pressure gradients are computed by forward finite-difference approximations. Hence, reservoir models whose pressure gradients are difficult to obtain analytically can be readily included. Only equations for drawndown type curves of reservoir models are needed. Simulated pressure tests and actual field data are analyzed to illustrate the application of the method. The method reduces the time required to perform well test analysis and minimizes the subjectivity of interpretation.

  18. Spectral saliency via automatic adaptive amplitude spectrum analysis

    Science.gov (United States)

    Wang, Xiaodong; Dai, Jialun; Zhu, Yafei; Zheng, Haiyong; Qiao, Xiaoyan

    2016-03-01

    Suppressing nonsalient patterns by smoothing the amplitude spectrum at an appropriate scale has been shown to effectively detect the visual saliency in the frequency domain. Different filter scales are required for different types of salient objects. We observe that the optimal scale for smoothing amplitude spectrum shares a specific relation with the size of the salient region. Based on this observation and the bottom-up saliency detection characterized by spectrum scale-space analysis for natural images, we propose to detect visual saliency, especially with salient objects of different sizes and locations via automatic adaptive amplitude spectrum analysis. We not only provide a new criterion for automatic optimal scale selection but also reserve the saliency maps corresponding to different salient objects with meaningful saliency information by adaptive weighted combination. The performance of quantitative and qualitative comparisons is evaluated by three different kinds of metrics on the four most widely used datasets and one up-to-date large-scale dataset. The experimental results validate that our method outperforms the existing state-of-the-art saliency models for predicting human eye fixations in terms of accuracy and robustness.

  19. Automatic visual tracking and social behaviour analysis with multiple mice.

    Directory of Open Access Journals (Sweden)

    Luca Giancardo

    Full Text Available Social interactions are made of complex behavioural actions that might be found in all mammalians, including humans and rodents. Recently, mouse models are increasingly being used in preclinical research to understand the biological basis of social-related pathologies or abnormalities. However, reliable and flexible automatic systems able to precisely quantify social behavioural interactions of multiple mice are still missing. Here, we present a system built on two components. A module able to accurately track the position of multiple interacting mice from videos, regardless of their fur colour or light settings, and a module that automatically characterise social and non-social behaviours. The behavioural analysis is obtained by deriving a new set of specialised spatio-temporal features from the tracker output. These features are further employed by a learning-by-example classifier, which predicts for each frame and for each mouse in the cage one of the behaviours learnt from the examples given by the experimenters. The system is validated on an extensive set of experimental trials involving multiple mice in an open arena. In a first evaluation we compare the classifier output with the independent evaluation of two human graders, obtaining comparable results. Then, we show the applicability of our technique to multiple mice settings, using up to four interacting mice. The system is also compared with a solution recently proposed in the literature that, similarly to us, addresses the problem with a learning-by-examples approach. Finally, we further validated our automatic system to differentiate between C57B/6J (a commonly used reference inbred strain and BTBR T+tf/J (a mouse model for autism spectrum disorders. Overall, these data demonstrate the validity and effectiveness of this new machine learning system in the detection of social and non-social behaviours in multiple (>2 interacting mice, and its versatility to deal with different

  20. Automatic Monitoring Electronic Tongue with MEAs for Environmental Analysis

    Institute of Scientific and Technical Information of China (English)

    Shaofang Zou; Hong Men; Yi Li; Yinping Wang; Ping Wang

    2006-01-01

    An automatic monitoring electronic tongue based on differential pulse stripping voltammetry (DPSV) was developed for heavy metals analysis. Simultaneous detections of trace Zn(Ⅱ), Cd(Ⅱ), Pb(Ⅱ), Cu(Ⅱ), Fe(Ⅲ) and Cr(Ⅲ) in water samples were performed with three electrochemical sensors. The sensor chip combined a silicon-based Hg-coated Au microelectrode array (MEA) as the working electrode on one side with an Ag/AgCl reference electrode and a Pt counter electrode on the other side. With a computer controlled multipotentiostat, pumps and valves, the electronic tongue realized in-situ real-time detection of the six metals mentioned above at parts-per-billion level without manual operation.

  1. QCS: Driving automatic data analysis programs for TFTR

    International Nuclear Information System (INIS)

    QCS (Queue Control System) executes on the VAX Cluster, driving programs which provide automatic analysis of per-shot data for the TFTR experiment at PPPL. QCS works in conjunction with site-specific programs to provide these noninteractive user programs with shot numbers for which all necessary conditions have been satisfied to permit processing. A typical condition is the existence of a particular data file or set of files for a shot. The user provides a boolean expression of the conditions upon which a shot number should be entered into a private Queue. The user program requests a ''ready-to-process'' shot number through a call to a specially provided function. If the specified Queue is empty, the program hibernates until another shot number is available

  2. AUTOMATIC LICENSE PLATE LOCALISATION AND IDENTIFICATION VIA SIGNATURE ANALYSIS

    Directory of Open Access Journals (Sweden)

    Lorita Angeline

    2014-02-01

    Full Text Available A new algorithm for license plate localisation and identification is proposed on the basis of Signature analysis. Signature analysis has been used to locate license plate candidate and its properties can be further utilised in supporting and affirming the license plate character recognition. This paper presents Signature Analysis and the improved conventional Connected Component Analysis (CCA to design an automatic license plate localisation and identification. A procedure called Euclidean Distance Transform is added to the conventional CCA in order to tackle the multiple bounding boxes that occurred. The developed algorithm, SAICCA achieved 92% successful rate, with 8% failed localisation rate due to the restrictions such as insufficient light level, clarity and license plate perceptual information. The processing time for a license plate localisation and recognition is a crucial criterion that needs to be concerned. Therefore, this paper has utilised several approaches to decrease the processing time to an optimal value. The results obtained show that the proposed system is capable to be implemented in both ideal and non-ideal environments.

  3. Automatic Beam Path Analysis of Laser Wakefield Particle Acceleration Data

    Energy Technology Data Exchange (ETDEWEB)

    Rubel, Oliver; Geddes, Cameron G.R.; Cormier-Michel, Estelle; Wu, Kesheng; Prabhat,; Weber, Gunther H.; Ushizima, Daniela M.; Messmer, Peter; Hagen, Hans; Hamann, Bernd; Bethel, E. Wes

    2009-10-19

    Numerical simulations of laser wakefield particle accelerators play a key role in the understanding of the complex acceleration process and in the design of expensive experimental facilities. As the size and complexity of simulation output grows, an increasingly acute challenge is the practical need for computational techniques that aid in scientific knowledge discovery. To that end, we present a set of data-understanding algorithms that work in concert in a pipeline fashion to automatically locate and analyze high energy particle bunches undergoing acceleration in very large simulation datasets. These techniques work cooperatively by first identifying features of interest in individual timesteps, then integrating features across timesteps, and based on the information derived perform analysis of temporally dynamic features. This combination of techniques supports accurate detection of particle beams enabling a deeper level of scientific understanding of physical phenomena than hasbeen possible before. By combining efficient data analysis algorithms and state-of-the-art data management we enable high-performance analysis of extremely large particle datasets in 3D. We demonstrate the usefulness of our methods for a variety of 2D and 3D datasets and discuss the performance of our analysis pipeline.

  4. Coloured Petri Nets: Basic Concepts, Analysis Methods and Practical Use. Vol. 2, Analysis Methods

    DEFF Research Database (Denmark)

    Jensen, Kurt

    This three-volume work presents a coherent description of the theoretical and practical aspects of coloured Petri nets (CP-nets). The second volume contains a detailed presentation of the analysis methods for CP-nets. They allow the modeller to investigate dynamic properties of CP-nets. The main...... ideas behind the analysis methods are described as well as the mathematics on which they are based and also how the methods are supported by computer tools. Some parts of the volume are theoretical while others are application oriented. The purpose of the volume is to teach the reader how to use...... the formal analysis methods, which does not require a deep understanding of the underlying mathematical theory....

  5. Automatic quantitative analysis of cardiac MR perfusion images

    Science.gov (United States)

    Breeuwer, Marcel M.; Spreeuwers, Luuk J.; Quist, Marcel J.

    2001-07-01

    Magnetic Resonance Imaging (MRI) is a powerful technique for imaging cardiovascular diseases. The introduction of cardiovascular MRI into clinical practice is however hampered by the lack of efficient and accurate image analysis methods. This paper focuses on the evaluation of blood perfusion in the myocardium (the heart muscle) from MR images, using contrast-enhanced ECG-triggered MRI. We have developed an automatic quantitative analysis method, which works as follows. First, image registration is used to compensate for translation and rotation of the myocardium over time. Next, the boundaries of the myocardium are detected and for each position within the myocardium a time-intensity profile is constructed. The time interval during which the contrast agent passes for the first time through the left ventricle and the myocardium is detected and various parameters are measured from the time-intensity profiles in this interval. The measured parameters are visualized as color overlays on the original images. Analysis results are stored, so that they can later on be compared for different stress levels of the heart. The method is described in detail in this paper and preliminary validation results are presented.

  6. Automatic Feature Interaction Analysis in PacoSuite

    Directory of Open Access Journals (Sweden)

    Wim Vanderperren

    2004-10-01

    Full Text Available In this paper, we build upon previous work that aims at recuperating aspect oriented ideas into component based software development. In that research, a composition adapter was proposed in order to capture crosscutting concerns in the PacoSuite component based methodology. A composition adapter is visually applied onto a given component composition and the changes it describes are automatically applied. Stacking multiple composition adapters onto the same component composition can however lead to unpredictable and undesired side-effects. In this paper, we propose a solution for this issue, widely known as the feature interaction problem. We present a classification of different interaction levels among composition adapters and the algorithms required to verify them. The proposed algorithms are however of exponential nature and depend on both the composition adapters and the component composition as a whole. In order to enhance the performance of our feature interaction analysis, we present a set of theorems that define the interaction levels solely in terms of the properties of the composition adapters themselves.

  7. Automatic analysis of ciliary beat frequency using optical flow

    Science.gov (United States)

    Figl, Michael; Lechner, Manuel; Werther, Tobias; Horak, Fritz; Hummel, Johann; Birkfellner, Wolfgang

    2012-02-01

    Ciliary beat frequency (CBF) can be a useful parameter for diagnosis of several diseases, as e.g. primary ciliary dyskinesia. (PCD). CBF computation is usually done using manual evaluation of high speed video sequences, a tedious, observer dependent, and not very accurate procedure. We used the OpenCV's pyramidal implementation of the Lukas-Kanade algorithm for optical flow computation and applied this to certain objects to follow the movements. The objects were chosen by their contrast applying the corner detection by Shi and Tomasi. Discrimination between background/noise and cilia by a frequency histogram allowed to compute the CBF. Frequency analysis was done using the Fourier transform in matlab. The correct number of Fourier summands was found by the slope in an approximation curve. The method showed to be usable to distinguish between healthy and diseased samples. However there remain difficulties in automatically identifying the cilia, and also in finding enough high contrast cilia in the image. Furthermore the some of the higher contrast cilia are lost (and sometimes found) by the method, an easy way to distinguish the correct sub-path of a point's path have yet to be found in the case where the slope methods doesn't work.

  8. Principal Component Analysis and Automatic Relevance Determination in Damage Identification

    CERN Document Server

    Mdlazi, L; Stander, C J; Scheffer, C; Heyns, P S

    2007-01-01

    This paper compares two neural network input selection schemes, the Principal Component Analysis (PCA) and the Automatic Relevance Determination (ARD) based on Mac-Kay's evidence framework. The PCA takes all the input data and projects it onto a lower dimension space, thereby reduc-ing the dimension of the input space. This input reduction method often results with parameters that have significant influence on the dynamics of the data being diluted by those that do not influence the dynamics of the data. The ARD selects the most relevant input parameters and discards those that do not contribute significantly to the dynamics of the data being modelled. The ARD sometimes results with important input parameters being discarded thereby compromising the dynamics of the data. The PCA and ARD methods are implemented together with a Multi-Layer-Perceptron (MLP) network for fault identification in structures and the performance of the two methods is as-sessed. It is observed that ARD and PCA give similar accu-racy le...

  9. Analysis of automatic rigid image-registration on tomotherapy

    International Nuclear Information System (INIS)

    The purpose of this study was to analyze translational and rotational adjustments during automatic rigid image-registration by using different control parameters for a total of five groups on TomoTherapy (Accuray Inc, Sunnyvale, CA, USA). We selected a total of 50 patients and classified them in five groups (brain, head-and-neck, lung, abdomen and pelvic) and used a total of 500 megavoltage computed tomography (MVCT) image sets for the analysis. From this we calculated the overall mean value(M) for systematic and random errors after applying the different control parameters. After randomization of the patients into the five groups, we found that the overall mean value varied according to three techniques and resolutions. The deviation for the lung, abdomen and pelvic groups was approximately greater than the deviation for the brain and head-and-neck groups in all adjustments. Overall, using a “full-image” produces smaller deviations in the rotational adjustments. We found that rotational adjustment has deviations with distinctly different control parameters. We concluded that using a combination of the “full-image” technique and “standard” resolution will be helpful in assisting with patients’ repositioning and in correcting for set-up errors prior to radiotherapy on TomoTherapy

  10. Trends of Science Education Research: An Automatic Content Analysis

    Science.gov (United States)

    Chang, Yueh-Hsia; Chang, Chun-Yen; Tseng, Yuen-Hsien

    2010-08-01

    This study used scientometric methods to conduct an automatic content analysis on the development trends of science education research from the published articles in the four journals of International Journal of Science Education, Journal of Research in Science Teaching, Research in Science Education, and Science Education from 1990 to 2007. The multi-stage clustering technique was employed to investigate with what topics, to what development trends, and from whose contribution that the journal publications constructed as a science education research field. This study found that the research topic of Conceptual Change & Concept Mapping was the most studied topic, although the number of publications has slightly declined in the 2000's. The studies in the themes of Professional Development, Nature of Science and Socio-Scientific Issues, and Conceptual Chang and Analogy were found to be gaining attention over the years. This study also found that, embedded in the most cited references, the supporting disciplines and theories of science education research are constructivist learning, cognitive psychology, pedagogy, and philosophy of science.

  11. Automatic analysis of gamma spectra using a desk computer

    International Nuclear Information System (INIS)

    A code for the analysis of gamma spectra obtained with a Ge(Li) detector was developed for use with a desk computer (Hewlett-Packard Model 9810 A). The process is performed in a totally automatic way, data are conveniently smoothed and the background is generated by a convolutive equation. A calibration of the equipment with well-known standard sources gives the necessary data for adjusting a third degree equation by minimun squares, relating the energy with the peak position. Criteria are given for determining if certain groups of values constitute or not a peak or if it is a double line. All the peaks are adjusted to a gaussian curve and if necessary decomposed in their components. Data entry is by punched tape, ASCII Code. An alf-numeric printer provides (a) the position of the peak and its energy, (b) its resolution if it is larger than expected, (c) the area of the peak with its statistic error determined by the method of Wasson. As option, the complete spectra with the determined background can be plotted. (author)

  12. Indexing of Arabic documents automatically based on lexical analysis

    OpenAIRE

    Molijy, Abdulrahman Al; Hmeidi, Ismail; Alsmadi, Izzat

    2012-01-01

    The continuous information explosion through the Internet and all information sources makes it necessary to perform all information processing activities automatically in quick and reliable manners. In this paper, we proposed and implemented a method to automatically create and Index for books written in Arabic language. The process depends largely on text summarization and abstraction processes to collect main topics and statements in the book. The process is developed in terms of accuracy a...

  13. Automatic analysis of BEBC pictures using the flying spot digitizer

    CERN Document Server

    Bacilieri, P; Luvisetto, M L; Masetti, M; Matteuzzi, P; Simoni, L

    1977-01-01

    In future experiments at CERN using the SPS (Super Proton Synchrotron) , pictures will be obtained from the big bubble chambers, BEBC and Gargamelle. Until now only a few thousands BEBC pictures have been taken in experiments with the existing PS accelerator. BEBC pictures are much more difficult to analyse by means of automatic devices than ones from the 2 m bubble chamber. Therefore it is necessary to sophisticate the automatic measuring system using an HPD (a mechanical Flying Spot Digitizer) or to develop new systems; for instance, Erasme recently built at CERN. Through a sophistication of the HPD device, the use of a home-made processor to get track segments and the development of a new program chain, which allows one to follow spiralizing tracks, a new system has been developed and it allows the automatic processing of BEBC pictures. (1 refs).

  14. Automatic analysis of BEBC pictures using the flying spot digitizer

    International Nuclear Information System (INIS)

    In future experiments at CERN using the SPS (Super Proton Synchrotron), pictures will be obtained from the big bubble chambers, BEBC and Gargamelle. Until now only a few thousands BEBC pictures have been taken in experiments with the existing PS accelerator. BEBC pictures are much more difficult to be analysed by means of automatic devices than the ones from 2 m bubble chamber. Therefore it is necessary to sophisticate the automatic measuring system using an HPD (a mechanical Flying Spot Digitizer) or to develop new systems like, for instance, Erasme recently built at CERN. Through a sophistication of the HPD device, the use of a home-made processor to get track segments and the development of a new program chain, which allows to follow spiralizing tracks, and the use of an interactive refreshing display for help, a new system has been developed and it allows the automatic processing of BEBC pictures. (Auth.)

  15. volBrain: An Online MRI Brain Volumetry System.

    Science.gov (United States)

    Manjón, José V; Coupé, Pierrick

    2016-01-01

    The amount of medical image data produced in clinical and research settings is rapidly growing resulting in vast amount of data to analyze. Automatic and reliable quantitative analysis tools, including segmentation, allow to analyze brain development and to understand specific patterns of many neurological diseases. This field has recently experienced many advances with successful techniques based on non-linear warping and label fusion. In this work we present a novel and fully automatic pipeline for volumetric brain analysis based on multi-atlas label fusion technology that is able to provide accurate volumetric information at different levels of detail in a short time. This method is available through the volBrain online web interface (http://volbrain.upv.es), which is publically and freely accessible to the scientific community. Our new framework has been compared with current state-of-the-art methods showing very competitive results. PMID:27512372

  16. volBrain: An Online MRI Brain Volumetry System

    Science.gov (United States)

    Manjón, José V.; Coupé, Pierrick

    2016-01-01

    The amount of medical image data produced in clinical and research settings is rapidly growing resulting in vast amount of data to analyze. Automatic and reliable quantitative analysis tools, including segmentation, allow to analyze brain development and to understand specific patterns of many neurological diseases. This field has recently experienced many advances with successful techniques based on non-linear warping and label fusion. In this work we present a novel and fully automatic pipeline for volumetric brain analysis based on multi-atlas label fusion technology that is able to provide accurate volumetric information at different levels of detail in a short time. This method is available through the volBrain online web interface (http://volbrain.upv.es), which is publically and freely accessible to the scientific community. Our new framework has been compared with current state-of-the-art methods showing very competitive results.

  17. Indexing of Arabic documents automatically based on lexical analysis

    CERN Document Server

    Molijy, Abdulrahman Al; Alsmadi, Izzat

    2012-01-01

    The continuous information explosion through the Internet and all information sources makes it necessary to perform all information processing activities automatically in quick and reliable manners. In this paper, we proposed and implemented a method to automatically create and Index for books written in Arabic language. The process depends largely on text summarization and abstraction processes to collect main topics and statements in the book. The process is developed in terms of accuracy and performance and results showed that this process can effectively replace the effort of manually indexing books and document, a process that can be very useful in all information processing and retrieval applications.

  18. Identificação e quantificação de voláteis de café através de cromatografia gasosa de alta resolução / espectrometria de massas empregando um amostrador automático de "headspace" Identification and quantification of coffee volatile components through high resolution gas chromatoghaph/mass spectrometer using a headspace automatic sampler

    Directory of Open Access Journals (Sweden)

    Leonardo César AMSTALDEN

    2001-01-01

    Full Text Available Usando um amostrador automático, os "headspaces" de três marcas comerciais de café torrado e moído foram analisados qualitativa e quantitativamente quanto a composição dos voláteis responsáveis pelo aroma através da técnica de cromatografia gasosa/espectrometria de massas. Uma vez que a metodologia não envolveu isolamento ou concentração dos aromas, suas proporções naturais foram mantidas, além de simplificar o preparo das amostras. O emprego do amostrador automático permitiu também boa resolução dos picos cromatográficos sem o emprego de criogenia, contribuindo para redução no tempo de análise. Noventa e um componentes puderam ser identificados, sendo que alguns compostos conhecidos como presentes em café como o dimetilsulfeto, metional e furfuril mercaptana não foram detectados. Os voláteis presentes em maior concentração puderam ser quantificados com o auxílio de dois padrões internos. A técnica se provou viável, tanto para caracterização como para quantificação de voláteis de café.Employing an automatic headspace sampler, the headspaces of three commercial brands of ground roasted coffee were qualitatively and quantitatively analyzed by gas chromatography / mass spectrometry. Since the methodology did not involve aroma isolation or concentration, their natural proportions were maintained, providing a more accurate composition of the flavors, and simplifying sample preparation. The automatic sampler allowed good resolution of the chromatographic peaks without cryofocusing the samples at the head of the column during injection, reducing analysis time. Ninety one compounds were identified and some known coffee volatiles, such as dimethyl sulphide, methional and furfuryl mercaptan were not detected. The more concentrated volatiles could be identified using two internal standards. The technique proved viable, for both characterization and for quantification of coffee volatiles.

  19. Spectral Curve Fitting for Automatic Hyperspectral Data Analysis

    CERN Document Server

    Brown, Adrian J

    2014-01-01

    Automatic discovery and curve fitting of absorption bands in hyperspectral data can enable the analyst to identify materials present in a scene by comparison with library spectra. This procedure is common in laboratory spectra, but is challenging for sparse hyperspectral data. A procedure for robust discovery of overlapping bands in hyperspectral data is described in this paper. The method is capable of automatically discovering and fitting symmetric absorption bands, can separate overlapping absorption bands in a stable manner, and has relatively low sensitivity to noise. A comparison with techniques already available in the literature is presented using simulated spectra. An application is demonstrated utilizing the shortwave infrared (2.0-2.5 micron or 5000-4000 cm-1) region. A small hyperspectral scene is processed to demonstrate the ability of the method to detect small shifts in absorption wavelength caused by varying white mica chemistry in a natural setting.

  20. Automatic analysis of ventilation and perfusion pulmonary scintigrams

    International Nuclear Information System (INIS)

    A fully automatic program is used to analyse Pulmonary Ventilation and Perfusion Scintigrams. Ventilation study is performed using a standard washin-washout 133Xe method. Multiple View Late Xenon Washout Images are also recorded. Perfusion study is performed using sup(99m)Tc serum albumin. The FORTRAN program recognizes the different steps of the test, whatever their durations are. It performs background subtraction, drows pulmonary Regions of Interest and calculate Ventilation and Perfusion parameters for each ROI and each lung. It also processes Multiple View Late Xenon Washout Images in such a way that they give not only a topographic information about hypoventilated regions, but also a semi-quantitative information about the strongness of xenon retention. During the processing, the operator has only to control two intermediate results (e.g. automatically determained pulmonary ROI). All the numerical and processed iconographic results are obtained within 10 minutes after the end of the test. This program has already been used to analyse 1,000 pulmonary studies. During those studies, correction of intermediate results has been very scarcely necessary. This efficient and reliable automatic program is very useful for the daily practice of a Nuclear Medecin Department

  1. Economic analysis of automatic flood irrigation for dairy farms in northern Victoria

    OpenAIRE

    Armstrong, Dan P.; Ho, Christie K.M.

    2011-01-01

    Interest in automatic flood irrigation is strong, given the labour and lifestyle benefits it can provide. An economic analysis of three automated flood irrigation systems for a dairy farm in northern Victorian indicated that automatic irrigation can be a profitable labour saving investment in many cases. However, profitability was sensitive to the amount and value of the labour saved. Pneumatic and timer systems were good investments regardless of the area they were installed to service. The ...

  2. Explodet Project:. Methods of Automatic Data Processing and Analysis for the Detection of Hidden Explosive

    Science.gov (United States)

    Lecca, Paola

    2003-12-01

    The research of the INFN Gruppo Collegato di Trento in the ambit of EXPLODET project for the humanitarian demining, is devoted to the development of a software procedure for the automatization of data analysis and decision taking about the presence of hidden explosive. Innovative algorithms of likely background calculation, a system based on neural networks for energy calibration and simple statistical methods for the qualitative consistency check of the signals are the main parts of the software performing the automatic data elaboration.

  3. Statistical analysis of quality control of automatic processor

    International Nuclear Information System (INIS)

    Objective: To strengthen the scientific management of automatic processor and promote QC, based on analyzing QC management chart for automatic processor by statistical method, evaluating and interpreting the data and trend of the chart. Method: Speed, contrast, minimum density of step wedge of film strip were measured everyday and recorded on the QC chart. Mean (x-bar), standard deviation (s) and range (R) were calculated. The data and the working trend were evaluated and interpreted for management decisions. Results: Using relative frequency distribution curve constructed by measured data, the authors can judge whether it is a symmetric bell-shaped curve or not. If not, it indicates a few extremes overstepping control limits possibly are pulling the curve to the left or right. If it is a normal distribution, standard deviation (s) is observed. When x-bar +- 2s lies in upper and lower control limits of relative performance indexes, it indicates the processor works in stable status in this period. Conclusion: Guided by statistical method, QC work becomes more scientific and quantified. The authors can deepen understanding and application of the trend chart, and improve the quality management to a new step

  4. Structuring Lecture Videos by Automatic Projection Screen Localization and Analysis.

    Science.gov (United States)

    Li, Kai; Wang, Jue; Wang, Haoqian; Dai, Qionghai

    2015-06-01

    We present a fully automatic system for extracting the semantic structure of a typical academic presentation video, which captures the whole presentation stage with abundant camera motions such as panning, tilting, and zooming. Our system automatically detects and tracks both the projection screen and the presenter whenever they are visible in the video. By analyzing the image content of the tracked screen region, our system is able to detect slide progressions and extract a high-quality, non-occluded, geometrically-compensated image for each slide, resulting in a list of representative images that reconstruct the main presentation structure. Afterwards, our system recognizes text content and extracts keywords from the slides, which can be used for keyword-based video retrieval and browsing. Experimental results show that our system is able to generate more stable and accurate screen localization results than commonly-used object tracking methods. Our system also extracts more accurate presentation structures than general video summarization methods, for this specific type of video. PMID:26357345

  5. Balance of the uranium market. Contribution to an economic analysis. Vol. 2

    International Nuclear Information System (INIS)

    The second volume of this thesis on the economic analysis of the uranium market contrains all the technical decriptions concerning reactors and fuel cycle and also detailed results of the two models on uranium supply and demand

  6. Finite element fatigue analysis of rectangular clutch spring of automatic slack adjuster

    Science.gov (United States)

    Xu, Chen-jie; Luo, Zai; Hu, Xiao-feng; Jiang, Wen-song

    2015-02-01

    The failure of rectangular clutch spring of automatic slack adjuster directly affects the work of automatic slack adjuster. We establish the structural mechanics model of automatic slack adjuster rectangular clutch spring based on its working principle and mechanical structure. In addition, we upload such structural mechanics model to ANSYS Workbench FEA system to predict the fatigue life of rectangular clutch spring. FEA results show that the fatigue life of rectangular clutch spring is 2.0403×105 cycle under the effect of braking loads. In the meantime, fatigue tests of 20 automatic slack adjusters are carried out on the fatigue test bench to verify the conclusion of the structural mechanics model. The experimental results show that the mean fatigue life of rectangular clutch spring is 1.9101×105, which meets the results based on the finite element analysis using ANSYS Workbench FEA system.

  7. Automatic analysis of macerals and reflectance; Analisis Automatico de Macerales y Reflectancia

    Energy Technology Data Exchange (ETDEWEB)

    Catalina, J.C.; Alarcon, D.; Gonzalez Prado, J.

    1998-12-01

    A new system has been developed to perform automatically macerals and reflectance analysis of single-seam bituminous coals, improving the interlaboratory accuracy of these types of analyses. The system follows the same steps as the manual method, requiring a human operator for preparation of coal samples and system startup; then, sample scanning, microscope focusing and field centre analysis are fully automatic. The main and most innovative idea of this approach is to coordinate an expert system with an image processing system, using both reflectance and morphological information. In this way, the system tries to reproduce the analysis procedure followed by a human expert in petrography. (Author)

  8. Nuclear Reactor RA Safety Report, Vol. 15, Analysis of significant accidents

    International Nuclear Information System (INIS)

    Power excursions of the RA reactor a mathematical model of reactor kinetic behaviour was formulated to describe power and temperature coefficients for both reactor fuel and moderator. Computer code TM-1 was written for analysis of possible reactor accidents. Power excursions caused by uncontrolled control rod removal and heavy water flow into the central vertical experimental channel were analyzed. Accidents caused by fuel elements handling were discussed including possible fuel element damage. Although the probability for uncontrolled radioactive materials release into the environment is very low, this type of accidents are analyzed as well including the impact on the personnel and the environment. A separate chapter describes analysis of the loss of flow accident. Safety analysis covers the possible damage of the outer steel Ra reactor vessel and the water screens which are part of the water biological shield

  9. Coloured Petri Nets: Basic Concepts, Analysis Methods and Practical Use. Vol. 1, Basic Concepts

    DEFF Research Database (Denmark)

    Jensen, Kurt

    volume contains the formal definition of CP-nets and the mathematical theory behind their analysis methods. It gives a detailed presentation of many small examples and a brief overview of some industrial applications. The purpose of the book is to teach the reader how to construct CP-net models...

  10. ANALYSIS OF EXISTING AND PROSPECTIVE TECHNICAL CONTROL SYSTEMS OF NUMERIC CODES AUTOMATIC BLOCKING

    Directory of Open Access Journals (Sweden)

    A. M. Beznarytnyy

    2013-09-01

    Full Text Available Purpose. To identify the characteristic features of the engineering control measures system of automatic block of numeric code, identifying their advantages and disadvantages, to analyze the possibility of their use in the problems of diagnosing status of the devices automatic block and setting targets for the development of new diagnostic systems. Methodology. In order to achieve targets the objective theoretical and analytical method and the method of functional analysis have been used. Findings. The analysis of existing and future facilities of the remote control and diagnostics automatic block devices had shown that the existing systems of diagnosis were not sufficiently informative, designed primarily to control the discrete parameters, which in turn did not allow them to construct a decision support subsystem. In developing of new systems of technical diagnostics it was proposed to use the principle of centralized distributed processing of diagnostic data, to include a subsystem support decision-making in to the diagnostics system, it will reduce the amount of work to maintain the devices blocking and reduce recovery time after the occurrence injury. Originality. As a result, the currently existing engineering controls facilities of automatic block can not provide a full assessment of the state distillation alarms and locks. Criteria for the development of new systems of technical diagnostics with increasing amounts of diagnostic information and its automatic analysis were proposed. Practical value. These results of the analysis can be used in practice in order to select the technical control of automatic block devices, as well as the further development of diagnostic systems automatic block that allows for a gradual transition from a planned preventive maintenance service model to the actual state of the monitored devices.

  11. Qualitative Psychology Nexus, Vol. III: Research Questions and Matching Methods of Analysis

    OpenAIRE

    2003-01-01

    This third volume of the Qualitative Psychology Nexus documents the contributions and discussions at the third international workshop on qualitative psychology, organized by the Center for Qualitative Psychology under the title "Research Questions and Matching Methods of Analysis." The meeting took place in October 2002 in Perlora, Spain. The ongoing efforts of our Spanish colleagues, especially Ramón Pérez Pérez and Olga Pérez, made it possible for this meeting to take place at a wonderful ...

  12. Original compositions using contemporary classical and jazz techniques accompanied by technical analysis Vol. I

    OpenAIRE

    Wijers, Dennis

    2011-01-01

    This portfolio of compositions consists of six major works and is accompanied by a technical analysis of each. Nature, the sky, astronomy, clouds and many fields of science have always fascinated me and four of the works presented here are heavily influenced by this fascination. The two remaining works are inspired by my interest in computing, computer programming and the possibilities opened up by the use of computers in composition. The first work, title Jupiter Moons Suite is a five...

  13. Formal Specification and Automatic Analysis of Business Processes under Authorization Constraints: An Action-Based Approach

    Science.gov (United States)

    Armando, Alessandro; Giunchiglia, Enrico; Ponta, Serena Elisa

    We present an approach to the formal specification and automatic analysis of business processes under authorization constraints based on the action language \\cal{C}. The use of \\cal{C} allows for a natural and concise modeling of the business process and the associated security policy and for the automatic analysis of the resulting specification by using the Causal Calculator (CCALC). Our approach improves upon previous work by greatly simplifying the specification step while retaining the ability to perform a fully automatic analysis. To illustrate the effectiveness of the approach we describe its application to a version of a business process taken from the banking domain and use CCALC to determine resource allocation plans complying with the security policy.

  14. Balance of the uranium market. Contribution to an economic analysis. Vol. 1

    International Nuclear Information System (INIS)

    The evolution of the economic and the energetic world situation for these last 40 years are first described stressing electricity production. Place of nuclear energy in the energy market is analyzed taking into account socio-political factors. In a second part, cost for electricity production from different sources: coal, nuclear power, fuel-oil are compared. Two models for uranium world supply and demand are developed. In the third part the uranium market balance, is examined. The analysis includes 3 steps: determination of total uranium demand from 1983 to 2000, determination of total uranium supply by portion of production cost, supply and demand are compared for future evolution with different hypothesis

  15. Statistical language analysis for automatic exfiltration event detection.

    Energy Technology Data Exchange (ETDEWEB)

    Robinson, David Gerald

    2010-04-01

    This paper discusses the recent development a statistical approach for the automatic identification of anomalous network activity that is characteristic of exfiltration events. This approach is based on the language processing method eferred to as latent dirichlet allocation (LDA). Cyber security experts currently depend heavily on a rule-based framework for initial detection of suspect network events. The application of the rule set typically results in an extensive list of uspect network events that are then further explored manually for suspicious activity. The ability to identify anomalous network events is heavily dependent on the experience of the security personnel wading through the network log. Limitations f this approach are clear: rule-based systems only apply to exfiltration behavior that has previously been observed, and experienced cyber security personnel are rare commodities. Since the new methodology is not a discrete rule-based pproach, it is more difficult for an insider to disguise the exfiltration events. A further benefit is that the methodology provides a risk-based approach that can be implemented in a continuous, dynamic or evolutionary fashion. This permits uspect network activity to be identified early with a quantifiable risk associated with decision making when responding to suspicious activity.

  16. Automatic chemical analysis of traces of boron in steel

    International Nuclear Information System (INIS)

    The analyzer is composed of a sample changer, reagent addition devices, a distillation vessel, a color reaction vessel, a spectrophotometer, a controller, etc. The automatic procedure is performed according to the predetermined distillation and color reaction programs after dissolving 0.5 g of steel sample in aqua regia and fuming with sulfuric acid-phosphoric acid. The sample solution on the sample changer is transferred into the distillation vessel, where boron is distilled with methyl alcohol by heating and aeration. The distillate is collected in the distillate vessel, and a 1/2 aliquot is transferred into the color reaction vessel with small amounts of water. After the addition of glacial acetic acid and propionic anhydride, the distillate is circulated through the circulating pipe which is composed of an air blowing tube, a bubble remover, a flow cell and a drain valve. Oxalyl chloride (to eliminate water), sulfuric acid, the curcumin reagent (to form the boron complex) and an acetate buffer are added, and the absorbance of the solution is measured at 545 nm. The analytical results of steel samples were in good agreement with those obtained by the conventional method and with certified values. (auth.)

  17. Analytical tool for the periodic safety analysis of NPP according to the PSA guideline. Vol. 1

    International Nuclear Information System (INIS)

    The SAIS (Safety Analysis and Informationssystem) Programme System is based on an integrated data base, which consists of a plant-data and a PSA related data part. Using SAIS analyses can be performed by special tools, which are connected directly to the data base. Two main editors, RISA+ and DEDIT, are used for data base management. The access to the data base is done via different types of pages, which are displayed on a displayed on a computer screen. The pages are called data sheets. Sets of input and output data sheets were implemented, such as system or component data sheets, fault trees or event trees. All input information, models and results needed for updated results of PSA (Living PSA) can be stored in the SAIS. The programme system contains the editor KVIEW which guarantees consistency of the stored data, e.g. with respect to names and codes of components and events. The information contained in the data base are called in by a standardized users guide programme, called Page Editor. (Brunsbuettel on reference NPP). (orig./HP)

  18. A locally designed mobile laboratory for radiation analysis and monitoring in qatar. Vol. 4

    International Nuclear Information System (INIS)

    A description of a mobile laboratory for radiation analysis and monitoring, completely designed in qatar and equipped at qatar university, is given. It consists of a van equipped with three scintillation detectors mounted on the front bumper. The detectors can monitor gamma radiations along the path of the laboratory over an angle range 120 degree. One Eberline radiation monitoring station is mounted on the roof. The laboratory is also equipped with several, and neutron survey meters in addition to some sampling equipment. All equipment used are powered with solar panels. The characteristics and performance of solar power/stabilized A C conversion is given. Data acquisition from the three scintillation detectors is performed by adding the outputs of the three detectors and storing the total as a function of time in a computer based multi-channel analyzer (MCA) operated in the MSC mode. The acquisition can be switched easily to the PHA mode to analyze gamma spectra from any possible contamination source. The laboratory was used in several environmental and possible contamination missions. Some results obtained during some of these missions are given. 4 figs

  19. The application of automatic image analysis in studies of bubble populations produced by radiation damage

    International Nuclear Information System (INIS)

    This paper describes a method by which automatic image analysis can be applied to the study of size distributions of small (< 5 nm) radiation induced bubbles. Statistically valid results are obtained directly from transmission electron microscope negatives and the optimum imaging conditions for obtaining reliable data are discussed. (author)

  20. From motion to faces: 3D-assisted automatic analysis of people

    OpenAIRE

    Iacopo Masi

    2014-01-01

    From motion to faces: 3D-assisted automatic analysis of people. This work proposes new computer vision algorithms about recognizing people by exploiting the face and the imaged appearance of the body. Many computer vision algorithms are covered: tracking, face recognition and person re-identification.

  1. Full automatic clean-up robot for dioxin/PCB analysis

    Energy Technology Data Exchange (ETDEWEB)

    Matsumura, T.; Masuzaki, Y.; Takahashi, A.; Koizumi, A. [METOCEAN Environment Inc., Shizuoka (Japan). Environmental Risk Research Center, Inst. of General Science for Environment; Okuyama, H.; Kawada, Y.; Higashiguchi, T. [Moritex Corporation, Yokohama (Japan)

    2004-09-15

    Dioxin analysis requires several steps of clean-up procedures by combination of several column chromatography (e.g. silica gel column chromatography, carbon column chromatography) and sulfuric acid treatment. Full Automatic Clean-up Robot for Dioxin and PCB were developed.

  2. Automatic activation analysis of fertilizers and plant samples by fast neutrons

    International Nuclear Information System (INIS)

    Instrumental neutron activation methods are suggested in order to carry out automatic analysis of fertilizers and plant samples for N, P, K and Si allowing for the investigation of more than 40 samples per 8 hours. The experimental errors do not exceed +-3% for N, +-6% for P, +-5% for K and +-15% for Si. (author)

  3. ATC Operations Analysis via Automatic Recognition of Clearances Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Recent advances in airport surface surveillance have motivated the creation of new tools and data sources for analysis of Air Traffic Control (ATC) operations. The...

  4. Porosity determination on pyrocarbon using automatic quantitative image analysis

    International Nuclear Information System (INIS)

    Methods of porosity determination are reviewed and applied to the measurement of the porosity of pyrocarbon. Specifically, the mathematical basis of stereology and the procedures involved in quantitative image analysis are detailed

  5. ATC Operations Analysis via Automatic Recognition of Clearances Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Recent advances in airport surface surveillance have motivated the creation of new tools for analysis of Air Traffic Control (ATC) operations, such as the Surface...

  6. A new automatic microscope for high-speed nuclear emulsion analysis of the OPERA experiment

    International Nuclear Information System (INIS)

    The large amount of emulsion plates to be analysed in the OPERA experiment requires the development of a new generation of automatic microscopes with an order of magnitude improvement in speed with respect to analogous past systems. We report on the large R and D effort in realizing automatic microscopes with a scanning speed of 20 cm2 per hour, describing the progress in the mechanics, optics and in the technology of image acquisition and analysis. We also report on the features and performances of the scanning system (precisions, angular and position resolutions and efficiencies) evaluated exposing stacks of emulsions to high momentum pion beam at CERN

  7. System for automatic x-ray-image analysis, measurement, and sorting of laser fusion targets

    International Nuclear Information System (INIS)

    This paper describes the Automatic X-Ray Image Analysis and Sorting (AXIAS) system which is designed to analyze and measure x-ray images of opaque hollow microspheres used as laser fusion targets. The x-ray images are first recorded on a high resolution film plate. The AXIAS system then digitizes and processes the images to accurately measure the target parameters and defects. The primary goals of the AXIAS system are: to provide extremely accurate and rapid measurements, to engineer a practical system for a routine production environment and to furnish the capability of automatically measuring an array of images for sorting and selection

  8. Automatic target recognition in SAR images using multilinear analysis

    OpenAIRE

    Porgès, Tristan; Favier, Gérard

    2011-01-01

    International audience Multilinear analysis provides a powerful mathematical framework for analyzing synthetic aperture radar (SAR) images resulting from the interaction of multiple factors like sky luminosity and viewing angles, while preserving their original shape. In this paper, we propose a multilinear principal component analysis (MPCA) algorithm for target recognition in SAR images. First, we form a high order tensor with the training image set and we apply the higher-order singular...

  9. Using automatic differentiation in sensitivity analysis of nuclear simulatoin models.

    Energy Technology Data Exchange (ETDEWEB)

    Alexe, M.; Roderick, O.; Anitescu, M.; Utke, J.; Fanning, T.; Hovland, P.; Virginia Tech.

    2010-01-01

    Sensitivity analysis is an important tool in the study of nuclear systems. In our recent work, we introduced a hybrid method that combines sampling techniques with first-order sensitivity analysis to approximate the effects of uncertainty in parameters of a nuclear reactor simulation model. For elementary examples, the approach offers a substantial advantage (in precision, computational efficiency, or both) over classical methods of uncertainty quantification.

  10. Automatic "pipeline" analysis of 3-D MRI data for clinical trials: application to multiple sclerosis.

    Science.gov (United States)

    Zijdenbos, Alex P; Forghani, Reza; Evans, Alan C

    2002-10-01

    The quantitative analysis of magnetic resonance imaging (MRI) data has become increasingly important in both research and clinical studies aiming at human brain development, function, and pathology. Inevitably, the role of quantitative image analysis in the evaluation of drug therapy will increase, driven in part by requirements imposed by regulatory agencies. However, the prohibitive length of time involved and the significant intraand inter-rater variability of the measurements obtained from manual analysis of large MRI databases represent major obstacles to the wider application of quantitative MRI analysis. We have developed a fully automatic "pipeline" image analysis framework and have successfully applied it to a number of large-scale, multicenter studies (more than 1,000 MRI scans). This pipeline system is based on robust image processing algorithms, executed in a parallel, distributed fashion. This paper describes the application of this system to the automatic quantification of multiple sclerosis lesion load in MRI, in the context of a phase III clinical trial. The pipeline results were evaluated through an extensive validation study, revealing that the obtained lesion measurements are statistically indistinguishable from those obtained by trained human observers. Given that intra- and inter-rater measurement variability is eliminated by automatic analysis, this system enhances the ability to detect small treatment effects not readily detectable through conventional analysis techniques. While useful for clinical trial analysis in multiple sclerosis, this system holds widespread potential for applications in other neurological disorders, as well as for the study of neurobiology in general. PMID:12585710

  11. Automatic analysis of JET charge exchange spectra using neural networks

    International Nuclear Information System (INIS)

    The analysis of charge exchange recombination spectra represents a very challenging problem due to the presence of many overlapping spectral lines. Conventional approaches are based on iterative least-squares optimization and suffer from the two difficulties of low speed and the need for a good initial approximation to the solution. This latter problem necessitates considerable human supervision of the analysis procedure. It is shown how neural network techniques allow charge exchange data to be analysed very rapidly, to give an approximate solution without the need for supervision. The network approach is well suited to the fast intershot analysis of large volumes of data, and can readily be implemented in dedicated hardware for real-time applications. The neural network can also be used to provide the initial guess for the standard least-squares algorithm when high accuracy is required. (Author)

  12. Analysis of Phonetic Transcriptions for Danish Automatic Speech Recognition

    DEFF Research Database (Denmark)

    Kirkedal, Andreas Søeborg

    recognition system depends heavily on the dictionary and the transcriptions therein. This paper presents an analysis of phonetic/phonemic features that are salient for current Danish ASR systems. This preliminary study consists of a series of experiments using an ASR system trained on the DK-PAROLE corpus....... The analysis indicates that transcribing e.g. stress or vowel duration has a negative impact on performance. The best performance is obtained with coarse phonetic annotation and improves performance 1% word error rate and 3.8% sentence error rate....

  13. Vibration Theory, Vol. 1B

    DEFF Research Database (Denmark)

    Asmussen, J. C.; Nielsen, Søren R. K.

    The present collection of MATLAB exercises has been published as a supplement to the textbook, Svingningsteori, Bind 1 and the collection of exercises in Vibration theory, Vol. 1A, Solved Problems. Throughout the exercise references are made to these books. The purpose of the MATLAB exercises is to...... give a better understanding of the physical problems in linear vibration theory and to surpress the mathematical analysis used to solve the problems. For this purpose the MATLAB environment is excellent....

  14. Automatic quantitative analysis of cardiac MR perfusion images

    NARCIS (Netherlands)

    Breeuwer, Marcel; Spreeuwers, Luuk; Quist, Marcel

    2001-01-01

    Magnetic Resonance Imaging (MRI) is a powerful technique for imaging cardiovascular diseases. The introduction of cardiovascular MRI into clinical practice is however hampered by the lack of efficient and accurate image analysis methods. This paper focuses on the evaluation of blood perfusion in the

  15. The symbolic computation and automatic analysis of trajectories

    Science.gov (United States)

    Grossman, Robert

    1991-01-01

    Research was generally done on computation of trajectories of dynamical systems, especially control systems. Algorithms were further developed for rewriting expressions involving differential operators. The differential operators involved arise in the local analysis of nonlinear control systems. An initial design was completed of the system architecture for software to analyze nonlinear control systems using data base computing.

  16. Automatic Binding Time Analysis for a Typed Lambda-Calculus

    DEFF Research Database (Denmark)

    Nielson, Hanne Riis; Nielson, Flemming

    1988-01-01

    A binding time analysis imposes a distinction between the computations to be performed early (e.g. at compile-time) and those to be performed late (e.g. at run-time). For the lambda-calculus this distinction is formalized by a two-level lambda-calculus. The authors present an algorithm for static...... analysis of the binding times of a typed lambda-calculus with products, sums, lists and general recursive types. Given partial information about the binding times of some of the subexpressions it will complete that information such that (i) early bindings may be turned into late bindings but not vice versa......, (ii) the resulting two-level lambda-expression reflects our intuition about binding times, e.g. that early bindings are performed before late bindings, and (iii) as few changes as possible have been made compared with the initial binding information. The results can be applied in the implementation...

  17. Image analysis in automatic system of pollen recognition

    OpenAIRE

    Piotr Rapiejko; Zbigniew M. Wawrzyniak; Ryszard S. Jachowicz; Dariusz Jurkiewicz

    2012-01-01

    In allergology practice and research, it would be convenient to receive pollen identification and monitoring results in much shorter time than it comes from human identification. Image based analysis is one of the approaches to an automated identification scheme for pollen grain and pattern recognition on such images is widely used as a powerful tool. The goal of such attempt is to provide accurate, fast recognition and classification and counting of pollen grains by computer system for monit...

  18. A framework for automatic heart sound analysis without segmentation

    Directory of Open Access Journals (Sweden)

    Tungpimolrut Kanokvate

    2011-02-01

    Full Text Available Abstract Background A new framework for heart sound analysis is proposed. One of the most difficult processes in heart sound analysis is segmentation, due to interference form murmurs. Method Equal number of cardiac cycles were extracted from heart sounds with different heart rates using information from envelopes of autocorrelation functions without the need to label individual fundamental heart sounds (FHS. The complete method consists of envelope detection, calculation of cardiac cycle lengths using auto-correlation of envelope signals, features extraction using discrete wavelet transform, principal component analysis, and classification using neural network bagging predictors. Result The proposed method was tested on a set of heart sounds obtained from several on-line databases and recorded with an electronic stethoscope. Geometric mean was used as performance index. Average classification performance using ten-fold cross-validation was 0.92 for noise free case, 0.90 under white noise with 10 dB signal-to-noise ratio (SNR, and 0.90 under impulse noise up to 0.3 s duration. Conclusion The proposed method showed promising results and high noise robustness to a wide range of heart sounds. However, more tests are needed to address any bias that may have been introduced by different sources of heart sounds in the current training set, and to concretely validate the method. Further work include building a new training set recorded from actual patients, then further evaluate the method based on this new training set.

  19. Comparison of Two Methods for Automatic Brain Morphometry Analysis

    Directory of Open Access Journals (Sweden)

    D. Schwarz

    2011-12-01

    Full Text Available The methods of computational neuroanatomy are widely used; the data on their individual strengths and limitations from direct comparisons are, however, scarce. The aim of the present study was direct comparison of DBM based on high-resolution spatial transforms with widely used VBM analysis based on segmented high-resolution images. We performed DBM and VBM analyses on simulated volume changes in a set of 20 3-D MR images, compared to 30 MR images, where only random spatial transforms were introduced. The ability of the two methods to detect regions with the simulated volume changes was determined using overlay index together with the ground truth regions of the simulations; the precision of the detection in space was determined using the distance measures between the centers of detected and simulated regions. DBM was able to detect all the regions with simulated local volume changes with high spatial precision. On the other hand, VBM detected only changes in vicinity of the largest simulated change, with a poor overlap of the detected changes and the ground truth. Taken together we suggest that the analysis of high-resolution deformation fields is more convenient, sensitive, and precise than voxel-wise analysis of tissue-segmented images.

  20. Two dimensional barcode-inspired automatic analysis for arrayed microfluidic immunoassays

    OpenAIRE

    Zhang, Yi; Qiao, Lingbo; Ren, Yunke; Wang, Xuwei; Gao, Ming; Tang, Yunfang; Jeff Xi, Jianzhong; Fu, Tzung-May; Jiang, Xingyu

    2013-01-01

    The usability of many high-throughput lab-on-a-chip devices in point-of-care applications is currently limited by the manual data acquisition and analysis process, which are labor intensive and time consuming. Based on our original design in the biochemical reactions, we proposed here a universal approach to perform automatic, fast, and robust analysis for high-throughput array-based microfluidic immunoassays. Inspired by two-dimensional (2D) barcodes, we incorporated asymmetric function patt...

  1. Automatic ECG Analysis for Preliminary and Detailed Diagnostics Based on Scale-space Representation

    OpenAIRE

    Belous, Natalie; Kobzar, Gleb

    2008-01-01

    A novel approach of automatic ECG analysis based on scale-scale signal representation is proposed. The approach uses curvature scale-space representation to locate main ECG waveform limits and peaks and may be used to correct results of other ECG analysis techniques or independently. Moreover dynamic matching of ECG CSS representations provides robust preliminary recognition of ECG abnormalities which has been proven by experimental results.

  2. Automatic analysis (aa): efficient neuroimaging workflows and parallel processing using Matlab and XML

    OpenAIRE

    Rhodri Cusack; Alejandro Vicente-Grabovetsky; Daniel J Mitchell; Peelle, Jonathan E.

    2015-01-01

    Recent years have seen neuroimaging data sets becoming richer, with larger cohorts of participants, a greater variety of acquisition techniques, and increasingly complex analyses. These advances have made data analysis pipelines complicated to set up and run (increasing the risk of human error) and time consuming to execute (restricting what analyses are attempted). Here we present an open-source framework, automatic analysis (aa), to address these concerns. Human efficiency is increased by m...

  3. Image analysis in automatic system of pollen recognition

    Directory of Open Access Journals (Sweden)

    Piotr Rapiejko

    2012-12-01

    Full Text Available In allergology practice and research, it would be convenient to receive pollen identification and monitoring results in much shorter time than it comes from human identification. Image based analysis is one of the approaches to an automated identification scheme for pollen grain and pattern recognition on such images is widely used as a powerful tool. The goal of such attempt is to provide accurate, fast recognition and classification and counting of pollen grains by computer system for monitoring. The isolated pollen grain are objects extracted from microscopic image by CCD camera and PC computer under proper conditions for further analysis. The algorithms are based on the knowledge from feature vector analysis of estimated parameters calculated from grain characteristics, including morphological features, surface features and other applicable estimated characteristics. Segmentation algorithms specially tailored to pollen object characteristics provide exact descriptions of pollen characteristics (border and internal features already used by human expert. The specific characteristics and its measures are statistically estimated for each object. Some low level statistics for estimated local and global measures of the features establish the feature space. Some special care should be paid on choosing these feature and on constructing the feature space to optimize the number of subspaces for higher recognition rates in low-level classification for type differentiation of pollen grains.The results of estimated parameters of feature vector in low dimension space for some typical pollen types are presented, as well as some effective and fast recognition results of performed experiments for different pollens. The findings show the ewidence of using proper chosen estimators of central and invariant moments (M21, NM2, NM3, NM8 NM9, of tailored characteristics for good enough classification measures (efficiency > 95%, even for low dimensional classifiers

  4. Towards automatic analysis of dynamic radionuclide studies using principal-components factor analysis

    International Nuclear Information System (INIS)

    A method is proposed for automatic analysis of dynamic radionuclide studies using the mathematical technique of principal-components factor analysis. This method is considered as a possible alternative to the conventional manual regions-of-interest method widely used. The method emphasises the importance of introducing a priori information into the analysis about the physiology of at least one of the functional structures in a study. Information is added by using suitable mathematical models to describe the underlying physiological processes. A single physiological factor is extracted representing the particular dynamic structure of interest. Two spaces 'study space, S' and 'theory space, T' are defined in the formation of the concept of intersection of spaces. A one-dimensional intersection space is computed. An example from a dynamic 99Tcsup(m) DTPA kidney study is used to demonstrate the principle inherent in the method proposed. The method requires no correction for the blood background activity, necessary when processing by the manual method. The careful isolation of the kidney by means of region of interest is not required. The method is therefore less prone to operator influence and can be automated. (author)

  5. Automatic Fatigue Detection of Drivers through Yawning Analysis

    Science.gov (United States)

    Azim, Tayyaba; Jaffar, M. Arfan; Ramzan, M.; Mirza, Anwar M.

    This paper presents a non-intrusive fatigue detection system based on the video analysis of drivers. The focus of the paper is on how to detect yawning which is an important cue for determining driver's fatigue. Initially, the face is located through Viola-Jones face detection method in a video frame. Then, a mouth window is extracted from the face region, in which lips are searched through spatial fuzzy c-means (s-FCM) clustering. The degree of mouth openness is extracted on the basis of mouth features, to determine driver's yawning state. If the yawning state of the driver persists for several consecutive frames, the system concludes that the driver is non-vigilant due to fatigue and is thus warned through an alarm. The system reinitializes when occlusion or misdetection occurs. Experiments were carried out using real data, recorded in day and night lighting conditions, and with users belonging to different race and gender.

  6. Automatic detection and analysis of nuclear plant malfunctions

    International Nuclear Information System (INIS)

    In this paper a system is proposed, which performs dynamically the detection and analysis of malfunctions in a nuclear plant. The proposed method was developed and implemented on a Reactor Simulator, instead of on a real one, thus allowing a wide range of tests. For all variables under control, a simulation module was identified and implemented on the reactor on-line computer. In the malfunction identification phase all modules run separately, processing plant input variables and producing their output variable in Real-Time; continuous comparison of the computed variables with plant variables allows malfunction's detection. At this moment the second phase can occur: when a malfunction is detected, all modules are connected, except the module simulating the wrong variable, and a fast simulation is carried on, to analyse the consequences. (author)

  7. Gaussian curvature analysis allows for automatic block placement in multi-block hexahedral meshing.

    Science.gov (United States)

    Ramme, Austin J; Shivanna, Kiran H; Magnotta, Vincent A; Grosland, Nicole M

    2011-10-01

    Musculoskeletal finite element analysis (FEA) has been essential to research in orthopaedic biomechanics. The generation of a volumetric mesh is often the most challenging step in a FEA. Hexahedral meshing tools that are based on a multi-block approach rely on the manual placement of building blocks for their mesh generation scheme. We hypothesise that Gaussian curvature analysis could be used to automatically develop a building block structure for multi-block hexahedral mesh generation. The Automated Building Block Algorithm incorporates principles from differential geometry, combinatorics, statistical analysis and computer science to automatically generate a building block structure to represent a given surface without prior information. We have applied this algorithm to 29 bones of varying geometries and successfully generated a usable mesh in all cases. This work represents a significant advancement in automating the definition of building blocks. PMID:20924860

  8. Two dimensional barcode-inspired automatic analysis for arrayed microfluidic immunoassays

    Science.gov (United States)

    Zhang, Yi; Qiao, Lingbo; Ren, Yunke; Wang, Xuwei; Gao, Ming; Tang, Yunfang; Jeff Xi, Jianzhong; Fu, Tzung-May; Jiang, Xingyu

    2013-01-01

    The usability of many high-throughput lab-on-a-chip devices in point-of-care applications is currently limited by the manual data acquisition and analysis process, which are labor intensive and time consuming. Based on our original design in the biochemical reactions, we proposed here a universal approach to perform automatic, fast, and robust analysis for high-throughput array-based microfluidic immunoassays. Inspired by two-dimensional (2D) barcodes, we incorporated asymmetric function patterns into a microfluidic array. These function patterns provide quantitative information on the characteristic dimensions of the microfluidic array, as well as mark its orientation and origin of coordinates. We used a computer program to perform automatic analysis for a high-throughput antigen/antibody interaction experiment in 10 s, which was more than 500 times faster than conventional manual processing. Our method is broadly applicable to many other microchannel-based immunoassays. PMID:24404030

  9. Automatic localization of cerebral cortical malformations using fractal analysis

    Science.gov (United States)

    De Luca, A.; Arrigoni, F.; Romaniello, R.; Triulzi, F. M.; Peruzzo, D.; Bertoldo, A.

    2016-08-01

    Malformations of cortical development (MCDs) encompass a variety of brain disorders affecting the normal development and organization of the brain cortex. The relatively low incidence and the extreme heterogeneity of these disorders hamper the application of classical group level approaches for the detection of lesions. Here, we present a geometrical descriptor for a voxel level analysis based on fractal geometry, then define two similarity measures to detect the lesions at single subject level. The pipeline was applied to 15 normal children and nine pediatric patients affected by MCDs following two criteria, maximum accuracy (WACC) and minimization of false positives (FPR), and proved that our lesion detection algorithm is able to detect and locate abnormalities of the brain cortex with high specificity (WACC  =  85%, FPR  =  96%), sensitivity (WACC  =  83%, FPR  =  63%) and accuracy (WACC  =  85%, FPR  =  90%). The combination of global and local features proves to be effective, making the algorithm suitable for the detection of both focal and diffused malformations. Compared to other existing algorithms, this method shows higher accuracy and sensitivity.

  10. Generalization versus contextualization in automatic evaluation revisited: A meta-analysis of successful and failed replications.

    Science.gov (United States)

    Gawronski, Bertram; Hu, Xiaoqing; Rydell, Robert J; Vervliet, Bram; De Houwer, Jan

    2015-08-01

    To account for disparate findings in the literature on automatic evaluation, Gawronski, Rydell, Vervliet, and De Houwer (2010) proposed a representational theory that specifies the contextual conditions under which automatic evaluations reflect initially acquired attitudinal information or subsequently acquired counterattitudinal information. The theory predicts that automatic evaluations should reflect the valence of expectancy-violating counterattitudinal information only in the context in which this information had been learned. In contrast, automatic evaluations should reflect the valence of initial attitudinal information in any other context, be it the context in which the initial attitudinal information had been acquired (ABA renewal) or a novel context in which the target object had not been encountered before (ABC renewal). The current article presents a meta-analysis of all published and unpublished studies from the authors' research groups regardless of whether they produced the predicted pattern of results. Results revealed average effect sizes of d = 0.249 for ABA renewal (30 studies, N = 3,142) and d = 0.174 for ABC renewal (27 studies, N = 2,930), both of which were significantly different from zero. Effect sizes were moderated by attention to context during learning, order of positive and negative information, context-valence contingencies during learning, and sample country. Although some of the obtained moderator effects are consistent with the representational theory, others require theoretical refinements and future research to gain deeper insights into the mechanisms underlying contextual renewal. PMID:26010481

  11. CADLIVE toolbox for MATLAB: automatic dynamic modeling of biochemical networks with comprehensive system analysis.

    Science.gov (United States)

    Inoue, Kentaro; Maeda, Kazuhiro; Miyabe, Takaaki; Matsuoka, Yu; Kurata, Hiroyuki

    2014-09-01

    Mathematical modeling has become a standard technique to understand the dynamics of complex biochemical systems. To promote the modeling, we had developed the CADLIVE dynamic simulator that automatically converted a biochemical map into its associated mathematical model, simulated its dynamic behaviors and analyzed its robustness. To enhance the feasibility by CADLIVE and extend its functions, we propose the CADLIVE toolbox available for MATLAB, which implements not only the existing functions of the CADLIVE dynamic simulator, but also the latest tools including global parameter search methods with robustness analysis. The seamless, bottom-up processes consisting of biochemical network construction, automatic construction of its dynamic model, simulation, optimization, and S-system analysis greatly facilitate dynamic modeling, contributing to the research of systems biology and synthetic biology. This application can be freely downloaded from http://www.cadlive.jp/CADLIVE_MATLAB/ together with an instruction. PMID:24623466

  12. Analysis of outdoor radon progeny concentration measured at the Spanish radioactive aerosol automatic monitoring network

    International Nuclear Information System (INIS)

    An analysis of 10-year radon progeny data, provided by the Spanish automatic radiological surveillance network, in relation to meteorology is presented. Results show great spatial variability depending mainly on the station location and thus, the surrounding radon exhalation rate. Hourly averages show the typical diurnal cycle with an early morning maximum and a minimum at noon, except for one mountain station, which shows an inverse behaviour. Monthly averaged values show lower concentrations during months with higher atmospheric instability.

  13. Automatic diagnosis of pneumoconiosis by texture analysis of chest X-ray images

    International Nuclear Information System (INIS)

    This paper presents a study on automatic diagnosis of pneumoconiosis by texture analysis of chest X-ray images. A pre-processing method for normalizing distribution of film density is proposed and its effectiveness is studied experimentally. Dominant causes of the deviations of texture features for evaluating the profusion of small opacities are excluded by the pre-processing. New texture features based on the distribution of gradient vectors are also proposed. Experiments to evaluate the proposed system are demonstrated

  14. Technical characterization by image analysis: an automatic method of mineralogical studies

    International Nuclear Information System (INIS)

    The application of a modern method of image analysis fully automated for the study of grain size distribution modal assays, degree of liberation and mineralogical associations is discussed. The image analyser is interfaced with a scanning electron microscope and an energy dispersive X-rays analyser. The image generated by backscattered electrons is analysed automatically and the system has been used in accessment studies of applied mineralogy as well as in process control in the mining industry. (author)

  15. Probabilistic Analysis of an Automatic Power Factor Controller with variation in Power Factor

    OpenAIRE

    P K Bhatia; ROOSEL JAIN; GULSHAN TANEJA,

    2012-01-01

    In the present study, the probabilistic analysis of an automatic power factor controller (APFC) system working in industry/factory is investigated. The power factor correction of electrical loads and energy losses due to poor power factor are the problems common to all industrial companies. Therefore, the study of APFC unit is of greatimportance. Initially, the system is operative with controlled power factor. Then it may transit to state with power factor not controlled. On the failure of th...

  16. Sleep-monitoring, experiment M133. [electronic recording system for automatic analysis of human sleep patterns

    Science.gov (United States)

    Frost, J. D., Jr.; Salamy, J. G.

    1973-01-01

    The Skylab sleep-monitoring experiment simulated the timelines and environment expected during a 56-day Skylab mission. Two crewmembers utilized the data acquisition and analysis hardware, and their sleep characteristics were studied in an online fashion during a number of all night recording sessions. Comparison of the results of online automatic analysis with those of postmission visual data analysis was favorable, confirming the feasibility of obtaining reliable objective information concerning sleep characteristics during the Skylab missions. One crewmember exhibited definite changes in certain sleep characteristics (e.g., increased sleep latency, increased time Awake during first third of night, and decreased total sleep time) during the mission.

  17. Elemental analysis of soil and plant samples at El-Manzala lake neutron activation analysis technique. Vol. 4

    International Nuclear Information System (INIS)

    Soil and plant samples were collected from from two locations, Bahr El-Baker, and Bahr kados at the manzala lake, where, where high pollution is expected. The samples were especially treated and prepared for investigation by thermal neutron activation analysis (NAA). The irradiation facilities of the first egyptian research reactor (ET-R R-1), and the hyper pure germanium (HPGe) detection system were used for analysis. Among the 34 identified Fe, Co, As, Cd, Te, La, Sm, Rb, Hg, Th, and U are of a special significance because of the their toxic deleterious impact on living organisms. This work is part of a research project concerning pollution studies on the river Nile and some lakes of egypt. The data obtained in the present work stands as a reference basic record for any future follow up of contamination level. 1 tab

  18. Automatic Evaluation for E-Learning Using Latent Semantic Analysis: A Use Case

    Directory of Open Access Journals (Sweden)

    Mireia Farrús

    2013-03-01

    Full Text Available Assessment in education allows for obtaining, organizing, and presenting information about how much and how well the student is learning. The current paper aims at analysing and discussing some of the most state-of-the-art assessment systems in education. Later, this work presents a specific use case developed for the Universitat Oberta de Catalunya, which is an online university. An automatic evaluation tool is proposed that allows the student to evaluate himself anytime and receive instant feedback. This tool is a web-based platform, and it has been designed for engineering subjects (i.e., with math symbols and formulas in Catalan and Spanish. Particularly, the technique used for automatic assessment is latent semantic analysis. Although the experimental framework from the use case is quite challenging, results are promising.

  19. Instruments and scope of possible action of the Federal States for global climate protection. Vol. 1: Analysis. Vol. 2: Baseline data. Vol. 3: Legal opinion; Instrumente und Handlungsmoeglichkeiten der Bundeslaender zum Klimaschutz. Bd. 1: Analyseband. Bd. 2: Materialband. Bd. 3: Rechtsgutachten

    Energy Technology Data Exchange (ETDEWEB)

    Blechschmidt, K.; Herbert, W.; Barzantny, K.; Dittmann, W.; Gehm, C.; Menges, R.; Moehring-Hueser, W.; Wortmann, K.; Herbert, W. [Energiestiftung Schleswig-Holstein, Kiel (Germany); Penschke, A. [Zentrum fuer Rationelle Energieanwendung und Umwelt GmbH, Regensburg (Germany); Groth, A.; Kusche, C.

    2001-07-01

    In the analysis volume, (vol. 1), the legal and political frameworks governing the greenhouse gas reduction policy adopted by the Federal states of Germany are explained. The evolution of the policies, goals pursued and points of major emphasis, the orientation for the future and existing impediments for implementation are analysed. Volume 2 contains a comprehensive baseline data library characterising the present situation, as well as examples selected by the Federal states, illustrating policy schemes which proved to be successful. The legal opinion presented in volume 3 explains the distribution of competences for climate policy of the Federal Government and the state governments in accordance with constitutional law, as well as channels of implementing policy schemes via legislation issued by the Federal states, or administrative action in execution of the law, or other administrative action. (orig./CB) [German] Im Analyseband (Band 1) werden die rechtlichen und politischen Rahmenbedingungen der Landesklimaschutzpolitik dargestellt, gefolgt von einer Analyse dieser Politik hinsichtlich ihrer Entwicklung, Ziele und Schwerpunkte, bestehenden Hindernissen, zukuenftiger Ausrichtung, Handlungsoptionen und Beispiele fuer die Erschliessung neuer Taetigkeitsgebiete der Laender. Der Materialband (Band 2) enthaelt neben umfassenden Daten zur Schilderung der Situation auch von den Laendern ausgewaehlte, erfolgreiche Beispiele fuer die Durchfuehrung von Klimaschutzmassnahmen. In einem Rechtsgutachten (Band 3) werden die verfassungsrechtlichen Grundlagen der Kompetenzverteilung zwischen Bund und Laendern auf dem Gebiet des Klimaschutzes detailliert dargelegt und Moeglichkeiten der Umsetzung von Klimaschutzpolitik durch Landesgesetzgebung sowie gesetzesvollziehende und nicht-gesetzesvollziehende Verwaltungstaetigkeit untersucht. (orig./CB)

  20. Automatic evaluation and data generation for analytical chemistry instrumental analysis exercises

    Directory of Open Access Journals (Sweden)

    Arsenio Muñoz de la Peña

    2014-01-01

    Full Text Available In general, laboratory activities are costly in terms of time, space, and money. As such, the ability to provide realistically simulated laboratory data that enables students to practice data analysis techniques as a complementary activity would be expected to reduce these costs while opening up very interesting possibilities. In the present work, a novel methodology is presented for design of analytical chemistry instrumental analysis exercises that can be automatically personalized for each student and the results evaluated immediately. The proposed system provides each student with a different set of experimental data generated randomly while satisfying a set of constraints, rather than using data obtained from actual laboratory work. This allows the instructor to provide students with a set of practical problems to complement their regular laboratory work along with the corresponding feedback provided by the system's automatic evaluation process. To this end, the Goodle Grading Management System (GMS, an innovative web-based educational tool for automating the collection and assessment of practical exercises for engineering and scientific courses, was developed. The proposed methodology takes full advantage of the Goodle GMS fusion code architecture. The design of a particular exercise is provided ad hoc by the instructor and requires basic Matlab knowledge. The system has been employed with satisfactory results in several university courses. To demonstrate the automatic evaluation process, three exercises are presented in detail. The first exercise involves a linear regression analysis of data and the calculation of the quality parameters of an instrumental analysis method. The second and third exercises address two different comparison tests, a comparison test of the mean and a t-paired test.

  1. Automatic generation of stop word lists for information retrieval and analysis

    Science.gov (United States)

    Rose, Stuart J

    2013-01-08

    Methods and systems for automatically generating lists of stop words for information retrieval and analysis. Generation of the stop words can include providing a corpus of documents and a plurality of keywords. From the corpus of documents, a term list of all terms is constructed and both a keyword adjacency frequency and a keyword frequency are determined. If a ratio of the keyword adjacency frequency to the keyword frequency for a particular term on the term list is less than a predetermined value, then that term is excluded from the term list. The resulting term list is truncated based on predetermined criteria to form a stop word list.

  2. Determination of porosity of pyrocarbon by means of the automatic quantitative image analysis

    International Nuclear Information System (INIS)

    For a long time, the quantitative image analysis is well known as a method for quantifying the results of material investigation basing on ceramography. The development of the automatic image analysers has made it a fast and elegant procedure for evaluation. It is used to determine easily and routinely the macroporosity and by this the density of the pyrocarbon coatings of nuclear fuel particles. This report describes the definition of measuring parameters, the measuring procedure, the mathematical calculations, and first experimental and mathematical results. (orig.)

  3. Application of Neural Network Boolean Factor Analysis Procedure to Automatic Conference Papers Categorization

    Czech Academy of Sciences Publication Activity Database

    Húsek, Dušan; Frolov, A. A.; Polyakov, P.Y.; Řezanková, H.; Snášel, V.

    Lisabon : Instituto Nacional de Estatística, 2008 - (Gomes, M.; Pinto Martins, J.; Silva, J.), s. 3739-3742 ISBN 978-972-673-992-0. [ISI 2007. Session of the International Statistical Institute /56./. Lisboa (PT), 22.08.2007-29.08.2007] R&D Projects: GA AV ČR 1ET100300414 Grant ostatní: RFBR(RU) 05-07-90049 Institutional research plan: CEZ:AV0Z10300504 Keywords : Boolean factor analysis * document classification * automatic concepts search * unsupervised learning * neural network Subject RIV: BB - Applied Statistics, Operational Research

  4. Automatic Lameness Detection in a Milking Robot : Instrumentation, measurement software, algorithms for data analysis and a neural network model

    OpenAIRE

    Pastell, Matti

    2007-01-01

    The aim of this thesis is to develop a fully automatic lameness detection system that operates in a milking robot. The instrumentation, measurement software, algorithms for data analysis and a neural network model for lameness detection were developed. Automatic milking has become a common practice in dairy husbandry, and in the year 2006 about 4000 farms worldwide used over 6000 milking robots. There is a worldwide movement with the objective of fully automating every process from feedi...

  5. Modeling Earthen Dike Stability: Sensitivity Analysis and Automatic Calibration of Diffusivities Based on Live Sensor Data

    CERN Document Server

    Melnikova, N B; Sloot, P M A

    2012-01-01

    The paper describes concept and implementation details of integrating a finite element module for dike stability analysis Virtual Dike into an early warning system for flood protection. The module operates in real-time mode and includes fluid and structural sub-models for simulation of porous flow through the dike and for dike stability analysis. Real-time measurements obtained from pore pressure sensors are fed into the simulation module, to be compared with simulated pore pressure dynamics. Implementation of the module has been performed for a real-world test case - an earthen levee protecting a sea-port in Groningen, the Netherlands. Sensitivity analysis and calibration of diffusivities have been performed for tidal fluctuations. An algorithm for automatic diffusivities calibration for a heterogeneous dike is proposed and studied. Analytical solutions describing tidal propagation in one-dimensional saturated aquifer are employed in the algorithm to generate initial estimates of diffusivities.

  6. Multidirectional channeling analysis of epitaxial CdTe layers using an automatic RBS/channeling system

    Energy Technology Data Exchange (ETDEWEB)

    Wielunski, L.S.; Kenny, M.J. [CSIRO, Lindfield, NSW (Australia). Applied Physics Div.

    1993-12-31

    Rutherford Backscattering Spectrometry (RBS) is an ion beam analysis technique used in many fields. The high depth and mass resolution of RBS make this technique very useful in semiconductor material analysis [1]. The use of ion channeling in combination with RBS creates a powerful technique which can provide information about crystal quality and structure in addition to mass and depth resolution [2]. The presence of crystal defects such as interstitial atoms, dislocations or dislocation loops can be detected and profiled [3,4]. Semiconductor materials such as CdTe, HgTe and Hg+xCd{sub 1-x}Te generate considerable interest due to applications as infrared detectors in many technological areas. The present paper demonstrates how automatic RBS and multidirectional channeling analysis can be used to evaluate crystal quality and near surface defects. 6 refs., 1 fig.

  7. Computer program for analysis of impedance cardiography signals enabling manual correction of points detected automatically

    Science.gov (United States)

    Oleksiak, Justyna; Cybulski, Gerard

    2014-11-01

    The aim of this work was to create a computer program, written in LabVIEW, which enables the visualization and analysis of hemodynamic parameters. It allows the user to import data collected using ReoMonitor, an ambulatory monitoring impedance cardiography (AICG) device. The data include one channel of the ECG and one channel of the first derivative of the impedance signal (dz/dt) sampled at 200Hz and the base impedance signal (Z0) sampled every 8s. The program consist of two parts: a bioscope allowing the presentation of traces (ECG, AICG, Z0) and an analytical portion enabling the detection of characteristic points on the signals and automatic calculation of hemodynamic parameters. The detection of characteristic points in both signals is done automatically, with the option to make manual corrections, which may be necessary to avoid "false positive" recognitions. This application is used to determine the values of basic hemodynamic variables: pre-ejection period (PEP), left ventricular ejection time (LVET), stroke volume (SV), cardiac output (CO), and heart rate (HR). It leaves room for further development of additional features, for both the analysis panel and the data acquisition function.

  8. In-air PIXE set-up for automatic analysis of historical document inks

    International Nuclear Information System (INIS)

    The iron gall inks were one of the writing materials mostly applied in historical documents of the western civilization. Due to the ink corrosive character, the documents are faced with a danger of being seriously, and in some cases also irreversibly changed. The elemental composition of the inks is an important information for taking the adequate conservation action [Project InkCor, http://www.infosrvr.nuk.uni-lj.si/jana/Inkcor/index.htm, and references within]. Here, the in-air PIXE analysis offers an indispensable tool due to its sensitivity and almost non-destructive character. An experimental approach developed for precise and automatic analysis of documents at Jozef Stefan Institute Tandetron accelerator is presented. The selected documents were mounted, one at the time, on the positioning board and the chosen ink spots on the sample were irradiated by 1.7 MeV protons. The data acquisition on the selected ink spots is done automatically throughout the measuring pattern determined prior to the measurement. The chemical elements identified in the documents ranged from Si to Pb, and between them the significant iron gall ink components like Fe, S, K, Cu, Zn, Co, Mn, Ni were deduced with precision of ±10%. The measurements were done non-destructively and no visible damage was observed on the irradiated documents

  9. Design of phantoms and software for automatic image analysis applied to digital radiographic equipments

    International Nuclear Information System (INIS)

    In a quality control of the radiographic equipment, the quality of the obtained image is very useful to characterize the physical properties of the image radiographic chain. In the radiographic technique it is necessary that the evaluation of the image can guarantee the constancy of its quality to carry out a suitable diagnosis. The use of digital systems allows the automatic analysis of the obtained radiographic images, increasing the objectivity in the evaluation of the image. In this work we have designed some radiographic phantoms for different radiographic digital devices, as dental, conventional, equipments with computed radiography (phosphor plate) and direct radiography (sensor) technology. Additionally, we have developed a software to analyse the image obtained by the radiographic equipment with digital processing techniques as edge detector, morphological operators, statistical test for the detected combinations.. The images have been acquired in DICOM, tiff.. format and they can be analysed with objective parameters as an image quality index and the contrast detail curve. The design of these phantoms let the evaluation of a wide range of operating conditions of voltage, current and time of the digital equipments. Moreover, the image quality analysis by the automatic software let study it with objective parameters and the functioning of the image chain of the digital system. (author)

  10. [Design and analysis of automatic measurement instrument for diffraction efficiency of plane reflection grating].

    Science.gov (United States)

    Wang, Fang; Qi, Xiang-Dong; Yu, Hong-Zhu; Yu, Hai-Li

    2009-02-01

    A new-style system that automatically measures the diffraction efficiency of plane reflection grating was designed. The continuous illuminant was adopted for illumination, the duplex grating spectrograph structure was applied, and the linear array NMOS was the receiving component. Wielding relevant principle of the grating spectrograph, theoretical analysis principle was carried out for the testing system. Integrating the aberration theory of geometrical optics, the image quality of this optics system was analyzed. Analysis indicated that the systematic device structure is compact, and electronics system is simplified. The system does not have the problem about wavelength sweep synchronization of the two grating spectrographs, and its wavelength repeatability is very good. So the precision is easy to guarantee. Compared with the former automated scheme, the production cost is reduced, moreover it is easy to operate, and the working efficiency is enhanced. The study showed that this automatic measurement instrument system features a spectral range of 190-1 100 nm and resolution is less than 3 nm, which entirely satisfies the design request. It is an economical and feasible plan. PMID:19445251

  11. A new automatic image analysis method for assessing estrogen receptors' status in breast tissue specimens.

    Science.gov (United States)

    Mouelhi, Aymen; Sayadi, Mounir; Fnaiech, Farhat; Mrad, Karima; Ben Romdhane, Khaled

    2013-12-01

    Manual assessment of estrogen receptors' (ER) status from breast tissue microscopy images is a subjective, time consuming and error prone process. Automatic image analysis methods offer the possibility to obtain consistent, objective and rapid diagnoses of histopathology specimens. In breast cancer biopsies immunohistochemically (IHC) stained for ER, cancer cell nuclei present a large variety in their characteristics that bring various difficulties for traditional image analysis methods. In this paper, we propose a new automatic method to perform both segmentation and classification of breast cell nuclei in order to give quantitative assessment and uniform indicators of IHC staining that will help pathologists in their diagnostic. Firstly, a color geometric active contour model incorporating a spatial fuzzy clustering algorithm is proposed to detect the contours of all cell nuclei in the image. Secondly, overlapping and touching nuclei are separated using an improved watershed algorithm based on a concave vertex graph. Finally, to identify positive and negative stained nuclei, all the segmented nuclei are classified into five categories according to their staining intensity and morphological features using a trained multilayer neural network combined with Fisher's linear discriminant preprocessing. The proposed method is tested on a large dataset containing several breast tissue images with different levels of malignancy. The experimental results show high agreement between the results of the method and ground-truth from the pathologist panel. Furthermore, a comparative study versus existing techniques is presented in order to demonstrate the efficiency and the superiority of the proposed method. PMID:24290943

  12. Selection of Entropy Based Features for Automatic Analysis of Essential Tremor

    Directory of Open Access Journals (Sweden)

    Karmele López-de-Ipiña

    2016-05-01

    Full Text Available Biomedical systems produce biosignals that arise from interaction mechanisms. In a general form, those mechanisms occur across multiple scales, both spatial and temporal, and contain linear and non-linear information. In this framework, entropy measures are good candidates in order provide useful evidence about disorder in the system, lack of information in time-series and/or irregularity of the signals. The most common movement disorder is essential tremor (ET, which occurs 20 times more than Parkinson’s disease. Interestingly, about 50%–70% of the cases of ET have a genetic origin. One of the most used standard tests for clinical diagnosis of ET is Archimedes’ spiral drawing. This work focuses on the selection of non-linear biomarkers from such drawings and handwriting, and it is part of a wider cross study on the diagnosis of essential tremor, where our piece of research presents the selection of entropy features for early ET diagnosis. Classic entropy features are compared with features based on permutation entropy. Automatic analysis system settled on several Machine Learning paradigms is performed, while automatic features selection is implemented by means of ANOVA (analysis of variance test. The obtained results for early detection are promising and appear applicable to real environments.

  13. Analysis and Development of FACE Automatic Apparatus for Rapid Identification of Transuranium Isotopes

    Energy Technology Data Exchange (ETDEWEB)

    Sebesta, E.H.

    1978-09-01

    A description of and operating manual for the FACE Automatic Apparatus has been written along with a documentation of the FACE machine operating program, to provide a user manual for the FACE Automatic Apparatus. In addition, FACE machine performance was investigated to improve transuranium throughput. Analysis of the causes of transuranium isotope loss was undertaken both chemical and radioactive. To lower radioactive loss, the dynamics of the most time consuming step of the FACE machine, the chromatographic column output droplet drying and flaming, in preparation of sample for alpha spectroscopy and counting, was investigated. A series of droplets were dried in an experimental apparatus demonstrating that droplets could be dried significantly faster through more intensie heating, enabling the FACE machine cycle to be shortened by 30-60 seconds. Proposals incorporating these ideas were provided for FACE machine development. The 66% chemical loss of product was analyzed and changes were proposed to reduce the radioisotopes product loss. An analysis of the chromatographic column was also provided. All operating steps in the FACE machine are described and analyzed to provide a complete guide, along with the proposals for machine improvement.

  14. The FAST-DATA System: Fully Automatic Stochastic Technology for Data Acquisition, Transmission, and Analysis

    International Nuclear Information System (INIS)

    The potential to automatically collect, classify, and report on stochastic data (signals with random, time-varying components) from power plants has long been discussed by utilities, government, industries, national laboratories and universities. It has become clear to all concerned that such signals often contain information about plant conditions which may provide the basis for increased plant availability through early detection and warning of developing malfunctions. Maintenance can then be scheduled at opportune times. Inopportune failures of major and minor power plant components are a major cause of down-time and detracts significantly from availability of the plant. A complete system to realize automatic stochastic data processing has been conceptually designed. Development of the FAST-DATA system has been initiated through a program of periodic measurements performed on the vibration and loose parts monitoring system of the Trojan reactor (1130-MW(e)PWR) operated by Portland General Electric Company. The development plan for the system consists of a six-step procedure. The initial steps depend on a significant level of human involvement. In the course of development of the system, the routine duties of operators and analysts are gradually replaced by computerized automatic data handling procedures. In the final configuration, the operator and analysts are completely freed of routine chores by logical machinery. The results achieved to date from actual application of the proof-of-principle system are discussed. The early developmental phases have concentrated on system organization and examination of a representative data base. Preliminary results from the signature analysis program using Trojan data indicate that the performance specifications predicted for the FAST-DATA system are achievable in practice. (author)

  15. Acoustic Analysis of Inhaler Sounds From Community-Dwelling Asthmatic Patients for Automatic Assessment of Adherence

    Science.gov (United States)

    D'arcy, Shona; Costello, Richard W.

    2014-01-01

    Inhalers are devices which deliver medication to the airways in the treatment of chronic respiratory diseases. When used correctly inhalers relieve and improve patients' symptoms. However, adherence to inhaler medication has been demonstrated to be poor, leading to reduced clinical outcomes, wasted medication, and higher healthcare costs. There is a clinical need for a system that can accurately monitor inhaler adherence as currently no method exists to evaluate how patients use their inhalers between clinic visits. This paper presents a method of automatically evaluating inhaler adherence through acoustic analysis of inhaler sounds. An acoustic monitoring device was employed to record the sounds patients produce while using a Diskus dry powder inhaler, in addition to the time and date patients use the inhaler. An algorithm was designed and developed to automatically detect inhaler events from the audio signals and provide feedback regarding patient adherence. The algorithm was evaluated on 407 audio files obtained from 12 community dwelling asthmatic patients. Results of the automatic classification were compared against two expert human raters. For patient data for whom the human raters Cohen's kappa agreement score was \\documentclass[12pt]{minimal} \\usepackage{amsmath} \\usepackage{wasysym} \\usepackage{amsfonts} \\usepackage{amssymb} \\usepackage{amsbsy} \\usepackage{upgreek} \\usepackage{mathrsfs} \\setlength{\\oddsidemargin}{-69pt} \\begin{document} }{}${>}{0.81}$\\end{document}, results indicated that the algorithm's accuracy was 83% in determining the correct inhaler technique score compared with the raters. This paper has several clinical implications as it demonstrates the feasibility of using acoustics to objectively monitor patient inhaler adherence and provide real-time personalized medical care for a chronic respiratory illness.

  16. Toward automatic regional analysis of pulmonary function using inspiration and expiration thoracic CT

    DEFF Research Database (Denmark)

    Murphy, Keelin; Pluim, Josien P. W.; Rikxoort, Eva M. van;

    2012-01-01

    and its results; (b) verify that the quantitative, regional ventilation measurements acquired through CT are meaningful for pulmonary function analysis; (c) identify the most effective of the calculated measurements in predicting pulmonary function; and (d) demonstrate the potential of the system to...... the accuracy of the automatic methods. Quantitative measures representing ventilation are computed at every image voxel and analyzed to provide information about pulmonary function, both globally and on a regional basis. These CT derived measurements are correlated with results from spirometry tests...... alone is not optimal for predicting pulmonary function. It also permits measurement of ventilation on a per lobe basis which reveals, for example, that the condition of the lower lobes contributes most to the pulmonary function of the subject. It is expected that this type of regional analysis will be...

  17. Automatic fuzzy object-based analysis of VHSR images for urban objects extraction

    Science.gov (United States)

    Sebari, Imane; He, Dong-Chen

    2013-05-01

    We present an automatic approach for object extraction from very high spatial resolution (VHSR) satellite images based on Object-Based Image Analysis (OBIA). The proposed solution requires no input data other than the studied image. Not input parameters are required. First, an automatic non-parametric cooperative segmentation technique is applied to create object primitives. A fuzzy rule base is developed based on the human knowledge used for image interpretation. The rules integrate spectral, textural, geometric and contextual object proprieties. The classes of interest are: tree, lawn, bare soil and water for natural classes; building, road, parking lot for man made classes. The fuzzy logic is integrated in our approach in order to manage the complexity of the studied subject, to reason with imprecise knowledge and to give information on the precision and certainty of the extracted objects. The proposed approach was applied to extracts of Ikonos images of Sherbrooke city (Canada). An overall total extraction accuracy of 80% was observed. The correctness rates obtained for building, road and parking lot classes are of 81%, 75% and 60%, respectively.

  18. Measuring head for the semi-automatic analysis of ore samples and industrial solutions

    International Nuclear Information System (INIS)

    At the request of the French National Centre for the Exploitation of the Seas, the Commissariat a l'energie atomique has devised a semi-automatic ship-borne instrument for non-dispersive X-ray fluorescence analysis of submarine nodule samples. The prototype was operational in 1972 on ships of the National Centre, and was later adapted for the more general requirements of industry. The measuring head can take 25 sample-holders at the start. These are fed through automatically and successive counts are carried out with a preselected number of filters - 18 at most. After the device has bee n set going the operator need intervene only to process the aggregate counting results. This ship-borne equipment has made it possible to determine nickel and copper in nodule powders with an absolute error better than 0.1%, cobalt to about 0.15%, and manganese and iron to about 0.4% absolute. Another application was on a semi-industrial pilot installation, for determining cobalt, nickel and copper in dilute ammoniac solutions. (author)

  19. Automatic computer-aided detection of prostate cancer based on multiparametric magnetic resonance image analysis

    Science.gov (United States)

    Vos, P. C.; Barentsz, J. O.; Karssemeijer, N.; Huisman, H. J.

    2012-03-01

    In this paper, a fully automatic computer-aided detection (CAD) method is proposed for the detection of prostate cancer. The CAD method consists of multiple sequential steps in order to detect locations that are suspicious for prostate cancer. In the initial stage, a voxel classification is performed using a Hessian-based blob detection algorithm at multiple scales on an apparent diffusion coefficient map. Next, a parametric multi-object segmentation method is applied and the resulting segmentation is used as a mask to restrict the candidate detection to the prostate. The remaining candidates are characterized by performing histogram analysis on multiparametric MR images. The resulting feature set is summarized into a malignancy likelihood by a supervised classifier in a two-stage classification approach. The detection performance for prostate cancer was tested on a screening population of 200 consecutive patients and evaluated using the free response operating characteristic methodology. The results show that the CAD method obtained sensitivities of 0.41, 0.65 and 0.74 at false positive (FP) levels of 1, 3 and 5 per patient, respectively. In conclusion, this study showed that it is feasible to automatically detect prostate cancer at a FP rate lower than systematic biopsy. The CAD method may assist the radiologist to detect prostate cancer locations and could potentially guide biopsy towards the most aggressive part of the tumour.

  20. Semi-automatic system for UV images analysis of historical musical instruments

    Science.gov (United States)

    Dondi, Piercarlo; Invernizzi, Claudia; Licchelli, Maurizio; Lombardi, Luca; Malagodi, Marco; Rovetta, Tommaso

    2015-06-01

    The selection of representative areas to be analyzed is a common problem in the study of Cultural Heritage items. UV fluorescence photography is an extensively used technique to highlight specific surface features which cannot be observed in visible light (e.g. restored parts or treated with different materials), and it proves to be very effective in the study of historical musical instruments. In this work we propose a new semi-automatic solution for selecting areas with the same perceived color (a simple clue of similar materials) on UV photos, using a specifically designed interactive tool. The proposed method works in two steps: (i) users select a small rectangular area of the image; (ii) program automatically highlights all the areas that have the same color of the selected input. The identification is made by the analysis of the image in HSV color model, the most similar to the human perception. The achievable result is more accurate than a manual selection, because it can detect also points that users do not recognize as similar due to perception illusion. The application has been developed following the rules of usability, and Human Computer Interface has been improved after a series of tests performed by expert and non-expert users. All the experiments were performed on UV imagery of the Stradivari violins collection stored by "Museo del Violino" in Cremona.

  1. Hardware and software system for automatic microemulsion assay evaluation by analysis of optical properties

    Science.gov (United States)

    Maeder, Ulf; Schmidts, Thomas; Burg, Jan-Michael; Heverhagen, Johannes T.; Runkel, Frank; Fiebich, Martin

    2010-03-01

    A new hardware device called Microemulsion Analyzer (MEA), which facilitates the preparation and evaluation of microemulsions, was developed. Microemulsions, consisting of three phases (oil, surfactant and water) and prepared on deep well plates according to the PDMPD method can be automatically evaluated by means of the optical properties. The ratio of ingredients to form a microemulsion strongly depends on the properties and the amounts of the used ingredients. A microemulsion assay is set up on deep well plates to determine these ratios. The optical properties of the ingredients change from turbid to transparent as soon as a microemulsion is formed. The MEA contains a frame and an imageprocessing and analysis algorithm. The frame itself consists of aluminum, an electro luminescent foil (ELF) and a camera. As the frame keeps the well plate at the correct position and angle, the ELF provides constant illumination of the plate from below. The camera provides an image that is processed by the algorithm to automatically evaluate the turbidity in the wells. Using the determined parameters, a phase diagram is created that visualizes the information. This build-up can be used to analyze microemulsion assays and to get results in a standardized way. In addition, it is possible to perform stability tests of the assay by creating special differential stability diagrams after a period of time.

  2. Group-wise automatic mesh-based analysis of cortical thickness

    Science.gov (United States)

    Vachet, Clement; Cody Hazlett, Heather; Niethammer, Marc; Oguz, Ipek; Cates, Joshua; Whitaker, Ross; Piven, Joseph; Styner, Martin

    2011-03-01

    The analysis of neuroimaging data from pediatric populations presents several challenges. There are normal variations in brain shape from infancy to adulthood and normal developmental changes related to tissue maturation. Measurement of cortical thickness is one important way to analyze such developmental tissue changes. We developed a novel framework that allows group-wise automatic mesh-based analysis of cortical thickness. Our approach is divided into four main parts. First an individual pre-processing pipeline is applied on each subject to create genus-zero inflated white matter cortical surfaces with cortical thickness measurements. The second part performs an entropy-based group-wise shape correspondence on these meshes using a particle system, which establishes a trade-off between an even sampling of the cortical surfaces and the similarity of corresponding points across the population using sulcal depth information and spatial proximity. A novel automatic initial particle sampling is performed using a matched 98-lobe parcellation map prior to a particle-splitting phase. Third, corresponding re-sampled surfaces are computed with interpolated cortical thickness measurements, which are finally analyzed via a statistical vertex-wise analysis module. This framework consists of a pipeline of automated 3D Slicer compatible modules. It has been tested on a small pediatric dataset and incorporated in an open-source C++ based high-level module called GAMBIT. GAMBIT's setup allows efficient batch processing, grid computing and quality control. The current research focuses on the use of an average template for correspondence and surface re-sampling, as well as thorough validation of the framework and its application to clinical pediatric studies.

  3. Automatic Generation of Algorithms for the Statistical Analysis of Planetary Nebulae Images

    Science.gov (United States)

    Fischer, Bernd

    2004-01-01

    Analyzing data sets collected in experiments or by observations is a Core scientific activity. Typically, experimentd and observational data are &aught with uncertainty, and the analysis is based on a statistical model of the conjectured underlying processes, The large data volumes collected by modern instruments make computer support indispensible for this. Consequently, scientists spend significant amounts of their time with the development and refinement of the data analysis programs. AutoBayes [GF+02, FS03] is a fully automatic synthesis system for generating statistical data analysis programs. Externally, it looks like a compiler: it takes an abstract problem specification and translates it into executable code. Its input is a concise description of a data analysis problem in the form of a statistical model as shown in Figure 1; its output is optimized and fully documented C/C++ code which can be linked dynamically into the Matlab and Octave environments. Internally, however, it is quite different: AutoBayes derives a customized algorithm implementing the given model using a schema-based process, and then further refines and optimizes the algorithm into code. A schema is a parameterized code template with associated semantic constraints which define and restrict the template s applicability. The schema parameters are instantiated in a problem-specific way during synthesis as AutoBayes checks the constraints against the original model or, recursively, against emerging sub-problems. AutoBayes schema library contains problem decomposition operators (which are justified by theorems in a formal logic in the domain of Bayesian networks) as well as machine learning algorithms (e.g., EM, k-Means) and nu- meric optimization methods (e.g., Nelder-Mead simplex, conjugate gradient). AutoBayes augments this schema-based approach by symbolic computation to derive closed-form solutions whenever possible. This is a major advantage over other statistical data analysis systems

  4. [A new method of automatic analysis of tongue deviation using self-correction].

    Science.gov (United States)

    Zhu, Mingfeng; Du, Jianqiang; Meng, Fan; Zhang, Kang; Ding, Chenghua

    2012-02-01

    The article analyzes the old analysis method of tongue deviation and introduces a new analysis method of it with self-correction avoiding the shortcomings of the old method. In this article, comparisons and analyses are made to current central axis extraction methods, and the facts proved that these methods were not suitable for central axis extraction of tongue images. To overcome the shortcoming that the old method utilized area symmetry to extract central axis so that it would lead to a failure to find central axis, we introduced a kind of shape symmetry analysis method to extract the central axis. This method was capable of correcting the edge of tongue root automatically, and it improved the accuracy of central axis extraction. In additional, in this article, a kind of mouth corner analysis method by analysis of variational hue of tongue images was introduced. In the experiment for comparison, this method was more accurate than the old one and its efficiency was higher than that of the old one. PMID:22404028

  5. Cold Flow Properties of Biodiesel by Automatic and Manual Analysis Methods

    Science.gov (United States)

    Biodiesel from most common feedstocks has inferior cold flow properties compared to conventional diesel fuel. Blends with as little as 10 vol% biodiesel content typically have significantly higher cloud point (CP), pour point (PP) and cold filter plugging point (CFPP) than No. 2 grade diesel fuel (...

  6. Simulation analysis of inadvertent opening of automatic depressurization system for AP1000

    International Nuclear Information System (INIS)

    Based on the structural and operational characteristics of AP1000, a thermal-hydraulic computer code RETAC was developed using FORTRAN language. The code was adopted to analyze the inadvertent opening of automatic depressurization system (ADS) and transient characteristics of some main system parameters were obtained, including pressurizer pressure, normalized core thermal power, normalized core flow rate, core average temperature, the maximum fuel temperature and MDNBR. The results show that the maximum fuel temperature and MDNBR do not exceed specified limits and meet the safety criteria under the protection of pressurizer low-pressure shut-down signal. The computed results were compared with the results of thermal-hydraulic analysis code LOFTRAN for AP1000 developed by Westinghouse Electric Corporation. The trend shows a good agreement and thus proves the applicability and accuracy of the modeling of RETAC and the calculation of ADS critical flow rate. (authors)

  7. Dynamic Analysis of AN Automatic Washing Machine with a Hydraulic Balancer

    Science.gov (United States)

    BAE, S.; LEE, J. M.; KANG, Y. J.; KANG, J. S.; YUN, J. R.

    2002-10-01

    A mathematical model of a hydraulic balancer in steady state condition was derived from a whirling model of a vertical axis washing machine, with the aim of implementing a dynamic analysis of an automatic washing machine during spin drying mode. The centrifugal force acting on the hydraulic balancer depends on the centroidal distance of the fluid in the hydraulic balancer, and the centroidal distance is a function of an eccentricity of the geometric center of the hydraulic balancer from the axis of rotation. A mathematical model of the hydraulic balancer in steady state is validated by the experimental result of the centrifugal force. Experiments were performed on a washing machine during spin drying mode, and results were compared with the simulation result. The parameters affecting the vibration of the washing machine were investigated by the parameter study.

  8. Automatic Speech Signal Analysis for Clinical Diagnosis and Assessment of Speech Disorders

    CERN Document Server

    Baghai-Ravary, Ladan

    2013-01-01

    Automatic Speech Signal Analysis for Clinical Diagnosis and Assessment of Speech Disorders provides a survey of methods designed to aid clinicians in the diagnosis and monitoring of speech disorders such as dysarthria and dyspraxia, with an emphasis on the signal processing techniques, statistical validity of the results presented in the literature, and the appropriateness of methods that do not require specialized equipment, rigorously controlled recording procedures or highly skilled personnel to interpret results. Such techniques offer the promise of a simple and cost-effective, yet objective, assessment of a range of medical conditions, which would be of great value to clinicians. The ideal scenario would begin with the collection of examples of the clients’ speech, either over the phone or using portable recording devices operated by non-specialist nursing staff. The recordings could then be analyzed initially to aid diagnosis of conditions, and subsequently to monitor the clients’ progress and res...

  9. Development of automatic image analysis algorithms for protein localization studies in budding yeast

    Science.gov (United States)

    Logg, Katarina; Kvarnström, Mats; Diez, Alfredo; Bodvard, Kristofer; Käll, Mikael

    2007-02-01

    Microscopy of fluorescently labeled proteins has become a standard technique for live cell imaging. However, it is still a challenge to systematically extract quantitative data from large sets of images in an unbiased fashion, which is particularly important in high-throughput or time-lapse studies. Here we describe the development of a software package aimed at automatic quantification of abundance and spatio-temporal dynamics of fluorescently tagged proteins in vivo in the budding yeast Saccharomyces cerevisiae, one of the most important model organisms in proteomics. The image analysis methodology is based on first identifying cell contours from bright field images, and then use this information to measure and statistically analyse protein abundance in specific cellular domains from the corresponding fluorescence images. The applicability of the procedure is exemplified for two nuclear localized GFP-tagged proteins, Mcm4p and Nrm1p.

  10. Automatic Sleep Stages Classification Using EEG Entropy Features and Unsupervised Pattern Analysis Techniques

    Directory of Open Access Journals (Sweden)

    Jose Luis Rodríguez-Sotelo

    2014-12-01

    Full Text Available Sleep is a growing area of research interest in medicine and neuroscience. Actually, one major concern is to find a correlation between several physiologic variables and sleep stages. There is a scientific agreement on the characteristics of the five stages of human sleep, based on EEG analysis. Nevertheless, manual stage classification is still the most widely used approach. This work proposes a new automatic sleep classification method based on unsupervised feature classification algorithms recently developed, and on EEG entropy measures. This scheme extracts entropy metrics from EEG records to obtain a feature vector. Then, these features are optimized in terms of relevance using the Q-α algorithm. Finally, the resulting set of features is entered into a clustering procedure to obtain a final segmentation of the sleep stages. The proposed method reached up to an average of 80% correctly classified stages for each patient separately while keeping the computational cost low.

  11. Automatic Ferrite Content Measurement based on Image Analysis and Pattern Classification

    Directory of Open Access Journals (Sweden)

    Hafiz Muhammad Tanveer

    2015-05-01

    Full Text Available The existing manual point counting technique for ferrite content measurement is a difficult time consuming method which has limited accuracy due to limited human perception and error induced by points on boundaries of grid spacing. In this paper, we present a novel algorithm, based on image analysis and pattern classification, to evaluate the volume fraction of ferrite in microstructure containing ferrite and austenite. The prime focus of the proposed algorithm is to solve the problem of ferrite content measurement using automatic binary classification approach. Classification of image data into two distinct classes, using optimum threshold finding method, is the key idea behind the new algorithm. Automation of the process to measure the ferrite content and to speed up specimen’s testing procedure is the main feature of the newly developed algorithm. Improved performance index by reducing error sources is reflected from obtained results and validated through the comparison with a well-known method of Ohtsu.

  12. Automatic recognition of polychlorinated biphenyls in gas-chromatographic/mass spectrometric analysis

    International Nuclear Information System (INIS)

    A computer code for automatic recognition of mass spectra of polychlorinated biphenyls (PCBs) has been developed and used as a specific PCB detector in the gas-chromatographic/mass spectrometric analysis. The recognition is based on numerical features to be extracted from the mass spectrum. The code is in Fortran. The result of a classification are the so-called classification chromatograms for the particular groups of PCBs of equal chlorine number. The practical application has been tested on water- and waste oil samples, with PCBs added. The sensitivity is 0,5-1 ng for separate PCB components and 5-20 ng for technical PCD mixtures. 59 refs., 50 figs., 5 tabs. (qui)

  13. Automatic denoising of functional MRI data: combining independent component analysis and hierarchical fusion of classifiers.

    Science.gov (United States)

    Salimi-Khorshidi, Gholamreza; Douaud, Gwenaëlle; Beckmann, Christian F; Glasser, Matthew F; Griffanti, Ludovica; Smith, Stephen M

    2014-04-15

    Many sources of fluctuation contribute to the fMRI signal, and this makes identifying the effects that are truly related to the underlying neuronal activity difficult. Independent component analysis (ICA) - one of the most widely used techniques for the exploratory analysis of fMRI data - has shown to be a powerful technique in identifying various sources of neuronally-related and artefactual fluctuation in fMRI data (both with the application of external stimuli and with the subject "at rest"). ICA decomposes fMRI data into patterns of activity (a set of spatial maps and their corresponding time series) that are statistically independent and add linearly to explain voxel-wise time series. Given the set of ICA components, if the components representing "signal" (brain activity) can be distinguished form the "noise" components (effects of motion, non-neuronal physiology, scanner artefacts and other nuisance sources), the latter can then be removed from the data, providing an effective cleanup of structured noise. Manual classification of components is labour intensive and requires expertise; hence, a fully automatic noise detection algorithm that can reliably detect various types of noise sources (in both task and resting fMRI) is desirable. In this paper, we introduce FIX ("FMRIB's ICA-based X-noiseifier"), which provides an automatic solution for denoising fMRI data via accurate classification of ICA components. For each ICA component FIX generates a large number of distinct spatial and temporal features, each describing a different aspect of the data (e.g., what proportion of temporal fluctuations are at high frequencies). The set of features is then fed into a multi-level classifier (built around several different classifiers). Once trained through the hand-classification of a sufficient number of training datasets, the classifier can then automatically classify new datasets. The noise components can then be subtracted from (or regressed out of) the original

  14. Evaluating the effectiveness of treatment of corneal ulcers via computer-based automatic image analysis

    Science.gov (United States)

    Otoum, Nesreen A.; Edirisinghe, Eran A.; Dua, Harminder; Faraj, Lana

    2012-06-01

    Corneal Ulcers are a common eye disease that requires prompt treatment. Recently a number of treatment approaches have been introduced that have been proven to be very effective. Unfortunately, the monitoring process of the treatment procedure remains manual and hence time consuming and prone to human errors. In this research we propose an automatic image analysis based approach to measure the size of an ulcer and its subsequent further investigation to determine the effectiveness of any treatment process followed. In Ophthalmology an ulcer area is detected for further inspection via luminous excitation of a dye. Usually in the imaging systems utilised for this purpose (i.e. a slit lamp with an appropriate dye) the ulcer area is excited to be luminous green in colour as compared to rest of the cornea which appears blue/brown. In the proposed approach we analyse the image in the HVS colour space. Initially a pre-processing stage that carries out a local histogram equalisation is used to bring back detail in any over or under exposed areas. Secondly we deal with the removal of potential reflections from the affected areas by making use of image registration of two candidate corneal images based on the detected corneal areas. Thirdly the exact corneal boundary is detected by initially registering an ellipse to the candidate corneal boundary detected via edge detection and subsequently allowing the user to modify the boundary to overlap with the boundary of the ulcer being observed. Although this step makes the approach semi automatic, it removes the impact of breakages of the corneal boundary due to occlusion, noise, image quality degradations. The ratio between the ulcer area confined within the corneal area to the corneal area is used as a measure of comparison. We demonstrate the use of the proposed tool in the analysis of the effectiveness of a treatment procedure adopted for corneal ulcers in patients by comparing the variation of corneal size over time.

  15. Control-oriented Automatic System for Transport Analysis (ASTRA)-Matlab integration for Tokamaks

    International Nuclear Information System (INIS)

    The exponential growth in energy consumption has led to a renewed interest in the development of alternatives to fossil fuels. Between the unconventional resources that may help to meet this energy demand, nuclear fusion has arisen as a promising source, which has given way to an unprecedented interest in solving the different control problems existing in nuclear fusion reactors such as Tokamaks. The aim of this manuscript is to show how one of the most popular codes used to simulate the performance of Tokamaks, the Automatic System For Transport Analysis (ASTRA) code, can be integrated into the Matlab-Simulink tool in order to make easier and more comfortable the development of suitable controllers for Tokamaks. As a demonstrative case study to show the feasibility and the goodness of the proposed ASTRA-Matlab integration, a modified anti-windup Proportional Integral Derivative (PID)-based controller for the loop voltage of a Tokamak has been implemented. The integration achieved represents an original and innovative work in the Tokamak control area and it provides new possibilities for the development and application of advanced control schemes to the standardized and widely extended ASTRA transport code for Tokamaks. -- Highlights: → The paper presents a useful tool for rapid prototyping of different solutions to deal with the control problems arising in Tokamaks. → The proposed tool embeds the standardized Automatic System For Transport Analysis (ASTRA) code for Tokamaks within the well-known Matlab-Simulink software. → This allows testing and combining diverse control schemes in a unified way considering the ASTRA as the plant of the system. → A demonstrative Proportional Integral Derivative (PID)-based case study is provided to show the feasibility and capabilities of the proposed integration.

  16. A system for automatic recording and analysis of motor activity in rats.

    Science.gov (United States)

    Heredia-López, Francisco J; May-Tuyub, Rossana M; Bata-García, José L; Góngora-Alfaro, José L; Alvarez-Cervera, Fernando J

    2013-03-01

    We describe the design and evaluation of an electronic system for the automatic recording of motor activity in rats. The device continually locates the position of a rat inside a transparent acrylic cube (50 cm/side) with infrared sensors arranged on its walls so as to correspond to the x-, y-, and z-axes. The system is governed by two microcontrollers. The raw data are saved in a text file within a secure digital memory card, and offline analyses are performed with a library of programs that automatically compute several parameters based on the sequence of coordinates and the time of occurrence of each movement. Four analyses can be made at specified time intervals: traveled distance (cm), movement speed (cm/s), time spent in vertical exploration (s), and thigmotaxis (%). In addition, three analyses are made for the total duration of the experiment: time spent at each x-y coordinate pair (min), time spent on vertical exploration at each x-y coordinate pair (s), and frequency distribution of vertical exploration episodes of distinct durations. User profiles of frequently analyzed parameters may be created and saved for future experimental analyses, thus obtaining a full set of analyses for a group of rats in a short time. The performance of the developed system was assessed by recording the spontaneous motor activity of six rats, while their behaviors were simultaneously videotaped for manual analysis by two trained observers. A high and significant correlation was found between the values measured by the electronic system and by the observers. PMID:22707401

  17. An automatic granular structure generation and finite element analysis of heterogeneous semi-solid materials

    Science.gov (United States)

    Sharifi, Hamid; Larouche, Daniel

    2015-09-01

    The quality of cast metal products depends on the capacity of the semi-solid metal to sustain the stresses generated during the casting. Predicting the evolution of these stresses with accuracy in the solidification interval should be highly helpful to avoid the formation of defects like hot tearing. This task is however very difficult because of the heterogeneous nature of the material. In this paper, we propose to evaluate the mechanical behaviour of a metal during solidification using a mesh generation technique of the heterogeneous semi-solid material for a finite element analysis at the microscopic level. This task is done on a two-dimensional (2D) domain in which the granular structure of the solid phase is generated surrounded by an intergranular and interdendritc liquid phase. Some basic solid grains are first constructed and projected in the 2D domain with random orientations and scale factors. Depending on their orientation, the basic grains are combined to produce larger grains or separated by a liquid film. Different basic grain shapes can produce different granular structures of the mushy zone. As a result, using this automatic grain generation procedure, we can investigate the effect of grain shapes and sizes on the thermo-mechanical behaviour of the semi-solid material. The granular models are automatically converted to the finite element meshes. The solid grains and the liquid phase are meshed properly using quadrilateral elements. This method has been used to simulate the microstructure of a binary aluminium-copper alloy (Al-5.8 wt% Cu) when the fraction solid is 0.92. Using the finite element method and the Mie-Grüneisen equation of state for the liquid phase, the transient mechanical behaviour of the mushy zone under tensile loading has been investigated. The stress distribution and the bridges, which are formed during the tensile loading, have been detected.

  18. Writ Large on Your Face: Observing Emotions Using Automatic Facial Analysis

    Directory of Open Access Journals (Sweden)

    Dieckmann Anja

    2014-05-01

    Full Text Available Emotions affect all of our daily decisions and, of course, they also influence our evaluations of brands, products and advertisements. But what exactly do consumers feel when they watch a TV commercial, visit a website or when they interact with a brand in different ways? Measuring such emotions is not an easy task. In the past, the effectiveness of marketing material was evaluated mostly by subsequent surveys. Now, with the emergence of neuroscientific approaches like EEG, the measurement of real-time reactions is possible, for instance, when watching a commercial. However, most neuroscientific procedures are fairly invasive and irritating. For an EEG, for instance, numerous electrodes need to be placed on the participant's scalp. Furthermore, data analysis is highly complex. Scientific expertise is necessary for interpretation, so the procedure remains a black box to most practitioners and the results are still rather controversial. By contrast, automatic facial analysis provides similar information without having to wire study participants. In addition, the results of such analyses are intuitive and easy to interpret even for laypeople. These convincing advantages led GfK Company to decide on facial analysis and to develop a tool suitable for measuring emotional responses to marketing stimuli, making it easily applicable in marketing research practice.

  19. Fractal analysis of elastographic images for automatic detection of diffuse diseases of salivary glands: preliminary results.

    Science.gov (United States)

    Badea, Alexandru Florin; Lupsor Platon, Monica; Crisan, Maria; Cattani, Carlo; Badea, Iulia; Pierro, Gaetano; Sannino, Gianpaolo; Baciut, Grigore

    2013-01-01

    The geometry of some medical images of tissues, obtained by elastography and ultrasonography, is characterized in terms of complexity parameters such as the fractal dimension (FD). It is well known that in any image there are very subtle details that are not easily detectable by the human eye. However, in many cases like medical imaging diagnosis, these details are very important since they might contain some hidden information about the possible existence of certain pathological lesions like tissue degeneration, inflammation, or tumors. Therefore, an automatic method of analysis could be an expedient tool for physicians to give a faultless diagnosis. The fractal analysis is of great importance in relation to a quantitative evaluation of "real-time" elastography, a procedure considered to be operator dependent in the current clinical practice. Mathematical analysis reveals significant discrepancies among normal and pathological image patterns. The main objective of our work is to demonstrate the clinical utility of this procedure on an ultrasound image corresponding to a submandibular diffuse pathology. PMID:23762183

  20. Automatic Behavior Analysis During a Clinical Interview with a Virtual Human.

    Science.gov (United States)

    Rizzo, Albert; Lucas, Gale; Gratch, Jonathan; Stratou, Giota; Morency, Louis-Philippe; Chavez, Kenneth; Shilling, Russ; Scherer, Stefan

    2016-01-01

    SimSensei is a Virtual Human (VH) interviewing platform that uses off-the-shelf sensors (i.e., webcams, Microsoft Kinect and a microphone) to capture and interpret real-time audiovisual behavioral signals from users interacting with the VH system. The system was specifically designed for clinical interviewing and health care support by providing a face-to-face interaction between a user and a VH that can automatically react to the inferred state of the user through analysis of behavioral signals gleaned from the user's facial expressions, body gestures and vocal parameters. Akin to how non-verbal behavioral signals have an impact on human-to-human interaction and communication, SimSensei aims to capture and infer user state from signals generated from user non-verbal communication to improve engagement between a VH and a user and to quantify user state from the data captured across a 20 minute interview. Results from of sample of service members (SMs) who were interviewed before and after a deployment to Afghanistan indicate that SMs reveal more PTSD symptoms to the VH than they report on the Post Deployment Health Assessment. Pre/Post deployment facial expression analysis indicated more sad expressions and few happy expressions at post deployment. PMID:27046598

  1. Automatic registration of multi-modal microscopy images for integrative analysis of prostate tissue sections

    International Nuclear Information System (INIS)

    Prostate cancer is one of the leading causes of cancer related deaths. For diagnosis, predicting the outcome of the disease, and for assessing potential new biomarkers, pathologists and researchers routinely analyze histological samples. Morphological and molecular information may be integrated by aligning microscopic histological images in a multiplex fashion. This process is usually time-consuming and results in intra- and inter-user variability. The aim of this study is to investigate the feasibility of using modern image analysis methods for automated alignment of microscopic images from differently stained adjacent paraffin sections from prostatic tissue specimens. Tissue samples, obtained from biopsy or radical prostatectomy, were sectioned and stained with either hematoxylin & eosin (H&E), immunohistochemistry for p63 and AMACR or Time Resolved Fluorescence (TRF) for androgen receptor (AR). Image pairs were aligned allowing for translation, rotation and scaling. The registration was performed automatically by first detecting landmarks in both images, using the scale invariant image transform (SIFT), followed by the well-known RANSAC protocol for finding point correspondences and finally aligned by Procrustes fit. The Registration results were evaluated using both visual and quantitative criteria as defined in the text. Three experiments were carried out. First, images of consecutive tissue sections stained with H&E and p63/AMACR were successfully aligned in 85 of 88 cases (96.6%). The failures occurred in 3 out of 13 cores with highly aggressive cancer (Gleason score ≥ 8). Second, TRF and H&E image pairs were aligned correctly in 103 out of 106 cases (97%). The third experiment considered the alignment of image pairs with the same staining (H&E) coming from a stack of 4 sections. The success rate for alignment dropped from 93.8% in adjacent sections to 22% for sections furthest away. The proposed method is both reliable and fast and therefore well suited

  2. Automatic analysis (aa: efficient neuroimaging workflows and parallel processing using Matlab and XML

    Directory of Open Access Journals (Sweden)

    Rhodri eCusack

    2015-01-01

    Full Text Available Recent years have seen neuroimaging data becoming richer, with larger cohorts of participants, a greater variety of acquisition techniques, and increasingly complex analyses. These advances have made data analysis pipelines complex to set up and run (increasing the risk of human error and time consuming to execute (restricting what analyses are attempted. Here we present an open-source framework, automatic analysis (aa, to address these concerns. Human efficiency is increased by making code modular and reusable, and managing its execution with a processing engine that tracks what has been completed and what needs to be (redone. Analysis is accelerated by optional parallel processing of independent tasks on cluster or cloud computing resources. A pipeline comprises a series of modules that each perform a specific task. The processing engine keeps track of the data, calculating a map of upstream and downstream dependencies for each module. Existing modules are available for many analysis tasks, such as SPM-based fMRI preprocessing, individual and group level statistics, voxel-based morphometry, tractography, and multi-voxel pattern analyses (MVPA. However, aa also allows for full customization, and encourages efficient management of code: new modules may be written with only a small code overhead. aa has been used by more than 50 researchers in hundreds of neuroimaging studies comprising thousands of subjects. It has been found to be robust, fast and efficient, for simple single subject studies up to multimodal pipelines on hundreds of subjects. It is attractive to both novice and experienced users. aa can reduce the amount of time neuroimaging laboratories spend performing analyses and reduce errors, expanding the range of scientific questions it is practical to address.

  3. Automatic analysis (aa): efficient neuroimaging workflows and parallel processing using Matlab and XML.

    Science.gov (United States)

    Cusack, Rhodri; Vicente-Grabovetsky, Alejandro; Mitchell, Daniel J; Wild, Conor J; Auer, Tibor; Linke, Annika C; Peelle, Jonathan E

    2014-01-01

    Recent years have seen neuroimaging data sets becoming richer, with larger cohorts of participants, a greater variety of acquisition techniques, and increasingly complex analyses. These advances have made data analysis pipelines complicated to set up and run (increasing the risk of human error) and time consuming to execute (restricting what analyses are attempted). Here we present an open-source framework, automatic analysis (aa), to address these concerns. Human efficiency is increased by making code modular and reusable, and managing its execution with a processing engine that tracks what has been completed and what needs to be (re)done. Analysis is accelerated by optional parallel processing of independent tasks on cluster or cloud computing resources. A pipeline comprises a series of modules that each perform a specific task. The processing engine keeps track of the data, calculating a map of upstream and downstream dependencies for each module. Existing modules are available for many analysis tasks, such as SPM-based fMRI preprocessing, individual and group level statistics, voxel-based morphometry, tractography, and multi-voxel pattern analyses (MVPA). However, aa also allows for full customization, and encourages efficient management of code: new modules may be written with only a small code overhead. aa has been used by more than 50 researchers in hundreds of neuroimaging studies comprising thousands of subjects. It has been found to be robust, fast, and efficient, for simple-single subject studies up to multimodal pipelines on hundreds of subjects. It is attractive to both novice and experienced users. aa can reduce the amount of time neuroimaging laboratories spend performing analyses and reduce errors, expanding the range of scientific questions it is practical to address. PMID:25642185

  4. Numerical analysis of resonances induced by s wave neutrons in transmission time-of-flight experiments with a computer IBM 7094 II; Methodes d'analyse des resonances induites par les neutrons s dans les experiences de transmission par temps de vol et automatisation de ces methodes sur ordinateur IBM 7094 II

    Energy Technology Data Exchange (ETDEWEB)

    Corge, Ch. [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires

    1969-01-01

    Numerical analysis of transmission resonances induced by s wave neutrons in time-of-flight experiments can be achieved in a fairly automatic way on an IBM 7094/II computer. The involved computations are carried out following a four step scheme: 1 - experimental raw data are processed to obtain the resonant transmissions, 2 - values of experimental quantities for each resonance are derived from the above transmissions, 3 - resonance parameters are determined using a least square method to solve the over determined system obtained by equalling theoretical functions to the correspondent experimental values. Four analysis methods are gathered in the same code, 4 - graphical control of the results is performed. (author) [French] L'automatisation, sur ordinateur IBM 7094/II, de l'analyse des resonances induites par les neutrons s dans les experiences de transmission par temps de vol a ete accomplie en la decomposant selon un schema articule en quatre phases: 1 - le traitement des donnees experimentales brutes pour obtenir les transmissions interfero-resonnantes, 2 - la determination des grandeurs d'analyse a partir des transmissions precedentes, 3 - l'analyse proprement dite des resonances dont les parametres sont obtenus par la resolution d'un systeme surabondant. Quatre methodes d'analyse sont groupees en un meme programme, 4 - la procedure de verification graphique. (auteur)

  5. Estimating the quality of pasturage in the municipality of Paragominas (PA) by means of automatic analysis of LANDSAT data

    Science.gov (United States)

    Dejesusparada, N. (Principal Investigator); Dossantos, A. P.; Novo, E. M. L. D.; Duarte, V.

    1981-01-01

    The use of LANDSAT data to evaluate pasture quality in the Amazon region is demonstrated. Pasture degradation in deforested areas of a traditional tropical forest cattle-raising region was estimated. Automatic analysis using interactive multispectral analysis (IMAGE-100) shows that 24% of the deforested areas were occupied by natural vegetation regrowth, 24% by exposed soil, 15% by degraded pastures, and 46% was suitable grazing land.

  6. The development of an automatic sample-changer and control instrumentation for isotope-source neutron-activation analysis

    International Nuclear Information System (INIS)

    An automatic sample-changer was developed at the Council for Mineral Technology for use in isotope-source neutron-activation analysis. Tests show that the sample-changer can transfer a sample of up to 3 kg in mass over a distance of 3 m within 5 s. In addition, instrumentation in the form of a three-stage sequential timer was developed to control the sequence of irradiation transfer and analysis

  7. The IDA-80 measurement evaluation programme on mass spectrometric isotope dilution analysis of uranium and plutonium. Vol. 1

    International Nuclear Information System (INIS)

    The main objective was the acquisition of basic data on the uncertainties involved in the mass spectrometric isotope dilution analysis as applied to the determination of uranium and plutonium in active feed solutions of reprocessing plants. The element concentrations and isotopic compositions of all test materials used were determined by CBNM and NBS with high accuracy. The more than 60000 analytical data reported by the participating laboratories were evaluated by statistical methods applied mainly to the calculation of estimates of the variances for the different uncertainty components contributing to the total uncertainty of this analytical technique. Attention was given to such topics as sample ageing, influence of fission products, spike calibration, ion fractionation, Pu-241 decay correction, minor isotope measurement and errors in data transfer. Furthermore, the performance of the 'dried sample' technique and the 'in-situ' spiking method of undiluted samples of reprocessing fuel solution with U-235/Pu-242 metal alloy spikes, were tested successfully. Considerable improvement of isotope dilution analysis in this safeguards relevant application during the last decade is shown as compared to the results obtained in the IDA-72 interlaboratory experiment, organized by KfK in 1972 on the same subject. (orig./HP)

  8. Automatic identification of mobile and rigid substructures in molecular dynamics simulations and fractional structural fluctuation analysis.

    Directory of Open Access Journals (Sweden)

    Leandro Martínez

    Full Text Available The analysis of structural mobility in molecular dynamics plays a key role in data interpretation, particularly in the simulation of biomolecules. The most common mobility measures computed from simulations are the Root Mean Square Deviation (RMSD and Root Mean Square Fluctuations (RMSF of the structures. These are computed after the alignment of atomic coordinates in each trajectory step to a reference structure. This rigid-body alignment is not robust, in the sense that if a small portion of the structure is highly mobile, the RMSD and RMSF increase for all atoms, resulting possibly in poor quantification of the structural fluctuations and, often, to overlooking important fluctuations associated to biological function. The motivation of this work is to provide a robust measure of structural mobility that is practical, and easy to interpret. We propose a Low-Order-Value-Optimization (LOVO strategy for the robust alignment of the least mobile substructures in a simulation. These substructures are automatically identified by the method. The algorithm consists of the iterative superposition of the fraction of structure displaying the smallest displacements. Therefore, the least mobile substructures are identified, providing a clearer picture of the overall structural fluctuations. Examples are given to illustrate the interpretative advantages of this strategy. The software for performing the alignments was named MDLovoFit and it is available as free-software at: http://leandro.iqm.unicamp.br/mdlovofit.

  9. Multi-level Bayesian safety analysis with unprocessed Automatic Vehicle Identification data for an urban expressway.

    Science.gov (United States)

    Shi, Qi; Abdel-Aty, Mohamed; Yu, Rongjie

    2016-03-01

    In traffic safety studies, crash frequency modeling of total crashes is the cornerstone before proceeding to more detailed safety evaluation. The relationship between crash occurrence and factors such as traffic flow and roadway geometric characteristics has been extensively explored for a better understanding of crash mechanisms. In this study, a multi-level Bayesian framework has been developed in an effort to identify the crash contributing factors on an urban expressway in the Central Florida area. Two types of traffic data from the Automatic Vehicle Identification system, which are the processed data capped at speed limit and the unprocessed data retaining the original speed were incorporated in the analysis along with road geometric information. The model framework was proposed to account for the hierarchical data structure and the heterogeneity among the traffic and roadway geometric data. Multi-level and random parameters models were constructed and compared with the Negative Binomial model under the Bayesian inference framework. Results showed that the unprocessed traffic data was superior. Both multi-level models and random parameters models outperformed the Negative Binomial model and the models with random parameters achieved the best model fitting. The contributing factors identified imply that on the urban expressway lower speed and higher speed variation could significantly increase the crash likelihood. Other geometric factors were significant including auxiliary lanes and horizontal curvature. PMID:26722989

  10. Preoperative automatic visual behavioural analysis as a tool for intraocular lens choice in cataract surgery

    Directory of Open Access Journals (Sweden)

    Heloisa Neumann Nogueira

    2015-04-01

    Full Text Available Purpose: Cataract is the main cause of blindness, affecting 18 million people worldwide, with the highest incidence in the population above 50 years of age. Low visual acuity caused by cataract may have a negative impact on patient quality of life. The current treatment is surgery in order to replace the natural lens with an artificial intraocular lens (IOL, which can be mono- or multifocal. However, due to potential side effects, IOLs must be carefully chosen to ensure higher patient satisfaction. Thus, studies on the visual behavior of these patients may be an important tool to determine the best type of IOL implantation. This study proposed an anamnestic add-on for optimizing the choice of IOL. Methods: We used a camera that automatically takes pictures, documenting the patient’s visual routine in order to obtain additional information about the frequency of distant, intermediate, and near sights. Results: The results indicated an estimated frequency percentage, suggesting that visual analysis of routine photographic records of a patient with cataract may be useful for understanding behavioural gaze and for choosing visual management strategy after cataract surgery, simultaneously stimulating interest for customized IOL manufacturing according to individual needs.

  11. Approaches to automatic parameter fitting in a microscopy image segmentation pipeline: An exploratory parameter space analysis

    Directory of Open Access Journals (Sweden)

    Christian Held

    2013-01-01

    Full Text Available Introduction: Research and diagnosis in medicine and biology often require the assessment of a large amount of microscopy image data. Although on the one hand, digital pathology and new bioimaging technologies find their way into clinical practice and pharmaceutical research, some general methodological issues in automated image analysis are still open. Methods: In this study, we address the problem of fitting the parameters in a microscopy image segmentation pipeline. We propose to fit the parameters of the pipeline′s modules with optimization algorithms, such as, genetic algorithms or coordinate descents, and show how visual exploration of the parameter space can help to identify sub-optimal parameter settings that need to be avoided. Results: This is of significant help in the design of our automatic parameter fitting framework, which enables us to tune the pipeline for large sets of micrographs. Conclusion: The underlying parameter spaces pose a challenge for manual as well as automated parameter optimization, as the parameter spaces can show several local performance maxima. Hence, optimization strategies that are not able to jump out of local performance maxima, like the hill climbing algorithm, often result in a local maximum.

  12. Risk analysis of Leksell Gamma Knife Model C with automatic positioning system

    International Nuclear Information System (INIS)

    Purpose: This study was conducted to evaluate the decrease in risk from misadministration of the new Leksell Gamma Knife Model C with Automatic Positioning System compared with previous models. Methods and Materials: Elekta Instruments, A.B. of Stockholm has introduced a new computer-controlled Leksell Gamma Knife Model C which uses motor-driven trunnions to reposition the patient between isocenters (shots) without human intervention. Previous models required the operators to manually set coordinates from a printed list, permitting opportunities for coordinate transposition, incorrect helmet size, incorrect treatment times, missing shots, or repeated shots. Results: A risk analysis was conducted between craniotomy involving hospital admission and outpatient Gamma Knife radiosurgery. A report of the Institute of Medicine of the National Academies dated November 29, 1999 estimated that medical errors kill between 44,000 and 98,000 people each year in the United States. Another report from the National Nosocomial Infections Surveillance System estimates that 2.1 million nosocomial infections occur annually in the United States in acute care hospitals alone, with 31 million total admissions. Conclusions: All medical procedures have attendant risks of morbidity and possibly mortality. Each patient should be counseled as to the risk of adverse effects as well as the likelihood of good results for alternative treatment strategies. This paper seeks to fill a gap in the existing medical literature, which has a paucity of data involving risk estimates for stereotactic radiosurgery

  13. Automatic Sleep Stage Scoring Using Time-Frequency Analysis and Stacked Sparse Autoencoders.

    Science.gov (United States)

    Tsinalis, Orestis; Matthews, Paul M; Guo, Yike

    2016-05-01

    We developed a machine learning methodology for automatic sleep stage scoring. Our time-frequency analysis-based feature extraction is fine-tuned to capture sleep stage-specific signal features as described in the American Academy of Sleep Medicine manual that the human experts follow. We used ensemble learning with an ensemble of stacked sparse autoencoders for classifying the sleep stages. We used class-balanced random sampling across sleep stages for each model in the ensemble to avoid skewed performance in favor of the most represented sleep stages, and addressed the problem of misclassification errors due to class imbalance while significantly improving worst-stage classification. We used an openly available dataset from 20 healthy young adults for evaluation. We used a single channel of EEG from this dataset, which makes our method a suitable candidate for longitudinal monitoring using wearable EEG in real-world settings. Our method has both high overall accuracy (78%, range 75-80%), and high mean [Formula: see text]-score (84%, range 82-86%) and mean accuracy across individual sleep stages (86%, range 84-88%) over all subjects. The performance of our method appears to be uncorrelated with the sleep efficiency and percentage of transitional epochs in each recording. PMID:26464268

  14. Automatic detection of diabetic foot complications with infrared thermography by asymmetric analysis

    Science.gov (United States)

    Liu, Chanjuan; van Netten, Jaap J.; van Baal, Jeff G.; Bus, Sicco A.; van der Heijden, Ferdi

    2015-02-01

    Early identification of diabetic foot complications and their precursors is essential in preventing their devastating consequences, such as foot infection and amputation. Frequent, automatic risk assessment by an intelligent telemedicine system might be feasible and cost effective. Infrared thermography is a promising modality for such a system. The temperature differences between corresponding areas on contralateral feet are the clinically significant parameters. This asymmetric analysis is hindered by (1) foot segmentation errors, especially when the foot temperature and the ambient temperature are comparable, and by (2) different shapes and sizes between contralateral feet due to deformities or minor amputations. To circumvent the first problem, we used a color image and a thermal image acquired synchronously. Foot regions, detected in the color image, were rigidly registered to the thermal image. This resulted in 97.8%±1.1% sensitivity and 98.4%±0.5% specificity over 76 high-risk diabetic patients with manual annotation as a reference. Nonrigid landmark-based registration with B-splines solved the second problem. Corresponding points in the two feet could be found regardless of the shapes and sizes of the feet. With that, the temperature difference of the left and right feet could be obtained.

  15. Gene Ontology density estimation and discourse analysis for automatic GeneRiF extraction

    Directory of Open Access Journals (Sweden)

    Mottaz Anaïs

    2008-04-01

    Full Text Available Abstract Background This paper describes and evaluates a sentence selection engine that extracts a GeneRiF (Gene Reference into Functions as defined in ENTREZ-Gene based on a MEDLINE record. Inputs for this task include both a gene and a pointer to a MEDLINE reference. In the suggested approach we merge two independent sentence extraction strategies. The first proposed strategy (LASt uses argumentative features, inspired by discourse-analysis models. The second extraction scheme (GOEx uses an automatic text categorizer to estimate the density of Gene Ontology categories in every sentence; thus providing a full ranking of all possible candidate GeneRiFs. A combination of the two approaches is proposed, which also aims at reducing the size of the selected segment by filtering out non-content bearing rhetorical phrases. Results Based on the TREC-2003 Genomics collection for GeneRiF identification, the LASt extraction strategy is already competitive (52.78%. When used in a combined approach, the extraction task clearly shows improvement, achieving a Dice score of over 57% (+10%. Conclusions Argumentative representation levels and conceptual density estimation using Gene Ontology contents appear complementary for functional annotation in proteomics.

  16. Performance Analysis of the SIFT Operator for Automatic Feature Extraction and Matching in Photogrammetric Applications.

    Science.gov (United States)

    Lingua, Andrea; Marenchino, Davide; Nex, Francesco

    2009-01-01

    In the photogrammetry field, interest in region detectors, which are widely used in Computer Vision, is quickly increasing due to the availability of new techniques. Images acquired by Mobile Mapping Technology, Oblique Photogrammetric Cameras or Unmanned Aerial Vehicles do not observe normal acquisition conditions. Feature extraction and matching techniques, which are traditionally used in photogrammetry, are usually inefficient for these applications as they are unable to provide reliable results under extreme geometrical conditions (convergent taking geometry, strong affine transformations, etc.) and for bad-textured images. A performance analysis of the SIFT technique in aerial and close-range photogrammetric applications is presented in this paper. The goal is to establish the suitability of the SIFT technique for automatic tie point extraction and approximate DSM (Digital Surface Model) generation. First, the performances of the SIFT operator have been compared with those provided by feature extraction and matching techniques used in photogrammetry. All these techniques have been implemented by the authors and validated on aerial and terrestrial images. Moreover, an auto-adaptive version of the SIFT operator has been developed, in order to improve the performances of the SIFT detector in relation to the texture of the images. The Auto-Adaptive SIFT operator (A(2) SIFT) has been validated on several aerial images, with particular attention to large scale aerial images acquired using mini-UAV systems. PMID:22412336

  17. Performance Analysis of the SIFT Operator for Automatic Feature Extraction and Matching in Photogrammetric Applications

    Directory of Open Access Journals (Sweden)

    Francesco Nex

    2009-05-01

    Full Text Available In the photogrammetry field, interest in region detectors, which are widely used in Computer Vision, is quickly increasing due to the availability of new techniques. Images acquired by Mobile Mapping Technology, Oblique Photogrammetric Cameras or Unmanned Aerial Vehicles do not observe normal acquisition conditions. Feature extraction and matching techniques, which are traditionally used in photogrammetry, are usually inefficient for these applications as they are unable to provide reliable results under extreme geometrical conditions (convergent taking geometry, strong affine transformations, etc. and for bad-textured images. A performance analysis of the SIFT technique in aerial and close-range photogrammetric applications is presented in this paper. The goal is to establish the suitability of the SIFT technique for automatic tie point extraction and approximate DSM (Digital Surface Model generation. First, the performances of the SIFT operator have been compared with those provided by feature extraction and matching techniques used in photogrammetry. All these techniques have been implemented by the authors and validated on aerial and terrestrial images. Moreover, an auto-adaptive version of the SIFT operator has been developed, in order to improve the performances of the SIFT detector in relation to the texture of the images. The Auto-Adaptive SIFT operator (A2 SIFT has been validated on several aerial images, with particular attention to large scale aerial images acquired using mini-UAV systems.

  18. System for the automatic analysis of defects in X-ray imaging

    International Nuclear Information System (INIS)

    A radiological device was developed to obtain direct digitized views. A set of algorithms has been developed and demonstrated for the automatic evaluation of weldings. Some results concerning electronuclear fuel pin weldings are presented

  19. Analysis and Design of PLC-based Control System for Automatic Beverage Filling Machine

    OpenAIRE

    Yundan Lu; Liangcai Zeng; Feilong Zheng; Gangsheng Kai

    2015-01-01

    Automatic filling system is the main equipment in the food machinery industry. With the development of beverage industry and increasing demand of the filling system. The relay control method in traditional Filling machine has low automation and integration level and cannot satisfy the rapid development of automatic production. PLC control method has advantages of simple programming, strong anti-interference and high working reliability, has gradually replace the relay control method. In this ...

  20. Neuronal Spectral Analysis of EEG and Expert Knowledge Integration for Automatic Classification of Sleep Stages

    OpenAIRE

    Kerkeni, Nizar; Alexandre, Frédéric; Bedoui, Mohamed Hédi; Bougrain, Laurent; Dogui, Mohamed

    2005-01-01

    http://www.wseas.org Being able to analyze and interpret signal coming from electroencephalogram (EEG) recording can be of high interest for many applications including medical diagnosis and Brain-Computer Interfaces. Indeed, human experts are today able to extract from this signal many hints related to physiological as well as cognitive states of the recorded subject and it would be very interesting to perform such task automatically but today no completely automatic system exists. In pre...

  1. Social Risk and Depression: Evidence from Manual and Automatic Facial Expression Analysis

    OpenAIRE

    Girard, Jeffrey M.; Cohn, Jeffrey F.; Mahoor, Mohammad H.; Mavadati, Seyedmohammad; Rosenwald, Dean P.

    2013-01-01

    Investigated the relationship between change over time in severity of depression symptoms and facial expression. Depressed participants were followed over the course of treatment and video recorded during a series of clinical interviews. Facial expressions were analyzed from the video using both manual and automatic systems. Automatic and manual coding were highly consistent for FACS action units, and showed similar effects for change over time in depression severity. For both systems, when s...

  2. Automatic differentiation bibliography

    Energy Technology Data Exchange (ETDEWEB)

    Corliss, G.F. (comp.)

    1992-07-01

    This is a bibliography of work related to automatic differentiation. Automatic differentiation is a technique for the fast, accurate propagation of derivative values using the chain rule. It is neither symbolic nor numeric. Automatic differentiation is a fundamental tool for scientific computation, with applications in optimization, nonlinear equations, nonlinear least squares approximation, stiff ordinary differential equation, partial differential equations, continuation methods, and sensitivity analysis. This report is an updated version of the bibliography which originally appeared in Automatic Differentiation of Algorithms: Theory, Implementation, and Application.

  3. Petroleum product refining: plant level analysis of costs and competitiveness. Implications of greenhouse gas emission reductions. Vol 1

    International Nuclear Information System (INIS)

    Implications on the Canadian refining industry of reducing greenhouse gas (GHG) emissions to meet Canada's Kyoto commitment are assessed, based on a plant-level analysis of costs, benefits and economic and competitive impacts. It was determined on the basis of demand estimates prepared by Natural Resources Canada that refining industry carbon dioxide emissions could be as much a 38 per cent higher than 1990 levels in 2010. Achieving a six per cent reduction below 1990 levels from this business-as-usual case is considered a very difficult target to achieve, unless refinery shutdowns occur. This would require higher imports to meet Canada's petroleum products demand, leaving total carbon dioxide emissions virtually unchanged. A range of options, classified as (1) low capital, operating efficiency projects, (2) medium capital, process/utility optimization projects, (3) high capital, refinery specific projects, and (4) high operating cost GHG projects, were evaluated. Of these four alternatives, the low capital or operating efficiency projects were the only ones judged to have the potential to be economically viable. Energy efficiency projects in these four groups were evaluated under several policy initiatives including accelerated depreciation and a $200 per tonne of carbon tax. Result showed that an accelerated depreciation policy would lower the hurdle rate for refinery investments, and could achieve a four per cent reduction in GHG emissions below 1990 levels, assuming no further shutdown of refinery capacity. The carbon tax was judged to be potentially damaging to the Canadian refinery industry since it would penalize cracking refineries (most Canadian refineries are of this type); it would provide further uncertainty and risk, such that industry might not be able to justify investments to reduce emissions. The overall assessment is that the Canadian refinery industry could not meet the pro-rata Kyoto GHG reduction target through implementation of economically

  4. Automatic extraction of soft tissues from 3D MRI head images using model driven analysis

    International Nuclear Information System (INIS)

    This paper presents an automatic extraction system (called TOPS-3D : Top Down Parallel Pattern Recognition System for 3D Images) of soft tissues from 3D MRI head images by using model driven analysis algorithm. As the construction of system TOPS we developed, two concepts have been considered in the design of system TOPS-3D. One is the system having a hierarchical structure of reasoning using model information in higher level, and the other is a parallel image processing structure used to extract plural candidate regions for a destination entity. The new points of system TOPS-3D are as follows. (1) The TOPS-3D is a three-dimensional image analysis system including 3D model construction and 3D image processing techniques. (2) A technique is proposed to increase connectivity between knowledge processing in higher level and image processing in lower level. The technique is realized by applying opening operation of mathematical morphology, in which a structural model function defined in higher level by knowledge representation is immediately used to the filter function of opening operation as image processing in lower level. The system TOPS-3D applied to 3D MRI head images consists of three levels. First and second levels are reasoning part, and third level is image processing part. In experiments, we applied 5 samples of 3D MRI head images with size 128 x 128 x 128 pixels to the system TOPS-3D to extract the regions of soft tissues such as cerebrum, cerebellum and brain stem. From the experimental results, the system is robust for variation of input data by using model information, and the position and shape of soft tissues are extracted corresponding to anatomical structure. (author)

  5. Automatization of the neutron activation analysis method in the nuclear analysis laboratory

    International Nuclear Information System (INIS)

    In the present paper the work done to automatice the Neutron Activation Analysis technic with a neutron generator is described. An interface between an IBM compatible microcomputer and the equipment in use to make this kind of measurement was developed. including the specialized software for this system

  6. Determination of Bingham Rheological Parameters of SCC using On-line Video Image Analysis of Automatic Slump Flow Testing

    DEFF Research Database (Denmark)

    Thrane, Lars Nyholm; Pade, Claus

    A “touch one bottom” prototype system for estimation of Bingham rheological parameters of SCC has been developed. Video image analysis is used to obtain a series of corresponding values of concrete spread versus time during an automatic slump flow test. The spread versus time curve is subsequentl...... used to estimate the Bingham rheological parameters by a least square search into a database. It takes less than 120 seconds from the start of the slump flow test to the SCC’s Bingham rheological parameters appear on the system’s PC.......A “touch one bottom” prototype system for estimation of Bingham rheological parameters of SCC has been developed. Video image analysis is used to obtain a series of corresponding values of concrete spread versus time during an automatic slump flow test. The spread versus time curve is subsequently...

  7. Automatic sequences

    CERN Document Server

    Haeseler, Friedrich

    2003-01-01

    Automatic sequences are sequences which are produced by a finite automaton. Although they are not random they may look as being random. They are complicated, in the sense of not being not ultimately periodic, they may look rather complicated, in the sense that it may not be easy to name the rule by which the sequence is generated, however there exists a rule which generates the sequence. The concept automatic sequences has special applications in algebra, number theory, finite automata and formal languages, combinatorics on words. The text deals with different aspects of automatic sequences, in particular:· a general introduction to automatic sequences· the basic (combinatorial) properties of automatic sequences· the algebraic approach to automatic sequences· geometric objects related to automatic sequences.

  8. Automatic Condition Monitoring of Industrial Rolling-Element Bearings Using Motor’s Vibration and Current Analysis

    OpenAIRE

    Zhenyu Yang

    2015-01-01

    An automatic condition monitoring for a class of industrial rolling-element bearings is developed based on the vibration as well as stator current analysis. The considered fault scenarios include a single-point defect, multiple-point defects, and a type of distributed defect. Motivated by the potential commercialization, the developed system is promoted mainly using off-the-shelf techniques, that is, the high-frequency resonance technique with envelope detection and the average of short-time ...

  9. Automatic detection of outlines. Application to the quantitative analysis of renal scintiscanning pictures

    International Nuclear Information System (INIS)

    The purpose of the work described is the finalizing of a method making it possible automatically to extract the significant outlines on a renal scintiscanning picture. The algorithms must be simple and of high performance, their routine execution on a mini-computer must be fast enough to compete effectively with human performances. However, the method that has been developed is general enough to be adapted, with slight modifications, to another type of picture. The first chapter is a brief introduction to the principle of scintiscanning, the equipment used and the type of picture obtained therefrom. In the second chapter the various approaches used for form recognition and scene analysis are very briefly described with the help of examples. The third chapter deals with pretreatment techniques (particularly the machine operators) used for segmenting the pictures. Chapter four presents techniques which segment the picture by parallel processing of all its points. In chapter five a description is given of the sequential research techniques of the outline elements, drawing inspiration from the methods used in artificial intelligence for resolving the optimization problem. The sixth chapter shows the difficulties encountered in extracting the renal outlines and the planning technique stages adopted to overcome these difficulties. Chapter seven describes in detail the two research methods employed for generating the plan. In chapter eight, the methods used for extending the areas obtained on the plan and for refining the outlines that bound them are dealt with. Chapter nine is a short presentation of the organization of the programmes and of their data structure. Finally, examples of results are given in chapter ten

  10. Development of user interface to support automatic program generation of nuclear power plant analysis by module-based simulation system

    International Nuclear Information System (INIS)

    Module-based Simulation System (MSS) has been developed to realize a new software work environment enabling versatile dynamic simulation of a complex nuclear power system flexibly. The MSS makes full use of modern software technology to replace a large fraction of human software works in complex, large-scale program development by computer automation. Fundamental methods utilized in MSS and developmental study on human interface system SESS-1 to help users in generating integrated simulation programs automatically are summarized as follows: (1) To enhance usability and 'communality' of program resources, the basic mathematical models of common usage in nuclear power plant analysis are programed as 'modules' and stored in a module library. The information on usage of individual modules are stored in module database with easy registration, update and retrieval by the interactive management system. (2) Target simulation programs and the input/output files are automatically generated with simple block-wise languages by a precompiler system for module integration purpose. (3) Working time for program development and analysis in an example study of an LMFBR plant thermal-hydraulic transient analysis was demonstrated to be remarkably shortened, with the introduction of an interface system SESS-1 developed as an automatic program generation environment. (author)

  11. Analysis and Design of PLC-based Control System for Automatic Beverage Filling Machine

    Directory of Open Access Journals (Sweden)

    Yundan Lu

    2015-01-01

    Full Text Available Automatic filling system is the main equipment in the food machinery industry. With the development of beverage industry and increasing demand of the filling system. The relay control method in traditional Filling machine has low automation and integration level and cannot satisfy the rapid development of automatic production. PLC control method has advantages of simple programming, strong anti-interference and high working reliability, has gradually replace the relay control method. In this study, hardware and software for the automatic filling system based on PLC control is designed, especially the injection section servo control system which adopts the servo motor driver metering pump is carefully analyzed and the filling precision is highly improved.

  12. Proceedings of the Seventh Conference of Nuclear Sciences and Applications. Vol.1,2,3

    International Nuclear Information System (INIS)

    The publication has been set up as a textbook for nuclear sciences and applications vol.1: (1) radiochemistry; (2) radiation chemistry; (3) isotope production; (4) waste management; vol.2: (1) nuclear and reactor; (2) physics; (3) plasma physics; (4) instrumentation and devices; (5) trace and ultra trace analysis; (6) environmental; vol.3: (1) radiation protection; (2) radiation health hazards; (3) nuclear safety; (4) biology; (5) agriculture

  13. Automatic data processing and analysis system for monitoring region around a planned nuclear power plant

    Science.gov (United States)

    Kortström, Jari; Tiira, Timo; Kaisko, Outi

    2016-03-01

    The Institute of Seismology of University of Helsinki is building a new local seismic network, called OBF network, around planned nuclear power plant in Northern Ostrobothnia, Finland. The network will consist of nine new stations and one existing station. The network should be dense enough to provide azimuthal coverage better than 180° and automatic detection capability down to ML -0.1 within a radius of 25 km from the site.The network construction work began in 2012 and the first four stations started operation at the end of May 2013. We applied an automatic seismic signal detection and event location system to a network of 13 stations consisting of the four new stations and the nearest stations of Finnish and Swedish national seismic networks. Between the end of May and December 2013 the network detected 214 events inside the predefined area of 50 km radius surrounding the planned nuclear power plant site. Of those detections, 120 were identified as spurious events. A total of 74 events were associated with known quarries and mining areas. The average location error, calculated as a difference between the announced location from environment authorities and companies and the automatic location, was 2.9 km. During the same time period eight earthquakes between magnitude range 0.1-1.0 occurred within the area. Of these seven could be automatically detected. The results from the phase 1 stations of the OBF network indicates that the planned network can achieve its goals.

  14. Fast automatic analysis of antenatal dexamethasone on micro-seizure activity in the EEG

    International Nuclear Information System (INIS)

    Full text: In this work wc develop an automatic scheme for studying the effect of the antenatal Dexamethasone on the EEG activity. To do so an FFT (Fast Fourier Transform) based detector was designed and applied to the EEG recordings obtained from two groups of fetal sheep. Both groups received two injections with a time delay of 24 h between them. However the applied medicine was different for each group (Dex and saline). The detector developed was used to automatically identify and classify micro-seizures that occurred in the frequency bands corresponding to the EEG transients known as slow waves (2.5 14 Hz). For each second of the data recordings the spectrum was computed and the rise of the energy in each predefined frequency band then counted when the energy level exceeded a predefined corresponding threshold level (Where the threshold level was obtained from the long term average of the spectral points at each band). Our results demonstrate that it was possible to automatically count the micro-seizures for the three different bands in a time effective manner. It was found that the number of transients did not strongly depend on the nature of the injected medicine which was consistent with the results manually obtained by an EEG expert. Tn conclusion, the automatic detection scheme presented here would allow for rapid micro-seizure event identification of hours of highly sampled EEG data thus providing a valuable time-saving device.

  15. Benefits of automatic multielemental analysis of zinc-lead ore slurries by radioisotope X-ray fluorescence

    International Nuclear Information System (INIS)

    The radioisotope X-ray fluorescence measuring system has been developed for automatic multielement analysis of zinc-lead ore slurries. The system consists of several XRF measuring probes, electronic unit and minicomputer with its peripherals. The system has been used for simultaneous determination of Fe, Zn and Pb in flotation streams with accuracy within 3-15%, depending on metal concentration. Improved control of the flotation process resulting from on-stream analysis has led to increases of up to 3.4% in metal recovery. (author). 3 refs, 4 figs, 1 tab

  16. Evaluating PcGets and RETINA as Automatic Model Selection Algorithms.

    OpenAIRE

    Jennifer L. Castle

    2005-01-01

    The paper describes two automatic model selection algorithms, RETINA and PcGets, briefly discussing how the algorithms work and what their performance claims are. RETINA's Matlab implementation of the code is explained, then the program is compared with PcGets on the data in Perez-Amaral, Gallo and White (2005, Econometric Theory, Vol. 21, pp. 262-277), "A Comparison of Complementary Automatic Modelling Methods: RETINA and PcGets", and Hoover and Perez (1999, Econometrics Journal, Vol. 2, pp....

  17. Sleep stage scoring using the neural network model: comparison between visual and automatic analysis in normal subjects and patients.

    Science.gov (United States)

    Schaltenbrand, N; Lengelle, R; Toussaint, M; Luthringer, R; Carelli, G; Jacqmin, A; Lainey, E; Muzet, A; Macher, J P

    1996-01-01

    In this paper, we compare and analyze the results from automatic analysis and visual scoring of nocturnal sleep recordings. The validation is based on a sleep recording set of 60 subjects (33 males and 27 females), consisting of three groups: 20 normal controls subjects, 20 depressed patients and 20 insomniac patients treated with a benzodiazepine. The inter-expert variability estimated from these 60 recordings (61,949 epochs) indicated an average agreement rate of 87.5% between two experts on the basis of 30-second epochs. The automatic scoring system, compared in the same way with one expert, achieved an average agreement rate of 82.3%, without expert supervision. By adding expert supervision for ambiguous and unknown epochs, detected by computation of an uncertainty index and unknown rejection, the automatic/expert agreement grew from 82.3% to 90%, with supervision over only 20% of the night. Bearing in mind the composition and the size of the test sample, the automated sleep staging system achieved a satisfactory performance level and may be considered a useful alternative to visual sleep stage scoring for large-scale investigations of human sleep. PMID:8650459

  18. Assessment of Severe Apnoea through Voice Analysis, Automatic Speech, and Speaker Recognition Techniques

    Science.gov (United States)

    Fernández Pozo, Rubén; Blanco Murillo, Jose Luis; Hernández Gómez, Luis; López Gonzalo, Eduardo; Alcázar Ramírez, José; Toledano, Doroteo T.

    2009-12-01

    This study is part of an ongoing collaborative effort between the medical and the signal processing communities to promote research on applying standard Automatic Speech Recognition (ASR) techniques for the automatic diagnosis of patients with severe obstructive sleep apnoea (OSA). Early detection of severe apnoea cases is important so that patients can receive early treatment. Effective ASR-based detection could dramatically cut medical testing time. Working with a carefully designed speech database of healthy and apnoea subjects, we describe an acoustic search for distinctive apnoea voice characteristics. We also study abnormal nasalization in OSA patients by modelling vowels in nasal and nonnasal phonetic contexts using Gaussian Mixture Model (GMM) pattern recognition on speech spectra. Finally, we present experimental findings regarding the discriminative power of GMMs applied to severe apnoea detection. We have achieved an 81% correct classification rate, which is very promising and underpins the interest in this line of inquiry.

  19. Application for the technology overview of web projects based on automatic analysis of repositories

    OpenAIRE

    POLANC, ALJANA

    2016-01-01

    This thesis comprises the development and presentation of an application which aims to facilitate the overview of the technical and other data of web projects that are being developed or maintained in a company. The application automatically updates the collection of relevant data of the web projects by integrating with the GitHub web service, where it obtains the information regarding programming languages, libraries, other technologies, project contributors and other important data. In the ...

  20. Automatic Prediction of Cardiovascular and Cerebrovascular Events Using Heart Rate Variability Analysis

    OpenAIRE

    Melillo, Paolo; Izzo, Raffaele; Orrico, Ada; Scala, Paolo; Attanasio, Marcella; Mirra, Marco; De Luca, Nicola; Pecchia, Leandro

    2015-01-01

    Background There is consensus that Heart Rate Variability is associated with the risk of vascular events. However, Heart Rate Variability predictive value for vascular events is not completely clear. The aim of this study is to develop novel predictive models based on data-mining algorithms to provide an automatic risk stratification tool for hypertensive patients. Methods A database of 139 Holter recordings with clinical data of hypertensive patients followed up for at least 12 months were c...

  1. A novel automatic film changer for high-speed analysis of nuclear emulsions

    OpenAIRE

    Borer, K.; Damet, J; Hess, M.; I. Kreslo; Moser, U.; Pretzl, K.; Savvinov, N.; Schuetz, H. -U.; Waelchli, T.; Weber, M.

    2006-01-01

    This paper describes the recent development of a novel automatic computer-controlled manipulator for emulsion film placement and removal at the microscope object table (also called stage). The manipulator is designed for mass scanning of emulsion films for the OPERA neutrino oscillation experiment and provides emulsion changing time shorter than 30 seconds with an emulsion film positioning accuracy as good as 20 microns RMS.

  2. Automatic derivation of domain terms and concept location based on the analysis of the identifiers

    OpenAIRE

    Vaclavik, Peter; Poruban, Jaroslav; Mezei, Marek

    2010-01-01

    Developers express the meaning of the domain ideas in specifically selected identifiers and comments that form the target implemented code. Software maintenance requires knowledge and understanding of the encoded ideas. This paper presents a way how to create automatically domain vocabulary. Knowledge of domain vocabulary supports the comprehension of a specific domain for later code maintenance or evolution. We present experiments conducted in two selected domains: application servers and we...

  3. Evaluation of Characteristics of Non-Metallic Inclusions in P/M Ni-Base Superalloy by Automatic Image Analysis

    Institute of Scientific and Technical Information of China (English)

    Li; Xinggang; Ge; Changchun; Shen; Weiping

    2007-01-01

    Non-metallic inclusions,especially the large ones,within P/M Ni-base superalloy have a major influence on fatigue characteristics,but are not directly measurable by routine inspection.In this paper,a method,automatic image analysis,is proposed for estimation of the content,size and amount of non-metallic inclusions in superalloy.The methodology for the practical application of this method is described and the factors affecting the precision of the estimation are discussed.In the experiment,the characteristics of the non-metallic inclusions in Ni-base P/M superalloy are analyzed.

  4. AUTOCASK (AUTOmatic Generation of 3-D CASK models). A microcomputer based system for shipping cask design review analysis

    International Nuclear Information System (INIS)

    AUTOCASK (AUTOmatic Generation of 3-D CASK models) is a microcomputer-based system of computer programs and databases developed at the Lawrence Livermore National Laboratory (LLNL) for the structural analysis of shipping casks for radioactive material. Model specification is performed on the microcomputer, and the analyses are performed on an engineering workstation or mainframe computer. AUTOCASK is based on 80386/80486 compatible microcomputers. The system is composed of a series of menus, input programs, display programs, a mesh generation program, and archive programs. All data is entered through fill-in-the-blank input screens that contain descriptive data requests

  5. Finite element analysis of osteosynthesis screw fixation in the bone stock: an appropriate method for automatic screw modelling.

    Science.gov (United States)

    Wieding, Jan; Souffrant, Robert; Fritsche, Andreas; Mittelmeier, Wolfram; Bader, Rainer

    2012-01-01

    The use of finite element analysis (FEA) has grown to a more and more important method in the field of biomedical engineering and biomechanics. Although increased computational performance allows new ways to generate more complex biomechanical models, in the area of orthopaedic surgery, solid modelling of screws and drill holes represent a limitation of their use for individual cases and an increase of computational costs. To cope with these requirements, different methods for numerical screw modelling have therefore been investigated to improve its application diversity. Exemplarily, fixation was performed for stabilization of a large segmental femoral bone defect by an osteosynthesis plate. Three different numerical modelling techniques for implant fixation were used in this study, i.e. without screw modelling, screws as solid elements as well as screws as structural elements. The latter one offers the possibility to implement automatically generated screws with variable geometry on arbitrary FE models. Structural screws were parametrically generated by a Python script for the automatic generation in the FE-software Abaqus/CAE on both a tetrahedral and a hexahedral meshed femur. Accuracy of the FE models was confirmed by experimental testing using a composite femur with a segmental defect and an identical osteosynthesis plate for primary stabilisation with titanium screws. Both deflection of the femoral head and the gap alteration were measured with an optical measuring system with an accuracy of approximately 3 µm. For both screw modelling techniques a sufficient correlation of approximately 95% between numerical and experimental analysis was found. Furthermore, using structural elements for screw modelling the computational time could be reduced by 85% using hexahedral elements instead of tetrahedral elements for femur meshing. The automatically generated screw modelling offers a realistic simulation of the osteosynthesis fixation with screws in the adjacent

  6. Finite element analysis of osteosynthesis screw fixation in the bone stock: an appropriate method for automatic screw modelling.

    Directory of Open Access Journals (Sweden)

    Jan Wieding

    Full Text Available The use of finite element analysis (FEA has grown to a more and more important method in the field of biomedical engineering and biomechanics. Although increased computational performance allows new ways to generate more complex biomechanical models, in the area of orthopaedic surgery, solid modelling of screws and drill holes represent a limitation of their use for individual cases and an increase of computational costs. To cope with these requirements, different methods for numerical screw modelling have therefore been investigated to improve its application diversity. Exemplarily, fixation was performed for stabilization of a large segmental femoral bone defect by an osteosynthesis plate. Three different numerical modelling techniques for implant fixation were used in this study, i.e. without screw modelling, screws as solid elements as well as screws as structural elements. The latter one offers the possibility to implement automatically generated screws with variable geometry on arbitrary FE models. Structural screws were parametrically generated by a Python script for the automatic generation in the FE-software Abaqus/CAE on both a tetrahedral and a hexahedral meshed femur. Accuracy of the FE models was confirmed by experimental testing using a composite femur with a segmental defect and an identical osteosynthesis plate for primary stabilisation with titanium screws. Both deflection of the femoral head and the gap alteration were measured with an optical measuring system with an accuracy of approximately 3 µm. For both screw modelling techniques a sufficient correlation of approximately 95% between numerical and experimental analysis was found. Furthermore, using structural elements for screw modelling the computational time could be reduced by 85% using hexahedral elements instead of tetrahedral elements for femur meshing. The automatically generated screw modelling offers a realistic simulation of the osteosynthesis fixation with

  7. Dynamic Analysis and Control of an Automatic Transmission for Start-Stop Function and Efficiency Improvement

    OpenAIRE

    Yang Liu; Shuhan Wang; Peng Dong; Xiangyang Xu

    2015-01-01

    An electric oil pump (EOP) was integrated into the hydraulic system and an automatic transmission (AT) mechanical oil pump (MOP) was downsized. These processes were performed to combine a start-stop function with the AT and further improve the transmission efficiency. Furthermore, this study established a dynamics model of power loss and leakage of an 8-speed AT; a flow-based control algorithm of the EOP was then developed to realize the start-stop function and support the MOP to meet the flo...

  8. Computational text analysis and reading comprehension exam complexity towards automatic text classification

    CERN Document Server

    Liontou, Trisevgeni

    2014-01-01

    This book delineates a range of linguistic features that characterise the reading texts used at the B2 (Independent User) and C1 (Proficient User) levels of the Greek State Certificate of English Language Proficiency exams in order to help define text difficulty per level of competence. In addition, it examines whether specific reader variables influence test takers' perceptions of reading comprehension difficulty. The end product is a Text Classification Profile per level of competence and a formula for automatically estimating text difficulty and assigning levels to texts consistently and re

  9. S-onset time automatic picking based on polarization analysis and higher order statistics

    Science.gov (United States)

    Lois, A.; Sokos, E.; Paraskevopoulos, P.; Tselentis, G.-A.

    2012-04-01

    Automatic S-wave onset time identification constitutes a rough problem for the seismologists, due to the increased level of seismic energy by P-coda waves, before the S-wave arrival. Most of the, up to now proposed algorithms, are mainly based on the polarization features of the seismic waves. In this work we propose a new simple time domain technique for automatically determining the S-arrival onsets and present its implementation on 3-component single station data. Over small time windows, the eigenproblem of the covariance matrix is solved and the eigenvalue, corresponding to the maximum eigenvector is retained for further processing. We form in this way a time series of maximum eigenvalues, which serves as a characteristic function, whose statistical properties provide an initial S-arrival time estimation. A multi-window approach combined with an energy-based weighting scheme is also applied, in order to reduce the algorithm's dependence on the moving window's length. Specifically, for each S arrival time estimation an automatically evaluated uncertainty index is introduced for evaluating the probability of a false alarm. This quality measure, similar to SNR, is based on an energy ratio estimated on the two horizontal components, into a predefined time section. In this way we conclude to a set of solutions whose weighted mean is the final S-onset time estimation. Automatic picks are compared against manual reference picks, resulting in sufficiently good results, regarding the accuracy as well as noise robustness. In general the proposed method is straightforward to implement, demands low computational resources and the only parameters that have to be set are the lengths of the time moving windows the algorithm uses. Furthermore, it has to be mentioned that the detected seismic signals as well as good quality P-picks are prerequisites to conclude on correct estimations of S-arrival times. Due to its efficiency, the specific technique can be used as a useful tool

  10. A fast automatic plate changer for the analysis of nuclear emulsions

    International Nuclear Information System (INIS)

    This paper describes the design and performance of a computer controlled emulsion Plate Changer for the automatic placement and removal of nuclear emulsion films for the European Scanning System microscopes. The Plate Changer is used for mass scanning and measurement of the emulsions of the OPERA neutrino oscillation experiment at the Gran Sasso lab on the CNGS neutrino beam. Unlike other systems it works with both dry and oil objectives. The film changing takes less than 20 s and the accuracy on the positioning of the emulsion films is about 10μm. The final accuracy in retrieving track coordinates after fiducial marks measurement is better than 1μm

  11. Automatic categorization of Spanish texts into linguistic registers: a contrastive analysis

    Directory of Open Access Journals (Sweden)

    John Roberto Rodríguez

    2013-07-01

    Full Text Available Collaborative software such as Recommender Systems can benefit from the automatic classification of texts into linguistic registers. First, the linguistic register provides information about the users' profiles and the context of the recommendation. Second, considering the characteristics of each type of text can help to improve existing natural language processing methods. In this paper we contrast two approaches to register categorization for Spanish. The first approach is focused on morphosintactic patterns and the second one on lexical patterns. For the experimental evaluation we tested 38 machine learning algorithms with a precision higher than 89%.

  12. Analysis of Relationship between Pioneer Brand Status and Consumer’s Attitude toward a Brand (Case on Yamaha Automatic vs. Honda Automatic Transmission Motorcycle in Indonesia

    Directory of Open Access Journals (Sweden)

    Arga Hananto

    2011-06-01

    Full Text Available Previous research have indicated that brand pioneership provide some advantages such as high market share barriers to entry and consumers preference as well as higher consumer attitude. This paper intends to explore the relationship between perceived brand pioneership on consumers’ brand attitude. The study focuses on two competing brands from automatic transmission motorcycle category, namely Yamaha and Honda.Based on result from 90 respondents, this study confirms the perception that Yamaha (although not the true pioneer is perceived by the majority of respondents as the pioneering brand in the automatic transmission motorcycle. This study also found that those respondents who perceived that Yamaha is the pioneer brand tend to ascribe higher brand attitude toward Yamaha than toward Honda. Result from this study adds to the repository of studies concerning brand pioneership as well as adding to repository of knowledge about Indonesian consumer behavior.

  13. Journalism Abstracts. Vol. 15.

    Science.gov (United States)

    Popovich, Mark N., Ed.

    This book, the fifteenth volume of an annual publication, contains 373 abstracts of 52 doctoral and 321 master's theses from 50 colleges and universities. The abstracts are arranged alphabetically by author, with the doctoral dissertations appearing first. These cover such topics as advertising, audience analysis, content analysis of news issues…

  14. Automatic geometric modeling, mesh generation and FE analysis for pipelines with idealized defects and arbitrary location

    Energy Technology Data Exchange (ETDEWEB)

    Motta, R.S.; Afonso, S.M.B.; Willmersdorf, R.B.; Lyra, P.R.M. [Universidade Federal de Pernambuco (UFPE), Recife, PE (Brazil); Cabral, H.L.D. [TRANSPETRO, Rio de Janeiro, RJ (Brazil); Andrade, E.Q. [Petroleo Brasileiro S.A. (PETROBRAS), Rio de Janeiro, RJ (Brazil)

    2009-07-01

    Although the Finite Element Method (FEM) has proved to be a powerful tool to predict the failure pressure of corroded pipes, the generation of good computational models of pipes with corrosion defects can take several days. This makes the use of computational simulation procedure difficult to apply in practice. The main purpose of this work is to develop a set of computational tools to produce automatically models of pipes with defects, ready to be analyzed with commercial FEM programs, starting from a few parameters that locate and provide the main dimensions of the defect or a series of defects. Here these defects can be internal and external and also assume general spatial locations along the pipe. Idealized rectangular and elliptic geometries can be generated. These tools were based on MSC.PATRAN pre and post-processing programs and were written with PCL (Patran Command Language). The program for the automatic generation of models (PIPEFLAW) has a simplified and customized graphical interface, so that an engineer with basic notions of computational simulation with the FEM can generate rapidly models that result in precise and reliable simulations. Some examples of models of pipes with defects generated by the PIPEFLAW system are shown, and the results of numerical analyses, done with the tools presented in this work, are compared with, empiric results. (author)

  15. Automatic prediction of cardiovascular and cerebrovascular events using heart rate variability analysis.

    Directory of Open Access Journals (Sweden)

    Paolo Melillo

    Full Text Available There is consensus that Heart Rate Variability is associated with the risk of vascular events. However, Heart Rate Variability predictive value for vascular events is not completely clear. The aim of this study is to develop novel predictive models based on data-mining algorithms to provide an automatic risk stratification tool for hypertensive patients.A database of 139 Holter recordings with clinical data of hypertensive patients followed up for at least 12 months were collected ad hoc. Subjects who experienced a vascular event (i.e., myocardial infarction, stroke, syncopal event were considered as high-risk subjects. Several data-mining algorithms (such as support vector machine, tree-based classifier, artificial neural network were used to develop automatic classifiers and their accuracy was tested by assessing the receiver-operator characteristics curve. Moreover, we tested the echographic parameters, which have been showed as powerful predictors of future vascular events.The best predictive model was based on random forest and enabled to identify high-risk hypertensive patients with sensitivity and specificity rates of 71.4% and 87.8%, respectively. The Heart Rate Variability based classifier showed higher predictive values than the conventional echographic parameters, which are considered as significant cardiovascular risk factors.Combination of Heart Rate Variability measures, analyzed with data-mining algorithm, could be a reliable tool for identifying hypertensive patients at high risk to develop future vascular events.

  16. Automatic Screening and Grading of Age-Related Macular Degeneration from Texture Analysis of Fundus Images

    Science.gov (United States)

    Phan, Thanh Vân; Seoud, Lama; Chakor, Hadi; Cheriet, Farida

    2016-01-01

    Age-related macular degeneration (AMD) is a disease which causes visual deficiency and irreversible blindness to the elderly. In this paper, an automatic classification method for AMD is proposed to perform robust and reproducible assessments in a telemedicine context. First, a study was carried out to highlight the most relevant features for AMD characterization based on texture, color, and visual context in fundus images. A support vector machine and a random forest were used to classify images according to the different AMD stages following the AREDS protocol and to evaluate the features' relevance. Experiments were conducted on a database of 279 fundus images coming from a telemedicine platform. The results demonstrate that local binary patterns in multiresolution are the most relevant for AMD classification, regardless of the classifier used. Depending on the classification task, our method achieves promising performances with areas under the ROC curve between 0.739 and 0.874 for screening and between 0.469 and 0.685 for grading. Moreover, the proposed automatic AMD classification system is robust with respect to image quality.

  17. Study on triterpenoic acids distribution in Ganoderma mushrooms by automatic multiple development high performance thin layer chromatographic fingerprint analysis.

    Science.gov (United States)

    Yan, Yu-Zhen; Xie, Pei-Shan; Lam, Wai-Kei; Chui, Eddie; Yu, Qiong-Xi

    2010-01-01

    Ganoderma--"Lingzhi" in Chinese--is one of the superior Chinese tonic materia medicas in China, Japan, and Korea. Two species, Ganoderma lucidum (Red Lingzhi) and G. sinense (Purple Lingzhi), have been included in the Chinese Pharmacopoeia since its 2000 Edition. However, some other species of Ganoderma are also available in the market. For example, there are five species divided by color called "Penta-colors Lingzhi" that have been advocated as being the most invigorating among the Lingzhi species; but there is no scientific evidence for such a claim. Morphological identification can serve as an effective practice for differentiating the various species, but the inherent quality has to be delineated by chemical analysis. Among the diverse constituents in Lingzhi, triterpenoids are commonly recognized as the major active ingredients. An automatic triple development HPTLC fingerprint analysis was carried out for detecting the distribution consistency of the triterpenoic acids in various Lingzhi samples. The chromatographic conditions were optimized as follows: stationary phase, precoated HPTLC silica gel 60 plate; mobile phase, toluene-ethyl acetate-methanol-formic acid (15 + 15 + 1 + 0.1); and triple-development using automatic multiple development equipment. The chromatograms showed good resolution, and the color images provided more specific HPTLC fingerprints than have been previously published. It was observed that the abundance of triterpenoic acids and consistent fingerprint pattern in Red Lingzhi (fruiting body of G. lucidum) outweighs the other species of Lingzhi. PMID:21140647

  18. Automatic image analysis methods for the determination of stereological parameters - application to the analysis of densification during solid state sintering of WC-Co compacts

    Science.gov (United States)

    Missiaen; Roure

    2000-08-01

    Automatic image analysis methods which were used to determine microstructural parameters of sintered materials are presented. Estimation of stereological parameters at interfaces, when the system contains more than two phases, is particularly detailed. It is shown that the specific surface areas and mean curvatures of the various interfaces can be estimated in the numerical space of the images. The methods are applied to the analysis of densification during solid state sintering of WC-Co compacts. The microstructural evolution is commented on. Application of microstructural measurements to the analysis of densification kinetics is also discussed. PMID:10947907

  19. [Digital storage and semi-automatic analysis of esophageal pressure signals. Evaluation of a commercialized system (PC Polygraft, Synectics)].

    Science.gov (United States)

    Bruley des Varannes, S; Pujol, P; Salim, B; Cherbut, C; Cloarec, D; Galmiche, J P

    1989-11-01

    The aim of this work was to evaluate a new commercially available pressure recording system (PC Polygraf, Synectics) and to compare this system with a classical method using perfused catheters. The PC Polygraf uses microtransducers and allows direct digitized storage and semi-automatic analysis of data. In the first part of this study, manometric assessment was conducted using only perfused catheters. The transducers were connected to both an analog recorder and to a PC Polygraf. Using the two methods of analysis, contraction amplitudes were strongly correlated (r = 0.99; p less than 0.0001) whereas durations were significantly but loosely correlated (r = 0.51; p less than 0.001). Resting LES pressure was significantly correlated (r = 0.87; p less than 0.05). In the second part of this study, simultaneous recordings of esophageal pressure were conducted in 7 patients, by placing side by side the two tubes (microtransducers and perfused catheters) with the sideholes at the same level. The characteristics of the waves were determined both by visual analysis of analog tracing and by semi-automatic analysis of digitized recording with adequate program. Mean amplitude was lower with the microtransducers than with the perfused catheters (60 vs 68 cm H2O; p less than 0.05), but the duration of waves was not significantly different when using both systems. Values obtained for each of these parameters using both methods were significantly correlated (amplitude: r = 0.74; duration: r = 0.51). The localization and the measure of the basal tone of sphincter were found to be difficult when using microtransducers. These results show that PC Polygraf allows a satisfactory analysis of esophageal pressure signals. However, only perfused catheters offer an excellent reliability for complete studies of both sphincter and peristaltism. PMID:2612832

  20. {gamma}H2AX foci as a measure of DNA damage: A computational approach to automatic analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ivashkevich, Alesia N. [Trescowthick Research Laboratories, Peter MacCallum Cancer Centre, St. Andrew' s Place, East Melbourne, Victoria 3002 (Australia); Martin, Olga A. [Laboratory of Molecular Pharmacology, Center for Cancer Research, National Cancer Institute, National Institute of Health, D.H.H.S., Bethesda, MD 20892 (United States); Smith, Andrea J. [Trescowthick Research Laboratories, Peter MacCallum Cancer Centre, St. Andrew' s Place, East Melbourne, Victoria 3002 (Australia); Redon, Christophe E.; Bonner, William M. [Laboratory of Molecular Pharmacology, Center for Cancer Research, National Cancer Institute, National Institute of Health, D.H.H.S., Bethesda, MD 20892 (United States); Martin, Roger F. [Trescowthick Research Laboratories, Peter MacCallum Cancer Centre, St. Andrew' s Place, East Melbourne, Victoria 3002 (Australia); Lobachevsky, Pavel N., E-mail: pavel.lobachevsky@petermac.org [Trescowthick Research Laboratories, Peter MacCallum Cancer Centre, St. Andrew' s Place, East Melbourne, Victoria 3002 (Australia)

    2011-06-03

    The {gamma}H2AX focus assay represents a fast and sensitive approach for the detection of one of the critical types of DNA damage - double-strand breaks (DSB) induced by various cytotoxic agents including ionising radiation. Apart from research applications, the assay has a potential in clinical medicine/pathology, such as assessment of individual radiosensitivity, response to cancer therapies, as well as in biodosimetry. Given that generally there is a direct relationship between numbers of microscopically visualised {gamma}H2AX foci and DNA DSB in a cell, the number of foci per nucleus represents the most efficient and informative parameter of the assay. Although computational approaches have been developed for automatic focus counting, the tedious and time consuming manual focus counting still remains the most reliable way due to limitations of computational approaches. We suggest a computational approach and associated software for automatic focus counting that minimises these limitations. Our approach, while using standard image processing algorithms, maximises the automation of identification of nuclei/cells in complex images, offers an efficient way to optimise parameters used in the image analysis and counting procedures, optionally invokes additional procedures to deal with variations in intensity of the signal and background in individual images, and provides automatic batch processing of a series of images. We report results of validation studies that demonstrated correlation of manual focus counting with results obtained using our computational algorithm for mouse jejunum touch prints, mouse tongue sections and human blood lymphocytes as well as radiation dose response of {gamma}H2AX focus induction for these biological specimens.

  1. Automatic system for quantification and visualization of lung aeration on chest computed tomography images: the Lung Image System Analysis - LISA

    International Nuclear Information System (INIS)

    High Resolution Computed Tomography (HRCT) is the exam of choice for the diagnostic evaluation of lung parenchyma diseases. There is an increasing interest for computational systems able to automatically analyze the radiological densities of the lungs in CT images. The main objective of this study is to present a system for the automatic quantification and visualization of the lung aeration in HRCT images of different degrees of aeration, called Lung Image System Analysis (LISA). The secondary objective is to compare LISA to the Osiris system and also to specific algorithm lung segmentation (ALS), on the accuracy of the lungs segmentation. The LISA system automatically extracts the following image attributes: lungs perimeter, cross sectional area, volume, the radiological densities histograms, the mean lung density (MLD) in Hounsfield units (HU), the relative area of the lungs with voxels with density values lower than -950 HU (RA950) and the 15th percentile of the least density voxels (PERC15). Furthermore, LISA has a colored mask algorithm that applies pseudo-colors to the lung parenchyma according to the pre-defined radiological density chosen by the system user. The lungs segmentations of 102 images of 8 healthy volunteers and 141 images of 11 patients with Chronic Obstructive Pulmonary Disease (COPD) were compared on the accuracy and concordance among the three methods. The LISA was more effective on lungs segmentation than the other two methods. LISA's color mask tool improves the spatial visualization of the degrees of lung aeration and the various attributes of the image that can be extracted may help physicians and researchers to better assess lung aeration both quantitatively and qualitatively. LISA may have important clinical and research applications on the assessment of global and regional lung aeration and therefore deserves further developments and validation studies. (author)

  2. Automatic system for quantification and visualization of lung aeration on chest computed tomography images: the Lung Image System Analysis - LISA

    Energy Technology Data Exchange (ETDEWEB)

    Felix, John Hebert da Silva; Cortez, Paulo Cesar, E-mail: jhsfelix@gmail.co [Universidade Federal do Ceara (UFC), Fortaleza, CE (Brazil). Dept. de Engenharia de Teleinformatica; Holanda, Marcelo Alcantara [Universidade Federal do Ceara (UFC), Fortaleza, CE (Brazil). Hospital Universitario Walter Cantidio. Dept. de Medicina Clinica

    2010-12-15

    High Resolution Computed Tomography (HRCT) is the exam of choice for the diagnostic evaluation of lung parenchyma diseases. There is an increasing interest for computational systems able to automatically analyze the radiological densities of the lungs in CT images. The main objective of this study is to present a system for the automatic quantification and visualization of the lung aeration in HRCT images of different degrees of aeration, called Lung Image System Analysis (LISA). The secondary objective is to compare LISA to the Osiris system and also to specific algorithm lung segmentation (ALS), on the accuracy of the lungs segmentation. The LISA system automatically extracts the following image attributes: lungs perimeter, cross sectional area, volume, the radiological densities histograms, the mean lung density (MLD) in Hounsfield units (HU), the relative area of the lungs with voxels with density values lower than -950 HU (RA950) and the 15th percentile of the least density voxels (PERC15). Furthermore, LISA has a colored mask algorithm that applies pseudo-colors to the lung parenchyma according to the pre-defined radiological density chosen by the system user. The lungs segmentations of 102 images of 8 healthy volunteers and 141 images of 11 patients with Chronic Obstructive Pulmonary Disease (COPD) were compared on the accuracy and concordance among the three methods. The LISA was more effective on lungs segmentation than the other two methods. LISA's color mask tool improves the spatial visualization of the degrees of lung aeration and the various attributes of the image that can be extracted may help physicians and researchers to better assess lung aeration both quantitatively and qualitatively. LISA may have important clinical and research applications on the assessment of global and regional lung aeration and therefore deserves further developments and validation studies. (author)

  3. Quantitative right and left ventricular functional analysis during gated whole-chest MDCT: A feasibility study comparing automatic segmentation to semi-manual contouring

    International Nuclear Information System (INIS)

    Purpose: To evaluate the feasibility of an automatic, whole-heart segmentation algorithm for measuring global heart function from gated, whole-chest MDCT images. Material and methods: 15 patients with suspicion of PE underwent whole-chest contrast-enhanced MDCT with retrospective ECG synchronization. Two observers computed right and left ventricular functional indices using a semi-manual and an automatic whole-heart segmentation algorithm. The two techniques were compared using Bland-Altman analysis and paired Student's t-test. Measurement reproducibility was calculated using intraclass correlation coefficient. Results: Ventricular analysis with automatic segmentation was successful in 13/15 (86%) and in 15/15 (100%) patients for the right ventricle and left ventricle, respectively. Reproducibility of measurements for both ventricles was perfect (ICC: 1.00) and very good for automatic and semi-manual measurements, respectively. Ventricular volumes and functional indices except right ventricular ejection fraction obtained from the automatic method were significantly higher for the RV compared to the semi-manual methods. Conclusions: The automatic, whole-heart segmentation algorithm enabled highly reproducible global heart function to be rapidly obtained in patients undergoing gated whole-chest MDCT for assessment of acute chest pain with suspicion of pulmonary embolism.

  4. Automatic Vehicle Extraction from Airborne LiDAR Data Using an Object-Based Point Cloud Analysis Method

    Directory of Open Access Journals (Sweden)

    Jixian Zhang

    2014-09-01

    Full Text Available Automatic vehicle extraction from an airborne laser scanning (ALS point cloud is very useful for many applications, such as digital elevation model generation and 3D building reconstruction. In this article, an object-based point cloud analysis (OBPCA method is proposed for vehicle extraction from an ALS point cloud. First, a segmentation-based progressive TIN (triangular irregular network densification is employed to detect the ground points, and the potential vehicle points are detected based on the normalized heights of the non-ground points. Second, 3D connected component analysis is performed to group the potential vehicle points into segments. At last, vehicle segments are detected based on three features, including area, rectangularity and elongatedness. Experiments suggest that the proposed method is capable of achieving higher accuracy than the exiting mean-shift-based method for vehicle extraction from an ALS point cloud. Moreover, the larger the point density is, the higher the achieved accuracy is.

  5. Analysis of the distribution of the brain cells of the fruit fly by an automatic cell counting algorithm

    Science.gov (United States)

    Shimada, Takashi; Kato, Kentaro; Kamikouchi, Azusa; Ito, Kei

    2005-05-01

    The fruit fly is the smallest brain-having model animal. Its brain is said to consist only of about 250,000 neurons, whereas it shows “the rudiments of consciousness” in addition to its high abilities such as learning and memory. As the starting point of the exhaustive analysis of its brain-circuit information, we have developed a new algorithm of counting cells automatically from source 2D/3D figures. In our algorithm, counting cells is realized by embedding objects (typically, disks/balls), each of which has exclusive volume. Using this method, we have succeeded in counting thousands of cells accurately. This method provides us the information necessary for the analysis of brain circuits: the precise distribution of the whole brain cells.

  6. Statistical Analysis of Automatic Seed Word Acquisition to Improve Harmful Expression Extraction in Cyberbullying Detection

    Directory of Open Access Journals (Sweden)

    Suzuha Hatakeyama

    2016-04-01

    Full Text Available We study the social problem of cyberbullying, defined as a new form of bullying that takes place in the Internet space. This paper proposes a method for automatic acquisition of seed words to improve performance of the original method for the cyberbullying detection by Nitta et al. [1]. We conduct an experiment exactly in the same settings to find out that the method based on a Web mining technique, lost over 30% points of its performance since being proposed in 2013. Thus, we hypothesize on the reasons for the decrease in the performance and propose a number of improvements, from which we experimentally choose the best one. Furthermore, we collect several seed word sets using different approaches, evaluate and their precision. We found out that the influential factor in extraction of harmful expressions is not the number of seed words, but the way the seed words were collected and filtered.

  7. GALA: an automatic tool for the abundance analysis of stellar spectra

    CERN Document Server

    Mucciarelli, A; Lovisi, L; Ferraro, F R; Lapenna, E

    2013-01-01

    GALA is a freely distributed Fortran code to derive automatically the atmospheric parameters (temperature, gravity, microturbulent velocity and overall metallicity) and abundances for individual species of stellar spectra using the classical method based on the equivalent widths of metallic lines. The abundances of individual spectral lines are derived by using the WIDTH9 code developed by R. L. Kurucz. GALA is designed to obtain the best model atmosphere, by optimizing temperature, surface gravity, microturbulent velocity and metallicity, after rejecting the discrepant lines. Finally, it computes accurate internal errors for each atmospheric parameter and abundance. The code permits to obtain chemical abundances and atmospheric parameters for large stellar samples in a very short time, thus making GALA an useful tool in the epoch of the multi-object spectrographs and large surveys. An extensive set of tests with both synthetic and observed spectra is performed and discussed to explore the capabilities and ro...

  8. Automatic parameterization and analysis of stellar atmospheres: a study of the DA white dwarfs

    Energy Technology Data Exchange (ETDEWEB)

    McMahan, R.K. Jr.

    1986-01-01

    A method for automatically calculating atmospheric parameters of hydrogen-rich degenerate stars from low resolution spectra is advanced and then applied to the spectra of 53 DA white dwarfs. All data were taken using the Mark II spectrograph on the McGraw-Hill 1.3 m telescope and cover the spectral range lambdalambda4100-7000 at a resolution of eight Angstroms. The model grid was generated at Dartmouth using the atmosphere code LUCIFER; it contained over 275 synthetic spectra extending from 6000 to 100,000 K in effective temperature and 7.4-9.3 in log g. A new value for the width of the DA mass distribution was achieved using the techniques presented here. Accuracies in the atmospheric parameters greater than twice those previously published were obtained. These results place strict constraints on the magnitude of mass loss in stars in the red giant phase, as well as in the mechanisms responsible for the loss.

  9. Automatic derivation of domain terms and concept location based on the analysis of the identifiers

    CERN Document Server

    Vaclavik, Peter; Mezei, Marek

    2010-01-01

    Developers express the meaning of the domain ideas in specifically selected identifiers and comments that form the target implemented code. Software maintenance requires knowledge and understanding of the encoded ideas. This paper presents a way how to create automatically domain vocabulary. Knowledge of domain vocabulary supports the comprehension of a specific domain for later code maintenance or evolution. We present experiments conducted in two selected domains: application servers and web frameworks. Knowledge of domain terms enables easy localization of chunks of code that belong to a certain term. We consider these chunks of code as "concepts" and their placement in the code as "concept location". Application developers may also benefit from the obtained domain terms. These terms are parts of speech that characterize a certain concept. Concepts are encoded in "classes" (OO paradigm) and the obtained vocabulary of terms supports the selection and the comprehension of the class' appropriate identifiers. ...

  10. Automatic classication of pulmonary function in COPD patients using trachea analysis in chest CT scans

    Science.gov (United States)

    van Rikxoort, E. M.; de Jong, P. A.; Mets, O. M.; van Ginneken, B.

    2012-03-01

    Chronic Obstructive Pulmonary Disease (COPD) is a chronic lung disease that is characterized by airflow limitation. COPD is clinically diagnosed and monitored using pulmonary function testing (PFT), which measures global inspiration and expiration capabilities of patients and is time-consuming and labor-intensive. It is becoming standard practice to obtain paired inspiration-expiration CT scans of COPD patients. Predicting the PFT results from the CT scans would alleviate the need for PFT testing. It is hypothesized that the change of the trachea during breathing might be an indicator of tracheomalacia in COPD patients and correlate with COPD severity. In this paper, we propose to automatically measure morphological changes in the trachea from paired inspiration and expiration CT scans and investigate the influence on COPD GOLD stage classification. The trachea is automatically segmented and the trachea shape is encoded using the lengths of rays cast from the center of gravity of the trachea. These features are used in a classifier, combined with emphysema scoring, to attempt to classify subjects into their COPD stage. A database of 187 subjects, well distributed over the COPD GOLD stages 0 through 4 was used for this study. The data was randomly divided into training and test set. Using the training scans, a nearest mean classifier was trained to classify the subjects into their correct GOLD stage using either emphysema score, tracheal shape features, or a combination. Combining the proposed trachea shape features with emphysema score, the classification performance into GOLD stages improved with 11% to 51%. In addition, an 80% accuracy was achieved in distinguishing healthy subjects from COPD patients.

  11. Automatic Extraction of Optimal Endmembers from Airborne Hyperspectral Imagery Using Iterative Error Analysis (IEA and Spectral Discrimination Measurements

    Directory of Open Access Journals (Sweden)

    Ahram Song

    2015-01-01

    Full Text Available Pure surface materials denoted by endmembers play an important role in hyperspectral processing in various fields. Many endmember extraction algorithms (EEAs have been proposed to find appropriate endmember sets. Most studies involving the automatic extraction of appropriate endmembers without a priori information have focused on N-FINDR. Although there are many different versions of N-FINDR algorithms, computational complexity issues still remain and these algorithms cannot consider the case where spectrally mixed materials are extracted as final endmembers. A sequential endmember extraction-based algorithm may be more effective when the number of endmembers to be extracted is unknown. In this study, we propose a simple but accurate method to automatically determine the optimal endmembers using such a method. The proposed method consists of three steps for determining the proper number of endmembers and for removing endmembers that are repeated or contain mixed signatures using the Root Mean Square Error (RMSE images obtained from Iterative Error Analysis (IEA and spectral discrimination measurements. A synthetic hyperpsectral image and two different airborne images such as Airborne Imaging Spectrometer for Application (AISA and Compact Airborne Spectrographic Imager (CASI data were tested using the proposed method, and our experimental results indicate that the final endmember set contained all of the distinct signatures without redundant endmembers and errors from mixed materials.

  12. Development on quantitative safety analysis method of accident scenario. The automatic scenario generator development for event sequence construction of accident

    International Nuclear Information System (INIS)

    This study intends to develop a more sophisticated tool that will advance the current event tree method used in all PSA, and to focus on non-catastrophic events, specifically a non-core melt sequence scenario not included in an ordinary PSA. In the non-catastrophic event PSA, it is necessary to consider various end states and failure combinations for the purpose of multiple scenario construction. Therefore it is anticipated that an analysis work should be reduced and automated method and tool is required. A scenario generator that can automatically handle scenario construction logic and generate the enormous size of sequences logically identified by state-of-the-art methodology was developed. To fulfill the scenario generation as a technical tool, a simulation model associated with AI technique and graphical interface, was introduced. The AI simulation model in this study was verified for the feasibility of its capability to evaluate actual systems. In this feasibility study, a spurious SI signal was selected to test the model's applicability. As a result, the basic capability of the scenario generator could be demonstrated and important scenarios were generated. The human interface with a system and its operation, as well as time dependent factors and their quantification in scenario modeling, was added utilizing human scenario generator concept. Then the feasibility of an improved scenario generator was tested for actual use. Automatic scenario generation with a certain level of credibility, was achieved by this study. (author)

  13. Automatic analysis and reduction of reaction mechanisms for complex fuel combustion

    Energy Technology Data Exchange (ETDEWEB)

    Nilsson, Daniel

    2001-05-01

    This work concentrates on automatic procedures for simplifying chemical models for realistic fuels using skeletal mechanism construction and Quasi Steady-State Approximation (QSSA) applied to detailed reaction mechanisms. To automate the selection of species for removal or approximation, different indices for species ranking have thus been proposed. Reaction flow rates are combined with sensitivity information for targeting a certain quantity, and used to determine a level of redundancy for automatic skeletal mechanism construction by exclusion of redundant species. For QSSA reduction, a measure of species lifetime can be used for species ranking as-is, weighted by concentrations or molecular transport timescales, and/or combined with species sensitivity. Maximum values of the indices are accumulated over ranges of parameters, (e.g. fuel-air ratio and octane number), and species with low accumulated index values are selected for removal or steady-state approximation. In the case of QSSA, a model with a certain degree of reduction is automatically implemented as FORTRAN code by setting a certain index limit. The code calculates source terms of explicitly handled species from reaction rates and the steady-state concentrations by internal iteration. Homogeneous-reactor and one-dimensional laminar-flame models were used as test cases. A staged combustor fuelled by ethylene with monomethylamine addition is modelled by two homogeneous reactors in sequence, i.e. a PSR (Perfectly Stirred Reactor) followed by a PFR (Plug Flow Reactor). A modified PFR model was applied for simulation of a Homogeneous Charge Compression Ignition (HCCI) engine fuelled with four-component natural gas, whereas a two-zone model was required for a knocking Spark Ignition (SI) engine powered by Primary Reference Fuel (PRF). Finally, a laminar one-dimensional model was used to simulate premixed flames burning methane and an aeroturbine kerosene surrogate consisting of n-decane and toluene. In

  14. Semi-automatic analysis of standard uptake values in serial PET/CT studies in patients with lung cancer and lymphoma

    Directory of Open Access Journals (Sweden)

    Ly John

    2012-04-01

    Full Text Available Abstract Background Changes in maximum standardised uptake values (SUVmax between serial PET/CT studies are used to determine disease progression or regression in oncologic patients. To measure these changes manually can be time consuming in a clinical routine. A semi-automatic method for calculation of SUVmax in serial PET/CT studies was developed and compared to a conventional manual method. The semi-automatic method first aligns the serial PET/CT studies based on the CT images. Thereafter, the reader selects an abnormal lesion in one of the PET studies. After this manual step, the program automatically detects the corresponding lesion in the other PET study, segments the two lesions and calculates the SUVmax in both studies as well as the difference between the SUVmax values. The results of the semi-automatic analysis were compared to that of a manual SUVmax analysis using a Philips PET/CT workstation. Three readers did the SUVmax readings in both methods. Sixteen patients with lung cancer or lymphoma who had undergone two PET/CT studies were included. There were a total of 26 lesions. Results Linear regression analysis of changes in SUVmax show that intercepts and slopes are close to the line of identity for all readers (reader 1: intercept = 1.02, R2 = 0.96; reader 2: intercept = 0.97, R2 = 0.98; reader 3: intercept = 0.99, R2 = 0.98. Manual and semi-automatic method agreed in all cases whether SUVmax had increased or decreased between the serial studies. The average time to measure SUVmax changes in two serial PET/CT examinations was four to five times longer for the manual method compared to the semi-automatic method for all readers (reader 1: 53.7 vs. 10.5 s; reader 2: 27.3 vs. 6.9 s; reader 3: 47.5 vs. 9.5 s; p Conclusions Good agreement was shown in assessment of SUVmax changes between manual and semi-automatic method. The semi-automatic analysis was four to five times faster to perform than the manual analysis. These findings show the

  15. Analysis of the automatic depressurization system (ADS) tests for the AP600 design

    International Nuclear Information System (INIS)

    The AP600 is a Westinghouse advanced pressurized water reactor (PWR) designed with passive plant safety features that rely on natural driving forces, such as gravity, and natural circulation which allows significant simplification of the plant systems equipment and operation. As part of the passive safety concept, the AP600 utilizes an Automatic Depressurization System (ADS) to depressurize the Reactor Coolant System (RCS) allowing long-term gravity injection to be initiated and maintained for passive reflood and long term core cooling. The ADS design consists of four flow paths, two of which are connected to the top of the pressurizer and a flow path from each of the two RCS hot legs. During a postulated accident, the two flow paths from the hot legs discharge directly to containment. The two paths from the pressurizer discharge steam and/or water from the RCS into the In-containment Refueling Water Storage Tank (IRWST) through spargers located underwater where the steam is normally condensed with no increase in containment pressure or temperature. The ADS tests are one part of the planned AP600 Westinghouse test program for the passive core cooling system (PXS). The ADS tests are full-scale simulations of AP600 ADS components. The ADS tests provide dynamic performance data of the ADS for use in computer code validation and design verification

  16. Analysis of electric energy consumption of automatic milking systems in different configurations and operative conditions.

    Science.gov (United States)

    Calcante, Aldo; Tangorra, Francesco M; Oberti, Roberto

    2016-05-01

    Automatic milking systems (AMS) have been a revolutionary innovation in dairy cow farming. Currently, more than 10,000 dairy cow farms worldwide use AMS to milk their cows. Electric consumption is one of the most relevant and uncontrollable operational cost of AMS, ranging between 35 and 40% of their total annual operational costs. The aim of the present study was to measure and analyze the electric energy consumption of 4 AMS with different configurations: single box, central unit featuring a central vacuum system for 1 cow unit and for 2 cow units. The electrical consumption (daily consumption, daily consumption per cow milked, consumption per milking, and consumption per 100L of milk) of each AMS (milking unit + air compressor) was measured using 2 energy analyzers. The measurement period lasted 24h with a sampling frequency of 0.2Hz. The daily total energy consumption (milking unit + air compressor) ranged between 45.4 and 81.3 kWh; the consumption per cow milked ranged between 0.59 and 0.99 kWh; the consumption per milking ranged between 0.21 and 0.33 kWh; and the consumption per 100L of milk ranged between 1.80 to 2.44 kWh according to the different configurations and operational contexts considered. Results showed that AMS electric consumption was mainly conditioned by farm management rather than machine characteristics/architectures. PMID:26971145

  17. Automatic testing system design and data analysis of permafrost temperature in Qinghai-Tibet Railway

    Institute of Scientific and Technical Information of China (English)

    尚迎春; 齐红元

    2008-01-01

    Aimed at the characteristics of permafrost temperature influencing the safety of Qinghai-Tibet Railway and its on-line testing system, comparing the achievement of permafrost study nationwide with those worldwide, an automatic testing system of permafrost temperature, containing a master computer and some slave computers, was designed. By choosing high-precise thermistors as temperature sensor, designing and positioning the depth and interval of testing sections, testing, keeping and sending permafrost temperature data at time over slave computers, and receiving, processing and analyzing the data of collecting permafrost temperature over master computer, the change of the permafrost temperature can be described and analyzed, which can provide information for permafrost railway engineering design. Moreover, by taking permafrost temperature testing in a certain section of Qinghai-Tibet Railway as an instance, the collected data of permafrost temperature were analyzed, and the effect of permafrost behavior was depicted under the railway, as well as, a BP model was set up to predict the permafrost characteristics. This testing system will provide information timely about the change of the permafrost to support the safety operation in Qinghai-Tibet Railway.

  18. Automatic parameterization and analysis of stellar atmospheres: a study of the DA white dwarfs

    International Nuclear Information System (INIS)

    A method for automatically calculating atmospheric parameters of hydrogen-rich degenerate stars from low resolution spectra is advanced and then applied to the spectra of 53 DA white dwarfs. All data were taken using the Mark II spectrograph on the McGraw-Hill 1.3 m telescope and cover the spectral range λλ4100-7000 at a resolution of eight Angstroms. The model grid was generated at Dartmouth using the atmosphere code LUCIFER; it contained over 275 synthetic spectra extending from 6000 to 100,000 K in effective temperature and 7.4-9.3 in log g. A new value for the width of the DA mass distribution was achieved using the techniques presented here. Accuracies in the atmospheric parameters greater than twice those previously published were obtained. These results place strict constraints on the magnitude of mass loss in stars in the red giant phase, as well as in the mechanisms responsible for the loss

  19. Analysis of Fourth Stage of Automatic Depressurization System Failure to Open in AP1000 LOCA

    Directory of Open Access Journals (Sweden)

    Zhao Guozhi

    2014-01-01

    Full Text Available Automatic Depressurization System (ADS is a very important part of passive core cooling system in passive safety nuclear plant AP1000. ADS have four stages with each stage having two series and only ADS4 utilizes squib valves. During the accident, emergency core injecting is realized by gravity driven passive safety injection system like makeup tank (CMT, accumulator and In-Containment Refueling Water Storage Tank (IRWST. The objective e of this study is to analyze the system response and phenomenon under part of failure of ADS in AP1000 LOCA. The plant model is built by using SCDAP/RELAP5/MOD 3.4 code. The chosen accident scenario is small and medium LOCAs followed by failure of ADS4 to open, whose location is different from the other three stages. The results indicate that long time core cooling from IRWST is postponed greatly through intentional depressurization only by ADS1, 2, 3. In addition, LOCAs with equivalent diameter 25.4 cm and 34.1 cm will not lead to core melt while 5.08 cm break LOCA will. Meanwhile, high water level in the pressurizer will appear during all of three LOCAs.

  20. ANALYSIS OF THE DISTANCES COVERED BY FIRST DIVISION BRAZILIAN SOCCER PLAYERS OBTAINED WITH AN AUTOMATIC TRACKING METHOD

    Directory of Open Access Journals (Sweden)

    Ricardo M. L. Barros

    2007-06-01

    Full Text Available Methods based on visual estimation still is the most widely used analysis of the distances that is covered by soccer players during matches, and most description available in the literature were obtained using such an approach. Recently, systems based on computer vision techniques have appeared and the very first results are available for comparisons. The aim of the present study was to analyse the distances covered by Brazilian soccer players and compare the results to the European players', both data measured by automatic tracking system. Four regular Brazilian First Division Championship matches between different teams were filmed. Applying a previously developed automatic tracking system (DVideo, Campinas, Brazil, the results of 55 outline players participated in the whole game (n = 55 are presented. The results of mean distances covered, standard deviations (s and coefficient of variation (cv after 90 minutes were 10,012 m, s = 1,024 m and cv = 10.2%, respectively. The results of three-way ANOVA according to playing positions, showed that the distances covered by external defender (10642 ± 663 m, central midfielders (10476 ± 702 m and external midfielders (10598 ± 890 m were greater than forwards (9612 ± 772 m and forwards covered greater distances than central defenders (9029 ± 860 m. The greater distances were covered in standing, walking, or jogging, 5537 ± 263 m, followed by moderate-speed running, 1731 ± 399 m; low speed running, 1615 ± 351 m; high-speed running, 691 ± 190 m and sprinting, 437 ± 171 m. Mean distance covered in the first half was 5,173 m (s = 394 m, cv = 7.6% highly significant greater (p < 0.001 than the mean value 4,808 m (s = 375 m, cv = 7.8% in the second half. A minute-by-minute analysis revealed that after eight minutes of the second half, player performance has already decreased and this reduction is maintained throughout the second half

  1. Automatic analysis of slips of the tongue: Insights into the cognitive architecture of speech production.

    Science.gov (United States)

    Goldrick, Matthew; Keshet, Joseph; Gustafson, Erin; Heller, Jordana; Needle, Jeremy

    2016-04-01

    Traces of the cognitive mechanisms underlying speaking can be found within subtle variations in how we pronounce sounds. While speech errors have traditionally been seen as categorical substitutions of one sound for another, acoustic/articulatory analyses show they partially reflect the intended sound. When "pig" is mispronounced as "big," the resulting /b/ sound differs from correct productions of "big," moving towards intended "pig"-revealing the role of graded sound representations in speech production. Investigating the origins of such phenomena requires detailed estimation of speech sound distributions; this has been hampered by reliance on subjective, labor-intensive manual annotation. Computational methods can address these issues by providing for objective, automatic measurements. We develop a novel high-precision computational approach, based on a set of machine learning algorithms, for measurement of elicited speech. The algorithms are trained on existing manually labeled data to detect and locate linguistically relevant acoustic properties with high accuracy. Our approach is robust, is designed to handle mis-productions, and overall matches the performance of expert coders. It allows us to analyze a very large dataset of speech errors (containing far more errors than the total in the existing literature), illuminating properties of speech sound distributions previously impossible to reliably observe. We argue that this provides novel evidence that two sources both contribute to deviations in speech errors: planning processes specifying the targets of articulation and articulatory processes specifying the motor movements that execute this plan. These findings illustrate how a much richer picture of speech provides an opportunity to gain novel insights into language processing. PMID:26779665

  2. Automatic vision system for analysis of microscopic behavior of flow and transport in porous media

    Energy Technology Data Exchange (ETDEWEB)

    Rashidi, M.; Dehmeshid, J.; Dickenson, E.; Daemi, F.

    1997-07-01

    This paper describes the development of a novel automated and efficient vision system to obtain velocity and concentration measurements within a porous medium. An aqueous fluid laced with a fluorescent dye or microspheres flows through a transparent, reflective-index-matched column packed with a transparent crystals. For illumination purposes, a planar sheet of lasers passes through the column as a CCD camera records all the laser illuminated planes. Detailed microscopic velocity and concentration fluids have been computed within a 3D volume of the column. For measuring velocities, while the aqueous fluid, laced with fluorescent microspheres, flows though the transparent medium, a CCD camera records the motions of the fluorescing particles by a video cassette recorder.The recorder images are acquired frame by frame and transferred to the computer foe processing by using a frame grabber and written relevant algorithms through an RD-232 interface. Since the grabbed image is poor in this stage, some preprocessings are used to enhance particles within images. Finally, these measurement, while the aqueous fluid, laced with a fluorescent organic dye, flows through the transparent medium, a CCD camera sweeps back and forth across the column and records concentration slices on the planes illuminated by the laser beam traveling simultaneously with the camera. Subsequently, these recorded images are transferred to the computer for processing in similar fashion to the velocity measurement. In order to have a fully automatic vision system, several detailed image processing techniques are developed to match exact imaged (at difference times during the experiments) that have different intensities values but the same topological characteristics. This results in normalized interstitial chemical concentration as a function of time within the porous column.

  3. Biodosimetry estimation using the ratio of the longest: shortest length in the premature chromosome condensation (PCC) method applying autocapture and automatic image analysis

    International Nuclear Information System (INIS)

    The combination of automatic image acquisition and automatic image analysis of premature chromosome condensation (PCC) spreads was tested as a rapid biodosimeter protocol. Human peripheral lymphocytes were irradiated with 60Co gamma rays in a single dose of between 1 and 20 Gy, stimulated with phytohaemaglutinin and incubated for 48 h, division blocked with Colcemid, and PCC-induced by Calyculin A. Images of chromosome spreads were captured and analysed automatically by combining the Metafer 4 and CellProfiler platforms. Automatic measurement of chromosome lengths allows the calculation of the length ratio (LR) of the longest and the shortest piece that can be used for dose estimation since this ratio is correlated with ionizing radiation dose. The LR of the longest and the shortest chromosome pieces showed the best goodness-of-fit to a linear model in the dose interval tested. The application of the automatic analysis increases the potential use of the PCC method for triage in the event of massive radiation causalities. (author)

  4. SU-C-201-04: Quantification of Perfusion Heterogeneity Based On Texture Analysis for Fully Automatic Detection of Ischemic Deficits From Myocardial Perfusion Imaging

    International Nuclear Information System (INIS)

    Purpose: Texture-based quantification of image heterogeneity has been a popular topic for imaging studies in recent years. As previous studies mainly focus on oncological applications, we report our recent efforts of applying such techniques on cardiac perfusion imaging. A fully automated procedure has been developed to perform texture analysis for measuring the image heterogeneity. Clinical data were used to evaluate the preliminary performance of such methods. Methods: Myocardial perfusion images of Thallium-201 scans were collected from 293 patients with suspected coronary artery disease. Each subject underwent a Tl-201 scan and a percutaneous coronary intervention (PCI) within three months. The PCI Result was used as the gold standard of coronary ischemia of more than 70% stenosis. Each Tl-201 scan was spatially normalized to an image template for fully automatic segmentation of the LV. The segmented voxel intensities were then carried into the texture analysis with our open-source software Chang Gung Image Texture Analysis toolbox (CGITA). To evaluate the clinical performance of the image heterogeneity for detecting the coronary stenosis, receiver operating characteristic (ROC) analysis was used to compute the overall accuracy, sensitivity and specificity as well as the area under curve (AUC). Those indices were compared to those obtained from the commercially available semi-automatic software QPS. Results: With the fully automatic procedure to quantify heterogeneity from Tl-201 scans, we were able to achieve a good discrimination with good accuracy (74%), sensitivity (73%), specificity (77%) and AUC of 0.82. Such performance is similar to those obtained from the semi-automatic QPS software that gives a sensitivity of 71% and specificity of 77%. Conclusion: Based on fully automatic procedures of data processing, our preliminary data indicate that the image heterogeneity of myocardial perfusion imaging can provide useful information for automatic determination

  5. SU-C-201-04: Quantification of Perfusion Heterogeneity Based On Texture Analysis for Fully Automatic Detection of Ischemic Deficits From Myocardial Perfusion Imaging

    Energy Technology Data Exchange (ETDEWEB)

    Fang, Y [National Cheng Kung University, Tainan, Taiwan (China); Huang, H [Chang Gung University, Taoyuan, Taiwan (China); Su, T [Chang Gung Memorial Hospital, Taoyuan, Taiwan (China)

    2015-06-15

    Purpose: Texture-based quantification of image heterogeneity has been a popular topic for imaging studies in recent years. As previous studies mainly focus on oncological applications, we report our recent efforts of applying such techniques on cardiac perfusion imaging. A fully automated procedure has been developed to perform texture analysis for measuring the image heterogeneity. Clinical data were used to evaluate the preliminary performance of such methods. Methods: Myocardial perfusion images of Thallium-201 scans were collected from 293 patients with suspected coronary artery disease. Each subject underwent a Tl-201 scan and a percutaneous coronary intervention (PCI) within three months. The PCI Result was used as the gold standard of coronary ischemia of more than 70% stenosis. Each Tl-201 scan was spatially normalized to an image template for fully automatic segmentation of the LV. The segmented voxel intensities were then carried into the texture analysis with our open-source software Chang Gung Image Texture Analysis toolbox (CGITA). To evaluate the clinical performance of the image heterogeneity for detecting the coronary stenosis, receiver operating characteristic (ROC) analysis was used to compute the overall accuracy, sensitivity and specificity as well as the area under curve (AUC). Those indices were compared to those obtained from the commercially available semi-automatic software QPS. Results: With the fully automatic procedure to quantify heterogeneity from Tl-201 scans, we were able to achieve a good discrimination with good accuracy (74%), sensitivity (73%), specificity (77%) and AUC of 0.82. Such performance is similar to those obtained from the semi-automatic QPS software that gives a sensitivity of 71% and specificity of 77%. Conclusion: Based on fully automatic procedures of data processing, our preliminary data indicate that the image heterogeneity of myocardial perfusion imaging can provide useful information for automatic determination

  6. Analysis on the Influence of Automatic Station Temperature Data on the Sequence Continuity of Historical Meteorological Data

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    [Objective] The research aimed to study the influence of automatic station data on the sequence continuity of historical meteorological data. [Method] Based on the temperature data which were measured by the automatic meteorological station and the corresponding artificial observation data during January-December in 2001, the monthly average, maximum and minimum temperatures in the automatic station were compared with the corresponding artificial observation temperature data in the parallel observation peri...

  7. Automatic denoising of functional MRI data: combining independent component analysis and hierarchical fusion of classifiers

    NARCIS (Netherlands)

    Salimi-Khorshidi, G.; Douaud, G.; Beckmann, C.F.; Glasser, M.F.; Griffanti, L.; Smith, S.M.

    2014-01-01

    Many sources of fluctuation contribute to the fMRI signal, and this makes identifying the effects that are truly related to the underlying neuronal activity difficult. Independent component analysis (ICA) - one of the most widely used techniques for the exploratory analysis of fMRI data - has shown

  8. Automatic denoising of functional MRI data: Combining independent component analysis and hierarchical fusion of classifiers

    NARCIS (Netherlands)

    Salimi-Khorshidi, Gholamreza; Douaud, Gwenaelle; Beckmann, Christian F.; Glasser, Matthew F.; Griffanti, Ludovica; Smith, Stephen M.

    2014-01-01

    Many sources of fluctuation contribute to the fMRI signal, and this makes identifying the effects that are truly related to the underlying neuronal activity difficult. Independent component analysis (ICA) – one of the most widely used techniques for the exploratory analysis of fMRI data – has shown

  9. Automatic scatter detection in fluorescence landscapes by means of spherical principal component analysis

    DEFF Research Database (Denmark)

    Kotwa, Ewelina Katarzyna; Jørgensen, Bo Munk; Brockhoff, Per B.; Frosch, Stina

    In this paper, we introduce a new method, based on spherical principal component analysis (S‐PCA), for the identification of Rayleigh and Raman scatters in fluorescence excitation–emission data. These scatters should be found and eliminated as a prestep before fitting parallel factor analysis...

  10. Interconnecting smartphone, image analysis server, and case report forms in clinical trials for automatic skin lesion tracking in clinical trials

    Science.gov (United States)

    Haak, Daniel; Doma, Aliaa; Gombert, Alexander; Deserno, Thomas M.

    2016-03-01

    Today, subject's medical data in controlled clinical trials is captured digitally in electronic case report forms (eCRFs). However, eCRFs only insufficiently support integration of subject's image data, although medical imaging is looming large in studies today. For bed-side image integration, we present a mobile application (App) that utilizes the smartphone-integrated camera. To ensure high image quality with this inexpensive consumer hardware, color reference cards are placed in the camera's field of view next to the lesion. The cards are used for automatic calibration of geometry, color, and contrast. In addition, a personalized code is read from the cards that allows subject identification. For data integration, the App is connected to an communication and image analysis server that also holds the code-study-subject relation. In a second system interconnection, web services are used to connect the smartphone with OpenClinica, an open-source, Food and Drug Administration (FDA)-approved electronic data capture (EDC) system in clinical trials. Once the photographs have been securely stored on the server, they are released automatically from the mobile device. The workflow of the system is demonstrated by an ongoing clinical trial, in which photographic documentation is frequently performed to measure the effect of wound incision management systems. All 205 images, which have been collected in the study so far, have been correctly identified and successfully integrated into the corresponding subject's eCRF. Using this system, manual steps for the study personnel are reduced, and, therefore, errors, latency and costs decreased. Our approach also increases data security and privacy.

  11. Compression Algorithm Analysis of In-Situ (S)TEM Video: Towards Automatic Event Detection and Characterization

    Energy Technology Data Exchange (ETDEWEB)

    Teuton, Jeremy R.; Griswold, Richard L.; Mehdi, Beata L.; Browning, Nigel D.

    2015-09-23

    Precise analysis of both (S)TEM images and video are time and labor intensive processes. As an example, determining when crystal growth and shrinkage occurs during the dynamic process of Li dendrite deposition and stripping involves manually scanning through each frame in the video to extract a specific set of frames/images. For large numbers of images, this process can be very time consuming, so a fast and accurate automated method is desirable. Given this need, we developed software that uses analysis of video compression statistics for detecting and characterizing events in large data sets. This software works by converting the data into a series of images which it compresses into an MPEG-2 video using the open source “avconv” utility [1]. The software does not use the video itself, but rather analyzes the video statistics from the first pass of the video encoding that avconv records in the log file. This file contains statistics for each frame of the video including the frame quality, intra-texture and predicted texture bits, forward and backward motion vector resolution, among others. In all, avconv records 15 statistics for each frame. By combining different statistics, we have been able to detect events in various types of data. We have developed an interactive tool for exploring the data and the statistics that aids the analyst in selecting useful statistics for each analysis. Going forward, an algorithm for detecting and possibly describing events automatically can be written based on statistic(s) for each data type.

  12. Automatic utilities auditing

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Colin Boughton [Energy Metering Technology (United Kingdom)

    2000-08-01

    At present, energy audits represent only snapshot situations of the flow of energy. The normal pattern of energy audits as seen through the eyes of an experienced energy auditor is described. A brief history of energy auditing is given. It is claimed that the future of energy auditing lies in automatic meter reading with expert data analysis providing continuous automatic auditing thereby reducing the skill element. Ultimately, it will be feasible to carry out auditing at intervals of say 30 minutes rather than five years.

  13. 自动核型分析技术及其在植物中的应用%Automatic Karyotype Analysis Technology and Its Application in Plants

    Institute of Scientific and Technical Information of China (English)

    何小周; 郭东林

    2009-01-01

    核型分析是一种重要的细胞遗传学研究手段.染色体自动分析技术操作简便、速度快、结果可靠,是未来核型分析技术发展的方向.综述了生物核型分析的策略、系统组成、自动化图像处理和分析技术及其在植物核型分析中的应用.%The karyotype analysis is an important research means of cytogenetics. The automatic chromosome analysis technology is simple, rapid and reliable, so it is the development direction of the karyotype analysis technology in the future. The strategies, system composition, automatic image processing and analysis technologies of biological karyotype analysis and its application in plants were reviewed.

  14. Data Analysis of Metering System and Automatic Regulation of Heat Supply to Five- and More Storey Housing Buildings

    Directory of Open Access Journals (Sweden)

    F. R. Latypov

    2014-06-01

    Full Text Available The paper considers regulation of heat supply to housing buildings using automatic regulators. Integral index of automatic regulator action on the heat-supply system of multi-storey housing building is proportional to a thermal criterion of Pomerantsev similarity.

  15. FamPipe: An Automatic Analysis Pipeline for Analyzing Sequencing Data in Families for Disease Studies

    Science.gov (United States)

    Chung, Ren-Hua; Tsai, Wei-Yun; Kang, Chen-Yu; Yao, Po-Ju; Tsai, Hui-Ju; Chen, Chia-Hsiang

    2016-01-01

    In disease studies, family-based designs have become an attractive approach to analyzing next-generation sequencing (NGS) data for the identification of rare mutations enriched in families. Substantial research effort has been devoted to developing pipelines for automating sequence alignment, variant calling, and annotation. However, fewer pipelines have been designed specifically for disease studies. Most of the current analysis pipelines for family-based disease studies using NGS data focus on a specific function, such as identifying variants with Mendelian inheritance or identifying shared chromosomal regions among affected family members. Consequently, some other useful family-based analysis tools, such as imputation, linkage, and association tools, have yet to be integrated and automated. We developed FamPipe, a comprehensive analysis pipeline, which includes several family-specific analysis modules, including the identification of shared chromosomal regions among affected family members, prioritizing variants assuming a disease model, imputation of untyped variants, and linkage and association tests. We used simulation studies to compare properties of some modules implemented in FamPipe, and based on the results, we provided suggestions for the selection of modules to achieve an optimal analysis strategy. The pipeline is under the GNU GPL License and can be downloaded for free at http://fampipe.sourceforge.net. PMID:27272119

  16. Automatic Denoising of Functional MRI Data: Combining Independent Component Analysis and Hierarchical Fusion of Classifiers

    OpenAIRE

    Salimi-Khorshidi, Gholamreza; Douaud, Gwenaëlle; Beckmann, Christian F.; Glasser, Matthew F.; Griffanti, Ludovica; Smith, Stephen M.

    2014-01-01

    Many sources of fluctuation contribute to the fMRI signal, and this makes identifying the effects that are truly related to the underlying neuronal activity difficult. Independent component analysis (ICA) - one of the most widely used techniques for the exploratory analysis of fMRI data - has shown to be a powerful technique in identifying various sources of neuronally-related and artefactual fluctuation in fMRI data (both with the application of external stimuli and with the subject “at rest...

  17. Automatic analysis of uranium-bearing extracts in amine solvent extraction plants processing sulfate leach liquors

    International Nuclear Information System (INIS)

    Instrumentation based on continuous segmented flow analysis is suggested for the control of uranium loading in the amine phase of solvent extraction processing sulfate leach liquors. It can be installed with relatively little capital outlay and operational costs are expected to be low. The uranium(VI) in up to 60 samples of extract (proportional 0.1 to 5 g l-1 U) per hour can be determined. Application of spectrophotometry to the analysis of various process streams is discussed and it is concluded that it compares favourably in several important respects with the use of alternative techniques. (orig.)

  18. Automatic Case-Based Reasoning Approach for Landslide Detection: Integration of Object-Oriented Image Analysis and a Genetic Algorithm

    Directory of Open Access Journals (Sweden)

    Jie Dou

    2015-04-01

    Full Text Available This paper proposes an automatic method for detecting landslides by using an integrated approach comprising object-oriented image analysis (OOIA, a genetic algorithm (GA, and a case-based reasoning (CBR technique. It consists of three main phases: (1 image processing and multi-image segmentation; (2 feature optimization; and (3 detecting landslides. The proposed approach was employed in a fast-growing urban region, the Pearl River Delta in South China. The results of detection were validated with the help of field surveys. The experimental results indicated that the proposed OOIA-GA-CBR (0.87 demonstrates higher classification performance than the stand-alone OOIA (0.75 method for detecting landslides. The area under curve (AUC value was also higher than that of the simple OOIA, indicating the high efficiency of the proposed landslide detection approach. The case library created using the integrated model can be reused for time-independent analysis, thus rendering our approach superior in comparison to other traditional methods, such as the maximum likelihood classifier. The results of this study thus facilitate fast generation of accurate landslide inventory maps, which will eventually extend our understanding of the evolution of landscapes shaped by landslide processes.

  19. A Sensitive and Automatic White Matter Fiber Tracts Model for Longitudinal Analysis of Diffusion Tensor Images in Multiple Sclerosis.

    Science.gov (United States)

    Stamile, Claudio; Kocevar, Gabriel; Cotton, François; Durand-Dubief, Françoise; Hannoun, Salem; Frindel, Carole; Guttmann, Charles R G; Rousseau, David; Sappey-Marinier, Dominique

    2016-01-01

    Diffusion tensor imaging (DTI) is a sensitive tool for the assessment of microstructural alterations in brain white matter (WM). We propose a new processing technique to detect, local and global longitudinal changes of diffusivity metrics, in homologous regions along WM fiber-bundles. To this end, a reliable and automatic processing pipeline was developed in three steps: 1) co-registration and diffusion metrics computation, 2) tractography, bundle extraction and processing, and 3) longitudinal fiber-bundle analysis. The last step was based on an original Gaussian mixture model providing a fine analysis of fiber-bundle cross-sections, and allowing a sensitive detection of longitudinal changes along fibers. This method was tested on simulated and clinical data. High levels of F-Measure were obtained on simulated data. Experiments on cortico-spinal tract and inferior fronto-occipital fasciculi of five patients with Multiple Sclerosis (MS) included in a weekly follow-up protocol highlighted the greater sensitivity of this fiber scale approach to detect small longitudinal alterations. PMID:27224308

  20. From Laser Scanning to Finite Element Analysis of Complex Buildings by Using a Semi-Automatic Procedure

    Directory of Open Access Journals (Sweden)

    Giovanni Castellazzi

    2015-07-01

    Full Text Available In this paper, a new semi-automatic procedure to transform three-dimensional point clouds of complex objects to three-dimensional finite element models is presented and validated. The procedure conceives of the point cloud as a stacking of point sections. The complexity of the clouds is arbitrary, since the procedure is designed for terrestrial laser scanner surveys applied to buildings with irregular geometry, such as historical buildings. The procedure aims at solving the problems connected to the generation of finite element models of these complex structures by constructing a fine discretized geometry with a reduced amount of time and ready to be used with structural analysis. If the starting clouds represent the inner and outer surfaces of the structure, the resulting finite element model will accurately capture the whole three-dimensional structure, producing a complex solid made by voxel elements. A comparison analysis with a CAD-based model is carried out on a historical building damaged by a seismic event. The results indicate that the proposed procedure is effective and obtains comparable models in a shorter time, with an increased level of automation.

  1. A Sensitive and Automatic White Matter Fiber Tracts Model for Longitudinal Analysis of Diffusion Tensor Images in Multiple Sclerosis

    Science.gov (United States)

    Stamile, Claudio; Kocevar, Gabriel; Cotton, François; Durand-Dubief, Françoise; Hannoun, Salem; Frindel, Carole; Guttmann, Charles R. G.; Rousseau, David; Sappey-Marinier, Dominique

    2016-01-01

    Diffusion tensor imaging (DTI) is a sensitive tool for the assessment of microstructural alterations in brain white matter (WM). We propose a new processing technique to detect, local and global longitudinal changes of diffusivity metrics, in homologous regions along WM fiber-bundles. To this end, a reliable and automatic processing pipeline was developed in three steps: 1) co-registration and diffusion metrics computation, 2) tractography, bundle extraction and processing, and 3) longitudinal fiber-bundle analysis. The last step was based on an original Gaussian mixture model providing a fine analysis of fiber-bundle cross-sections, and allowing a sensitive detection of longitudinal changes along fibers. This method was tested on simulated and clinical data. High levels of F-Measure were obtained on simulated data. Experiments on cortico-spinal tract and inferior fronto-occipital fasciculi of five patients with Multiple Sclerosis (MS) included in a weekly follow-up protocol highlighted the greater sensitivity of this fiber scale approach to detect small longitudinal alterations. PMID:27224308

  2. TU-F-17A-01: BEST IN PHYSICS (JOINT IMAGING-THERAPY) - An Automatic Toolkit for Efficient and Robust Analysis of 4D Respiratory Motion

    International Nuclear Information System (INIS)

    Purpose: To provide an automatic image analysis toolkit to process thoracic 4-dimensional computed tomography (4DCT) and extract patient-specific motion information to facilitate investigational or clinical use of 4DCT. Methods: We developed an automatic toolkit in MATLAB to overcome the extra workload from the time dimension in 4DCT. This toolkit employs image/signal processing, computer vision, and machine learning methods to visualize, segment, register, and characterize lung 4DCT automatically or interactively. A fully-automated 3D lung segmentation algorithm was designed and 4D lung segmentation was achieved in batch mode. Voxel counting was used to calculate volume variations of the torso, lung and its air component, and local volume changes at the diaphragm and chest wall to characterize breathing pattern. Segmented lung volumes in 12 patients are compared with those from a treatment planning system (TPS). Voxel conversion was introduced from CT# to other physical parameters, such as gravity-induced pressure, to create a secondary 4D image. A demon algorithm was applied in deformable image registration and motion trajectories were extracted automatically. Calculated motion parameters were plotted with various templates. Machine learning algorithms, such as Naive Bayes and random forests, were implemented to study respiratory motion. This toolkit is complementary to and will be integrated with the Computational Environment for Radiotherapy Research (CERR). Results: The automatic 4D image/data processing toolkit provides a platform for analysis of 4D images and datasets. It processes 4D data automatically in batch mode and provides interactive visual verification for manual adjustments. The discrepancy in lung volume calculation between this and the TPS is <±2% and the time saving is by 1–2 orders of magnitude. Conclusion: A framework of 4D toolkit has been developed to analyze thoracic 4DCT automatically or interactively, facilitating both investigational

  3. The ACODEA Framework: Developing Segmentation and Classification Schemes for Fully Automatic Analysis of Online Discussions

    Science.gov (United States)

    Mu, Jin; Stegmann, Karsten; Mayfield, Elijah; Rose, Carolyn; Fischer, Frank

    2012-01-01

    Research related to online discussions frequently faces the problem of analyzing huge corpora. Natural Language Processing (NLP) technologies may allow automating this analysis. However, the state-of-the-art in machine learning and text mining approaches yields models that do not transfer well between corpora related to different topics. Also,…

  4. Meteor automatic imager and analyzer: Analysis of noise characteristics and possible noise suppression

    Czech Academy of Sciences Publication Activity Database

    Švihlík, J.; Fliegel, K.; Páta, P.; Vítek, S.; Koten, Pavel

    Bellingham: SPIE, 2010, 779821/1-779821/9. (Proceedings of SPIE. 7798). ISBN 978-0-8194-8294-5. ISSN 0277-786X. [Applications of Digital Image Processing /33./. San Diego (US), 02.08.2010-04.08.2010] Institutional support: RVO:67985815 Keywords : MAIA * meteor * noise analysis Subject RIV: JA - Electronics ; Optoelectronics, Electrical Engineering

  5. Exploiting automatically generated databases of traffic signs and road markings for contextual co-occurrence analysis

    Science.gov (United States)

    Hazelhoff, Lykele; Creusen, Ivo M.; Woudsma, Thomas; de With, Peter H. N.

    2015-11-01

    Combined databases of road markings and traffic signs provide a complete and full description of the present traffic legislation and instructions. Such databases contribute to efficient signage maintenance, improve navigation, and benefit autonomous driving vehicles. A system is presented for the automated creation of such combined databases, which additionally investigates the benefit of this combination for automated contextual placement analysis. This analysis involves verification of the co-occurrence of traffic signs and road markings to retrieve a list of potentially incorrectly signaled (and thus potentially unsafe) road situations. This co-occurrence verification is specifically explored for both pedestrian crossings and yield situations. Evaluations on 420 km of road have shown that individual detection of traffic signs and road markings denoting these road situations can be performed with accuracies of 98% and 85%, respectively. Combining both approaches shows that over 95% of the pedestrian crossings and give-way situations can be identified. An exploration toward additional co-occurrence analysis of signs and markings shows that inconsistently signaled situations can successfully be extracted, such that specific safety actions can be directed toward cases lacking signs or markings, while most consistently signaled situations can be omitted from this analysis.

  6. Automatic co-registration of space-based sensors for precision change detection and analysis

    Science.gov (United States)

    Bryant, N.; Zobrist, A.; Logan, T.

    2003-01-01

    A variety of techniques were developed at JPL to assure sub-pixel co-registration of scenes and ortho-rectification of satellite imagery to other georeferenced information to permit precise change detection and analysis of low and moderate resolution space sensors.

  7. Study of Automatic Extraction, Classification, and Ranking of Product Aspects Based on Sentiment Analysis of Reviews

    Directory of Open Access Journals (Sweden)

    Muhammad Rafi

    2015-10-01

    Full Text Available It is very common for a customer to read reviews about the product before making a final decision to buy it. Customers are always eager to get the best and the most objective information about the product theywish to purchase and reviews are the major source to obtain this information. Although reviews are easily accessible from the web, but since most of them carry ambiguous opinion and different structure, it is often very difficult for a customer to filter the information he actually needs. This paper suggests a framework, which provides a single user interface solution to this problem based on sentiment analysis of reviews. First, it extracts all the reviews from different websites carrying varying structure, and gathers information about relevant aspects of that product. Next, it does sentiment analysis around those aspects and gives them sentiment scores. Finally, it ranks all extracted aspects and clusters them into positive and negative class. The final output is a graphical visualization of all positive and negative aspects, which provide the customer easy, comparable, and visual information about the important aspects of the product. The experimental results on five different products carrying 5000 reviewsshow 78% accuracy. Moreover, the paper also explained the effect of Negation, Valence Shifter, and Diminisher with sentiment lexiconon sentiment analysis, andconcluded that they all are independent of the case problem , and have no effect on the accuracy of sentiment analysis.

  8. Automatic analysis of quality of images from X-ray digital flat detectors

    International Nuclear Information System (INIS)

    Since last decade, medical imaging has grown up with the development of new digital imaging techniques. In the field of X-ray radiography, new detectors replace progressively older techniques, based on film or x-ray intensifiers. These digital detectors offer a higher sensibility and reduced overall dimensions. This work has been prepared with Trixell, the world leading company in flat detectors for medical radiography. It deals with quality control on digital images stemming from these detectors. High quality standards of medical imaging impose a close analysis of the defects that can appear on the images. This work describes a complete process for quality analysis of such images. A particular focus is given on the detection task of the defects, thanks to methods well adapted to our context of spatially correlated defects in noise background. (author)

  9. Automatic Detection of Previously-Unseen Application States for Deployment Environment Testing and Analysis

    Science.gov (United States)

    Murphy, Christian; Vaughan, Moses; Ilahi, Waseem; Kaiser, Gail

    2010-01-01

    For large, complex software systems, it is typically impossible in terms of time and cost to reliably test the application in all possible execution states and configurations before releasing it into production. One proposed way of addressing this problem has been to continue testing and analysis of the application in the field, after it has been deployed. A practical limitation of many such automated approaches is the potentially high performance overhead incurred by the necessary instrumentation. However, it may be possible to reduce this overhead by selecting test cases and performing analysis only in previously-unseen application states, thus reducing the number of redundant tests and analyses that are run. Solutions for fault detection, model checking, security testing, and fault localization in deployed software may all benefit from a technique that ignores application states that have already been tested or explored. In this paper, we present a solution that ensures that deployment environment tests are only executed in states that the application has not previously encountered. In addition to discussing our implementation, we present the results of an empirical study that demonstrates its effectiveness, and explain how the new approach can be generalized to assist other automated testing and analysis techniques intended for the deployment environment. PMID:21197140

  10. Automatic analysis of CR-39 track detectors for selective assessment of radon and its decay products

    International Nuclear Information System (INIS)

    A system for analyzing data from CR-39 track detectors exposed to radon and its daughter products was developed. The system performs both alpha particle spectroscopic analysis, where for each track the essential geometric parameters are evaluated, and mere counting of tracks; each condition is distinguished by its own chemical etching, microscope and scanning parameters. The spectroscopic technique was applied to the assessment of 210Po embedded in glass and to the discrimination of 222Rn, 218Po and 214Po contributions in passive dosimetry. The counting technique was applied to the determination of indoor radon concentration with passive dosemeters containing CR-39 detectors

  11. Automatic mechanical fault assessment of small wind energy systems in microgrids using electric signature analysis

    DEFF Research Database (Denmark)

    Skrimpas, Georgios Alexandros; Marhadi, Kun Saptohartyadi; Jensen, Bogi Bech

    2013-01-01

    A microgrid is a cluster of power generation, consumption and storage systems capable of operating either independently or as part of a macrogrid. The mechanical condition of the power production units, such as the small wind turbines, is considered of crucial importance especially in the case of...... islanded operation. In this paper, the fault assessment is achieved efficiently and consistently via electric signature analysis (ESA). In ESA the fault related frequency components are manifested as sidebands of the existing current and voltage time harmonics. The energy content between the fundamental, 5...

  12. Dynamic Analysis of a Vehicular-mounted Automatic Weapon–Planar Case

    Directory of Open Access Journals (Sweden)

    Huai-Ku Sun

    2009-05-01

    Full Text Available This study analyses the dynamic behaviour of a machine gun mounted on a four-wheeled vehicle. The entire system comprises three parts: the gun, the flexible monopod, and the vehicle. The weapon has a multirigid- body mechanism and comprises a rigid receiver, a rigid bolt, a bullet, a buffer, and a recoil spring. The vehicle model features a rigid vehicle body, suspension springs, shock absorbers, and wheels. The finite element method is used to model the flexible monopod connecting the gun and the vehicle. This study combines a computer-aided analysis of rigid-body mechanisms with finite element analysis of a flexible structure to derive the total equations of motion, incorporating the Lagrange multiplier. The total equations of motion are solved with numerical integration to simulate the transient response of the whole system. This approach can easily resolve the problem of rigid-flexible coupling effect, and promote the function of the whole system in the engineering design phase.Defence Science Journal, 2009, 59(3, pp.265-272, DOI:http://dx.doi.org/10.14429/dsj.59.1520

  13. Performance portability study of an automatic target detection and classification algorithm for hyperspectral image analysis using OpenCL

    Science.gov (United States)

    Bernabe, Sergio; Igual, Francisco D.; Botella, Guillermo; Garcia, Carlos; Prieto-Matias, Manuel; Plaza, Antonio

    2015-10-01

    Recent advances in heterogeneous high performance computing (HPC) have opened new avenues for demanding remote sensing applications. Perhaps one of the most popular algorithm in target detection and identification is the automatic target detection and classification algorithm (ATDCA) widely used in the hyperspectral image analysis community. Previous research has already investigated the mapping of ATDCA on graphics processing units (GPUs) and field programmable gate arrays (FPGAs), showing impressive speedup factors that allow its exploitation in time-critical scenarios. Based on these studies, our work explores the performance portability of a tuned OpenCL implementation across a range of processing devices including multicore processors, GPUs and other accelerators. This approach differs from previous papers, which focused on achieving the optimal performance on each platform. Here, we are more interested in the following issues: (1) evaluating if a single code written in OpenCL allows us to achieve acceptable performance across all of them, and (2) assessing the gap between our portable OpenCL code and those hand-tuned versions previously investigated. Our study includes the analysis of different tuning techniques that expose data parallelism as well as enable an efficient exploitation of the complex memory hierarchies found in these new heterogeneous devices. Experiments have been conducted using hyperspectral data sets collected by NASA's Airborne Visible Infra- red Imaging Spectrometer (AVIRIS) and the Hyperspectral Digital Imagery Collection Experiment (HYDICE) sensors. To the best of our knowledge, this kind of analysis has not been previously conducted in the hyperspectral imaging processing literature, and in our opinion it is very important in order to really calibrate the possibility of using heterogeneous platforms for efficient hyperspectral imaging processing in real remote sensing missions.

  14. Performance analysis of automatic generation control of interconnected power systems with delayed mode operation of area control error

    Directory of Open Access Journals (Sweden)

    Janardan Nanda

    2015-05-01

    Full Text Available This study presents automatic generation control (AGC of interconnected power systems comprising of two thermal and one hydro area having integral controllers. Emphasis is given to a delay in the area control error for the actuation of the supplementary controller and to examine its impact on the dynamic response against no delay which is usually the practice. Analysis is based on 50% loading condition in all the areas. The system performance is examined considering 1% step load perturbation. Results reveal that delayed mode operation provides a better system dynamic performance compared with that obtained without delay and has several distinct merits for the governor. The delay is linked with reduction in wear and tear of the secondary controller and hence increases the life of the governor. The controller gains are optimised by particle swarm optimisation. The performance of delayed mode operation of AGC at other loading conditions is also analysed. An attempt has also been made to find the impact of weights for different components in a cost function used to optimise the controller gains. A modified cost function having different weights for different components when used for controller gain optimisation improves the system performance.

  15. Regionalization of epididymal duct and epithelium in rats and mice by automatic computer-aided morphometric analysis

    Institute of Scientific and Technical Information of China (English)

    C. Soler; J. J. de Monserrat; M. Nú(n)ez; R. Gutiérrez; J. Nú(n)ez; M. Sancho; T. G. Cooper

    2005-01-01

    Aim: To establish a rat and mouse epididymal map based on the use of the Epiquatre automatic software for histologic image analysis. Methods: Epididymides from five adult rats and five adult mice were fixed in alcoholic Bouin's fixative and embedded in paraffin. Serial longitudinal sections through the medial aspect of the organ were cut at 10 μm and stained with hematoxylin and eosin. As determined from major connective tissue septa, nine subdivisions of the rat epididymis and seven for the mouse were determined, consisting of five sub-regions in the caput (rat and mouse), one (mouse) or three (rat) in the corpus and one in the cauda (rat and mouse). Using the Epiquatre software,several tubular, luminal and epithelial morphometric parameters were evaluated. Results: Statistical comparison of the quantitative parameters revealed regional differences (2-5 in the rat, 3-6 in the mouse, dependent on parameters)with caput regions 1 and 2 being largely distinguishable from the similar remaining caput and corpus, which were in turn recognizable from the cauda regions in both species. Conclusion: The use of the Epiquatre software allowed us to establish regression curves for different morphometric parameters that can permit the detection of changes in their values under different pathological or experimental conditions.

  16. Automatic Recognition of Human Parasite Cysts on Microscopic Stools Images using Principal Component Analysis and Probabilistic Neural Network

    Directory of Open Access Journals (Sweden)

    Beaudelaire Saha Tchinda

    2015-09-01

    Full Text Available Parasites live in a host and get its food from or at the expensive of that host. Cysts represent a form of resistance and spread of parasites. The manual diagnosis of microscopic stools images is time-consuming and depends on the human expert. In this paper, we propose an automatic recognition system that can be used to identify various intestinal parasite cysts from their microscopic digital images. We employ image pixel feature to train the probabilistic neural networks (PNN. Probabilistic neural networks are suitable for classification problems. The main novelty is the use of features vectors extracted directly from the image pixel. For this goal, microscopic images are previously segmented to separate the parasite image from the background. The extracted parasite is then resized to 12x12 image features vector. For dimensionality reduction, the principal component analysis basis projection has been used. 12x12 extracted features were orthogonalized into two principal components variables that consist the input vector of the PNN. The PNN is trained using 540 microscopic images of the parasite. The proposed approach was tested successfully on 540 samples of protozoan cysts obtained from 9 kinds of intestinal parasites.

  17. Using statistical analysis and artificial intelligence tools for automatic assessment of video sequences

    Science.gov (United States)

    Ekobo Akoa, Brice; Simeu, Emmanuel; Lebowsky, Fritz

    2014-01-01

    This paper proposes two novel approaches to Video Quality Assessment (VQA). Both approaches attempt to develop video evaluation techniques capable of replacing human judgment when rating video quality in subjective experiments. The underlying study consists of selecting fundamental quality metrics based on Human Visual System (HVS) models and using artificial intelligence solutions as well as advanced statistical analysis. This new combination enables suitable video quality ratings while taking as input multiple quality metrics. The first method uses a neural network based machine learning process. The second method consists in evaluating the video quality assessment using non-linear regression model. The efficiency of the proposed methods is demonstrated by comparing their results with those of existing work done on synthetic video artifacts. The results obtained by each method are compared with scores from a database resulting from subjective experiments.

  18. AUTOMATIC HUMAN FACE RECOGNITION USING MULTIVARIATE GAUSSIAN MODEL AND FISHER LINEAR DISCRIMINATIVE ANALYSIS

    Directory of Open Access Journals (Sweden)

    Surya Kasturi

    2014-08-01

    Full Text Available Face recognition plays an important role in surveillance, biometrics and is a popular application of computer vision. In this paper, color based skin segmentation is proposed to detect faces and is matched with faces from the dataset. The proposed color based segmentation method is tested in different color spaces to identify suitable color space for identification of faces. Based on the sample skin distribution a Multivariate Gaussian Model is fitted to identify skin regions from which face regions are detected using connected components. The detected face is match with a template and verified. The proposed method Multivariate Gaussian Model – Fisher Linear Discriminative Analysis (MGM – FLDA is compared with machine learning - Viola & Jones algorithm and it gives better results in terms of time.

  19. Automatic detection and high resolution fine structure analysis of conic X-ray diffraction lines

    Energy Technology Data Exchange (ETDEWEB)

    Bauch, J.; Henschel, F. [TU Dresden, Institut fuer Werkstoffwissenschaft, 01069 Dresden (Germany); Schulze, M. [TU Dresden, Institut fuer Photogrammetrie und Fernerkundung, 01069 Dresden (Germany)

    2011-05-15

    The presented method demonstrates a first step in the development of a high resolution ''Residual stress microscope'' and facilitates through the implementation of largely automated procedures a fast detection of diffraction lines in the form of conic sections. It has been implemented for, but is not exclusively used for the Kossel technique and the ''X-ray Rotation-Tilt Method'' (XRT). The resulting multifaceted evaluable data base of many X-ray diffraction radiographies can be used not only for the systematic analysis of anomalies in diffraction lines (reflection fine structure), but also for direct calculation and output of precision residual stress tensors. (copyright 2011 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim) (orig.)

  20. Automatic simplification of solid models for engineering analysis independent of modeling sequences

    International Nuclear Information System (INIS)

    Although solid models can represent complex and detailed geometry of parts, it is often necessary to simplify solid models by removing the detailed geometry in some applications such as finite element analysis and similarity assessment of CAD models. There are no standards for judging the goodness of a simplification method, but one essential criterion would be that it should generate a consistent and acceptable simplification for the same solid model, regardless of how the solid model has been created. Since a design-feature-based approach is tightly dependent on modeling sequences and designer's modeling preferences, it sometimes produces inconsistent and unacceptable simplifications. In this paper, a new method is proposed to simplify solid models of machined parts. Independently of user-specified design features, this method directly recognizes and generates subtractive features from the final model of the part, and then simplifies the solid model by removing the detailed geometry by using these subtractive features

  1. Process monitoring using automatic physical measurement based on electrical and physical variability analysis

    Science.gov (United States)

    Shauly, Eitan N.; Levi, Shimon; Schwarzband, Ishai; Adan, Ofer; Latinsky, Sergey

    2015-04-01

    A fully automated silicon-based methodology for systematic analysis of electrical features is shown. The system was developed for process monitoring and electrical variability reduction. A mapping step was created by dedicated structures such as static-random-access-memory (SRAM) array or standard cell library, or by using a simple design rule checking run-set. The resulting database was then used as an input for choosing locations for critical dimension scanning electron microscope images and for specific layout parameter extraction then was input to SPICE compact modeling simulation. Based on the experimental data, we identified two items that must be checked and monitored using the method described here: transistor's sensitivity to the distance between the poly end cap and edge of active area (AA) due to AA rounding, and SRAM leakage due to a too close N-well to P-well. Based on this example, for process monitoring and variability analyses, we extensively used this method to analyze transistor gates having different shapes. In addition, analysis for a large area of high density standard cell library was done. Another set of monitoring focused on a high density SRAM array is also presented. These examples provided information on the poly and AA layers, using transistor parameters such as leakage current and drive current. We successfully define "robust" and "less-robust" transistor configurations included in the library and identified unsymmetrical transistors in the SRAM bit-cells. These data were compared to data extracted from the same devices at the end of the line. Another set of analyses was done to samples after Cu M1 etch. Process monitoring information on M1 enclosed contact was extracted based on contact resistance as a feedback. Guidelines for the optimal M1 space for different layout configurations were also extracted. All these data showed the successful in-field implementation of our methodology as a useful process monitoring method.

  2. Suitability of UK Biobank Retinal Images for Automatic Analysis of Morphometric Properties of the Vasculature.

    Directory of Open Access Journals (Sweden)

    Thomas J MacGillivray

    Full Text Available To assess the suitability of retinal images held in the UK Biobank--the largest retinal data repository in a prospective population-based cohort--for computer assisted vascular morphometry, generating measures that are commonly investigated as candidate biomarkers of systemic disease.Non-mydriatic fundus images from both eyes of 2,690 participants--people with a self-reported history of myocardial infarction (n=1,345 and a matched control group (n=1,345--were analysed using VAMPIRE software. These images were drawn from those of 68,554 UK Biobank participants who underwent retinal imaging at recruitment. Four operators were trained in the use of the software to measure retinal vascular tortuosity and bifurcation geometry.Total operator time was approximately 360 hours (4 minutes per image. 2,252 (84% of participants had at least one image of sufficient quality for the software to process, i.e. there was sufficient detection of retinal vessels in the image by the software to attempt the measurement of the target parameters. 1,604 (60% of participants had an image of at least one eye that was adequately analysed by the software, i.e. the measurement protocol was successfully completed. Increasing age was associated with a reduced proportion of images that could be processed (p=0.0004 and analysed (p<0.0001. Cases exhibited more acute arteriolar branching angles (p=0.02 as well as lower arteriolar and venular tortuosity (p<0.0001.A proportion of the retinal images in UK Biobank are of insufficient quality for automated analysis. However, the large size of the UK Biobank means that tens of thousands of images are available and suitable for computational analysis. Parametric information measured from the retinas of participants with suspected cardiovascular disease was significantly different to that measured from a matched control group.

  3. Analysis of crystallographic slip and grain boundary sliding in a Ti–45Al–2Nb–2Mn (at%)–0.8 vol%TiB2 alloy by high temperature in situ mechanical testing

    International Nuclear Information System (INIS)

    This work aims to contribute to a further understanding of the fundamentals of crystallographic slip and grain boundary sliding in the γ-TiAl Ti–45Al–2Nb–2Mn (at%)–0.8 vol%TiB2 intermetallic alloy, by means of in situ high-temperature tensile testing combined with electron backscatter diffraction (EBSD). Several microstructures, containing different fractions and sizes of lamellar colonies and equiaxed γ-grains, were fabricated by either centrifugal casting or powder metallurgy, followed by heat treatment at 1300 °C and furnace cooling. in situ tensile and tensile-creep experiments were performed in a scanning electron microscope (SEM) at temperatures ranging from 580 °C to 700 °C. EBSD was carried out in selected regions before and after straining. Our results suggest that, during constant strain rate tests, true twin γ/γ interfaces are the weakest barriers to dislocations and, thus, that the relevant length scale might be influenced by the distance between non-true twin boundaries. Under creep conditions both grain/colony boundary sliding (G/CBS) and crystallographic slip are observed to contribute to deformation. The incidence of boundary sliding is particularly high in γ grains of duplex microstructures. The slip activity during creep deformation in different microstructures was evaluated by trace analysis. Special emphasis was placed in distinguishing the compliance of different slip events with the Schmid law with respect to the applied stress

  4. Analysis of crystallographic slip and grain boundary sliding in a Ti–45Al–2Nb–2Mn (at%)–0.8 vol%TiB{sub 2} alloy by high temperature in situ mechanical testing

    Energy Technology Data Exchange (ETDEWEB)

    Muñoz-Moreno, R. [IMDEA Materials Institute, C/Eric Kandel 2, 28906 Getafe, Madrid (Spain); Department of Materials Science and Engineering, Universidad Carlos III de Madrid, Avda. Universidad 30, 28911 Leganés (Spain); Ruiz-Navas, E.M. [Department of Materials Science and Engineering, Universidad Carlos III de Madrid, Avda. Universidad 30, 28911 Leganés (Spain); Boehlert, C.J. [IMDEA Materials Institute, C/Eric Kandel 2, 28906 Getafe, Madrid (Spain); Department of Chemical Engineering and Materials Science, Michigan State University, 2527 Engineering Building, East Lansing, MI 48824 (United States); Llorca, J. [IMDEA Materials Institute, C/Eric Kandel 2, 28906 Getafe, Madrid (Spain); Department of Materials Science, Polytechnic University of Madrid, 28040 Madrid (Spain); Torralba, J.M. [IMDEA Materials Institute, C/Eric Kandel 2, 28906 Getafe, Madrid (Spain); Department of Materials Science and Engineering, Universidad Carlos III de Madrid, Avda. Universidad 30, 28911 Leganés (Spain); Pérez-Prado, M.T., E-mail: teresa.perez.prado@imdea.org [IMDEA Materials Institute, C/Eric Kandel 2, 28906 Getafe, Madrid (Spain)

    2014-06-01

    This work aims to contribute to a further understanding of the fundamentals of crystallographic slip and grain boundary sliding in the γ-TiAl Ti–45Al–2Nb–2Mn (at%)–0.8 vol%TiB{sub 2} intermetallic alloy, by means of in situ high-temperature tensile testing combined with electron backscatter diffraction (EBSD). Several microstructures, containing different fractions and sizes of lamellar colonies and equiaxed γ-grains, were fabricated by either centrifugal casting or powder metallurgy, followed by heat treatment at 1300 °C and furnace cooling. in situ tensile and tensile-creep experiments were performed in a scanning electron microscope (SEM) at temperatures ranging from 580 °C to 700 °C. EBSD was carried out in selected regions before and after straining. Our results suggest that, during constant strain rate tests, true twin γ/γ interfaces are the weakest barriers to dislocations and, thus, that the relevant length scale might be influenced by the distance between non-true twin boundaries. Under creep conditions both grain/colony boundary sliding (G/CBS) and crystallographic slip are observed to contribute to deformation. The incidence of boundary sliding is particularly high in γ grains of duplex microstructures. The slip activity during creep deformation in different microstructures was evaluated by trace analysis. Special emphasis was placed in distinguishing the compliance of different slip events with the Schmid law with respect to the applied stress.

  5. 发动机气道CFD分析流程自动化研究%CFD Automatic Analysis Process of Engine Ports

    Institute of Scientific and Technical Information of China (English)

    姜涛; 罗马吉; 向梁山; 宋秀萍

    2012-01-01

    在成熟的发动机气道CFD分析流程的基础上,基于CFD分析软件Star-ccm+和Java编程语言,提出并探索了发动机气道CFD分析流程自动化的解决方案,并开发出了简单的气道CFD自动化分析平台,可在一定程度上提高气道CFD分析的效率.该方案对其他CFD分析问题的流程自动化也具有借鉴意义.%Based on the mature analysis process of the engine port CFD, the solution of the automatic process of the engine port CFD analysis was presented with Star - ccm + software and Java programming. And a simple platform of the CFD automatic analysis was developed. It can improve the efficiency of the CFD analysis and provide references for the automatic analysis processes of other CFD problems.

  6. Automatic estimation of optimal autoregressive filters for the analysis of volcanic seismic activity

    Directory of Open Access Journals (Sweden)

    P. Lesage

    2008-04-01

    Full Text Available Long-period (LP events observed on volcanoes provide important information for volcano monitoring and for studying the physical processes in magmatic and hydrothermal systems. Of all the methods used to analyse this kind of seismicity, autoregressive (AR modelling is particularly valuable, as it produces precise estimations of the frequencies and quality factors of the spectral peaks that are generated by resonance effects at seismic sources and, via deconvolution of the observed record, it allows the excitation function of the resonator to be determined. However, with AR modelling methods it is difficult to determine the order of the AR filter that will yield the best model of the signal. This note presents an algorithm to overcome this problem, together with some examples of applications. The approach described uses the kurtosis (fourth order cumulant of the deconvolved signal to provide an objective criterion for selecting the filter order. This approach allows the partial automation of the AR analysis and thus provides interesting possibilities for improving volcano monitoring methods.

  7. Automatic evaluation of intrapartum fetal heart rate recordings: a comprehensive analysis of useful features

    International Nuclear Information System (INIS)

    Cardiotocography is the monitoring of fetal heart rate (FHR) and uterine contractions (TOCO), used routinely since the 1960s by obstetricians to detect fetal hypoxia. The evaluation of the FHR in clinical settings is based on an evaluation of macroscopic morphological features and so far has managed to avoid adopting any achievements from the HRV research field. In this work, most of the features utilized for FHR characterization, including FIGO, HRV, nonlinear, wavelet, and time and frequency domain features, are investigated and assessed based on their statistical significance in the task of distinguishing the FHR into three FIGO classes. We assess the features on a large data set (552 records) and unlike in other published papers we use three-class expert evaluation of the records instead of the pH values. We conclude the paper by presenting the best uncorrelated features and their individual rank of importance according to the meta-analysis of three different ranking methods. The number of accelerations and decelerations, interval index, as well as Lempel–Ziv complexity and Higuchi's fractal dimension are among the top five features

  8. Inter-observer Variability Analysis of Automatic Lung Delineation in Normal and Disease Patients.

    Science.gov (United States)

    Saba, Luca; Than, Joel C M; Noor, Norliza M; Rijal, Omar M; Kassim, Rosminah M; Yunus, Ashari; Ng, Chue R; Suri, Jasjit S

    2016-06-01

    Human interaction has become almost mandatory for an automated medical system wishing to be accepted by clinical regulatory agencies such as Food and Drug Administration. Since this interaction causes variability in the gathered data, the inter-observer and intra-observer variability must be analyzed in order to validate the accuracy of the system. This study focuses on the variability from different observers that interact with an automated lung delineation system that relies on human interaction in the form of delineation of the lung borders. The database consists of High Resolution Computed Tomography (HRCT): 15 normal and 81 diseased patients' images taken retrospectively at five levels per patient. Three observers manually delineated the lungs borders independently and using software called ImgTracer™ (AtheroPoint™, Roseville, CA, USA) to delineate the lung boundaries in all five levels of 3-D lung volume. The three observers consisted of Observer-1: lesser experienced novice tracer who is a resident in radiology under the guidance of radiologist, whereas Observer-2 and Observer-3 are lung image scientists trained by lung radiologist and biomedical imaging scientist and experts. The inter-observer variability can be shown by comparing each observer's tracings to the automated delineation and also by comparing each manual tracing of the observers with one another. The normality of the tracings was tested using D'Agostino-Pearson test and all observers tracings showed a normal P-value higher than 0.05. The analysis of variance (ANOVA) test between three observers and automated showed a P-value higher than 0.89 and 0.81 for the right lung (RL) and left lung (LL), respectively. The performance of the automated system was evaluated using Dice Similarity Coefficient (DSC), Jaccard Index (JI) and Hausdorff (HD) Distance measures. Although, Observer-1 has lesser experience compared to Obsever-2 and Obsever-3, the Observer Deterioration Factor (ODF) shows that

  9. Statistical analysis of automatically detected ion density variations recorded by DEMETER and their relation to seismic activity

    Directory of Open Access Journals (Sweden)

    Michel Parrot

    2012-04-01

    Full Text Available

    Many examples of ionospheric perturbations observed during large seismic events were recorded by the low-altitude satellite DEMETER. However, there are also ionospheric variations without seismic activity. The present study is devoted to a statistical analysis of the night-time ion density variations. Software was implemented to detect variations in the data before earthquakes world-wide. Earthquakes with magnitudes >4.8 were selected and classified according to their magnitudes, depths and locations (land, close to the coast, or below the sea. For each earthquake, an automatic search for ion density variations was conducted from 15 days before the earthquake, when the track of the satellite orbit was at less than 1,500 km from the earthquake epicenter. The result of this first step provided the variations relative to the background in the vicinity of the epicenter for each 15 days before each earthquake. In the second step, comparisons were carried out between the largest variations over the 15 days and the earthquake magnitudes. The statistical analysis is based on calculation of the median values as a function of the various seismic parameters (magnitude, depth, location. A comparison was also carried out with two other databases, where on the one hand, the locations of the epicenters were randomly modified, and on the other hand, the longitudes of the epicenters were shifted. The results show that the intensities of the ionospheric perturbations are larger prior to the earthquakes than prior to random events, and that the perturbations increase with the earthquake magnitudes.


  10. Automatic Spiral Analysis for Objective Assessment of Motor Symptoms in Parkinson’s Disease

    Directory of Open Access Journals (Sweden)

    Mevludin Memedi

    2015-09-01

    well as had good test-retest reliability. This study demonstrated the potential of using digital spiral analysis for objective quantification of PD-specific and/or treatment-induced motor symptoms.

  11. [Reliability of % vol. declarations on labels of wine bottles].

    Science.gov (United States)

    Schütz, Harald; Erdmann, Freidoon; Verhoff, Marcel A; Weiler, Günter

    2005-01-01

    The Council Regulation (EC) no. 1493/1999 of 17 May 1999 on the common organisation of the market in wine (Abl. L 179 dated 14/7/1999) and the GMO Wine 2000 (Annex VII A) stipulates that the labels of wine bottles have to indicate, among others, information on the sales designation of the product, the nominal volume and the alcoholic strength. The latter must not differ by more than 0.5% vol. from the alcoholic strength as established by analysis. Only when quality wines are stored in bottles for more than three years, the accepted tolerance limits are +/- 0.8% vol. The presented investigation results show that deviations have to be taken into account which may be highly relevant for forensic practice. PMID:15887778

  12. Main: 1VOL [RPSD[Archive

    Lifescience Database Archive (English)

    Full Text Available 1VOL シロイヌナズナ Arabidopsis Arabidopsis thaliana (L.) Heynh. Transcription Initiation Factor ... ucture Of A Tfiib-Tbp-Tata-Element Ternary Complex Nature ... V. 377 119 1995 Transcription Initiation, Molecula ...

  13. Automatic Differentiation Variational Inference

    OpenAIRE

    Kucukelbir, Alp; Tran, Dustin; Ranganath, Rajesh; Gelman, Andrew; Blei, David M.

    2016-01-01

    Probabilistic modeling is iterative. A scientist posits a simple model, fits it to her data, refines it according to her analysis, and repeats. However, fitting complex models to large data is a bottleneck in this process. Deriving algorithms for new models can be both mathematically and computationally challenging, which makes it difficult to efficiently cycle through the steps. To this end, we develop automatic differentiation variational inference (ADVI). Using our method, the scientist on...

  14. A Contrastive Analysis in the Meaning of the Linguistic Units in the Contemporary German and Macedonian Language. Procedia – Social and Behavioural Sciences, Vol. 46, 1198-1202

    OpenAIRE

    Ivanovska, Biljana; Daskalovska, Nina; Celik, Mahmut

    2012-01-01

    Abstract We analyze the similarities and differences of the linguistic units in the contemporary German and Macedonian language and the lexical structure of both languages. This paper focuses on the analysis and the comparison of the semantic feature of the items related to the term “kinship” (Verwandschaftbezeichnungen). The approach for analysis varies from language to language and depends on the different language systems of the languages being compared. It demands careful preparation a...

  15. AUTOMATIC ANALYSIS AND CLASSIFICATION OF THE ROOF SURFACES FOR THE INSTALLATION OF SOLAR PANELS USING A MULTI-DATA SOURCE AND MULTI-SENSOR AERIAL PLATFORM

    OpenAIRE

    López, L.; Lagüela, S.; Picon, I.; D. González-Aguilera

    2015-01-01

    A low-cost multi-sensor aerial platform, aerial trike, equipped with visible and thermographic sensors is used for the acquisition of all the data needed for the automatic analysis and classification of roof surfaces regarding their suitability to harbour solar panels. The geometry of a georeferenced 3D point cloud generated from visible images using photogrammetric and computer vision algorithms, and the temperatures measured on thermographic images are decisive to evaluate the surfaces, slo...

  16. Large Scale Automatic Analysis and Classification of Roof Surfaces for the Installation of Solar Panels Using a Multi-Sensor Aerial Platform

    OpenAIRE

    Luis López-Fernández; Susana Lagüela; Inmaculada Picón; Diego González-Aguilera

    2015-01-01

    A low-cost multi-sensor aerial platform, aerial trike, equipped with visible and thermographic sensors is used for the acquisition of all the data needed for the automatic analysis and classification of roof surfaces regarding their suitability to harbor solar panels. The geometry of a georeferenced 3D point cloud generated from visible images using photogrammetric and computer vision algorithms, and the temperatures measured on thermographic images are decisive to evaluate the areas, tilts, ...

  17. Study of medical isotope production facility stack emissions and noble gas isotopic signature using automatic gamma-spectra analysis platform

    Science.gov (United States)

    Zhang, Weihua; Hoffmann, Emmy; Ungar, Kurt; Dolinar, George; Miley, Harry; Mekarski, Pawel; Schrom, Brian; Hoffman, Ian; Lawrie, Ryan; Loosz, Tom

    2013-04-01

    The nuclear industry emissions of the four CTBT (Comprehensive Nuclear-Test-Ban Treaty) relevant radioxenon isotopes are unavoidably detected by the IMS along with possible treaty violations. Another civil source of radioxenon emissions which contributes to the global background is radiopharmaceutical production companies. To better understand the source terms of these background emissions, a joint project between HC, ANSTO, PNNL and CRL was formed to install real-time detection systems to support 135Xe, 133Xe, 131mXe and 133mXe measurements at the ANSTO and CRL 99Mo production facility stacks as well as the CANDU (CANada Deuterium Uranium) primary coolant monitoring system at CRL. At each site, high resolution gamma spectra were collected every 15 minutes using a HPGe detector to continuously monitor a bypass feed from the stack or CANDU primary coolant system as it passed through a sampling cell. HC also conducted atmospheric monitoring for radioxenon at approximately 200 km distant from CRL. A program was written to transfer each spectrum into a text file format suitable for the automatic gamma-spectra analysis platform and then email the file to a server. Once the email was received by the server, it was automatically analysed with the gamma-spectrum software UniSampo/Shaman to perform radionuclide identification and activity calculation for a large number of gamma-spectra in a short period of time (less than 10 seconds per spectrum). The results of nuclide activity together with other spectrum parameters were saved into the Linssi database. This database contains a large amount of radionuclide information which is a valuable resource for the analysis of radionuclide distribution within the noble gas fission product emissions. The results could be useful to identify the specific mechanisms of the activity release. The isotopic signatures of the various radioxenon species can be determined as a function of release time. Comparison of 133mXe and 133Xe activity

  18. A semantic approach to the efficient integration of interactive and automatic target recognition systems for the analysis of complex infrastructure from aerial imagery

    Science.gov (United States)

    Bauer, A.; Peinsipp-Byma, E.

    2008-04-01

    The analysis of complex infrastructure from aerial imagery, for instance a detailed analysis of an airfield, requires the interpreter, besides to be familiar with the sensor's imaging characteristics, to have a detailed understanding of the infrastructure domain. The required domain knowledge includes knowledge about the processes and functions involved in the operation of the infrastructure, the potential objects used to provide those functions and their spatial and functional interrelations. Since it is not possible yet to provide reliable automatic object recognition (AOR) for the analysis of such complex scenes, we developed systems to support a human interpreter with either interactive approaches, able to assist the interpreter with previously acquired expert knowledge about the domain in question, or AOR methods, capable of detecting, recognizing or analyzing certain classes of objects for certain sensors. We believe, to achieve an optimal result at the end of an interpretation process in terms of efficiency and effectivity, it is essential to integrate both interactive and automatic approaches to image interpretation. In this paper we present an approach inspired by the advancing semantic web technology to represent domain knowledge, the capabilities of available AOR modules and the image parameters in an explicit way. This enables us to seamlessly extend an interactive image interpretation environment with AOR modules in a way that we can automatically select suitable AOR methods for the current subtask, focus them on an appropriate area of interest and reintegrate their results into the environment.

  19. Automatic Functional Harmonic Analysis

    NARCIS (Netherlands)

    de Haas, W.B.; Magalhães, J.P.; Wiering, F.; Veltkamp, R.C.

    2013-01-01

    Music scholars have been studying tonal harmony intensively for centuries, yielding numerous theories and models. Unfortunately, a large number of these theories are formulated in a rather informal fashion and lack mathematical precision. In this article we present HarmTrace, a functional model of W

  20. Automatic Functional Harmonic Analysis

    OpenAIRE

    de Haas, W.B.; Magalhães, J.P.; Wiering, F.; Veltkamp, R.C.

    2013-01-01

    Music scholars have been studying tonal harmony intensively for centuries, yielding numerous theories and models. Unfortunately, a large number of these theories are formulated in a rather informal fashion and lack mathematical precision. In this article we present HarmTrace, a functional model of Western tonal harmony that builds on well-known theories of tonal harmony. In contrast to other approaches that remain purely theoretical, we present an implemented system that is evaluated empirica...

  1. English Grammar, A Combined Tagmemic and Transformational Approach. A Constrastive Analysis of English and Vietnamese, Vol. 1. Linguistic Circle of Canberra Publications, Series C--Books, No. 3.

    Science.gov (United States)

    Nguyen, Dang Liem

    This is the first volume of a contrastive analysis of English and Vietnamese in the light of a combined tagmemic and transformational approach. The dialects contrasted are Midwest Standard American English and Standard Saigon Vietnamese. The study has been designed chiefly for pedagogical applications. A general introduction gives the history of…

  2. A computer program to automatically generate state equations and macro-models. [for network analysis and design

    Science.gov (United States)

    Garrett, S. J.; Bowers, J. C.; Oreilly, J. E., Jr.

    1978-01-01

    A computer program, PROSE, that produces nonlinear state equations from a simple topological description of an electrical or mechanical network is described. Unnecessary states are also automatically eliminated, so that a simplified terminal circuit model is obtained. The program also prints out the eigenvalues of a linearized system and the sensitivities of the eigenvalue of largest magnitude.

  3. A Level 1+ Probabilistic Safety Assessment of the high flux Australian reactor. Vol. 2. Appendix C: System analysis models and results

    International Nuclear Information System (INIS)

    This section contains the results of the quantitative system/top event analysis. Section C. 1 gives the basic event coding scheme. Section C.2 shows the master frequency file (MFF), which contains the split fraction names, the top events they belong to, the mean values of the uncertainty distribution that is generated by the Monte Carlo quantification in the System Analysis module of RISKMAN, and a brief description of each split fraction. The MFF is organized by the systems modeled, and within each system, the top events associated with the system. Section C.3 contains the fault trees developed for the system/top event models and the RISKMAN reports for each of the system/top event models. The reports are organized under the following system headings: Compressed/Service Air Supply (AIR); Containment Isolation System (CIS); Heavy Water Cooling System (D20); Emergency Core Cooling System (ECCS; Electric Power System (EPS); Light Water Cooling system (H20); Helium Gas System (HE); Mains Water System (MW); Miscellaneous Top Events (MISC); Operator Actions (OPER) Reactor Protection System (RPS); Space Conditioner System (SCS); Condition/Status Switch (SWITCH); RCB Ventilation System (VENT); No. 1 Storage Block Cooling System (SB)

  4. A Level 1+ Probabilistic Safety Assessment of the high flux Australian reactor. Vol. 2. Appendix C: System analysis models and results

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-01-01

    This section contains the results of the quantitative system/top event analysis. Section C. 1 gives the basic event coding scheme. Section C.2 shows the master frequency file (MFF), which contains the split fraction names, the top events they belong to, the mean values of the uncertainty distribution that is generated by the Monte Carlo quantification in the System Analysis module of RISKMAN, and a brief description of each split fraction. The MFF is organized by the systems modeled, and within each system, the top events associated with the system. Section C.3 contains the fault trees developed for the system/top event models and the RISKMAN reports for each of the system/top event models. The reports are organized under the following system headings: Compressed/Service Air Supply (AIR); Containment Isolation System (CIS); Heavy Water Cooling System (D20); Emergency Core Cooling System (ECCS); Electric Power System (EPS); Light Water Cooling system (H20); Helium Gas System (HE); Mains Water System (MW); Miscellaneous Top Events (MISC); Operator Actions (OPER) Reactor Protection System (RPS); Space Conditioner System (SCS); Condition/Status Switch (SWITCH); RCB Ventilation System (VENT); No. 1 Storage Block Cooling System (SB)

  5. Development and validation of automatic tools for interactive recurrence analysis in radiation therapy: optimization of treatment algorithms for locally advanced pancreatic cancer

    International Nuclear Information System (INIS)

    In radiation oncology recurrence analysis is an important part in the evaluation process and clinical quality assurance of treatment concepts. With the example of 9 patients with locally advanced pancreatic cancer we developed and validated interactive analysis tools to support the evaluation workflow. After an automatic registration of the radiation planning CTs with the follow-up images, the recurrence volumes are segmented manually. Based on these volumes the DVH (dose volume histogram) statistic is calculated, followed by the determination of the dose applied to the region of recurrence and the distance between the boost and recurrence volume. We calculated the percentage of the recurrence volume within the 80%-isodose volume and compared it to the location of the recurrence within the boost volume, boost + 1 cm, boost + 1.5 cm and boost + 2 cm volumes. Recurrence analysis of 9 patients demonstrated that all recurrences except one occurred within the defined GTV/boost volume; one recurrence developed beyond the field border/outfield. With the defined distance volumes in relation to the recurrences, we could show that 7 recurrent lesions were within the 2 cm radius of the primary tumor. Two large recurrences extended beyond the 2 cm, however, this might be due to very rapid growth and/or late detection of the tumor progression. The main goal of using automatic analysis tools is to reduce time and effort conducting clinical analyses. We showed a first approach and use of a semi-automated workflow for recurrence analysis, which will be continuously optimized. In conclusion, despite the limitations of the automatic calculations we contributed to in-house optimization of subsequent study concepts based on an improved and validated target volume definition

  6. Microcomputer-based systems for automatic control of sample irradiation and chemical analysis of short-lived isotopes

    International Nuclear Information System (INIS)

    Two systems resulted from the need for the study of the nuclear decay of short-lived radionuclides. Automation was required for better repeatability, speed of chemical separation after irradiation and for protection from the high radiation fields of the samples. A MCS-8 computer was used as the nucleus of the automatic sample irradiation system because the control system required an extensive multiple-sequential circuit. This approach reduced the sequential problem to a computer program. The automatic chemistry control system is a mixture of a fixed and a computer-based programmable control system. The fixed control receives the irradiated liquid sample from the reactor, extracts the liquid and disposes of the used sample container. The programmable control executes the chemistry program that the user has entered through the teletype. (U.S.)

  7. Preliminary comparison with 40 CFR Part 191, Subpart B for the Waste Isolation Pilot Plant, December 1991. Vol. 4: Uncertainty and sensitivity analysis results

    International Nuclear Information System (INIS)

    The most appropriate conceptual model for performance assessment at the Waste Isolation Pilot Plant (WIPP) is believed to include gas generation due to corrosion and microbial action in the repository and a dual-porosity (matrix and fracture porosity) representation for solute transport in the Culebra Dolomite Member of the Rustler Formation. Under these assumptions, complementary cumulative distribution functions (CCDFs) summarizing radionuclide releases to the accessible environment due to both cuttings removal and groundwater transport fall substantially below the release limits promulgated by the Environmental Protection Agency (EPA). This is the case even when the current estimates of the uncertainty in analysis inputs are incorporated into the performance assessment. The best-estimate performance-assessment results are dominated by cuttings removal. The releases to the accessible environment due to groundwater transport make very small contributions to the total release. The variability in the distribution of CCDFs that must be considered in comparisons with the EPA release limits is dominated by the variable LAMBDA (rate constant in Poisson model for drilling intrusions). The variability in releases to the accessible environment due to individual drilling intrusions is dominated by DBDIAM (drill bit diameter). Most of the imprecisely known variables considered in the 1991 WIPP performance assessment relate to radionuclide releases to the accessible environment due to groundwater transport. For a single borehole (i.e., an E2-type scenario), whether or not a release from the repository to the Culebra even occurs is controlled by the variable SALPERM (Salado permeability), with no releases for small values (i.e., -21 m2) of this variable. When SALPERM is small, the repository never fills with brine and so there is no flow up an intruding borehole that can transport radionuclides to the Culebra. Further, releases that do reach the Culebra for larger values of

  8. An image analysis and classification system for automatic weed species identification in different crops for precision weed management

    OpenAIRE

    Weis, Martin

    2010-01-01

    A system for the automatic weed detection in arable fields was developed in this thesis. With the resulting maps, weeds in fields can be controlled on a sub-field level, according to their abundance. The system contributes to the emerging field of Precision Farming technologies. Precision Farming technologies have been developed during the last two decades to refine the agricultural management practise. The goal of Precision Farming is to vary treatments within fields, according to the local ...

  9. Contribution to automatic speech recognition. Analysis of the direct acoustical signal. Recognition of isolated words and phoneme identification

    International Nuclear Information System (INIS)

    This report deals with the acoustical-phonetic step of the automatic recognition of the speech. The parameters used are the extrema of the acoustical signal (coded in amplitude and duration). This coding method, the properties of which are described, is simple and well adapted to a digital processing. The quality and the intelligibility of the coded signal after reconstruction are particularly satisfactory. An experiment for the automatic recognition of isolated words has been carried using this coding system. We have designed a filtering algorithm operating on the parameters of the coding. Thus the characteristics of the formants can be derived under certain conditions which are discussed. Using these characteristics the identification of a large part of the phonemes for a given speaker was achieved. Carrying on the studies has required the development of a particular methodology of real time processing which allowed immediate evaluation of the improvement of the programs. Such processing on temporal coding of the acoustical signal is extremely powerful and could represent, used in connection with other methods an efficient tool for the automatic processing of the speech.(author)

  10. Automatic Spectroscopic Data Categorization by Clustering Analysis (ASCLAN): A Data-Driven Approach for Distinguishing Discriminatory Metabolites for Phenotypic Subclasses.

    Science.gov (United States)

    Zou, Xin; Holmes, Elaine; Nicholson, Jeremy K; Loo, Ruey Leng

    2016-06-01

    We propose a novel data-driven approach aiming to reliably distinguish discriminatory metabolites from nondiscriminatory metabolites for a given spectroscopic data set containing two biological phenotypic subclasses. The automatic spectroscopic data categorization by clustering analysis (ASCLAN) algorithm aims to categorize spectral variables within a data set into three clusters corresponding to noise, nondiscriminatory and discriminatory metabolites regions. This is achieved by clustering each spectral variable based on the r(2) value representing the loading weight of each spectral variable as extracted from a orthogonal partial least-squares discriminant (OPLS-DA) model of the data set. The variables are ranked according to r(2) values and a series of principal component analysis (PCA) models are then built for subsets of these spectral data corresponding to ranges of r(2) values. The Q(2)X value for each PCA model is extracted. K-means clustering is then applied to the Q(2)X values to generate two clusters based on minimum Euclidean distance criterion. The cluster consisting of lower Q(2)X values is deemed devoid of metabolic information (noise), while the cluster consists of higher Q(2)X values is then further subclustered into two groups based on the r(2) values. We considered the cluster with high Q(2)X but low r(2) values as nondiscriminatory, while the cluster with high Q(2)X and r(2) values as discriminatory variables. The boundaries between these three clusters of spectral variables, on the basis of the r(2) values were considered as the cut off values for defining the noise, nondiscriminatory and discriminatory variables. We evaluated the ASCLAN algorithm using six simulated (1)H NMR spectroscopic data sets representing small, medium and large data sets (N = 50, 500, and 1000 samples per group, respectively), each with a reduced and full resolution set of variables (0.005 and 0.0005 ppm, respectively). ASCLAN correctly identified all discriminatory

  11. Road Safety Data, Collection, Transfer and Analysis DaCoTa. Deliverable 1.5. Vol.1 — Analysis of the stakeholder survey: perceived priority and availability of data and tools and relation to the stakeholders' characteristics. Vol.II: Analysis of Road Safety Management in the European countries.

    OpenAIRE

    Papadimitriou, E. Yannis, G. Bijleveld, F.D. & Cardoso, J.L.

    2015-01-01

    Volume I: This report is part of the ‘Policy’ Work Package of the DaCoTA project (www.dacotaproject.eu). The ‘Policy’ Work Package is designed to fill in the gap in knowledge on road safety policy making processes, their institutional framework and the data, methods and technical tools needed to base policy formulation and adoption on scientifically-established evidence. This document provides the results of a detailed analysis of a survey conducted with a large panel of stakeholders. The aim...

  12. AccPbFRET: An ImageJ plugin for semi-automatic, fully corrected analysis of acceptor photobleaching FRET images

    Directory of Open Access Journals (Sweden)

    Vereb György

    2008-08-01

    Full Text Available Abstract Background The acceptor photobleaching fluorescence resonance energy transfer (FRET method is widely used for monitoring molecular interactions in cells. This method of FRET, while among those with the simplest mathematics, is robust, self-controlled and independent of fluorophore amounts and ratios. Results AccPbFRET is a user-friendly, efficient ImageJ plugin which allows fully corrected, pixel-wise calculation and detailed, ROI (region of interest-based analysis of FRET efficiencies in microscopic images. Furthermore, automatic registration and semi-automatic analysis of large image sets is provided, which are not available in any existing FRET evaluation software. Conclusion Despite of the widespread applicability of the acceptor photobleaching FRET technique, this is the first paper where all possible sources of major errors of the measurement and analysis are considered, and AccPbFRET is the only program which provides the complete suite of corrections – for registering image pairs, for unwanted photobleaching of the donor, for cross-talk of the acceptor and/or its photoproduct to the donor channel and for partial photobleaching of the acceptor. The program efficiently speeds up the analysis of large image sets even for novice users and is freely available.

  13. Dynamics of structures '89. Vol. 3

    International Nuclear Information System (INIS)

    The proceedings, comprising 3 volumes published by the Plzen Centre of the Czechoslovak Society for Science and Technology (Vol. 1 and 2) and by Skoda Works in Plzen (Vol. 3), contain 107 papers, out of which 8 fall within the INIS Subject Scope; these deal with problems related to the earthquake resistance of nuclear power plants. Attention is paid to the evaluation of seismic characteristics of nuclear power plant equipment, to the equipment testing and to calculations of its dynamic characteristics under simulated seismic stress. (Z.M.)

  14. Crisis Communication (Handbooks of Communication Science Vol. 23)

    DEFF Research Database (Denmark)

    Vol. 23 - The Handbook of Communication Science General editors: Peter J. Schultz and Paul Cobley......Vol. 23 - The Handbook of Communication Science General editors: Peter J. Schultz and Paul Cobley...

  15. Carotid artery stenosis quantification: Concordance analysis between radiologist and semi-automatic computer software by using Multi-Detector-Row CT angiography

    Energy Technology Data Exchange (ETDEWEB)

    Saba, Luca, E-mail: lucasaba@tiscali.it [Department of Radiology, Policlinico Universitario, A.O.U. Cagliari, s.s. 554 Monserrato, Cagliari 09045 (Italy); Department of Vascular Surgery, Policlinico Universitario, A.O.U. Cagliari, s.s. 554 Monserrato, Cagliari 09045 (Italy); Sanfilippo, Roberto; Montisci, Roberto [Department of Vascular Surgery, Policlinico Universitario, A.O.U. Cagliari, s.s. 554 Monserrato, Cagliari 09045 (Italy); Calleo, Giancarlo [Institute of Radiology, Ospedale San Giovanni di Dio, A.O.U. Cagliari 46 Hospital Street, 09126 Cagliari (Italy); Mallarini, Giorgio [Department of Radiology, Policlinico Universitario, A.O.U. Cagliari, s.s. 554 Monserrato, Cagliari 09045 (Italy); Institute of Radiology, Ospedale San Giovanni di Dio, A.O.U. Cagliari 46 Hospital Street, 09126 Cagliari (Italy)

    2011-07-15

    Purpose: Carotid artery stenosis quantification is still considered a leading parameter in the choice of the therapeutic option. Our purpose was to asses the concordance between radiologist and a semi-automatic computer software in the stenosis quantification of carotid artery studied by using a Multi-Detector-Row CT angiography (MDCTA). Methods and material: 45 patients studied by using a 40-detector row CT scanner were retrospectively analyzed. Carotid artery stenosis was quantified by one high experienced radiologist in vessel analysis and by using a dedicated software. Carotid artery stenosis was calculated according to the ECST method. Bland-Altman statistics was used to measure the inter- and intra-concordance between radiologist and software and correlation coefficient between measures were performed by using nonparametric Spearmann correlation statistic. A p value < 0.05 was considered to mean statistical significance. Results: A strength correlation according to linear regression (correlation Spearman'{rho} coefficient = 0.975; p < 0.0001) between radiologist and software of vessel analysis was observed. Between first and second stenosis of carotid artery quantification performed by radiologist and software of vessel analysis we observed a Spearman'{rho} coefficient = 0.943 (p < 0.0001) and a Spearman'{rho} coefficient = 0.9879; (p < 0.0001) respectively. Conclusions: Our results indicated that there is a strength correlation according to linear regression between stenosis of carotid artery quantification performed by radiologist and semi-automatic software. Reproducibility between measurements performed by semi-automatic software are higher compared to radiologist.

  16. Caracterización de especies de Tilia mediante perfiles cromatográficos en fase gaseosa de los componentes volátiles extraídos del vapor en equilibrio con el material vegetal ( "headspace analysis" )

    OpenAIRE

    Sala, Gabriela; Mandrile, Eloy L.; Cafferata, Lázaro F.R.

    1992-01-01

    Se han estudiado los componentes volátiles de brácteas y flores de 3 especies de Tilia (Tilo), por cromatografía gaseosa, utilizando una columna rellena con "Porapak Q" a fin de contribuir a su caracterización quimiotaxonómica. Se compararon las eficiencias relativas de los métodos de extracción de compuestos volátiles: destilación por arrastre con vapor de agua, destilación a presión reducida y de muestreo estático de la cámara ocupada por el vapor en equilibrio con el material a 80 "C ("Hea...

  17. 如何在EXCEL中实现考试成绩自动分析%How to Achieve Automatic Analysis of Test Scores in EXCEL

    Institute of Scientific and Technical Information of China (English)

    王方云

    2012-01-01

    学校教学管理部门要求教师每学期对所任学科的期末考试成绩进行成绩统计分析,并做出成绩直方图。此项工作繁琐,手工操作还不准确且极易出错,如果采用EXCEL则完全可以精确实现对考试成绩自动分析并做出成绩直方图。文章对如何在EXCEL中实现考试成绩自动分析作了详细描述。%School education management require teachers for statistical analysis of results each semester for the final exam- ination of any subject, and make performance histogram. This work is tedious, manual operation is not accurate and error- prone, if you can use EXCEL to achieve accurate automatic analysis of test scores and make results histogram. This article detailed describes on how to achieve test results in EXCEL automatic analysis.

  18. Soil-structure interaction Vol.3. Influence of ground water

    International Nuclear Information System (INIS)

    This study has been performed for the Nuclear Regulatory Commission (NRC) by the Structural Analysis Division of Brookhaven National Laboratory (BNL). The study was conducted during the fiscal year 1965 on the program entitled 'Benchmarking of Structural Engineering Problems' sponsored by NRC. The program considered three separate but complementary problems, each associated with the soil-structure interaction (551) phase of the seismic response analysis of nuclear plant facilities. The reports, all entitled Soil-Structure Interaction, are presented in three separate volumes, namely: Vol. 1 Influence of Layering by AJ Philippacopoulos, Vol. 2 Influence of Lift-Off by C.A. Miller, Vol. 3 Influence of Ground Water by C.J. Costantino. The two problems presented in Volumes 2 and 3 were conducted at the City University of New York (CUNY) under subcontract to BNL. This report, Volume 3 of the report, presents a summary of the first year's effort on the subject of the influence of foundation ground water on the SSI phenomenon. A finite element computer program was developed for the two-phased formulation of the combined soil-water problem. This formulation is based on the Biot dynamic equations of motion for both the solid and fluid phases of a typical soil. Frequency dependent interaction coefficients were generated for the two-dimensional plane problem of a rigid surface footing moving against a saturated linear soil. The results indicate that interaction coefficients are significantly modified as compared to the comparable values for a dry soil, particularly for the rocking mode of response. Calculations were made to study the impact of the modified interaction coefficients on the response of a typical nuclear reactor building. The amplification factors for a stick model placed atop a dry and saturated soil were computed. It was found that pore water caused the rocking response to decrease and translational response to increase over the frequency range of interest, as

  19. Proceedings of the third arab conference on the peaceful uses of atomic energy, vol.a,b

    International Nuclear Information System (INIS)

    The publication has been set up as a textbook for peaceful uses of atomic energy vol.A: (1) reactor,materials,energy; (2) nuclear raw materials; (3) radiocesium-waste; (4) nuclear safety; (5) nuclear physics; (6) radiochemistry; (7) radiobiology; vol.B: (1) nuclear medicine; (2) agriculture and soil science; (3) isotope hydrology; (4) food preservation; (5) insect eradication; (6 )industrial application; (7) nuclear activation analysis; (8) health physics and environmental studies

  20. A prostate CAD system based on multiparametric analysis of DCE T1-w, and DW automatically registered images

    Science.gov (United States)

    Giannini, Valentina; Vignati, Anna; Mazzetti, Simone; De Luca, Massimo; Bracco, Christian; Stasi, Michele; Russo, Filippo; Armando, Enrico; Regge, Daniele

    2013-02-01

    Prostate specific antigen (PSA)-based screening reduces the rate of death from prostate cancer (PCa) by 31%, but this benefit is associated with a high risk of overdiagnosis and overtreatment. As prostate transrectal ultrasound-guided biopsy, the standard procedure for prostate histological sampling, has a sensitivity of 77% with a considerable false-negative rate, more accurate methods need to be found to detect or rule out significant disease. Prostate magnetic resonance imaging has the potential to improve the specificity of PSA-based screening scenarios as a non-invasive detection tool, in particular exploiting the combination of anatomical and functional information in a multiparametric framework. The purpose of this study was to describe a computer aided diagnosis (CAD) method that automatically produces a malignancy likelihood map by combining information from dynamic contrast enhanced MR images and diffusion weighted images. The CAD system consists of multiple sequential stages, from a preliminary registration of images of different sequences, in order to correct for susceptibility deformation and/or movement artifacts, to a Bayesian classifier, which fused all the extracted features into a probability map. The promising results (AUROC=0.87) should be validated on a larger dataset, but they suggest that the discrimination on a voxel basis between benign and malignant tissues is feasible with good performances. This method can be of benefit to improve the diagnostic accuracy of the radiologist, reduce reader variability and speed up the reading time, automatically highlighting probably cancer suspicious regions.

  1. Automatic extraction of corpus callosum from midsagittal head MR image and examination of Alzheimer-type dementia objective diagnostic system in feature analysis

    International Nuclear Information System (INIS)

    We studied the objective diagnosis of Alzheimer-type dementia based on changes in the corpus callosum. We examined midsagittal head MR images of 40 Alzheimer-type dementia patients (15 men and 25 women; mean age, 75.4±5.5 years) and 31 healthy elderly persons (10 men and 21 women; mean age, 73.4±7.5 years), 71 subjects altogether. First, the corpus callosum was automatically extracted from midsagittal head MR images. Next, Alzheimer-type dementia was compared with the healthy elderly individuals using the features of shape factor and six features of Co-occurrence Matrix from the corpus callosum. Automatic extraction of the corpus callosum succeeded in 64 of 71 individuals, for an extraction rate of 90.1%. A statistically significant difference was found in 7 of the 9 features between Alzheimer-type dementia patients and the healthy elderly adults. Discriminant analysis using the 7 features demonstrated a sensitivity rate of 82.4%, specificity of 89.3%, and overall accuracy of 85.5%. These results indicated the possibility of an objective diagnostic system for Alzheimer-type dementia using feature analysis based on change in the corpus callosum. (author)

  2. Development of automatic extraction of the corpus callosum from magnetic resonance imaging of the head and examination of the early dementia objective diagnostic technique in feature analysis

    International Nuclear Information System (INIS)

    We examined the objective diagnosis of dementia based on changes in the corpus callosum. We examined midsagittal head MR images of 17 early dementia patients (2 men and 15 women; mean age, 77.2±3.3 years) and 18 healthy elderly controls (2 men and 16 women; mean age, 73.8±6.5 years), 35 subjects altogether. First, the corpus callosum was automatically extracted from the MR images. Next, early dementia was compared with the healthy elderly individuals using 5 features of the straight-line methods, 5 features of the Run-Length Matrix, and 6 features of the Co-occurrence Matrix from the corpus callosum. Automatic extraction of the corpus callosum showed an accuracy rate of 84.1±3.7%. A statistically significant difference was found in 6 of the 16 features between early dementia patients and healthy elderly controls. Discriminant analysis using the 6 features demonstrated a sensitivity of 88.2% and specificity of 77.8%, with an overall accuracy of 82.9%. These results indicate that feature analysis based on changes in the corpus callosum can be used as an objective diagnostic technique for early dementia. (author)

  3. Improved automatic steam distillation combined with oscillation-type densimetry for determining alcoholic strength in spirits and liqueurs.

    Science.gov (United States)

    Lachenmeier, Dirk W; Plato, Leander; Suessmann, Manuela; Di Carmine, Matthew; Krueger, Bjoern; Kukuck, Armin; Kranz, Markus

    2015-01-01

    The determination of the alcoholic strength in spirits and liqueurs is required to control the labelling of alcoholic beverages. The reference methodology prescribes a distillation step followed by densimetric measurement. The classic distillation using a Vigreux rectifying column and a West condenser is time consuming and error-prone, especially for liqueurs that may have problems with entrainment and charring. For this reason, this methodology suggests the use of an automated steam distillation device as alternative. The novel instrument comprises an increased steam power, a redesigned geometry of the condenser and a larger cooling coil with controllable flow, compared to previously available devices. Method optimization applying D-optimal and central composite designs showed significant influence of sample volume, distillation time and coolant flow, while other investigated parameters such as steam power, receiver volume, or the use of pipettes or flasks for sample measurement did not significantly influence the results. The method validation was conducted using the following settings: steam power 70 %, sample volume 25 mL transferred using pipettes, receiver volume 50 mL, coolant flow 7 L/min, and distillation time as long as possible just below the calibration mark. For four different liqueurs covering the typical range of these products between 15 and 35 % vol, the method showed an adequate precision, with relative standard deviations below 0.4 % (intraday) and below 0.6 % (interday). The absolute standard deviations were between 0.06 % vol and 0.08 % vol (intraday) and between 0.07 % vol and 0.10 % vol (interday). The improved automatic steam distillation devices offer an excellent alternative for sample cleanup of volatiles from complex matrices. A major advantage are the low costs for consumables per analysis (only distilled water is needed). For alcoholic strength determination, the method has become more rugged than before, and there are only

  4. An Analysis of an Automatic Coolant Bypass in the International Space Station Node 2 Internal Active Thermal Control System

    Science.gov (United States)

    Clanton, Stephen E.; Holt, James M.; Turner, Larry D. (Technical Monitor)

    2001-01-01

    A challenging part of International Space Station (ISS) thermal control design is the ability to incorporate design changes into an integrated system without negatively impacting performance. The challenge presents itself in that the typical ISS Internal Active Thermal Control System (IATCS) consists of an integrated hardware/software system that provides active coolant resources to a variety of users. Software algorithms control the IATCS to specific temperatures, flow rates, and pressure differentials in order to meet the user-defined requirements. What may seem to be small design changes imposed on the system may in fact result in system instability or the temporary inability to meet user requirements. The purpose of this paper is to provide a brief description of the solution process and analyses used to implement one such design change that required the incorporation of an automatic coolant bypass in the ISS Node 2 element.

  5. An Automatic and Dynamic Approach for Personalized Recommendation of Learning Objects Considering Students Learning Styles: An Experimental Analysis

    Directory of Open Access Journals (Sweden)

    Fabiano A. DORÇA

    2016-04-01

    Full Text Available Content personalization in educational systems is an increasing research area. Studies show that students tend to have better performances when the content is customized according to his/her preferences. One important aspect of students particularities is how they prefer to learn. In this context, students learning styles should be considered, due to the importance of this feature to the adaptivity process in such systems. Thus, this work presents an efficient approach for personalization of the teaching process based on learning styles. Our approach is based on an expert system that implements a set of rules which classifies learning objects according to their teaching style, and then automatically filters learning objects according to students' learning styles. The best adapted learning objects are ranked and recommended to the student. Preliminary experiments suggest promising results.

  6. Automatic extraction analysis of the anatomical functional area for normal brain 18F-FDG PET imaging

    International Nuclear Information System (INIS)

    Using self-designed automatic extraction software of brain functional area, the grey scale distribution of 18F-FDG imaging and the relationship between the 18F-FDG accumulation of brain anatomic function area and the 18F-FDG injected dose, the level of glucose, the age, etc., were studied. According to the Talairach coordinate system, after rotation, drift and plastic deformation, the 18F-FDG PET imaging was registered into the Talairach coordinate atlas, and then the average gray value scale ratios between individual brain anatomic functional area and whole brain area was calculated. Further more the statistics of the relationship between the 18F-FDG accumulation of every brain anatomic function area and the 18F-FDG injected dose, the level of glucose and the age were tested by using multiple stepwise regression model. After images' registration, smoothing and extraction, main cerebral cortex of the 18F-FDG PET brain imaging can be successfully localized and extracted, such as frontal lobe, parietal lobe, occipital lobe, temporal lobe, cerebellum, brain ventricle, thalamus and hippocampus. The average ratios to the inner reference of every brain anatomic functional area were 1.01 ± 0.15. By multiple stepwise regression with the exception of thalamus and hippocampus, the grey scale of all the brain functional area was negatively correlated to the ages, but with no correlation to blood sugar and dose in all areas. To the 18F-FDG PET imaging, the brain functional area extraction program could automatically delineate most of the cerebral cortical area, and also successfully reflect the brain blood and metabolic study, but extraction of the more detailed area needs further investigation

  7. Quantitative Analysis of Heavy Metals in Water Based on LIBS with an Automatic Device for Sample Preparation

    Science.gov (United States)

    Hu, Li; Zhao, Nanjing; Liu, Wenqing; Meng, Deshuo; Fang, Li; Wang, Yin; Yu, Yang; Ma, Mingjun

    2015-08-01

    Heavy metals in water can be deposited on graphite flakes, which can be used as an enrichment method for laser-induced breakdown spectroscopy (LIBS) and is studied in this paper. The graphite samples were prepared with an automatic device, which was composed of a loading and unloading module, a quantitatively adding solution module, a rapid heating and drying module and a precise rotating module. The experimental results showed that the sample preparation methods had no significant effect on sample distribution and the LIBS signal accumulated in 20 pulses was stable and repeatable. With an increasing amount of the sample solution on the graphite flake, the peak intensity at Cu I 324.75 nm accorded with the exponential function with a correlation coefficient of 0.9963 and the background intensity remained unchanged. The limit of detection (LOD) was calculated through linear fitting of the peak intensity versus the concentration. The LOD decreased rapidly with an increasing amount of sample solution until the amount exceeded 20 mL and the correlation coefficient of exponential function fitting was 0.991. The LOD of Pb, Ni, Cd, Cr and Zn after evaporating different amounts of sample solution on the graphite flakes was measured and the variation tendency of their LOD with sample solution amounts was similar to the tendency for Cu. The experimental data and conclusions could provide a reference for automatic sample preparation and heavy metal in situ detection. supported by National Natural Science Foundation of China (No. 60908018), National High Technology Research and Development Program of China (No. 2013AA065502) and Anhui Province Outstanding Youth Science Fund of China (No. 1108085J19)

  8. A fully-automatic caudate nucleus segmentation of brain MRI: Application in volumetric analysis of pediatric attention-deficit/hyperactivity disorder

    Directory of Open Access Journals (Sweden)

    Igual Laura

    2011-12-01

    Full Text Available Abstract Background Accurate automatic segmentation of the caudate nucleus in magnetic resonance images (MRI of the brain is of great interest in the analysis of developmental disorders. Segmentation methods based on a single atlas or on multiple atlases have been shown to suitably localize caudate structure. However, the atlas prior information may not represent the structure of interest correctly. It may therefore be useful to introduce a more flexible technique for accurate segmentations. Method We present Cau-dateCut: a new fully-automatic method of segmenting the caudate nucleus in MRI. CaudateCut combines an atlas-based segmentation strategy with the Graph Cut energy-minimization framework. We adapt the Graph Cut model to make it suitable for segmenting small, low-contrast structures, such as the caudate nucleus, by defining new energy function data and boundary potentials. In particular, we exploit information concerning the intensity and geometry, and we add supervised energies based on contextual brain structures. Furthermore, we reinforce boundary detection using a new multi-scale edgeness measure. Results We apply the novel CaudateCut method to the segmentation of the caudate nucleus to a new set of 39 pediatric attention-deficit/hyperactivity disorder (ADHD patients and 40 control children, as well as to a public database of 18 subjects. We evaluate the quality of the segmentation using several volumetric and voxel by voxel measures. Our results show improved performance in terms of segmentation compared to state-of-the-art approaches, obtaining a mean overlap of 80.75%. Moreover, we present a quantitative volumetric analysis of caudate abnormalities in pediatric ADHD, the results of which show strong correlation with expert manual analysis. Conclusion CaudateCut generates segmentation results that are comparable to gold-standard segmentations and which are reliable in the analysis of differentiating neuroanatomical abnormalities

  9. Z physics at LEP 1. Vol. 1

    International Nuclear Information System (INIS)

    The contents of this final report from the Workshop on Z Physics at LEP can be divided into two parts. The first part, comprising Vols. 1 and 2, is a relatively concise but fairly complete handbook on the physics of e+e- annihilation near the Z peak (with normal LEP luminosity and unpolarized beams, appropriate for the first phase of LEP operation). The second part (Vol. 3) is devoted to a review of the existing Monte Carlo event generators for LEP physics. A special effort has been made to co-ordinate the different parts of this report, with the aim of achieving a systematic and balanced review of the subject, rather than having simply a collection of separate contributions. (orig.)

  10. Z Physics at LEP 1. Vol. 3

    International Nuclear Information System (INIS)

    The contents of this final report from the Workshop on Z Physics at LEP can be divided into two parts. The first part, comprising Vols. 1 and 2, is a relatively concise but fairly complete handbook on the physics of e+e- annihilation near the Z peak (with normal LEP luminosity and unpolarized beams, appropriate for the first phase of LEP operation). The second part (Vol. 3) is devoted to a review of the existing Monte Carlo event generators for LEP physics. A special effort has been made to co-ordinate the different parts of this report, with the aim of achieving a systematic and balanced review of the subject, rather than having simply a collection of separate contributions. (orig.)

  11. Z physics at LEP 1. Vol. 2

    International Nuclear Information System (INIS)

    The contents of this final report from the Workshop on Z Physics at LEP can be divided into two parts. The first part, comprising Vols. 1 and 2, is a relatively concise but fairly complete handbook on the physics of e+e- annihilation near the Z peak (with normal LEP luminosity and unpolarized beams, appropriate for the first phase of LEP operation). The second part (Vol. 3) is devoted to a review of the existing Monte Carlo event generators for LEP physics. A special effort has been made to co-ordinate the different parts of this report, with the aim of achieving a systematic and balanced review of the subject, rather than having simply a collection of separate contributions. (orig.)

  12. Automatic Performance Debugging of SPMD Parallel Programs

    CERN Document Server

    Liu, Xu; Zhan, Jianfeng; Tu, Bibo; Meng, Dan

    2010-01-01

    Automatic performance debugging of parallel applications usually involves two steps: automatic detection of performance bottlenecks and uncovering their root causes for performance optimization. Previous work fails to resolve this challenging issue in several ways: first, several previous efforts automate analysis processes, but present the results in a confined way that only identifies performance problems with apriori knowledge; second, several tools take exploratory or confirmatory data analysis to automatically discover relevant performance data relationships. However, these efforts do not focus on locating performance bottlenecks or uncovering their root causes. In this paper, we design and implement an innovative system, AutoAnalyzer, to automatically debug the performance problems of single program multi-data (SPMD) parallel programs. Our system is unique in terms of two dimensions: first, without any apriori knowledge, we automatically locate bottlenecks and uncover their root causes for performance o...

  13. Large hadron collider workshop. Proceedings. Vol. 3

    International Nuclear Information System (INIS)

    The aim of the LHC workshop at Aachen was to discuss the 'discovery potential' of a high-luminosity hadron collider (the Large Hadron Collider) and to define the requirements of the detectors. Of central interest was whether a Higgs particle with mass below 1 TeV could be seen using detectors potentially available within a few years from now. Other topics included supersymmetry, heavy quarks, excited gauge bosons, and exotica in proton-proton collisions, as well as physics to be observed in electron-proton and heavy-ion collisions. A large part of the workshop was devoted to the discussion of instrumental and detector concepts, including simulation, signal processing, data acquisition, tracking, calorimetry, lepton identification and radiation hardness. The workshop began with parallel sessions of working groups on physics and instrumentaiton and continued, in the second half, with plenary talks giving overviews of the LHC project and the SSC, RHIC, and HERA programmes, summaries of the working groups, presentations from industry, and conclusions. Vol. 1 of these proceedings contains the papers presented at the plenary sessions, Vol. 2 the individual contributions to the physics sessions, and Vol. 3 those to the instrumentation sessions. (orig.)

  14. Large hadron collider workshop. Proceedings. Vol. 1

    International Nuclear Information System (INIS)

    The aim of the LCH workshop at Aachen was to discuss the 'discovery potential' of a high-luminosity hadron collider (the Large Hadron Collider) and to define the requirements of the detectors. Of central interest was whether a Higgs particle with mass below 1 TeV could be seen using detectors potentially available within a few years from now. Other topics included supersymmetry, heavy quarks, excited gauge bosons, and exotica in proton-proton collisions, as well as physics to be observed in electron-proton and heavy-ion collisions. A large part of the workshop was devoted to the discussion of instrumental and detector concepts, including simulation, signal processing, data acquisition, tracking, calorimetry, lepton identification and radiation hardness. The workshop began with parallel sessions of working groups on physics and instrumentation and continued, in the second half, with plenary talks giving overviews of the LHC project and the SSC, RHIC, and HERA programmes, summaries of the working groups, presentations from industry, and conclusions. Vol. 1 of these proceedings contains the papers presented at the plenary sessions, Vol. 2 the individual contributions to the physics sessions, and Vol. 3 those to the instrumentation sessions. (orig.)

  15. Large hadron collider workshop. Proceedings. Vol. 2

    International Nuclear Information System (INIS)

    The aim of the LHC workshop at Aachen was to discuss the 'discovery potential' of a high-luminosity hadron collider (the Large Hadron Collider) and to define the requirements of the detectors. Of central interest was whether a Higgs particle with mass below 1 TeV could be seen using detectors potentially available within a few years from now. Other topics included supersymmetry, heavy quarks, excited gauge bosons, and exotica in proton-proton collisions, as well as physics to be observed in electron-proton and heavy-ion collisions. A large part of the workshop was devoted to the discussion of instrumental and detector concepts, including simulation, signal processing, data acquisition, tracking, calorimetry, lepton identification and radiation hardness. The workshop began with parallel sessions of working groups on physics and instrumentation and continued, in the second half, with plenary talks giving overviews of the LHC project and the SSC, RHIC, and HERA programmes, summaries of the working groups, presentations from industry, and conclusions. Vol.1 of these proceedings contains the papers presented at the plenary sessions, Vol.2 the individual contributions to the physics sessions, and Vol.3 those to the instrumentation sessions. (orig.)

  16. Contribution to automatic image recognition. Application to analysis of plain scenes of overlapping parts in robot technology

    International Nuclear Information System (INIS)

    A method for object modeling and overlapped object automatic recognition is presented. Our work is composed of three essential parts: image processing, object modeling, and evaluation of the implementation of the stated concepts. In the first part, we present a method of edge encoding which is based on a re-sampling of the data encoded according to Freeman, this method generates an isotropic, homogenous and very precise representation. The second part relates to object modeling. This important step makes much easier the recognition work. The new method proposed characterizes a model with two groups of information: the description group containing the primitives, the discrimination group containing data packs, called 'transition vectors'. Based on this original method of information organization, a 'relative learning' is able to select, to ignore and to update the information concerning the objects already learned, according to the new information to be included into the data base. The recognition is a two-pass process: the first pass determines very efficiently the presence of objects by making use of each object's particularities, and this hypothesis is either confirmed or rejected by the following fine verification pass. The last part describes in detail the experimentation results. We demonstrate the robustness of the algorithms with images in both poor lighting and overlapping objects conditions. The system, named SOFIA, has been installed into an industrial vision system series and works in real time. (author)

  17. AUTOMATIC CLASSIFICATION OF X-RATED VIDEOS USING OBSCENE SOUND ANALYSIS BASED ON A REPEATED CURVE-LIKE SPECTRUM FEATURE

    Directory of Open Access Journals (Sweden)

    JaeDeok Lim

    2011-11-01

    Full Text Available This paper addresses the automatic classification of X-rated videos by analyzing its obscene sounds. In this paper, obscene sounds refer to audio signals generated from sexual moans and screams during sexual scenes. By analyzing various sound samples, we determined the distinguishable characteristics of obscene sounds and propose a repeated curve-like spectrum feature that represents the characteristics of such sounds. We constructed 6,269 audio clips to evaluate the proposed feature, and separately constructed 1,200 X-rated and general videos for classification. The proposed feature has an F1-score, precision, and recall rate of 96.6%, 98.2%, and 95.2%, respectively, for the original dataset, and 92.6%, 97.6%, and 88.0% for a noisy dataset of 5dB SNR. And, in classifying videos, the feature has more than a 90% F1- score, 97% precision, and an 84% recall rate. From the measured performance, X-rated videos can be classified with only the audio features and the repeated curve-like spectrum feature is suitable to detect obscene sounds.

  18. Development of an automatic scanning system for nuclear emulsion analysis in the OPERA experiment and study of neutrino interactions location

    International Nuclear Information System (INIS)

    Following Super Kamiokande and K2K experiments, Opera (Oscillation Project with Emulsion tracking Apparatus), aims to confirm neutrino oscillation in the atmospheric sector. Taking advantage of a technique already employed in Chorus and in Donut, the Emulsion Cloud Chamber (ECC), Opera will be able to observe the νμ → ντ oscillation, through the ντ appearance in a pure νμ beam. The Opera experiment, with its ∼ 100000 m2 of nuclear emulsions, needs a very fast automatic scanning system. Optical and mechanics components have been customized in order to achieve a speed of about 20 cm2/hour per emulsion layer (44 μm thick), while keeping a sub-micro-metric resolution. The first part of this thesis was dedicated to the optimization of 4 scanning systems at the French scanning station, based in Lyon. An experimental study on a dry objective scanning system has also been realized. The obtained results show that the performances of dry scanning are similar with respect to the traditional oil scanning, so that it can be successfully used for Opera. The second part of this work was devoted to the study of the neutrino interaction location and reconstruction strategy actually used in Opera. A dedicated test beam was performed at CERN in order to simulate Opera conditions. The obtained results definitely confirm that the proposed strategy is well adapted for tau search. (author)

  19. Automatic monitoring system for high-steep slope in open-pit mine based on GPS and data analysis

    Science.gov (United States)

    Zhou, Chunmei; Li, Xianfu; Qin, Sunwei; Qiu, Dandan; Wu, Yanlin; Xiao, Yun; Zhou, Jian

    2008-12-01

    Recently, GPS has been more and more applicative in open pit mine slope safety monitoring. Daye Iron Mine open pit high-steep slope automatic monitoring system mainly consists of three modules, namely, GPS data processing module, monitoring and warning module, emergency plans module. According to the rock mass structural feature and the side slope stability evaluation, it is arranged altogether to seven GPS distortion monitoring points on the sharp of Fault F9 at Daye iron Mine, adopted the combination of monofrequent static GPS receiver and data-transmission radio to carry on the observation, the data processing mainly uses three transect interpolation method to solve the questions of discontinuity and Effectiveness in the data succession. According to the displacement monitoring data from 1990 to 1996 of Daye Iron Mine East Open Pit Shizi mountain Landslide A2, researching the displacement criterion, rate criterion, acceleration criterion, creep curve tangent angle criterion etc of landslide failure, the result shows that the landslide A2 is the lapse type crag nature landslide whose movement in three phases, namely creep stage, accelerated phase, destruction stage. It is different of the failure criterion in different stages and different position that is at the rear, central, front margin of the landslide. It has important guiding significance to put forward the comprehensive failure criterion of seven new-settled monitoring points combining the slope deformation destruction and macroscopic evidence.

  20. Multiexponential analysis of experimental data by an automatic peeling technique followed by non-linear least-squares adaption

    International Nuclear Information System (INIS)

    This report is concerned with multi-exponential fitting of a model function f(t) = Σsub(j=1)sup(n) asub(j) esup(-αsub(j)t) + eta(t) (asub(j), αsub(j) > o, 1 <= j <= n, eta(t) = a + bt) to given experimental data (tsub(k), ysub(k)), 1 <= k <= m, where the number n of exponential terms contained in (*) is not known in advance. An automatic version of the well-known manually performed peeling technique is realized and implemented in the subroutine SEARCH. This program yields the above mentioned number n and initial values for the parameters a, b, asub(j), αsub(j), 1 <= j <= n, in addition, which serve as input data for a final non-linear fitting of model (*) by a convenient non-linear fit program, e.g. VARPRO (from FORTLIB of KFA) or VA13AD (from Harwell Subroutine Library). Moreover, auxiliary programs for evaluation of f, the partial exponential terms in f, and the appertaining possibly weighted Least squares functional F, respectively, as well as subroutines for determination of the first and second partial derivatives of f and F with respect to the parameters are made accessible. Characteristic examples of multi-exponential fitting to simulated and experimental data demonstrate the efficiency of the presented method. (orig.)

  1. Automatic Classification of X-rated Videos using Obscene Sound Analysis based on a Repeated Curve-like Spectrum Feature

    CERN Document Server

    Lim, JaeDeok; Han, SeungWan; Lee, ChoelHoon

    2011-01-01

    This paper addresses the automatic classification of X-rated videos by analyzing its obscene sounds. In this paper, obscene sounds refer to audio signals generated from sexual moans and screams during sexual scenes. By analyzing various sound samples, we determined the distinguishable characteristics of obscene sounds and propose a repeated curve-like spectrum feature that represents the characteristics of such sounds. We constructed 6,269 audio clips to evaluate the proposed feature, and separately constructed 1,200 X-rated and general videos for classification. The proposed feature has an F1-score, precision, and recall rate of 96.6%, 98.2%, and 95.2%, respectively, for the original dataset, and 92.6%, 97.6%, and 88.0% for a noisy dataset of 5dB SNR. And, in classifying videos, the feature has more than a 90% F1-score, 97% precision, and an 84% recall rate. From the measured performance, X-rated videos can be classified with only the audio features and the repeated curve-like spectrum feature is suitable to...

  2. Large Scale Automatic Analysis and Classification of Roof Surfaces for the Installation of Solar Panels Using a Multi-Sensor Aerial Platform

    Directory of Open Access Journals (Sweden)

    Luis López-Fernández

    2015-09-01

    Full Text Available A low-cost multi-sensor aerial platform, aerial trike, equipped with visible and thermographic sensors is used for the acquisition of all the data needed for the automatic analysis and classification of roof surfaces regarding their suitability to harbor solar panels. The geometry of a georeferenced 3D point cloud generated from visible images using photogrammetric and computer vision algorithms, and the temperatures measured on thermographic images are decisive to evaluate the areas, tilts, orientations and the existence of obstacles to locate the optimal zones inside each roof surface for the installation of solar panels. This information is complemented with the estimation of the solar irradiation received by each surface. This way, large areas may be efficiently analyzed obtaining as final result the optimal locations for the placement of solar panels as well as the information necessary (location, orientation, tilt, area and solar irradiation to estimate the productivity of a solar panel from its technical characteristics.

  3. Automatic Condition Monitoring of Industrial Rolling-Element Bearings Using Motor’s Vibration and Current Analysis

    DEFF Research Database (Denmark)

    Yang, Zhenyu

    2015-01-01

    extensively studied under diverse operating conditions: different sensor locations, motor speeds, loading conditions, and data samples from different time segments. The experimental results showed the powerful capability of vibration analysis in the bearing point defect fault diagnosis. The current analysis...

  4. Automatic input rectification

    OpenAIRE

    Long, Fan; Ganesh, Vijay; Carbin, Michael James; Sidiroglou, Stelios; Rinard, Martin

    2012-01-01

    We present a novel technique, automatic input rectification, and a prototype implementation, SOAP. SOAP learns a set of constraints characterizing typical inputs that an application is highly likely to process correctly. When given an atypical input that does not satisfy these constraints, SOAP automatically rectifies the input (i.e., changes the input so that it satisfies the learned constraints). The goal is to automatically convert potentially dangerous inputs into typical inputs that the ...

  5. Automatic Fiscal Stabilizers

    Directory of Open Access Journals (Sweden)

    Narcis Eduard Mitu

    2013-11-01

    Full Text Available Policies or institutions (built into an economic system that automatically tend to dampen economic cycle fluctuations in income, employment, etc., without direct government intervention. For example, in boom times, progressive income tax automatically reduces money supply as incomes and spendings rise. Similarly, in recessionary times, payment of unemployment benefits injects more money in the system and stimulates demand. Also called automatic stabilizers or built-in stabilizers.

  6. Analysis of image factors of x-ray films : study for the intelligent replenishment system of automatic film processor

    International Nuclear Information System (INIS)

    We analyzed image factors to determine the characteristic factors that need for intelligent replenishment system of the auto film processor. We processed the serial 300 sheets of radiographic films of chest phantom without replenishment of developing and fixation replenisher. We took the digital data by using film digitizer which scanned the films and automatically summed up the pixel values of the films. We analyzed characteristic curves, average gradients and relative speeds of individual film using densitometer and step densitometry. We also evaluated the pH of developer, fixer, and washer fluid with digital pH meter. Fixer residual rate and washing effect were measured by densitometer using the reagent methods. There was no significant reduction of the digital density numbers of the serial films without replenishment of developer and fixer. The average gradients were gradually decreased by 0.02 and relative speeds were also gradually decreased by 6.96% relative to initial standard step-densitometric measurement. The pHs of developer and fixer were reflected the inactivation of each fluid. The fixer residual rates and washing effects after processing each 25 sheets of films were in the normal range. We suggest that the digital data are not reliable due to limitation of the hardware and software of the film digitizer. We conclude that average gradient and relative speed which mean the film's contrast and sensitivity respectively are reliable factors for determining the need for the replenishment of the auto film processor. We need more study of simpler equations and programming for more intelligent replenishment system of the auto film processor

  7. Automatic Radiation Monitoring in Slovenia

    International Nuclear Information System (INIS)

    Full text: The automatic radiation monitoring system in Slovenia started in early nineties and now it comprises measurements of: 1. External gamma radiation: For the time being there are forty-three probes with GM tubes integrated into a common automatic network, operated at the SNSA. The probes measure dose rate in 30 minute intervals. 2. Aerosol radioactivity: Three automatic aerosol stations measure the concentration of artificial alpha and beta activity in the air, gamma emitting radionuclides, radioactive iodine 131 in the air (in all chemical forms, - natural radon and thoron progeny, 3. Radon progeny concentration: Radon progeny concentration is measured hourly and results are displayed as the equilibrium equivalent concentrations (EEC), 4. Radioactive deposition measurements: As a support to gamma dose rate measurements - the SNSA developed and installed an automatic measuring station for surface contamination equipped with gamma spectrometry system (with 3x3' NaI(Tl) detector). All data are transferred through the different communication pathways to the SNSA. They are collected in 30 minute intervals. Within these intervals the central computer analyses and processes the collected data, and creates different reports. Every month QA/QC analysis of data is performed, showing the statistics of acquisition errors and availability of measuring results. All results are promptly available at the our WEB pages. The data are checked and daily sent to the EURDEP system at Ispra (Italy) and also to the Austrian, Croatian and Hungarian authorities. (author)

  8. Operational Automatic Remote Sensing Image Understanding Systems: Beyond Geographic Object-Based and Object-Oriented Image Analysis (GEOBIA/GEOOIA. Part 1: Introduction

    Directory of Open Access Journals (Sweden)

    Andrea Baraldi

    2012-09-01

    Full Text Available According to existing literature and despite their commercial success, state-of-the-art two-stage non-iterative geographic object-based image analysis (GEOBIA systems and three-stage iterative geographic object-oriented image analysis (GEOOIA systems, where GEOOIA/GEOBIA, remain affected by a lack of productivity, general consensus and research. To outperform the degree of automation, accuracy, efficiency, robustness, scalability and timeliness of existing GEOBIA/GEOOIA systems in compliance with the Quality Assurance Framework for Earth Observation (QA4EO guidelines, this methodological work is split into two parts. The present first paper provides a multi-disciplinary Strengths, Weaknesses, Opportunities and Threats (SWOT analysis of the GEOBIA/GEOOIA approaches that augments similar analyses proposed in recent years. In line with constraints stemming from human vision, this SWOT analysis promotes a shift of learning paradigm in the pre-attentive vision first stage of a remote sensing (RS image understanding system (RS-IUS, from sub-symbolic statistical model-based (inductive image segmentation to symbolic physical model-based (deductive image preliminary classification. Hence, a symbolic deductive pre-attentive vision first stage accomplishes image sub-symbolic segmentation and image symbolic pre-classification simultaneously. In the second part of this work a novel hybrid (combined deductive and inductive RS-IUS architecture featuring a symbolic deductive pre-attentive vision first stage is proposed and discussed in terms of: (a computational theory (system design; (b information/knowledge representation; (c algorithm design; and (d implementation. As proof-of-concept of symbolic physical model-based pre-attentive vision first stage, the spectral knowledge-based, operational, near real-time Satellite Image Automatic Mapper™ (SIAM™ is selected from existing literature. To the best of these authors’ knowledge, this is the first time a

  9. Hybrid Search Methods for Automatic Discovery of Computational Agent Schemes

    Czech Academy of Sciences Publication Activity Database

    Neruda, Roman

    Vol. 3. Los Alamitos: IEEE Computer Society, 2008 - (Li, Y.; Pasi, G.; Zhang, C.; Cercone, N.; Cao, L.), s. 579-582 ISBN 978-0-7695-3496-1. [WI-IAT 2008 Workshops. IEEE/WIC/ACM 2008 International Conference on Web Intelligence and Intelligent Agent Technology. Sydney (AU), 09.12.2008-12.12.2008] R&D Projects: GA AV ČR 1ET100300419 Institutional research plan: CEZ:AV0Z10300504 Keywords : multi-agent systems * intelligent agents * automatic configurations Subject RIV: IN - Informatics, Computer Science

  10. Development of an optimized procedure bridging design and structural analysis codes for the automatized design of the SMART

    International Nuclear Information System (INIS)

    In this report, an optimized design and analysis procedure is established to apply to the SMART (System-integrated Modular Advanced ReacTor) development. The development of an optimized procedure is to minimize the time consumption and engineering effort by squeezing the design and feedback interactions. To achieve this goal, the data and information generated through the design development should be directly transferred to the analysis program with minimum operation. The verification of the design concept requires considerable effort since the communication between the design and analysis involves time consuming stage for the conversion of input information. In this report, an optimized procedure is established bridging the design and analysis stage utilizing the IDEAS, ABAQUS and ANSYS. (author). 3 refs., 2 tabs., 5 figs

  11. Multielement X-ray radiometric analysis with application of semiconductor detectors and automatic processing of the results of measurements

    International Nuclear Information System (INIS)

    Problems of complex extraction of useful components from the ores with compound composition demand to ensure multielement analysis having the accuracy which is sufficient for practical purposes. Great possibilities has the X-ray-radiometric analysis with application of semiconductor detectors (SD) and with processing the results of measurements by means of mini- or micro-computers. Present state in the detection and computation techniques permits to introduce the said instruments into the practical use in the analytical laboratories of the mining enterprises. On the base of discussion of the practical tasks in analysis of different types of ores, in the paper basic principles of the multielement X-ray-radiometric analysis for industrial purposes have been formulated. First of all it is an installation with few channels. The main requirement in creation of such installations is to ensure high relaibility and stability of their performance. A variant is given of such analyzer, constructed with use of SiLi or Ge detecting blocks. Possibility for quick change of the excitation sources made of the set of iron-55, cadmium-109, americium-241 or cobalt-57 ensures effective excitation of elements in the range from calcium to uranium. Some practical methods of analysis have been discussed in the paper. They are based both on the methods of passive and active experiments at the calibration stages. Accuracy of these methods is enough for change of ordinary chemical analysis by the radiometric one. Problems are discussed of application of mini- and micro-computers, permitting processing of information according to the metods of analysis having been developed. Some examples are given of practical realization of the multielement X-ray-radiometric analysis of the lead-zinc, cppper-molybdenum, lead-barite and some other types of ores and also of the products of processing of ores

  12. An activation analysis system for short-lived radioisotopes including automatic dead-time corrections with a microcomputer

    International Nuclear Information System (INIS)

    A system based on an IBM-PC microcomputer coupled to a Canberra Series 80 multichannel analyser was developed for activation analysis with short-lived radioisotopes. The data transfer program can store up to 77 gamma-ray spectra on a floppy disc. A spectrum analysis program, DVC, was written to determine peak areas interactively, to correct the counting losses, and to calculate elemental concentrations. (author)

  13. Physics at LEP2. Vol. 1

    International Nuclear Information System (INIS)

    This is the final report of the Workshop on Physics at LEP2, held at CERN during 1995. The first part of vol. 1 is devoted to aspects of machine physics of particular relevance to experiments, including the energy, luminosity and interaction regions, as well as the measurement of beam energy. The second part of vol. 1 is a relatively concise, but fairly complete, handbook on the physics of e+e- annihilation above the WW threshold and up to √s∼200 GeV. It contains discussions on WW cross-sections and distributions, W mass determination, Standard Model processes, QCD and gamma-gamma physics, as well as aspects of discovery physics, such as Higgs, new particle searches, triple gauge boson couplings and Z'. The second volume contains a review of the existing Monte Carlo generators for LEP2 physics. These include generators for WW physics, QCD and gamma-gamma processes, Bhabha scattering and discovery physics. A special effort was made to co-ordinate the different parts, with a view to achieving a systematic and balanced review of the subject, rather than just publishing a collection of separate contributions. (orig.)

  14. Physics at LEP2. Vol. 2

    International Nuclear Information System (INIS)

    This is final report of the Workshop on Physics at LEP2, held at CERN during 1995. The first part of vol. 1 is devoted to aspects of machine physics of particular relevance to experiments, including the energy, luminosity and interaction regions, as well as the measurement of beam energy. The second part of vol. 1 is a relatively concise, but fairly complete, handbook on the physics of e+e- annihilation above the WW threshold and up to √s∼200 GeV. It contains discussions on WW cross-sections and distributions, W mass determination, Standard Model processes, QCD and gamma-gamma physics, as well as aspects of discovery physics, such as Higgs, new particle searches, triple gauge boson couplings and Z'. The second volume contains a review of the existing Monte Carlo generators for LEP2 physics. These include generators for WW physics, QCD and gamma-gamma processes, Bhabha scattering and discovery physics. A special effort was made to co-ordinate the different parts, with a view to achieving a systematic and balanced review of the subject, rather than just publishing a collection of separate contributions. (orig.)

  15. Cross sections for atomic processes, vol. 2

    International Nuclear Information System (INIS)

    This data collection book contains the data on all processes involving hydrogen and helium isotopes, their ions, electrons and photons, collected systematically and comprehensively, and is compiled subsequently to Vol. 1 as one of the works of the data collection study group in the Institute of Plasma Physics, Nagoya University, Japan. The items of the contents will include energy level, multiplicately excited state, radiation process, electron collision, ionic collision, recombination, collision of neutral atoms, colliding process involving molecules, and other processes. However, the first edition this time contains energy level, radiation process, electron collision and ionic collision, and the data on remaining items are now under collection. Though some criticisms have been heard about Vol. 1, the authors consider that such comprehensive collection based on systematic classification is the foundation of making a generalized data bank expected to become necessary in future. Thus the data collection book includes all relevant processes, and records the experimental data and theoretically calculated results in principle without modification by selecting them systematically. This year, investigation on data evaluation is taken up also as one of the tasks of the study group. (Wakatsuki, Y.)

  16. Toward dynamic isotopomer analysis in the rat brain in vivo: automatic quantitation of 13C NMR spectra using LCModel

    OpenAIRE

    Henry, Pierre-Gilles; Oz, Gülin; Provencher, Stephen; Gruetter, Rolf

    2003-01-01

    The LCModel method was adapted to analyze localized in vivo (13)C NMR spectra obtained from the rat brain in vivo at 9.4 T. Prior knowledge of chemical-shifts, J-coupling constants and J-evolution was included in the analysis. Up to 50 different isotopomer signals corresponding to 10 metabolites were quantified simultaneously in 400 microl volumes in the rat brain in vivo during infusion of [1,6-(13)C(2)]glucose. The analysis remained accurate even at low signal-to-noise ratio of the order of...

  17. Semi-automatic motion compensation of contrast-enhanced ultrasound images from abdominal organs for perfusion analysis

    Czech Academy of Sciences Publication Activity Database

    Schafer, S.; Nylund, K.; Saevik, F.; Engjom, T.; Mézl, M.; Jiřík, Radovan; Dimcevski, G.; Gilja, O.H.; Tönnies, K.

    2015-01-01

    Roč. 63, AUG 1 (2015), s. 229-237. ISSN 0010-4825 R&D Projects: GA ČR GAP102/12/2380 Institutional support: RVO:68081731 Keywords : ultrasonography * motion analysis * motion compensation * registration * CEUS * contrast-enhanced ultrasound * perfusion * perfusion modeling Subject RIV: FS - Medical Facilities ; Equipment Impact factor: 1.240, year: 2014

  18. Development of automatic reactor vessel inspection systems: development of data acquisition and analysis system for the nuclear vessel weld

    Energy Technology Data Exchange (ETDEWEB)

    Park, C. H.; Lim, H. T.; Um, B. G. [Korea Advanced Institute of Science and Technology, Taejeon (Korea)

    2001-03-01

    The objective of this project is to develop an automated ultrasonic data acquisition and data analysis system to examine the reactor vessel weldsIn order to examine nuclear vessel welds including reactor pressure vessel(RPV), huge amount of ultrasonic data from 6 channels should be able to be on-line processed. In addition, ultrasonic transducer scanning device should be remotely controlled, because working place is high radiation area. This kind of an automated ultrasonic testing equipment has not been developed domestically yet In order to develop an automated ultrasonic testing system, RPV ultrasonic testing equipments developed in foreign countries were investigated and the capability of high speed ultrasonic signal processing hardwares was analyzed in this study, ultrasonic signal processing system was designed. And also, ultrasonic data acquisition and analysis software was developed. 11 refs., 6 figs., 9 tabs. (Author)

  19. A Genetic Algorithms-based Procedure for Automatic Tolerance Allocation Integrated in a Commercial Variation Analysis Software

    OpenAIRE

    2012-01-01

    Indexed by SCOPUS In the functional design process of a mechanical component, the tolerance allocation stage is of primary importance to make the component itself responding to the functional requirements and to cost constraints. Present state-of-the-art approach to tolerance allocation is based on the use of Statistical Tolerance Analysis (STA) software packages which, by means of Monte Carlo simulation, allow forecasting the result of a set of user-selected geometrical and dimensional toler...

  20. Validation protocol applied to an automatic co-registration method based on multi-resolution analysis and local deformation models

    OpenAIRE

    Blanc, Philippe; Wald, Lucien

    1998-01-01

    The issue of co-registration distortions between images is one of major problems involved in data fusion processes. This conclusion can be extended to change detection generally also performing on a pixel basis. Accurate methods are therefore required for co-registration of images in these particular cases. It is the reason why we present a co-registration method using multi-resolution analysis and local deformation models. This work includes a validation protocol that enables the assessment ...

  1. Design, Construction and Effectiveness Analysis of Hybrid Automatic Solar Tracking System for Amorphous and Crystalline Solar Cells

    OpenAIRE

    Bhupendra Gupta

    2013-01-01

    - This paper concerns the design and construction of a Hybrid solar tracking system. The constructed device was implemented by integrating it with Amorphous & Crystalline Solar Panel, three dimensional freedom mechanism and microcontroller. The amount of power available from a photovoltaic panel is determined by three parameters, the type of solar tracker, materials of solar panel and the intensity of the sunlight. The objective of this paper is to present analysis on the use of two differ...

  2. Determination of Free and Total Sulfites in Wine using an Automatic Flow Injection Analysis System with Voltammetric Detection

    OpenAIRE

    Gonçalves, Luís Moreira; Pacheco, João Grosso; Magalhães, Paulo Jorge; Rodrigues, José António; Barros, Aquiles Araújo

    2009-01-01

    Abstract An automated Flow Injection Analysis (FIA) system based on a initial analyte separation by gas-diffusion and subsequent determination by square-wave voltammetry (SWV) in a flow cell is proposed for the determination of total and free content of sulphur dioxide (SO2) in wine. The proposed method was compared with two iodometric methodologies (the Ripper method and the simplified method commonly used by the wine industry). The developed method shown repeatability (RSD lower ...

  3. AN AUTOMATIZED IN-PLACE ANALYSIS OF A HEAVY LIFT JACK-UP VESSEL UNDER SURVIVAL CONDITIONS

    Directory of Open Access Journals (Sweden)

    Gil Rama

    2014-08-01

    Full Text Available Heavy lift jack-up vessels (HLJV are used for the installation of components of large offshore wind farms. A systematic FE-analysis is presented for the HLJV THOR (owned by Hochtief Infrastructure GmbH under extreme weather conditions. A parametric finite element (FE model and analysis are developed by using ANSYS-APDL programming environment. The analysis contains static and dynamic nonlinear FE-calculations, which are carried out according to the relevant standards (ISO 19905 for in-place analyses of jack-up vessels. Besides strategies of model abstraction, a guide for the determination of the relevant loads is given. In order to calculate the dynamic loads, single degree of freedom (SDOF analogy and dynamic nonlinear FE-calculations are used. As a result of detailed determination of dynamic loads and consideration of soil properties by spring elements, the used capacities are able to be reduced by 28 %. This provides for significant improvement of the environmental restrictions of the HLJV THOR for the considered load scenario.

  4. An automatic system for multielement solvent extractions

    International Nuclear Information System (INIS)

    The automatic system described is suitable for multi-element separations by solvent extraction techniques with organic solvents heavier than water. The analysis is run automatically by a central control unit and includes steps such as pH regulation and reduction or oxidation. As an example, the separation of radioactive Hg2+, Cu2+, Mo6+, Cd2+, As5+, Sb5+, Fe3+, and Co3+ by means of diethyldithiocarbonate complexes is reported. (Auth.)

  5. Automatic registration of satellite imagery

    Science.gov (United States)

    Fonseca, Leila M. G.; Costa, Max H. M.; Manjunath, B. S.; Kenney, C.

    1997-01-01

    Image registration is one of the basic image processing operations in remote sensing. With the increase in the number of images collected every day from different sensors, automated registration of multi-sensor/multi-spectral images has become an important issue. A wide range of registration techniques has been developed for many different types of applications and data. The objective of this paper is to present an automatic registration algorithm which uses a multiresolution analysis procedure based upon the wavelet transform. The procedure is completely automatic and relies on the grey level information content of the images and their local wavelet transform modulus maxima. The registration algorithm is very simple and easy to apply because it needs basically one parameter. We have obtained very encouraging results on test data sets from the TM and SPOT sensor images of forest, urban and agricultural areas.

  6. Automatic screening of obstructive sleep apnea from the ECG based on empirical mode decomposition and wavelet analysis

    International Nuclear Information System (INIS)

    This study analyses two different methods to detect obstructive sleep apnea (OSA) during sleep time based only on the ECG signal. OSA is a common sleep disorder caused by repetitive occlusions of the upper airways, which produces a characteristic pattern on the ECG. ECG features, such as the heart rate variability (HRV) and the QRS peak area, contain information suitable for making a fast, non-invasive and simple screening of sleep apnea. Fifty recordings freely available on Physionet have been included in this analysis, subdivided in a training and in a testing set. We investigated the possibility of using the recently proposed method of empirical mode decomposition (EMD) for this application, comparing the results with the ones obtained through the well-established wavelet analysis (WA). By these decomposition techniques, several features have been extracted from the ECG signal and complemented with a series of standard HRV time domain measures. The best performing feature subset, selected through a sequential feature selection (SFS) method, was used as the input of linear and quadratic discriminant classifiers. In this way we were able to classify the signals on a minute-by-minute basis as apneic or nonapneic with different best-subset sizes, obtaining an accuracy up to 89% with WA and 85% with EMD. Furthermore, 100% correct discrimination of apneic patients from normal subjects was achieved independently of the feature extractor. Finally, the same procedure was repeated by pooling features from standard HRV time domain, EMD and WA together in order to investigate if the two decomposition techniques could provide complementary features. The obtained accuracy was 89%, similarly to the one achieved using only Wavelet analysis as the feature extractor; however, some complementary features in EMD and WA are evident

  7. GAIT-ER-AID: An Expert System for Analysis of Gait with Automatic Intelligent Pre-Processing of Data

    OpenAIRE

    Bontrager, EL.; Perry, J.; Bogey, R.; Gronley, J.; Barnes, L.; Bekey, G.; Kim, JW

    1990-01-01

    This paper describes the architecture and applications of an expert system designed to identify the specific muscles responsible for a given dysfunctional gait pattern. The system consists of two parts: a data analysis expert system (DA/ES) and a gait pathology expert system (GP/ES). The DA/ES processes raw data on joint angles, foot-floor contact patterns and EMG's from relevant muscles and synthesizes them into a data frame for use by the GP/ES. Various aspects of the intelligent data pre-p...

  8. Automatic fault tree construction with RIKKE - a compendium of examples. Vol. 2

    International Nuclear Information System (INIS)

    This second volume describes the construction of fault trees for systems with loops, including control and safety loops. It also gives a short summary of the event coding scheme used in the FTLIB component model library. (author)

  9. An environmental friendly method for the automatic determination of hypochlorite in commercial products using multisyringe flow injection analysis.

    Science.gov (United States)

    Soto, N Ornelas; Horstkotte, B; March, J G; López de Alba, P L; López Martínez, L; Cerdá Martín, V

    2008-03-24

    A multisyringe flow injection analysis system was used for the determination of hypochlorite in cleaning agents, by measurement of the native absorbance of hypochlorite at 292 nm. The methodology was based on the selective decomposition of hypochlorite by a cobalt oxide catalyst giving chloride and oxygen. The difference of the absorbance of the sample before and after its pass through a cobalt oxide column was selected as analytical signal. As no further reagent was required this work can be considered as a contribution to environmental friendly analytical chemistry. The entire analytical procedure, including in-line sample dilution in three steps was automated by first, dilution in a stirred miniature vessel, second by dispersion and third by in-line addition of water using multisyringe flow injection technique. The dynamic concentration range was 0.04-0.78 gL(-1) (relative standard deviation lower than 3%), where the extension of the hypochlorite decomposition was of 90+/-4%. The proposed method was successfully applied to the analysis of commercial cleaning products. The accuracy of the method was established by iodometric titration. PMID:18328319

  10. Automatic Implantable Cardiac Defibrillator

    Medline Plus

    Full Text Available Automatic Implantable Cardiac Defibrillator February 19, 2009 Halifax Health Medical Center, Daytona Beach, FL Welcome to Halifax Health Daytona Beach, Florida. Over the next hour you' ...

  11. Automatic Payroll Deposit System.

    Science.gov (United States)

    Davidson, D. B.

    1979-01-01

    The Automatic Payroll Deposit System in Yakima, Washington's Public School District No. 7, directly transmits each employee's salary amount for each pay period to a bank or other financial institution. (Author/MLF)

  12. A radar-based regional extreme rainfall analysis to derive the thresholds for a novel automatic alert system in Switzerland

    Science.gov (United States)

    Panziera, Luca; Gabella, Marco; Zanini, Stefano; Hering, Alessandro; Germann, Urs; Berne, Alexis

    2016-06-01

    This paper presents a regional extreme rainfall analysis based on 10 years of radar data for the 159 regions adopted for official natural hazard warnings in Switzerland. Moreover, a nowcasting tool aimed at issuing heavy precipitation regional alerts is introduced. The two topics are closely related, since the extreme rainfall analysis provides the thresholds used by the nowcasting system for the alerts. Warm and cold seasons' monthly maxima of several statistical quantities describing regional rainfall are fitted to a generalized extreme value distribution in order to derive the precipitation amounts corresponding to sub-annual return periods for durations of 1, 3, 6, 12, 24 and 48 h. It is shown that regional return levels exhibit a large spatial variability in Switzerland, and that their spatial distribution strongly depends on the duration of the aggregation period: for accumulations of 3 h and shorter, the largest return levels are found over the northerly alpine slopes, whereas for longer durations the southern Alps exhibit the largest values. The inner alpine chain shows the lowest values, in agreement with previous rainfall climatologies. The nowcasting system presented here is aimed to issue heavy rainfall alerts for a large variety of end users, who are interested in different precipitation characteristics and regions, such as, for example, small urban areas, remote alpine catchments or administrative districts. The alerts are issued not only if the rainfall measured in the immediate past or forecast in the near future exceeds some predefined thresholds but also as soon as the sum of past and forecast precipitation is larger than threshold values. This precipitation total, in fact, has primary importance in applications for which antecedent rainfall is as important as predicted one, such as urban floods early warning systems. The rainfall fields, the statistical quantity representing regional rainfall and the frequency of alerts issued in case of

  13. Analysis of volatile compounds of Ilex paraguariensis A. St. - Hil. and its main adulterating species Ilex theizans Mart. ex Reissek and Ilex dumosa Reissek Análise de compostos voláteis de Ilex paraguariensis A. St. - Hil. e suas principais espécies adulterantes Ilex theizans Mart. ex Reissek e Ilex dumosa Reissek

    Directory of Open Access Journals (Sweden)

    Rogério Marcos Dallago

    2011-12-01

    Full Text Available The adulteration of the product Ilex paraguariensis with other Ilex species is a mAjor problem for maté tea producers. In this work, three species of Ilex were evaluated for their volatile composition by headspace solid phase microextraction coupled to gas chromatography and mass spectrum detector (HS-SPME/GC-MS. The adulterating species I. dumnosa and I. theizans Mart. ex Reissek presented a different profile of volatile organic compounds when compared to I. paraguariensis. Aldehydes methyl-butanal, pentanal, hexanal, heptanal and nonanal were detected only in the adulterating species. This result suggests that such compounds are potential chemical markers for identification of adulteration and quality analysis of products based on Ilex paraguariensis.A adulteração do produto Ilex paraguariensis com outras espécies de Ilex é um dos principais problemas dos produtores de erva-mate. Neste trabalho, três espécies de Ilex foram avaliadas quanto à sua composição volátil por microextração em fase sólida acoplada à cromatografia gasosa e detector de espectro de massas (HS-SPME/GC-MS. As espécies adulterantes I. dumnosa e I. theizans Mart. ex Reissek apresentaram um perfil diferente de compostos orgânicos voláteis, quando comparadas com a I. paraguariensis. Os aldeídos metil-butanal, pentanal, hexanal, heptanal e nonanal foram detectados apenas nas espécies adulterantes. Esse resultado sugere que esses compostos químicos são marcadores potenciais para a identificação de adulteração e análise da qualidade dos produtos à base de Ilex paraguariensis.

  14. Artificial intelligence applied to the automatic analysis of absorption spectra. Objective measurement of the fine structure constant

    CERN Document Server

    Bainbridge, Matthew B

    2016-01-01

    A new and fully-automated method is presented for the analysis of high-resolution absorption spectra (GVPFIT). The method has broad application but here we apply it specifically to the problem of measuring the fine structure constant at high redshift. For this we need objectivity and reproducibility. GVPFIT is also motivated by the importance of obtaining a large statistical sample of measurements of $\\Delta\\alpha/\\alpha$. Interactive analyses are both time consuming and complex and automation makes obtaining a large sample feasible. Three numerical methods are unified into one artificial intelligence process: a genetic algorithm that emulates the Darwinian processes of reproduction, mutation and selection, non-linear least-squares with parameter constraints (VPFIT), and Bayesian model averaging. In contrast to previous methodologies, which relied on a particular solution as being the most likely model, GVPFIT plus Bayesian model averaging derives results from a large set of models, and helps overcome systema...

  15. Design, Construction and Effectiveness Analysis of Hybrid Automatic Solar Tracking System for Amorphous and Crystalline Solar Cells

    Directory of Open Access Journals (Sweden)

    Bhupendra Gupta

    2013-10-01

    Full Text Available - This paper concerns the design and construction of a Hybrid solar tracking system. The constructed device was implemented by integrating it with Amorphous & Crystalline Solar Panel, three dimensional freedom mechanism and microcontroller. The amount of power available from a photovoltaic panel is determined by three parameters, the type of solar tracker, materials of solar panel and the intensity of the sunlight. The objective of this paper is to present analysis on the use of two different material of Solar panel like Amorphous & Crystalline in a Solar tracking system at Stationary, Single Axis, Dual Axis & Hybrid Axis solar tracker to have better performance with minimum losses to the surroundings, as this device ensures maximum intensity of sun rays hitting the surface of the panel from sunrise to sunset

  16. Determination of free and total sulfites in wine using an automatic flow injection analysis system with voltammetric detection.

    Science.gov (United States)

    Goncalves, Luis Moreira; Grosso Pacheco, Joao; Jorge Magalhaes, Paulo; Antonio Rodrigues, Jose; Araujo Barros, Aquiles

    2010-02-01

    An automated flow injection analysis (FIA) system, based on an initial analyte separation by gas-diffusion and subsequent determination by square-wave voltammetry (SWV) in a flow cell, was developed for the determination of total and free sulfur dioxide (SO(2)) in wine. The proposed method was compared with two iodometric methodologies (the Ripper method and a simplified method commonly used by the wine industry). The developed method displayed good repeatability (RSD lower than 6%) and linearity (between 10 and 250 mg l(-1)) as well as a suitable LOD (3 mg l(-1)) and LOQ (9 mg l(-1)). A major advantage of this system is that SO(2) is directly detected by flow SWV. PMID:20013444

  17. Automatic Arabic Text Classification

    OpenAIRE

    Al-harbi, S; Almuhareb, A.; Al-Thubaity , A; Khorsheed, M. S.; Al-Rajeh, A.

    2008-01-01

    Automated document classification is an important text mining task especially with the rapid growth of the number of online documents present in Arabic language. Text classification aims to automatically assign the text to a predefined category based on linguistic features. Such a process has different useful applications including, but not restricted to, e-mail spam detection, web page content filtering, and automatic message routing. This paper presents the results of experiments on documen...

  18. Histological analysis of tissue structures of the internal organs of steppe tortoises following their exposure to spaceflight conditions while circumnavigating the moon aboard the Zond-7 automatic station

    Science.gov (United States)

    Sutulov, L. S.; Sutulov, Y. L.; Trukhina, L. V.

    1975-01-01

    Tortoises flown around the Moon on the 6-1/2 day voyage of the Zond-7 automatic space station evidently did not suffer any pathological changes to their peripheral blood picture, heart, lungs, intestines, or liver.

  19. Development of automatic reactor vessel inspection systems; development of data acquisition and analysis system for the nuclear vessel weld

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jong Po; Park, C. H.; Kim, H. T.; Noh, H. C.; Lee, J. M.; Kim, C. K.; Um, B. G. [Research Institute of KAITEC, Seoul (Korea)

    2002-03-01

    The objective of this project is to develop an automated ultrasonic data acquisition and data analysis system to examine heavy vessel welds. In order to examine nuclear vessel welds including reactor pressure vessel(RPV), huge amount of ultrasonic data from 6 channels should be able to be on-line processed. In addition, ultrasonic transducer scanning device should be remotely controlled, because working place is high radiation area. This kind of an automated ultrasonic testing equipment has not been developed domestically yet. In order to develop an automated ultrasonic testing system, RPV ultrasonic testing equipments developed in foreign countries were investigated and the capability of high speed ultrasonic signal processing hardwares was analyzed. In this study, ultrasonic signal processing system was designed. And also, ultrasonic data acquisition software was developed. The new systems were tested on the RPV welds of Ulchin Unit 6 to confirm their functions and capabilities. They worked very well as designed and the tests were successfully completed. 13 refs., 34 figs., 11 tabs. (Author)

  20. Automatically produced FRP beams with embedded FOS in complex geometry: process, material compatibility, micromechanical analysis, and performance tests

    Science.gov (United States)

    Gabler, Markus; Tkachenko, Viktoriya; Küppers, Simon; Kuka, Georg G.; Habel, Wolfgang R.; Milwich, Markus; Knippers, Jan

    2012-04-01

    The main goal of the presented work was to evolve a multifunctional beam composed out of fiber reinforced plastics (FRP) and an embedded optical fiber with various fiber Bragg grating sensors (FBG). These beams are developed for the use as structural member for bridges or industrial applications. It is now possible to realize large scale cross sections, the embedding is part of a fully automated process and jumpers can be omitted in order to not negatively influence the laminate. The development includes the smart placement and layout of the optical fibers in the cross section, reliable strain transfer, and finally the coupling of the embedded fibers after production. Micromechanical tests and analysis were carried out to evaluate the performance of the sensor. The work was funded by the German ministry of economics and technology (funding scheme ZIM). Next to the authors of this contribution, Melanie Book with Röchling Engineering Plastics KG (Haren/Germany; Katharina Frey with SAERTEX GmbH & Co. KG (Saerbeck/Germany) were part of the research group.

  1. Development of automatic surveillance of animal behaviour and welfare using image analysis and machine learned segmentation technique.

    Science.gov (United States)

    Nilsson, M; Herlin, A H; Ardö, H; Guzhva, O; Åström, K; Bergsten, C

    2015-11-01

    In this paper the feasibility to extract the proportion of pigs located in different areas of a pig pen by advanced image analysis technique is explored and discussed for possible applications. For example, pigs generally locate themselves in the wet dunging area at high ambient temperatures in order to avoid heat stress, as wetting the body surface is the major path to dissipate the heat by evaporation. Thus, the portion of pigs in the dunging area and resting area, respectively, could be used as an indicator of failure of controlling the climate in the pig environment as pigs are not supposed to rest in the dunging area. The computer vision methodology utilizes a learning based segmentation approach using several features extracted from the image. The learning based approach applied is based on extended state-of-the-art features in combination with a structured prediction framework based on a logistic regression solver using elastic net regularization. In addition, the method is able to produce a probability per pixel rather than form a hard decision. This overcomes some of the limitations found in a setup using grey-scale information only. The pig pen is a difficult imaging environment because of challenging lighting conditions like shadows, poor lighting and poor contrast between pig and background. In order to test practical conditions, a pen containing nine young pigs was filmed from a top view perspective by an Axis M3006 camera with a resolution of 640 × 480 in three, 10-min sessions under different lighting conditions. The results indicate that a learning based method improves, in comparison with greyscale methods, the possibility to reliable identify proportions of pigs in different areas of the pen. Pigs with a changed behaviour (location) in the pen may indicate changed climate conditions. Changed individual behaviour may also indicate inferior health or acute illness. PMID:26189971

  2. Identification of Biocontrol Bacteria against Soybean Root Rot with Biolog Automatic Microbiology Analysis System%拮抗大豆根腐病细菌的Biolog鉴定

    Institute of Scientific and Technical Information of China (English)

    许艳丽; 刘海龙; 李春杰; 潘凤娟; 李淑娴; 刘新晶

    2012-01-01

    In order to identify the systematic position of taxonomy of two biocontrol bacteria against soybean root rot. Traditional morphological identification and BIOLOG automatic microbiology analysis system were used to identify strain B021a and B04b. The results showed that similarity value of strain B021a with Vibrio tubiashii was 0. 634, possibility to 86% and genetic distance to 4.00,and similarity value of strain B04b with Pasteurella trehalosi was 0. 610,probability to 75% and genetic distance to 2. 77. Strain B021a was identified as Vibrio tubiashii and strain B04b as Pasteurella trehalosi by colony morphological propertie and BIOLOC analysis system.%为明确2株生防细菌的分类地位,采用传统形态学方法结合Biolog微生物自动分析系统,鉴定了大豆根腐病的2株生防细菌.结果表明,菌株B021a与塔式弧菌相似度值为0.634,可能性是86%,遗传距离为4.00.菌株B04b与海藻巴斯德菌相似度值为0.610,可能性是75%,遗传距离为2.77.综合形态学和Biolog鉴定结果,认为菌株B021a是塔式弧菌,菌株B04b是海藻巴斯德菌.

  3. Recommended number of strides for automatic assessment of gait symmetry and regularity in above-knee amputees by means of accelerometry and autocorrelation analysis

    Directory of Open Access Journals (Sweden)

    Tura Andrea

    2012-02-01

    Full Text Available Abstract Background Symmetry and regularity of gait are essential outcomes of gait retraining programs, especially in lower-limb amputees. This study aims presenting an algorithm to automatically compute symmetry and regularity indices, and assessing the minimum number of strides for appropriate evaluation of gait symmetry and regularity through autocorrelation of acceleration signals. Methods Ten transfemoral amputees (AMP and ten control subjects (CTRL were studied. Subjects wore an accelerometer and were asked to walk for 70 m at their natural speed (twice. Reference values of step and stride regularity indices (Ad1 and Ad2 were obtained by autocorrelation analysis of the vertical and antero-posterior acceleration signals, excluding initial and final strides. The Ad1 and Ad2 coefficients were then computed at different stages by analyzing increasing portions of the signals (considering both the signals cleaned by initial and final strides, and the whole signals. At each stage, the difference between Ad1 and Ad2 values and the corresponding reference values were compared with the minimum detectable difference, MDD, of the index. If that difference was less than MDD, it was assumed that the portion of signal used in the analysis was of sufficient length to allow reliable estimation of the autocorrelation coefficient. Results All Ad1 and Ad2 indices were lower in AMP than in CTRL (P Conclusions Without the need to identify and eliminate the phases of gait initiation and termination, twenty strides can provide a reasonable amount of information to reliably estimate gait regularity in transfemoral amputees.

  4. Automatic extraction of the cingulum bundle in diffusion tensor tract-specific analysis. Feasibility study in Parkinson's disease with and without dementia

    International Nuclear Information System (INIS)

    Tract-specific analysis (TSA) measures diffusion parameters along a specific fiber that has been extracted by fiber tracking using manual regions of interest (ROIs), but TSA is limited by its requirement for manual operation, poor reproducibility, and high time consumption. We aimed to develop a fully automated extraction method for the cingulum bundle (CB) and to apply the method to TSA in neurobehavioral disorders such as Parkinson's disease (PD). We introduce the voxel classification (VC) and auto diffusion tensor fiber-tracking (AFT) methods of extraction. The VC method directly extracts the CB, skipping the fiber-tracking step, whereas the AFT method uses fiber tracking from automatically selected ROIs. We compared the results of VC and AFT to those obtained by manual diffusion tensor fiber tracking (MFT) performed by 3 operators. We quantified the Jaccard similarity index among the 3 methods in data from 20 subjects (10 normal controls [NC] and 10 patients with Parkinson's disease dementia [PDD]). We used all 3 extraction methods (VC, AFT, and MFT) to calculate the fractional anisotropy (FA) values of the anterior and posterior CB for 15 NC subjects, 15 with PD, and 15 with PDD. The Jaccard index between results of AFT and MFT, 0.72, was similar to the inter-operator Jaccard index of MFT. However, the Jaccard indices between VC and MFT and between VC and AFT were lower. Consequently, the VC method classified among 3 different groups (NC, PD, and PDD), whereas the others classified only 2 different groups (NC, PD or PDD). For TSA in Parkinson's disease, the VC method can be more useful than the AFT and MFT methods for extracting the CB. In addition, the results of patient data analysis suggest that a reduction of FA in the posterior CB may represent a useful biological index for monitoring PD and PDD. (author)

  5. 高校学生信息自动提取与分析系统研究%Research and analysis of college students information automatic extraction

    Institute of Scientific and Technical Information of China (English)

    聂明辉

    2013-01-01

    Digital campus construction to the third stage, started system integration and large-scale data sharing. Through the mutual sharing of data platform, data extraction and clean-up technologies, particularly get the information we need, but also enables us to understand and evaluate students more comprehensive and objective in the university campus life, learning situation. This paper is based on ODI technology, MapReduce technology and data warehouse technology as the foundation, proposed and constructed a university student in school, life information automatic extraction and analysis system model, for the university informatization environment more intelligent management students provides a feasible technical scheme.%高校数字化校园建设到第三阶段,开始了大规模的系统整合以及数据共享。通过互通共享的数据平台,利用数据抽取和清理技术,有所侧重的获取我们需要的信息,也使我们能够更全面更客观的了解和评价学生在大学校园内的生活、学习情况。本文就此以ODI技术、MapReduce技术以及数据仓库技术为基础,提出并构筑了一套高校学生在校学习、生活信息自动提取与分析系统模型,为信息化大环境下高校更加智慧的管理学生提供了一种可行的技术方案。

  6. Automatic Program Development

    DEFF Research Database (Denmark)

    Automatic Program Development is a tribute to Robert Paige (1947-1999), our accomplished and respected colleague, and moreover our good friend, whose untimely passing was a loss to our academic and research community. We have collected the revised, updated versions of the papers published in his...... honor in the Higher-Order and Symbolic Computation Journal in the years 2003 and 2005. Among them there are two papers by Bob: (i) a retrospective view of his research lines, and (ii) a proposal for future studies in the area of the automatic program derivation. The book also includes some papers by...... members of the IFIP Working Group 2.1 of which Bob was an active member. All papers are related to some of the research interests of Bob and, in particular, to the transformational development of programs and their algorithmic derivation from formal specifications. Automatic Program Development offers a...

  7. Automatic Camera Control

    DEFF Research Database (Denmark)

    Burelli, Paolo; Preuss, Mike

    2014-01-01

    Automatically generating computer animations is a challenging and complex problem with applications in games and film production. In this paper, we investigate howto translate a shot list for a virtual scene into a series of virtual camera configurations — i.e automatically controlling the virtual...... camera. We approach this problem by modelling it as a dynamic multi-objective optimisation problem and show how this metaphor allows a much richer expressiveness than a classical single objective approach. Finally, we showcase the application of a multi-objective evolutionary algorithm to generate a shot...

  8. Automatic text summarization

    CERN Document Server

    Torres Moreno, Juan Manuel

    2014-01-01

    This new textbook examines the motivations and the different algorithms for automatic document summarization (ADS). We performed a recent state of the art. The book shows the main problems of ADS, difficulties and the solutions provided by the community. It presents recent advances in ADS, as well as current applications and trends. The approaches are statistical, linguistic and symbolic. Several exemples are included in order to clarify the theoretical concepts.  The books currently available in the area of Automatic Document Summarization are not recent. Powerful algorithms have been develop

  9. Expert system for the automatic analysis of the Eddy current signals from the monitoring of vapor generators of a PWR, type reactor

    International Nuclear Information System (INIS)

    The automatization of the monitoring of the steam generator tubes required some developments in the field of data processing. The monitoring is performed by means of Eddy current tests. Improvements in signal processing and in pattern recognition associated to the artificial intelligence techniques induced EDF (French Electricity Company) to develop an automatic signal processing system. The system, named EXTRACSION (French acronym for Expert System for the Processing and classification of Signals of Nuclear Nature), insures the coherence between the different fields of knowledge (metallurgy, measurement, signals) during data processing by applying an object oriented representation

  10. Workshop Arboretum Volčji potok

    Directory of Open Access Journals (Sweden)

    Ana Kučan

    2012-01-01

    Full Text Available From its constitution onwards, the Volčji Potok Arboretum has been caught between various conflicting orientations. It is both a scientific, research and educational institution, and a cultural monument with exquisite garden and landscape design features and areas of great natural value and built cultural heritage, as well as commercial venue. At the same time, it functions as a park and an area for mass events, a garden centre and nursery. This variety of functions has helped Arboretum to survive the pressures of time; however, partial and uncoordinated interventions have threatened its original mission and its image and generated a number of conflicting situations. The workshop, organised on the initiative of the Institute for the Protection of Cultural Heritage of Slovenia, which involved students from the Faculty of Architecture and students from the Department of Landscape Architecture of the Biotechnical Faculty in mixed groups, generated eight proposals to solve some of the most urgent problems by introducing optimised development with clearly defined goals and priorities.

  11. Nuclear Reactor RA Safety Report, Vol. 16, Maximum hypothetical accident

    International Nuclear Information System (INIS)

    Fault tree analysis of the maximum hypothetical accident covers the basic elements: accident initiation, phase development phases - scheme of possible accident flow. Cause of the accident initiation is the break of primary cooling pipe, heavy water system. Loss of primary coolant causes loss of pressure in the primary circuit at the coolant input in the reactor vessel. This initiates safety protection system which should automatically shutdown the reactor. Separate chapters are devoted to: after-heat removal, coolant and moderator loss; accident effects on the reactor core, effects in the reactor building, and release of radioactive wastes

  12. Automatic Dance Lesson Generation

    Science.gov (United States)

    Yang, Yang; Leung, H.; Yue, Lihua; Deng, LiQun

    2012-01-01

    In this paper, an automatic lesson generation system is presented which is suitable in a learning-by-mimicking scenario where the learning objects can be represented as multiattribute time series data. The dance is used as an example in this paper to illustrate the idea. Given a dance motion sequence as the input, the proposed lesson generation…

  13. Automatic validation of numerical solutions

    DEFF Research Database (Denmark)

    Stauning, Ole

    1997-01-01

    This thesis is concerned with ``Automatic Validation of Numerical Solutions''. The basic theory of interval analysis and self-validating methods is introduced. The mean value enclosure is applied to discrete mappings for obtaining narrow enclosures of the iterates when applying these mappings wit...... the mean value enclosure of an integral operator and uses interval Bernstein polynomials for enclosing the solution. Two numerical examples are given, using two orders of approximation and using different numbers of discretization points.......This thesis is concerned with ``Automatic Validation of Numerical Solutions''. The basic theory of interval analysis and self-validating methods is introduced. The mean value enclosure is applied to discrete mappings for obtaining narrow enclosures of the iterates when applying these mappings with...... intervals as initial values. A modification of the mean value enclosure of discrete mappings is considered, namely the extended mean value enclosure which in most cases leads to even better enclosures. These methods have previously been described in connection with discretizing solutions of ordinary...

  14. Automatic generation of 3D fine mesh geometries for the analysis of the venus-3 shielding benchmark experiment with the Tort code

    Energy Technology Data Exchange (ETDEWEB)

    Pescarini, M.; Orsi, R.; Martinelli, T. [ENEA, Ente per le Nuove Tecnologie, l' Energia e l' Ambiente, Centro Ricerche Ezio Clementel Bologna (Italy)

    2003-07-01

    In many practical radiation transport applications today the cost for solving refined, large size and complex multi-dimensional problems is not so much computing but is linked to the cumbersome effort required by an expert to prepare a detailed geometrical model, verify and validate that it is correct and represents, to a specified tolerance, the real design or facility. This situation is, in particular, relevant and frequent in reactor core criticality and shielding calculations, with three-dimensional (3D) general purpose radiation transport codes, requiring a very large number of meshes and high performance computers. The need for developing tools that make easier the task to the physicist or engineer, by reducing the time required, by facilitating through effective graphical display the verification of correctness and, finally, that help the interpretation of the results obtained, has clearly emerged. The paper shows the results of efforts in this field through detailed simulations of a complex shielding benchmark experiment. In the context of the activities proposed by the OECD/NEA Nuclear Science Committee (NSC) Task Force on Computing Radiation Dose and Modelling of Radiation-Induced Degradation of Reactor Components (TFRDD), the ENEA-Bologna Nuclear Data Centre contributed with an analysis of the VENUS-3 low-flux neutron shielding benchmark experiment (SCK/CEN-Mol, Belgium). One of the targets of the work was to test the BOT3P system, originally developed at the Nuclear Data Centre in ENEA-Bologna and actually released to OECD/NEA Data Bank for free distribution. BOT3P, ancillary system of the DORT (2D) and TORT (3D) SN codes, permits a flexible automatic generation of spatial mesh grids in Cartesian or cylindrical geometry, through combinatorial geometry algorithms, following a simplified user-friendly approach. This system demonstrated its validity also in core criticality analyses, as for example the Lewis MOX fuel benchmark, permitting to easily

  15. Automatic generation of 3D fine mesh geometries for the analysis of the venus-3 shielding benchmark experiment with the Tort code

    International Nuclear Information System (INIS)

    In many practical radiation transport applications today the cost for solving refined, large size and complex multi-dimensional problems is not so much computing but is linked to the cumbersome effort required by an expert to prepare a detailed geometrical model, verify and validate that it is correct and represents, to a specified tolerance, the real design or facility. This situation is, in particular, relevant and frequent in reactor core criticality and shielding calculations, with three-dimensional (3D) general purpose radiation transport codes, requiring a very large number of meshes and high performance computers. The need for developing tools that make easier the task to the physicist or engineer, by reducing the time required, by facilitating through effective graphical display the verification of correctness and, finally, that help the interpretation of the results obtained, has clearly emerged. The paper shows the results of efforts in this field through detailed simulations of a complex shielding benchmark experiment. In the context of the activities proposed by the OECD/NEA Nuclear Science Committee (NSC) Task Force on Computing Radiation Dose and Modelling of Radiation-Induced Degradation of Reactor Components (TFRDD), the ENEA-Bologna Nuclear Data Centre contributed with an analysis of the VENUS-3 low-flux neutron shielding benchmark experiment (SCK/CEN-Mol, Belgium). One of the targets of the work was to test the BOT3P system, originally developed at the Nuclear Data Centre in ENEA-Bologna and actually released to OECD/NEA Data Bank for free distribution. BOT3P, ancillary system of the DORT (2D) and TORT (3D) SN codes, permits a flexible automatic generation of spatial mesh grids in Cartesian or cylindrical geometry, through combinatorial geometry algorithms, following a simplified user-friendly approach. This system demonstrated its validity also in core criticality analyses, as for example the Lewis MOX fuel benchmark, permitting to easily

  16. Automatic indexing, compiling and classification

    International Nuclear Information System (INIS)

    A review of the principles of automatic indexing, is followed by a comparison and summing-up of work by the authors and by a Soviet staff from the Moscou INFORM-ELECTRO Institute. The mathematical and linguistic problems of the automatic building of thesaurus and automatic classification are examined

  17. Automatic gamma spectrometry analytical apparatus

    International Nuclear Information System (INIS)

    This invention falls within the area of quantitative or semi-quantitative analysis by gamma spectrometry and particularly refers to a device for bringing the samples into the counting position. The purpose of this invention is precisely to provide an automatic apparatus specifically adapted to the analysis of hard gamma radiations. To this effect, the invention relates to a gamma spectrometry analytical device comprising a lead containment, a detector of which the sensitive part is located inside the containment and additionally comprising a transfer system for bringing the analyzed samples in succession to a counting position inside the containment above the detector. A feed compartment enables the samples to be brought in turn one by one on to the transfer system through a duct connecting the compartment to the transfer system. Sequential systems for the coordinated forward feed of the samples in the compartment and the transfer system complete this device

  18. Automatic image processing as a means of safeguarding nuclear material

    International Nuclear Information System (INIS)

    Problems involved in computerized analysis of pictures taken by automatic film or video cameras in the context of international safeguards implementation are described. They include technical ones as well as the need to establish objective criteria for assessing image information. In the near future automatic image processing systems will be useful in verifying the identity and integrity of IAEA seals. (author)

  19. The automatic NMR gaussmeter

    International Nuclear Information System (INIS)

    The paper describes the automatic gaussmeter operating according to the principle of nuclear magnetic resonance. There have been discussed the operating principle, the block diagram and operating parameters of the meter. It can be applied to measurements of induction in electromagnets of wide-line radio-spectrometers EPR and NMR and in calibration stands of magnetic induction values. Frequency range of an autodyne oscillator from 0,6 up to 86 MHz for protons is corresponding to the field range from 0.016 up to 2T. Applicaton of other nuclei, such as 7Li and 2D is also foreseen. The induction measurement is carried over automatically, and the NMR signal and value of measured induction are displayed on a monitor screen. (author)

  20. Automatic trend estimation

    CERN Document Server

    Vamos¸, C˘alin

    2013-01-01

    Our book introduces a method to evaluate the accuracy of trend estimation algorithms under conditions similar to those encountered in real time series processing. This method is based on Monte Carlo experiments with artificial time series numerically generated by an original algorithm. The second part of the book contains several automatic algorithms for trend estimation and time series partitioning. The source codes of the computer programs implementing these original automatic algorithms are given in the appendix and will be freely available on the web. The book contains clear statement of the conditions and the approximations under which the algorithms work, as well as the proper interpretation of their results. We illustrate the functioning of the analyzed algorithms by processing time series from astrophysics, finance, biophysics, and paleoclimatology. The numerical experiment method extensively used in our book is already in common use in computational and statistical physics.

  1. Automatic Wall Painting Robot

    OpenAIRE

    P.KEERTHANAA, K.JEEVITHA, V.NAVINA, G.INDIRA, S.JAYAMANI

    2013-01-01

    The Primary Aim Of The Project Is To Design, Develop And Implement Automatic Wall Painting Robot Which Helps To Achieve Low Cost Painting Equipment. Despite The Advances In Robotics And Its Wide Spreading Applications, Interior Wall Painting Has Shared Little In Research Activities. The Painting Chemicals Can Cause Hazards To The Human Painters Such As Eye And Respiratory System Problems. Also The Nature Of Painting Procedure That Requires Repeated Work And Hand Rising Makes It Boring, Time A...

  2. Automatic Program Reports

    OpenAIRE

    Lígia Maria da Silva Ribeiro; Gabriel de Sousa Torcato David

    2007-01-01

    To profit from the data collected by the SIGARRA academic IS, a systematic setof graphs and statistics has been added to it and are available on-line. Thisanalytic information can be automatically included in a flexible yearly report foreach program as well as in a synthesis report for the whole school. Somedifficulties in the interpretation of some graphs led to the definition of new keyindicators and the development of a data warehouse across the university whereeffective data consolidation...

  3. Automatic Inductive Programming Tutorial

    OpenAIRE

    Aler, Ricardo

    2006-01-01

    Computers that can program themselves is an old dream of Artificial Intelligence, but only nowadays there is some progress of remark. In relation to Machine Learning, a computer program is the most powerful structure that can be learned, pushing the final goal well beyond neural networks or decision trees. There are currently many separate areas, working independently, related to automatic programming, both deductive and inductive. The first goal of this tutorial is to give to the attendants ...

  4. Automatic food decisions

    DEFF Research Database (Denmark)

    Mueller Loose, Simone

    Consumers' food decisions are to a large extent shaped by automatic processes, which are either internally directed through learned habits and routines or externally influenced by context factors and visual information triggers. Innovative research methods such as eye tracking, choice experiments...... and food diaries allow us to better understand the impact of unconscious processes on consumers' food choices. Simone Mueller Loose will provide an overview of recent research insights into the effects of habit and context on consumers' food choices....

  5. Automaticity or active control

    DEFF Research Database (Denmark)

    Tudoran, Ana Alina; Olsen, Svein Ottar

    This study addresses the quasi-moderating role of habit strength in explaining action loyalty. A model of loyalty behaviour is proposed that extends the traditional satisfaction–intention–action loyalty network. Habit strength is conceptualised as a cognitive construct to refer to the psychologic......, respectively, between intended loyalty and action loyalty. At high levels of habit strength, consumers are more likely to free up cognitive resources and incline the balance from controlled to routine and automatic-like responses....

  6. Automatic digital image registration

    Science.gov (United States)

    Goshtasby, A.; Jain, A. K.; Enslin, W. R.

    1982-01-01

    This paper introduces a general procedure for automatic registration of two images which may have translational, rotational, and scaling differences. This procedure involves (1) segmentation of the images, (2) isolation of dominant objects from the images, (3) determination of corresponding objects in the two images, and (4) estimation of transformation parameters using the center of gravities of objects as control points. An example is given which uses this technique to register two images which have translational, rotational, and scaling differences.

  7. Upgrade of the automatic analysis system in the TJ-II Thomson Scattering diagnostic: New image recognition classifier and fault condition detection

    International Nuclear Information System (INIS)

    An automatic image classification system based on support vector machines (SVM) has been in operation for years in the TJ-II Thomson Scattering diagnostic. It recognizes five different types of images: CCD camera background, measurement of stray light without plasma or in a collapsed discharge, image during ECH phase, image during NBI phase and image after reaching the cut off density during ECH heating. Each kind of image implies the execution of different application software. Due to the fact that the recognition system is based on a learning system and major modifications have been carried out in both the diagnostic (optics) and TJ-II plasmas (injected power), the classifier model is no longer valid. A new SVM model has been developed with the current conditions. Also, specific error conditions in the data acquisition process can automatically be detected and managed now. The recovering process has been automated, thereby avoiding the loss of data in ensuing discharges.

  8. Automatic scanning for nuclear emulsion

    International Nuclear Information System (INIS)

    Automatic scanning systems have been recently developed for application in neutrino experiments exploiting nuclear emulsion detectors of particle tracks. These systems speed up substantially the analysis of events in emulsion, allowing the realisation of experiments with unprecedented statistics. The pioneering work on automatic scanning has been done by the University of Nagoya (Japan). The so called new track selector has a very good reproducibility in position (∼1 μm) and angle (∼3 mrad), with the possibility to reconstruct, in about 3 s, all the tracks in a view of 150x150 μm2 and 1 mm of thickness. A new system (ultratrack selector), with speed higher by one order of magnitude, has started to be in operation. R and D programs are going on in Nagoya and in other laboratories for new systems. The scanning speed in nuclear emulsion be further increased by an order of magnitude. The recent progress in the technology of digital signal processing and of image acquisition systems (CCDs and fast frame grabbers) allows the realisation of systems with high performance. New interesting applications of the technique in other fields (e.g. in biophysics) have recently been envisaged

  9. Analysis of results obtained using the automatic chemical control of the quality of the water heat carrier in the drum boiler of the Ivanovo CHP-3 power plant

    Science.gov (United States)

    Larin, A. B.; Kolegov, A. V.

    2012-10-01

    Results of industrial tests of the new method used for the automatic chemical control of the quality of boiler water of the drum-type power boiler ( P d = 13.8 MPa) are described. The possibility of using an H-cationite column for measuring the electric conductivity of an H-cationized sample of boiler water over a long period of time is shown.

  10. Support vector machine for automatic pain recognition

    Science.gov (United States)

    Monwar, Md Maruf; Rezaei, Siamak

    2009-02-01

    Facial expressions are a key index of emotion and the interpretation of such expressions of emotion is critical to everyday social functioning. In this paper, we present an efficient video analysis technique for recognition of a specific expression, pain, from human faces. We employ an automatic face detector which detects face from the stored video frame using skin color modeling technique. For pain recognition, location and shape features of the detected faces are computed. These features are then used as inputs to a support vector machine (SVM) for classification. We compare the results with neural network based and eigenimage based automatic pain recognition systems. The experiment results indicate that using support vector machine as classifier can certainly improve the performance of automatic pain recognition system.

  11. AUTOMATIC WEB SCRAPPING USING VISUAL SELECTORS

    Directory of Open Access Journals (Sweden)

    Rashmi Bhosale

    2015-12-01

    Full Text Available The amount of information that is currently vailable on the net grows at a very fast pace, thus web can be considered as the largest knowledge repository ever developed and made available to the public. A web data extraction system is a system that extracts data from web pages automatically. Web data analysis applications such as extracting mutual funds information from a website, extracting opening and closing price of stock daily from a web page and so on involves web data extraction.Early techniques were construcingt wrapper to visit those sites and collect data which is time consuming. Thus a technique called as Automatic Web Scrapping Using Visual Selectors(AWSUVS is proposed. For selected data sections AWSUVS discovers extraction pattern automatically. AWSUVS uses visual cues to identify data records while ignoring noise items such as advertises and navigation bars.

  12. An image-based automatic mesh generation and numerical simulation for a population-based analysis of aerosol delivery in the human lungs

    Science.gov (United States)

    Miyawaki, Shinjiro; Tawhai, Merryn H.; Hoffman, Eric A.; Lin, Ching-Long

    2013-11-01

    The authors propose a method to automatically generate three-dimensional subject-specific airway geometries and meshes for computational fluid dynamics (CFD) studies of aerosol delivery in the human lungs. The proposed method automatically expands computed tomography (CT)-based airway skeleton to generate the centerline (CL)-based model, and then fits it to the CT-segmented geometry to generate the hybrid CL-CT-based model. To produce a turbulent laryngeal jet known to affect aerosol transport, we developed a physiologically-consistent laryngeal model that can be attached to the trachea of the above models. We used Gmsh to automatically generate the mesh for the above models. To assess the quality of the models, we compared the regional aerosol distributions in a human lung predicted by the hybrid model and the manually generated CT-based model. The aerosol distribution predicted by the hybrid model was consistent with the prediction by the CT-based model. We applied the hybrid model to 8 healthy and 16 severe asthmatic subjects, and average geometric error was 3.8% of the branch radius. The proposed method can be potentially applied to the branch-by-branch analyses of a large population of healthy and diseased lungs. NIH Grants R01-HL-094315 and S10-RR-022421, CT data provided by SARP, and computer time provided by XSEDE.

  13. Using airborne LiDAR in geoarchaeological contexts: Assessment of an automatic tool for the detection and the morphometric analysis of grazing archaeological structures (French Massif Central).

    Science.gov (United States)

    Roussel, Erwan; Toumazet, Jean-Pierre; Florez, Marta; Vautier, Franck; Dousteyssier, Bertrand

    2014-05-01

    Airborne laser scanning (ALS) of archaeological regions of interest is nowadays a widely used and established method for accurate topographic and microtopographic survey. The penetration of the vegetation cover by the laser beam allows the reconstruction of reliable digital terrain models (DTM) of forested areas where traditional prospection methods are inefficient, time-consuming and non-exhaustive. The ALS technology provides the opportunity to discover new archaeological features hidden by vegetation and provides a comprehensive survey of cultural heritage sites within their environmental context. However, the post-processing of LiDAR points clouds produces a huge quantity of data in which relevant archaeological features are not easily detectable with common visualizing and analysing tools. Undoubtedly, there is an urgent need for automation of structures detection and morphometric extraction techniques, especially for the "archaeological desert" in densely forested areas. This presentation deals with the development of automatic detection procedures applied to archaeological structures located in the French Massif Central, in the western forested part of the Puy-de-Dôme volcano between 950 and 1100 m a.s.l.. These unknown archaeological sites were discovered by the March 2011 ALS mission and display a high density of subcircular depressions with a corridor access. The spatial organization of these depressions vary from isolated to aggregated or aligned features. Functionally, they appear to be former grazing constructions built from the medieval to the modern period. Similar grazing structures are known in other locations of the French Massif Central (Sancy, Artense, Cézallier) where the ground is vegetation-free. In order to develop a reliable process of automatic detection and mapping of these archaeological structures, a learning zone has been delineated within the ALS surveyed area. The grazing features were mapped and typical morphometric attributes

  14. Dynamics of structures '89. Vol. 1 and 2

    International Nuclear Information System (INIS)

    The proceedings, comprising 3 volumes published by the Plzen Centre of the Czechoslovak Society for Science and Technology (Vol. 1 and 2) and by Skoda Works in Plzen (Vol. 3), contain 107 papers, out of which 8 fall within the INIS Subject Scope; these deal with problems related to the earthquake resistance of nuclear power plants. Attention is paid to the evaluation of seismic characteristics of nuclear power plant equipment, to the equipment testing and to calculations of its dynamic characteristics under simulated seismic stress. (Z.M.)

  15. Proceedings of the 1. Arabic conference on chemical applications (Chemia 2). Vol. 2

    International Nuclear Information System (INIS)

    The conference of chemical application was held on 1-5 Nov 1997 in Cairo, This vol.2 contains of chemical application on nuclear materials. Studies on these vol.This second volume covers papers presented on the subjects

  16. Automatic radioactive waste recycling

    International Nuclear Information System (INIS)

    The production of a plutonium ingot by calcium reduction process at CEA/Valduc generates a residue called 'slag'. This article introduces the recycling unit which is dedicated to the treatment of slags. The aim is to separate and to recycle the plutonium trapped in this bulk on the one hand, and to generate a disposable waste from the slag on the other hand. After a general introduction of the facilities, some elements will be enlightened, particularly the dissolution step, the filtration and the drying equipment. Reflections upon technological constraints will be proposed, and the benefits of a fully automatic recycling unit of nuclear waste will also be stressed. (authors)

  17. Automatic Configuration in NTP

    Institute of Scientific and Technical Information of China (English)

    Jiang Zongli(蒋宗礼); Xu Binbin

    2003-01-01

    NTP is nowadays the most widely used distributed network time protocol, which aims at synchronizing the clocks of computers in a network and keeping the accuracy and validation of the time information which is transmitted in the network. Without automatic configuration mechanism, the stability and flexibility of the synchronization network built upon NTP protocol are not satisfying. P2P's resource discovery mechanism is used to look for time sources in a synchronization network, and according to the network environment and node's quality, the synchronization network is constructed dynamically.

  18. Proceedings of the second international conference on environmental impact assessment of all economical activities. Vol. 1

    International Nuclear Information System (INIS)

    Proceedings of the conference consist of 3 volumes: Vol. 1 - 'Environmental Impact Assessment of all Economical Activities including Industry'; Vol. 2 - 'Air Pollution Control and Prevention'; Vol. 3 - Waste Management and Environmental Problems in Construction Industry'. Out of 32 papers contained in Vol. 1, 2 were inputted to INIS. They deal with models of radionuclide transport in food chains and the use of aerial monitoring in the study of environmental contamination. (Z.S.)

  19. Proceedings of the second international conference on environmental impact assessment of all economical activities. Vol. 2

    International Nuclear Information System (INIS)

    Proceedings of the conference consist of 3 volumes: Vol. 1 - 'Environmental Impact Assessment of all Economical Activities including Industry'; Vol. 2 - 'Air Pollution Control and Prevention'; Vol. 3 - Waste Management and Environmental Problems in Construction Industry'. Out of 32 papers contained in Vol. 2, 4 were inputted to INIS. They deal with nuclear fusion as a potential energy source, with environmental aspects of disposal of ashes from power plants in the Czech Republic, and with land reclamation after mining activities. (Z.S.)

  20. Upgrade of the Automatic Analysis System in the TJ-II Thomson Scattering Diagnostic: New Image Recognition Classifier and Fault Condition Detection

    International Nuclear Information System (INIS)

    Full text of publication follows: An automatic image classification system has been in operation for years in the TJ-II Thomson diagnostic. It recognizes five different types of images: CCD camera background, measurement of stray light without plasma or in a collapsed discharge, image during ECH phase, image during NBI phase and image after reaching the cut o density during ECH heating. Each kind of image implies the execution of different application software. Therefore, the classification system was developed to launch the corresponding software in an automatic way. The method to recognize the several classes was based on a learning system, in particular Support Vector Machines (SVM). Since the first implementation of the classifier, a relevant improvement has been accomplished in the diagnostic: a new notch filter is in operation, having a larger stray-light rejection at the ruby wavelength than the previous filter. On the other hand, its location in the optical system has been modified. As a consequence, the stray light pattern in the CCD image is located in a different position. In addition to these transformations, the power of neutral beams injected in the TJ-II plasma has been increased about a factor of 2. Due to the fact that the recognition system is based on a learning system and major modifications have been carried out in both the diagnostic (optics) and TJ-II plasmas (injected power), the classifier model is no longer valid. The creation of a new model (also based on SVM) under the present conditions has been necessary. Finally, specific error conditions in the data acquisition process can automatically be detected now. The recovering process can be automated, thereby avoiding the loss of data in ensuing discharges. (authors)

  1. Automatic readout micrometer

    International Nuclear Information System (INIS)

    A measuring system is disclosed for surveying and very accurately positioning objects with respect to a reference line. A principal use of this surveying system is for accurately aligning the electromagnets which direct a particle beam emitted from a particle accelerator. Prior art surveying systems require highly skilled surveyors. Prior art systems include, for example, optical surveying systems which are susceptible to operator reading errors, and celestial navigation-type surveying systems, with their inherent complexities. The present invention provides an automatic readout micrometer which can very accurately measure distances. The invention has a simplicity of operation which practically eliminates the possibilities of operator optical reading error, owning to the elimination of traditional optical alignments for making measurements. The invention has an extendable arm which carries a laser surveying target. The extendable arm can be continuously positioned over its entire length of travel by either a coarse or fine adjustment without having the fine adjustment outrun the coarse adjustment until a reference laser beam is centered on the target as indicated by a digital readout. The length of the micrometer can then be accurately and automatically read by a computer and compared with a standardized set of alignment measurements. Due to its construction, the micrometer eliminates any errors due to temperature changes when the system is operated within a standard operating temperature range

  2. Automatic personnel contamination monitor

    International Nuclear Information System (INIS)

    United Nuclear Industries, Inc. (UNI) has developed an automatic personnel contamination monitor (APCM), which uniquely combines the design features of both portal and hand and shoe monitors. In addition, this prototype system also has a number of new features, including: micro computer control and readout, nineteen large area gas flow detectors, real-time background compensation, self-checking for system failures, and card reader identification and control. UNI's experience in operating the Hanford N Reactor, located in Richland, Washington, has shown the necessity of automatically monitoring plant personnel for contamination after they have passed through the procedurally controlled radiation zones. This final check ensures that each radiation zone worker has been properly checked before leaving company controlled boundaries. Investigation of the commercially available portal and hand and shoe monitors indicated that they did not have the sensitivity or sophistication required for UNI's application, therefore, a development program was initiated, resulting in the subject monitor. Field testing shows good sensitivity to personnel contamination with the majority of alarms showing contaminants on clothing, face and head areas. In general, the APCM has sensitivity comparable to portal survey instrumentation. The inherit stand-in, walk-on feature of the APCM not only makes it easy to use, but makes it difficult to bypass. (author)

  3. Automatic identification of agricultural terraces through object-oriented analysis of very high resolution DSMs and multispectral imagery obtained from an unmanned aerial vehicle.

    Science.gov (United States)

    Diaz-Varela, R A; Zarco-Tejada, P J; Angileri, V; Loudjani, P

    2014-02-15

    Agricultural terraces are features that provide a number of ecosystem services. As a result, their maintenance is supported by measures established by the European Common Agricultural Policy (CAP). In the framework of CAP implementation and monitoring, there is a current and future need for the development of robust, repeatable and cost-effective methodologies for the automatic identification and monitoring of these features at farm scale. This is a complex task, particularly when terraces are associated to complex vegetation cover patterns, as happens with permanent crops (e.g. olive trees). In this study we present a novel methodology for automatic and cost-efficient identification of terraces using only imagery from commercial off-the-shelf (COTS) cameras on board unmanned aerial vehicles (UAVs). Using state-of-the-art computer vision techniques, we generated orthoimagery and digital surface models (DSMs) at 11 cm spatial resolution with low user intervention. In a second stage, these data were used to identify terraces using a multi-scale object-oriented classification method. Results show the potential of this method even in highly complex agricultural areas, both regarding DSM reconstruction and image classification. The UAV-derived DSM had a root mean square error (RMSE) lower than 0.5 m when the height of the terraces was assessed against field GPS data. The subsequent automated terrace classification yielded an overall accuracy of 90% based exclusively on spectral and elevation data derived from the UAV imagery. PMID:24473345

  4. Automatic extraction of left ventricle in SPECT myocardial perfusion imaging

    International Nuclear Information System (INIS)

    An automatic method of extracting left ventricle from SPECT myocardial perfusion data was introduced. This method was based on the least square analysis of the positions of all short-axis slices pixels from the half sphere-cylinder myocardial model, and used a iterative reconstruction technique to automatically cut off the non-left ventricular tissue from the perfusion images. Thereby, this technique provided the bases for further quantitative analysis

  5. JAPS: an automatic parallelizing system based on JAVA

    Institute of Scientific and Technical Information of China (English)

    杜建成; 陈道蓄; 谢立

    1999-01-01

    JAPS is an automatic parallelizing system based on JAVA running on NOW. It implements the automatic process from dependence analysis to parallel execution. The current version of JAPS can exploit functional parallelism and the detection of data parallelism will be incorporated in the new version, which is underway. The framework and key techniques of JAPS are presented. Specific topics discussed are task partitioning, summary information collection, data dependence analysis, pre-scheduling and dynamic scheduling, etc.

  6. Comment on "Why reduced-form regression models of health effects versus exposures should not replace QRA: livestock production and infant mortality as an example," by Louis Anthony (Tony) Cox, Jr., Risk Analysis 2009, Vol. 29, No. 12.

    Science.gov (United States)

    Sneeringer, Stacy

    2010-04-01

    While a recent paper by Cox in this journal uses as its motivating factor the benefits of quantitative risk assessment, its content is entirely devoted to critiquing Sneeringer's article in the American Journal of Agricultural Economics. Cox's two main critiques of Sneeringer are fundamentally flawed and misrepresent the original article. Cox posits that Sneeringer did A and B, and then argues why A and B are incorrect. However, Sneeringer in fact did C and D; thus critiques of A and B are not applicable to Sneeringer's analysis. PMID:20345577

  7. Quality and completeness of risk analyses. Vol. 1

    International Nuclear Information System (INIS)

    The program described was started in 1974, at Risoe National Laboratory. The motivation was criticism then being directed at the Reactor Safety Study WASH 1400, and the view that if risk analysis were to have a future as a scientific study, then its procedures would need to be verified. The material described is a record of a prolonged set of experiments and experiences and this second edition of the report includes an update of the research to cover a study of 35 risk analyses reviewed and checked between 1988 and 1992. A survey is presented of the ways in which incompleteness, lacunae, and oversights arise in risk analysis. Areas were detailed knowledge of disturbance causes and consequences need to be known are alarm priority setting, suppression of nuisance alarms and status annunciation signals, advanced shut down system design and runback systems, alarm and disturbance analysis, automatic plant supervision, disturbance diagnosis and testing support, and monitoring of safe operation margins. Despite improvements in risk analysis technique, in safety management, and in safety design, there will always be problems of ignorance, of material properties, of chemical reactions, of behavioural patterns and safety decision making, which leave plants vulnerable and hazardous. (AB) (24 refs.)

  8. Automatic Speaker Recognition System

    Directory of Open Access Journals (Sweden)

    Parul,R. B. Dubey

    2012-12-01

    Full Text Available Spoken language is used by human to convey many types of information. Primarily, speech convey message via words. Owing to advanced speech technologies, people's interactions with remote machines, such as phone banking, internet browsing, and secured information retrieval by voice, is becoming popular today. Speaker verification and speaker identification are important for authentication and verification in security purpose. Speaker identification methods can be divided into text independent and text-dependent. Speaker recognition is the process of automatically recognizing speaker voice on the basis of individual information included in the input speech waves. It consists of comparing a speech signal from an unknown speaker to a set of stored data of known speakers. This process recognizes who has spoken by matching input signal with pre- stored samples. The work is focussed to improve the performance of the speaker verification under noisy conditions.

  9. Automatic Wall Painting Robot

    Directory of Open Access Journals (Sweden)

    P.KEERTHANAA, K.JEEVITHA, V.NAVINA, G.INDIRA, S.JAYAMANI

    2013-07-01

    Full Text Available The Primary Aim Of The Project Is To Design, Develop And Implement Automatic Wall Painting Robot Which Helps To Achieve Low Cost Painting Equipment. Despite The Advances In Robotics And Its Wide Spreading Applications, Interior Wall Painting Has Shared Little In Research Activities. The Painting Chemicals Can Cause Hazards To The Human Painters Such As Eye And Respiratory System Problems. Also The Nature Of Painting Procedure That Requires Repeated Work And Hand Rising Makes It Boring, Time And Effort Consuming. When Construction Workers And Robots Are Properly Integrated In Building Tasks, The Whole Construction Process Can Be Better Managed And Savings In Human Labour And Timing Are Obtained As A Consequence. In Addition, It Would Offer The Opportunity To Reduce Or Eliminate Human Exposure To Difficult And Hazardous Environments, Which Would Solve Most Of The Problems Connected With Safety When Many Activities Occur At The Same Time. These Factors Motivate The Development Of An Automated Robotic Painting System.

  10. Automatic alkaloid removal system.

    Science.gov (United States)

    Yahaya, Muhammad Rizuwan; Hj Razali, Mohd Hudzari; Abu Bakar, Che Abdullah; Ismail, Wan Ishak Wan; Muda, Wan Musa Wan; Mat, Nashriyah; Zakaria, Abd

    2014-01-01

    This alkaloid automated removal machine was developed at Instrumentation Laboratory, Universiti Sultan Zainal Abidin Malaysia that purposely for removing the alkaloid toxicity from Dioscorea hispida (DH) tuber. It is a poisonous plant where scientific study has shown that its tubers contain toxic alkaloid constituents, dioscorine. The tubers can only be consumed after it poisonous is removed. In this experiment, the tubers are needed to blend as powder form before inserting into machine basket. The user is need to push the START button on machine controller for switching the water pump ON by then creating turbulence wave of water in machine tank. The water will stop automatically by triggering the outlet solenoid valve. The powders of tubers are washed for 10 minutes while 1 liter of contaminated water due toxin mixture is flowing out. At this time, the controller will automatically triggered inlet solenoid valve and the new water will flow in machine tank until achieve the desire level that which determined by ultra sonic sensor. This process will repeated for 7 h and the positive result is achieved and shows it significant according to the several parameters of biological character ofpH, temperature, dissolve oxygen, turbidity, conductivity and fish survival rate or time. From that parameter, it also shows the positive result which is near or same with control water and assuming was made that the toxin is fully removed when the pH of DH powder is near with control water. For control water, the pH is about 5.3 while water from this experiment process is 6.0 and before run the machine the pH of contaminated water is about 3.8 which are too acid. This automated machine can save time for removing toxicity from DH compared with a traditional method while less observation of the user. PMID:24783795

  11. Unsupervised Threshold for Automatic Extraction of Dolphin Dorsal Fin Outlines from Digital Photographs in DARWIN (Digital Analysis and Recognition of Whale Images on a Network)

    CERN Document Server

    Hale, Scott A

    2012-01-01

    At least two software packages---DARWIN, Eckerd College, and FinScan, Texas A&M---exist to facilitate the identification of cetaceans---whales, dolphins, porpoises---based upon the naturally occurring features along the edges of their dorsal fins. Such identification is useful for biological studies of population, social interaction, migration, etc. The process whereby fin outlines are extracted in current fin-recognition software packages is manually intensive and represents a major user input bottleneck: it is both time consuming and visually fatiguing. This research aims to develop automated methods (employing unsupervised thresholding and morphological processing techniques) to extract cetacean dorsal fin outlines from digital photographs thereby reducing manual user input. Ideally, automatic outline generation will improve the overall user experience and improve the ability of the software to correctly identify cetaceans. Various transformations from color to gray space were examined to determine whi...

  12. Automatic classification of defects in weld pipe

    International Nuclear Information System (INIS)

    With the advancement of computer imaging technology, the image on hard radiographic film can be digitized and stored in a computer and the manual process of defect recognition and classification may be replace by the computer. In this paper a computerized method for automatic detection and classification of common defects in film radiography of weld pipe is described. The detection and classification processes consist of automatic selection of interest area on the image and then classify common defects using image processing and special algorithms. Analysis of the attributes of each defect such as area, size, shape and orientation are carried out by the feature analysis process. These attributes reveal the type of each defect. These methods of defect classification result in high success rate. Our experience showed that sharp film images produced better results

  13. Automatic Classification of Attacks on IP Telephony

    Directory of Open Access Journals (Sweden)

    Jakub Safarik

    2013-01-01

    Full Text Available This article proposes an algorithm for automatic analysis of attack data in IP telephony network with a neural network. Data for the analysis is gathered from variable monitoring application running in the network. These monitoring systems are a typical part of nowadays network. Information from them is usually used after attack. It is possible to use an automatic classification of IP telephony attacks for nearly real-time classification and counter attack or mitigation of potential attacks. The classification use proposed neural network, and the article covers design of a neural network and its practical implementation. It contains also methods for neural network learning and data gathering functions from honeypot application.

  14. 基于主成分分析和马氏距离的测井曲线自动分层方法%Automatic stratification of well logging curves based on principal component analysis and Mahalanobis distance

    Institute of Scientific and Technical Information of China (English)

    涂必超; 杨枫林

    2012-01-01

    采用主成分分析与判别分析相结合的方法对测井曲线进行自动分层.首先对所有井进行数据预处理,通过程序比较字符串的方式提取出其中的公共曲线,然后对处理过的数据用R编程实现主成分分析,选择主成分代替原来的数据,达到降维的目的.最后以标准井提供的分层结果作为样本进行距离判别分析,对剩下井进行自动分层,并与手工分层的结果比对,给出最终处理的分层结果.%The principal component analysis and discriminated analysis are applied to automatically stratify well logging curves. Firstly, the data preprocessing to all wells is done. By the way of comparing the character string, its common curves are picked up. Then, principal component analysis to the processed data is realized by using R programming. Principal component is used to replace primary data so as to reduce dimension. Finally, discriminated analysis is done based on the sample of stratified result supplied by standard well. The rest wells are automatically stratified. By comparing with the result of manual stratification, the final - processed stratified result is obtained.

  15. Keystone feasibility study. Final report. Vol. 4

    Energy Technology Data Exchange (ETDEWEB)

    1982-12-01

    Volume four of the Keystone coal-to-methanol project includes the following: (1) project management; (2) economic and financial analyses; (3) market analysis; (4) process licensing and agreements; and (5) appendices. 24 figures, 27 tables.

  16. Análise da madeira de Pinus oocarpa parte I: estudo dos constituintes macromoleculares e extrativos voláteis Chemical analysis of Pinus oocarpa wood part I: quantification of macromolecular components and volatile extractives

    Directory of Open Access Journals (Sweden)

    Sérgio Antônio Lemos de Morais

    2005-06-01

    Full Text Available Neste estudo foram analisados os principais componentes químicos da madeira de Pinus oocarpa, cultivado na região do cerrado. A composição química dessa madeira foi: 59,05% de a-celulose, 21,22% de hemiceluloses A e B, 25,18% de lignina, 2,78% de extrativos em diclorometano, 4,38% de extrativos em etanol:tolueno, 4,31% de extrativos em água quente e 1,26% de cinzas. O conteúdo de celulose foi relativamente elevado, indicando que essa madeira possui grande potencial para produção de pasta de celulose. Investigou-se, também, a composição dos extrativos. Os principais constituintes do extrato diclorometano dessa madeira foram os ácidos diterpênicos, além dos ácidos palmítico e oléico. No óleo essencial, extraído por aparelho de Clevenger, os principais componentes identificados foram aromadendreno, ledano, hexadecanal e ácido oléico.The chemical composition of Pinus oocarpa wood cultivated in the Brazilian cerrado was established. The obtained results were: a-cellulose (59.05%, hemicelluloses A and B (21.22%, lignin (25.18%, dichloromethane extractives (2.78%, ethanol:toluene extractives (4.38%, hot water extractives (4.31% and ash (1.26%. The cellulose content was high. This result opens perspectives for using Pinus oocarpa wood in pulp and paper industries. Most of the dichloromethane extractives were diterpenic, palmitic and oleic acids. The volatile composition, obtained by means of the Clevenger method followed by GC-MS analysis was constituted mainly by aromadendrene, ledane, hexadecanal and oleic acid.

  17. Automatic liquid-liquid extraction system

    International Nuclear Information System (INIS)

    This invention concerns an automatic liquid-liquid extraction system ensuring great reproducibility on a number of samples, stirring and decanting of the two liquid phases, then the quantitative removal of the entire liquid phase present in the extraction vessel at the end of the operation. This type of system has many applications, particularly in carrying out analytical processes comprising a stage for the extraction, by means of an appropriate solvent, of certain components of the sample under analysis

  18. Generating Semi-Markov Models Automatically

    Science.gov (United States)

    Johnson, Sally C.

    1990-01-01

    Abstract Semi-Markov Specification Interface to SURE Tool (ASSIST) program developed to generate semi-Markov model automatically from description in abstract, high-level language. ASSIST reads input file describing failure behavior of system in abstract language and generates Markov models in format needed for input to Semi-Markov Unreliability Range Evaluator (SURE) program (COSMIC program LAR-13789). Facilitates analysis of behavior of fault-tolerant computer. Written in PASCAL.

  19. Making automatic differentiation truly automatic : coupling PETSc with ADIC

    International Nuclear Information System (INIS)

    Despite its name, automatic differentiation (AD) is often far from an automatic process. often one must specify independent and dependent variables, indicate the derivative quantities to be computed, and perhaps even provide information about the structure of the Jacobians or Hessians being computed. However, when AD is used in conjunction with a toolkit with well-defined interfaces, many of these issues do not arise. They describe recent research into coupling the ADIC automatic differentiation tool with PETSc, a toolkit for the parallel numerical solution of PDEs. This research leverages the interfaces and objects of PETSc to make the AD process very nearly transparent

  20. Composite materials: Fatigue and fracture. Vol. 3

    Science.gov (United States)

    O'Brien, T. K. (Editor)

    1991-01-01

    The present volume discusses topics in the fields of matrix cracking and delamination, interlaminar fracture toughness, delamination analysis, strength and impact characteristics, and fatigue and fracture behavior. Attention is given to cooling rate effects in carbon-reinforced PEEK, the effect of porosity on flange-web corner strength, mode II delamination in toughened composites, the combined effect of matrix cracking and free edge delamination, and a 3D stress analysis of plain weave composites. Also discussed are the compression behavior of composites, damage-based notched-strength modeling, fatigue failure processes in aligned carbon-epoxy laminates, and the thermomechanical fatigue of a quasi-isotropic metal-matrix composite.

  1. Empirical Research in Theatre, Vol 3.

    Science.gov (United States)

    Addington, David W., Ed.; Kepke, Allen N., Ed.

    This journal provides a focal point for the collection and distribution of systematically processed information about theory and practice in theatre. Part of an irregularly published series, this issue contains investigations of the application of transactional analysis to the theatre, the psychological effect of counterattitudinal acting in…

  2. Information Management and Market Engineering. Vol. II

    OpenAIRE

    Dreier, Thomas; Krämer, Jan; Studer, Rudi; Weinhardt, Christof; [Hrsg.

    2010-01-01

    The research program Information Management and Market Engineering focuses on the analysis and the design of electronic markets. Taking a holistic view of the conceptualization and realization of solutions, the research integrates the disciplines business administration, economics, computer science, and law. Topics of interest range from the implementation, quality assurance, and advancement of electronic markets to their integration into business processes and legal frameworks.

  3. 带电作业工器具自动管理系统应用分析%Application Analysis of Live Line Tool Automatic Management System

    Institute of Scientific and Technical Information of China (English)

    曹国文; 蒋标

    2015-01-01

    In view of complicated formalities of tools receiving and returning, long time-consuming, low work efficiency in the live line tools storehouse in Bayannur Electric Power Bureau, the live line tool automatic management system was adopted. RFID system simplified the the procedure of tools receipt and approval. As implementing, the staff need to take the tools with radio frequency tags through the the doors installed with radio frequency device instruments. The system will record the information of the staff, the time, the name of tools taking out (taking in), and the numbers of the tools, and upload and store automatically the information to live working instruments warehouse computer. The staff can inquire information in this computer and in the office computer through private network, which can not only save working time, improve work efficiency, but also guarantee the safety of the instruments use.%针对内蒙古巴彦淖尔电业局带电作业工具库内工器具领用、归还手续繁琐、耗时长,工作效率低的情况,在该局使用了带电作业工器具自动管理系统(RFID)。该系统简化了带电作业工器具出、入库的流程,操作时,工作人员只需携带贴有射频标签的工器具从装有射频装置的大门通过,该系统就会记录领取工器具的工作人员、时间、带出(回)的工器具名称以及数量信息,并将信息上传至带电作业工器具库房的计算机中,自动保存,工作人员可在该计算机中查询借出、归还、库存信息,也可在办公室计算机中通过专网查询当前工器具信息,无需人工登记,不仅节约了工作时间,提高了工作效率,而且确保了工器具的使用安全。

  4. Supersymmetric mechanics. Vol. 1. Supersymmetry, noncommutativity and matrix models

    Energy Technology Data Exchange (ETDEWEB)

    Bellucci, S. (ed.) [Istituto Nazionale di Fisica Nucleare, Rome (Italy)

    2006-07-01

    This is the first volume in a series of books on the general theme of Supersymmetric Mechanics; the series is based on lectures and discussions held in 2005 and 2006 at the INFN-Laboratori Nazionali di Frascati. The selected topics include supersymmetry and supergravity, the attractor mechanism, black holes, noncommutative mechanics, super-Hamiltonian formalism and matrix models. All lectures are intended for beginners at the graduate level and nonspecialists from related fields of research and a substantial effort was made to incorporate in the extensive write-ups the results of the animated discussion sessions which followed the individual lectures. A second volume appears as Lecture Notes Physics, Vol. 701 ''Supersymmetric Mechanics - Vol. 2: The Attractor Mechanism'' (2006), ISBN: 3-540-34156-0. (orig.)

  5. An automatic evaluation system for NTA film neutron dosimeters

    CERN Document Server

    Müller, R

    1999-01-01

    At CERN, neutron personal monitoring for over 4000 collaborators is performed with Kodak NTA films, which have been shown to be the most suitable neutron dosimeter in the radiation environment around high-energy accelerators. To overcome the lengthy and strenuous manual scanning process with an optical microscope, an automatic analysis system has been developed. We report on the successful automatic scanning of NTA films irradiated with sup 2 sup 3 sup 8 Pu-Be source neutrons, which results in densely ionised recoil tracks, as well as on the extension of the method to higher energy neutrons causing sparse and fragmentary tracks. The application of the method in routine personal monitoring is discussed. $9 overcome the lengthy and strenuous manual scanning process with an optical microscope, an automatic analysis system has been developed. We report on the successful automatic scanning of NTA films irradiated with /sup 238/Pu-Be source $9 discussed. (10 refs).

  6. Automatic Kurdish Dialects Identification

    Directory of Open Access Journals (Sweden)

    Hossein Hassani

    2016-02-01

    Full Text Available Automatic dialect identification is a necessary Lan guage Technology for processing multi- dialect languages in which the dialects are linguis tically far from each other. Particularly, this becomes crucial where the dialects are mutually uni ntelligible. Therefore, to perform computational activities on these languages, the sy stem needs to identify the dialect that is the subject of the process. Kurdish language encompasse s various dialects. It is written using several different scripts. The language lacks of a standard orthography. This situation makes the Kurdish dialectal identification more interesti ng and required, both form the research and from the application perspectives. In this research , we have applied a classification method, based on supervised machine learning, to identify t he dialects of the Kurdish texts. The research has focused on two widely spoken and most dominant Kurdish dialects, namely, Kurmanji and Sorani. The approach could be applied to the other Kurdish dialects as well. The method is also applicable to the languages which are similar to Ku rdish in their dialectal diversity and differences.

  7. Index to Nuclear Safety. A technical progress review by chronology, permuted title, and author. Vol. 11, No. 1 through Vol. 15, No. 6

    International Nuclear Information System (INIS)

    This issue of the Index to Nuclear Safety covers only articles included in Nuclear Safety, Vol. 11, No. 1, through Vol. 15, No. 6. This index is presented in three sections as follows: Chronological List of Articles by Volume; Permuted Title (KWIC) Index; and Author Index. (U.S.)

  8. Kinematic Analysis of Incomplete Gear Automatic Reverse Mechanism%不完全齿轮自动换向机构的运动分析

    Institute of Scientific and Technical Information of China (English)

    王猛; 李长春

    2012-01-01

    利用不完全齿轮机构的特点,设计一种齿轮式自动换向装置,实现输入轴连续单向转动,输出轴连续正反向转动,且保持输入轴的转速和输出轴的转速之比为定值.分析了不完全齿轮首齿和末齿的啮合线长度,在保证重合度等于1时,运用齿顶高修形法,避免了不完全齿轮传动的干涉现象发生.%By the characteristics of the incomplete gear mechanism, a kind of gear automatic reversing device to realize the input shaft continuous unidirectional turns is designed to realize, the output shaft positive and negative rotary under the condition of the input shaft speed and output speed ratio is fixed value. The length of line of action with the teeth at the end and the first of incomplete gear is analyzed, on the premise of contact ratio equal to one, using repair form method of addendum ,the interference phenomenon of the incomplete gear transmission is avoided.

  9. Exercises in dental radiology. Vol. 3

    International Nuclear Information System (INIS)

    The book is addressed to paediatric dentists and other dentists who have children among their patents; it presents a survey of normal and pathological development of teeth and surrounding tissues. Imaging errors, eruption problems, anomalies the radiological picture of primary and secondary crowding during eruption, analysis of the deciduous teeth, teleradiography, traumas and temporomandibular diseases are discussed. Each chapter contains questions concerning the interpretation of the radiological findings. (orig./MG)

  10. Information Management and Market Engineering. [Vol. I

    OpenAIRE

    Dreier, Thomas; Studer, Rudi; Weinhardt, Christof

    2006-01-01

    The research program �Information Management and Market Engineering� focuses on the analysis and the design of electronic markets. Taking a holistic view of the conceptualization and realization of solutions, the research integrates the disciplines business administration, economics, computer science, and law. Topics of interest range from the implementation, quality assurance, and further development of electronic markets to their integration into business processes, innovative...

  11. Electronic amplifiers for automatic compensators

    CERN Document Server

    Polonnikov, D Ye

    1965-01-01

    Electronic Amplifiers for Automatic Compensators presents the design and operation of electronic amplifiers for use in automatic control and measuring systems. This book is composed of eight chapters that consider the problems of constructing input and output circuits of amplifiers, suppression of interference and ensuring high sensitivity.This work begins with a survey of the operating principles of electronic amplifiers in automatic compensator systems. The succeeding chapters deal with circuit selection and the calculation and determination of the principal characteristics of amplifiers, as

  12. JOYO reactor technology progress report, vol. 5

    International Nuclear Information System (INIS)

    This report summarizes the technical progress made by the PNC Experimental Fast Reactor Division in the period April through June 1981. Main contents are as follows. 3.1 Radiation Shielding Performance Test: Radiation rates in and around the containment vessel were measured at up to 75 MW power without radiation shield concrete blocks above the reactor pit. 3.2 Measurement and Analysis of the Reactor Core Characteristics: A power coefficient, a burn up coefficient and a excess reactivity were measured in the fourth and fifth 75 MW duty operation cycles. 3.3 Transient Response Test: Transient response test was conducted in the fourth 75 MW duty operation cycle. 3.4 Operational Data Banking System: Operational data banking system is ready for service. 3.5 Test Plan of Natural Circulation: Natural circulation test is planned to confirm decay heat removal capability by natural circulation. 3.6 Impurity Control and Analysis: Results of the impurity control and analysis are shown on sodium coolant and argon cover gas for the fourth 75 MW duty operation cycle. 3.7 Surveillance Tests. 3.8 Operation Planning. 8. Special Article: A Long-Range Plan of Reactor Technology Section as of 1980. (author)

  13. Preventing SQL Injection through Automatic Query Sanitization with ASSIST

    OpenAIRE

    Raymond Mui; Phyllis Frankl

    2010-01-01

    Web applications are becoming an essential part of our everyday lives. Many of our activities are dependent on the functionality and security of these applications. As the scale of these applications grows, injection vulnerabilities such as SQL injection are major security challenges for developers today. This paper presents the technique of automatic query sanitization to automatically remove SQL injection vulnerabilities in code. In our technique, a combination of static analysis and progra...

  14. SEMANTIC INTEGRATION FOR AUTOMATIC ONTOLOGY MAPPING

    Directory of Open Access Journals (Sweden)

    Siham AMROUCH

    2013-11-01

    Full Text Available In the last decade, ontologies have played a key technology role for information sharing and agents interoperability in different application domains. In semantic web domain, ontologies are efficiently used to face the great challenge of representing the semantics of data, in order to bring the actual web to its full power and hence, achieve its objective. However, using ontologies as common and shared vocabularies requires a certain degree of interoperability between them. To confront this requirement, mapping ontologies is a solution that is not to be avoided. In deed, ontology mapping build a meta layer that allows different applications and information systems to access and share their informations, of course, after resolving the different forms of syntactic, semantic and lexical mismatches. In the contribution presented in this paper, we have integrated the semantic aspect based on an external lexical resource, wordNet, to design a new algorithm for fully automatic ontology mapping. This fully automatic character features the main difference of our contribution with regards to the most of the existing semi-automatic algorithms of ontology mapping, such as Chimaera, Prompt, Onion, Glue, etc. To better enhance the performances of our algorithm, the mapping discovery stage is based on the combination of two sub-modules. The former analysis the concept’s names and the later analysis their properties. Each one of these two sub-modules is it self based on the combination of lexical and semantic similarity measures.

  15. Development of a System for Automatic Recognition of Speech Development of a System for Automatic Recognition of Speech

    Directory of Open Access Journals (Sweden)

    Michal Kuba

    2003-01-01

    Full Text Available The article gives a review of a research on processing and automatic recognition of speech signals (ARR at the Department of Telecommunications of the Faculty of Electrical Engineering, University of iilina. On-going research is oriented to speech parametrization using 2-dimensional cepstral analysis, and to an application of HMMs and neural networks for speech recognition in Slovak language. The article summarizes achieved results and outlines future orientation of our research in automatic speech recognition.The article gives a review of a research on processing and automatic recognition of speech signals (ARR at the Department of Telecommunications of the Faculty of Electrical Engineering, University of Zilina. On-going research is oriented to speech parametrization using 2-dimensional cepstral analysis, and to an application of HMMs and neural networks for speech recognition in Slovak language. The article summarizes achieved results and outlines future orientation of our research in automatic speech recognition.

  16. Savannah River Site production reactor safety analysis report. Vol. VIII

    International Nuclear Information System (INIS)

    The Savannah River Site (SRS) production reactors are unique in their methods of charging and discharging fuel assemblies, target assemblies, and other components. All components are charged and discharged in the air using remotely operated precision cranes. The following sections describe the systems used to store, charge, discharge, and disassemble reactor components and assemblies, and the redundant systems designed to ensure the reliability of crane cooling and control systems

  17. Neutron Diffraction Texture Analysis of 1 Vol.% Cu in Aluminium

    OpenAIRE

    Brokmeier, H.-G.; Bunge, H. J.

    1988-01-01

    Neutron diffraction texture determination was carried out in two-phase composites, AlCu1%, prepared by powder metallurgical methods and extruded 84 and 89% respectively. The texture of the minor Cu-phase can be measured with a sufficient degree of accuracy within reasonable time. This texture is weaker than the texture of the Al-matrix phase corresponding to the lower internal deformation degree of the former one. It is estimated that the method can be extended to even lower volume fractions ...

  18. Semi-automatic drawings surveying system

    International Nuclear Information System (INIS)

    A system for the semi-automatic survey of drawings is presented. Its design has been oriented to the reduction of the stored information required for the drawing reproduction. This equipment consists mainly of a plotter driven by a micro-computer, but the pen of the plotter is replaced by a circular photodiode array. Line drawings are first viewed as a concatenation of vectors, with constant angle between the two vectors, and then divided in arcs of circles and line segments. A dynamic analysis of line intersections with the circular sensor permits to identify starting points and end points in a line, for the purpose of automatically following connected lines in drawing. The advantage of the method described is that precision practically depends only on the plotter performance, the sensor resolution being only considered for the thickness of strokes and the distance between two strokes. (author)

  19. Retrotrapezoid nucleus, respiratory chemosensitivity and breathing automaticity

    Science.gov (United States)

    Guyenet, Patrice G.; Bayliss, Douglas A.; Stornetta, Ruth L.; Fortuna, Michal G.; Abbott, Stephen B.; Depuy, Seth D.

    2009-01-01

    SUMMARY Breathing automaticity and CO2 regulation are inseparable neural processes. The retrotrapezoid nucleus (RTN), a group of glutamatergic neurons that express the transcription factor Phox2b, may be a crucial nodal point through which breathing automaticity is regulated to maintain CO2 constant. This review updates the analysis presented in prior publications. Additional evidence that RTN neurons have central respiratory chemoreceptor properties is presented but this is only one of many factors that determine their activity. The RTN is also regulated by powerful inputs from the carotid bodies and, at least in the adult, by many other synaptic inputs. We also analyze how RTN neurons may control the activity of the downstream central respiratory pattern generator. Specifically, we review the evidence which suggests that RTN neurons a) innervate the entire ventral respiratory column, and b) control both inspiration and expiration. Finally, we argue that the RTN neurons are the adult form of the parafacial respiratory group in neonate rats. PMID:19712903

  20. Automatic modulation recognition of communication signals

    CERN Document Server

    Azzouz, Elsayed Elsayed

    1996-01-01

    Automatic modulation recognition is a rapidly evolving area of signal analysis. In recent years, interest from the academic and military research institutes has focused around the research and development of modulation recognition algorithms. Any communication intelligence (COMINT) system comprises three main blocks: receiver front-end, modulation recogniser and output stage. Considerable work has been done in the area of receiver front-ends. The work at the output stage is concerned with information extraction, recording and exploitation and begins with signal demodulation, that requires accurate knowledge about the signal modulation type. There are, however, two main reasons for knowing the current modulation type of a signal; to preserve the signal information content and to decide upon the suitable counter action, such as jamming. Automatic Modulation Recognition of Communications Signals describes in depth this modulation recognition process. Drawing on several years of research, the authors provide a cr...

  1. Channel selection for automatic seizure detection

    DEFF Research Database (Denmark)

    Duun-Henriksen, Jonas; Kjaer, Troels Wesenberg; Madsen, Rasmus Elsborg; Remvig, Line Sofie; Thomsen, Carsten Eckhart; Sørensen, Helge Bjarup Dissing

    2012-01-01

    Objective: To investigate the performance of epileptic seizure detection using only a few of the recorded EEG channels and the ability of software to select these channels compared with a neurophysiologist. Methods: Fifty-nine seizures and 1419 h of interictal EEG are used for training and testing...... of an automatic channel selection method. The characteristics of the seizures are extracted by the use of a wavelet analysis and classified by a support vector machine. The best channel selection method is based upon maximum variance during the seizure. Results: Using only three channels, a seizure...... recorded directly on the epileptic focus. Conclusions: Based on our dataset, automatic seizure detection can be done using only three EEG channels without loss of performance. These channels should be selected based on maximum variance and not, as often done, using the focal channels. Significance: With...

  2. IJIMAI Editor's Note - Vol. 3 Issue 5

    Directory of Open Access Journals (Sweden)

    Rubén Gonzalez-Crespo

    2015-12-01

    Full Text Available The research works presented in this issue are based on various topics of interest, among which are included: DSL, Machine Learning, Information hiding, Steganography, SMA, RTECTL, SMT-based bounded model checking, STS, Spatial sound, X3D, X3DOM, Web Audio API, Web3D, Real-time, Realistic 3D, 3D Audio, Apache Wave, API, Collaborative, Pedestrian Inertial, Navigation System, Indoor Location, Learning Algorithms, Information Fusion, Agile development, Scrum, Cross Functional Teams, Knowledge Transfer, Technological Innovation, Technology Transfer, Social Networks Analysis, Project Management, Links in Social Networks, Rights of Knowledge Sharing and Web 2.0.

  3. Exercises in dental radiology. Vol. 1

    Energy Technology Data Exchange (ETDEWEB)

    Langlais, R.P.; Kasle, M.J.

    1980-01-01

    With concise questions for diagnosis and differential diagnosis the authors present 298 radiographs, which are completed by notes taken from the anamnesis and other findings. Five fields are discussed: X-ray anatomy, film faults, identification of materials and foreign bodies, anomalies, pathologic alterations and localisation tasks. Without any doubt the voluminous collection of pathologic findings is the most important part. But also in the other chapters numerous valuable limits are given. They all lead to precise and systematic analysis and careful interpretation of radiographs. In many cases the answers in the appendix give by additional explanations information on technical details, radipathologic specialities and on the mode of diagnosis.

  4. Exercises in dental radiology. Vol. 1

    International Nuclear Information System (INIS)

    With concised questions for diagnosis and differential diagnosis the authors present 298 radiographs, which are completed by notes taken from the anamnesis and other findings. Five fields are discussed: X-ray anatomy, film faults, identification of materials and foreign bodies, anomalies, pathologic alterations and localisation tasks. Without any doubt the voluminous collection of pathologic findings is the most important part. But also in the other chapters numerous valuable limits are given. They all lead to precise and systematic analysis and careful interpretation of radiographs. In many cases the answers in the appendix give by additional explanations information on technical details, radipathologic specialities and on the mode of diagnosis. (orig./MG)

  5. Clothes Dryer Automatic Termination Evaluation

    Energy Technology Data Exchange (ETDEWEB)

    TeGrotenhuis, Ward E.

    2014-10-01

    Volume 2: Improved Sensor and Control Designs Many residential clothes dryers on the market today provide automatic cycles that are intended to stop when the clothes are dry, as determined by the final remaining moisture content (RMC). However, testing of automatic termination cycles has shown that many dryers are susceptible to over-drying of loads, leading to excess energy consumption. In particular, tests performed using the DOE Test Procedure in Appendix D2 of 10 CFR 430 subpart B have shown that as much as 62% of the energy used in a cycle may be from over-drying. Volume 1 of this report shows an average of 20% excess energy from over-drying when running automatic cycles with various load compositions and dryer settings. Consequently, improving automatic termination sensors and algorithms has the potential for substantial energy savings in the U.S.

  6. Soil-structure interaction Vol.2. Influence of lift-off

    International Nuclear Information System (INIS)

    This study has been performed for the Nuclear Regulatory Commission (NRC) by the Structural Analysis Division of Brookhaven National Laboratory (BNL). The study was conducted during the fiscal year 1985 on the program entitled 'Benchmarking of Structural Engineering Problems' sponsored by NRC. The program considered three separate but complementary problems, each associated with the soil-structure interaction (SSI) phase of the seismic response analysis of nuclear plant facilities. The reports are presented in three separate volumes. The general title for the reports is 'Soil Structure Interaction' with the following subtitles: Vol. 1 Influence of Layering by A.J. Philippacopoulos, Vol. 2 Influence of Lift-Off by C.A. Miller, Vol. 3 Influence of Ground Water by C.J. Costantino. The two problems presented in Volumes 2 and 3 were conducted at the City University of New York (CUNY) under subcontract to BNL. This report, Volume 2 of the report, presents a summary of the work performed defining the influence liftoff has on the seismic response of nuclear power plant structures. The standard lumped parameter analysis method was modified by representing the lumped soil/structure interaction horizontal and rocking dampers with distributed (over the foundation area) springs and dampers. The distributed springs and dampers are then modified so that they can only transmit compressive stresses. Additional interaction damping is included to account for the energy dissipated as a portion of the foundation which has separated comes back into contact with the soil. The validity of the model is evaluated by comparing predictions made with it to data measured during the SIMQUAKE II experiment. The predictions were found to correlate quite well with the measured data except for some discrepancies at the higher frequencies (greater than 10 cps). This discrepancy was attributed to the relatively crude model used for impact effects. Data is presented which identifies the peak

  7. Vision-based industrial automatic vehicle classifier

    Science.gov (United States)

    Khanipov, Timur; Koptelov, Ivan; Grigoryev, Anton; Kuznetsova, Elena; Nikolaev, Dmitry

    2015-02-01

    The paper describes the automatic motor vehicle video stream based classification system. The system determines vehicle type at payment collection plazas on toll roads. Classification is performed in accordance with a preconfigured set of rules which determine type by number of wheel axles, vehicle length, height over the first axle and full height. These characteristics are calculated using various computer vision algorithms: contour detectors, correlational analysis, fast Hough transform, Viola-Jones detectors, connected components analysis, elliptic shapes detectors and others. Input data contains video streams and induction loop signals. Output signals are vehicle enter and exit events, vehicle type, motion direction, speed and the above mentioned features.

  8. 基于大数据分析的铁路自动售检票监控系统研究%Railway Automatic Ticketing and Gate Monitoring System based on big data analysis

    Institute of Scientific and Technical Information of China (English)

    王成; 史天运

    2015-01-01

    This article proposed the general frame of Railway Automatic Ticketing and Gate Monitoring System(RATGS). The System was consisted of 4 layers, which were the infrastructure layer, the management layer, the analysis layer and application layer. The System was introduced technologies such as multidimensional data analysis, the distributed ifle system storage MapReduce, Complex Event Processing(CEP), data mining and etc., to implement the value added services based on passenger behavior analysis, such as fault early warning, analysis of failure rate, the utilization rate analysis of equipments, business optimization analysis, OD hotspot analysis, abnormal passenger recognition, usability analysis of equipment. All of these pointed out a new method for the future development of RATGS.%本文提出铁路自动售检票监控系统总体框架由基础层、管理层、分析层和应用层组成。利用多维数据分析、分布式文件系统存储和MapReduce计算、复杂事件处理、数据挖掘等技术,实现对铁路自动售检票系统的故障预警和故障率分析、设备利用率分析、业务优化分析以及OD热点分析、异常旅客识别、设备易用性分析等以旅客行为分析为基础的增值业务,为铁路自动售检票系统的未来发展提供一种新思路。

  9. Analysis of carotid artery plaque and wall boundaries on CT images by using a semi-automatic method based on level set model

    International Nuclear Information System (INIS)

    The purpose of this study was to evaluate the potentialities of a semi-automated technique in the detection and measurement of the carotid artery plaque. Twenty-two consecutive patients (18 males, 4 females; mean age 62 years) examined with MDCTA from January 2011 to March 2011 were included in this retrospective study. Carotid arteries are examined with a 16-multi-detector-row CT system, and for each patient, the most diseased carotid was selected. In the first phase, the carotid plaque was identified and one experienced radiologist manually traced the inner and outer boundaries by using polyline and radial distance method (PDM and RDM, respectively). In the second phase, the carotid inner and outer boundaries were traced with an automated algorithm: level-set-method (LSM). Data were compared by using Pearson rho correlation, Bland-Altman, and regression. A total of 715 slices were analyzed. The mean thickness of the plaque using the reference PDM was 1.86 mm whereas using the LSM-PDM was 1.96 mm; using the reference RDM was 2.06 mm whereas using the LSM-RDM was 2.03 mm. The correlation values between the references, the LSM, the PDM and the RDM were 0.8428, 0.9921, 0.745 and 0.6425. Bland-Altman demonstrated a very good agreement in particular with the RDM method. Results of our study indicate that LSM method can automatically measure the thickness of the plaque and that the best results are obtained with the RDM. Our results suggest that advanced computer-based algorithms can identify and trace the plaque boundaries like an experienced human reader. (orig.)

  10. 移动互联平台自动答疑模式之分析%Analysis of Mobile Internet Platform Automatic Question-answering Mode

    Institute of Scientific and Technical Information of China (English)

    郭毅; 涂婧璐

    2015-01-01

    In all schools to promote quality education in the process of teaching reform, questions and answers after class is an es⁃sential part of teaching content. With the popularity of Internet applications, network platform answering questions has become a necessary complement to classroom teaching. Now,network Q&A at present most of the limit of environment and answering the timeliness and other issues, so that the students' interest in learning was affected by. Our solution is to network the traditional way to answer questions is applied to mobile Internet devices, it can complete the function of automatic answer question. It makes the Q&A without time and space environmental constraints, thus expanding the students' learning space. This will greatly stimulate and promote students' interest in learning.%在各学校推进以素质教育为主的教学改革过程中,课后问答是教学环节必不可少的内容,随着互联网的普及应用,网络答疑已成为课堂教学的必要补充。目前的网络答疑大多受环境及答疑时效性等问题限制,使学生学习兴趣受到影响。我们把传统的网络答疑方式应用到手机等移动设备上,完成提问的自动答复,使之不受时空等约束,扩展学生的学习空间,将极大的激发学习兴趣。

  11. On-line dynamic fractionation and automatic determination of inorganic phosphorus in environmental solid substrates exploiting sequential injection microcolumn extraction and flow injection analysis

    International Nuclear Information System (INIS)

    Sequential injection microcolumn extraction (SI-MCE) based on the implementation of a soil-containing microcartridge as external reactor in a sequential injection network is, for the first time, proposed for dynamic fractionation of macronutrients in environmental solids, as exemplified by the partitioning of inorganic phosphorus in agricultural soils. The on-line fractionation method capitalises on the accurate metering and sequential exposure of the various extractants to the solid sample by application of programmable flow as precisely coordinated by a syringe pump. Three different soil phase associations for phosphorus, that is, exchangeable, Al- and Fe-bound, and Ca-bound fractions, were elucidated by accommodation in the flow manifold of the three steps of the Hieltjes-Lijklema (HL) scheme involving the use of 1.0 M NH4Cl, 0.1 M NaOH and 0.5 M HCl, respectively, as sequential leaching reagents. The precise timing and versatility of SI for tailoring various operational extraction modes were utilized for investigating the extractability and the extent of phosphorus re-distribution for variable partitioning times. Automatic spectrophotometric determination of soluble reactive phosphorus in soil extracts was performed by a flow injection (FI) analyser based on the Molybdenum Blue (MB) chemistry. The 3σ detection limit was 0.02 mg P L-1 while the linear dynamic range extended up to 20 mg P L-1 regardless of the extracting media. Despite the variable chemical composition of the HL extracts, a single FI set-up was assembled with no need for either manifold re-configuration or modification of chemical composition of reagents. The mobilization of trace elements, such as Cd, often present in grazed pastures as a result of the application of phosphate fertilizers, was also explored in the HL fractions by electrothermal atomic absorption spectrometry

  12. Prospects for de-automatization.

    Science.gov (United States)

    Kihlstrom, John F

    2011-06-01

    Research by Raz and his associates has repeatedly found that suggestions for hypnotic agnosia, administered to highly hypnotizable subjects, reduce or even eliminate Stroop interference. The present paper sought unsuccessfully to extend these findings to negative priming in the Stroop task. Nevertheless, the reduction of Stroop interference has broad theoretical implications, both for our understanding of automaticity and for the prospect of de-automatizing cognition in meditation and other altered states of consciousness. PMID:20356765

  13. Process automatization in system administration

    OpenAIRE

    Petauer, Janja

    2013-01-01

    The aim of the thesis is to present automatization of user management in company Studio Moderna. The company has grown exponentially in recent years, that is why we needed to find faster, easier and cheaper way of man- aging user accounts. We automatized processes of creating, changing and removing user accounts within Active Directory. We prepared user interface inside of existing application, used Java Script for drop down menus, wrote script in scripting programming langu...

  14. Automatic Number Plate Recognition System

    OpenAIRE

    Rajshree Dhruw; Dharmendra Roy

    2014-01-01

    Automatic Number Plate Recognition (ANPR) is a mass surveillance system that captures the image of vehicles and recognizes their license number. The objective is to design an efficient automatic authorized vehicle identification system by using the Indian vehicle number plate. In this paper we discus different methodology for number plate localization, character segmentation & recognition of the number plate. The system is mainly applicable for non standard Indian number plates by recognizing...

  15. Eating as an Automatic Behavior

    OpenAIRE

    Deborah A. Cohen, MD, MPH; Thomas A. Farley, MD, MPH

    2007-01-01

    The continued growth of the obesity epidemic at a time when obesity is highly stigmatizing should make us question the assumption that, given the right information and motivation, people can successfully reduce their food intake over the long term. An alternative view is that eating is an automatic behavior over which the environment has more control than do individuals. Automatic behaviors are those that occur without awareness, are initiated without intention, tend to continue without contr...

  16. IJIMAI Editor's Note - Vol. 2 Issue 7

    Directory of Open Access Journals (Sweden)

    Luis de-la-Fuente-Valentín

    2014-09-01

    Full Text Available This special issue, Special Issue on Multisensor user tracking and analytics to improve education and other application fields, concentrates on the practical and experimental use of data mining and analytics techniques, specially focusing on the educational area. The selected papers deal with the most relevant issues in the field, such as the integration of data from different sources, the identification of data suitable for the problem analysis, and the validation of the analytics techniques as support in the decision making process. The application fields of the analytics techniques presented in this paper have a clear focus on the educational area (where Learning Analytics has emerged as a buzzword in the recent years but not restricted to it. The result is a collection of use cases, experimental validations and analytics systems with a clear contribution to the state of the art.

  17. Preventing SQL Injection through Automatic Query Sanitization with ASSIST

    CERN Document Server

    Mui, Raymond; 10.4204/EPTCS.35.3

    2010-01-01

    Web applications are becoming an essential part of our everyday lives. Many of our activities are dependent on the functionality and security of these applications. As the scale of these applications grows, injection vulnerabilities such as SQL injection are major security challenges for developers today. This paper presents the technique of automatic query sanitization to automatically remove SQL injection vulnerabilities in code. In our technique, a combination of static analysis and program transformation are used to automatically instrument web applications with sanitization code. We have implemented this technique in a tool named ASSIST (Automatic and Static SQL Injection Sanitization Tool) for protecting Java-based web applications. Our experimental evaluation showed that our technique is effective against SQL injection vulnerabilities and has a low overhead.

  18. Preventing SQL Injection through Automatic Query Sanitization with ASSIST

    Directory of Open Access Journals (Sweden)

    Raymond Mui

    2010-09-01

    Full Text Available Web applications are becoming an essential part of our everyday lives. Many of our activities are dependent on the functionality and security of these applications. As the scale of these applications grows, injection vulnerabilities such as SQL injection are major security challenges for developers today. This paper presents the technique of automatic query sanitization to automatically remove SQL injection vulnerabilities in code. In our technique, a combination of static analysis and program transformation are used to automatically instrument web applications with sanitization code. We have implemented this technique in a tool named ASSIST (Automatic and Static SQL Injection Sanitization Tool for protecting Java-based web applications. Our experimental evaluation showed that our technique is effective against SQL injection vulnerabilities and has a low overhead.

  19. 焊缝形貌对埋弧焊缝自动超声波探伤结果的影响分析%Analysis on Effect of Weld Profile on Submerged Arc Weld Automatic Ultrasonic Testing Results

    Institute of Scientific and Technical Information of China (English)

    桂光正

    2011-01-01

    The weld internal quality of double side submerged arc welding is mainly detected by automatic ultrasonic testing system. In this article, it introduced weld automatic ultrasonic testing equipment and weld tracking principle which was imported from the abroad and used in production line. Through analysis on structure of sampled pipe detected by ultrasonic testing, it explained effect factors to the accuracy of ultrasonic testing results from several aspects, such as outside weld profile, inside and outside weld width, center misalignment of inside and outside weld, anomaly weld profile resulted from edge offset, and so on. Finally, it gave measures and methods of improving ultrasonic testing accuracy for submerged arc welded pipe.%双面埋弧焊管焊缝的内部质量主要通过自动超声波探伤系统来检测.介绍了宝钢股份UOE生产线引进的管体焊缝自动超声波探伤设备及其焊缝跟踪原理.通过对该系统及超声波探伤样管结构等的分析,指出外焊缝形貌、内外焊缝宽度、内外焊缝中心偏移以及错边导致的焊缝形貌不规则等均会影响自动超声波探伤结果的准确性.最后,给出了提高埋弧焊管焊缝自动超声波探伤准确性的措施及方法.

  20. Automatic Detection of Dominance and Expected Interest

    Directory of Open Access Journals (Sweden)

    M. Teresa Anguera

    2010-01-01

    Full Text Available Social Signal Processing is an emergent area of research that focuses on the analysis of social constructs. Dominance and interest are two of these social constructs. Dominance refers to the level of influence a person has in a conversation. Interest, when referred in terms of group interactions, can be defined as the degree of engagement that the members of a group collectively display during their interaction. In this paper, we argue that only using behavioral motion information, we are able to predict the interest of observers when looking at face-to-face interactions as well as the dominant people. First, we propose a simple set of movement-based features from body, face, and mouth activity in order to define a higher set of interaction indicators. The considered indicators are manually annotated by observers. Based on the opinions obtained, we define an automatic binary dominance detection problem and a multiclass interest quantification problem. Error-Correcting Output Codes framework is used to learn to rank the perceived observer's interest in face-to-face interactions meanwhile Adaboost is used to solve the dominant detection problem. The automatic system shows good correlation between the automatic categorization results and the manual ranking made by the observers in both dominance and interest detection problems.