WorldWideScience

Sample records for automatic vol analysis

  1. The change of cerebral blood flow after heart transplantation in congestive heart failure: a voxel-based and automatic VOl analysis of Tc-99m ECD SPECT

    Energy Technology Data Exchange (ETDEWEB)

    Hong, I. K.; Kim, J. J.; Lee, C. H.; Lim, K. C.; Moon, D. H.; Rhu, J. S.; Kim, J. S. [Asan Medical Center, Seoul (Korea, Republic of)

    2007-07-01

    To investigate the change of global and regional cerebral blood flow after heart transplantation (HT) in congestive heart failure (CHF) patients. Twenty-one patients with CHF who underwent HT (45{+-}12 yrs, M/F=19/2) and 10 healthy volunteers (39{+-}13 yrs, M/F = 7/3) were prospectively included. All patients underwent echocardiography and radionuclide angiography including brain and aorta with brain SPECT which was performed after iv bolus injection of Tc-99m ECD (740MBq) before (175{+-}253 days) and after (129{+-}82 days) HT. Patients were divided into two groups according to the interval between HT and postoperative SPECT [early follow-up (f/u): <6 mo, n=14; late f/u: >6 mo, n=7]. Global CBF (gCBF) of bilateral hemispheres were calculated by Patlak graphical analysis. Absolute rCBF map was obtained from brain SPECT by Lassen's correction algorithm. Age-corrected voxel-based analysis using SPM2 and automatic VOl analysis were performed to assess the rCBF change. Cardiac ejection fraction of all patients improved after HT (20.8%{yields}64.0%). gCBF was reduced compared to normal before HT (35.7{+-}3.9 vs. 49.1{+-}3.0 ml/100g/min; p<0.001) and improved postoperatively (46.6{+-}5.4, p<0.001). The preoperative gCBFs of early and late f/u group were not different (34.6{+-}3.2 vs. 38.0{+-}4.4, p=0.149) but postoperative gCBF (43.9{+-}3.7) of late f/u group was higher than those (52.0{+-}4.0) of early f/u group (p<0.001). On voxel-based analysis, preoperative rCBF was reduced in entire brain but most severely in bilateral superior and inferior frontal cortex, supplementary motor area, precuneus and anterior cingulum, compared to normals (uncorrected p<0.001). After HT, rCBF of these areas improved more significantly in late f/u group than in early f/u group but still lower than normals. Global CBF was significantly reduced in CHF patients and improved after HT. rCBFs of the frontal cortex, precuneus and cingulum were most severely reduced and slowly improved after

  2. A background to risk analysis. Vol. 3

    International Nuclear Information System (INIS)

    This 4-volumes report gives a background of ideas, principles, and examples which might be of use in developing practical methods for risk analysis. Some of the risk analysis techniques described are somewhat experimental. The report is written in an introductory style, but where some point needs further justifi- cation or evaluation, this is given in the form of a chapter appenix. In this way, it is hoped that the report can serve two purposes, - as a basis for starting risk analysis work and as a basis for discussing effectiveness of risk analysis procedures. The report should be seen as a preliminary stage, prior to a program of industrial trials of risk analysis methods. Vol. 3 contains chapters on quantification of risk, failure and accident probability, risk analysis and design, and examles of risk analysis for process plant. (BP)

  3. A background risk analysis. Vol. 1

    International Nuclear Information System (INIS)

    This 4-volumes report gives a background of ideas, principles, and examples which might be of use in developing practical methods for risk analysis. Some of the risk analysis techniques, described are somewhat experimental. The report is written in an introductory style, but where some point needs further justification or evaluation, this is given in the form of a chapter appendix. In this way, it is hoped that the report can serve two purposes, - as a basis for starting risk analysis work and as a basis for discussing effectiveness of risk analysis procedures. The report should be seen as a preliminary stage, prior to a program of industrial trials of risk analysis methods. Vol. 1 contains a short history of risk analysis, and chapters on risk, failures, errors and accidents, and general procedures for risk analysis. (BP)

  4. A background to risk analysis. Vol. 2

    International Nuclear Information System (INIS)

    This 4-volumes report gives a background of ideas, principles and examples which might be of use in developing practical methods for risk analysis. Some of the risk analysis techniques described are somewhat experimental. The report is written in an introductory style, but where some point needs further justification or evaluation, this is given in the form of a chapter appendix. In this way, it is hoped that the report can serve two purposes, - as a basis for starting risk analysis work and as a basis for discussing effectiveness of risk analysis procedures. The report should be seen as a preliminary stage, prior to a program of industrial trials of risk analysis methods. Vol. 2 treats generic methods of qualitative failure analysis. (BP)

  5. A background to risk analysis. Vol. 4

    International Nuclear Information System (INIS)

    This 4-volumes report gives a background of ideas, principles, and examples which might be of use in developing practical methods for risk analysis. Some of the risk analysis techniques described are somewhat experimental. The report is written in an introductory style, but where some point needs further justification or evaluation, this is given in the form of a chapter appendix. In this way, it is hoped that the report can serve two purposes, - as a basis for starting risk analysis work and as a basis for discussing effectiveness of risk analysis procedures. The report should be seen as a preliminary stage, prior to a program of industrial trials of risk analysis methods. Vol. 4 treats human error in plant operation. (BP)

  6. Automatic Complexity Analysis

    DEFF Research Database (Denmark)

    Rosendahl, Mads

    1989-01-01

    One way to analyse programs is to to derive expressions for their computational behaviour. A time bound function (or worst-case complexity) gives an upper bound for the computation time as a function of the size of input. We describe a system to derive such time bounds automatically using abstract...

  7. An automatic visual analysis system for tennis

    OpenAIRE

    Connaghan, Damien; Moran, Kieran; O''Connor, Noel E.

    2013-01-01

    This article presents a novel video analysis system for coaching tennis players of all levels, which uses computer vision algorithms to automatically edit and index tennis videos into meaningful annotations. Existing tennis coaching software lacks the ability to automatically index a tennis match into key events, and therefore, a coach who uses existing software is burdened with time-consuming manual video editing. This work aims to explore the effectiveness of a system to automatically de...

  8. Automatic malware analysis an emulator based approach

    CERN Document Server

    Yin, Heng

    2012-01-01

    Malicious software (i.e., malware) has become a severe threat to interconnected computer systems for decades and has caused billions of dollars damages each year. A large volume of new malware samples are discovered daily. Even worse, malware is rapidly evolving becoming more sophisticated and evasive to strike against current malware analysis and defense systems. Automatic Malware Analysis presents a virtualized malware analysis framework that addresses common challenges in malware analysis. In regards to this new analysis framework, a series of analysis techniques for automatic malware analy

  9. Automatic analysis of multiparty meetings

    Indian Academy of Sciences (India)

    Steve Renals

    2011-10-01

    This paper is about the recognition and interpretation of multiparty meetings captured as audio, video and other signals. This is a challenging task since the meetings consist of spontaneous and conversational interactions between a number of participants: it is a multimodal, multiparty, multistream problem. We discuss the capture and annotation of the Augmented Multiparty Interaction (AMI) meeting corpus, the development of a meeting speech recognition system, and systems for the automatic segmentation, summarization and social processing of meetings, together with some example applications based on these systems.

  10. An Automatic Hierarchical Delay Analysis Tool

    Institute of Scientific and Technical Information of China (English)

    FaridMheir-El-Saadi; BozenaKaminska

    1994-01-01

    The performance analysis of VLSI integrated circuits(ICs) with flat tools is slow and even sometimes impossible to complete.Some hierarchical tools have been developed to speed up the analysis of these large ICs.However,these hierarchical tools suffer from a poor interaction with the CAD database and poorly automatized operations.We introduce a general hierarchical framework for performance analysis to solve these problems.The circuit analysis is automatic under the proposed framework.Information that has been automatically abstracted in the hierarchy is kept in database properties along with the topological information.A limited software implementation of the framework,PREDICT,has also been developed to analyze the delay performance.Experimental results show that hierarchical analysis CPU time and memory requirements are low if heuristics are used during the abstraction process.

  11. Automatic emotional expression analysis from eye area

    Science.gov (United States)

    Akkoç, Betül; Arslan, Ahmet

    2015-02-01

    Eyes play an important role in expressing emotions in nonverbal communication. In the present study, emotional expression classification was performed based on the features that were automatically extracted from the eye area. Fırst, the face area and the eye area were automatically extracted from the captured image. Afterwards, the parameters to be used for the analysis through discrete wavelet transformation were obtained from the eye area. Using these parameters, emotional expression analysis was performed through artificial intelligence techniques. As the result of the experimental studies, 6 universal emotions consisting of expressions of happiness, sadness, surprise, disgust, anger and fear were classified at a success rate of 84% using artificial neural networks.

  12. Handbook of nuclear engineering: vol 1: nuclear engineering fundamentals; vol 2: reactor design; vol 3: reactor analysis; vol 4: reactors of waste disposal and safeguards

    CERN Document Server

    2013-01-01

    The Handbook of Nuclear Engineering is an authoritative compilation of information regarding methods and data used in all phases of nuclear engineering. Addressing nuclear engineers and scientists at all academic levels, this five volume set provides the latest findings in nuclear data and experimental techniques, reactor physics, kinetics, dynamics and control. Readers will also find a detailed description of data assimilation, model validation and calibration, sensitivity and uncertainty analysis, fuel management and cycles, nuclear reactor types and radiation shielding. A discussion of radioactive waste disposal, safeguards and non-proliferation, and fuel processing with partitioning and transmutation is also included. As nuclear technology becomes an important resource of non-polluting sustainable energy in the future, The Handbook of Nuclear Engineering is an excellent reference for practicing engineers, researchers and professionals.

  13. Microprocessors in automatic chemical analysis

    International Nuclear Information System (INIS)

    Application of microprocessors to programming and computing of solutions chemical analysis by a sequential technique is examined. Safety, performances reliability are compared to other methods. An example is given on uranium titration by spectrophotometry

  14. Automatic Prosodic Break Detection and Feature Analysis

    Institute of Scientific and Technical Information of China (English)

    Chong-Jia Ni; Ai-Ying Zhang; Wen-Ju Liu; Bo Xu

    2012-01-01

    Automatic prosodic break detection and annotation are important for both speech understanding and natural speech synthesis.In this paper,we discuss automatic prosodic break detection and feature analysis.The contributions of the paper are two aspects.One is that we use classifier combination method to detect Mandarin and English prosodic break using acoustic,lexical and syntactic evidence.Our proposed method achieves better performance on both the Mandarin prosodic annotation corpus — Annotated Speech Corpus of Chinese Discourse and the English prosodic annotation corpus —Boston University Radio News Corpus when compared with the baseline system and other researches' experimental results.The other is the feature analysis for prosodic break detection.The functions of different features,such as duration,pitch,energy,and intensity,are analyzed and compared in Mandarin and English prosodic break detection.Based on the feature analysis,we also verify some linguistic conclusions.

  15. ANALYSIS METHOD OF AUTOMATIC PLANETARY TRANSMISSION KINEMATICS

    Directory of Open Access Journals (Sweden)

    Józef DREWNIAK

    2014-06-01

    Full Text Available In the present paper, planetary automatic transmission is modeled by means of contour graphs. The goals of modeling could be versatile: ratio calculating via algorithmic equation generation, analysis of velocity and accelerations. The exemplary gears running are analyzed, several drives/gears are consecutively taken into account discussing functional schemes, assigned contour graphs and generated system of equations and their solutions. The advantages of the method are: algorithmic approach, general approach where particular drives are cases of the generally created model. Moreover, the method allows for further analyzes and synthesis tasks e.g. checking isomorphism of design solutions.

  16. Automatic abundance analysis of high resolution spectra

    CERN Document Server

    Bonifacio, P; Bonifacio, Piercarlo; Caffau, Elisabetta

    2003-01-01

    We describe an automatic procedure for determining abundances from high resolution spectra. Such procedures are becoming increasingly important as large amounts of data are delivered from 8m telescopes and their high-multiplexing fiber facilities, such as FLAMES on ESO-VLT. The present procedure is specifically targeted for the analysis of spectra of giants in the Sgr dSph; however, the procedure may be, in principle, tailored to analyse stars of any type. Emphasis is placed on the algorithms and on the stability of the method; the external accuracy rests, ultimately, on the reliability of the theoretical models (model-atmospheres, synthetic spectra) used to interpret the data. Comparison of the results of the procedure with the results of a traditional analysis for 12 Sgr giants shows that abundances accurate at the level of 0.2 dex, comparable with that of traditional analysis of the same spectra, may be derived in a fast and efficient way. Such automatic procedures are not meant to replace the traditional ...

  17. Semi-automatic analysis of fire debris

    Science.gov (United States)

    Touron; Malaquin; Gardebas; Nicolai

    2000-05-01

    Automated analysis of fire residues involves a strategy which deals with the wide variety of received criminalistic samples. Because of unknown concentration of accelerant in a sample and the wide range of flammable products, full attention from the analyst is required. Primary detection with a photoionisator resolves the first problem, determining the right method to use: the less responsive classical head-space determination or absorption on active charcoal tube, a better fitted method more adapted to low concentrations can thus be chosen. The latter method is suitable for automatic thermal desorption (ATD400), to avoid any risk of cross contamination. A PONA column (50 mx0.2 mm i.d.) allows the separation of volatile hydrocarbons from C(1) to C(15) and the update of a database. A specific second column is used for heavy hydrocarbons. Heavy products (C(13) to C(40)) were extracted from residues using a very small amount of pentane, concentrated to 1 ml at 50 degrees C and then placed on an automatic carousel. Comparison of flammables with referenced chromatograms provided expected identification, possibly using mass spectrometry. This analytical strategy belongs to the IRCGN quality program, resulting in analysis of 1500 samples per year by two technicians. PMID:10802196

  18. Automatic analysis of distance bounding protocols

    CERN Document Server

    Malladi, Sreekanth; Kothapalli, Kishore

    2010-01-01

    Distance bounding protocols are used by nodes in wireless networks to calculate upper bounds on their distances to other nodes. However, dishonest nodes in the network can turn the calculations both illegitimate and inaccurate when they participate in protocol executions. It is important to analyze protocols for the possibility of such violations. Past efforts to analyze distance bounding protocols have only been manual. However, automated approaches are important since they are quite likely to find flaws that manual approaches cannot, as witnessed in literature for analysis pertaining to key establishment protocols. In this paper, we use the constraint solver tool to automatically analyze distance bounding protocols. We first formulate a new trace property called Secure Distance Bounding (SDB) that protocol executions must satisfy. We then classify the scenarios in which these protocols can operate considering the (dis)honesty of nodes and location of the attacker in the network. Finally, we extend the const...

  19. Transboundary diagnostic analysis. Vol. 2. Background and environmental assessment

    OpenAIRE

    2012-01-01

    The Transboundary Diagnosis Analysis(TDA) quantifies and ranks water-related environmental transboundary issues and their causes according to the severity of environmental and/or socio-economic impacts. The three main issues in BOBLME are; overexploitation of marine living resources; degradation of mangroves, coral reefs and seagrasses; pollution and water quality. Volume 2 contains background material that sets out the bio-physical and socio-economic characteristics of the BOBLME; an analysi...

  20. Experiment data acquisition and analysis system. Vol. 1

    International Nuclear Information System (INIS)

    The Experiment Data Acquisition and Analysis System EDAS was created to acquire and analyze data collected in experiments carried out at the heavy ion accelerator UNILAC. It has been available since 1975 and has become the most frequently used system for evaluating experiments at GSI. EDAS has undergone constant development, and the many enhancements make this completely revised third edition of the EDAS manual necessary. EDAS consists of two sub-systems: GOLDA for experimental data acquisition on PDP-11's and SATAN mainly for off-line analysis in replay mode on large IBM mainframes. The capacity of one IBM 3081 CPU is mainly dedicated to EDAS processing and is almost fully utilized by this application. More than 200 users from GSI as well as from collaborating laboratories and universities use SATAN in more than 100 sessions daily needing 10 to 20 hours of user CPU time. EDAS is designed as an open system. (orig./HSI)

  1. Transboundary diagnostic analysis. Vol. 1. Issues, proximate and root causes

    OpenAIRE

    2012-01-01

    The Transboundary Diagnostic Analysis(TDA) quantifies and ranks water-related environmental transboundary issues and their causes according to the severity of environmental and/or socio-economic impacts. The three main issues in BOBLME are; overexploitation of marine living resources; degradation of mangroves, coral reefs and seagrasses; pollution and water quality. Volume 1 describes the transboundary issues in BOBLME and their proximate and underlying root causes. These will be used to deve...

  2. Handbook of nuclear data for neutron activation analysis. Vol. I

    International Nuclear Information System (INIS)

    The first part of a two-volume book which is meant for experimentalists working in instrumental activation analysis and related fields, such as nuclear metrology, materials testing and environmental studies. The volume describes the basic processes of gamma-ray interaction with matter as well as the important phenomena affecting gamma-ray spectra formation in semiconductor spectrometers. A brief account is also given of computation methods commonly employed for spectra evaluation. The results rather than detailed derivations are stressed. A great deal of material si divided into five chapters and nine appendices. The inclusion of many tables of significant spectroscopic data should make the text a useful handbook for those dealing with multi-channel gamma-ray spectra. (author) 26 figs., 82 tabs., 334 refs

  3. Automatic terrain modeling using transfinite element analysis

    KAUST Repository

    Collier, Nathaniel O.

    2010-05-31

    An automatic procedure for modeling terrain is developed based on L2 projection-based interpolation of discrete terrain data onto transfinite function spaces. The function space is refined automatically by the use of image processing techniques to detect regions of high error and the flexibility of the transfinite interpolation to add degrees of freedom to these areas. Examples are shown of a section of the Palo Duro Canyon in northern Texas.

  4. Towards automatic quantitative analysis of cardiac MR perfusion images

    NARCIS (Netherlands)

    Breeuwer, M.; Quist, M.; Spreeuwers, L.J.; Paetsch, I.; Al-Saadi, N.; Nagel, E.

    2001-01-01

    Magnetic Resonance Imaging (MRI) is a powerful technique for imaging cardiovascular diseases. The introduction of cardiovascular MRI into clinical practice is however hampered by the lack of efficient and reliable automatic image analysis methods. This paper focuses on the automatic evaluation of th

  5. Dynamic Analysis of a Pendulum Dynamic Automatic Balancer

    Directory of Open Access Journals (Sweden)

    Jin-Seung Sohn

    2007-01-01

    Full Text Available The automatic dynamic balancer is a device to reduce the vibration from unbalanced mass of rotors. Instead of considering prevailing ball automatic dynamic balancer, pendulum automatic dynamic balancer is analyzed. For the analysis of dynamic stability and behavior, the nonlinear equations of motion for a system are derived with respect to polar coordinates by the Lagrange's equations. The perturbation method is applied to investigate the dynamic behavior of the system around the equilibrium position. Based on the linearized equations, the dynamic stability of the system around the equilibrium positions is investigated by the eigenvalue analysis.

  6. Automatic basal slice detection for cardiac analysis

    Science.gov (United States)

    Paknezhad, Mahsa; Marchesseau, Stephanie; Brown, Michael S.

    2016-03-01

    Identification of the basal slice in cardiac imaging is a key step to measuring the ejection fraction (EF) of the left ventricle (LV). Despite research on cardiac segmentation, basal slice identification is routinely performed manually. Manual identification, however, has been shown to have high inter-observer variability, with a variation of the EF by up to 8%. Therefore, an automatic way of identifying the basal slice is still required. Prior published methods operate by automatically tracking the mitral valve points from the long-axis view of the LV. These approaches assumed that the basal slice is the first short-axis slice below the mitral valve. However, guidelines published in 2013 by the society for cardiovascular magnetic resonance indicate that the basal slice is the uppermost short-axis slice with more than 50% myocardium surrounding the blood cavity. Consequently, these existing methods are at times identifying the incorrect short-axis slice. Correct identification of the basal slice under these guidelines is challenging due to the poor image quality and blood movement during image acquisition. This paper proposes an automatic tool that focuses on the two-chamber slice to find the basal slice. To this end, an active shape model is trained to automatically segment the two-chamber view for 51 samples using the leave-one-out strategy. The basal slice was detected using temporal binary profiles created for each short-axis slice from the segmented two-chamber slice. From the 51 successfully tested samples, 92% and 84% of detection results were accurate at the end-systolic and the end-diastolic phases of the cardiac cycle, respectively.

  7. Automatic Gait Recognition by Symmetry Analysis

    OpenAIRE

    Hayfron-Acquah, James B.; Nixon, Mark S.; Carter, John N.

    2001-01-01

    We describe a new method for automatic gait recognition based on analysing the symmetry of human motion, by using the Generalised Symmetry Operator. This operator, rather than relying on the borders of a shape or on general appearance, locates features by their symmetrical properties. This approach is reinforced by the psychologists' view that gait is a symmetrical pattern of motion and by other works. We applied our new method to two different databases and derived gait signatures for silhou...

  8. Coloured Petri Nets: Basic Concepts, Analysis Methods and Practical Use. Vol. 3, Practical Use

    DEFF Research Database (Denmark)

    Jensen, Kurt

    for operations that are used in many arc expressions. These modifications make the CP-nets more appropriate as study material, but they do not change the essential behaviour of the CPN models. The terminology in the original material has been modified to fit the terminology introduced in the first two volumes...... of the CPN models and some of the analysis results. This has been possible since, Vols. 1 and 2 have given the readers a much more thorough knowledge of CP-nets than readers of ordinary research papers. Finally, it is discussed how some of the problems from the projects can be overcome or circumvented. Many...... of these problems have already been removed, e.g., by improvements of the CPN tools. Other problems can be avoided by a careful choice of modelling and analysis techniques. The material has been modified in cooperation with the original authors and the final result has been approved by them. The conclusions...

  9. Automatic learning strategies and their application to electrophoresis analysis

    OpenAIRE

    Roch, Christian Maurice; Pun, Thierry; Hochstrasser, Denis; Pellegrini, Christian

    1989-01-01

    Automatic learning plays an important role in image analysis and pattern recognition. A taxonomy of automatic learning strategies is presented; this categorization is based on the amount of inferences the learning element must perform to bridge the gap between environmental and system knowledge representation level. Four main categories are identified and described: rote learning, learning by deduction, learning by induction, and learning by analogy. An application of learning by induction to...

  10. Automatic quantitative analysis of morphology of apoptotic HL-60 cells

    OpenAIRE

    Liu, Yahui; Lin, Wang; Yang, Xu; Liang, Weizi; Zhang, Jun; Meng, Maobin; Rice, John R.; Sa, Yu; Feng, Yuanming

    2014-01-01

    Morphological identification is a widespread procedure to assess the presence of apoptosis by visual inspection of the morphological characteristics or the fluorescence images. The procedure is lengthy and results are observer dependent. A quantitative automatic analysis is objective and would greatly help the routine work. We developed an image processing and segmentation method which combined the Otsu thresholding and morphological operators for apoptosis study. An automatic determina...

  11. Automatic Facial Expression Analysis A Survey

    Directory of Open Access Journals (Sweden)

    C.P. Sumathi

    2013-01-01

    Full Text Available The Automatic Facial Expression Recognition has been one of the latest research topic since1990’s.There have been recent advances in detecting face, facial expression recognition andclassification. There are multiple methods devised for facial feature extraction which helps in identifyingface and facial expressions. This paper surveys some of the published work since 2003 till date. Variousmethods are analysed to identify the Facial expression. The Paper also discusses about the facialparameterization using Facial Action Coding System(FACS action units and the methods whichrecognizes the action units parameters using facial expression data that are extracted. Various kinds offacial expressions are present in human face which can be identified based on their geometric features,appearance features and hybrid features . The two basic concepts of extracting features are based onfacial deformation and facial motion. This article also identifies the techniques based on thecharacteristics of expressions and classifies the suitable methods that can be implemented.

  12. Automatic analysis of signals during Eddy currents controls

    International Nuclear Information System (INIS)

    A method and the corresponding instrument have been developed for automatic analysis of Eddy currents testing signals. This apparatus enables at the same time the analysis, every 2 milliseconds, of two signals at two different frequencies. It can be used either on line with an Eddy Current testing instrument or with a magnetic tape recorder

  13. Automatic Analysis of Critical Incident Reports: Requirements and Use Cases.

    Science.gov (United States)

    Denecke, Kerstin

    2016-01-01

    Increasingly, critical incident reports are used as a means to increase patient safety and quality of care. The entire potential of these sources of experiential knowledge remains often unconsidered since retrieval and analysis is difficult and time-consuming, and the reporting systems often do not provide support for these tasks. The objective of this paper is to identify potential use cases for automatic methods that analyse critical incident reports. In more detail, we will describe how faceted search could offer an intuitive retrieval of critical incident reports and how text mining could support in analysing relations among events. To realise an automated analysis, natural language processing needs to be applied. Therefore, we analyse the language of critical incident reports and derive requirements towards automatic processing methods. We learned that there is a huge potential for an automatic analysis of incident reports, but there are still challenges to be solved. PMID:27139389

  14. Automatic analysis of a skull fracture based on image content

    Science.gov (United States)

    Shao, Hong; Zhao, Hong

    2003-09-01

    Automatic analysis based on image content is a hotspot with bright future of medical image diagnosis technology research. Analysis of the fracture of skull can help doctors diagnose. In this paper, a new approach is proposed to automatically detect the fracture of skull based on CT image content. First region growing method, whose seeds and growing rules are chosen by k-means clustering dynamically, is applied for image automatic segmentation. The segmented region boundary is found by boundary tracing. Then the shape of the boundary is analyzed, and the circularity measure is taken as description parameter. At last the rules for computer automatic diagnosis of the fracture of the skull are reasoned by entropy function. This method is used to analyze the images from the third ventricles below layer to cerebral cortex top layer. Experimental result shows that the recognition rate is 100% for the 100 images, which are chosen from medical image database randomly and are not included in the training examples. This method integrates color and shape feature, and isn't affected by image size and position. This research achieves high recognition rate and sets a basis for automatic analysis of brain image.

  15. Project Report: Automatic Sequence Processor Software Analysis

    Science.gov (United States)

    Benjamin, Brandon

    2011-01-01

    The Mission Planning and Sequencing (MPS) element of Multi-Mission Ground System and Services (MGSS) provides space missions with multi-purpose software to plan spacecraft activities, sequence spacecraft commands, and then integrate these products and execute them on spacecraft. Jet Propulsion Laboratory (JPL) is currently is flying many missions. The processes for building, integrating, and testing the multi-mission uplink software need to be improved to meet the needs of the missions and the operations teams that command the spacecraft. The Multi-Mission Sequencing Team is responsible for collecting and processing the observations, experiments and engineering activities that are to be performed on a selected spacecraft. The collection of these activities is called a sequence and ultimately a sequence becomes a sequence of spacecraft commands. The operations teams check the sequence to make sure that no constraints are violated. The workflow process involves sending a program start command, which activates the Automatic Sequence Processor (ASP). The ASP is currently a file-based system that is comprised of scripts written in perl, c-shell and awk. Once this start process is complete, the system checks for errors and aborts if there are any; otherwise the system converts the commands to binary, and then sends the resultant information to be radiated to the spacecraft.

  16. Profiling School Shooters: Automatic Text-Based Analysis

    Directory of Open Access Journals (Sweden)

    Yair eNeuman

    2015-06-01

    Full Text Available School shooters present a challenge to both forensic psychiatry and law enforcement agencies. The relatively small number of school shooters, their various charateristics, and the lack of in-depth analysis of all of the shooters prior to the shooting add complexity to our understanding of this problem. In this short paper, we introduce a new methodology for automatically profiling school shooters. The methodology involves automatic analysis of texts and the production of several measures relevant for the identification of the shooters. Comparing texts written by six school shooters to 6056 texts written by a comparison group of male subjects, we found that the shooters' texts scored significantly higher on the Narcissistic Personality dimension as well as on the Humilated and Revengeful dimensions. Using a ranking/priorization procedure, similar to the one used for the automatic identification of sexual predators, we provide support for the validity and relevance of the proposed methodology.

  17. Trends of Science Education Research: An Automatic Content Analysis

    Science.gov (United States)

    Chang, Yueh-Hsia; Chang, Chun-Yen; Tseng, Yuen-Hsien

    2010-01-01

    This study used scientometric methods to conduct an automatic content analysis on the development trends of science education research from the published articles in the four journals of "International Journal of Science Education, Journal of Research in Science Teaching, Research in Science Education, and Science Education" from 1990 to 2007. The…

  18. Automatic analysis of trabecular bone structure from knee MRI

    DEFF Research Database (Denmark)

    Marques, Joselene; Granlund, Rabia; Lillholm, Martin;

    2012-01-01

    We investigated the feasibility of quantifying osteoarthritis (OA) by analysis of the trabecular bone structure in low-field knee MRI. Generic texture features were extracted from the images and subsequently selected by sequential floating forward selection (SFFS), following a fully automatic, un...

  19. Automatic zebrafish heartbeat detection and analysis for zebrafish embryos.

    Science.gov (United States)

    Pylatiuk, Christian; Sanchez, Daniela; Mikut, Ralf; Alshut, Rüdiger; Reischl, Markus; Hirth, Sofia; Rottbauer, Wolfgang; Just, Steffen

    2014-08-01

    A fully automatic detection and analysis method of heartbeats in videos of nonfixed and nonanesthetized zebrafish embryos is presented. This method reduces the manual workload and time needed for preparation and imaging of the zebrafish embryos, as well as for evaluating heartbeat parameters such as frequency, beat-to-beat intervals, and arrhythmicity. The method is validated by a comparison of the results from automatic and manual detection of the heart rates of wild-type zebrafish embryos 36-120 h postfertilization and of embryonic hearts with bradycardia and pauses in the cardiac contraction.

  20. Automatic analysis of double coronal mass ejections from coronagraph images

    Science.gov (United States)

    Jacobs, Matthew; Chang, Lin-Ching; Pulkkinen, Antti; Romano, Michelangelo

    2015-11-01

    Coronal mass ejections (CMEs) can have major impacts on man-made technology and humans, both in space and on Earth. These impacts have created a high interest in the study of CMEs in an effort to detect and track events and forecast the CME arrival time to provide time for proper mitigation. A robust automatic real-time CME processing pipeline is greatly desired to avoid laborious and subjective manual processing. Automatic methods have been proposed to segment CMEs from coronagraph images and estimate CME parameters such as their heliocentric location and velocity. However, existing methods suffered from several shortcomings such as the use of hard thresholding and an inability to handle two or more CMEs occurring within the same coronagraph image. Double-CME analysis is a necessity for forecasting the many CME events that occur within short time frames. Robust forecasts for all CME events are required to fully understand space weather impacts. This paper presents a new method to segment CME masses and pattern recognition approaches to differentiate two CMEs in a single coronagraph image. The proposed method is validated on a data set of 30 halo CMEs, with results showing comparable ability in transient arrival time prediction accuracy and the new ability to automatically predict the arrival time of a double-CME event. The proposed method is the first automatic method to successfully calculate CME parameters from double-CME events, making this automatic method applicable to a wider range of CME events.

  1. The Avenging Females: A Comparative Analysis of Kill Bill Vol.1-2, Death Proof and Sympathy for Lady Vengeance

    Directory of Open Access Journals (Sweden)

    Basak Göksel Demiray

    2012-04-01

    Full Text Available This paper provides a comparative analysis of Quentin Tarantino’s Kill Bill Vol.1-2 (2003, 2004, Death Proof (2007 and Park Chan Wook’s Sympathy for Lady Vengeance (Chinjeolhan Geumjassi, 2005. The primary objectives of this study are: (1 to reveal the gender-biases inherent to the fundamental discursive structures of the foregoing films; (2 to compare and contrast the films through an analysis of the ‘gaze(s’ and possible ‘pleasures’,  which are inherent in their narratives, in relation to Laura Mulvey’s and Carol Clover’s approaches; and (3 to distinguish Kill Bill Vol.1-2 from the foregoing two and the ‘avenging female’ clichés of the other horror/violence movies in the context of the replaced positionings of its protagonist and antagonist inherent in its distinct narrative style.

  2. Formalising responsibility modelling for automatic analysis

    OpenAIRE

    Simpson, Robbie; Storer, Tim

    2015-01-01

    Modelling the structure of social-technical systems as a basis for informing software system design is a difficult compromise. Formal methods struggle to capture the scale and complexity of the heterogeneous organisations that use technical systems. Conversely, informal approaches lack the rigour needed to inform the software design and construction process or enable automated analysis. We revisit the concept of responsibility modelling, which models social technical systems as a collec...

  3. Automatic quantitative morphological analysis of interacting galaxies

    CERN Document Server

    Shamir, Lior; Wallin, John

    2013-01-01

    The large number of galaxies imaged by digital sky surveys reinforces the need for computational methods for analyzing galaxy morphology. While the morphology of most galaxies can be associated with a stage on the Hubble sequence, morphology of galaxy mergers is far more complex due to the combination of two or more galaxies with different morphologies and the interaction between them. Here we propose a computational method based on unsupervised machine learning that can quantitatively analyze morphologies of galaxy mergers and associate galaxies by their morphology. The method works by first generating multiple synthetic galaxy models for each galaxy merger, and then extracting a large set of numerical image content descriptors for each galaxy model. These numbers are weighted using Fisher discriminant scores, and then the similarities between the galaxy mergers are deduced using a variation of Weighted Nearest Neighbor analysis such that the Fisher scores are used as weights. The similarities between the ga...

  4. Analysis of Phonetic Transcriptions for Danish Automatic Speech Recognition

    DEFF Research Database (Denmark)

    Kirkedal, Andreas Søeborg

    2013-01-01

    Automatic speech recognition (ASR) relies on three resources: audio, orthographic transcriptions and a pronunciation dictionary. The dictionary or lexicon maps orthographic words to sequences of phones or phonemes that represent the pronunciation of the corresponding word. The quality of a speech....... The analysis indicates that transcribing e.g. stress or vowel duration has a negative impact on performance. The best performance is obtained with coarse phonetic annotation and improves performance 1% word error rate and 3.8% sentence error rate....

  5. Facilitator control as automatic behavior: A verbal behavior analysis

    OpenAIRE

    Hall, Genae A.

    1993-01-01

    Several studies of facilitated communication have demonstrated that the facilitators were controlling and directing the typing, although they appeared to be unaware of doing so. Such results shift the focus of analysis to the facilitator's behavior and raise questions regarding the controlling variables for that behavior. This paper analyzes facilitator behavior as an instance of automatic verbal behavior, from the perspective of Skinner's (1957) book Verbal Behavior. Verbal behavior is autom...

  6. Development of an automatic identification algorithm for antibiogram analysis.

    Science.gov (United States)

    Costa, Luan F R; da Silva, Eduardo S; Noronha, Victor T; Vaz-Moreira, Ivone; Nunes, Olga C; Andrade, Marcelino M de

    2015-12-01

    Routinely, diagnostic and microbiology laboratories perform antibiogram analysis which can present some difficulties leading to misreadings and intra and inter-reader deviations. An Automatic Identification Algorithm (AIA) has been proposed as a solution to overcome some issues associated with the disc diffusion method, which is the main goal of this work. AIA allows automatic scanning of inhibition zones obtained by antibiograms. More than 60 environmental isolates were tested using susceptibility tests which were performed for 12 different antibiotics for a total of 756 readings. Plate images were acquired and classified as standard or oddity. The inhibition zones were measured using the AIA and results were compared with reference method (human reading), using weighted kappa index and statistical analysis to evaluate, respectively, inter-reader agreement and correlation between AIA-based and human-based reading. Agreements were observed in 88% cases and 89% of the tests showed no difference or a reading problems such as overlapping inhibition zones, imperfect microorganism seeding, non-homogeneity of the circumference, partial action of the antimicrobial, and formation of a second halo of inhibition. Furthermore, AIA proved to overcome some of the limitations observed in other automatic methods. Therefore, AIA may be a practical tool for automated reading of antibiograms in diagnostic and microbiology laboratories. PMID:26513468

  7. DMET-Analyzer: automatic analysis of Affymetrix DMET Data

    Directory of Open Access Journals (Sweden)

    Guzzi Pietro

    2012-10-01

    Full Text Available Abstract Background Clinical Bioinformatics is currently growing and is based on the integration of clinical and omics data aiming at the development of personalized medicine. Thus the introduction of novel technologies able to investigate the relationship among clinical states and biological machineries may help the development of this field. For instance the Affymetrix DMET platform (drug metabolism enzymes and transporters is able to study the relationship among the variation of the genome of patients and drug metabolism, detecting SNPs (Single Nucleotide Polymorphism on genes related to drug metabolism. This may allow for instance to find genetic variants in patients which present different drug responses, in pharmacogenomics and clinical studies. Despite this, there is currently a lack in the development of open-source algorithms and tools for the analysis of DMET data. Existing software tools for DMET data generally allow only the preprocessing of binary data (e.g. the DMET-Console provided by Affymetrix and simple data analysis operations, but do not allow to test the association of the presence of SNPs with the response to drugs. Results We developed DMET-Analyzer a tool for the automatic association analysis among the variation of the patient genomes and the clinical conditions of patients, i.e. the different response to drugs. The proposed system allows: (i to automatize the workflow of analysis of DMET-SNP data avoiding the use of multiple tools; (ii the automatic annotation of DMET-SNP data and the search in existing databases of SNPs (e.g. dbSNP, (iii the association of SNP with pathway through the search in PharmaGKB, a major knowledge base for pharmacogenomic studies. DMET-Analyzer has a simple graphical user interface that allows users (doctors/biologists to upload and analyse DMET files produced by Affymetrix DMET-Console in an interactive way. The effectiveness and easy use of DMET Analyzer is demonstrated through different

  8. Corpus analysis and automatic detection of emotion-including keywords

    Science.gov (United States)

    Yuan, Bo; He, Xiangqing; Liu, Ying

    2013-12-01

    Emotion words play a vital role in many sentiment analysis tasks. Previous research uses sentiment dictionary to detect the subjectivity or polarity of words. In this paper, we dive into Emotion-Inducing Keywords (EIK), which refers to the words in use that convey emotion. We first analyze an emotion corpus to explore the pragmatic aspects of EIK. Then we design an effective framework for automatically detecting EIK in sentences by utilizing linguistic features and context information. Our system outperforms traditional dictionary-based methods dramatically in increasing Precision, Recall and F1-score.

  9. Entropy analysis of OCT signal for automatic tissue characterization

    Science.gov (United States)

    Wang, Yahui; Qiu, Yi; Zaki, Farzana; Xu, Yiqing; Hubbi, Basil; Belfield, Kevin D.; Liu, Xuan

    2016-03-01

    Optical coherence tomography (OCT) signal can provide microscopic characterization of biological tissue and assist clinical decision making in real-time. However, raw OCT data is noisy and complicated. It is challenging to extract information that is directly related to the pathological status of tissue through visual inspection on huge volume of OCT signal streaming from the high speed OCT engine. Therefore, it is critical to discover concise, comprehensible information from massive OCT data through novel strategies for signal analysis. In this study, we perform Shannon entropy analysis on OCT signal for automatic tissue characterization, which can be applied in intraoperative tumor margin delineation for surgical excision of cancer. The principle of this technique is based on the fact that normal tissue is usually more structured with higher entropy value, compared to pathological tissue such as cancer tissue. In this study, we develop high-speed software based on graphic processing units (GPU) for real-time entropy analysis of OCT signal.

  10. Spectral saliency via automatic adaptive amplitude spectrum analysis

    Science.gov (United States)

    Wang, Xiaodong; Dai, Jialun; Zhu, Yafei; Zheng, Haiyong; Qiao, Xiaoyan

    2016-03-01

    Suppressing nonsalient patterns by smoothing the amplitude spectrum at an appropriate scale has been shown to effectively detect the visual saliency in the frequency domain. Different filter scales are required for different types of salient objects. We observe that the optimal scale for smoothing amplitude spectrum shares a specific relation with the size of the salient region. Based on this observation and the bottom-up saliency detection characterized by spectrum scale-space analysis for natural images, we propose to detect visual saliency, especially with salient objects of different sizes and locations via automatic adaptive amplitude spectrum analysis. We not only provide a new criterion for automatic optimal scale selection but also reserve the saliency maps corresponding to different salient objects with meaningful saliency information by adaptive weighted combination. The performance of quantitative and qualitative comparisons is evaluated by three different kinds of metrics on the four most widely used datasets and one up-to-date large-scale dataset. The experimental results validate that our method outperforms the existing state-of-the-art saliency models for predicting human eye fixations in terms of accuracy and robustness.

  11. Automatic visual tracking and social behaviour analysis with multiple mice.

    Directory of Open Access Journals (Sweden)

    Luca Giancardo

    Full Text Available Social interactions are made of complex behavioural actions that might be found in all mammalians, including humans and rodents. Recently, mouse models are increasingly being used in preclinical research to understand the biological basis of social-related pathologies or abnormalities. However, reliable and flexible automatic systems able to precisely quantify social behavioural interactions of multiple mice are still missing. Here, we present a system built on two components. A module able to accurately track the position of multiple interacting mice from videos, regardless of their fur colour or light settings, and a module that automatically characterise social and non-social behaviours. The behavioural analysis is obtained by deriving a new set of specialised spatio-temporal features from the tracker output. These features are further employed by a learning-by-example classifier, which predicts for each frame and for each mouse in the cage one of the behaviours learnt from the examples given by the experimenters. The system is validated on an extensive set of experimental trials involving multiple mice in an open arena. In a first evaluation we compare the classifier output with the independent evaluation of two human graders, obtaining comparable results. Then, we show the applicability of our technique to multiple mice settings, using up to four interacting mice. The system is also compared with a solution recently proposed in the literature that, similarly to us, addresses the problem with a learning-by-examples approach. Finally, we further validated our automatic system to differentiate between C57B/6J (a commonly used reference inbred strain and BTBR T+tf/J (a mouse model for autism spectrum disorders. Overall, these data demonstrate the validity and effectiveness of this new machine learning system in the detection of social and non-social behaviours in multiple (>2 interacting mice, and its versatility to deal with different

  12. [Automatic analysis pipeline of next-generation sequencing data].

    Science.gov (United States)

    Wenke, Li; Fengyu, Li; Siyao, Zhang; Bin, Cai; Na, Zheng; Yu, Nie; Dao, Zhou; Qian, Zhao

    2014-06-01

    The development of next-generation sequencing has generated high demand for data processing and analysis. Although there are a lot of software for analyzing next-generation sequencing data, most of them are designed for one specific function (e.g., alignment, variant calling or annotation). Therefore, it is necessary to combine them together for data analysis and to generate interpretable results for biologists. This study designed a pipeline to process Illumina sequencing data based on Perl programming language and SGE system. The pipeline takes original sequence data (fastq format) as input, calls the standard data processing software (e.g., BWA, Samtools, GATK, and Annovar), and finally outputs a list of annotated variants that researchers can further analyze. The pipeline simplifies the manual operation and improves the efficiency by automatization and parallel computation. Users can easily run the pipeline by editing the configuration file or clicking the graphical interface. Our work will facilitate the research projects using the sequencing technology.

  13. Coloured Petri Nets: Basic Concepts, Analysis Methods and Practical Use. Vol. 2, Analysis Methods

    DEFF Research Database (Denmark)

    Jensen, Kurt

    This three-volume work presents a coherent description of the theoretical and practical aspects of coloured Petri nets (CP-nets). The second volume contains a detailed presentation of the analysis methods for CP-nets. They allow the modeller to investigate dynamic properties of CP-nets. The main ...

  14. Automatic Monitoring Electronic Tongue with MEAs for Environmental Analysis

    Institute of Scientific and Technical Information of China (English)

    Shaofang Zou; Hong Men; Yi Li; Yinping Wang; Ping Wang

    2006-01-01

    An automatic monitoring electronic tongue based on differential pulse stripping voltammetry (DPSV) was developed for heavy metals analysis. Simultaneous detections of trace Zn(Ⅱ), Cd(Ⅱ), Pb(Ⅱ), Cu(Ⅱ), Fe(Ⅲ) and Cr(Ⅲ) in water samples were performed with three electrochemical sensors. The sensor chip combined a silicon-based Hg-coated Au microelectrode array (MEA) as the working electrode on one side with an Ag/AgCl reference electrode and a Pt counter electrode on the other side. With a computer controlled multipotentiostat, pumps and valves, the electronic tongue realized in-situ real-time detection of the six metals mentioned above at parts-per-billion level without manual operation.

  15. Neural network for automatic analysis of motility data

    DEFF Research Database (Denmark)

    Jakobsen, Erik; Kruse-Andersen, S; Kolberg, Jens Godsk

    1994-01-01

    events. Due to great variation in events, this method often fails to detect biologically relevant pressure variations. We have tried to develop a new concept for recognition of pressure events based on a neural network. Pressures were recorded for over 23 hours in 29 normal volunteers by means...... comparable. However, the neural network recognized pressure peaks clearly generated by muscular activity that had escaped detection by the conventional program. In conclusion, we believe that neurocomputing has potential advantages for automatic analysis of gastrointestinal motility data....... of a portable data recording system. A number of pressure events and non-events were selected from 9 recordings and used for training the network. The performance of the trained network was then verified on recordings from the remaining 20 volunteers. The accuracy and sensitivity of the two systems were...

  16. Automatic Analysis of Cellularity in Glioblastoma and Correlation with ADC Using Trajectory Analysis and Automatic Nuclei Counting

    Science.gov (United States)

    Burth, Sina; Kieslich, Pascal J.; Jungk, Christine; Sahm, Felix; Kickingereder, Philipp; Kiening, Karl; Unterberg, Andreas; Wick, Wolfgang; Schlemmer, Heinz-Peter; Bendszus, Martin; Radbruch, Alexander

    2016-01-01

    Objective Several studies have analyzed a correlation between the apparent diffusion coefficient (ADC) derived from diffusion-weighted MRI and the tumor cellularity of corresponding histopathological specimens in brain tumors with inconclusive findings. Here, we compared a large dataset of ADC and cellularity values of stereotactic biopsies of glioblastoma patients using a new postprocessing approach including trajectory analysis and automatic nuclei counting. Materials and Methods Thirty-seven patients with newly diagnosed glioblastomas were enrolled in this study. ADC maps were acquired preoperatively at 3T and coregistered to the intraoperative MRI that contained the coordinates of the biopsy trajectory. 561 biopsy specimens were obtained; corresponding cellularity was calculated by semi-automatic nuclei counting and correlated to the respective preoperative ADC values along the stereotactic biopsy trajectory which included areas of T1-contrast-enhancement and necrosis. Results There was a weak to moderate inverse correlation between ADC and cellularity in glioblastomas that varied depending on the approach towards statistical analysis: for mean values per patient, Spearman’s ρ = -0.48 (p = 0.002), for all trajectory values in one joint analysis Spearman’s ρ = -0.32 (p < 0.001). The inverse correlation was additionally verified by a linear mixed model. Conclusions Our data confirms a previously reported inverse correlation between ADC and tumor cellularity. However, the correlation in the current article is weaker than the pooled correlation of comparable previous studies. Hence, besides cell density, other factors, such as necrosis and edema might influence ADC values in glioblastomas. PMID:27467557

  17. Automatic Beam Path Analysis of Laser Wakefield Particle Acceleration Data

    Energy Technology Data Exchange (ETDEWEB)

    Rubel, Oliver; Geddes, Cameron G.R.; Cormier-Michel, Estelle; Wu, Kesheng; Prabhat,; Weber, Gunther H.; Ushizima, Daniela M.; Messmer, Peter; Hagen, Hans; Hamann, Bernd; Bethel, E. Wes

    2009-10-19

    Numerical simulations of laser wakefield particle accelerators play a key role in the understanding of the complex acceleration process and in the design of expensive experimental facilities. As the size and complexity of simulation output grows, an increasingly acute challenge is the practical need for computational techniques that aid in scientific knowledge discovery. To that end, we present a set of data-understanding algorithms that work in concert in a pipeline fashion to automatically locate and analyze high energy particle bunches undergoing acceleration in very large simulation datasets. These techniques work cooperatively by first identifying features of interest in individual timesteps, then integrating features across timesteps, and based on the information derived perform analysis of temporally dynamic features. This combination of techniques supports accurate detection of particle beams enabling a deeper level of scientific understanding of physical phenomena than hasbeen possible before. By combining efficient data analysis algorithms and state-of-the-art data management we enable high-performance analysis of extremely large particle datasets in 3D. We demonstrate the usefulness of our methods for a variety of 2D and 3D datasets and discuss the performance of our analysis pipeline.

  18. AUTOMATIC LICENSE PLATE LOCALISATION AND IDENTIFICATION VIA SIGNATURE ANALYSIS

    Directory of Open Access Journals (Sweden)

    Lorita Angeline

    2014-02-01

    Full Text Available A new algorithm for license plate localisation and identification is proposed on the basis of Signature analysis. Signature analysis has been used to locate license plate candidate and its properties can be further utilised in supporting and affirming the license plate character recognition. This paper presents Signature Analysis and the improved conventional Connected Component Analysis (CCA to design an automatic license plate localisation and identification. A procedure called Euclidean Distance Transform is added to the conventional CCA in order to tackle the multiple bounding boxes that occurred. The developed algorithm, SAICCA achieved 92% successful rate, with 8% failed localisation rate due to the restrictions such as insufficient light level, clarity and license plate perceptual information. The processing time for a license plate localisation and recognition is a crucial criterion that needs to be concerned. Therefore, this paper has utilised several approaches to decrease the processing time to an optimal value. The results obtained show that the proposed system is capable to be implemented in both ideal and non-ideal environments.

  19. Automatic beam path analysis of laser wakefield particle acceleration data

    Energy Technology Data Exchange (ETDEWEB)

    Ruebel, Oliver; Wu, Kesheng; Prabhat; Weber, Gunther H; Ushizima, Daniela M; Hamann, Bernd; Bethel, Wes [Computational Research Division, Lawrence Berkeley National Laboratory, One Cyclotron Road, Berkeley, CA 94720 (United States); Geddes, Cameron G R; Cormier-Michel, Estelle [LOASIS program of Lawrence Berkeley National Laboratory, One Cyclotron Road, Berkeley, CA 94720 (United States); Messmer, Peter [Tech-X Corporation, 5621 Arapahoe Avenue Suite A, Boulder, CO 80303 (United States); Hagen, Hans [International Research Training Group ' Visualization of Large and Unstructured Data Sets-Applications in Geospatial Planning, Modeling, and Engineering' , Technische Universitaet Kaiserslautern, Erwin-Schroedinger-Strasse, D-67653 Kaiserslautern (Germany)], E-mail: oruebel@lbl.gov

    2009-01-01

    Numerical simulations of laser wakefield particle accelerators play a key role in the understanding of the complex acceleration process and in the design of expensive experimental facilities. As the size and complexity of simulation output grows, an increasingly acute challenge is the practical need for computational techniques that aid in scientific knowledge discovery. To that end, we present a set of data-understanding algorithms that work in concert in a pipeline fashion to automatically locate and analyze high-energy particle bunches undergoing acceleration in very large simulation datasets. These techniques work cooperatively by first identifying features of interest in individual timesteps, then integrating features across timesteps, and based on the information-derived perform analysis of temporally dynamic features. This combination of techniques supports accurate detection of particle beams enabling a deeper level of scientific understanding of physical phenomena than has been possible before. By combining efficient data analysis algorithms and state-of-the-art data management we enable high-performance analysis of extremely large particle datasets in 3D. We demonstrate the usefulness of our methods for a variety of 2D and 3D datasets and discuss the performance of our analysis pipeline.

  20. Automatic quantitative analysis of cardiac MR perfusion images

    Science.gov (United States)

    Breeuwer, Marcel M.; Spreeuwers, Luuk J.; Quist, Marcel J.

    2001-07-01

    Magnetic Resonance Imaging (MRI) is a powerful technique for imaging cardiovascular diseases. The introduction of cardiovascular MRI into clinical practice is however hampered by the lack of efficient and accurate image analysis methods. This paper focuses on the evaluation of blood perfusion in the myocardium (the heart muscle) from MR images, using contrast-enhanced ECG-triggered MRI. We have developed an automatic quantitative analysis method, which works as follows. First, image registration is used to compensate for translation and rotation of the myocardium over time. Next, the boundaries of the myocardium are detected and for each position within the myocardium a time-intensity profile is constructed. The time interval during which the contrast agent passes for the first time through the left ventricle and the myocardium is detected and various parameters are measured from the time-intensity profiles in this interval. The measured parameters are visualized as color overlays on the original images. Analysis results are stored, so that they can later on be compared for different stress levels of the heart. The method is described in detail in this paper and preliminary validation results are presented.

  1. Fully automatic apparatus for proximate analysis and total sulfur analysis of coal and coke

    Energy Technology Data Exchange (ETDEWEB)

    Ishibashi, Y.; Fukumoto, K.; Maeda, K.; Ogawa, A.; Goto, K.; Ishii, T.

    1987-02-01

    For the purpose of saving labor and improving working conditions, the full automatic apparatus for the proximate analysis and total sulfur analysis of coal and coke was developed. Objects of this automation were to quantify inherent moisture, ash, volatile matter and total sulfur. Quantitative determination of inherent moisture, ash and volatile matter was made gravimetrically. Total sulfur was measured by combustion and acid and base titration, using a conductivity glass electrode sensor. The robot was set to the center of the automatic apparatus and relevant apparatuses were arranged in concentric circular locations. The computer carried out condition control of each apparatus, sequence control by operating demand and operation completing signal, and progress control of analysis. Very high correlation between the conventional method and this automatic method was confirmed. 4 refs., 9 figs., 3 tabs.

  2. volBrain: An Online MRI Brain Volumetry System.

    Science.gov (United States)

    Manjón, José V; Coupé, Pierrick

    2016-01-01

    The amount of medical image data produced in clinical and research settings is rapidly growing resulting in vast amount of data to analyze. Automatic and reliable quantitative analysis tools, including segmentation, allow to analyze brain development and to understand specific patterns of many neurological diseases. This field has recently experienced many advances with successful techniques based on non-linear warping and label fusion. In this work we present a novel and fully automatic pipeline for volumetric brain analysis based on multi-atlas label fusion technology that is able to provide accurate volumetric information at different levels of detail in a short time. This method is available through the volBrain online web interface (http://volbrain.upv.es), which is publically and freely accessible to the scientific community. Our new framework has been compared with current state-of-the-art methods showing very competitive results. PMID:27512372

  3. volBrain: An Online MRI Brain Volumetry System

    Science.gov (United States)

    Manjón, José V.; Coupé, Pierrick

    2016-01-01

    The amount of medical image data produced in clinical and research settings is rapidly growing resulting in vast amount of data to analyze. Automatic and reliable quantitative analysis tools, including segmentation, allow to analyze brain development and to understand specific patterns of many neurological diseases. This field has recently experienced many advances with successful techniques based on non-linear warping and label fusion. In this work we present a novel and fully automatic pipeline for volumetric brain analysis based on multi-atlas label fusion technology that is able to provide accurate volumetric information at different levels of detail in a short time. This method is available through the volBrain online web interface (http://volbrain.upv.es), which is publically and freely accessible to the scientific community. Our new framework has been compared with current state-of-the-art methods showing very competitive results.

  4. volBrain: an online MRI brain volumetry system

    Directory of Open Access Journals (Sweden)

    Jose V. Manjon

    2016-07-01

    Full Text Available The amount of medical image data produced in clinical and research settings is rapidly growing resulting in vast amount of data to analyze. Automatic and reliable quantitative analysis tools, including segmentation, allow to analyze brain development and to understand specific patterns of many neurological diseases. This field has recently experienced many advances with successful techniques based on non-linear warping and label fusion. In this work we present a novel and fully automatic pipeline for volumetric brain analysis based on multi-atlas label fusion technology that is able to provide accurate volumetric information at different levels of detail in a short time. This method is available through the volBrain online web interface (http://volbrain.upv.es, which is publically and freely accessible to the scientific community. Our new framework has been compared with current state-of-the-art methods showing very competitive results.

  5. Automatic analysis of gamma spectra using a desk computer

    International Nuclear Information System (INIS)

    A code for the analysis of gamma spectra obtained with a Ge(Li) detector was developed for use with a desk computer (Hewlett-Packard Model 9810 A). The process is performed in a totally automatic way, data are conveniently smoothed and the background is generated by a convolutive equation. A calibration of the equipment with well-known standard sources gives the necessary data for adjusting a third degree equation by minimun squares, relating the energy with the peak position. Criteria are given for determining if certain groups of values constitute or not a peak or if it is a double line. All the peaks are adjusted to a gaussian curve and if necessary decomposed in their components. Data entry is by punched tape, ASCII Code. An alf-numeric printer provides (a) the position of the peak and its energy, (b) its resolution if it is larger than expected, (c) the area of the peak with its statistic error determined by the method of Wasson. As option, the complete spectra with the determined background can be plotted. (author)

  6. Automatic analysis of ciliary beat frequency using optical flow

    Science.gov (United States)

    Figl, Michael; Lechner, Manuel; Werther, Tobias; Horak, Fritz; Hummel, Johann; Birkfellner, Wolfgang

    2012-02-01

    Ciliary beat frequency (CBF) can be a useful parameter for diagnosis of several diseases, as e.g. primary ciliary dyskinesia. (PCD). CBF computation is usually done using manual evaluation of high speed video sequences, a tedious, observer dependent, and not very accurate procedure. We used the OpenCV's pyramidal implementation of the Lukas-Kanade algorithm for optical flow computation and applied this to certain objects to follow the movements. The objects were chosen by their contrast applying the corner detection by Shi and Tomasi. Discrimination between background/noise and cilia by a frequency histogram allowed to compute the CBF. Frequency analysis was done using the Fourier transform in matlab. The correct number of Fourier summands was found by the slope in an approximation curve. The method showed to be usable to distinguish between healthy and diseased samples. However there remain difficulties in automatically identifying the cilia, and also in finding enough high contrast cilia in the image. Furthermore the some of the higher contrast cilia are lost (and sometimes found) by the method, an easy way to distinguish the correct sub-path of a point's path have yet to be found in the case where the slope methods doesn't work.

  7. Automatic comic page image understanding based on edge segment analysis

    Science.gov (United States)

    Liu, Dong; Wang, Yongtao; Tang, Zhi; Li, Luyuan; Gao, Liangcai

    2013-12-01

    Comic page image understanding aims to analyse the layout of the comic page images by detecting the storyboards and identifying the reading order automatically. It is the key technique to produce the digital comic documents suitable for reading on mobile devices. In this paper, we propose a novel comic page image understanding method based on edge segment analysis. First, we propose an efficient edge point chaining method to extract Canny edge segments (i.e., contiguous chains of Canny edge points) from the input comic page image; second, we propose a top-down scheme to detect line segments within each obtained edge segment; third, we develop a novel method to detect the storyboards by selecting the border lines and further identify the reading order of these storyboards. The proposed method is performed on a data set consisting of 2000 comic page images from ten printed comic series. The experimental results demonstrate that the proposed method achieves satisfactory results on different comics and outperforms the existing methods.

  8. Automatic Feature Interaction Analysis in PacoSuite

    Directory of Open Access Journals (Sweden)

    Wim Vanderperren

    2004-10-01

    Full Text Available In this paper, we build upon previous work that aims at recuperating aspect oriented ideas into component based software development. In that research, a composition adapter was proposed in order to capture crosscutting concerns in the PacoSuite component based methodology. A composition adapter is visually applied onto a given component composition and the changes it describes are automatically applied. Stacking multiple composition adapters onto the same component composition can however lead to unpredictable and undesired side-effects. In this paper, we propose a solution for this issue, widely known as the feature interaction problem. We present a classification of different interaction levels among composition adapters and the algorithms required to verify them. The proposed algorithms are however of exponential nature and depend on both the composition adapters and the component composition as a whole. In order to enhance the performance of our feature interaction analysis, we present a set of theorems that define the interaction levels solely in terms of the properties of the composition adapters themselves.

  9. Trends of Science Education Research: An Automatic Content Analysis

    Science.gov (United States)

    Chang, Yueh-Hsia; Chang, Chun-Yen; Tseng, Yuen-Hsien

    2010-08-01

    This study used scientometric methods to conduct an automatic content analysis on the development trends of science education research from the published articles in the four journals of International Journal of Science Education, Journal of Research in Science Teaching, Research in Science Education, and Science Education from 1990 to 2007. The multi-stage clustering technique was employed to investigate with what topics, to what development trends, and from whose contribution that the journal publications constructed as a science education research field. This study found that the research topic of Conceptual Change & Concept Mapping was the most studied topic, although the number of publications has slightly declined in the 2000's. The studies in the themes of Professional Development, Nature of Science and Socio-Scientific Issues, and Conceptual Chang and Analogy were found to be gaining attention over the years. This study also found that, embedded in the most cited references, the supporting disciplines and theories of science education research are constructivist learning, cognitive psychology, pedagogy, and philosophy of science.

  10. Identificação e quantificação de voláteis de café através de cromatografia gasosa de alta resolução / espectrometria de massas empregando um amostrador automático de "headspace" Identification and quantification of coffee volatile components through high resolution gas chromatoghaph/mass spectrometer using a headspace automatic sampler

    Directory of Open Access Journals (Sweden)

    Leonardo César AMSTALDEN

    2001-01-01

    Full Text Available Usando um amostrador automático, os "headspaces" de três marcas comerciais de café torrado e moído foram analisados qualitativa e quantitativamente quanto a composição dos voláteis responsáveis pelo aroma através da técnica de cromatografia gasosa/espectrometria de massas. Uma vez que a metodologia não envolveu isolamento ou concentração dos aromas, suas proporções naturais foram mantidas, além de simplificar o preparo das amostras. O emprego do amostrador automático permitiu também boa resolução dos picos cromatográficos sem o emprego de criogenia, contribuindo para redução no tempo de análise. Noventa e um componentes puderam ser identificados, sendo que alguns compostos conhecidos como presentes em café como o dimetilsulfeto, metional e furfuril mercaptana não foram detectados. Os voláteis presentes em maior concentração puderam ser quantificados com o auxílio de dois padrões internos. A técnica se provou viável, tanto para caracterização como para quantificação de voláteis de café.Employing an automatic headspace sampler, the headspaces of three commercial brands of ground roasted coffee were qualitatively and quantitatively analyzed by gas chromatography / mass spectrometry. Since the methodology did not involve aroma isolation or concentration, their natural proportions were maintained, providing a more accurate composition of the flavors, and simplifying sample preparation. The automatic sampler allowed good resolution of the chromatographic peaks without cryofocusing the samples at the head of the column during injection, reducing analysis time. Ninety one compounds were identified and some known coffee volatiles, such as dimethyl sulphide, methional and furfuryl mercaptan were not detected. The more concentrated volatiles could be identified using two internal standards. The technique proved viable, for both characterization and for quantification of coffee volatiles.

  11. Automatic analysis of BEBC pictures using the flying spot digitizer

    CERN Document Server

    Bacilieri, P; Luvisetto, M L; Masetti, M; Matteuzzi, P; Simoni, L

    1977-01-01

    In future experiments at CERN using the SPS (Super Proton Synchrotron) , pictures will be obtained from the big bubble chambers, BEBC and Gargamelle. Until now only a few thousands BEBC pictures have been taken in experiments with the existing PS accelerator. BEBC pictures are much more difficult to analyse by means of automatic devices than ones from the 2 m bubble chamber. Therefore it is necessary to sophisticate the automatic measuring system using an HPD (a mechanical Flying Spot Digitizer) or to develop new systems; for instance, Erasme recently built at CERN. Through a sophistication of the HPD device, the use of a home-made processor to get track segments and the development of a new program chain, which allows one to follow spiralizing tracks, a new system has been developed and it allows the automatic processing of BEBC pictures. (1 refs).

  12. Automatic analysis of BEBC pictures using the flying spot digitizer

    International Nuclear Information System (INIS)

    In future experiments at CERN using the SPS (Super Proton Synchrotron), pictures will be obtained from the big bubble chambers, BEBC and Gargamelle. Until now only a few thousands BEBC pictures have been taken in experiments with the existing PS accelerator. BEBC pictures are much more difficult to be analysed by means of automatic devices than the ones from 2 m bubble chamber. Therefore it is necessary to sophisticate the automatic measuring system using an HPD (a mechanical Flying Spot Digitizer) or to develop new systems like, for instance, Erasme recently built at CERN. Through a sophistication of the HPD device, the use of a home-made processor to get track segments and the development of a new program chain, which allows to follow spiralizing tracks, and the use of an interactive refreshing display for help, a new system has been developed and it allows the automatic processing of BEBC pictures. (Auth.)

  13. Automatic proximate analyzer of coal based on isothermal thermogravimetric analysis (TGA) with twin-furnace

    Energy Technology Data Exchange (ETDEWEB)

    Xiong, Youhui; Jiang, Taiyi; Zou, Xianhong [National Laboratory of Coal Combustion, Huazhong University of Science and Technology, Wuhan, Hubei 430074 (China)

    2003-12-17

    A new type of rapid and automatic proximate analyzer for coal based on isothermal thermogravimetric analysis (TGA) with twin-furnace is introduced in this paper. This automatic proximate analyzer was developed by combination with some novel technologies, such as the automatic weighting method for multi-samples in a high temperature and dynamic gas flow circumstance, the self-protection system for the electric balance, and the optimal method and procedure for coal analysis process. Additionally, the comparison between standard values and the measurement values derived from the new instrument of standard coals was presented.

  14. Spectral Curve Fitting for Automatic Hyperspectral Data Analysis

    CERN Document Server

    Brown, Adrian J

    2014-01-01

    Automatic discovery and curve fitting of absorption bands in hyperspectral data can enable the analyst to identify materials present in a scene by comparison with library spectra. This procedure is common in laboratory spectra, but is challenging for sparse hyperspectral data. A procedure for robust discovery of overlapping bands in hyperspectral data is described in this paper. The method is capable of automatically discovering and fitting symmetric absorption bands, can separate overlapping absorption bands in a stable manner, and has relatively low sensitivity to noise. A comparison with techniques already available in the literature is presented using simulated spectra. An application is demonstrated utilizing the shortwave infrared (2.0-2.5 micron or 5000-4000 cm-1) region. A small hyperspectral scene is processed to demonstrate the ability of the method to detect small shifts in absorption wavelength caused by varying white mica chemistry in a natural setting.

  15. Automatic analysis of ventilation and perfusion pulmonary scintigrams

    International Nuclear Information System (INIS)

    A fully automatic program is used to analyse Pulmonary Ventilation and Perfusion Scintigrams. Ventilation study is performed using a standard washin-washout 133Xe method. Multiple View Late Xenon Washout Images are also recorded. Perfusion study is performed using sup(99m)Tc serum albumin. The FORTRAN program recognizes the different steps of the test, whatever their durations are. It performs background subtraction, drows pulmonary Regions of Interest and calculate Ventilation and Perfusion parameters for each ROI and each lung. It also processes Multiple View Late Xenon Washout Images in such a way that they give not only a topographic information about hypoventilated regions, but also a semi-quantitative information about the strongness of xenon retention. During the processing, the operator has only to control two intermediate results (e.g. automatically determained pulmonary ROI). All the numerical and processed iconographic results are obtained within 10 minutes after the end of the test. This program has already been used to analyse 1,000 pulmonary studies. During those studies, correction of intermediate results has been very scarcely necessary. This efficient and reliable automatic program is very useful for the daily practice of a Nuclear Medecin Department

  16. Balance of the uranium market. Contribution to an economic analysis. Vol. 2

    International Nuclear Information System (INIS)

    The second volume of this thesis on the economic analysis of the uranium market contrains all the technical decriptions concerning reactors and fuel cycle and also detailed results of the two models on uranium supply and demand

  17. Sampling and analysis in the Arctic marine benthic environment. Vol. 2. Guide to practice

    Energy Technology Data Exchange (ETDEWEB)

    1985-03-01

    There is an increasing requirement for environmental information on Arctic marine sediments because of natural resource development, particularly offshore hydrocarbon exploration and associated proposed transport mechanisms. At the same time, it is recognized that methods that have been used for collection, analysis and data reporting need improvement if the data are to have long term value. This is the second part of a study to produce a guide for managers and planners involved in the collection and analysis of environmental data from marine sediments in Arctic regions. The first phase of this study (Volume 1) reviewed existing methods and was intended to serve as a background document to the present report. The format is designed to outline in a step-by-step manner the factors to be considered and offers suggestions as to possible methods to achieve various results. This guide includes sampling strategy and means of quality control, sampling and subsampling methods for chemical analysis, sampling methods for benthos, means of sample preservation and storage, sampling logistics (transport, positioning and sampling from ice), methods of sample analysis for metals, organics, grain size, Pb-210 dating, methods of laboratory quality assurance and methods for the analysis of benthic data. 319 refs., 14 figs., 44 tabs.

  18. Economic analysis of automatic flood irrigation for dairy farms in northern Victoria

    OpenAIRE

    Armstrong, Dan P.; Ho, Christie K.M.

    2011-01-01

    Interest in automatic flood irrigation is strong, given the labour and lifestyle benefits it can provide. An economic analysis of three automated flood irrigation systems for a dairy farm in northern Victorian indicated that automatic irrigation can be a profitable labour saving investment in many cases. However, profitability was sensitive to the amount and value of the labour saved. Pneumatic and timer systems were good investments regardless of the area they were installed to service. The ...

  19. Explodet Project:. Methods of Automatic Data Processing and Analysis for the Detection of Hidden Explosive

    Science.gov (United States)

    Lecca, Paola

    2003-12-01

    The research of the INFN Gruppo Collegato di Trento in the ambit of EXPLODET project for the humanitarian demining, is devoted to the development of a software procedure for the automatization of data analysis and decision taking about the presence of hidden explosive. Innovative algorithms of likely background calculation, a system based on neural networks for energy calibration and simple statistical methods for the qualitative consistency check of the signals are the main parts of the software performing the automatic data elaboration.

  20. Automatic behaviour analysis system for honeybees using computer vision

    DEFF Research Database (Denmark)

    Tu, Gang Jun; Hansen, Mikkel Kragh; Kryger, Per;

    2016-01-01

    We present a fully automatic online video system, which is able to detect the behaviour of honeybees at the beehive entrance. Our monitoring system focuses on observing the honeybees as naturally as possible (i.e. without disturbing the honeybees). It is based on the Raspberry Pi that is a low...... demonstrate that this system can be used as a tool to detect the behaviour of honeybees and assess their state in the beehive entrance. Besides, the result of the computation time show that the Raspberry Pi is a viable solution in such real-time video processing system....

  1. Structuring Lecture Videos by Automatic Projection Screen Localization and Analysis.

    Science.gov (United States)

    Li, Kai; Wang, Jue; Wang, Haoqian; Dai, Qionghai

    2015-06-01

    We present a fully automatic system for extracting the semantic structure of a typical academic presentation video, which captures the whole presentation stage with abundant camera motions such as panning, tilting, and zooming. Our system automatically detects and tracks both the projection screen and the presenter whenever they are visible in the video. By analyzing the image content of the tracked screen region, our system is able to detect slide progressions and extract a high-quality, non-occluded, geometrically-compensated image for each slide, resulting in a list of representative images that reconstruct the main presentation structure. Afterwards, our system recognizes text content and extracts keywords from the slides, which can be used for keyword-based video retrieval and browsing. Experimental results show that our system is able to generate more stable and accurate screen localization results than commonly-used object tracking methods. Our system also extracts more accurate presentation structures than general video summarization methods, for this specific type of video. PMID:26357345

  2. Balance of the uranium market. Contribution to an economic analysis. Vol. 1

    International Nuclear Information System (INIS)

    The evolution of the economic and the energetic world situation for these last 40 years are first described stressing electricity production. Place of nuclear energy in the energy market is analyzed taking into account socio-political factors. In a second part, cost for electricity production from different sources: coal, nuclear power, fuel-oil are compared. Two models for uranium world supply and demand are developed. In the third part the uranium market balance, is examined. The analysis includes 3 steps: determination of total uranium demand from 1983 to 2000, determination of total uranium supply by portion of production cost, supply and demand are compared for future evolution with different hypothesis

  3. Finite element fatigue analysis of rectangular clutch spring of automatic slack adjuster

    Science.gov (United States)

    Xu, Chen-jie; Luo, Zai; Hu, Xiao-feng; Jiang, Wen-song

    2015-02-01

    The failure of rectangular clutch spring of automatic slack adjuster directly affects the work of automatic slack adjuster. We establish the structural mechanics model of automatic slack adjuster rectangular clutch spring based on its working principle and mechanical structure. In addition, we upload such structural mechanics model to ANSYS Workbench FEA system to predict the fatigue life of rectangular clutch spring. FEA results show that the fatigue life of rectangular clutch spring is 2.0403×105 cycle under the effect of braking loads. In the meantime, fatigue tests of 20 automatic slack adjusters are carried out on the fatigue test bench to verify the conclusion of the structural mechanics model. The experimental results show that the mean fatigue life of rectangular clutch spring is 1.9101×105, which meets the results based on the finite element analysis using ANSYS Workbench FEA system.

  4. Automatic analysis of macerals and reflectance; Analisis Automatico de Macerales y Reflectancia

    Energy Technology Data Exchange (ETDEWEB)

    Catalina, J.C.; Alarcon, D.; Gonzalez Prado, J.

    1998-12-01

    A new system has been developed to perform automatically macerals and reflectance analysis of single-seam bituminous coals, improving the interlaboratory accuracy of these types of analyses. The system follows the same steps as the manual method, requiring a human operator for preparation of coal samples and system startup; then, sample scanning, microscope focusing and field centre analysis are fully automatic. The main and most innovative idea of this approach is to coordinate an expert system with an image processing system, using both reflectance and morphological information. In this way, the system tries to reproduce the analysis procedure followed by a human expert in petrography. (Author)

  5. Development of a System for Automatic Facial Expression Analysis

    Science.gov (United States)

    Diago, Luis A.; Kitaoka, Tetsuko; Hagiwara, Ichiro

    Automatic recognition of facial expressions can be an important component of natural human-machine interactions. While a lot of samples are desirable for estimating more accurately the feelings of a person (e.g. likeness) about a machine interface, in real world situation, only a small number of samples must be obtained because the high cost in collecting emotions from observed person. This paper proposes a system that solves this problem conforming to individual differences. A new method is developed for facial expression classification based on the combination of Holographic Neural Networks (HNN) and Type-2 Fuzzy Logic. For the recognition of emotions induced by facial expressions, compared with former HNN and Support Vector Machines (SVM) classifiers, proposed method achieved the best generalization performance using less learning time than SVM classifiers.

  6. Analytical tool for the periodic safety analysis of NPP according to the PSA guideline. Vol. 1

    International Nuclear Information System (INIS)

    The SAIS (Safety Analysis and Informationssystem) Programme System is based on an integrated data base, which consists of a plant-data and a PSA related data part. Using SAIS analyses can be performed by special tools, which are connected directly to the data base. Two main editors, RISA+ and DEDIT, are used for data base management. The access to the data base is done via different types of pages, which are displayed on a displayed on a computer screen. The pages are called data sheets. Sets of input and output data sheets were implemented, such as system or component data sheets, fault trees or event trees. All input information, models and results needed for updated results of PSA (Living PSA) can be stored in the SAIS. The programme system contains the editor KVIEW which guarantees consistency of the stored data, e.g. with respect to names and codes of components and events. The information contained in the data base are called in by a standardized users guide programme, called Page Editor. (Brunsbuettel on reference NPP). (orig./HP)

  7. ANALYSIS OF EXISTING AND PROSPECTIVE TECHNICAL CONTROL SYSTEMS OF NUMERIC CODES AUTOMATIC BLOCKING

    Directory of Open Access Journals (Sweden)

    A. M. Beznarytnyy

    2013-09-01

    Full Text Available Purpose. To identify the characteristic features of the engineering control measures system of automatic block of numeric code, identifying their advantages and disadvantages, to analyze the possibility of their use in the problems of diagnosing status of the devices automatic block and setting targets for the development of new diagnostic systems. Methodology. In order to achieve targets the objective theoretical and analytical method and the method of functional analysis have been used. Findings. The analysis of existing and future facilities of the remote control and diagnostics automatic block devices had shown that the existing systems of diagnosis were not sufficiently informative, designed primarily to control the discrete parameters, which in turn did not allow them to construct a decision support subsystem. In developing of new systems of technical diagnostics it was proposed to use the principle of centralized distributed processing of diagnostic data, to include a subsystem support decision-making in to the diagnostics system, it will reduce the amount of work to maintain the devices blocking and reduce recovery time after the occurrence injury. Originality. As a result, the currently existing engineering controls facilities of automatic block can not provide a full assessment of the state distillation alarms and locks. Criteria for the development of new systems of technical diagnostics with increasing amounts of diagnostic information and its automatic analysis were proposed. Practical value. These results of the analysis can be used in practice in order to select the technical control of automatic block devices, as well as the further development of diagnostic systems automatic block that allows for a gradual transition from a planned preventive maintenance service model to the actual state of the monitored devices.

  8. Vibration Theory, Vol. 1B

    DEFF Research Database (Denmark)

    Asmussen, J. C.; Nielsen, Søren R. K.

    The present collection of MATLAB exercises has been published as a supplement to the textbook, Svingningsteori, Bind 1 and the collection of exercises in Vibration theory, Vol. 1A, Solved Problems. Throughout the exercise references are made to these books. The purpose of the MATLAB exercises is ...... is to give a better understanding of the physical problems in linear vibration theory and to surpress the mathematical analysis used to solve the problems. For this purpose the MATLAB environment is excellent....

  9. Statistical language analysis for automatic exfiltration event detection.

    Energy Technology Data Exchange (ETDEWEB)

    Robinson, David Gerald

    2010-04-01

    This paper discusses the recent development a statistical approach for the automatic identification of anomalous network activity that is characteristic of exfiltration events. This approach is based on the language processing method eferred to as latent dirichlet allocation (LDA). Cyber security experts currently depend heavily on a rule-based framework for initial detection of suspect network events. The application of the rule set typically results in an extensive list of uspect network events that are then further explored manually for suspicious activity. The ability to identify anomalous network events is heavily dependent on the experience of the security personnel wading through the network log. Limitations f this approach are clear: rule-based systems only apply to exfiltration behavior that has previously been observed, and experienced cyber security personnel are rare commodities. Since the new methodology is not a discrete rule-based pproach, it is more difficult for an insider to disguise the exfiltration events. A further benefit is that the methodology provides a risk-based approach that can be implemented in a continuous, dynamic or evolutionary fashion. This permits uspect network activity to be identified early with a quantifiable risk associated with decision making when responding to suspicious activity.

  10. Automatic chemical analysis of traces of boron in steel

    International Nuclear Information System (INIS)

    The analyzer is composed of a sample changer, reagent addition devices, a distillation vessel, a color reaction vessel, a spectrophotometer, a controller, etc. The automatic procedure is performed according to the predetermined distillation and color reaction programs after dissolving 0.5 g of steel sample in aqua regia and fuming with sulfuric acid-phosphoric acid. The sample solution on the sample changer is transferred into the distillation vessel, where boron is distilled with methyl alcohol by heating and aeration. The distillate is collected in the distillate vessel, and a 1/2 aliquot is transferred into the color reaction vessel with small amounts of water. After the addition of glacial acetic acid and propionic anhydride, the distillate is circulated through the circulating pipe which is composed of an air blowing tube, a bubble remover, a flow cell and a drain valve. Oxalyl chloride (to eliminate water), sulfuric acid, the curcumin reagent (to form the boron complex) and an acetate buffer are added, and the absorbance of the solution is measured at 545 nm. The analytical results of steel samples were in good agreement with those obtained by the conventional method and with certified values. (auth.)

  11. Full automatic clean-up robot for dioxin/PCB analysis

    Energy Technology Data Exchange (ETDEWEB)

    Matsumura, T.; Masuzaki, Y.; Takahashi, A.; Koizumi, A. [METOCEAN Environment Inc., Shizuoka (Japan). Environmental Risk Research Center, Inst. of General Science for Environment; Okuyama, H.; Kawada, Y.; Higashiguchi, T. [Moritex Corporation, Yokohama (Japan)

    2004-09-15

    Dioxin analysis requires several steps of clean-up procedures by combination of several column chromatography (e.g. silica gel column chromatography, carbon column chromatography) and sulfuric acid treatment. Full Automatic Clean-up Robot for Dioxin and PCB were developed.

  12. Alleviating Search Uncertainty through Concept Associations: Automatic Indexing, Co-Occurrence Analysis, and Parallel Computing.

    Science.gov (United States)

    Chen, Hsinchun; Martinez, Joanne; Kirchhoff, Amy; Ng, Tobun D.; Schatz, Bruce R.

    1998-01-01

    Grounded on object filtering, automatic indexing, and co-occurrence analysis, an experiment was performed using a parallel supercomputer to analyze over 400,000 abstracts in an INSPEC computer engineering collection. A user evaluation revealed that system-generated thesauri were better than the human-generated INSPEC subject thesaurus in concept…

  13. ATC Operations Analysis via Automatic Recognition of Clearances Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Recent advances in airport surface surveillance have motivated the creation of new tools and data sources for analysis of Air Traffic Control (ATC) operations. The...

  14. Porosity determination on pyrocarbon using automatic quantitative image analysis

    International Nuclear Information System (INIS)

    Methods of porosity determination are reviewed and applied to the measurement of the porosity of pyrocarbon. Specifically, the mathematical basis of stereology and the procedures involved in quantitative image analysis are detailed

  15. ATC Operations Analysis via Automatic Recognition of Clearances Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Recent advances in airport surface surveillance have motivated the creation of new tools for analysis of Air Traffic Control (ATC) operations, such as the Surface...

  16. Automatic target recognition in SAR images using multilinear analysis

    OpenAIRE

    Porgès, Tristan; Favier, Gérard

    2011-01-01

    International audience Multilinear analysis provides a powerful mathematical framework for analyzing synthetic aperture radar (SAR) images resulting from the interaction of multiple factors like sky luminosity and viewing angles, while preserving their original shape. In this paper, we propose a multilinear principal component analysis (MPCA) algorithm for target recognition in SAR images. First, we form a high order tensor with the training image set and we apply the higher-order singular...

  17. CAD system for automatic analysis of CT perfusion maps

    Science.gov (United States)

    Hachaj, T.; Ogiela, M. R.

    2011-03-01

    In this article, authors present novel algorithms developed for the computer-assisted diagnosis (CAD) system for analysis of dynamic brain perfusion, computer tomography (CT) maps, cerebral blood flow (CBF), and cerebral blood volume (CBV). Those methods perform both quantitative analysis [detection and measurement and description with brain anatomy atlas (AA) of potential asymmetries/lesions] and qualitative analysis (semantic interpretation of visualized symptoms). The semantic interpretation (decision about type of lesion: ischemic/hemorrhagic, is the brain tissue at risk of infraction or not) of visualized symptoms is done by, so-called, cognitive inference processes allowing for reasoning on character of pathological regions based on specialist image knowledge. The whole system is implemented in.NET platform (C# programming language) and can be used on any standard PC computer with.NET framework installed.

  18. Automatic quantitative analysis of cardiac MR perfusion images

    NARCIS (Netherlands)

    Breeuwer, Marcel; Spreeuwers, Luuk; Quist, Marcel

    2001-01-01

    Magnetic Resonance Imaging (MRI) is a powerful technique for imaging cardiovascular diseases. The introduction of cardiovascular MRI into clinical practice is however hampered by the lack of efficient and accurate image analysis methods. This paper focuses on the evaluation of blood perfusion in the

  19. Automatic Binding Time Analysis for a Typed Lambda-Calculus

    DEFF Research Database (Denmark)

    Nielson, Hanne Riis; Nielson, Flemming

    1988-01-01

    A binding time analysis imposes a distinction between the computations to be performed early (e.g. at compile-time) and those to be performed late (e.g. at run-time). For the lambda-calculus this distinction is formalized by a two-level lambda-calculus. The authors present an algorithm for static...... analysis of the binding times of a typed lambda-calculus with products, sums, lists and general recursive types. Given partial information about the binding times of some of the subexpressions it will complete that information such that (i) early bindings may be turned into late bindings but not vice versa......, (ii) the resulting two-level lambda-expression reflects our intuition about binding times, e.g. that early bindings are performed before late bindings, and (iii) as few changes as possible have been made compared with the initial binding information. The results can be applied in the implementation...

  20. Two dimensional barcode-inspired automatic analysis for arrayed microfluidic immunoassays

    OpenAIRE

    Zhang, Yi; Qiao, Lingbo; Ren, Yunke; Wang, Xuwei; Gao, Ming; Tang, Yunfang; Jeff Xi, Jianzhong; Fu, Tzung-May; Jiang, Xingyu

    2013-01-01

    The usability of many high-throughput lab-on-a-chip devices in point-of-care applications is currently limited by the manual data acquisition and analysis process, which are labor intensive and time consuming. Based on our original design in the biochemical reactions, we proposed here a universal approach to perform automatic, fast, and robust analysis for high-throughput array-based microfluidic immunoassays. Inspired by two-dimensional (2D) barcodes, we incorporated asymmetric function patt...

  1. Automatic ECG Analysis for Preliminary and Detailed Diagnostics Based on Scale-space Representation

    OpenAIRE

    Belous, Natalie; Kobzar, Gleb

    2008-01-01

    A novel approach of automatic ECG analysis based on scale-scale signal representation is proposed. The approach uses curvature scale-space representation to locate main ECG waveform limits and peaks and may be used to correct results of other ECG analysis techniques or independently. Moreover dynamic matching of ECG CSS representations provides robust preliminary recognition of ECG abnormalities which has been proven by experimental results.

  2. Automatic analysis (aa): efficient neuroimaging workflows and parallel processing using Matlab and XML

    OpenAIRE

    Rhodri Cusack; Alejandro Vicente-Grabovetsky; Daniel J Mitchell; Peelle, Jonathan E.

    2015-01-01

    Recent years have seen neuroimaging data sets becoming richer, with larger cohorts of participants, a greater variety of acquisition techniques, and increasingly complex analyses. These advances have made data analysis pipelines complicated to set up and run (increasing the risk of human error) and time consuming to execute (restricting what analyses are attempted). Here we present an open-source framework, automatic analysis (aa), to address these concerns. Human efficiency is increased by m...

  3. Automatic Fatigue Detection of Drivers through Yawning Analysis

    Science.gov (United States)

    Azim, Tayyaba; Jaffar, M. Arfan; Ramzan, M.; Mirza, Anwar M.

    This paper presents a non-intrusive fatigue detection system based on the video analysis of drivers. The focus of the paper is on how to detect yawning which is an important cue for determining driver's fatigue. Initially, the face is located through Viola-Jones face detection method in a video frame. Then, a mouth window is extracted from the face region, in which lips are searched through spatial fuzzy c-means (s-FCM) clustering. The degree of mouth openness is extracted on the basis of mouth features, to determine driver's yawning state. If the yawning state of the driver persists for several consecutive frames, the system concludes that the driver is non-vigilant due to fatigue and is thus warned through an alarm. The system reinitializes when occlusion or misdetection occurs. Experiments were carried out using real data, recorded in day and night lighting conditions, and with users belonging to different race and gender.

  4. Automatic detection and analysis of nuclear plant malfunctions

    International Nuclear Information System (INIS)

    In this paper a system is proposed, which performs dynamically the detection and analysis of malfunctions in a nuclear plant. The proposed method was developed and implemented on a Reactor Simulator, instead of on a real one, thus allowing a wide range of tests. For all variables under control, a simulation module was identified and implemented on the reactor on-line computer. In the malfunction identification phase all modules run separately, processing plant input variables and producing their output variable in Real-Time; continuous comparison of the computed variables with plant variables allows malfunction's detection. At this moment the second phase can occur: when a malfunction is detected, all modules are connected, except the module simulating the wrong variable, and a fast simulation is carried on, to analyse the consequences. (author)

  5. Automatic movie skimming with story units via general tempo analysis

    Science.gov (United States)

    Lee, Shih-Hung; Yeh, Chia H.; Kuo, C.-C. J.

    2003-12-01

    A skimming system for movie content exploration is proposed using story units extracted via general tempo analysis of audio and visual data. Quite a few schemes have been proposed to segment video data into shots with low-level features, yet the grouping of shots into meaningful units, called story units here, is important and challenging. In this work, we detect similar shots using key frames and include these similar shots as a node in the scene transition graph. Then, an importance measure is calculated based on the total length of each node. Finally, we select sinks and shots according to this measure. Based on these semantic shots, a meaningful skims can be successfully generated. Simulation results will be presented to show that the proposed video skimming scheme can preserve the essential and significant content of the original video data.

  6. Gaussian curvature analysis allows for automatic block placement in multi-block hexahedral meshing.

    Science.gov (United States)

    Ramme, Austin J; Shivanna, Kiran H; Magnotta, Vincent A; Grosland, Nicole M

    2011-10-01

    Musculoskeletal finite element analysis (FEA) has been essential to research in orthopaedic biomechanics. The generation of a volumetric mesh is often the most challenging step in a FEA. Hexahedral meshing tools that are based on a multi-block approach rely on the manual placement of building blocks for their mesh generation scheme. We hypothesise that Gaussian curvature analysis could be used to automatically develop a building block structure for multi-block hexahedral mesh generation. The Automated Building Block Algorithm incorporates principles from differential geometry, combinatorics, statistical analysis and computer science to automatically generate a building block structure to represent a given surface without prior information. We have applied this algorithm to 29 bones of varying geometries and successfully generated a usable mesh in all cases. This work represents a significant advancement in automating the definition of building blocks.

  7. Two dimensional barcode-inspired automatic analysis for arrayed microfluidic immunoassays

    Science.gov (United States)

    Zhang, Yi; Qiao, Lingbo; Ren, Yunke; Wang, Xuwei; Gao, Ming; Tang, Yunfang; Jeff Xi, Jianzhong; Fu, Tzung-May; Jiang, Xingyu

    2013-01-01

    The usability of many high-throughput lab-on-a-chip devices in point-of-care applications is currently limited by the manual data acquisition and analysis process, which are labor intensive and time consuming. Based on our original design in the biochemical reactions, we proposed here a universal approach to perform automatic, fast, and robust analysis for high-throughput array-based microfluidic immunoassays. Inspired by two-dimensional (2D) barcodes, we incorporated asymmetric function patterns into a microfluidic array. These function patterns provide quantitative information on the characteristic dimensions of the microfluidic array, as well as mark its orientation and origin of coordinates. We used a computer program to perform automatic analysis for a high-throughput antigen/antibody interaction experiment in 10 s, which was more than 500 times faster than conventional manual processing. Our method is broadly applicable to many other microchannel-based immunoassays. PMID:24404030

  8. Automatic localization of cerebral cortical malformations using fractal analysis

    Science.gov (United States)

    De Luca, A.; Arrigoni, F.; Romaniello, R.; Triulzi, F. M.; Peruzzo, D.; Bertoldo, A.

    2016-08-01

    Malformations of cortical development (MCDs) encompass a variety of brain disorders affecting the normal development and organization of the brain cortex. The relatively low incidence and the extreme heterogeneity of these disorders hamper the application of classical group level approaches for the detection of lesions. Here, we present a geometrical descriptor for a voxel level analysis based on fractal geometry, then define two similarity measures to detect the lesions at single subject level. The pipeline was applied to 15 normal children and nine pediatric patients affected by MCDs following two criteria, maximum accuracy (WACC) and minimization of false positives (FPR), and proved that our lesion detection algorithm is able to detect and locate abnormalities of the brain cortex with high specificity (WACC  =  85%, FPR  =  96%), sensitivity (WACC  =  83%, FPR  =  63%) and accuracy (WACC  =  85%, FPR  =  90%). The combination of global and local features proves to be effective, making the algorithm suitable for the detection of both focal and diffused malformations. Compared to other existing algorithms, this method shows higher accuracy and sensitivity.

  9. Analysis of Fiber deposition using Automatic Image Processing Method

    Science.gov (United States)

    Belka, M.; Lizal, F.; Jedelsky, J.; Jicha, M.

    2013-04-01

    Fibers are permanent threat for a human health. They have an ability to penetrate deeper in the human lung, deposit there and cause health hazards, e.glung cancer. An experiment was carried out to gain more data about deposition of fibers. Monodisperse glass fibers were delivered into a realistic model of human airways with an inspiratory flow rate of 30 l/min. Replica included human airways from oral cavity up to seventh generation of branching. Deposited fibers were rinsed from the model and placed on nitrocellulose filters after the delivery. A new novel method was established for deposition data acquisition. The method is based on a principle of image analysis. The images were captured by high definition camera attached to a phase contrast microscope. Results of new method were compared with standard PCM method, which follows methodology NIOSH 7400, and a good match was found. The new method was found applicable for evaluation of fibers and deposition fraction and deposition efficiency were calculated afterwards.

  10. Analysis of Fiber deposition using Automatic Image Processing Method

    Directory of Open Access Journals (Sweden)

    Jicha M.

    2013-04-01

    Full Text Available Fibers are permanent threat for a human health. They have an ability to penetrate deeper in the human lung, deposit there and cause health hazards, e.glung cancer. An experiment was carried out to gain more data about deposition of fibers. Monodisperse glass fibers were delivered into a realistic model of human airways with an inspiratory flow rate of 30 l/min. Replica included human airways from oral cavity up to seventh generation of branching. Deposited fibers were rinsed from the model and placed on nitrocellulose filters after the delivery. A new novel method was established for deposition data acquisition. The method is based on a principle of image analysis. The images were captured by high definition camera attached to a phase contrast microscope. Results of new method were compared with standard PCM method, which follows methodology NIOSH 7400, and a good match was found. The new method was found applicable for evaluation of fibers and deposition fraction and deposition efficiency were calculated afterwards.

  11. Automatic proximate analysis of coal using a general-purpose robot

    Energy Technology Data Exchange (ETDEWEB)

    Maeda, K.

    1986-01-01

    In order to ensure supplies of blast furnace coke at Nippon Kokan's Fukuyama Works, the results of analysis of coal (as-received and blended) and coke are used in the selection of appropriate operating conditions for the coke ovens and blast furnaces. The author discusses the following topics: 1) selection of items for automatic analysis; 2) analytic methods used (weight loss, combustion, volumetric analysis and titration); the nature of the automated system adopted (arrangement of apparatus, functions of the robot, system configuration and software used); and degree of precision obtained in analysis. Typical analytic results are given. 4 refs., 16 figs., 5 tabs.

  12. Fully Automatic System for Accurate Localisation and Analysis of Cephalometric Landmarks in Lateral Cephalograms.

    Science.gov (United States)

    Lindner, Claudia; Wang, Ching-Wei; Huang, Cheng-Ta; Li, Chung-Hsing; Chang, Sheng-Wei; Cootes, Tim F

    2016-01-01

    Cephalometric tracing is a standard analysis tool for orthodontic diagnosis and treatment planning. The aim of this study was to develop and validate a fully automatic landmark annotation (FALA) system for finding cephalometric landmarks in lateral cephalograms and its application to the classification of skeletal malformations. Digital cephalograms of 400 subjects (age range: 7-76 years) were available. All cephalograms had been manually traced by two experienced orthodontists with 19 cephalometric landmarks, and eight clinical parameters had been calculated for each subject. A FALA system to locate the 19 landmarks in lateral cephalograms was developed. The system was evaluated via comparison to the manual tracings, and the automatically located landmarks were used for classification of the clinical parameters. The system achieved an average point-to-point error of 1.2 mm, and 84.7% of landmarks were located within the clinically accepted precision range of 2.0 mm. The automatic landmark localisation performance was within the inter-observer variability between two clinical experts. The automatic classification achieved an average classification accuracy of 83.4% which was comparable to an experienced orthodontist. The FALA system rapidly and accurately locates and analyses cephalometric landmarks in lateral cephalograms, and has the potential to significantly improve the clinical work flow in orthodontic treatment. PMID:27645567

  13. Instruments and scope of possible action of the Federal States for global climate protection. Vol. 1: Analysis. Vol. 2: Baseline data. Vol. 3: Legal opinion; Instrumente und Handlungsmoeglichkeiten der Bundeslaender zum Klimaschutz. Bd. 1: Analyseband. Bd. 2: Materialband. Bd. 3: Rechtsgutachten

    Energy Technology Data Exchange (ETDEWEB)

    Blechschmidt, K.; Herbert, W.; Barzantny, K.; Dittmann, W.; Gehm, C.; Menges, R.; Moehring-Hueser, W.; Wortmann, K.; Herbert, W. [Energiestiftung Schleswig-Holstein, Kiel (Germany); Penschke, A. [Zentrum fuer Rationelle Energieanwendung und Umwelt GmbH, Regensburg (Germany); Groth, A.; Kusche, C.

    2001-07-01

    In the analysis volume, (vol. 1), the legal and political frameworks governing the greenhouse gas reduction policy adopted by the Federal states of Germany are explained. The evolution of the policies, goals pursued and points of major emphasis, the orientation for the future and existing impediments for implementation are analysed. Volume 2 contains a comprehensive baseline data library characterising the present situation, as well as examples selected by the Federal states, illustrating policy schemes which proved to be successful. The legal opinion presented in volume 3 explains the distribution of competences for climate policy of the Federal Government and the state governments in accordance with constitutional law, as well as channels of implementing policy schemes via legislation issued by the Federal states, or administrative action in execution of the law, or other administrative action. (orig./CB) [German] Im Analyseband (Band 1) werden die rechtlichen und politischen Rahmenbedingungen der Landesklimaschutzpolitik dargestellt, gefolgt von einer Analyse dieser Politik hinsichtlich ihrer Entwicklung, Ziele und Schwerpunkte, bestehenden Hindernissen, zukuenftiger Ausrichtung, Handlungsoptionen und Beispiele fuer die Erschliessung neuer Taetigkeitsgebiete der Laender. Der Materialband (Band 2) enthaelt neben umfassenden Daten zur Schilderung der Situation auch von den Laendern ausgewaehlte, erfolgreiche Beispiele fuer die Durchfuehrung von Klimaschutzmassnahmen. In einem Rechtsgutachten (Band 3) werden die verfassungsrechtlichen Grundlagen der Kompetenzverteilung zwischen Bund und Laendern auf dem Gebiet des Klimaschutzes detailliert dargelegt und Moeglichkeiten der Umsetzung von Klimaschutzpolitik durch Landesgesetzgebung sowie gesetzesvollziehende und nicht-gesetzesvollziehende Verwaltungstaetigkeit untersucht. (orig./CB)

  14. CADLIVE toolbox for MATLAB: automatic dynamic modeling of biochemical networks with comprehensive system analysis.

    Science.gov (United States)

    Inoue, Kentaro; Maeda, Kazuhiro; Miyabe, Takaaki; Matsuoka, Yu; Kurata, Hiroyuki

    2014-09-01

    Mathematical modeling has become a standard technique to understand the dynamics of complex biochemical systems. To promote the modeling, we had developed the CADLIVE dynamic simulator that automatically converted a biochemical map into its associated mathematical model, simulated its dynamic behaviors and analyzed its robustness. To enhance the feasibility by CADLIVE and extend its functions, we propose the CADLIVE toolbox available for MATLAB, which implements not only the existing functions of the CADLIVE dynamic simulator, but also the latest tools including global parameter search methods with robustness analysis. The seamless, bottom-up processes consisting of biochemical network construction, automatic construction of its dynamic model, simulation, optimization, and S-system analysis greatly facilitate dynamic modeling, contributing to the research of systems biology and synthetic biology. This application can be freely downloaded from http://www.cadlive.jp/CADLIVE_MATLAB/ together with an instruction.

  15. Automatic diagnosis of pneumoconiosis by texture analysis of chest X-ray images

    International Nuclear Information System (INIS)

    This paper presents a study on automatic diagnosis of pneumoconiosis by texture analysis of chest X-ray images. A pre-processing method for normalizing distribution of film density is proposed and its effectiveness is studied experimentally. Dominant causes of the deviations of texture features for evaluating the profusion of small opacities are excluded by the pre-processing. New texture features based on the distribution of gradient vectors are also proposed. Experiments to evaluate the proposed system are demonstrated

  16. Analysis of outdoor radon progeny concentration measured at the Spanish radioactive aerosol automatic monitoring network

    International Nuclear Information System (INIS)

    An analysis of 10-year radon progeny data, provided by the Spanish automatic radiological surveillance network, in relation to meteorology is presented. Results show great spatial variability depending mainly on the station location and thus, the surrounding radon exhalation rate. Hourly averages show the typical diurnal cycle with an early morning maximum and a minimum at noon, except for one mountain station, which shows an inverse behaviour. Monthly averaged values show lower concentrations during months with higher atmospheric instability.

  17. Technical characterization by image analysis: an automatic method of mineralogical studies

    International Nuclear Information System (INIS)

    The application of a modern method of image analysis fully automated for the study of grain size distribution modal assays, degree of liberation and mineralogical associations is discussed. The image analyser is interfaced with a scanning electron microscope and an energy dispersive X-rays analyser. The image generated by backscattered electrons is analysed automatically and the system has been used in accessment studies of applied mineralogy as well as in process control in the mining industry. (author)

  18. Automatic Evaluation for E-Learning Using Latent Semantic Analysis: A Use Case

    Directory of Open Access Journals (Sweden)

    Mireia Farrús

    2013-03-01

    Full Text Available Assessment in education allows for obtaining, organizing, and presenting information about how much and how well the student is learning. The current paper aims at analysing and discussing some of the most state-of-the-art assessment systems in education. Later, this work presents a specific use case developed for the Universitat Oberta de Catalunya, which is an online university. An automatic evaluation tool is proposed that allows the student to evaluate himself anytime and receive instant feedback. This tool is a web-based platform, and it has been designed for engineering subjects (i.e., with math symbols and formulas in Catalan and Spanish. Particularly, the technique used for automatic assessment is latent semantic analysis. Although the experimental framework from the use case is quite challenging, results are promising.

  19. Correlation analysis-based image segmentation approach for automatic agriculture vehicle

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    It is important to segment image correctly to extract guidance information for automatic agriculture vehicle. If we can make the computer know where the crops are, we can extract the guidance line easily. Images were divided into some rectangle small windows, then a pair of 1-D arrays was constructed in each small windows. The correlation coefficients of every small window constructed the features to segment images. The results showed that correlation analysis is a potential approach for processing complex farmland for guidance system, and more correlation analysis methods must be researched.

  20. NGS-Trex: an automatic analysis workflow for RNA-Seq data.

    Science.gov (United States)

    Boria, Ilenia; Boatti, Lara; Saggese, Igor; Mignone, Flavio

    2015-01-01

    RNA-Seq technology allows the rapid analysis of whole transcriptomes taking advantage of next-generation sequencing platforms. Moreover with the constant decrease of the cost of NGS analysis RNA-Seq is becoming very popular and widespread. Unfortunately data analysis is quite demanding in terms of bioinformatic skills and infrastructures required, thus limiting the potential users of this method. Here we describe the complete analysis of sample data from raw sequences to data mining of results by using NGS-Trex platform, a low user interaction, fully automatic analysis workflow. Used through a web interface, NGS-Trex processes data and profiles the transcriptome of the samples identifying expressed genes, transcripts, and new and known splice variants. It also detects differentially expressed genes and transcripts across different experiments.

  1. Automatic evaluation and data generation for analytical chemistry instrumental analysis exercises

    Directory of Open Access Journals (Sweden)

    Arsenio Muñoz de la Peña

    2014-01-01

    Full Text Available In general, laboratory activities are costly in terms of time, space, and money. As such, the ability to provide realistically simulated laboratory data that enables students to practice data analysis techniques as a complementary activity would be expected to reduce these costs while opening up very interesting possibilities. In the present work, a novel methodology is presented for design of analytical chemistry instrumental analysis exercises that can be automatically personalized for each student and the results evaluated immediately. The proposed system provides each student with a different set of experimental data generated randomly while satisfying a set of constraints, rather than using data obtained from actual laboratory work. This allows the instructor to provide students with a set of practical problems to complement their regular laboratory work along with the corresponding feedback provided by the system's automatic evaluation process. To this end, the Goodle Grading Management System (GMS, an innovative web-based educational tool for automating the collection and assessment of practical exercises for engineering and scientific courses, was developed. The proposed methodology takes full advantage of the Goodle GMS fusion code architecture. The design of a particular exercise is provided ad hoc by the instructor and requires basic Matlab knowledge. The system has been employed with satisfactory results in several university courses. To demonstrate the automatic evaluation process, three exercises are presented in detail. The first exercise involves a linear regression analysis of data and the calculation of the quality parameters of an instrumental analysis method. The second and third exercises address two different comparison tests, a comparison test of the mean and a t-paired test.

  2. Automatic generation of stop word lists for information retrieval and analysis

    Energy Technology Data Exchange (ETDEWEB)

    Rose, Stuart J

    2013-01-08

    Methods and systems for automatically generating lists of stop words for information retrieval and analysis. Generation of the stop words can include providing a corpus of documents and a plurality of keywords. From the corpus of documents, a term list of all terms is constructed and both a keyword adjacency frequency and a keyword frequency are determined. If a ratio of the keyword adjacency frequency to the keyword frequency for a particular term on the term list is less than a predetermined value, then that term is excluded from the term list. The resulting term list is truncated based on predetermined criteria to form a stop word list.

  3. Determination of porosity of pyrocarbon by means of the automatic quantitative image analysis

    International Nuclear Information System (INIS)

    For a long time, the quantitative image analysis is well known as a method for quantifying the results of material investigation basing on ceramography. The development of the automatic image analysers has made it a fast and elegant procedure for evaluation. It is used to determine easily and routinely the macroporosity and by this the density of the pyrocarbon coatings of nuclear fuel particles. This report describes the definition of measuring parameters, the measuring procedure, the mathematical calculations, and first experimental and mathematical results. (orig.)

  4. Modeling Earthen Dike Stability: Sensitivity Analysis and Automatic Calibration of Diffusivities Based on Live Sensor Data

    CERN Document Server

    Melnikova, N B; Sloot, P M A

    2012-01-01

    The paper describes concept and implementation details of integrating a finite element module for dike stability analysis Virtual Dike into an early warning system for flood protection. The module operates in real-time mode and includes fluid and structural sub-models for simulation of porous flow through the dike and for dike stability analysis. Real-time measurements obtained from pore pressure sensors are fed into the simulation module, to be compared with simulated pore pressure dynamics. Implementation of the module has been performed for a real-world test case - an earthen levee protecting a sea-port in Groningen, the Netherlands. Sensitivity analysis and calibration of diffusivities have been performed for tidal fluctuations. An algorithm for automatic diffusivities calibration for a heterogeneous dike is proposed and studied. Analytical solutions describing tidal propagation in one-dimensional saturated aquifer are employed in the algorithm to generate initial estimates of diffusivities.

  5. Multidirectional channeling analysis of epitaxial CdTe layers using an automatic RBS/channeling system

    Energy Technology Data Exchange (ETDEWEB)

    Wielunski, L.S.; Kenny, M.J. [CSIRO, Lindfield, NSW (Australia). Applied Physics Div.

    1993-12-31

    Rutherford Backscattering Spectrometry (RBS) is an ion beam analysis technique used in many fields. The high depth and mass resolution of RBS make this technique very useful in semiconductor material analysis [1]. The use of ion channeling in combination with RBS creates a powerful technique which can provide information about crystal quality and structure in addition to mass and depth resolution [2]. The presence of crystal defects such as interstitial atoms, dislocations or dislocation loops can be detected and profiled [3,4]. Semiconductor materials such as CdTe, HgTe and Hg+xCd{sub 1-x}Te generate considerable interest due to applications as infrared detectors in many technological areas. The present paper demonstrates how automatic RBS and multidirectional channeling analysis can be used to evaluate crystal quality and near surface defects. 6 refs., 1 fig.

  6. Automatic analysis of image of surface structure of cell wall-deficient EVC.

    Science.gov (United States)

    Li, S; Hu, K; Cai, N; Su, W; Xiong, H; Lou, Z; Lin, T; Hu, Y

    2001-01-01

    Some computer applications for cell characterization in medicine and biology, such as analysis of surface structure of cell wall-deficient EVC (El Tor Vibrio of Cholera), operate with cell samples taken from very small areas of interest. In order to perform texture characterization in such an application, only a few texture operators can be employed: the operators should be insensitive to noise and image distortion and be reliable in order to estimate texture quality from images. Therefore, we introduce wavelet theory and mathematical morphology to analyse the cellular surface micro-area image obtained by SEM (Scanning Electron Microscope). In order to describe the quality of surface structure of cell wall-deficient EVC, we propose a fully automatic computerized method. The image analysis process is carried out in two steps. In the first, we decompose the given image by dyadic wavelet transform and form an image approximation with higher resolution, by doing so, we perform edge detection of given images efficiently. In the second, we introduce many operations of mathematical morphology to obtain morphological quantitative parameters of surface structure of cell wall-deficient EVC. The obtained results prove that the method can eliminate noise, detect the edge and extract the feature parameters validly. In this work, we have built automatic analytic software named "EVC.CELL".

  7. Selection of Entropy Based Features for Automatic Analysis of Essential Tremor

    Directory of Open Access Journals (Sweden)

    Karmele López-de-Ipiña

    2016-05-01

    Full Text Available Biomedical systems produce biosignals that arise from interaction mechanisms. In a general form, those mechanisms occur across multiple scales, both spatial and temporal, and contain linear and non-linear information. In this framework, entropy measures are good candidates in order provide useful evidence about disorder in the system, lack of information in time-series and/or irregularity of the signals. The most common movement disorder is essential tremor (ET, which occurs 20 times more than Parkinson’s disease. Interestingly, about 50%–70% of the cases of ET have a genetic origin. One of the most used standard tests for clinical diagnosis of ET is Archimedes’ spiral drawing. This work focuses on the selection of non-linear biomarkers from such drawings and handwriting, and it is part of a wider cross study on the diagnosis of essential tremor, where our piece of research presents the selection of entropy features for early ET diagnosis. Classic entropy features are compared with features based on permutation entropy. Automatic analysis system settled on several Machine Learning paradigms is performed, while automatic features selection is implemented by means of ANOVA (analysis of variance test. The obtained results for early detection are promising and appear applicable to real environments.

  8. Design of phantoms and software for automatic image analysis applied to digital radiographic equipments

    International Nuclear Information System (INIS)

    In a quality control of the radiographic equipment, the quality of the obtained image is very useful to characterize the physical properties of the image radiographic chain. In the radiographic technique it is necessary that the evaluation of the image can guarantee the constancy of its quality to carry out a suitable diagnosis. The use of digital systems allows the automatic analysis of the obtained radiographic images, increasing the objectivity in the evaluation of the image. In this work we have designed some radiographic phantoms for different radiographic digital devices, as dental, conventional, equipments with computed radiography (phosphor plate) and direct radiography (sensor) technology. Additionally, we have developed a software to analyse the image obtained by the radiographic equipment with digital processing techniques as edge detector, morphological operators, statistical test for the detected combinations.. The images have been acquired in DICOM, tiff.. format and they can be analysed with objective parameters as an image quality index and the contrast detail curve. The design of these phantoms let the evaluation of a wide range of operating conditions of voltage, current and time of the digital equipments. Moreover, the image quality analysis by the automatic software let study it with objective parameters and the functioning of the image chain of the digital system. (author)

  9. Analysis and Development of FACE Automatic Apparatus for Rapid Identification of Transuranium Isotopes

    Energy Technology Data Exchange (ETDEWEB)

    Sebesta, E.H.

    1978-09-01

    A description of and operating manual for the FACE Automatic Apparatus has been written along with a documentation of the FACE machine operating program, to provide a user manual for the FACE Automatic Apparatus. In addition, FACE machine performance was investigated to improve transuranium throughput. Analysis of the causes of transuranium isotope loss was undertaken both chemical and radioactive. To lower radioactive loss, the dynamics of the most time consuming step of the FACE machine, the chromatographic column output droplet drying and flaming, in preparation of sample for alpha spectroscopy and counting, was investigated. A series of droplets were dried in an experimental apparatus demonstrating that droplets could be dried significantly faster through more intensie heating, enabling the FACE machine cycle to be shortened by 30-60 seconds. Proposals incorporating these ideas were provided for FACE machine development. The 66% chemical loss of product was analyzed and changes were proposed to reduce the radioisotopes product loss. An analysis of the chromatographic column was also provided. All operating steps in the FACE machine are described and analyzed to provide a complete guide, along with the proposals for machine improvement.

  10. [Design and analysis of automatic measurement instrument for diffraction efficiency of plane reflection grating].

    Science.gov (United States)

    Wang, Fang; Qi, Xiang-Dong; Yu, Hong-Zhu; Yu, Hai-Li

    2009-02-01

    A new-style system that automatically measures the diffraction efficiency of plane reflection grating was designed. The continuous illuminant was adopted for illumination, the duplex grating spectrograph structure was applied, and the linear array NMOS was the receiving component. Wielding relevant principle of the grating spectrograph, theoretical analysis principle was carried out for the testing system. Integrating the aberration theory of geometrical optics, the image quality of this optics system was analyzed. Analysis indicated that the systematic device structure is compact, and electronics system is simplified. The system does not have the problem about wavelength sweep synchronization of the two grating spectrographs, and its wavelength repeatability is very good. So the precision is easy to guarantee. Compared with the former automated scheme, the production cost is reduced, moreover it is easy to operate, and the working efficiency is enhanced. The study showed that this automatic measurement instrument system features a spectral range of 190-1 100 nm and resolution is less than 3 nm, which entirely satisfies the design request. It is an economical and feasible plan. PMID:19445251

  11. Computer program for analysis of impedance cardiography signals enabling manual correction of points detected automatically

    Science.gov (United States)

    Oleksiak, Justyna; Cybulski, Gerard

    2014-11-01

    The aim of this work was to create a computer program, written in LabVIEW, which enables the visualization and analysis of hemodynamic parameters. It allows the user to import data collected using ReoMonitor, an ambulatory monitoring impedance cardiography (AICG) device. The data include one channel of the ECG and one channel of the first derivative of the impedance signal (dz/dt) sampled at 200Hz and the base impedance signal (Z0) sampled every 8s. The program consist of two parts: a bioscope allowing the presentation of traces (ECG, AICG, Z0) and an analytical portion enabling the detection of characteristic points on the signals and automatic calculation of hemodynamic parameters. The detection of characteristic points in both signals is done automatically, with the option to make manual corrections, which may be necessary to avoid "false positive" recognitions. This application is used to determine the values of basic hemodynamic variables: pre-ejection period (PEP), left ventricular ejection time (LVET), stroke volume (SV), cardiac output (CO), and heart rate (HR). It leaves room for further development of additional features, for both the analysis panel and the data acquisition function.

  12. Automatic quantitative analysis of microstructure of ductile cast iron using digital image processing

    Directory of Open Access Journals (Sweden)

    Abhijit Malage

    2015-09-01

    Full Text Available Ductile cast iron is preferred as nodular iron or spheroidal graphite iron. Ductile cast iron contains graphite in form of discrete nodules and matrix of ferrite and perlite. In order to determine the mechanical properties, one needs to determine volume of phases in matrix and nodularity in the microstructure of metal sample. Manual methods available for this, are time consuming and accuracy depends on expertize. The paper proposes a novel method for automatic quantitative analysis of microstructure of Ferritic Pearlitic Ductile Iron which calculates volume of phases and nodularity of that sample. This gives results within a very short time (approximately 5 sec with 98% accuracy for volume phases of matrices and 90% of accuracy for nodule detection and analysis which are in the range of standard specified for SG 500/7 and validated by metallurgist.

  13. Automatic simplification of systems of reaction-diffusion equations by a posteriori analysis.

    Science.gov (United States)

    Maybank, Philip J; Whiteley, Jonathan P

    2014-02-01

    Many mathematical models in biology and physiology are represented by systems of nonlinear differential equations. In recent years these models have become increasingly complex in order to explain the enormous volume of data now available. A key role of modellers is to determine which components of the model have the greatest effect on a given observed behaviour. An approach for automatically fulfilling this role, based on a posteriori analysis, has recently been developed for nonlinear initial value ordinary differential equations [J.P. Whiteley, Model reduction using a posteriori analysis, Math. Biosci. 225 (2010) 44-52]. In this paper we extend this model reduction technique for application to both steady-state and time-dependent nonlinear reaction-diffusion systems. Exemplar problems drawn from biology are used to demonstrate the applicability of the technique. PMID:24418010

  14. Automatic geocoding of high-value targets using structural image analysis and GIS data

    Science.gov (United States)

    Soergel, Uwe; Thoennessen, Ulrich

    1999-12-01

    Geocoding based merely on navigation data and sensor model is often not possible or precise enough. In these cases an improvement of the preregistration through image-based approaches is a solution. Due to the large amount of data in remote sensing automatic geocoding methods are necessary. For geocoding purposes appropriate tie points, which are present in image and map, have to be detected and matched. The tie points are base of the transformation function. Assigning the tie points is combinatorial problem depending on the number of tie points. This number can be reduced using structural tie points like corners or crossings of prominent extended targets (e.g. harbors, airfields). Additionally the reliability of the tie points is improved. Our approach extracts structural tie points independently in the image and in the vector map by a model-based image analysis. The vector map is provided by a GIS using ATKIS data base. The model parameters are extracted from maps or collateral information of the scenario. The two sets of tie points are automatically matched with a Geometric Hashing algorithm. The algorithm was successfully applied to VIS, IR and SAR data.

  15. Simple automatic strategy for background drift correction in chromatographic data analysis.

    Science.gov (United States)

    Fu, Hai-Yan; Li, He-Dong; Yu, Yong-Jie; Wang, Bing; Lu, Peng; Cui, Hua-Peng; Liu, Ping-Ping; She, Yuan-Bin

    2016-06-01

    Chromatographic background drift correction, which influences peak detection and time shift alignment results, is a critical stage in chromatographic data analysis. In this study, an automatic background drift correction methodology was developed. Local minimum values in a chromatogram were initially detected and organized as a new baseline vector. Iterative optimization was then employed to recognize outliers, which belong to the chromatographic peaks, in this vector, and update the outliers in the baseline until convergence. The optimized baseline vector was finally expanded into the original chromatogram, and linear interpolation was employed to estimate background drift in the chromatogram. The principle underlying the proposed method was confirmed using a complex gas chromatographic dataset. Finally, the proposed approach was applied to eliminate background drift in liquid chromatography quadrupole time-of-flight samples used in the metabolic study of Escherichia coli samples. The proposed method was comparable with three classical techniques: morphological weighted penalized least squares, moving window minimum value strategy and background drift correction by orthogonal subspace projection. The proposed method allows almost automatic implementation of background drift correction, which is convenient for practical use.

  16. Semi-automatic system for UV images analysis of historical musical instruments

    Science.gov (United States)

    Dondi, Piercarlo; Invernizzi, Claudia; Licchelli, Maurizio; Lombardi, Luca; Malagodi, Marco; Rovetta, Tommaso

    2015-06-01

    The selection of representative areas to be analyzed is a common problem in the study of Cultural Heritage items. UV fluorescence photography is an extensively used technique to highlight specific surface features which cannot be observed in visible light (e.g. restored parts or treated with different materials), and it proves to be very effective in the study of historical musical instruments. In this work we propose a new semi-automatic solution for selecting areas with the same perceived color (a simple clue of similar materials) on UV photos, using a specifically designed interactive tool. The proposed method works in two steps: (i) users select a small rectangular area of the image; (ii) program automatically highlights all the areas that have the same color of the selected input. The identification is made by the analysis of the image in HSV color model, the most similar to the human perception. The achievable result is more accurate than a manual selection, because it can detect also points that users do not recognize as similar due to perception illusion. The application has been developed following the rules of usability, and Human Computer Interface has been improved after a series of tests performed by expert and non-expert users. All the experiments were performed on UV imagery of the Stradivari violins collection stored by "Museo del Violino" in Cremona.

  17. Hardware and software system for automatic microemulsion assay evaluation by analysis of optical properties

    Science.gov (United States)

    Maeder, Ulf; Schmidts, Thomas; Burg, Jan-Michael; Heverhagen, Johannes T.; Runkel, Frank; Fiebich, Martin

    2010-03-01

    A new hardware device called Microemulsion Analyzer (MEA), which facilitates the preparation and evaluation of microemulsions, was developed. Microemulsions, consisting of three phases (oil, surfactant and water) and prepared on deep well plates according to the PDMPD method can be automatically evaluated by means of the optical properties. The ratio of ingredients to form a microemulsion strongly depends on the properties and the amounts of the used ingredients. A microemulsion assay is set up on deep well plates to determine these ratios. The optical properties of the ingredients change from turbid to transparent as soon as a microemulsion is formed. The MEA contains a frame and an imageprocessing and analysis algorithm. The frame itself consists of aluminum, an electro luminescent foil (ELF) and a camera. As the frame keeps the well plate at the correct position and angle, the ELF provides constant illumination of the plate from below. The camera provides an image that is processed by the algorithm to automatically evaluate the turbidity in the wells. Using the determined parameters, a phase diagram is created that visualizes the information. This build-up can be used to analyze microemulsion assays and to get results in a standardized way. In addition, it is possible to perform stability tests of the assay by creating special differential stability diagrams after a period of time.

  18. Automatic yield-line analysis of slabs using discontinuity layout optimization.

    Science.gov (United States)

    Gilbert, Matthew; He, Linwei; Smith, Colin C; Le, Canh V

    2014-08-01

    The yield-line method of analysis is a long established and extremely effective means of estimating the maximum load sustainable by a slab or plate. However, although numerous attempts to automate the process of directly identifying the critical pattern of yield-lines have been made over the past few decades, to date none has proved capable of reliably analysing slabs of arbitrary geometry. Here, it is demonstrated that the discontinuity layout optimization (DLO) procedure can successfully be applied to such problems. The procedure involves discretization of the problem using nodes inter-connected by potential yield-line discontinuities, with the critical layout of these then identified using linear programming. The procedure is applied to various benchmark problems, demonstrating that highly accurate solutions can be obtained, and showing that DLO provides a truly systematic means of directly and reliably automatically identifying yield-line patterns. Finally, since the critical yield-line patterns for many problems are found to be quite complex in form, a means of automatically simplifying these is presented.

  19. [SMEAC Newsletters, Science Education, Vol. 1, No. 1--Vol. 2, No. 1, 1967-1968].

    Science.gov (United States)

    ERIC Clearinghouse for Science, Mathematics, and Environmental Education, Columbus, OH.

    Each of these newsletters, produced by the ERIC Information Analysis Center for Science, Mathematics, and Environmental Education, contains information concerning center publications and other items considered of interest to researchers and educators of various education levels. Vol. 1, No. 1 highlights selected bibliographies (no longer produced…

  20. [SMEAC Newsletters, Science Education, Vol. 2, No. 2--Vol. 2, No. 3, 1969].

    Science.gov (United States)

    ERIC Clearinghouse for Science, Mathematics, and Environmental Education, Columbus, OH.

    Each of these newsletters, produced by the ERIC Information Analysis Center for Science, Mathematics, and Environmental Education, Contains information concerning center publications and activities, as well as other items considered of interest to researchers and educators of various educational levels. One of the emphases in Vol. 2, No. 2, is a…

  1. Automatic Generation of Algorithms for the Statistical Analysis of Planetary Nebulae Images

    Science.gov (United States)

    Fischer, Bernd

    2004-01-01

    Analyzing data sets collected in experiments or by observations is a Core scientific activity. Typically, experimentd and observational data are &aught with uncertainty, and the analysis is based on a statistical model of the conjectured underlying processes, The large data volumes collected by modern instruments make computer support indispensible for this. Consequently, scientists spend significant amounts of their time with the development and refinement of the data analysis programs. AutoBayes [GF+02, FS03] is a fully automatic synthesis system for generating statistical data analysis programs. Externally, it looks like a compiler: it takes an abstract problem specification and translates it into executable code. Its input is a concise description of a data analysis problem in the form of a statistical model as shown in Figure 1; its output is optimized and fully documented C/C++ code which can be linked dynamically into the Matlab and Octave environments. Internally, however, it is quite different: AutoBayes derives a customized algorithm implementing the given model using a schema-based process, and then further refines and optimizes the algorithm into code. A schema is a parameterized code template with associated semantic constraints which define and restrict the template s applicability. The schema parameters are instantiated in a problem-specific way during synthesis as AutoBayes checks the constraints against the original model or, recursively, against emerging sub-problems. AutoBayes schema library contains problem decomposition operators (which are justified by theorems in a formal logic in the domain of Bayesian networks) as well as machine learning algorithms (e.g., EM, k-Means) and nu- meric optimization methods (e.g., Nelder-Mead simplex, conjugate gradient). AutoBayes augments this schema-based approach by symbolic computation to derive closed-form solutions whenever possible. This is a major advantage over other statistical data analysis systems

  2. Automatic Speech Signal Analysis for Clinical Diagnosis and Assessment of Speech Disorders

    CERN Document Server

    Baghai-Ravary, Ladan

    2013-01-01

    Automatic Speech Signal Analysis for Clinical Diagnosis and Assessment of Speech Disorders provides a survey of methods designed to aid clinicians in the diagnosis and monitoring of speech disorders such as dysarthria and dyspraxia, with an emphasis on the signal processing techniques, statistical validity of the results presented in the literature, and the appropriateness of methods that do not require specialized equipment, rigorously controlled recording procedures or highly skilled personnel to interpret results. Such techniques offer the promise of a simple and cost-effective, yet objective, assessment of a range of medical conditions, which would be of great value to clinicians. The ideal scenario would begin with the collection of examples of the clients’ speech, either over the phone or using portable recording devices operated by non-specialist nursing staff. The recordings could then be analyzed initially to aid diagnosis of conditions, and subsequently to monitor the clients’ progress and res...

  3. Automatic quantification of morphological features for hepatic trabeculae analysis in stained liver specimens.

    Science.gov (United States)

    Ishikawa, Masahiro; Murakami, Yuri; Ahi, Sercan Taha; Yamaguchi, Masahiro; Kobayashi, Naoki; Kiyuna, Tomoharu; Yamashita, Yoshiko; Saito, Akira; Abe, Tokiya; Hashiguchi, Akinori; Sakamoto, Michiie

    2016-04-01

    This paper proposes a digital image analysis method to support quantitative pathology by automatically segmenting the hepatocyte structure and quantifying its morphological features. To structurally analyze histopathological hepatic images, we isolate the trabeculae by extracting the sinusoids, fat droplets, and stromata. We then measure the morphological features of the extracted trabeculae, divide the image into cords, and calculate the feature values of the local cords. We propose a method of calculating the nuclear-cytoplasmic ratio, nuclear density, and number of layers using the local cords. Furthermore, we evaluate the effectiveness of the proposed method using surgical specimens. The proposed method was found to be an effective method for the quantification of the Edmondson grade. PMID:27335894

  4. Development of automatic image analysis algorithms for protein localization studies in budding yeast

    Science.gov (United States)

    Logg, Katarina; Kvarnström, Mats; Diez, Alfredo; Bodvard, Kristofer; Käll, Mikael

    2007-02-01

    Microscopy of fluorescently labeled proteins has become a standard technique for live cell imaging. However, it is still a challenge to systematically extract quantitative data from large sets of images in an unbiased fashion, which is particularly important in high-throughput or time-lapse studies. Here we describe the development of a software package aimed at automatic quantification of abundance and spatio-temporal dynamics of fluorescently tagged proteins in vivo in the budding yeast Saccharomyces cerevisiae, one of the most important model organisms in proteomics. The image analysis methodology is based on first identifying cell contours from bright field images, and then use this information to measure and statistically analyse protein abundance in specific cellular domains from the corresponding fluorescence images. The applicability of the procedure is exemplified for two nuclear localized GFP-tagged proteins, Mcm4p and Nrm1p.

  5. SAUNA—a system for automatic sampling, processing, and analysis of radioactive xenon

    Science.gov (United States)

    Ringbom, A.; Larson, T.; Axelsson, A.; Elmgren, K.; Johansson, C.

    2003-08-01

    A system for automatic sampling, processing, and analysis of atmospheric radioxenon has been developed. From an air sample of about 7 m3 collected during 12 h, 0.5 cm3 of xenon is extracted, and the atmospheric activities from the four xenon isotopes 133Xe, 135Xe, 131mXe, and 133mXe are determined with a beta-gamma coincidence technique. The collection is performed using activated charcoal and molecular sieves at ambient temperature. The sample preparation and quantification are performed using preparative gas chromatography. The system was tested under routine conditions for a 5-month period, with average minimum detectable concentrations below 1 mBq/ m3 for all four isotopes.

  6. Quantitative Study on Nonmetallic Inclusion Particles in Steels by Automatic Image Analysis With Extreme Values Method

    Institute of Scientific and Technical Information of China (English)

    Cássio Barbosa; José Brant de Campos; J(ǒ)neo Lopes do Nascimento; Iêda Maria Vieira Caminha

    2009-01-01

    The presence of nonmetallic inclusion particles which appear during steelmaking process is harmful to the properties of steels, which is mainly as a function of some aspects such as size, volume fraction, shape, and distribution of these particles. The automatic image analysis technique is one of the most important tools for the quantitative determination of these parameters. The classical Student approach and the Extreme Values Method (EVM) were used for the inclusion size and shape determination and the evaluation of distance between the inclusion particles. The results thus obtained indicated that there were significant differences in the characteristics of the inclusion particles in the analyzed products. Both methods achieved results with some differences, indicating that EVM could be used as a faster and more reliable statistical methodology.

  7. Evaluating the effectiveness of treatment of corneal ulcers via computer-based automatic image analysis

    Science.gov (United States)

    Otoum, Nesreen A.; Edirisinghe, Eran A.; Dua, Harminder; Faraj, Lana

    2012-06-01

    Corneal Ulcers are a common eye disease that requires prompt treatment. Recently a number of treatment approaches have been introduced that have been proven to be very effective. Unfortunately, the monitoring process of the treatment procedure remains manual and hence time consuming and prone to human errors. In this research we propose an automatic image analysis based approach to measure the size of an ulcer and its subsequent further investigation to determine the effectiveness of any treatment process followed. In Ophthalmology an ulcer area is detected for further inspection via luminous excitation of a dye. Usually in the imaging systems utilised for this purpose (i.e. a slit lamp with an appropriate dye) the ulcer area is excited to be luminous green in colour as compared to rest of the cornea which appears blue/brown. In the proposed approach we analyse the image in the HVS colour space. Initially a pre-processing stage that carries out a local histogram equalisation is used to bring back detail in any over or under exposed areas. Secondly we deal with the removal of potential reflections from the affected areas by making use of image registration of two candidate corneal images based on the detected corneal areas. Thirdly the exact corneal boundary is detected by initially registering an ellipse to the candidate corneal boundary detected via edge detection and subsequently allowing the user to modify the boundary to overlap with the boundary of the ulcer being observed. Although this step makes the approach semi automatic, it removes the impact of breakages of the corneal boundary due to occlusion, noise, image quality degradations. The ratio between the ulcer area confined within the corneal area to the corneal area is used as a measure of comparison. We demonstrate the use of the proposed tool in the analysis of the effectiveness of a treatment procedure adopted for corneal ulcers in patients by comparing the variation of corneal size over time.

  8. Control-oriented Automatic System for Transport Analysis (ASTRA)-Matlab integration for Tokamaks

    International Nuclear Information System (INIS)

    The exponential growth in energy consumption has led to a renewed interest in the development of alternatives to fossil fuels. Between the unconventional resources that may help to meet this energy demand, nuclear fusion has arisen as a promising source, which has given way to an unprecedented interest in solving the different control problems existing in nuclear fusion reactors such as Tokamaks. The aim of this manuscript is to show how one of the most popular codes used to simulate the performance of Tokamaks, the Automatic System For Transport Analysis (ASTRA) code, can be integrated into the Matlab-Simulink tool in order to make easier and more comfortable the development of suitable controllers for Tokamaks. As a demonstrative case study to show the feasibility and the goodness of the proposed ASTRA-Matlab integration, a modified anti-windup Proportional Integral Derivative (PID)-based controller for the loop voltage of a Tokamak has been implemented. The integration achieved represents an original and innovative work in the Tokamak control area and it provides new possibilities for the development and application of advanced control schemes to the standardized and widely extended ASTRA transport code for Tokamaks. -- Highlights: → The paper presents a useful tool for rapid prototyping of different solutions to deal with the control problems arising in Tokamaks. → The proposed tool embeds the standardized Automatic System For Transport Analysis (ASTRA) code for Tokamaks within the well-known Matlab-Simulink software. → This allows testing and combining diverse control schemes in a unified way considering the ASTRA as the plant of the system. → A demonstrative Proportional Integral Derivative (PID)-based case study is provided to show the feasibility and capabilities of the proposed integration.

  9. Evaluation of ventricular dysfunction using semi-automatic longitudinal strain analysis of four-chamber cine MR imaging.

    Science.gov (United States)

    Kawakubo, Masateru; Nagao, Michinobu; Kumazawa, Seiji; Yamasaki, Yuzo; Chishaki, Akiko S; Nakamura, Yasuhiko; Honda, Hiroshi; Morishita, Junji

    2016-02-01

    The aim of this study was to evaluate ventricular dysfunction using the longitudinal strain analysis in 4-chamber (4CH) cine MR imaging, and to investigate the agreement between the semi-automatic and manual measurements in the analysis. Fifty-two consecutive patients with ischemic, or non-ischemic cardiomyopathy and repaired tetralogy of Fallot who underwent cardiac MR examination incorporating cine MR imaging were retrospectively enrolled. The LV and RV longitudinal strain values were obtained by semi-automatically and manually. Receiver operating characteristic (ROC) analysis was performed to determine the optimal cutoff of the minimum longitudinal strain value for the detection of patients with cardiac dysfunction. The correlations between manual and semi-automatic measurements for LV and RV walls were analyzed by Pearson coefficient analysis. ROC analysis demonstrated the optimal cut-off of the minimum longitudinal strain values (εL_min) for diagnoses the LV and RV dysfunction at a high accuracy (LV εL_min = -7.8 %: area under the curve, 0.89; sensitivity, 83 %; specificity, 91 %, RV εL_min = -15.7 %: area under the curve, 0.82; sensitivity, 92 %; specificity, 68 %). Excellent correlations between manual and semi-automatic measurements for LV and RV free wall were observed (LV, r = 0.97, p cine MR imaging can evaluate LV and RV dysfunction with simply and easy measurements. The strain analysis could have extensive application in cardiac imaging for various clinical cases.

  10. Toward automatic regional analysis of pulmonary function using inspiration and expiration thoracic CT

    DEFF Research Database (Denmark)

    Murphy, Keelin; Pluim, Josien P. W.; Rikxoort, Eva M. van;

    2012-01-01

    Purpose: To analyze pulmonary function using a fully automatic technique which processes pairs of thoracic CT scans acquired at breath-hold inspiration and expiration, respectively. The following research objectives are identified to: (a) describe and systematically analyze the processing pipeline...... disorder). Lungs, fissures, airways, lobes, and vessels are automatically segmented in both scans and the expiration scan is registered with the inspiration scan using a fully automatic nonrigid registration algorithm. Segmentations and registrations are examined and scored by expert observers to analyze...

  11. Numerical analysis of resonances induced by s wave neutrons in transmission time-of-flight experiments with a computer IBM 7094 II; Methodes d'analyse des resonances induites par les neutrons s dans les experiences de transmission par temps de vol et automatisation de ces methodes sur ordinateur IBM 7094 II

    Energy Technology Data Exchange (ETDEWEB)

    Corge, Ch. [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires

    1969-01-01

    Numerical analysis of transmission resonances induced by s wave neutrons in time-of-flight experiments can be achieved in a fairly automatic way on an IBM 7094/II computer. The involved computations are carried out following a four step scheme: 1 - experimental raw data are processed to obtain the resonant transmissions, 2 - values of experimental quantities for each resonance are derived from the above transmissions, 3 - resonance parameters are determined using a least square method to solve the over determined system obtained by equalling theoretical functions to the correspondent experimental values. Four analysis methods are gathered in the same code, 4 - graphical control of the results is performed. (author) [French] L'automatisation, sur ordinateur IBM 7094/II, de l'analyse des resonances induites par les neutrons s dans les experiences de transmission par temps de vol a ete accomplie en la decomposant selon un schema articule en quatre phases: 1 - le traitement des donnees experimentales brutes pour obtenir les transmissions interfero-resonnantes, 2 - la determination des grandeurs d'analyse a partir des transmissions precedentes, 3 - l'analyse proprement dite des resonances dont les parametres sont obtenus par la resolution d'un systeme surabondant. Quatre methodes d'analyse sont groupees en un meme programme, 4 - la procedure de verification graphique. (auteur)

  12. An automatic granular structure generation and finite element analysis of heterogeneous semi-solid materials

    Science.gov (United States)

    Sharifi, Hamid; Larouche, Daniel

    2015-09-01

    The quality of cast metal products depends on the capacity of the semi-solid metal to sustain the stresses generated during the casting. Predicting the evolution of these stresses with accuracy in the solidification interval should be highly helpful to avoid the formation of defects like hot tearing. This task is however very difficult because of the heterogeneous nature of the material. In this paper, we propose to evaluate the mechanical behaviour of a metal during solidification using a mesh generation technique of the heterogeneous semi-solid material for a finite element analysis at the microscopic level. This task is done on a two-dimensional (2D) domain in which the granular structure of the solid phase is generated surrounded by an intergranular and interdendritc liquid phase. Some basic solid grains are first constructed and projected in the 2D domain with random orientations and scale factors. Depending on their orientation, the basic grains are combined to produce larger grains or separated by a liquid film. Different basic grain shapes can produce different granular structures of the mushy zone. As a result, using this automatic grain generation procedure, we can investigate the effect of grain shapes and sizes on the thermo-mechanical behaviour of the semi-solid material. The granular models are automatically converted to the finite element meshes. The solid grains and the liquid phase are meshed properly using quadrilateral elements. This method has been used to simulate the microstructure of a binary aluminium-copper alloy (Al-5.8 wt% Cu) when the fraction solid is 0.92. Using the finite element method and the Mie-Grüneisen equation of state for the liquid phase, the transient mechanical behaviour of the mushy zone under tensile loading has been investigated. The stress distribution and the bridges, which are formed during the tensile loading, have been detected.

  13. Writ Large on Your Face: Observing Emotions Using Automatic Facial Analysis

    Directory of Open Access Journals (Sweden)

    Dieckmann Anja

    2014-05-01

    Full Text Available Emotions affect all of our daily decisions and, of course, they also influence our evaluations of brands, products and advertisements. But what exactly do consumers feel when they watch a TV commercial, visit a website or when they interact with a brand in different ways? Measuring such emotions is not an easy task. In the past, the effectiveness of marketing material was evaluated mostly by subsequent surveys. Now, with the emergence of neuroscientific approaches like EEG, the measurement of real-time reactions is possible, for instance, when watching a commercial. However, most neuroscientific procedures are fairly invasive and irritating. For an EEG, for instance, numerous electrodes need to be placed on the participant's scalp. Furthermore, data analysis is highly complex. Scientific expertise is necessary for interpretation, so the procedure remains a black box to most practitioners and the results are still rather controversial. By contrast, automatic facial analysis provides similar information without having to wire study participants. In addition, the results of such analyses are intuitive and easy to interpret even for laypeople. These convincing advantages led GfK Company to decide on facial analysis and to develop a tool suitable for measuring emotional responses to marketing stimuli, making it easily applicable in marketing research practice.

  14. The proportionator: unbiased stereological estimation using biased automatic image analysis and non-uniform probability proportional to size sampling

    DEFF Research Database (Denmark)

    Gardi, Jonathan Eyal; Nyengaard, Jens Randel; Gundersen, Hans Jørgen Gottlieb

    2008-01-01

    The proportionator is a novel and radically different approach to sampling with microscopes based on well-known statistical theory (probability proportional to size - PPS sampling). It uses automatic image analysis, with a large range of options, to assign to every field of view in the section...... of its entirely different sampling strategy, based on known but non-uniform sampling probabilities, the proportionator for the first time allows the real CE at the section level to be automatically estimated (not just predicted), unbiased - for all estimators and at no extra cost to the user....

  15. The IDA-80 measurement evaluation programme on mass spectrometric isotope dilution analysis of uranium and plutonium. Vol. 1

    International Nuclear Information System (INIS)

    The main objective was the acquisition of basic data on the uncertainties involved in the mass spectrometric isotope dilution analysis as applied to the determination of uranium and plutonium in active feed solutions of reprocessing plants. The element concentrations and isotopic compositions of all test materials used were determined by CBNM and NBS with high accuracy. The more than 60000 analytical data reported by the participating laboratories were evaluated by statistical methods applied mainly to the calculation of estimates of the variances for the different uncertainty components contributing to the total uncertainty of this analytical technique. Attention was given to such topics as sample ageing, influence of fission products, spike calibration, ion fractionation, Pu-241 decay correction, minor isotope measurement and errors in data transfer. Furthermore, the performance of the 'dried sample' technique and the 'in-situ' spiking method of undiluted samples of reprocessing fuel solution with U-235/Pu-242 metal alloy spikes, were tested successfully. Considerable improvement of isotope dilution analysis in this safeguards relevant application during the last decade is shown as compared to the results obtained in the IDA-72 interlaboratory experiment, organized by KfK in 1972 on the same subject. (orig./HP)

  16. Automatic Behavior Analysis During a Clinical Interview with a Virtual Human.

    Science.gov (United States)

    Rizzo, Albert; Lucas, Gale; Gratch, Jonathan; Stratou, Giota; Morency, Louis-Philippe; Chavez, Kenneth; Shilling, Russ; Scherer, Stefan

    2016-01-01

    SimSensei is a Virtual Human (VH) interviewing platform that uses off-the-shelf sensors (i.e., webcams, Microsoft Kinect and a microphone) to capture and interpret real-time audiovisual behavioral signals from users interacting with the VH system. The system was specifically designed for clinical interviewing and health care support by providing a face-to-face interaction between a user and a VH that can automatically react to the inferred state of the user through analysis of behavioral signals gleaned from the user's facial expressions, body gestures and vocal parameters. Akin to how non-verbal behavioral signals have an impact on human-to-human interaction and communication, SimSensei aims to capture and infer user state from signals generated from user non-verbal communication to improve engagement between a VH and a user and to quantify user state from the data captured across a 20 minute interview. Results from of sample of service members (SMs) who were interviewed before and after a deployment to Afghanistan indicate that SMs reveal more PTSD symptoms to the VH than they report on the Post Deployment Health Assessment. Pre/Post deployment facial expression analysis indicated more sad expressions and few happy expressions at post deployment. PMID:27046598

  17. Automatic Analysis of Facial Affect: A Survey of Registration, Representation, and Recognition.

    Science.gov (United States)

    Sariyanidi, Evangelos; Gunes, Hatice; Cavallaro, Andrea

    2015-06-01

    Automatic affect analysis has attracted great interest in various contexts including the recognition of action units and basic or non-basic emotions. In spite of major efforts, there are several open questions on what the important cues to interpret facial expressions are and how to encode them. In this paper, we review the progress across a range of affect recognition applications to shed light on these fundamental questions. We analyse the state-of-the-art solutions by decomposing their pipelines into fundamental components, namely face registration, representation, dimensionality reduction and recognition. We discuss the role of these components and highlight the models and new trends that are followed in their design. Moreover, we provide a comprehensive analysis of facial representations by uncovering their advantages and limitations; we elaborate on the type of information they encode and discuss how they deal with the key challenges of illumination variations, registration errors, head-pose variations, occlusions, and identity bias. This survey allows us to identify open issues and to define future directions for designing real-world affect recognition systems. PMID:26357337

  18. Proceedings of the Seventh Conference of Nuclear Sciences and Applications. Vol.1,2,3

    International Nuclear Information System (INIS)

    The publication has been set up as a textbook for nuclear sciences and applications vol.1: (1) radiochemistry; (2) radiation chemistry; (3) isotope production; (4) waste management; vol.2: (1) nuclear and reactor; (2) physics; (3) plasma physics; (4) instrumentation and devices; (5) trace and ultra trace analysis; (6) environmental; vol.3: (1) radiation protection; (2) radiation health hazards; (3) nuclear safety; (4) biology; (5) agriculture

  19. Automatic analysis (aa): efficient neuroimaging workflows and parallel processing using Matlab and XML.

    Science.gov (United States)

    Cusack, Rhodri; Vicente-Grabovetsky, Alejandro; Mitchell, Daniel J; Wild, Conor J; Auer, Tibor; Linke, Annika C; Peelle, Jonathan E

    2014-01-01

    Recent years have seen neuroimaging data sets becoming richer, with larger cohorts of participants, a greater variety of acquisition techniques, and increasingly complex analyses. These advances have made data analysis pipelines complicated to set up and run (increasing the risk of human error) and time consuming to execute (restricting what analyses are attempted). Here we present an open-source framework, automatic analysis (aa), to address these concerns. Human efficiency is increased by making code modular and reusable, and managing its execution with a processing engine that tracks what has been completed and what needs to be (re)done. Analysis is accelerated by optional parallel processing of independent tasks on cluster or cloud computing resources. A pipeline comprises a series of modules that each perform a specific task. The processing engine keeps track of the data, calculating a map of upstream and downstream dependencies for each module. Existing modules are available for many analysis tasks, such as SPM-based fMRI preprocessing, individual and group level statistics, voxel-based morphometry, tractography, and multi-voxel pattern analyses (MVPA). However, aa also allows for full customization, and encourages efficient management of code: new modules may be written with only a small code overhead. aa has been used by more than 50 researchers in hundreds of neuroimaging studies comprising thousands of subjects. It has been found to be robust, fast, and efficient, for simple-single subject studies up to multimodal pipelines on hundreds of subjects. It is attractive to both novice and experienced users. aa can reduce the amount of time neuroimaging laboratories spend performing analyses and reduce errors, expanding the range of scientific questions it is practical to address.

  20. Automatic analysis (aa: efficient neuroimaging workflows and parallel processing using Matlab and XML

    Directory of Open Access Journals (Sweden)

    Rhodri eCusack

    2015-01-01

    Full Text Available Recent years have seen neuroimaging data becoming richer, with larger cohorts of participants, a greater variety of acquisition techniques, and increasingly complex analyses. These advances have made data analysis pipelines complex to set up and run (increasing the risk of human error and time consuming to execute (restricting what analyses are attempted. Here we present an open-source framework, automatic analysis (aa, to address these concerns. Human efficiency is increased by making code modular and reusable, and managing its execution with a processing engine that tracks what has been completed and what needs to be (redone. Analysis is accelerated by optional parallel processing of independent tasks on cluster or cloud computing resources. A pipeline comprises a series of modules that each perform a specific task. The processing engine keeps track of the data, calculating a map of upstream and downstream dependencies for each module. Existing modules are available for many analysis tasks, such as SPM-based fMRI preprocessing, individual and group level statistics, voxel-based morphometry, tractography, and multi-voxel pattern analyses (MVPA. However, aa also allows for full customization, and encourages efficient management of code: new modules may be written with only a small code overhead. aa has been used by more than 50 researchers in hundreds of neuroimaging studies comprising thousands of subjects. It has been found to be robust, fast and efficient, for simple single subject studies up to multimodal pipelines on hundreds of subjects. It is attractive to both novice and experienced users. aa can reduce the amount of time neuroimaging laboratories spend performing analyses and reduce errors, expanding the range of scientific questions it is practical to address.

  1. Petroleum product refining: plant level analysis of costs and competitiveness. Implications of greenhouse gas emission reductions. Vol 1

    International Nuclear Information System (INIS)

    Implications on the Canadian refining industry of reducing greenhouse gas (GHG) emissions to meet Canada's Kyoto commitment are assessed, based on a plant-level analysis of costs, benefits and economic and competitive impacts. It was determined on the basis of demand estimates prepared by Natural Resources Canada that refining industry carbon dioxide emissions could be as much a 38 per cent higher than 1990 levels in 2010. Achieving a six per cent reduction below 1990 levels from this business-as-usual case is considered a very difficult target to achieve, unless refinery shutdowns occur. This would require higher imports to meet Canada's petroleum products demand, leaving total carbon dioxide emissions virtually unchanged. A range of options, classified as (1) low capital, operating efficiency projects, (2) medium capital, process/utility optimization projects, (3) high capital, refinery specific projects, and (4) high operating cost GHG projects, were evaluated. Of these four alternatives, the low capital or operating efficiency projects were the only ones judged to have the potential to be economically viable. Energy efficiency projects in these four groups were evaluated under several policy initiatives including accelerated depreciation and a $200 per tonne of carbon tax. Result showed that an accelerated depreciation policy would lower the hurdle rate for refinery investments, and could achieve a four per cent reduction in GHG emissions below 1990 levels, assuming no further shutdown of refinery capacity. The carbon tax was judged to be potentially damaging to the Canadian refinery industry since it would penalize cracking refineries (most Canadian refineries are of this type); it would provide further uncertainty and risk, such that industry might not be able to justify investments to reduce emissions. The overall assessment is that the Canadian refinery industry could not meet the pro-rata Kyoto GHG reduction target through implementation of economically

  2. An empirical analysis of the methodology of automatic imitation research in a strategic context.

    Science.gov (United States)

    Aczel, Balazs; Kekecs, Zoltan; Bago, Bence; Szollosi, Aba; Foldes, Andrei

    2015-08-01

    Since the discovery of the mirror neuron system, it has been proposed that the automatic tendency to copy observed actions exists in humans and that this mechanism might be responsible for a range of social behavior. A strong argument for automatic behavior can be made when actions are executed against motivation to do otherwise. Strategic games in which imitation is disadvantageous serve as ideal designs for studying the automatic nature of participants' behavior. Most recently, Belot, Crawford, and Heyes (2013) conducted an explorative study using a modified version of the Rock-Paper-Scissors game, and suggested that in the case of asynchrony in the execution of the gestures, automatic imitation can be observed early on after the opponent's presentation. In our study, we video recorded the games, which allowed us to examine the effect of delay on imitative behavior as well as the sensitivity of the previously employed analyses. The examination of the recorded images revealed that more than 80% of the data were irrelevant to the study of automatic behavior. Additional bias in the paradigm became apparent, as previously presented gestures were found to affect the behavior of the players. After noise filtering, we found no evidence of automatic imitation in either the whole filtered data set or in selected time windows based on delay length. Besides questioning the strength of the results of previous analyses, we propose several experimental and statistical modifications for further research on automatic imitation. PMID:26010594

  3. An empirical analysis of the methodology of automatic imitation research in a strategic context.

    Science.gov (United States)

    Aczel, Balazs; Kekecs, Zoltan; Bago, Bence; Szollosi, Aba; Foldes, Andrei

    2015-08-01

    Since the discovery of the mirror neuron system, it has been proposed that the automatic tendency to copy observed actions exists in humans and that this mechanism might be responsible for a range of social behavior. A strong argument for automatic behavior can be made when actions are executed against motivation to do otherwise. Strategic games in which imitation is disadvantageous serve as ideal designs for studying the automatic nature of participants' behavior. Most recently, Belot, Crawford, and Heyes (2013) conducted an explorative study using a modified version of the Rock-Paper-Scissors game, and suggested that in the case of asynchrony in the execution of the gestures, automatic imitation can be observed early on after the opponent's presentation. In our study, we video recorded the games, which allowed us to examine the effect of delay on imitative behavior as well as the sensitivity of the previously employed analyses. The examination of the recorded images revealed that more than 80% of the data were irrelevant to the study of automatic behavior. Additional bias in the paradigm became apparent, as previously presented gestures were found to affect the behavior of the players. After noise filtering, we found no evidence of automatic imitation in either the whole filtered data set or in selected time windows based on delay length. Besides questioning the strength of the results of previous analyses, we propose several experimental and statistical modifications for further research on automatic imitation.

  4. Estimating the quality of pasturage in the municipality of Paragominas (PA) by means of automatic analysis of LANDSAT data

    Science.gov (United States)

    Dejesusparada, N. (Principal Investigator); Dossantos, A. P.; Novo, E. M. L. D.; Duarte, V.

    1981-01-01

    The use of LANDSAT data to evaluate pasture quality in the Amazon region is demonstrated. Pasture degradation in deforested areas of a traditional tropical forest cattle-raising region was estimated. Automatic analysis using interactive multispectral analysis (IMAGE-100) shows that 24% of the deforested areas were occupied by natural vegetation regrowth, 24% by exposed soil, 15% by degraded pastures, and 46% was suitable grazing land.

  5. The development of an automatic sample-changer and control instrumentation for isotope-source neutron-activation analysis

    International Nuclear Information System (INIS)

    An automatic sample-changer was developed at the Council for Mineral Technology for use in isotope-source neutron-activation analysis. Tests show that the sample-changer can transfer a sample of up to 3 kg in mass over a distance of 3 m within 5 s. In addition, instrumentation in the form of a three-stage sequential timer was developed to control the sequence of irradiation transfer and analysis

  6. Automatic differentiation bibliography

    Energy Technology Data Exchange (ETDEWEB)

    Corliss, G.F. (comp.)

    1992-07-01

    This is a bibliography of work related to automatic differentiation. Automatic differentiation is a technique for the fast, accurate propagation of derivative values using the chain rule. It is neither symbolic nor numeric. Automatic differentiation is a fundamental tool for scientific computation, with applications in optimization, nonlinear equations, nonlinear least squares approximation, stiff ordinary differential equation, partial differential equations, continuation methods, and sensitivity analysis. This report is an updated version of the bibliography which originally appeared in Automatic Differentiation of Algorithms: Theory, Implementation, and Application.

  7. A unified framework for automatic wound segmentation and analysis with deep convolutional neural networks.

    Science.gov (United States)

    Wang, Changhan; Yan, Xinchen; Smith, Max; Kochhar, Kanika; Rubin, Marcie; Warren, Stephen M; Wrobel, James; Lee, Honglak

    2015-08-01

    Wound surface area changes over multiple weeks are highly predictive of the wound healing process. Furthermore, the quality and quantity of the tissue in the wound bed also offer important prognostic information. Unfortunately, accurate measurements of wound surface area changes are out of reach in the busy wound practice setting. Currently, clinicians estimate wound size by estimating wound width and length using a scalpel after wound treatment, which is highly inaccurate. To address this problem, we propose an integrated system to automatically segment wound regions and analyze wound conditions in wound images. Different from previous segmentation techniques which rely on handcrafted features or unsupervised approaches, our proposed deep learning method jointly learns task-relevant visual features and performs wound segmentation. Moreover, learned features are applied to further analysis of wounds in two ways: infection detection and healing progress prediction. To the best of our knowledge, this is the first attempt to automate long-term predictions of general wound healing progress. Our method is computationally efficient and takes less than 5 seconds per wound image (480 by 640 pixels) on a typical laptop computer. Our evaluations on a large-scale wound database demonstrate the effectiveness and reliability of the proposed system. PMID:26736781

  8. Performance Analysis of the SIFT Operator for Automatic Feature Extraction and Matching in Photogrammetric Applications

    Directory of Open Access Journals (Sweden)

    Francesco Nex

    2009-05-01

    Full Text Available In the photogrammetry field, interest in region detectors, which are widely used in Computer Vision, is quickly increasing due to the availability of new techniques. Images acquired by Mobile Mapping Technology, Oblique Photogrammetric Cameras or Unmanned Aerial Vehicles do not observe normal acquisition conditions. Feature extraction and matching techniques, which are traditionally used in photogrammetry, are usually inefficient for these applications as they are unable to provide reliable results under extreme geometrical conditions (convergent taking geometry, strong affine transformations, etc. and for bad-textured images. A performance analysis of the SIFT technique in aerial and close-range photogrammetric applications is presented in this paper. The goal is to establish the suitability of the SIFT technique for automatic tie point extraction and approximate DSM (Digital Surface Model generation. First, the performances of the SIFT operator have been compared with those provided by feature extraction and matching techniques used in photogrammetry. All these techniques have been implemented by the authors and validated on aerial and terrestrial images. Moreover, an auto-adaptive version of the SIFT operator has been developed, in order to improve the performances of the SIFT detector in relation to the texture of the images. The Auto-Adaptive SIFT operator (A2 SIFT has been validated on several aerial images, with particular attention to large scale aerial images acquired using mini-UAV systems.

  9. Automatic identification of mobile and rigid substructures in molecular dynamics simulations and fractional structural fluctuation analysis.

    Directory of Open Access Journals (Sweden)

    Leandro Martínez

    Full Text Available The analysis of structural mobility in molecular dynamics plays a key role in data interpretation, particularly in the simulation of biomolecules. The most common mobility measures computed from simulations are the Root Mean Square Deviation (RMSD and Root Mean Square Fluctuations (RMSF of the structures. These are computed after the alignment of atomic coordinates in each trajectory step to a reference structure. This rigid-body alignment is not robust, in the sense that if a small portion of the structure is highly mobile, the RMSD and RMSF increase for all atoms, resulting possibly in poor quantification of the structural fluctuations and, often, to overlooking important fluctuations associated to biological function. The motivation of this work is to provide a robust measure of structural mobility that is practical, and easy to interpret. We propose a Low-Order-Value-Optimization (LOVO strategy for the robust alignment of the least mobile substructures in a simulation. These substructures are automatically identified by the method. The algorithm consists of the iterative superposition of the fraction of structure displaying the smallest displacements. Therefore, the least mobile substructures are identified, providing a clearer picture of the overall structural fluctuations. Examples are given to illustrate the interpretative advantages of this strategy. The software for performing the alignments was named MDLovoFit and it is available as free-software at: http://leandro.iqm.unicamp.br/mdlovofit.

  10. Multi-level Bayesian safety analysis with unprocessed Automatic Vehicle Identification data for an urban expressway.

    Science.gov (United States)

    Shi, Qi; Abdel-Aty, Mohamed; Yu, Rongjie

    2016-03-01

    In traffic safety studies, crash frequency modeling of total crashes is the cornerstone before proceeding to more detailed safety evaluation. The relationship between crash occurrence and factors such as traffic flow and roadway geometric characteristics has been extensively explored for a better understanding of crash mechanisms. In this study, a multi-level Bayesian framework has been developed in an effort to identify the crash contributing factors on an urban expressway in the Central Florida area. Two types of traffic data from the Automatic Vehicle Identification system, which are the processed data capped at speed limit and the unprocessed data retaining the original speed were incorporated in the analysis along with road geometric information. The model framework was proposed to account for the hierarchical data structure and the heterogeneity among the traffic and roadway geometric data. Multi-level and random parameters models were constructed and compared with the Negative Binomial model under the Bayesian inference framework. Results showed that the unprocessed traffic data was superior. Both multi-level models and random parameters models outperformed the Negative Binomial model and the models with random parameters achieved the best model fitting. The contributing factors identified imply that on the urban expressway lower speed and higher speed variation could significantly increase the crash likelihood. Other geometric factors were significant including auxiliary lanes and horizontal curvature. PMID:26722989

  11. Multi-level Bayesian safety analysis with unprocessed Automatic Vehicle Identification data for an urban expressway.

    Science.gov (United States)

    Shi, Qi; Abdel-Aty, Mohamed; Yu, Rongjie

    2016-03-01

    In traffic safety studies, crash frequency modeling of total crashes is the cornerstone before proceeding to more detailed safety evaluation. The relationship between crash occurrence and factors such as traffic flow and roadway geometric characteristics has been extensively explored for a better understanding of crash mechanisms. In this study, a multi-level Bayesian framework has been developed in an effort to identify the crash contributing factors on an urban expressway in the Central Florida area. Two types of traffic data from the Automatic Vehicle Identification system, which are the processed data capped at speed limit and the unprocessed data retaining the original speed were incorporated in the analysis along with road geometric information. The model framework was proposed to account for the hierarchical data structure and the heterogeneity among the traffic and roadway geometric data. Multi-level and random parameters models were constructed and compared with the Negative Binomial model under the Bayesian inference framework. Results showed that the unprocessed traffic data was superior. Both multi-level models and random parameters models outperformed the Negative Binomial model and the models with random parameters achieved the best model fitting. The contributing factors identified imply that on the urban expressway lower speed and higher speed variation could significantly increase the crash likelihood. Other geometric factors were significant including auxiliary lanes and horizontal curvature.

  12. Automatic Sleep Stage Scoring Using Time-Frequency Analysis and Stacked Sparse Autoencoders.

    Science.gov (United States)

    Tsinalis, Orestis; Matthews, Paul M; Guo, Yike

    2016-05-01

    We developed a machine learning methodology for automatic sleep stage scoring. Our time-frequency analysis-based feature extraction is fine-tuned to capture sleep stage-specific signal features as described in the American Academy of Sleep Medicine manual that the human experts follow. We used ensemble learning with an ensemble of stacked sparse autoencoders for classifying the sleep stages. We used class-balanced random sampling across sleep stages for each model in the ensemble to avoid skewed performance in favor of the most represented sleep stages, and addressed the problem of misclassification errors due to class imbalance while significantly improving worst-stage classification. We used an openly available dataset from 20 healthy young adults for evaluation. We used a single channel of EEG from this dataset, which makes our method a suitable candidate for longitudinal monitoring using wearable EEG in real-world settings. Our method has both high overall accuracy (78%, range 75-80%), and high mean [Formula: see text]-score (84%, range 82-86%) and mean accuracy across individual sleep stages (86%, range 84-88%) over all subjects. The performance of our method appears to be uncorrelated with the sleep efficiency and percentage of transitional epochs in each recording. PMID:26464268

  13. Analysis of an Automatic Accessibility Evaluator to Validate a Virtual and Authenticated Environment

    Directory of Open Access Journals (Sweden)

    Elisa Maria Pivetta

    2013-05-01

    Full Text Available This article’s objective is to analyze an automatic validation software compatible with the guidelines of Web Content Accessibility Guidelines (WCAG 2.0 in an authenticated environment. To the evaluation it was utilized as a test platform the authenticated environment of Moodle, which is an open source platform created for educational environments. Initially, a brief conceptualization about accessibility and the operation of these guidelines was described, and then the software to be tested was chosen: the WAVE. In the next step, the tool’s operation was valued and the study’s analysis was made, which allowed the comparison between the testable errors of WAVE with the guidelines of WCAG 2.0. As the results of the research, it was concluded that the tool WAVE obtained a good performance, even though it did not include several guidelines of WCAG 2.0 and did not classified the results within the accessibility’s principles of Web Accessibility Initiative (WAI. Also showed itself more adequate to developers than to common users, which have no knowledge of Web programming language.

  14. Automatic aerial image shadow detection through the hybrid analysis of RGB and HIS color space

    Science.gov (United States)

    Wu, Jun; Li, Huilin; Peng, Zhiyong

    2015-12-01

    This paper presents our research on automatic shadow detection from high-resolution aerial image through the hybrid analysis of RGB and HIS color space. To this end, the spectral characteristics of shadow are firstly discussed and three kinds of spectral components including the difference between normalized blue and normalized red component - BR, intensity and saturation components are selected as criterions to obtain initial segmentation of shadow region (called primary segmentation). After that, within the normalized RGB color space and HIS color space, the shadow region is extracted again (called auxiliary segmentation) using the OTSU operation, respectively. Finally, the primary segmentation and auxiliary segmentation are combined through a logical AND-connection operation to obtain reliable shadow region. In this step, small shadow areas are removed from combined shadow region and morphological algorithms are apply to fill small holes as well. The experimental results show that the proposed approach can effectively detect the shadow region from high-resolution aerial image and in high degree of automaton.

  15. Preoperative automatic visual behavioural analysis as a tool for intraocular lens choice in cataract surgery

    Directory of Open Access Journals (Sweden)

    Heloisa Neumann Nogueira

    2015-04-01

    Full Text Available Purpose: Cataract is the main cause of blindness, affecting 18 million people worldwide, with the highest incidence in the population above 50 years of age. Low visual acuity caused by cataract may have a negative impact on patient quality of life. The current treatment is surgery in order to replace the natural lens with an artificial intraocular lens (IOL, which can be mono- or multifocal. However, due to potential side effects, IOLs must be carefully chosen to ensure higher patient satisfaction. Thus, studies on the visual behavior of these patients may be an important tool to determine the best type of IOL implantation. This study proposed an anamnestic add-on for optimizing the choice of IOL. Methods: We used a camera that automatically takes pictures, documenting the patient’s visual routine in order to obtain additional information about the frequency of distant, intermediate, and near sights. Results: The results indicated an estimated frequency percentage, suggesting that visual analysis of routine photographic records of a patient with cataract may be useful for understanding behavioural gaze and for choosing visual management strategy after cataract surgery, simultaneously stimulating interest for customized IOL manufacturing according to individual needs.

  16. Performance Analysis of the SIFT Operator for Automatic Feature Extraction and Matching in Photogrammetric Applications.

    Science.gov (United States)

    Lingua, Andrea; Marenchino, Davide; Nex, Francesco

    2009-01-01

    In the photogrammetry field, interest in region detectors, which are widely used in Computer Vision, is quickly increasing due to the availability of new techniques. Images acquired by Mobile Mapping Technology, Oblique Photogrammetric Cameras or Unmanned Aerial Vehicles do not observe normal acquisition conditions. Feature extraction and matching techniques, which are traditionally used in photogrammetry, are usually inefficient for these applications as they are unable to provide reliable results under extreme geometrical conditions (convergent taking geometry, strong affine transformations, etc.) and for bad-textured images. A performance analysis of the SIFT technique in aerial and close-range photogrammetric applications is presented in this paper. The goal is to establish the suitability of the SIFT technique for automatic tie point extraction and approximate DSM (Digital Surface Model) generation. First, the performances of the SIFT operator have been compared with those provided by feature extraction and matching techniques used in photogrammetry. All these techniques have been implemented by the authors and validated on aerial and terrestrial images. Moreover, an auto-adaptive version of the SIFT operator has been developed, in order to improve the performances of the SIFT detector in relation to the texture of the images. The Auto-Adaptive SIFT operator (A(2) SIFT) has been validated on several aerial images, with particular attention to large scale aerial images acquired using mini-UAV systems. PMID:22412336

  17. 辛普森自动变速器档位分析%Analysis of Simpson's automatic transmission gears

    Institute of Scientific and Technical Information of China (English)

    黄代仲

    2012-01-01

      the traditional analysis method that were drive mechanism and gear of automatic transmission analyzed,was introduced, the following contents were included:composition of automatic transmission drive mechanism, process analysis and the use of the traditional analysis.%  用传统分析法对自动变速器的传动机构与档位进行分析(以丰田凌志A341E型四档电控自动变速器为例),其主要包括自动变速器传动机构组成、过程分析及使用传统分析法的作用。

  18. Neuronal Spectral Analysis of EEG and Expert Knowledge Integration for Automatic Classification of Sleep Stages

    OpenAIRE

    Kerkeni, Nizar; Alexandre, Frédéric; Bedoui, Mohamed Hédi; Bougrain, Laurent; Dogui, Mohamed

    2005-01-01

    http://www.wseas.org Being able to analyze and interpret signal coming from electroencephalogram (EEG) recording can be of high interest for many applications including medical diagnosis and Brain-Computer Interfaces. Indeed, human experts are today able to extract from this signal many hints related to physiological as well as cognitive states of the recorded subject and it would be very interesting to perform such task automatically but today no completely automatic system exists. In pre...

  19. Social Risk and Depression: Evidence from Manual and Automatic Facial Expression Analysis

    OpenAIRE

    Girard, Jeffrey M.; Cohn, Jeffrey F.; Mahoor, Mohammad H.; Mavadati, Seyedmohammad; Rosenwald, Dean P.

    2013-01-01

    Investigated the relationship between change over time in severity of depression symptoms and facial expression. Depressed participants were followed over the course of treatment and video recorded during a series of clinical interviews. Facial expressions were analyzed from the video using both manual and automatic systems. Automatic and manual coding were highly consistent for FACS action units, and showed similar effects for change over time in depression severity. For both systems, when s...

  20. Automatic sequences

    CERN Document Server

    Haeseler, Friedrich

    2003-01-01

    Automatic sequences are sequences which are produced by a finite automaton. Although they are not random they may look as being random. They are complicated, in the sense of not being not ultimately periodic, they may look rather complicated, in the sense that it may not be easy to name the rule by which the sequence is generated, however there exists a rule which generates the sequence. The concept automatic sequences has special applications in algebra, number theory, finite automata and formal languages, combinatorics on words. The text deals with different aspects of automatic sequences, in particular:· a general introduction to automatic sequences· the basic (combinatorial) properties of automatic sequences· the algebraic approach to automatic sequences· geometric objects related to automatic sequences.

  1. Automatization of the neutron activation analysis method in the nuclear analysis laboratory

    International Nuclear Information System (INIS)

    In the present paper the work done to automatice the Neutron Activation Analysis technic with a neutron generator is described. An interface between an IBM compatible microcomputer and the equipment in use to make this kind of measurement was developed. including the specialized software for this system

  2. Automatic detection of local arterial input functions through Independent Component Analysis on Dynamic Contrast enhanced Magnetic Resonance Imaging.

    Science.gov (United States)

    Narvaez, Mario; Ruiz-Espana, Silvia; Arana, Estanislao; Moratal, David

    2015-08-01

    Arterial Input Function (AIF) is obtained from perfusion studies as a basic parameter for the calculus of hemodynamic variables used as surrogate markers of the vascular status of tissues. However, at present, its identification is made manually leading to high subjectivity, low repeatability and considerable time consumption. We propose an alternative method to automatically identify local AIF in perfusion images using Independent Component Analysis. PMID:26737244

  3. Automatic detection of outlines. Application to the quantitative analysis of renal scintiscanning pictures

    International Nuclear Information System (INIS)

    The purpose of the work described is the finalizing of a method making it possible automatically to extract the significant outlines on a renal scintiscanning picture. The algorithms must be simple and of high performance, their routine execution on a mini-computer must be fast enough to compete effectively with human performances. However, the method that has been developed is general enough to be adapted, with slight modifications, to another type of picture. The first chapter is a brief introduction to the principle of scintiscanning, the equipment used and the type of picture obtained therefrom. In the second chapter the various approaches used for form recognition and scene analysis are very briefly described with the help of examples. The third chapter deals with pretreatment techniques (particularly the machine operators) used for segmenting the pictures. Chapter four presents techniques which segment the picture by parallel processing of all its points. In chapter five a description is given of the sequential research techniques of the outline elements, drawing inspiration from the methods used in artificial intelligence for resolving the optimization problem. The sixth chapter shows the difficulties encountered in extracting the renal outlines and the planning technique stages adopted to overcome these difficulties. Chapter seven describes in detail the two research methods employed for generating the plan. In chapter eight, the methods used for extending the areas obtained on the plan and for refining the outlines that bound them are dealt with. Chapter nine is a short presentation of the organization of the programmes and of their data structure. Finally, examples of results are given in chapter ten

  4. Development of user interface to support automatic program generation of nuclear power plant analysis by module-based simulation system

    International Nuclear Information System (INIS)

    Module-based Simulation System (MSS) has been developed to realize a new software work environment enabling versatile dynamic simulation of a complex nuclear power system flexibly. The MSS makes full use of modern software technology to replace a large fraction of human software works in complex, large-scale program development by computer automation. Fundamental methods utilized in MSS and developmental study on human interface system SESS-1 to help users in generating integrated simulation programs automatically are summarized as follows: (1) To enhance usability and 'communality' of program resources, the basic mathematical models of common usage in nuclear power plant analysis are programed as 'modules' and stored in a module library. The information on usage of individual modules are stored in module database with easy registration, update and retrieval by the interactive management system. (2) Target simulation programs and the input/output files are automatically generated with simple block-wise languages by a precompiler system for module integration purpose. (3) Working time for program development and analysis in an example study of an LMFBR plant thermal-hydraulic transient analysis was demonstrated to be remarkably shortened, with the introduction of an interface system SESS-1 developed as an automatic program generation environment. (author)

  5. Analysis and Design of PLC-based Control System for Automatic Beverage Filling Machine

    Directory of Open Access Journals (Sweden)

    Yundan Lu

    2015-01-01

    Full Text Available Automatic filling system is the main equipment in the food machinery industry. With the development of beverage industry and increasing demand of the filling system. The relay control method in traditional Filling machine has low automation and integration level and cannot satisfy the rapid development of automatic production. PLC control method has advantages of simple programming, strong anti-interference and high working reliability, has gradually replace the relay control method. In this study, hardware and software for the automatic filling system based on PLC control is designed, especially the injection section servo control system which adopts the servo motor driver metering pump is carefully analyzed and the filling precision is highly improved.

  6. Design of advanced automatic inspection system for turbine blade FPI analysis

    Science.gov (United States)

    Zheng, J.; Xie, W. F.; Viens, M.; Birglen, L.; Mantegh, I.

    2013-01-01

    Aircraft engine turbine blade is the most susceptible part to discontinuities as it works in the extremely high pressure and temperature. Among various types of NDT method, Fluorescent Penetrant Inspection (FPI) is comparably cheap and efficient thus suitable for detecting turbine blade surface discontinuities. In this paper, we have developed an Advanced Automatic Inspection System (AAIS) with Image Processing and Pattern Recognition techniques to aid human inspector. The system can automatically detect, measure and classify the discontinuities from turbine blade FPI images. The tests on the sample images provided by industrial partner have been performed to evaluate the system.

  7. Analysis of individual classification of lameness using automatic measurement of back posture in dairy cattle

    NARCIS (Netherlands)

    Viazzi, S.; Schlageter Tello, A.A.; Hertem, van T.; Romanini, C.E.B.; Pluk, A.; Halachmi, I.; Lokhorst, C.; Berckmans, D.

    2013-01-01

    Currently, diagnosis of lameness at an early stage in dairy cows relies on visual observation by the farmer, which is time consuming and often omitted. Many studies have tried to develop automatic cow lameness detection systems. However, those studies apply thresholds to the whole population to dete

  8. A Meta-Analysis on the Malleability of Automatic Gender Stereotypes

    Science.gov (United States)

    Lenton, Alison P.; Bruder, Martin; Sedikides, Constantine

    2009-01-01

    This meta-analytic review examined the efficacy of interventions aimed at reducing automatic gender stereotypes. Such interventions included attentional distraction, salience of within-category heterogeneity, and stereotype suppression. A small but significant main effect (g = 0.32) suggests that these interventions are successful but that their…

  9. Fast automatic analysis of antenatal dexamethasone on micro-seizure activity in the EEG

    International Nuclear Information System (INIS)

    Full text: In this work wc develop an automatic scheme for studying the effect of the antenatal Dexamethasone on the EEG activity. To do so an FFT (Fast Fourier Transform) based detector was designed and applied to the EEG recordings obtained from two groups of fetal sheep. Both groups received two injections with a time delay of 24 h between them. However the applied medicine was different for each group (Dex and saline). The detector developed was used to automatically identify and classify micro-seizures that occurred in the frequency bands corresponding to the EEG transients known as slow waves (2.5 14 Hz). For each second of the data recordings the spectrum was computed and the rise of the energy in each predefined frequency band then counted when the energy level exceeded a predefined corresponding threshold level (Where the threshold level was obtained from the long term average of the spectral points at each band). Our results demonstrate that it was possible to automatically count the micro-seizures for the three different bands in a time effective manner. It was found that the number of transients did not strongly depend on the nature of the injected medicine which was consistent with the results manually obtained by an EEG expert. Tn conclusion, the automatic detection scheme presented here would allow for rapid micro-seizure event identification of hours of highly sampled EEG data thus providing a valuable time-saving device.

  10. Automatic data processing and analysis system for monitoring region around a planned nuclear power plant

    Science.gov (United States)

    Kortström, Jari; Tiira, Timo; Kaisko, Outi

    2016-03-01

    The Institute of Seismology of University of Helsinki is building a new local seismic network, called OBF network, around planned nuclear power plant in Northern Ostrobothnia, Finland. The network will consist of nine new stations and one existing station. The network should be dense enough to provide azimuthal coverage better than 180° and automatic detection capability down to ML -0.1 within a radius of 25 km from the site.The network construction work began in 2012 and the first four stations started operation at the end of May 2013. We applied an automatic seismic signal detection and event location system to a network of 13 stations consisting of the four new stations and the nearest stations of Finnish and Swedish national seismic networks. Between the end of May and December 2013 the network detected 214 events inside the predefined area of 50 km radius surrounding the planned nuclear power plant site. Of those detections, 120 were identified as spurious events. A total of 74 events were associated with known quarries and mining areas. The average location error, calculated as a difference between the announced location from environment authorities and companies and the automatic location, was 2.9 km. During the same time period eight earthquakes between magnitude range 0.1-1.0 occurred within the area. Of these seven could be automatically detected. The results from the phase 1 stations of the OBF network indicates that the planned network can achieve its goals.

  11. Ceramography and segmentation of polycristalline ceramics: application to grain size analysis by automatic methods

    Energy Technology Data Exchange (ETDEWEB)

    Arnould, X.; Coster, M.; Chermant, J.L.; Chermant, L. [LERMAT, ISMRA, Caen (France); Chartier, T. [SPCTS, ENSCI, Limoges (France)

    2002-07-01

    The knowledge of the mean grain size of ceramics is a very important problem to solve in the ceramic industry. Some specific methods of segmentation are presented to analyse, by an automatic way, the granulometry and morphological parameters of ceramic materials. Example presented concerns cerine materials. Such investigations lead to important information on the sintering process. (orig.)

  12. Comparative analysis of automatic approaches to building detection from multi-source aerial data

    NARCIS (Netherlands)

    Frontoni, E.; Khoshelham, K.; Nardinocchi, C.; Nedkov, S.; Zingaretti, P.

    2008-01-01

    Automatic building detection has been a hot topic since the early 1990’s. Early approaches were based on a single aerial image. Detecting buildings is a difficult task so it can be more effective when multiple sources of information are obtained and fused. The objective of this paper is to provide a

  13. Assessment of Severe Apnoea through Voice Analysis, Automatic Speech, and Speaker Recognition Techniques

    Science.gov (United States)

    Fernández Pozo, Rubén; Blanco Murillo, Jose Luis; Hernández Gómez, Luis; López Gonzalo, Eduardo; Alcázar Ramírez, José; Toledano, Doroteo T.

    2009-12-01

    This study is part of an ongoing collaborative effort between the medical and the signal processing communities to promote research on applying standard Automatic Speech Recognition (ASR) techniques for the automatic diagnosis of patients with severe obstructive sleep apnoea (OSA). Early detection of severe apnoea cases is important so that patients can receive early treatment. Effective ASR-based detection could dramatically cut medical testing time. Working with a carefully designed speech database of healthy and apnoea subjects, we describe an acoustic search for distinctive apnoea voice characteristics. We also study abnormal nasalization in OSA patients by modelling vowels in nasal and nonnasal phonetic contexts using Gaussian Mixture Model (GMM) pattern recognition on speech spectra. Finally, we present experimental findings regarding the discriminative power of GMMs applied to severe apnoea detection. We have achieved an 81% correct classification rate, which is very promising and underpins the interest in this line of inquiry.

  14. Automatic stress-relieving music recommendation system based on photoplethysmography-derived heart rate variability analysis.

    Science.gov (United States)

    Shin, Il-Hyung; Cha, Jaepyeong; Cheon, Gyeong Woo; Lee, Choonghee; Lee, Seung Yup; Yoon, Hyung-Jin; Kim, Hee Chan

    2014-01-01

    This paper presents an automatic stress-relieving music recommendation system (ASMRS) for individual music listeners. The ASMRS uses a portable, wireless photoplethysmography module with a finger-type sensor, and a program that translates heartbeat signals from the sensor to the stress index. The sympathovagal balance index (SVI) was calculated from heart rate variability to assess the user's stress levels while listening to music. Twenty-two healthy volunteers participated in the experiment. The results have shown that the participants' SVI values are highly correlated with their prespecified music preferences. The sensitivity and specificity of the favorable music classification also improved as the number of music repetitions increased to 20 times. Based on the SVI values, the system automatically recommends favorable music lists to relieve stress for individuals. PMID:25571461

  15. Assessment of Severe Apnoea through Voice Analysis, Automatic Speech, and Speaker Recognition Techniques

    Directory of Open Access Journals (Sweden)

    Rubén Fernández Pozo

    2009-01-01

    Full Text Available This study is part of an ongoing collaborative effort between the medical and the signal processing communities to promote research on applying standard Automatic Speech Recognition (ASR techniques for the automatic diagnosis of patients with severe obstructive sleep apnoea (OSA. Early detection of severe apnoea cases is important so that patients can receive early treatment. Effective ASR-based detection could dramatically cut medical testing time. Working with a carefully designed speech database of healthy and apnoea subjects, we describe an acoustic search for distinctive apnoea voice characteristics. We also study abnormal nasalization in OSA patients by modelling vowels in nasal and nonnasal phonetic contexts using Gaussian Mixture Model (GMM pattern recognition on speech spectra. Finally, we present experimental findings regarding the discriminative power of GMMs applied to severe apnoea detection. We have achieved an 81% correct classification rate, which is very promising and underpins the interest in this line of inquiry.

  16. Automatic Prediction of Cardiovascular and Cerebrovascular Events Using Heart Rate Variability Analysis

    OpenAIRE

    Melillo, Paolo; Izzo, Raffaele; Orrico, Ada; Scala, Paolo; Attanasio, Marcella; Mirra, Marco; De Luca, Nicola; Pecchia, Leandro

    2015-01-01

    Background There is consensus that Heart Rate Variability is associated with the risk of vascular events. However, Heart Rate Variability predictive value for vascular events is not completely clear. The aim of this study is to develop novel predictive models based on data-mining algorithms to provide an automatic risk stratification tool for hypertensive patients. Methods A database of 139 Holter recordings with clinical data of hypertensive patients followed up for at least 12 months were c...

  17. Application for the technology overview of web projects based on automatic analysis of repositories

    OpenAIRE

    POLANC, ALJANA

    2016-01-01

    This thesis comprises the development and presentation of an application which aims to facilitate the overview of the technical and other data of web projects that are being developed or maintained in a company. The application automatically updates the collection of relevant data of the web projects by integrating with the GitHub web service, where it obtains the information regarding programming languages, libraries, other technologies, project contributors and other important data. In the ...

  18. Automatic Condition Monitoring of Industrial Rolling-Element Bearings Using Motor’s Vibration and Current Analysis

    DEFF Research Database (Denmark)

    Yang, Zhenyu

    2015-01-01

    An automatic condition monitoring for a class of industrial rolling-element bearings is developed based on the vibration as well as stator current analysis. The considered fault scenarios include a single-point defect, multiple-point defects, and a type of distributed defect. Motivated by the pot......An automatic condition monitoring for a class of industrial rolling-element bearings is developed based on the vibration as well as stator current analysis. The considered fault scenarios include a single-point defect, multiple-point defects, and a type of distributed defect. Motivated...... by the potential commercialization, the developed system is promoted mainly using off-the-shelf techniques, that is, the high-frequency resonance technique with envelope detection and the average of short-time Fourier transform. In order to test the flexibility and robustness, the monitoring performance...... is extensively studied under diverse operating conditions: different sensor locations, motor speeds, loading conditions, and data samples from different time segments. The experimental results showed the powerful capability of vibration analysis in the bearing point defect fault diagnosis. The current analysis...

  19. Evaluation of Characteristics of Non-Metallic Inclusions in P/M Ni-Base Superalloy by Automatic Image Analysis

    Institute of Scientific and Technical Information of China (English)

    Li; Xinggang; Ge; Changchun; Shen; Weiping

    2007-01-01

    Non-metallic inclusions,especially the large ones,within P/M Ni-base superalloy have a major influence on fatigue characteristics,but are not directly measurable by routine inspection.In this paper,a method,automatic image analysis,is proposed for estimation of the content,size and amount of non-metallic inclusions in superalloy.The methodology for the practical application of this method is described and the factors affecting the precision of the estimation are discussed.In the experiment,the characteristics of the non-metallic inclusions in Ni-base P/M superalloy are analyzed.

  20. Finite element analysis of osteosynthesis screw fixation in the bone stock: an appropriate method for automatic screw modelling.

    Science.gov (United States)

    Wieding, Jan; Souffrant, Robert; Fritsche, Andreas; Mittelmeier, Wolfram; Bader, Rainer

    2012-01-01

    The use of finite element analysis (FEA) has grown to a more and more important method in the field of biomedical engineering and biomechanics. Although increased computational performance allows new ways to generate more complex biomechanical models, in the area of orthopaedic surgery, solid modelling of screws and drill holes represent a limitation of their use for individual cases and an increase of computational costs. To cope with these requirements, different methods for numerical screw modelling have therefore been investigated to improve its application diversity. Exemplarily, fixation was performed for stabilization of a large segmental femoral bone defect by an osteosynthesis plate. Three different numerical modelling techniques for implant fixation were used in this study, i.e. without screw modelling, screws as solid elements as well as screws as structural elements. The latter one offers the possibility to implement automatically generated screws with variable geometry on arbitrary FE models. Structural screws were parametrically generated by a Python script for the automatic generation in the FE-software Abaqus/CAE on both a tetrahedral and a hexahedral meshed femur. Accuracy of the FE models was confirmed by experimental testing using a composite femur with a segmental defect and an identical osteosynthesis plate for primary stabilisation with titanium screws. Both deflection of the femoral head and the gap alteration were measured with an optical measuring system with an accuracy of approximately 3 µm. For both screw modelling techniques a sufficient correlation of approximately 95% between numerical and experimental analysis was found. Furthermore, using structural elements for screw modelling the computational time could be reduced by 85% using hexahedral elements instead of tetrahedral elements for femur meshing. The automatically generated screw modelling offers a realistic simulation of the osteosynthesis fixation with screws in the adjacent

  1. Finite element analysis of osteosynthesis screw fixation in the bone stock: an appropriate method for automatic screw modelling.

    Directory of Open Access Journals (Sweden)

    Jan Wieding

    Full Text Available The use of finite element analysis (FEA has grown to a more and more important method in the field of biomedical engineering and biomechanics. Although increased computational performance allows new ways to generate more complex biomechanical models, in the area of orthopaedic surgery, solid modelling of screws and drill holes represent a limitation of their use for individual cases and an increase of computational costs. To cope with these requirements, different methods for numerical screw modelling have therefore been investigated to improve its application diversity. Exemplarily, fixation was performed for stabilization of a large segmental femoral bone defect by an osteosynthesis plate. Three different numerical modelling techniques for implant fixation were used in this study, i.e. without screw modelling, screws as solid elements as well as screws as structural elements. The latter one offers the possibility to implement automatically generated screws with variable geometry on arbitrary FE models. Structural screws were parametrically generated by a Python script for the automatic generation in the FE-software Abaqus/CAE on both a tetrahedral and a hexahedral meshed femur. Accuracy of the FE models was confirmed by experimental testing using a composite femur with a segmental defect and an identical osteosynthesis plate for primary stabilisation with titanium screws. Both deflection of the femoral head and the gap alteration were measured with an optical measuring system with an accuracy of approximately 3 µm. For both screw modelling techniques a sufficient correlation of approximately 95% between numerical and experimental analysis was found. Furthermore, using structural elements for screw modelling the computational time could be reduced by 85% using hexahedral elements instead of tetrahedral elements for femur meshing. The automatically generated screw modelling offers a realistic simulation of the osteosynthesis fixation with

  2. An Automatic Cycle-Slip Processing Method and Its Precision Analysis

    Institute of Scientific and Technical Information of China (English)

    ZHENG Zuoya; LU Xiushan

    2006-01-01

    On the basis of analyzing and researching the current algorithms of cycle-slip detection and correction, a new method of cycle-slip detection and correction is put forward in this paper, that is, a reasonable cycle-slip detection condition and algorithm with corresponding program COMPRE (COMpass PRE-processing) to detect and correct cycle-slip automatically, compared with GIPSY and GAMIT software, for example, it is proved that this method is effective and credible to cycle-slip detection and correction in GPS data pre-processing.

  3. Computational text analysis and reading comprehension exam complexity towards automatic text classification

    CERN Document Server

    Liontou, Trisevgeni

    2014-01-01

    This book delineates a range of linguistic features that characterise the reading texts used at the B2 (Independent User) and C1 (Proficient User) levels of the Greek State Certificate of English Language Proficiency exams in order to help define text difficulty per level of competence. In addition, it examines whether specific reader variables influence test takers' perceptions of reading comprehension difficulty. The end product is a Text Classification Profile per level of competence and a formula for automatically estimating text difficulty and assigning levels to texts consistently and re

  4. A fast automatic plate changer for the analysis of nuclear emulsions

    International Nuclear Information System (INIS)

    This paper describes the design and performance of a computer controlled emulsion Plate Changer for the automatic placement and removal of nuclear emulsion films for the European Scanning System microscopes. The Plate Changer is used for mass scanning and measurement of the emulsions of the OPERA neutrino oscillation experiment at the Gran Sasso lab on the CNGS neutrino beam. Unlike other systems it works with both dry and oil objectives. The film changing takes less than 20 s and the accuracy on the positioning of the emulsion films is about 10μm. The final accuracy in retrieving track coordinates after fiducial marks measurement is better than 1μm

  5. Automatic sampling and analysis of organics and biomolecules by capillary action-supported contactless atmospheric pressure ionization mass spectrometry.

    Directory of Open Access Journals (Sweden)

    Cheng-Huan Hsieh

    Full Text Available Contactless atmospheric pressure ionization (C-API method has been recently developed for mass spectrometric analysis. A tapered capillary is used as both the sampling tube and spray emitter in C-API. No electric contact is required on the capillary tip during C-API mass spectrometric analysis. The simple design of the ionization method enables the automation of the C-API sampling system. In this study, we propose an automatic C-API sampling system consisting of a capillary (∼1 cm, an aluminium sample holder, and a movable XY stage for the mass spectrometric analysis of organics and biomolecules. The aluminium sample holder is controlled by the movable XY stage. The outlet of the C-API capillary is placed in front of the orifice of a mass spectrometer, whereas the sample well on the sample holder is moved underneath the capillary inlet. The sample droplet on the well can be readily infused into the C-API capillary through capillary action. When the sample solution reaches the capillary outlet, the sample spray is readily formed in the proximity of the mass spectrometer applied with a high electric field. The gas phase ions generated from the spray can be readily monitored by the mass spectrometer. We demonstrate that six samples can be analyzed in sequence within 3.5 min using this automatic C-API MS setup. Furthermore, the well containing the rinsing solvent is alternately arranged between the sample wells. Therefore, the C-API capillary could be readily flushed between runs. No carryover problems are observed during the analyses. The sample volume required for the C-API MS analysis is minimal, with less than 1 nL of the sample solution being sufficient for analysis. The feasibility of using this setup for quantitative analysis is also demonstrated.

  6. Automatic prediction of cardiovascular and cerebrovascular events using heart rate variability analysis.

    Directory of Open Access Journals (Sweden)

    Paolo Melillo

    Full Text Available There is consensus that Heart Rate Variability is associated with the risk of vascular events. However, Heart Rate Variability predictive value for vascular events is not completely clear. The aim of this study is to develop novel predictive models based on data-mining algorithms to provide an automatic risk stratification tool for hypertensive patients.A database of 139 Holter recordings with clinical data of hypertensive patients followed up for at least 12 months were collected ad hoc. Subjects who experienced a vascular event (i.e., myocardial infarction, stroke, syncopal event were considered as high-risk subjects. Several data-mining algorithms (such as support vector machine, tree-based classifier, artificial neural network were used to develop automatic classifiers and their accuracy was tested by assessing the receiver-operator characteristics curve. Moreover, we tested the echographic parameters, which have been showed as powerful predictors of future vascular events.The best predictive model was based on random forest and enabled to identify high-risk hypertensive patients with sensitivity and specificity rates of 71.4% and 87.8%, respectively. The Heart Rate Variability based classifier showed higher predictive values than the conventional echographic parameters, which are considered as significant cardiovascular risk factors.Combination of Heart Rate Variability measures, analyzed with data-mining algorithm, could be a reliable tool for identifying hypertensive patients at high risk to develop future vascular events.

  7. Automatic geometric modeling, mesh generation and FE analysis for pipelines with idealized defects and arbitrary location

    Energy Technology Data Exchange (ETDEWEB)

    Motta, R.S.; Afonso, S.M.B.; Willmersdorf, R.B.; Lyra, P.R.M. [Universidade Federal de Pernambuco (UFPE), Recife, PE (Brazil); Cabral, H.L.D. [TRANSPETRO, Rio de Janeiro, RJ (Brazil); Andrade, E.Q. [Petroleo Brasileiro S.A. (PETROBRAS), Rio de Janeiro, RJ (Brazil)

    2009-07-01

    Although the Finite Element Method (FEM) has proved to be a powerful tool to predict the failure pressure of corroded pipes, the generation of good computational models of pipes with corrosion defects can take several days. This makes the use of computational simulation procedure difficult to apply in practice. The main purpose of this work is to develop a set of computational tools to produce automatically models of pipes with defects, ready to be analyzed with commercial FEM programs, starting from a few parameters that locate and provide the main dimensions of the defect or a series of defects. Here these defects can be internal and external and also assume general spatial locations along the pipe. Idealized rectangular and elliptic geometries can be generated. These tools were based on MSC.PATRAN pre and post-processing programs and were written with PCL (Patran Command Language). The program for the automatic generation of models (PIPEFLAW) has a simplified and customized graphical interface, so that an engineer with basic notions of computational simulation with the FEM can generate rapidly models that result in precise and reliable simulations. Some examples of models of pipes with defects generated by the PIPEFLAW system are shown, and the results of numerical analyses, done with the tools presented in this work, are compared with, empiric results. (author)

  8. Force Analysis of Geneva Wheel and Face Cam Used In Automat

    Directory of Open Access Journals (Sweden)

    Madhoo G

    2014-06-01

    Full Text Available Glazerol Automat is a dedicated machine used in the Insulator pre assembly line. This automat is driven using single motor for different operations. Here the focus is on two main parts they are Geneva wheel and Face cam which are used for their respective operations. Geneva Wheel is used to index the drum which consists of 96 spindles. Due to this Geneva mechanism each of the spindles will hold the ceramic body when the drum is being indexed. Due to which there is a force which is generated in the Geneva wheel in maximum and minimum position. Face cam which is used while indexing the work piece carrier There are 2 tangential forces which are acting, one at the indexing side and the other at the driving side /cam side. The effective resultant force which is acting on the face cam while indexing work piece carrier is calculated. And these forces are analyzed using ansys and their respective Von Mises stresses and displacement plots are obtained for both the conditions based on boundary and loading conditions.

  9. Study on triterpenoic acids distribution in Ganoderma mushrooms by automatic multiple development high performance thin layer chromatographic fingerprint analysis.

    Science.gov (United States)

    Yan, Yu-Zhen; Xie, Pei-Shan; Lam, Wai-Kei; Chui, Eddie; Yu, Qiong-Xi

    2010-01-01

    Ganoderma--"Lingzhi" in Chinese--is one of the superior Chinese tonic materia medicas in China, Japan, and Korea. Two species, Ganoderma lucidum (Red Lingzhi) and G. sinense (Purple Lingzhi), have been included in the Chinese Pharmacopoeia since its 2000 Edition. However, some other species of Ganoderma are also available in the market. For example, there are five species divided by color called "Penta-colors Lingzhi" that have been advocated as being the most invigorating among the Lingzhi species; but there is no scientific evidence for such a claim. Morphological identification can serve as an effective practice for differentiating the various species, but the inherent quality has to be delineated by chemical analysis. Among the diverse constituents in Lingzhi, triterpenoids are commonly recognized as the major active ingredients. An automatic triple development HPTLC fingerprint analysis was carried out for detecting the distribution consistency of the triterpenoic acids in various Lingzhi samples. The chromatographic conditions were optimized as follows: stationary phase, precoated HPTLC silica gel 60 plate; mobile phase, toluene-ethyl acetate-methanol-formic acid (15 + 15 + 1 + 0.1); and triple-development using automatic multiple development equipment. The chromatograms showed good resolution, and the color images provided more specific HPTLC fingerprints than have been previously published. It was observed that the abundance of triterpenoic acids and consistent fingerprint pattern in Red Lingzhi (fruiting body of G. lucidum) outweighs the other species of Lingzhi. PMID:21140647

  10. [Digital storage and semi-automatic analysis of esophageal pressure signals. Evaluation of a commercialized system (PC Polygraft, Synectics)].

    Science.gov (United States)

    Bruley des Varannes, S; Pujol, P; Salim, B; Cherbut, C; Cloarec, D; Galmiche, J P

    1989-11-01

    The aim of this work was to evaluate a new commercially available pressure recording system (PC Polygraf, Synectics) and to compare this system with a classical method using perfused catheters. The PC Polygraf uses microtransducers and allows direct digitized storage and semi-automatic analysis of data. In the first part of this study, manometric assessment was conducted using only perfused catheters. The transducers were connected to both an analog recorder and to a PC Polygraf. Using the two methods of analysis, contraction amplitudes were strongly correlated (r = 0.99; p less than 0.0001) whereas durations were significantly but loosely correlated (r = 0.51; p less than 0.001). Resting LES pressure was significantly correlated (r = 0.87; p less than 0.05). In the second part of this study, simultaneous recordings of esophageal pressure were conducted in 7 patients, by placing side by side the two tubes (microtransducers and perfused catheters) with the sideholes at the same level. The characteristics of the waves were determined both by visual analysis of analog tracing and by semi-automatic analysis of digitized recording with adequate program. Mean amplitude was lower with the microtransducers than with the perfused catheters (60 vs 68 cm H2O; p less than 0.05), but the duration of waves was not significantly different when using both systems. Values obtained for each of these parameters using both methods were significantly correlated (amplitude: r = 0.74; duration: r = 0.51). The localization and the measure of the basal tone of sphincter were found to be difficult when using microtransducers. These results show that PC Polygraf allows a satisfactory analysis of esophageal pressure signals. However, only perfused catheters offer an excellent reliability for complete studies of both sphincter and peristaltism. PMID:2612832

  11. Automatic system for quantification and visualization of lung aeration on chest computed tomography images: the Lung Image System Analysis - LISA

    Energy Technology Data Exchange (ETDEWEB)

    Felix, John Hebert da Silva; Cortez, Paulo Cesar, E-mail: jhsfelix@gmail.co [Universidade Federal do Ceara (UFC), Fortaleza, CE (Brazil). Dept. de Engenharia de Teleinformatica; Holanda, Marcelo Alcantara [Universidade Federal do Ceara (UFC), Fortaleza, CE (Brazil). Hospital Universitario Walter Cantidio. Dept. de Medicina Clinica

    2010-12-15

    High Resolution Computed Tomography (HRCT) is the exam of choice for the diagnostic evaluation of lung parenchyma diseases. There is an increasing interest for computational systems able to automatically analyze the radiological densities of the lungs in CT images. The main objective of this study is to present a system for the automatic quantification and visualization of the lung aeration in HRCT images of different degrees of aeration, called Lung Image System Analysis (LISA). The secondary objective is to compare LISA to the Osiris system and also to specific algorithm lung segmentation (ALS), on the accuracy of the lungs segmentation. The LISA system automatically extracts the following image attributes: lungs perimeter, cross sectional area, volume, the radiological densities histograms, the mean lung density (MLD) in Hounsfield units (HU), the relative area of the lungs with voxels with density values lower than -950 HU (RA950) and the 15th percentile of the least density voxels (PERC15). Furthermore, LISA has a colored mask algorithm that applies pseudo-colors to the lung parenchyma according to the pre-defined radiological density chosen by the system user. The lungs segmentations of 102 images of 8 healthy volunteers and 141 images of 11 patients with Chronic Obstructive Pulmonary Disease (COPD) were compared on the accuracy and concordance among the three methods. The LISA was more effective on lungs segmentation than the other two methods. LISA's color mask tool improves the spatial visualization of the degrees of lung aeration and the various attributes of the image that can be extracted may help physicians and researchers to better assess lung aeration both quantitatively and qualitatively. LISA may have important clinical and research applications on the assessment of global and regional lung aeration and therefore deserves further developments and validation studies. (author)

  12. Automatic image analysis methods for the determination of stereological parameters - application to the analysis of densification during solid state sintering of WC-Co compacts

    Science.gov (United States)

    Missiaen; Roure

    2000-08-01

    Automatic image analysis methods which were used to determine microstructural parameters of sintered materials are presented. Estimation of stereological parameters at interfaces, when the system contains more than two phases, is particularly detailed. It is shown that the specific surface areas and mean curvatures of the various interfaces can be estimated in the numerical space of the images. The methods are applied to the analysis of densification during solid state sintering of WC-Co compacts. The microstructural evolution is commented on. Application of microstructural measurements to the analysis of densification kinetics is also discussed. PMID:10947907

  13. Automatic Vehicle Extraction from Airborne LiDAR Data Using an Object-Based Point Cloud Analysis Method

    Directory of Open Access Journals (Sweden)

    Jixian Zhang

    2014-09-01

    Full Text Available Automatic vehicle extraction from an airborne laser scanning (ALS point cloud is very useful for many applications, such as digital elevation model generation and 3D building reconstruction. In this article, an object-based point cloud analysis (OBPCA method is proposed for vehicle extraction from an ALS point cloud. First, a segmentation-based progressive TIN (triangular irregular network densification is employed to detect the ground points, and the potential vehicle points are detected based on the normalized heights of the non-ground points. Second, 3D connected component analysis is performed to group the potential vehicle points into segments. At last, vehicle segments are detected based on three features, including area, rectangularity and elongatedness. Experiments suggest that the proposed method is capable of achieving higher accuracy than the exiting mean-shift-based method for vehicle extraction from an ALS point cloud. Moreover, the larger the point density is, the higher the achieved accuracy is.

  14. Automatic derivation of domain terms and concept location based on the analysis of the identifiers

    CERN Document Server

    Vaclavik, Peter; Mezei, Marek

    2010-01-01

    Developers express the meaning of the domain ideas in specifically selected identifiers and comments that form the target implemented code. Software maintenance requires knowledge and understanding of the encoded ideas. This paper presents a way how to create automatically domain vocabulary. Knowledge of domain vocabulary supports the comprehension of a specific domain for later code maintenance or evolution. We present experiments conducted in two selected domains: application servers and web frameworks. Knowledge of domain terms enables easy localization of chunks of code that belong to a certain term. We consider these chunks of code as "concepts" and their placement in the code as "concept location". Application developers may also benefit from the obtained domain terms. These terms are parts of speech that characterize a certain concept. Concepts are encoded in "classes" (OO paradigm) and the obtained vocabulary of terms supports the selection and the comprehension of the class' appropriate identifiers. ...

  15. Statistical Analysis of Automatic Seed Word Acquisition to Improve Harmful Expression Extraction in Cyberbullying Detection

    Directory of Open Access Journals (Sweden)

    Suzuha Hatakeyama

    2016-04-01

    Full Text Available We study the social problem of cyberbullying, defined as a new form of bullying that takes place in the Internet space. This paper proposes a method for automatic acquisition of seed words to improve performance of the original method for the cyberbullying detection by Nitta et al. [1]. We conduct an experiment exactly in the same settings to find out that the method based on a Web mining technique, lost over 30% points of its performance since being proposed in 2013. Thus, we hypothesize on the reasons for the decrease in the performance and propose a number of improvements, from which we experimentally choose the best one. Furthermore, we collect several seed word sets using different approaches, evaluate and their precision. We found out that the influential factor in extraction of harmful expressions is not the number of seed words, but the way the seed words were collected and filtered.

  16. Automatic classication of pulmonary function in COPD patients using trachea analysis in chest CT scans

    Science.gov (United States)

    van Rikxoort, E. M.; de Jong, P. A.; Mets, O. M.; van Ginneken, B.

    2012-03-01

    Chronic Obstructive Pulmonary Disease (COPD) is a chronic lung disease that is characterized by airflow limitation. COPD is clinically diagnosed and monitored using pulmonary function testing (PFT), which measures global inspiration and expiration capabilities of patients and is time-consuming and labor-intensive. It is becoming standard practice to obtain paired inspiration-expiration CT scans of COPD patients. Predicting the PFT results from the CT scans would alleviate the need for PFT testing. It is hypothesized that the change of the trachea during breathing might be an indicator of tracheomalacia in COPD patients and correlate with COPD severity. In this paper, we propose to automatically measure morphological changes in the trachea from paired inspiration and expiration CT scans and investigate the influence on COPD GOLD stage classification. The trachea is automatically segmented and the trachea shape is encoded using the lengths of rays cast from the center of gravity of the trachea. These features are used in a classifier, combined with emphysema scoring, to attempt to classify subjects into their COPD stage. A database of 187 subjects, well distributed over the COPD GOLD stages 0 through 4 was used for this study. The data was randomly divided into training and test set. Using the training scans, a nearest mean classifier was trained to classify the subjects into their correct GOLD stage using either emphysema score, tracheal shape features, or a combination. Combining the proposed trachea shape features with emphysema score, the classification performance into GOLD stages improved with 11% to 51%. In addition, an 80% accuracy was achieved in distinguishing healthy subjects from COPD patients.

  17. Automatic Extraction of Optimal Endmembers from Airborne Hyperspectral Imagery Using Iterative Error Analysis (IEA and Spectral Discrimination Measurements

    Directory of Open Access Journals (Sweden)

    Ahram Song

    2015-01-01

    Full Text Available Pure surface materials denoted by endmembers play an important role in hyperspectral processing in various fields. Many endmember extraction algorithms (EEAs have been proposed to find appropriate endmember sets. Most studies involving the automatic extraction of appropriate endmembers without a priori information have focused on N-FINDR. Although there are many different versions of N-FINDR algorithms, computational complexity issues still remain and these algorithms cannot consider the case where spectrally mixed materials are extracted as final endmembers. A sequential endmember extraction-based algorithm may be more effective when the number of endmembers to be extracted is unknown. In this study, we propose a simple but accurate method to automatically determine the optimal endmembers using such a method. The proposed method consists of three steps for determining the proper number of endmembers and for removing endmembers that are repeated or contain mixed signatures using the Root Mean Square Error (RMSE images obtained from Iterative Error Analysis (IEA and spectral discrimination measurements. A synthetic hyperpsectral image and two different airborne images such as Airborne Imaging Spectrometer for Application (AISA and Compact Airborne Spectrographic Imager (CASI data were tested using the proposed method, and our experimental results indicate that the final endmember set contained all of the distinct signatures without redundant endmembers and errors from mixed materials.

  18. Development on quantitative safety analysis method of accident scenario. The automatic scenario generator development for event sequence construction of accident

    International Nuclear Information System (INIS)

    This study intends to develop a more sophisticated tool that will advance the current event tree method used in all PSA, and to focus on non-catastrophic events, specifically a non-core melt sequence scenario not included in an ordinary PSA. In the non-catastrophic event PSA, it is necessary to consider various end states and failure combinations for the purpose of multiple scenario construction. Therefore it is anticipated that an analysis work should be reduced and automated method and tool is required. A scenario generator that can automatically handle scenario construction logic and generate the enormous size of sequences logically identified by state-of-the-art methodology was developed. To fulfill the scenario generation as a technical tool, a simulation model associated with AI technique and graphical interface, was introduced. The AI simulation model in this study was verified for the feasibility of its capability to evaluate actual systems. In this feasibility study, a spurious SI signal was selected to test the model's applicability. As a result, the basic capability of the scenario generator could be demonstrated and important scenarios were generated. The human interface with a system and its operation, as well as time dependent factors and their quantification in scenario modeling, was added utilizing human scenario generator concept. Then the feasibility of an improved scenario generator was tested for actual use. Automatic scenario generation with a certain level of credibility, was achieved by this study. (author)

  19. Automatic analysis and reduction of reaction mechanisms for complex fuel combustion

    Energy Technology Data Exchange (ETDEWEB)

    Nilsson, Daniel

    2001-05-01

    This work concentrates on automatic procedures for simplifying chemical models for realistic fuels using skeletal mechanism construction and Quasi Steady-State Approximation (QSSA) applied to detailed reaction mechanisms. To automate the selection of species for removal or approximation, different indices for species ranking have thus been proposed. Reaction flow rates are combined with sensitivity information for targeting a certain quantity, and used to determine a level of redundancy for automatic skeletal mechanism construction by exclusion of redundant species. For QSSA reduction, a measure of species lifetime can be used for species ranking as-is, weighted by concentrations or molecular transport timescales, and/or combined with species sensitivity. Maximum values of the indices are accumulated over ranges of parameters, (e.g. fuel-air ratio and octane number), and species with low accumulated index values are selected for removal or steady-state approximation. In the case of QSSA, a model with a certain degree of reduction is automatically implemented as FORTRAN code by setting a certain index limit. The code calculates source terms of explicitly handled species from reaction rates and the steady-state concentrations by internal iteration. Homogeneous-reactor and one-dimensional laminar-flame models were used as test cases. A staged combustor fuelled by ethylene with monomethylamine addition is modelled by two homogeneous reactors in sequence, i.e. a PSR (Perfectly Stirred Reactor) followed by a PFR (Plug Flow Reactor). A modified PFR model was applied for simulation of a Homogeneous Charge Compression Ignition (HCCI) engine fuelled with four-component natural gas, whereas a two-zone model was required for a knocking Spark Ignition (SI) engine powered by Primary Reference Fuel (PRF). Finally, a laminar one-dimensional model was used to simulate premixed flames burning methane and an aeroturbine kerosene surrogate consisting of n-decane and toluene. In

  20. Semi-automatic analysis of standard uptake values in serial PET/CT studies in patients with lung cancer and lymphoma

    Directory of Open Access Journals (Sweden)

    Ly John

    2012-04-01

    Full Text Available Abstract Background Changes in maximum standardised uptake values (SUVmax between serial PET/CT studies are used to determine disease progression or regression in oncologic patients. To measure these changes manually can be time consuming in a clinical routine. A semi-automatic method for calculation of SUVmax in serial PET/CT studies was developed and compared to a conventional manual method. The semi-automatic method first aligns the serial PET/CT studies based on the CT images. Thereafter, the reader selects an abnormal lesion in one of the PET studies. After this manual step, the program automatically detects the corresponding lesion in the other PET study, segments the two lesions and calculates the SUVmax in both studies as well as the difference between the SUVmax values. The results of the semi-automatic analysis were compared to that of a manual SUVmax analysis using a Philips PET/CT workstation. Three readers did the SUVmax readings in both methods. Sixteen patients with lung cancer or lymphoma who had undergone two PET/CT studies were included. There were a total of 26 lesions. Results Linear regression analysis of changes in SUVmax show that intercepts and slopes are close to the line of identity for all readers (reader 1: intercept = 1.02, R2 = 0.96; reader 2: intercept = 0.97, R2 = 0.98; reader 3: intercept = 0.99, R2 = 0.98. Manual and semi-automatic method agreed in all cases whether SUVmax had increased or decreased between the serial studies. The average time to measure SUVmax changes in two serial PET/CT examinations was four to five times longer for the manual method compared to the semi-automatic method for all readers (reader 1: 53.7 vs. 10.5 s; reader 2: 27.3 vs. 6.9 s; reader 3: 47.5 vs. 9.5 s; p Conclusions Good agreement was shown in assessment of SUVmax changes between manual and semi-automatic method. The semi-automatic analysis was four to five times faster to perform than the manual analysis. These findings show the

  1. Analysis on machine tool systems using spindle vibration monitoring for automatic tool changer

    Directory of Open Access Journals (Sweden)

    Shang-Liang Chen

    2015-12-01

    Full Text Available Recently, the intelligent systems of technology have become one of the major items in the development of machine tools. One crucial technology is the machinery status monitoring function, which is required for abnormal warnings and the improvement of cutting efficiency. During processing, the mobility act of the spindle unit determines the most frequent and important part such as automatic tool changer. The vibration detection system includes the development of hardware and software, such as vibration meter, signal acquisition card, data processing platform, and machine control program. Meanwhile, based on the difference between the mechanical configuration and the desired characteristics, it is difficult for a vibration detection system to directly choose the commercially available kits. For this reason, it was also selected as an item for self-development research, along with the exploration of a significant parametric study that is sufficient to represent the machine characteristics and states. However, we also launched the development of functional parts of the system simultaneously. Finally, we entered the conditions and the parameters generated from both the states and the characteristics into the developed system to verify its feasibility.

  2. Automatic testing system design and data analysis of permafrost temperature in Qinghai-Tibet Railway

    Institute of Scientific and Technical Information of China (English)

    尚迎春; 齐红元

    2008-01-01

    Aimed at the characteristics of permafrost temperature influencing the safety of Qinghai-Tibet Railway and its on-line testing system, comparing the achievement of permafrost study nationwide with those worldwide, an automatic testing system of permafrost temperature, containing a master computer and some slave computers, was designed. By choosing high-precise thermistors as temperature sensor, designing and positioning the depth and interval of testing sections, testing, keeping and sending permafrost temperature data at time over slave computers, and receiving, processing and analyzing the data of collecting permafrost temperature over master computer, the change of the permafrost temperature can be described and analyzed, which can provide information for permafrost railway engineering design. Moreover, by taking permafrost temperature testing in a certain section of Qinghai-Tibet Railway as an instance, the collected data of permafrost temperature were analyzed, and the effect of permafrost behavior was depicted under the railway, as well as, a BP model was set up to predict the permafrost characteristics. This testing system will provide information timely about the change of the permafrost to support the safety operation in Qinghai-Tibet Railway.

  3. Analysis of Fourth Stage of Automatic Depressurization System Failure to Open in AP1000 LOCA

    Directory of Open Access Journals (Sweden)

    Zhao Guozhi

    2014-01-01

    Full Text Available Automatic Depressurization System (ADS is a very important part of passive core cooling system in passive safety nuclear plant AP1000. ADS have four stages with each stage having two series and only ADS4 utilizes squib valves. During the accident, emergency core injecting is realized by gravity driven passive safety injection system like makeup tank (CMT, accumulator and In-Containment Refueling Water Storage Tank (IRWST. The objective e of this study is to analyze the system response and phenomenon under part of failure of ADS in AP1000 LOCA. The plant model is built by using SCDAP/RELAP5/MOD 3.4 code. The chosen accident scenario is small and medium LOCAs followed by failure of ADS4 to open, whose location is different from the other three stages. The results indicate that long time core cooling from IRWST is postponed greatly through intentional depressurization only by ADS1, 2, 3. In addition, LOCAs with equivalent diameter 25.4 cm and 34.1 cm will not lead to core melt while 5.08 cm break LOCA will. Meanwhile, high water level in the pressurizer will appear during all of three LOCAs.

  4. Antenna system analysis and design for automatic detection and real-time tracking of electron Bernstein waves in FTU

    Science.gov (United States)

    Bin, W.; Alessi, E.; Bruschi, A.; D'Arcangelo, O.; Figini, L.; Galperti, C.; Garavaglia, S.; Granucci, G.; Moro, A.

    2014-05-01

    The algorithm for the automatic control of the new front steering antenna of the Frascati Tokamak Upgrade device has been improved, in view of forthcoming experiments aimed at testing the mode conversion of electron cyclotron waves at a frequency of 140 GHz. The existing antenna system has been prepared to provide two-point real-time measurements of electron Bernstein waves and to allow real-time tracking of the optimal conversion region. This required an accurate analysis of the antenna to minimize the risk of a mechanical damage of the movable launching mirrors, when accessing the high toroidal launching angles needed for this kind of experiment. A detailed description is presented of the work carried out to safely reach and validate the desired range of steering angles, which include the region of interest, and a technique is proposed to track and chase the correct line of sight for electron Bernstein waves detection during the shot.

  5. ANALYSIS OF THE DISTANCES COVERED BY FIRST DIVISION BRAZILIAN SOCCER PLAYERS OBTAINED WITH AN AUTOMATIC TRACKING METHOD

    Directory of Open Access Journals (Sweden)

    Ricardo M. L. Barros

    2007-06-01

    Full Text Available Methods based on visual estimation still is the most widely used analysis of the distances that is covered by soccer players during matches, and most description available in the literature were obtained using such an approach. Recently, systems based on computer vision techniques have appeared and the very first results are available for comparisons. The aim of the present study was to analyse the distances covered by Brazilian soccer players and compare the results to the European players', both data measured by automatic tracking system. Four regular Brazilian First Division Championship matches between different teams were filmed. Applying a previously developed automatic tracking system (DVideo, Campinas, Brazil, the results of 55 outline players participated in the whole game (n = 55 are presented. The results of mean distances covered, standard deviations (s and coefficient of variation (cv after 90 minutes were 10,012 m, s = 1,024 m and cv = 10.2%, respectively. The results of three-way ANOVA according to playing positions, showed that the distances covered by external defender (10642 ± 663 m, central midfielders (10476 ± 702 m and external midfielders (10598 ± 890 m were greater than forwards (9612 ± 772 m and forwards covered greater distances than central defenders (9029 ± 860 m. The greater distances were covered in standing, walking, or jogging, 5537 ± 263 m, followed by moderate-speed running, 1731 ± 399 m; low speed running, 1615 ± 351 m; high-speed running, 691 ± 190 m and sprinting, 437 ± 171 m. Mean distance covered in the first half was 5,173 m (s = 394 m, cv = 7.6% highly significant greater (p < 0.001 than the mean value 4,808 m (s = 375 m, cv = 7.8% in the second half. A minute-by-minute analysis revealed that after eight minutes of the second half, player performance has already decreased and this reduction is maintained throughout the second half

  6. Automatic Reading

    Institute of Scientific and Technical Information of China (English)

    胡迪

    2007-01-01

    <正>Reading is the key to school success and,like any skill,it takes practice.A child learns to walk by practising until he no longer has to think about how to put one foot in front of the other.The great athlete practises until he can play quickly,accurately and without thinking.Ed- ucators call it automaticity.

  7. Analysis on the Influence of Automatic Station Temperature Data on the Sequence Continuity of Historical Meteorological Data

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    [Objective] The research aimed to study the influence of automatic station data on the sequence continuity of historical meteorological data. [Method] Based on the temperature data which were measured by the automatic meteorological station and the corresponding artificial observation data during January-December in 2001, the monthly average, maximum and minimum temperatures in the automatic station were compared with the corresponding artificial observation temperature data in the parallel observation peri...

  8. SU-C-201-04: Quantification of Perfusion Heterogeneity Based On Texture Analysis for Fully Automatic Detection of Ischemic Deficits From Myocardial Perfusion Imaging

    Energy Technology Data Exchange (ETDEWEB)

    Fang, Y [National Cheng Kung University, Tainan, Taiwan (China); Huang, H [Chang Gung University, Taoyuan, Taiwan (China); Su, T [Chang Gung Memorial Hospital, Taoyuan, Taiwan (China)

    2015-06-15

    Purpose: Texture-based quantification of image heterogeneity has been a popular topic for imaging studies in recent years. As previous studies mainly focus on oncological applications, we report our recent efforts of applying such techniques on cardiac perfusion imaging. A fully automated procedure has been developed to perform texture analysis for measuring the image heterogeneity. Clinical data were used to evaluate the preliminary performance of such methods. Methods: Myocardial perfusion images of Thallium-201 scans were collected from 293 patients with suspected coronary artery disease. Each subject underwent a Tl-201 scan and a percutaneous coronary intervention (PCI) within three months. The PCI Result was used as the gold standard of coronary ischemia of more than 70% stenosis. Each Tl-201 scan was spatially normalized to an image template for fully automatic segmentation of the LV. The segmented voxel intensities were then carried into the texture analysis with our open-source software Chang Gung Image Texture Analysis toolbox (CGITA). To evaluate the clinical performance of the image heterogeneity for detecting the coronary stenosis, receiver operating characteristic (ROC) analysis was used to compute the overall accuracy, sensitivity and specificity as well as the area under curve (AUC). Those indices were compared to those obtained from the commercially available semi-automatic software QPS. Results: With the fully automatic procedure to quantify heterogeneity from Tl-201 scans, we were able to achieve a good discrimination with good accuracy (74%), sensitivity (73%), specificity (77%) and AUC of 0.82. Such performance is similar to those obtained from the semi-automatic QPS software that gives a sensitivity of 71% and specificity of 77%. Conclusion: Based on fully automatic procedures of data processing, our preliminary data indicate that the image heterogeneity of myocardial perfusion imaging can provide useful information for automatic determination

  9. Automatic utilities auditing

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Colin Boughton [Energy Metering Technology (United Kingdom)

    2000-08-01

    At present, energy audits represent only snapshot situations of the flow of energy. The normal pattern of energy audits as seen through the eyes of an experienced energy auditor is described. A brief history of energy auditing is given. It is claimed that the future of energy auditing lies in automatic meter reading with expert data analysis providing continuous automatic auditing thereby reducing the skill element. Ultimately, it will be feasible to carry out auditing at intervals of say 30 minutes rather than five years.

  10. Interconnecting smartphone, image analysis server, and case report forms in clinical trials for automatic skin lesion tracking in clinical trials

    Science.gov (United States)

    Haak, Daniel; Doma, Aliaa; Gombert, Alexander; Deserno, Thomas M.

    2016-03-01

    Today, subject's medical data in controlled clinical trials is captured digitally in electronic case report forms (eCRFs). However, eCRFs only insufficiently support integration of subject's image data, although medical imaging is looming large in studies today. For bed-side image integration, we present a mobile application (App) that utilizes the smartphone-integrated camera. To ensure high image quality with this inexpensive consumer hardware, color reference cards are placed in the camera's field of view next to the lesion. The cards are used for automatic calibration of geometry, color, and contrast. In addition, a personalized code is read from the cards that allows subject identification. For data integration, the App is connected to an communication and image analysis server that also holds the code-study-subject relation. In a second system interconnection, web services are used to connect the smartphone with OpenClinica, an open-source, Food and Drug Administration (FDA)-approved electronic data capture (EDC) system in clinical trials. Once the photographs have been securely stored on the server, they are released automatically from the mobile device. The workflow of the system is demonstrated by an ongoing clinical trial, in which photographic documentation is frequently performed to measure the effect of wound incision management systems. All 205 images, which have been collected in the study so far, have been correctly identified and successfully integrated into the corresponding subject's eCRF. Using this system, manual steps for the study personnel are reduced, and, therefore, errors, latency and costs decreased. Our approach also increases data security and privacy.

  11. Temporal Analysis and Automatic Calibration of the Velodyne HDL-32E LiDAR System

    Science.gov (United States)

    Chan, T. O.; Lichti, D. D.; Belton, D.

    2013-10-01

    At the end of the first quarter of 2012, more than 600 Velodyne LiDAR systems had been sold worldwide for various robotic and high-accuracy survey applications. The ultra-compact Velodyne HDL-32E LiDAR has become a predominant sensor for many applications that require lower sensor size/weight and cost. For high accuracy applications, cost-effective calibration methods with minimal manual intervention are always desired by users. However, the calibrations are complicated by the Velodyne LiDAR's narrow vertical field of view and the very highly time-variant nature of its measurements. In the paper, the temporal stability of the HDL-32E is first analysed as the motivation for developing a new, automated calibration method. This is followed by a detailed description of the calibration method that is driven by a novel segmentation method for extracting vertical cylindrical features from the Velodyne point clouds. The proposed segmentation method utilizes the Velodyne point cloud's slice-like nature and first decomposes the point clouds into 2D layers. Then the layers are treated as 2D images and are processed with the Generalized Hough Transform which extracts the points distributed in circular patterns from the point cloud layers. Subsequently, the vertical cylindrical features can be readily extracted from the whole point clouds based on the previously extracted points. The points are passed to the calibration that estimates the cylinder parameters and the LiDAR's additional parameters simultaneously by constraining the segmented points to fit to the cylindrical geometric model in such a way the weighted sum of the adjustment residuals are minimized. The proposed calibration is highly automatic and this allows end users to obtain the time-variant additional parameters instantly and frequently whenever there are vertical cylindrical features presenting in scenes. The methods were verified with two different real datasets, and the results suggest that up to 78

  12. Automatic symbolic analysis of SC networks using a modified nodal approach

    NARCIS (Netherlands)

    Zivkovic, V.A.; Petkovic, P.M.; Milanovic, D.P.

    1998-01-01

    This paper presents a symbolic analysis of Switched-Capacitor (SC) circuits in the z-domain using Modified Nodal Approach (MNA). We have selected the MNA method as one of the widely established approaches in circuit analysis. The analyses are performed using SymsimC symbolic simulator which also ena

  13. Automatic denoising of functional MRI data: combining independent component analysis and hierarchical fusion of classifiers

    NARCIS (Netherlands)

    Salimi-Khorshidi, G.; Douaud, G.; Beckmann, C.F.; Glasser, M.F.; Griffanti, L.; Smith, S.M.

    2014-01-01

    Many sources of fluctuation contribute to the fMRI signal, and this makes identifying the effects that are truly related to the underlying neuronal activity difficult. Independent component analysis (ICA) - one of the most widely used techniques for the exploratory analysis of fMRI data - has shown

  14. Automatic denoising of functional MRI data: Combining independent component analysis and hierarchical fusion of classifiers

    NARCIS (Netherlands)

    Salimi-Khorshidi, Gholamreza; Douaud, Gwenaelle; Beckmann, Christian F.; Glasser, Matthew F.; Griffanti, Ludovica; Smith, Stephen M.

    2014-01-01

    Many sources of fluctuation contribute to the fMRI signal, and this makes identifying the effects that are truly related to the underlying neuronal activity difficult. Independent component analysis (ICA) – one of the most widely used techniques for the exploratory analysis of fMRI data – has shown

  15. 自动核型分析技术及其在植物中的应用%Automatic Karyotype Analysis Technology and Its Application in Plants

    Institute of Scientific and Technical Information of China (English)

    何小周; 郭东林

    2009-01-01

    核型分析是一种重要的细胞遗传学研究手段.染色体自动分析技术操作简便、速度快、结果可靠,是未来核型分析技术发展的方向.综述了生物核型分析的策略、系统组成、自动化图像处理和分析技术及其在植物核型分析中的应用.%The karyotype analysis is an important research means of cytogenetics. The automatic chromosome analysis technology is simple, rapid and reliable, so it is the development direction of the karyotype analysis technology in the future. The strategies, system composition, automatic image processing and analysis technologies of biological karyotype analysis and its application in plants were reviewed.

  16. Analysis of DGNB-DK criteria for BIM-based Model Checking automatization

    DEFF Research Database (Denmark)

    Gade, Peter; Svidt, Kjeld; Jensen, Rasmus Lund

    This report includes the results of an analysis of the automation potential of the Danish edition of building sustainability assessment method Deutsche Gesellschaft für Nachhaltiges Bauen (DGNB) for office buildings version 2014 1.1. The analysis investigate the criteria related to DGNB-DK and if......-DK and if they would be suited for automation through the technological concept BIM-based Model Checking (BMC)....

  17. Failure analysis and maintenance of ZDHW-5 automatic calorimeter%ZDHW-5全自动量热仪故障分析与维护

    Institute of Scientific and Technical Information of China (English)

    陈庆鸿; 路长杰

    2016-01-01

    本文介绍了ZDHW-5全自动量热仪结构与组成、工作原理;对全自动量热仪使用过程中常见故障进行分析,提出了排除故障所采取的有效措施。%Aiming at the determination of coal calorific value of Coal Analysis Laboratory, ZDHW-5 automatic volume heat meter structure and composition, working principle were introduced; the common fault using a fully automatic calorimeter, such as oxygen bomb leak, mixer not-rotation, burning dish no-fire, burning dish unburned coal samples and test continuing for a long time, were analyzed in the paper, taking effective measures to troubleshooting. The failure of ZDHW-5 full automatic calorimeter oxygen bomb, test water, ambient temperature, internal and external cylinder, temperature sensor, mixer, ignition device and so on were analyzed.

  18. Chi-squared Automatic Interaction Detection Decision Tree Analysis of Risk Factors for Infant Anemia in Beijing, China

    Institute of Scientific and Technical Information of China (English)

    Fang Ye; Zhi-Hua Chen; Jie Chen; Fang Liu; Yong Zhang; Qin-Ying Fan; Lin Wang

    2016-01-01

    Background:In the past decades,studies on infant anemia have mainly focused on rural areas of China.With the increasing heterogeneity of population in recent years,available information on infant anemia is inconclusive in large cities of China,especially with comparison between native residents and floating population.This population-based cross-sectional study was implemented to determine the anemic status of infants as well as the risk factors in a representative downtown area of Beijing.Methods:As useful methods to build a predictive model,Chi-squared automatic interaction detection (CHAID) decision tree analysis and logistic regression analysis were introduced to explore risk factors of infant anemia.A total of 1091 infants aged 6-12 months together with their parents/caregivers living at Heping Avenue Subdistrict of Beijing were surveyed from January 1,2013 to December 31,2014.Results:The prevalence of anemia was 12.60% with a range of 3.47%-40.00% in different subgroup characteristics.The CHAID decision tree model has demonstrated multilevel interaction among risk factors through stepwise pathways to detect anemia.Besides the three predictors identified by logistic regression model including maternal anemia during pregnancy,exclusive breastfeeding in the first 6 months,and floating population,CHAID decision tree analysis also identified the fourth risk factor,the matemal educational level,with higher overall classification accuracy and larger area below the receiver operating characteristic curve.Conclusions:The infant anemic status in metropolis is complex and should be carefully considered by the basic health care practitioners.CHAID decision tree analysis has demonstrated a better performance in hierarchical analysis of population with great heterogeneity.Risk factors identified by this study might be meaningful in the early detection and prompt treatment of infant anemia in large cities.

  19. From Laser Scanning to Finite Element Analysis of Complex Buildings by Using a Semi-Automatic Procedure

    Directory of Open Access Journals (Sweden)

    Giovanni Castellazzi

    2015-07-01

    Full Text Available In this paper, a new semi-automatic procedure to transform three-dimensional point clouds of complex objects to three-dimensional finite element models is presented and validated. The procedure conceives of the point cloud as a stacking of point sections. The complexity of the clouds is arbitrary, since the procedure is designed for terrestrial laser scanner surveys applied to buildings with irregular geometry, such as historical buildings. The procedure aims at solving the problems connected to the generation of finite element models of these complex structures by constructing a fine discretized geometry with a reduced amount of time and ready to be used with structural analysis. If the starting clouds represent the inner and outer surfaces of the structure, the resulting finite element model will accurately capture the whole three-dimensional structure, producing a complex solid made by voxel elements. A comparison analysis with a CAD-based model is carried out on a historical building damaged by a seismic event. The results indicate that the proposed procedure is effective and obtains comparable models in a shorter time, with an increased level of automation.

  20. Parameter design and performance analysis of shift actuator for a two-speed automatic mechanical transmission for pure electric vehicles

    Directory of Open Access Journals (Sweden)

    Jianjun Hu

    2016-08-01

    Full Text Available Recent developments of pure electric vehicles have shown that pure electric vehicles equipped with two-speed or multi-speed gearbox possess higher energy efficiency by ensuring the drive motor operates at its peak performance range. This article presents the design, analysis, and control of a two-speed automatic mechanical transmission for pure electric vehicles. The shift actuator is based on a motor-controlled camshaft where a special geometric groove is machined, and the camshaft realizes the axial positions of the synchronizer sleeve for gear engaging, disengaging, and speed control of the drive motor. Based on the force analysis of shift process, the parameters of shift actuator and shift motor are designed. The drive motor’s torque control strategy before shifting, speed governing control strategy before engaging, shift actuator’s control strategy during gear engaging, and drive motor’s torque recovery strategy after shift process are proposed and implemented with a prototype. To validate the performance of the two-speed gearbox, a test bed was developed based on dSPACE that emulates various operation conditions. The experimental results indicate that the shift process with the proposed shift actuator and control strategy could be accomplished within 1 s under various operation conditions, with shift smoothness up to passenger car standard.

  1. [Automatic EEG analysis in the time domain and its possible clinical significance--presentation of a flexible software package].

    Science.gov (United States)

    Spiel, G; Benninger, F

    1986-01-01

    The intention of the automatic EEG analysis is to take several EEG characteristics into account and therefore be usable for different applications to quantify events in the EEG. This procedure of analysis is based on the estimation of maxima and minima points within the measured data and the calculation of the wavelength of the half-waves. This is done by correction of the actually measured maxima-minima-points along the t-axis by means of an interpolation technique, and the frequency of half waves are calculated from this solution with an accuracy of half a Hertz. This method was necessary because our equipment allows only a digitalisation rate of 8 ms (Harner, 1977). Using this procedure it is possible to record the frequency distribution, the distribution of amplitudes, and the distribution of steepness as distributions of elementary EEG characteristics. To characterize specified EEG patterns, the EEG data can be classified according to categories of combinations of quantified characteristics. If we consider topological aspects as well there are the following possibilities: 1 element. characteristic--1 EEG channel; 1 element. characteristic--2 or more EEG channels; several element. characteristics--1 EEG channel; several element. characteristics--2 or more channels. There are possibilities of data reduction, exemplified on the distribution of frequencies without taking into account the topological aspects. The above mentioned methods of data reduction are useful for one EEG channel. On the other hand a comparison of the EEG activity in different channels can be done. PMID:3774345

  2. A Sensitive and Automatic White Matter Fiber Tracts Model for Longitudinal Analysis of Diffusion Tensor Images in Multiple Sclerosis.

    Directory of Open Access Journals (Sweden)

    Claudio Stamile

    Full Text Available Diffusion tensor imaging (DTI is a sensitive tool for the assessment of microstructural alterations in brain white matter (WM. We propose a new processing technique to detect, local and global longitudinal changes of diffusivity metrics, in homologous regions along WM fiber-bundles. To this end, a reliable and automatic processing pipeline was developed in three steps: 1 co-registration and diffusion metrics computation, 2 tractography, bundle extraction and processing, and 3 longitudinal fiber-bundle analysis. The last step was based on an original Gaussian mixture model providing a fine analysis of fiber-bundle cross-sections, and allowing a sensitive detection of longitudinal changes along fibers. This method was tested on simulated and clinical data. High levels of F-Measure were obtained on simulated data. Experiments on cortico-spinal tract and inferior fronto-occipital fasciculi of five patients with Multiple Sclerosis (MS included in a weekly follow-up protocol highlighted the greater sensitivity of this fiber scale approach to detect small longitudinal alterations.

  3. Automatic Detection of CT Perfusion Datasets Unsuitable for Analysis due to Head Movement of Acute Ischemic Stroke Patients

    Directory of Open Access Journals (Sweden)

    Fahmi Fahmi

    2014-01-01

    Full Text Available Head movement during brain Computed Tomography Perfusion (CTP can deteriorate perfusion analysis quality in acute ischemic stroke patients. We developed a method for automatic detection of CTP datasets with excessive head movement, based on 3D image-registration of CTP, with non-contrast CT providing transformation parameters. For parameter values exceeding predefined thresholds, the dataset was classified as ‘severely moved’. Threshold values were determined by digital CTP phantom experiments. The automated selection was compared to manual screening by 2 experienced radiologists for 114 brain CTP datasets. Based on receiver operator characteristics, optimal thresholds were found of respectively 1.0°, 2.8° and 6.9° for pitch, roll and yaw, and 2.8 mm for z-axis translation. The proposed method had a sensitivity of 91.4% and a specificity of 82.3%. This method allows accurate automated detection of brain CTP datasets that are unsuitable for perfusion analysis.

  4. Exploiting automatically generated databases of traffic signs and road markings for contextual co-occurrence analysis

    Science.gov (United States)

    Hazelhoff, Lykele; Creusen, Ivo M.; Woudsma, Thomas; de With, Peter H. N.

    2015-11-01

    Combined databases of road markings and traffic signs provide a complete and full description of the present traffic legislation and instructions. Such databases contribute to efficient signage maintenance, improve navigation, and benefit autonomous driving vehicles. A system is presented for the automated creation of such combined databases, which additionally investigates the benefit of this combination for automated contextual placement analysis. This analysis involves verification of the co-occurrence of traffic signs and road markings to retrieve a list of potentially incorrectly signaled (and thus potentially unsafe) road situations. This co-occurrence verification is specifically explored for both pedestrian crossings and yield situations. Evaluations on 420 km of road have shown that individual detection of traffic signs and road markings denoting these road situations can be performed with accuracies of 98% and 85%, respectively. Combining both approaches shows that over 95% of the pedestrian crossings and give-way situations can be identified. An exploration toward additional co-occurrence analysis of signs and markings shows that inconsistently signaled situations can successfully be extracted, such that specific safety actions can be directed toward cases lacking signs or markings, while most consistently signaled situations can be omitted from this analysis.

  5. The ACODEA Framework: Developing Segmentation and Classification Schemes for Fully Automatic Analysis of Online Discussions

    Science.gov (United States)

    Mu, Jin; Stegmann, Karsten; Mayfield, Elijah; Rose, Carolyn; Fischer, Frank

    2012-01-01

    Research related to online discussions frequently faces the problem of analyzing huge corpora. Natural Language Processing (NLP) technologies may allow automating this analysis. However, the state-of-the-art in machine learning and text mining approaches yields models that do not transfer well between corpora related to different topics. Also,…

  6. Automatic analysis of quality of images from X-ray digital flat detectors

    International Nuclear Information System (INIS)

    Since last decade, medical imaging has grown up with the development of new digital imaging techniques. In the field of X-ray radiography, new detectors replace progressively older techniques, based on film or x-ray intensifiers. These digital detectors offer a higher sensibility and reduced overall dimensions. This work has been prepared with Trixell, the world leading company in flat detectors for medical radiography. It deals with quality control on digital images stemming from these detectors. High quality standards of medical imaging impose a close analysis of the defects that can appear on the images. This work describes a complete process for quality analysis of such images. A particular focus is given on the detection task of the defects, thanks to methods well adapted to our context of spatially correlated defects in noise background. (author)

  7. Automatic mechanical fault assessment of small wind energy systems in microgrids using electric signature analysis

    DEFF Research Database (Denmark)

    Skrimpas, Georgios Alexandros; Marhadi, Kun Saptohartyadi; Jensen, Bogi Bech

    2013-01-01

    of islanded operation. In this paper, the fault assessment is achieved efficiently and consistently via electric signature analysis (ESA). In ESA the fault related frequency components are manifested as sidebands of the existing current and voltage time harmonics. The energy content between the fundamental, 5...... element model where dynamic eccentricity and bearing outer race defect are simulated under varying fault severity and electric loading conditions....

  8. Automatic backscatter analysis of regional left ventricular systolic function using color kinesis.

    Science.gov (United States)

    Schwartz, S L; Cao, Q L; Vannan, M A; Pandian, N G

    1996-06-15

    Assessment of regional wall motion by 2-dimensional echocardiography can be performed by either semiquantitative wall motion scoring or by quantitative analysis. The former is subjective and requires expertise. Quantitative methods are too time-consuming for routine use in a busy clinical laboratory. Color kinesis is a new algorithm utilizing acoustic backscatter analysis. It provides a color encoded map of endocardial motion in real time. In each frame a new color layer is added; the thickness of the color beam represents endocardial motion during that frame. The end-systolic image has multiple color layers, representing regional and temporal heterogeneity of segmental motion. The purpose of this study was to validate the use of color kinesis for semiquantitative analysis of regional left ventricular systolic function and quantitatively in measurement of endocardial excursion. Semiquantitative wall motion scoring was performed in 18 patients using both 2-dimensional echo and color kinesis. Scoring was identical in 74% of segments; there was 84% agreement in definition of normal vs. abnormal. There was less interobserver variability in wall motion scoring using color kinesis. Endocardial excursion was quantified in 21 patients. 70% of the imaged segments were suitable for analysis. Correlation between 2-dimensional echocardiographic measurements and color kinesis was excellent, r = 0.87. The mean difference in excursion as measured by the 2 methods was -0.05 +/- 2.0 mm. In conclusion, color kinesis is a useful method for assessing regional contraction by displaying a color map of systolic endocardial excursion. This algorithm may improve the confidence and accuracy of assessment of segmental ventricular function by echocardiographic methods.

  9. Psychometric evaluation of the Serbian dictionary for automatic text analysis - LIWCser

    Directory of Open Access Journals (Sweden)

    Bjekić Jovana

    2014-01-01

    Full Text Available LIWC (Linguistic Inquiry and Word Count is widely used word-level content analysis software. It was used in large number of studies in the fields of clinical, social and personality psychology, and it is adapted for text analysis in 11 world languages. The aim of this research was to validate empirically newly constructed adaptation of LIWC software for Serbian language (LIWCser. The sample of the texts consisted of 384 texts in Serbian and 141 texts in English. It included scientific paper abstracts, newspaper articles, movie subtitles, short stories and essays. Comparative analysis of Serbian and English version of the software demonstrated acceptable level of equivalence (ICCM=.70. Average coverage of the texts with LIWCser dictionary was 69.93%, and variability of this measure in different types of texts is in line with expected. Adaptation of LIWC software for Serbian opens entirely new possibilities of assessment of spontaneous verbal behaviour that is highly relevant for different fields of psychology. [Projekat Ministarstva nauke Republike Srbije, br. 179018 i br. 175012

  10. Automatic Detection of Previously-Unseen Application States for Deployment Environment Testing and Analysis

    Science.gov (United States)

    Murphy, Christian; Vaughan, Moses; Ilahi, Waseem; Kaiser, Gail

    2010-01-01

    For large, complex software systems, it is typically impossible in terms of time and cost to reliably test the application in all possible execution states and configurations before releasing it into production. One proposed way of addressing this problem has been to continue testing and analysis of the application in the field, after it has been deployed. A practical limitation of many such automated approaches is the potentially high performance overhead incurred by the necessary instrumentation. However, it may be possible to reduce this overhead by selecting test cases and performing analysis only in previously-unseen application states, thus reducing the number of redundant tests and analyses that are run. Solutions for fault detection, model checking, security testing, and fault localization in deployed software may all benefit from a technique that ignores application states that have already been tested or explored. In this paper, we present a solution that ensures that deployment environment tests are only executed in states that the application has not previously encountered. In addition to discussing our implementation, we present the results of an empirical study that demonstrates its effectiveness, and explain how the new approach can be generalized to assist other automated testing and analysis techniques intended for the deployment environment. PMID:21197140

  11. Automatic snow extent extraction in alpine environments: short and medium term 2000-2006 analysis

    Science.gov (United States)

    Gamba, P.; Lisini, G.; Merlin, E.; Riva, F.

    2007-10-01

    Water resources in Northern Italy has dramatically shortened in the past 10 to 20 years, and recent phenomena connected to the climate change have further sharpened the trend. To match the observable and collected information with this experience and find methodologies to improve the water management cycle in the Lombardy Region, University of Milan Bicocca, Fondazione Lombardia per l'Ambiente and ARPA Lombardia are currently funding a project, named "Regional Impact of Climatic Change in Lombardy Water Resources: Modelling and Applications" (RICLIC-WARM). In the framework of this project, the analysis of the fraction of water available and provided to the whole regional network by the snow cover of the Alps will be investigated by means of remotely sensed data. While there are already a number of algorithms devoted to this task for data coming from various and different sensors in the visible and infrared regions, no operative comparison and analytical analysis of the advantages and drawbacks of using different data has been attempted. This idea will pave the way for a fusion of the available information as well as a multi-source mapping procedure which will be able to exploit successfully the huge quantity of data available for the past and the even larger amount that may be accessed in the future. To this aim, a comparison on selected dates for the whole 2000/2006 period was performed.

  12. Dynamic Analysis of a Vehicular-mounted Automatic Weapon–Planar Case

    Directory of Open Access Journals (Sweden)

    Huai-Ku Sun

    2009-05-01

    Full Text Available This study analyses the dynamic behaviour of a machine gun mounted on a four-wheeled vehicle. The entire system comprises three parts: the gun, the flexible monopod, and the vehicle. The weapon has a multirigid- body mechanism and comprises a rigid receiver, a rigid bolt, a bullet, a buffer, and a recoil spring. The vehicle model features a rigid vehicle body, suspension springs, shock absorbers, and wheels. The finite element method is used to model the flexible monopod connecting the gun and the vehicle. This study combines a computer-aided analysis of rigid-body mechanisms with finite element analysis of a flexible structure to derive the total equations of motion, incorporating the Lagrange multiplier. The total equations of motion are solved with numerical integration to simulate the transient response of the whole system. This approach can easily resolve the problem of rigid-flexible coupling effect, and promote the function of the whole system in the engineering design phase.Defence Science Journal, 2009, 59(3, pp.265-272, DOI:http://dx.doi.org/10.14429/dsj.59.1520

  13. Regionalization of epididymal duct and epithelium in rats and mice by automatic computer-aided morphometric analysis

    Institute of Scientific and Technical Information of China (English)

    C. Soler; J. J. de Monserrat; M. Nú(n)ez; R. Gutiérrez; J. Nú(n)ez; M. Sancho; T. G. Cooper

    2005-01-01

    Aim: To establish a rat and mouse epididymal map based on the use of the Epiquatre automatic software for histologic image analysis. Methods: Epididymides from five adult rats and five adult mice were fixed in alcoholic Bouin's fixative and embedded in paraffin. Serial longitudinal sections through the medial aspect of the organ were cut at 10 μm and stained with hematoxylin and eosin. As determined from major connective tissue septa, nine subdivisions of the rat epididymis and seven for the mouse were determined, consisting of five sub-regions in the caput (rat and mouse), one (mouse) or three (rat) in the corpus and one in the cauda (rat and mouse). Using the Epiquatre software,several tubular, luminal and epithelial morphometric parameters were evaluated. Results: Statistical comparison of the quantitative parameters revealed regional differences (2-5 in the rat, 3-6 in the mouse, dependent on parameters)with caput regions 1 and 2 being largely distinguishable from the similar remaining caput and corpus, which were in turn recognizable from the cauda regions in both species. Conclusion: The use of the Epiquatre software allowed us to establish regression curves for different morphometric parameters that can permit the detection of changes in their values under different pathological or experimental conditions.

  14. Automatic Recognition of Human Parasite Cysts on Microscopic Stools Images using Principal Component Analysis and Probabilistic Neural Network

    Directory of Open Access Journals (Sweden)

    Beaudelaire Saha Tchinda

    2015-09-01

    Full Text Available Parasites live in a host and get its food from or at the expensive of that host. Cysts represent a form of resistance and spread of parasites. The manual diagnosis of microscopic stools images is time-consuming and depends on the human expert. In this paper, we propose an automatic recognition system that can be used to identify various intestinal parasite cysts from their microscopic digital images. We employ image pixel feature to train the probabilistic neural networks (PNN. Probabilistic neural networks are suitable for classification problems. The main novelty is the use of features vectors extracted directly from the image pixel. For this goal, microscopic images are previously segmented to separate the parasite image from the background. The extracted parasite is then resized to 12x12 image features vector. For dimensionality reduction, the principal component analysis basis projection has been used. 12x12 extracted features were orthogonalized into two principal components variables that consist the input vector of the PNN. The PNN is trained using 540 microscopic images of the parasite. The proposed approach was tested successfully on 540 samples of protozoan cysts obtained from 9 kinds of intestinal parasites.

  15. AUTOMATIC HUMAN FACE RECOGNITION USING MULTIVARIATE GAUSSIAN MODEL AND FISHER LINEAR DISCRIMINATIVE ANALYSIS

    Directory of Open Access Journals (Sweden)

    Surya Kasturi

    2014-08-01

    Full Text Available Face recognition plays an important role in surveillance, biometrics and is a popular application of computer vision. In this paper, color based skin segmentation is proposed to detect faces and is matched with faces from the dataset. The proposed color based segmentation method is tested in different color spaces to identify suitable color space for identification of faces. Based on the sample skin distribution a Multivariate Gaussian Model is fitted to identify skin regions from which face regions are detected using connected components. The detected face is match with a template and verified. The proposed method Multivariate Gaussian Model – Fisher Linear Discriminative Analysis (MGM – FLDA is compared with machine learning - Viola & Jones algorithm and it gives better results in terms of time.

  16. Automatic detection and high resolution fine structure analysis of conic X-ray diffraction lines

    Energy Technology Data Exchange (ETDEWEB)

    Bauch, J.; Henschel, F. [TU Dresden, Institut fuer Werkstoffwissenschaft, 01069 Dresden (Germany); Schulze, M. [TU Dresden, Institut fuer Photogrammetrie und Fernerkundung, 01069 Dresden (Germany)

    2011-05-15

    The presented method demonstrates a first step in the development of a high resolution ''Residual stress microscope'' and facilitates through the implementation of largely automated procedures a fast detection of diffraction lines in the form of conic sections. It has been implemented for, but is not exclusively used for the Kossel technique and the ''X-ray Rotation-Tilt Method'' (XRT). The resulting multifaceted evaluable data base of many X-ray diffraction radiographies can be used not only for the systematic analysis of anomalies in diffraction lines (reflection fine structure), but also for direct calculation and output of precision residual stress tensors. (copyright 2011 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim) (orig.)

  17. Using statistical analysis and artificial intelligence tools for automatic assessment of video sequences

    Science.gov (United States)

    Ekobo Akoa, Brice; Simeu, Emmanuel; Lebowsky, Fritz

    2014-01-01

    This paper proposes two novel approaches to Video Quality Assessment (VQA). Both approaches attempt to develop video evaluation techniques capable of replacing human judgment when rating video quality in subjective experiments. The underlying study consists of selecting fundamental quality metrics based on Human Visual System (HVS) models and using artificial intelligence solutions as well as advanced statistical analysis. This new combination enables suitable video quality ratings while taking as input multiple quality metrics. The first method uses a neural network based machine learning process. The second method consists in evaluating the video quality assessment using non-linear regression model. The efficiency of the proposed methods is demonstrated by comparing their results with those of existing work done on synthetic video artifacts. The results obtained by each method are compared with scores from a database resulting from subjective experiments.

  18. Process monitoring using automatic physical measurement based on electrical and physical variability analysis

    Science.gov (United States)

    Shauly, Eitan N.; Levi, Shimon; Schwarzband, Ishai; Adan, Ofer; Latinsky, Sergey

    2015-04-01

    A fully automated silicon-based methodology for systematic analysis of electrical features is shown. The system was developed for process monitoring and electrical variability reduction. A mapping step was created by dedicated structures such as static-random-access-memory (SRAM) array or standard cell library, or by using a simple design rule checking run-set. The resulting database was then used as an input for choosing locations for critical dimension scanning electron microscope images and for specific layout parameter extraction then was input to SPICE compact modeling simulation. Based on the experimental data, we identified two items that must be checked and monitored using the method described here: transistor's sensitivity to the distance between the poly end cap and edge of active area (AA) due to AA rounding, and SRAM leakage due to a too close N-well to P-well. Based on this example, for process monitoring and variability analyses, we extensively used this method to analyze transistor gates having different shapes. In addition, analysis for a large area of high density standard cell library was done. Another set of monitoring focused on a high density SRAM array is also presented. These examples provided information on the poly and AA layers, using transistor parameters such as leakage current and drive current. We successfully define "robust" and "less-robust" transistor configurations included in the library and identified unsymmetrical transistors in the SRAM bit-cells. These data were compared to data extracted from the same devices at the end of the line. Another set of analyses was done to samples after Cu M1 etch. Process monitoring information on M1 enclosed contact was extracted based on contact resistance as a feedback. Guidelines for the optimal M1 space for different layout configurations were also extracted. All these data showed the successful in-field implementation of our methodology as a useful process monitoring method.

  19. Suitability of UK Biobank Retinal Images for Automatic Analysis of Morphometric Properties of the Vasculature.

    Directory of Open Access Journals (Sweden)

    Thomas J MacGillivray

    Full Text Available To assess the suitability of retinal images held in the UK Biobank--the largest retinal data repository in a prospective population-based cohort--for computer assisted vascular morphometry, generating measures that are commonly investigated as candidate biomarkers of systemic disease.Non-mydriatic fundus images from both eyes of 2,690 participants--people with a self-reported history of myocardial infarction (n=1,345 and a matched control group (n=1,345--were analysed using VAMPIRE software. These images were drawn from those of 68,554 UK Biobank participants who underwent retinal imaging at recruitment. Four operators were trained in the use of the software to measure retinal vascular tortuosity and bifurcation geometry.Total operator time was approximately 360 hours (4 minutes per image. 2,252 (84% of participants had at least one image of sufficient quality for the software to process, i.e. there was sufficient detection of retinal vessels in the image by the software to attempt the measurement of the target parameters. 1,604 (60% of participants had an image of at least one eye that was adequately analysed by the software, i.e. the measurement protocol was successfully completed. Increasing age was associated with a reduced proportion of images that could be processed (p=0.0004 and analysed (p<0.0001. Cases exhibited more acute arteriolar branching angles (p=0.02 as well as lower arteriolar and venular tortuosity (p<0.0001.A proportion of the retinal images in UK Biobank are of insufficient quality for automated analysis. However, the large size of the UK Biobank means that tens of thousands of images are available and suitable for computational analysis. Parametric information measured from the retinas of participants with suspected cardiovascular disease was significantly different to that measured from a matched control group.

  20. Index to the Journal of American Indian Education, Vol. 1, No. 1 - Vol. 8, No. 1.

    Science.gov (United States)

    Loomis, Charlotte Ann

    All articles (112) that appeared in the "Journal of American Indian Education" (JAIE), Vol. 1., No. 1 (June 1961) through Vol. 8, No 1 (October 1968) are indexed and annotated. The publication is divided into 3 parts: (1) annotations listed in order of appearance in JAIE by volume, number, and page; (2) author index; and (3) subject index. Later…

  1. 发动机气道CFD分析流程自动化研究%CFD Automatic Analysis Process of Engine Ports

    Institute of Scientific and Technical Information of China (English)

    姜涛; 罗马吉; 向梁山; 宋秀萍

    2012-01-01

    在成熟的发动机气道CFD分析流程的基础上,基于CFD分析软件Star-ccm+和Java编程语言,提出并探索了发动机气道CFD分析流程自动化的解决方案,并开发出了简单的气道CFD自动化分析平台,可在一定程度上提高气道CFD分析的效率.该方案对其他CFD分析问题的流程自动化也具有借鉴意义.%Based on the mature analysis process of the engine port CFD, the solution of the automatic process of the engine port CFD analysis was presented with Star - ccm + software and Java programming. And a simple platform of the CFD automatic analysis was developed. It can improve the efficiency of the CFD analysis and provide references for the automatic analysis processes of other CFD problems.

  2. Assessment of features for automatic CTG analysis based on expert annotation.

    Science.gov (United States)

    Chudácek, Vacláv; Spilka, Jirí; Lhotská, Lenka; Janku, Petr; Koucký, Michal; Huptych, Michal; Bursa, Miroslav

    2011-01-01

    Cardiotocography (CTG) is the monitoring of fetal heart rate (FHR) and uterine contractions (TOCO) since 1960's used routinely by obstetricians to detect fetal hypoxia. The evaluation of the FHR in clinical settings is based on an evaluation of macroscopic morphological features and so far has managed to avoid adopting any achievements from the HRV research field. In this work, most of the ever-used features utilized for FHR characterization, including FIGO, HRV, nonlinear, wavelet, and time and frequency domain features, are investigated and the features are assessed based on their statistical significance in the task of distinguishing the FHR into three FIGO classes. Annotation derived from the panel of experts instead of the commonly utilized pH values was used for evaluation of the features on a large data set (552 records). We conclude the paper by presenting the best uncorrelated features and their individual rank of importance according to the meta-analysis of three different ranking methods. Number of acceleration and deceleration, interval index, as well as Lempel-Ziv complexity and Higuchi's fractal dimension are among the top five features. PMID:22255719

  3. Automatic evaluation of intrapartum fetal heart rate recordings: a comprehensive analysis of useful features

    International Nuclear Information System (INIS)

    Cardiotocography is the monitoring of fetal heart rate (FHR) and uterine contractions (TOCO), used routinely since the 1960s by obstetricians to detect fetal hypoxia. The evaluation of the FHR in clinical settings is based on an evaluation of macroscopic morphological features and so far has managed to avoid adopting any achievements from the HRV research field. In this work, most of the features utilized for FHR characterization, including FIGO, HRV, nonlinear, wavelet, and time and frequency domain features, are investigated and assessed based on their statistical significance in the task of distinguishing the FHR into three FIGO classes. We assess the features on a large data set (552 records) and unlike in other published papers we use three-class expert evaluation of the records instead of the pH values. We conclude the paper by presenting the best uncorrelated features and their individual rank of importance according to the meta-analysis of three different ranking methods. The number of accelerations and decelerations, interval index, as well as Lempel–Ziv complexity and Higuchi's fractal dimension are among the top five features

  4. Automatic unattended sampling and analysis of background levels of C 2;C 5 hydrocarbons

    Science.gov (United States)

    Mowrer, Jacques; Lindskog, Anne

    As part of the European program for monitoring anthropogenic air pollutants (EUROTRAC), C 2C 5 hydrocarbons (gas phase) are being routinely measured at a background station at Rörvik, Sweden. A 2 ℓ air sample is taken every 4 h, and a compressed air standard and helium blank are analysed daily. The method is based on adsorption of the hydrocarbons onto an active charcoal based adsorbent, desorption/crofocusing onto a capillary trap, and analysis using capillary gas chromatography with a flame ionization detector. A Perma Pure dryer is used to remove water from the sample, and hydrocarbons > C 6 are removed using a Tenax adsorbent. The analytical instrument can be left unattended for up to 2 weeks at a time, depending on the consumption of liquid nitrogen and the compressed gases. Baseline or near baseline resolution is obtained for the 23 hydrocarbons monitored in this study. Reproducibility for the C 2C 4 isomers is 1-2%, and 2-15% for the C 5 isomers. The detection limit is 1-7 pptv. Preliminary mean hydrocarbon concentrations are presented for the period 21 February-9 April 1989.

  5. Automatic Differentiation Variational Inference

    OpenAIRE

    Kucukelbir, Alp; Tran, Dustin; Ranganath, Rajesh; Gelman, Andrew; Blei, David M.

    2016-01-01

    Probabilistic modeling is iterative. A scientist posits a simple model, fits it to her data, refines it according to her analysis, and repeats. However, fitting complex models to large data is a bottleneck in this process. Deriving algorithms for new models can be both mathematically and computationally challenging, which makes it difficult to efficiently cycle through the steps. To this end, we develop automatic differentiation variational inference (ADVI). Using our method, the scientist on...

  6. Automatic Digital Analysis of Chromogenic Media for Vancomycin-Resistant-Enterococcus Screens Using Copan WASPLab.

    Science.gov (United States)

    Faron, Matthew L; Buchan, Blake W; Coon, Christopher; Liebregts, Theo; van Bree, Anita; Jansz, Arjan R; Soucy, Genevieve; Korver, John; Ledeboer, Nathan A

    2016-10-01

    Vancomycin-resistant enterococci (VRE) are an important cause of health care-acquired infections (HAIs). Studies have shown that active surveillance of high-risk patients for VRE colonization can aid in reducing HAIs; however, these screens generate a significant cost to the laboratory and health care system. Digital imaging capable of differentiating negative and "nonnegative" chromogenic agar can reduce the labor cost of these screens and potentially improve patient care. In this study, we evaluated the performance of the WASPLab Chromogenic Detection Module (CDM) (Copan, Brescia, Italy) software to analyze VRE chromogenic agar and compared the results to technologist plate reading. Specimens collected at 3 laboratories were cultured using the WASPLab CDM and plated to each site's standard-of-care chromogenic media, which included Colorex VRE (BioMed Diagnostics, White City, OR) or Oxoid VRE (Oxoid, Basingstoke, United Kingdom). Digital images were scored using the CDM software after 24 or 40 h of growth, and all manual reading was performed using digital images on a high-definition (HD) monitor. In total, 104,730 specimens were enrolled and automation agreed with manual analysis for 90.1% of all specimens tested, with sensitivity and specificity of 100% and 89.5%, respectively. Automation results were discordant for 10,348 specimens, and all discordant images were reviewed by a laboratory supervisor or director. After a second review, 499 specimens were identified as representing missed positive cultures falsely called negative by the technologist, 1,616 were identified as containing borderline color results (negative result but with no package insert color visible), and 8,234 specimens were identified as containing colorimetric pigmentation due to residual matrix from the specimen or yeast (Candida). Overall, the CDM was accurate at identifying negative VRE plates, which comprised 84% (87,973) of the specimens in this study.

  7. Automatic Digital Analysis of Chromogenic Media for Vancomycin-Resistant-Enterococcus Screens Using Copan WASPLab.

    Science.gov (United States)

    Faron, Matthew L; Buchan, Blake W; Coon, Christopher; Liebregts, Theo; van Bree, Anita; Jansz, Arjan R; Soucy, Genevieve; Korver, John; Ledeboer, Nathan A

    2016-10-01

    Vancomycin-resistant enterococci (VRE) are an important cause of health care-acquired infections (HAIs). Studies have shown that active surveillance of high-risk patients for VRE colonization can aid in reducing HAIs; however, these screens generate a significant cost to the laboratory and health care system. Digital imaging capable of differentiating negative and "nonnegative" chromogenic agar can reduce the labor cost of these screens and potentially improve patient care. In this study, we evaluated the performance of the WASPLab Chromogenic Detection Module (CDM) (Copan, Brescia, Italy) software to analyze VRE chromogenic agar and compared the results to technologist plate reading. Specimens collected at 3 laboratories were cultured using the WASPLab CDM and plated to each site's standard-of-care chromogenic media, which included Colorex VRE (BioMed Diagnostics, White City, OR) or Oxoid VRE (Oxoid, Basingstoke, United Kingdom). Digital images were scored using the CDM software after 24 or 40 h of growth, and all manual reading was performed using digital images on a high-definition (HD) monitor. In total, 104,730 specimens were enrolled and automation agreed with manual analysis for 90.1% of all specimens tested, with sensitivity and specificity of 100% and 89.5%, respectively. Automation results were discordant for 10,348 specimens, and all discordant images were reviewed by a laboratory supervisor or director. After a second review, 499 specimens were identified as representing missed positive cultures falsely called negative by the technologist, 1,616 were identified as containing borderline color results (negative result but with no package insert color visible), and 8,234 specimens were identified as containing colorimetric pigmentation due to residual matrix from the specimen or yeast (Candida). Overall, the CDM was accurate at identifying negative VRE plates, which comprised 84% (87,973) of the specimens in this study. PMID:27413193

  8. Statistical analysis of automatically detected ion density variations recorded by DEMETER and their relation to seismic activity

    Directory of Open Access Journals (Sweden)

    Michel Parrot

    2012-04-01

    Full Text Available

    Many examples of ionospheric perturbations observed during large seismic events were recorded by the low-altitude satellite DEMETER. However, there are also ionospheric variations without seismic activity. The present study is devoted to a statistical analysis of the night-time ion density variations. Software was implemented to detect variations in the data before earthquakes world-wide. Earthquakes with magnitudes >4.8 were selected and classified according to their magnitudes, depths and locations (land, close to the coast, or below the sea. For each earthquake, an automatic search for ion density variations was conducted from 15 days before the earthquake, when the track of the satellite orbit was at less than 1,500 km from the earthquake epicenter. The result of this first step provided the variations relative to the background in the vicinity of the epicenter for each 15 days before each earthquake. In the second step, comparisons were carried out between the largest variations over the 15 days and the earthquake magnitudes. The statistical analysis is based on calculation of the median values as a function of the various seismic parameters (magnitude, depth, location. A comparison was also carried out with two other databases, where on the one hand, the locations of the epicenters were randomly modified, and on the other hand, the longitudes of the epicenters were shifted. The results show that the intensities of the ionospheric perturbations are larger prior to the earthquakes than prior to random events, and that the perturbations increase with the earthquake magnitudes.


  9. MAGE (M-file/Mif Automatic GEnerator): A graphical interface tool for automatic generation of Object Oriented Micromagnetic Framework configuration files and Matlab scripts for results analysis

    Science.gov (United States)

    Chęciński, Jakub; Frankowski, Marek

    2016-10-01

    We present a tool for fully-automated generation of both simulations configuration files (Mif) and Matlab scripts for automated data analysis, dedicated for Object Oriented Micromagnetic Framework (OOMMF). We introduce extended graphical user interface (GUI) that allows for fast, error-proof and easy creation of Mifs, without any programming skills usually required for manual Mif writing necessary. With MAGE we provide OOMMF extensions for complementing it by mangetoresistance and spin-transfer-torque calculations, as well as local magnetization data selection for output. Our software allows for creation of advanced simulations conditions like simultaneous parameters sweeps and synchronic excitation application. Furthermore, since output of such simulation could be long and complicated we provide another GUI allowing for automated creation of Matlab scripts suitable for analysis of such data with Fourier and wavelet transforms as well as user-defined operations.

  10. AUTOMATIC ANALYSIS AND CLASSIFICATION OF THE ROOF SURFACES FOR THE INSTALLATION OF SOLAR PANELS USING A MULTI-DATA SOURCE AND MULTI-SENSOR AERIAL PLATFORM

    OpenAIRE

    López, L.; Lagüela, S.; Picon, I.; D. González-Aguilera

    2015-01-01

    A low-cost multi-sensor aerial platform, aerial trike, equipped with visible and thermographic sensors is used for the acquisition of all the data needed for the automatic analysis and classification of roof surfaces regarding their suitability to harbour solar panels. The geometry of a georeferenced 3D point cloud generated from visible images using photogrammetric and computer vision algorithms, and the temperatures measured on thermographic images are decisive to evaluate the surfaces, slo...

  11. Large Scale Automatic Analysis and Classification of Roof Surfaces for the Installation of Solar Panels Using a Multi-Sensor Aerial Platform

    OpenAIRE

    Luis López-Fernández; Susana Lagüela; Inmaculada Picón; Diego González-Aguilera

    2015-01-01

    A low-cost multi-sensor aerial platform, aerial trike, equipped with visible and thermographic sensors is used for the acquisition of all the data needed for the automatic analysis and classification of roof surfaces regarding their suitability to harbor solar panels. The geometry of a georeferenced 3D point cloud generated from visible images using photogrammetric and computer vision algorithms, and the temperatures measured on thermographic images are decisive to evaluate the areas, tilts, ...

  12. Automatic detection of a hand-held needle in ultrasound via phased-based analysis of the tremor motion

    Science.gov (United States)

    Beigi, Parmida; Salcudean, Septimiu E.; Rohling, Robert; Ng, Gary C.

    2016-03-01

    This paper presents an automatic localization method for a standard hand-held needle in ultrasound based on temporal motion analysis of spatially decomposed data. Subtle displacement arising from tremor motion has a periodic pattern which is usually imperceptible in the intensity image but may convey information in the phase image. Our method aims to detect such periodic motion of a hand-held needle and distinguish it from intrinsic tissue motion, using a technique inspired by video magnification. Complex steerable pyramids allow specific design of the wavelets' orientations according to the insertion angle as well as the measurement of the local phase. We therefore use steerable pairs of even and odd Gabor wavelets to decompose the ultrasound B-mode sequence into various spatial frequency bands. Variations of the local phase measurements in the spatially decomposed input data is then temporally analyzed using a finite impulse response bandpass filter to detect regions with a tremor motion pattern. Results obtained from different pyramid levels are then combined and thresholded to generate the binary mask input for the Hough transform, which determines an estimate of the direction angle and discards some of the outliers. Polynomial fitting is used at the final stage to remove any remaining outliers and improve the trajectory detection. The detected needle is finally added back to the input sequence as an overlay of a cloud of points. We demonstrate the efficiency of our approach to detect the needle using subtle tremor motion in an agar phantom and in-vivo porcine cases where intrinsic motion is also present. The localization accuracy was calculated by comparing to expert manual segmentation, and presented in (mean, standard deviation and root-mean-square error) of (0.93°, 1.26° and 0.87°) and (1.53 mm, 1.02 mm and 1.82 mm) for the trajectory and the tip, respectively.

  13. Automatic Functional Harmonic Analysis

    OpenAIRE

    de Haas, W.B.; Magalhães, J.P.; Wiering, F.; Veltkamp, R.C.

    2013-01-01

    Music scholars have been studying tonal harmony intensively for centuries, yielding numerous theories and models. Unfortunately, a large number of these theories are formulated in a rather informal fashion and lack mathematical precision. In this article we present HarmTrace, a functional model of Western tonal harmony that builds on well-known theories of tonal harmony. In contrast to other approaches that remain purely theoretical, we present an implemented system that is evaluated empirica...

  14. Automatic Functional Harmonic Analysis

    NARCIS (Netherlands)

    de Haas, W.B.; Magalhães, J.P.; Wiering, F.; Veltkamp, R.C.

    2013-01-01

    Music scholars have been studying tonal harmony intensively for centuries, yielding numerous theories and models. Unfortunately, a large number of these theories are formulated in a rather informal fashion and lack mathematical precision. In this article we present HarmTrace, a functional model of W

  15. A Level 1+ Probabilistic Safety Assessment of the high flux Australian reactor. Vol. 2. Appendix C: System analysis models and results

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-01-01

    This section contains the results of the quantitative system/top event analysis. Section C. 1 gives the basic event coding scheme. Section C.2 shows the master frequency file (MFF), which contains the split fraction names, the top events they belong to, the mean values of the uncertainty distribution that is generated by the Monte Carlo quantification in the System Analysis module of RISKMAN, and a brief description of each split fraction. The MFF is organized by the systems modeled, and within each system, the top events associated with the system. Section C.3 contains the fault trees developed for the system/top event models and the RISKMAN reports for each of the system/top event models. The reports are organized under the following system headings: Compressed/Service Air Supply (AIR); Containment Isolation System (CIS); Heavy Water Cooling System (D20); Emergency Core Cooling System (ECCS); Electric Power System (EPS); Light Water Cooling system (H20); Helium Gas System (HE); Mains Water System (MW); Miscellaneous Top Events (MISC); Operator Actions (OPER) Reactor Protection System (RPS); Space Conditioner System (SCS); Condition/Status Switch (SWITCH); RCB Ventilation System (VENT); No. 1 Storage Block Cooling System (SB)

  16. A comparison of conscious and automatic memory processes for picture and word stimuli: a process dissociation analysis.

    Science.gov (United States)

    McBride, Dawn M; Anne Dosher, Barbara

    2002-09-01

    Four experiments were conducted to evaluate explanations of picture superiority effects previously found for several tasks. In a process dissociation procedure (Jacoby, 1991) with word stem completion, picture fragment completion, and category production tasks, conscious and automatic memory processes were compared for studied pictures and words with an independent retrieval model and a generate-source model. The predictions of a transfer appropriate processing account of picture superiority were tested and validated in "process pure" latent measures of conscious and unconscious, or automatic and source, memory processes. Results from both model fits verified that pictures had a conceptual (conscious/source) processing advantage over words for all tasks. The effects of perceptual (automatic/word generation) compatibility depended on task type, with pictorial tasks favoring pictures and linguistic tasks favoring words. Results show support for an explanation of the picture superiority effect that involves an interaction of encoding and retrieval processes. PMID:12435377

  17. Crisis Communication (Handbooks of Communication Science Vol. 23)

    DEFF Research Database (Denmark)

    Vol. 23 - The Handbook of Communication Science General editors: Peter J. Schultz and Paul Cobley......Vol. 23 - The Handbook of Communication Science General editors: Peter J. Schultz and Paul Cobley...

  18. Dynamics of structures '89. Vol. 3

    International Nuclear Information System (INIS)

    The proceedings, comprising 3 volumes published by the Plzen Centre of the Czechoslovak Society for Science and Technology (Vol. 1 and 2) and by Skoda Works in Plzen (Vol. 3), contain 107 papers, out of which 8 fall within the INIS Subject Scope; these deal with problems related to the earthquake resistance of nuclear power plants. Attention is paid to the evaluation of seismic characteristics of nuclear power plant equipment, to the equipment testing and to calculations of its dynamic characteristics under simulated seismic stress. (Z.M.)

  19. Automatic Semantic Image Annotation with Granular Analysis Method%图像语义自动标注及其粒度分析方法

    Institute of Scientific and Technical Information of China (English)

    张素兰; 郭平; 张继福; 胡立华

    2012-01-01

    缩小图像低层视觉特征与高层语义之间的鸿沟,以提高图像语义自动标注的精度,进而快速满足用户检索图像的需求,一直是图像语义自动标注研究的关键.粒度分析方法足一种层次的、重要的数据分析方法,为复杂问题的求解提供了新的思路.图像理解与分析的粒度不同,图像语义标注的精度则不同,检索的效率及准确度也就不同.本文对目前图像语义自动标注模型的方法进行综述和分析,阐述了粒度分析方法的思想、模型及其在图像语义标注过程中的应用,探索了以粒度分析为基础的图像语义自动标注方法并给出进一步的研究方向.%To bridge the semantic gap between low-level visual feature and high-level semantic concepts has been the subject of intensive investigation for years in order to improve the accuracy of automatic image annotation and satisfy the users' needs of quick image retrieval. Granular analysis is a hierarchical and important data analyzing method, which provides a new idea and method for solving the complicated problem. The accuracy of automatic image annotation and the efficiency of image retrieval are varying with the granularity size of image understanding and analysis. In this paper, the state-of-art models of automatic semantic image annotation are overviewed, then the idea and models of the granular analysis with its application in the process of automatic semantic image annotation are discussed, and the granular analysis based automatic image annotation methods are investigated as well as the promising research directions are given.

  20. Soil-structure interaction Vol.3. Influence of ground water

    International Nuclear Information System (INIS)

    This study has been performed for the Nuclear Regulatory Commission (NRC) by the Structural Analysis Division of Brookhaven National Laboratory (BNL). The study was conducted during the fiscal year 1965 on the program entitled 'Benchmarking of Structural Engineering Problems' sponsored by NRC. The program considered three separate but complementary problems, each associated with the soil-structure interaction (551) phase of the seismic response analysis of nuclear plant facilities. The reports, all entitled Soil-Structure Interaction, are presented in three separate volumes, namely: Vol. 1 Influence of Layering by AJ Philippacopoulos, Vol. 2 Influence of Lift-Off by C.A. Miller, Vol. 3 Influence of Ground Water by C.J. Costantino. The two problems presented in Volumes 2 and 3 were conducted at the City University of New York (CUNY) under subcontract to BNL. This report, Volume 3 of the report, presents a summary of the first year's effort on the subject of the influence of foundation ground water on the SSI phenomenon. A finite element computer program was developed for the two-phased formulation of the combined soil-water problem. This formulation is based on the Biot dynamic equations of motion for both the solid and fluid phases of a typical soil. Frequency dependent interaction coefficients were generated for the two-dimensional plane problem of a rigid surface footing moving against a saturated linear soil. The results indicate that interaction coefficients are significantly modified as compared to the comparable values for a dry soil, particularly for the rocking mode of response. Calculations were made to study the impact of the modified interaction coefficients on the response of a typical nuclear reactor building. The amplification factors for a stick model placed atop a dry and saturated soil were computed. It was found that pore water caused the rocking response to decrease and translational response to increase over the frequency range of interest, as

  1. Proceedings of the third arab conference on the peaceful uses of atomic energy, vol.a,b

    International Nuclear Information System (INIS)

    The publication has been set up as a textbook for peaceful uses of atomic energy vol.A: (1) reactor,materials,energy; (2) nuclear raw materials; (3) radiocesium-waste; (4) nuclear safety; (5) nuclear physics; (6) radiochemistry; (7) radiobiology; vol.B: (1) nuclear medicine; (2) agriculture and soil science; (3) isotope hydrology; (4) food preservation; (5) insect eradication; (6 )industrial application; (7) nuclear activation analysis; (8) health physics and environmental studies

  2. Road Safety Data, Collection, Transfer and Analysis DaCoTa. Deliverable 1.5. Vol.1 — Analysis of the stakeholder survey: perceived priority and availability of data and tools and relation to the stakeholders' characteristics. Vol.II: Analysis of Road Safety Management in the European countries.

    OpenAIRE

    Papadimitriou, E. Yannis, G. Bijleveld, F.D. & Cardoso, J.L.

    2015-01-01

    Volume I: This report is part of the ‘Policy’ Work Package of the DaCoTA project (www.dacotaproject.eu). The ‘Policy’ Work Package is designed to fill in the gap in knowledge on road safety policy making processes, their institutional framework and the data, methods and technical tools needed to base policy formulation and adoption on scientifically-established evidence. This document provides the results of a detailed analysis of a survey conducted with a large panel of stakeholders. The aim...

  3. Caracterización de especies de Tilia mediante perfiles cromatográficos en fase gaseosa de los componentes volátiles extraídos del vapor en equilibrio con el material vegetal ( "headspace analysis" )

    OpenAIRE

    Sala, Gabriela; Mandrile, Eloy L.; Cafferata, Lázaro F.R.

    1992-01-01

    Se han estudiado los componentes volátiles de brácteas y flores de 3 especies de Tilia (Tilo), por cromatografía gaseosa, utilizando una columna rellena con "Porapak Q" a fin de contribuir a su caracterización quimiotaxonómica. Se compararon las eficiencias relativas de los métodos de extracción de compuestos volátiles: destilación por arrastre con vapor de agua, destilación a presión reducida y de muestreo estático de la cámara ocupada por el vapor en equilibrio con el material a 80 "C ("Hea...

  4. Efficacy and complications of ultrasound-guided percutaneous renal biopsy using 18 G automatic biopsy gun in diffuse renal disease: Analysis of 203 cases

    Energy Technology Data Exchange (ETDEWEB)

    Gwon, Dong Il; Lee, Kang Hoon; Song, Kyung Sup; An, Suk Joo; Son, Sang Bum; Kim, Hyeon Suk [The Catholic University of Korea, St. Paul' s Hospital, Seoul (Korea, Republic of); Kim, Jee Young; Kim, Won Young; Park, Young Ha [The Catholic University of Korea, St. Vincent' s Hospital, Suwon (Korea, Republic of)

    2000-12-15

    To evaluate the efficacy and complications of ultrasound-guided percutaneous renal biopsy using 18G automatic biopsy gun in patients with diffuse renal disease. 203 ultrasound-guided renal biopsies using 18G automatic biopsy gun were performed in 197 patients for the diagnosis of diffuse renal disease. The success and complication rates were retrospectively evaluated by analysis of pathologic and clinical records and post-procedure ultrasonograms of the patients. Out of 203 renal biopsies, adequate tissues for pathologic diagnosis were obtained in 184 (90.6%) biopsies. The mean number of needle passes was 2.08, and the mean number of retrieved glomeruli was 7.71 {+-} 4.23. Minor complications occurred in seven biopsies (3.45%) including asymptomatic macroscopic hematuria in five (2.45%) and small subcapsular hematomas in two (1%). No patients required transfusion or surgery because of biopsy-related complication. Ultrasound-guided percutaneous renal biopsy using 18G automatic biopsy gun was an effective method for the pathologic diagnosis of diffuse renal disease and safe with low complication rate related to the procedure.

  5. Microcomputer-based systems for automatic control of sample irradiation and chemical analysis of short-lived isotopes

    International Nuclear Information System (INIS)

    Two systems resulted from the need for the study of the nuclear decay of short-lived radionuclides. Automation was required for better repeatability, speed of chemical separation after irradiation and for protection from the high radiation fields of the samples. A MCS-8 computer was used as the nucleus of the automatic sample irradiation system because the control system required an extensive multiple-sequential circuit. This approach reduced the sequential problem to a computer program. The automatic chemistry control system is a mixture of a fixed and a computer-based programmable control system. The fixed control receives the irradiated liquid sample from the reactor, extracts the liquid and disposes of the used sample container. The programmable control executes the chemistry program that the user has entered through the teletype. (U.S.)

  6. Comparative analysis of different implementations of a parallel algorithm for automatic target detection and classification of hyperspectral images

    Science.gov (United States)

    Paz, Abel; Plaza, Antonio; Plaza, Javier

    2009-08-01

    Automatic target detection in hyperspectral images is a task that has attracted a lot of attention recently. In the last few years, several algoritms have been developed for this purpose, including the well-known RX algorithm for anomaly detection, or the automatic target detection and classification algorithm (ATDCA), which uses an orthogonal subspace projection (OSP) approach to extract a set of spectrally distinct targets automatically from the input hyperspectral data. Depending on the complexity and dimensionality of the analyzed image scene, the target/anomaly detection process may be computationally very expensive, a fact that limits the possibility of utilizing this process in time-critical applications. In this paper, we develop computationally efficient parallel versions of both the RX and ATDCA algorithms for near real-time exploitation of these algorithms. In the case of ATGP, we use several distance metrics in addition to the OSP approach. The parallel versions are quantitatively compared in terms of target detection accuracy, using hyperspectral data collected by NASA's Airborne Visible Infra-Red Imaging Spectrometer (AVIRIS) over the World Trade Center in New York, five days after the terrorist attack of September 11th, 2001, and also in terms of parallel performance, using a massively Beowulf cluster available at NASA's Goddard Space Flight Center in Maryland.

  7. Contribution to automatic speech recognition. Analysis of the direct acoustical signal. Recognition of isolated words and phoneme identification

    International Nuclear Information System (INIS)

    This report deals with the acoustical-phonetic step of the automatic recognition of the speech. The parameters used are the extrema of the acoustical signal (coded in amplitude and duration). This coding method, the properties of which are described, is simple and well adapted to a digital processing. The quality and the intelligibility of the coded signal after reconstruction are particularly satisfactory. An experiment for the automatic recognition of isolated words has been carried using this coding system. We have designed a filtering algorithm operating on the parameters of the coding. Thus the characteristics of the formants can be derived under certain conditions which are discussed. Using these characteristics the identification of a large part of the phonemes for a given speaker was achieved. Carrying on the studies has required the development of a particular methodology of real time processing which allowed immediate evaluation of the improvement of the programs. Such processing on temporal coding of the acoustical signal is extremely powerful and could represent, used in connection with other methods an efficient tool for the automatic processing of the speech.(author)

  8. Automatic Spectroscopic Data Categorization by Clustering Analysis (ASCLAN): A Data-Driven Approach for Distinguishing Discriminatory Metabolites for Phenotypic Subclasses.

    Science.gov (United States)

    Zou, Xin; Holmes, Elaine; Nicholson, Jeremy K; Loo, Ruey Leng

    2016-06-01

    We propose a novel data-driven approach aiming to reliably distinguish discriminatory metabolites from nondiscriminatory metabolites for a given spectroscopic data set containing two biological phenotypic subclasses. The automatic spectroscopic data categorization by clustering analysis (ASCLAN) algorithm aims to categorize spectral variables within a data set into three clusters corresponding to noise, nondiscriminatory and discriminatory metabolites regions. This is achieved by clustering each spectral variable based on the r(2) value representing the loading weight of each spectral variable as extracted from a orthogonal partial least-squares discriminant (OPLS-DA) model of the data set. The variables are ranked according to r(2) values and a series of principal component analysis (PCA) models are then built for subsets of these spectral data corresponding to ranges of r(2) values. The Q(2)X value for each PCA model is extracted. K-means clustering is then applied to the Q(2)X values to generate two clusters based on minimum Euclidean distance criterion. The cluster consisting of lower Q(2)X values is deemed devoid of metabolic information (noise), while the cluster consists of higher Q(2)X values is then further subclustered into two groups based on the r(2) values. We considered the cluster with high Q(2)X but low r(2) values as nondiscriminatory, while the cluster with high Q(2)X and r(2) values as discriminatory variables. The boundaries between these three clusters of spectral variables, on the basis of the r(2) values were considered as the cut off values for defining the noise, nondiscriminatory and discriminatory variables. We evaluated the ASCLAN algorithm using six simulated (1)H NMR spectroscopic data sets representing small, medium and large data sets (N = 50, 500, and 1000 samples per group, respectively), each with a reduced and full resolution set of variables (0.005 and 0.0005 ppm, respectively). ASCLAN correctly identified all discriminatory

  9. 如何在EXCEL中实现考试成绩自动分析%How to Achieve Automatic Analysis of Test Scores in EXCEL

    Institute of Scientific and Technical Information of China (English)

    王方云

    2012-01-01

    学校教学管理部门要求教师每学期对所任学科的期末考试成绩进行成绩统计分析,并做出成绩直方图。此项工作繁琐,手工操作还不准确且极易出错,如果采用EXCEL则完全可以精确实现对考试成绩自动分析并做出成绩直方图。文章对如何在EXCEL中实现考试成绩自动分析作了详细描述。%School education management require teachers for statistical analysis of results each semester for the final exam- ination of any subject, and make performance histogram. This work is tedious, manual operation is not accurate and error- prone, if you can use EXCEL to achieve accurate automatic analysis of test scores and make results histogram. This article detailed describes on how to achieve test results in EXCEL automatic analysis.

  10. Z Physics at LEP 1. Vol. 3

    International Nuclear Information System (INIS)

    The contents of this final report from the Workshop on Z Physics at LEP can be divided into two parts. The first part, comprising Vols. 1 and 2, is a relatively concise but fairly complete handbook on the physics of e+e- annihilation near the Z peak (with normal LEP luminosity and unpolarized beams, appropriate for the first phase of LEP operation). The second part (Vol. 3) is devoted to a review of the existing Monte Carlo event generators for LEP physics. A special effort has been made to co-ordinate the different parts of this report, with the aim of achieving a systematic and balanced review of the subject, rather than having simply a collection of separate contributions. (orig.)

  11. Z physics at LEP 1. Vol. 2

    International Nuclear Information System (INIS)

    The contents of this final report from the Workshop on Z Physics at LEP can be divided into two parts. The first part, comprising Vols. 1 and 2, is a relatively concise but fairly complete handbook on the physics of e+e- annihilation near the Z peak (with normal LEP luminosity and unpolarized beams, appropriate for the first phase of LEP operation). The second part (Vol. 3) is devoted to a review of the existing Monte Carlo event generators for LEP physics. A special effort has been made to co-ordinate the different parts of this report, with the aim of achieving a systematic and balanced review of the subject, rather than having simply a collection of separate contributions. (orig.)

  12. Large hadron collider workshop. Proceedings. Vol. 1

    International Nuclear Information System (INIS)

    The aim of the LCH workshop at Aachen was to discuss the 'discovery potential' of a high-luminosity hadron collider (the Large Hadron Collider) and to define the requirements of the detectors. Of central interest was whether a Higgs particle with mass below 1 TeV could be seen using detectors potentially available within a few years from now. Other topics included supersymmetry, heavy quarks, excited gauge bosons, and exotica in proton-proton collisions, as well as physics to be observed in electron-proton and heavy-ion collisions. A large part of the workshop was devoted to the discussion of instrumental and detector concepts, including simulation, signal processing, data acquisition, tracking, calorimetry, lepton identification and radiation hardness. The workshop began with parallel sessions of working groups on physics and instrumentation and continued, in the second half, with plenary talks giving overviews of the LHC project and the SSC, RHIC, and HERA programmes, summaries of the working groups, presentations from industry, and conclusions. Vol. 1 of these proceedings contains the papers presented at the plenary sessions, Vol. 2 the individual contributions to the physics sessions, and Vol. 3 those to the instrumentation sessions. (orig.)

  13. Large hadron collider workshop. Proceedings. Vol. 3

    International Nuclear Information System (INIS)

    The aim of the LHC workshop at Aachen was to discuss the 'discovery potential' of a high-luminosity hadron collider (the Large Hadron Collider) and to define the requirements of the detectors. Of central interest was whether a Higgs particle with mass below 1 TeV could be seen using detectors potentially available within a few years from now. Other topics included supersymmetry, heavy quarks, excited gauge bosons, and exotica in proton-proton collisions, as well as physics to be observed in electron-proton and heavy-ion collisions. A large part of the workshop was devoted to the discussion of instrumental and detector concepts, including simulation, signal processing, data acquisition, tracking, calorimetry, lepton identification and radiation hardness. The workshop began with parallel sessions of working groups on physics and instrumentaiton and continued, in the second half, with plenary talks giving overviews of the LHC project and the SSC, RHIC, and HERA programmes, summaries of the working groups, presentations from industry, and conclusions. Vol. 1 of these proceedings contains the papers presented at the plenary sessions, Vol. 2 the individual contributions to the physics sessions, and Vol. 3 those to the instrumentation sessions. (orig.)

  14. Large hadron collider workshop. Proceedings. Vol. 2

    International Nuclear Information System (INIS)

    The aim of the LHC workshop at Aachen was to discuss the 'discovery potential' of a high-luminosity hadron collider (the Large Hadron Collider) and to define the requirements of the detectors. Of central interest was whether a Higgs particle with mass below 1 TeV could be seen using detectors potentially available within a few years from now. Other topics included supersymmetry, heavy quarks, excited gauge bosons, and exotica in proton-proton collisions, as well as physics to be observed in electron-proton and heavy-ion collisions. A large part of the workshop was devoted to the discussion of instrumental and detector concepts, including simulation, signal processing, data acquisition, tracking, calorimetry, lepton identification and radiation hardness. The workshop began with parallel sessions of working groups on physics and instrumentation and continued, in the second half, with plenary talks giving overviews of the LHC project and the SSC, RHIC, and HERA programmes, summaries of the working groups, presentations from industry, and conclusions. Vol.1 of these proceedings contains the papers presented at the plenary sessions, Vol.2 the individual contributions to the physics sessions, and Vol.3 those to the instrumentation sessions. (orig.)

  15. Development of automatic extraction of the corpus callosum from magnetic resonance imaging of the head and examination of the early dementia objective diagnostic technique in feature analysis

    International Nuclear Information System (INIS)

    We examined the objective diagnosis of dementia based on changes in the corpus callosum. We examined midsagittal head MR images of 17 early dementia patients (2 men and 15 women; mean age, 77.2±3.3 years) and 18 healthy elderly controls (2 men and 16 women; mean age, 73.8±6.5 years), 35 subjects altogether. First, the corpus callosum was automatically extracted from the MR images. Next, early dementia was compared with the healthy elderly individuals using 5 features of the straight-line methods, 5 features of the Run-Length Matrix, and 6 features of the Co-occurrence Matrix from the corpus callosum. Automatic extraction of the corpus callosum showed an accuracy rate of 84.1±3.7%. A statistically significant difference was found in 6 of the 16 features between early dementia patients and healthy elderly controls. Discriminant analysis using the 6 features demonstrated a sensitivity of 88.2% and specificity of 77.8%, with an overall accuracy of 82.9%. These results indicate that feature analysis based on changes in the corpus callosum can be used as an objective diagnostic technique for early dementia. (author)

  16. Automatic segmentation of diatom images for classification

    NARCIS (Netherlands)

    Jalba, Andrei C.; Wilkinson, Michael H.F.; Roerdink, Jos B.T.M.

    2004-01-01

    A general framework for automatic segmentation of diatom images is presented. This segmentation is a critical first step in contour-based methods for automatic identification of diatoms by computerized image analysis. We review existing results, adapt popular segmentation methods to this difficult p

  17. Algorithms for skiascopy measurement automatization

    Science.gov (United States)

    Fomins, Sergejs; Trukša, Renārs; KrūmiĆa, Gunta

    2014-10-01

    Automatic dynamic infrared retinoscope was developed, which allows to run procedure at a much higher rate. Our system uses a USB image sensor with up to 180 Hz refresh rate equipped with a long focus objective and 850 nm infrared light emitting diode as light source. Two servo motors driven by microprocessor control the rotation of semitransparent mirror and motion of retinoscope chassis. Image of eye pupil reflex is captured via software and analyzed along the horizontal plane. Algorithm for automatic accommodative state analysis is developed based on the intensity changes of the fundus reflex.

  18. An Automatic and Dynamic Approach for Personalized Recommendation of Learning Objects Considering Students Learning Styles: An Experimental Analysis

    Directory of Open Access Journals (Sweden)

    Fabiano A. DORÇA

    2016-04-01

    Full Text Available Content personalization in educational systems is an increasing research area. Studies show that students tend to have better performances when the content is customized according to his/her preferences. One important aspect of students particularities is how they prefer to learn. In this context, students learning styles should be considered, due to the importance of this feature to the adaptivity process in such systems. Thus, this work presents an efficient approach for personalization of the teaching process based on learning styles. Our approach is based on an expert system that implements a set of rules which classifies learning objects according to their teaching style, and then automatically filters learning objects according to students' learning styles. The best adapted learning objects are ranked and recommended to the student. Preliminary experiments suggest promising results.

  19. Automatic extraction analysis of the anatomical functional area for normal brain 18F-FDG PET imaging

    International Nuclear Information System (INIS)

    Using self-designed automatic extraction software of brain functional area, the grey scale distribution of 18F-FDG imaging and the relationship between the 18F-FDG accumulation of brain anatomic function area and the 18F-FDG injected dose, the level of glucose, the age, etc., were studied. According to the Talairach coordinate system, after rotation, drift and plastic deformation, the 18F-FDG PET imaging was registered into the Talairach coordinate atlas, and then the average gray value scale ratios between individual brain anatomic functional area and whole brain area was calculated. Further more the statistics of the relationship between the 18F-FDG accumulation of every brain anatomic function area and the 18F-FDG injected dose, the level of glucose and the age were tested by using multiple stepwise regression model. After images' registration, smoothing and extraction, main cerebral cortex of the 18F-FDG PET brain imaging can be successfully localized and extracted, such as frontal lobe, parietal lobe, occipital lobe, temporal lobe, cerebellum, brain ventricle, thalamus and hippocampus. The average ratios to the inner reference of every brain anatomic functional area were 1.01 ± 0.15. By multiple stepwise regression with the exception of thalamus and hippocampus, the grey scale of all the brain functional area was negatively correlated to the ages, but with no correlation to blood sugar and dose in all areas. To the 18F-FDG PET imaging, the brain functional area extraction program could automatically delineate most of the cerebral cortical area, and also successfully reflect the brain blood and metabolic study, but extraction of the more detailed area needs further investigation

  20. Automatic Performance Debugging of SPMD Parallel Programs

    CERN Document Server

    Liu, Xu; Zhan, Jianfeng; Tu, Bibo; Meng, Dan

    2010-01-01

    Automatic performance debugging of parallel applications usually involves two steps: automatic detection of performance bottlenecks and uncovering their root causes for performance optimization. Previous work fails to resolve this challenging issue in several ways: first, several previous efforts automate analysis processes, but present the results in a confined way that only identifies performance problems with apriori knowledge; second, several tools take exploratory or confirmatory data analysis to automatically discover relevant performance data relationships. However, these efforts do not focus on locating performance bottlenecks or uncovering their root causes. In this paper, we design and implement an innovative system, AutoAnalyzer, to automatically debug the performance problems of single program multi-data (SPMD) parallel programs. Our system is unique in terms of two dimensions: first, without any apriori knowledge, we automatically locate bottlenecks and uncover their root causes for performance o...

  1. Automatic Fiscal Stabilizers

    Directory of Open Access Journals (Sweden)

    Narcis Eduard Mitu

    2013-11-01

    Full Text Available Policies or institutions (built into an economic system that automatically tend to dampen economic cycle fluctuations in income, employment, etc., without direct government intervention. For example, in boom times, progressive income tax automatically reduces money supply as incomes and spendings rise. Similarly, in recessionary times, payment of unemployment benefits injects more money in the system and stimulates demand. Also called automatic stabilizers or built-in stabilizers.

  2. Automatic input rectification

    OpenAIRE

    Long, Fan; Ganesh, Vijay; Carbin, Michael James; Sidiroglou, Stelios; Rinard, Martin

    2012-01-01

    We present a novel technique, automatic input rectification, and a prototype implementation, SOAP. SOAP learns a set of constraints characterizing typical inputs that an application is highly likely to process correctly. When given an atypical input that does not satisfy these constraints, SOAP automatically rectifies the input (i.e., changes the input so that it satisfies the learned constraints). The goal is to automatically convert potentially dangerous inputs into typical inputs that the ...

  3. Contribution to automatic image recognition. Application to analysis of plain scenes of overlapping parts in robot technology

    International Nuclear Information System (INIS)

    A method for object modeling and overlapped object automatic recognition is presented. Our work is composed of three essential parts: image processing, object modeling, and evaluation of the implementation of the stated concepts. In the first part, we present a method of edge encoding which is based on a re-sampling of the data encoded according to Freeman, this method generates an isotropic, homogenous and very precise representation. The second part relates to object modeling. This important step makes much easier the recognition work. The new method proposed characterizes a model with two groups of information: the description group containing the primitives, the discrimination group containing data packs, called 'transition vectors'. Based on this original method of information organization, a 'relative learning' is able to select, to ignore and to update the information concerning the objects already learned, according to the new information to be included into the data base. The recognition is a two-pass process: the first pass determines very efficiently the presence of objects by making use of each object's particularities, and this hypothesis is either confirmed or rejected by the following fine verification pass. The last part describes in detail the experimentation results. We demonstrate the robustness of the algorithms with images in both poor lighting and overlapping objects conditions. The system, named SOFIA, has been installed into an industrial vision system series and works in real time. (author)

  4. Automatic monitoring system for high-steep slope in open-pit mine based on GPS and data analysis

    Science.gov (United States)

    Zhou, Chunmei; Li, Xianfu; Qin, Sunwei; Qiu, Dandan; Wu, Yanlin; Xiao, Yun; Zhou, Jian

    2008-12-01

    Recently, GPS has been more and more applicative in open pit mine slope safety monitoring. Daye Iron Mine open pit high-steep slope automatic monitoring system mainly consists of three modules, namely, GPS data processing module, monitoring and warning module, emergency plans module. According to the rock mass structural feature and the side slope stability evaluation, it is arranged altogether to seven GPS distortion monitoring points on the sharp of Fault F9 at Daye iron Mine, adopted the combination of monofrequent static GPS receiver and data-transmission radio to carry on the observation, the data processing mainly uses three transect interpolation method to solve the questions of discontinuity and Effectiveness in the data succession. According to the displacement monitoring data from 1990 to 1996 of Daye Iron Mine East Open Pit Shizi mountain Landslide A2, researching the displacement criterion, rate criterion, acceleration criterion, creep curve tangent angle criterion etc of landslide failure, the result shows that the landslide A2 is the lapse type crag nature landslide whose movement in three phases, namely creep stage, accelerated phase, destruction stage. It is different of the failure criterion in different stages and different position that is at the rear, central, front margin of the landslide. It has important guiding significance to put forward the comprehensive failure criterion of seven new-settled monitoring points combining the slope deformation destruction and macroscopic evidence.

  5. AUTOMATIC CLASSIFICATION OF X-RATED VIDEOS USING OBSCENE SOUND ANALYSIS BASED ON A REPEATED CURVE-LIKE SPECTRUM FEATURE

    Directory of Open Access Journals (Sweden)

    JaeDeok Lim

    2011-11-01

    Full Text Available This paper addresses the automatic classification of X-rated videos by analyzing its obscene sounds. In this paper, obscene sounds refer to audio signals generated from sexual moans and screams during sexual scenes. By analyzing various sound samples, we determined the distinguishable characteristics of obscene sounds and propose a repeated curve-like spectrum feature that represents the characteristics of such sounds. We constructed 6,269 audio clips to evaluate the proposed feature, and separately constructed 1,200 X-rated and general videos for classification. The proposed feature has an F1-score, precision, and recall rate of 96.6%, 98.2%, and 95.2%, respectively, for the original dataset, and 92.6%, 97.6%, and 88.0% for a noisy dataset of 5dB SNR. And, in classifying videos, the feature has more than a 90% F1- score, 97% precision, and an 84% recall rate. From the measured performance, X-rated videos can be classified with only the audio features and the repeated curve-like spectrum feature is suitable to detect obscene sounds.

  6. Automatic Classification of X-rated Videos using Obscene Sound Analysis based on a Repeated Curve-like Spectrum Feature

    CERN Document Server

    Lim, JaeDeok; Han, SeungWan; Lee, ChoelHoon

    2011-01-01

    This paper addresses the automatic classification of X-rated videos by analyzing its obscene sounds. In this paper, obscene sounds refer to audio signals generated from sexual moans and screams during sexual scenes. By analyzing various sound samples, we determined the distinguishable characteristics of obscene sounds and propose a repeated curve-like spectrum feature that represents the characteristics of such sounds. We constructed 6,269 audio clips to evaluate the proposed feature, and separately constructed 1,200 X-rated and general videos for classification. The proposed feature has an F1-score, precision, and recall rate of 96.6%, 98.2%, and 95.2%, respectively, for the original dataset, and 92.6%, 97.6%, and 88.0% for a noisy dataset of 5dB SNR. And, in classifying videos, the feature has more than a 90% F1-score, 97% precision, and an 84% recall rate. From the measured performance, X-rated videos can be classified with only the audio features and the repeated curve-like spectrum feature is suitable to...

  7. Large Scale Automatic Analysis and Classification of Roof Surfaces for the Installation of Solar Panels Using a Multi-Sensor Aerial Platform

    Directory of Open Access Journals (Sweden)

    Luis López-Fernández

    2015-09-01

    Full Text Available A low-cost multi-sensor aerial platform, aerial trike, equipped with visible and thermographic sensors is used for the acquisition of all the data needed for the automatic analysis and classification of roof surfaces regarding their suitability to harbor solar panels. The geometry of a georeferenced 3D point cloud generated from visible images using photogrammetric and computer vision algorithms, and the temperatures measured on thermographic images are decisive to evaluate the areas, tilts, orientations and the existence of obstacles to locate the optimal zones inside each roof surface for the installation of solar panels. This information is complemented with the estimation of the solar irradiation received by each surface. This way, large areas may be efficiently analyzed obtaining as final result the optimal locations for the placement of solar panels as well as the information necessary (location, orientation, tilt, area and solar irradiation to estimate the productivity of a solar panel from its technical characteristics.

  8. Physics at LEP2. Vol. 1

    International Nuclear Information System (INIS)

    This is the final report of the Workshop on Physics at LEP2, held at CERN during 1995. The first part of vol. 1 is devoted to aspects of machine physics of particular relevance to experiments, including the energy, luminosity and interaction regions, as well as the measurement of beam energy. The second part of vol. 1 is a relatively concise, but fairly complete, handbook on the physics of e+e- annihilation above the WW threshold and up to √s∼200 GeV. It contains discussions on WW cross-sections and distributions, W mass determination, Standard Model processes, QCD and gamma-gamma physics, as well as aspects of discovery physics, such as Higgs, new particle searches, triple gauge boson couplings and Z'. The second volume contains a review of the existing Monte Carlo generators for LEP2 physics. These include generators for WW physics, QCD and gamma-gamma processes, Bhabha scattering and discovery physics. A special effort was made to co-ordinate the different parts, with a view to achieving a systematic and balanced review of the subject, rather than just publishing a collection of separate contributions. (orig.)

  9. Physics at LEP2. Vol. 2

    International Nuclear Information System (INIS)

    This is final report of the Workshop on Physics at LEP2, held at CERN during 1995. The first part of vol. 1 is devoted to aspects of machine physics of particular relevance to experiments, including the energy, luminosity and interaction regions, as well as the measurement of beam energy. The second part of vol. 1 is a relatively concise, but fairly complete, handbook on the physics of e+e- annihilation above the WW threshold and up to √s∼200 GeV. It contains discussions on WW cross-sections and distributions, W mass determination, Standard Model processes, QCD and gamma-gamma physics, as well as aspects of discovery physics, such as Higgs, new particle searches, triple gauge boson couplings and Z'. The second volume contains a review of the existing Monte Carlo generators for LEP2 physics. These include generators for WW physics, QCD and gamma-gamma processes, Bhabha scattering and discovery physics. A special effort was made to co-ordinate the different parts, with a view to achieving a systematic and balanced review of the subject, rather than just publishing a collection of separate contributions. (orig.)

  10. Automatic Radiation Monitoring in Slovenia

    International Nuclear Information System (INIS)

    Full text: The automatic radiation monitoring system in Slovenia started in early nineties and now it comprises measurements of: 1. External gamma radiation: For the time being there are forty-three probes with GM tubes integrated into a common automatic network, operated at the SNSA. The probes measure dose rate in 30 minute intervals. 2. Aerosol radioactivity: Three automatic aerosol stations measure the concentration of artificial alpha and beta activity in the air, gamma emitting radionuclides, radioactive iodine 131 in the air (in all chemical forms, - natural radon and thoron progeny, 3. Radon progeny concentration: Radon progeny concentration is measured hourly and results are displayed as the equilibrium equivalent concentrations (EEC), 4. Radioactive deposition measurements: As a support to gamma dose rate measurements - the SNSA developed and installed an automatic measuring station for surface contamination equipped with gamma spectrometry system (with 3x3' NaI(Tl) detector). All data are transferred through the different communication pathways to the SNSA. They are collected in 30 minute intervals. Within these intervals the central computer analyses and processes the collected data, and creates different reports. Every month QA/QC analysis of data is performed, showing the statistics of acquisition errors and availability of measuring results. All results are promptly available at the our WEB pages. The data are checked and daily sent to the EURDEP system at Ispra (Italy) and also to the Austrian, Croatian and Hungarian authorities. (author)

  11. Substituting environmentally relevant flame retardants: assessment fundamentals. Vol. 2: flame-retardant finishings of selected products - applications-focused analysis: state of the art, trends, alternatives; Erarbeitung von Bewertungsgrundlagen zur Substitution umweltrelevanter Flammschutzmittel. Bd. 2: Flammhemmende Ausruestung ausgewaehlter Produkte - anwendungsbezogene Betrachtung: Stand der Technik, Trend, Alternativen

    Energy Technology Data Exchange (ETDEWEB)

    Leisewitz, A.; Schwarz, W.

    2001-04-01

    The study examines the status, trends and alternatives (substitution and reduction potentials) in the use of flame retardants in selected product sectors: construction; electronics and electrical engineering; rail vehicles; textiles/upholstery. In addition, the study characterises thirteen flame retardants in terms of material flows, applications and toxicology/ecotoxicology. Vol. I: Summary overview of flame retardant applications in Germany in 1999/2000; characterisation of 13 flame retardants in terms of substance properties and application-specific characteristics, range of applications and quantities; derivation of assessment fundamentals for flame retardants, focussing on toxicology/ecotoxicology, suitability for closed-loop substance management, and potential for substitution and reduction; summary assessment of 13 flame retardants; summary overview of flame retardant applications. Vol. II: Analysis of flame retardant applications (state of the art, trends, alternatives) in: unsaturated polyester (UP) resins (rail vehicles); polyurethane (PU) insulating foams and one component foams (OCF) (construction sector); plastics for generic uses in electronic and electrical equipment, in casings for electronic and electrical equipment and in printed circuit boards (electronics/electrical engineering); and in upholstery and mattresses (textile applications). Vol. III: Toxicological/ecotoxicological profiles of substances: Decabromodiphenyl oxide; Tetrabromobisphenol A; Bis[pentabromophenyl]ethane; Hexabromocyclodo-decane, Tris[chloropropyl]phosphate, Resorcinol-bis-diphenylphosphate; N-Hydroxymethyl-3-dimethylphosphonopropionamide, Red phosphorus, Ammonium polyphosphate, Melamin cyanurate, Aluminiumtrihydroxide, Sodium borate decahydrate, Antimony trioxide. (orig.) [German] Untersucht werden Stand, Trends und Alternativen (Substitutions- und Minderungspotentiale) beim Einsatz von Flammschutzmitteln (FSM) in ausgewaehlten Produkten aus: Baubereich, Elektrotechnik

  12. Automatic Validation of Protocol Narration

    DEFF Research Database (Denmark)

    Bodei, Chiara; Buchholtz, Mikael; Degano, Pierpablo;

    2003-01-01

    We perform a systematic expansion of protocol narrations into terms of a process algebra in order to make precise some of the detailed checks that need to be made in a protocol. We then apply static analysis technology to develop an automatic validation procedure for protocols. Finally, we...... demonstrate that these techniques suffice for identifying a number of authentication flaws in symmetric key protocols such as Needham-Schroeder, Otway-Rees, Yahalom and Andrew Secure RPC....

  13. Automatic validation of numerical solutions

    DEFF Research Database (Denmark)

    Stauning, Ole

    1997-01-01

    This thesis is concerned with ``Automatic Validation of Numerical Solutions''. The basic theory of interval analysis and self-validating methods is introduced. The mean value enclosure is applied to discrete mappings for obtaining narrow enclosures of the iterates when applying these mappings...... of an integral operator and uses interval Bernstein polynomials for enclosing the solution. Two numerical examples are given, using two orders of approximation and using different numbers of discretization points....

  14. Operational Automatic Remote Sensing Image Understanding Systems: Beyond Geographic Object-Based and Object-Oriented Image Analysis (GEOBIA/GEOOIA. Part 1: Introduction

    Directory of Open Access Journals (Sweden)

    Andrea Baraldi

    2012-09-01

    Full Text Available According to existing literature and despite their commercial success, state-of-the-art two-stage non-iterative geographic object-based image analysis (GEOBIA systems and three-stage iterative geographic object-oriented image analysis (GEOOIA systems, where GEOOIA/GEOBIA, remain affected by a lack of productivity, general consensus and research. To outperform the degree of automation, accuracy, efficiency, robustness, scalability and timeliness of existing GEOBIA/GEOOIA systems in compliance with the Quality Assurance Framework for Earth Observation (QA4EO guidelines, this methodological work is split into two parts. The present first paper provides a multi-disciplinary Strengths, Weaknesses, Opportunities and Threats (SWOT analysis of the GEOBIA/GEOOIA approaches that augments similar analyses proposed in recent years. In line with constraints stemming from human vision, this SWOT analysis promotes a shift of learning paradigm in the pre-attentive vision first stage of a remote sensing (RS image understanding system (RS-IUS, from sub-symbolic statistical model-based (inductive image segmentation to symbolic physical model-based (deductive image preliminary classification. Hence, a symbolic deductive pre-attentive vision first stage accomplishes image sub-symbolic segmentation and image symbolic pre-classification simultaneously. In the second part of this work a novel hybrid (combined deductive and inductive RS-IUS architecture featuring a symbolic deductive pre-attentive vision first stage is proposed and discussed in terms of: (a computational theory (system design; (b information/knowledge representation; (c algorithm design; and (d implementation. As proof-of-concept of symbolic physical model-based pre-attentive vision first stage, the spectral knowledge-based, operational, near real-time Satellite Image Automatic Mapper™ (SIAM™ is selected from existing literature. To the best of these authors’ knowledge, this is the first time a

  15. Multielement X-ray radiometric analysis with application of semiconductor detectors and automatic processing of the results of measurements

    International Nuclear Information System (INIS)

    Problems of complex extraction of useful components from the ores with compound composition demand to ensure multielement analysis having the accuracy which is sufficient for practical purposes. Great possibilities has the X-ray-radiometric analysis with application of semiconductor detectors (SD) and with processing the results of measurements by means of mini- or micro-computers. Present state in the detection and computation techniques permits to introduce the said instruments into the practical use in the analytical laboratories of the mining enterprises. On the base of discussion of the practical tasks in analysis of different types of ores, in the paper basic principles of the multielement X-ray-radiometric analysis for industrial purposes have been formulated. First of all it is an installation with few channels. The main requirement in creation of such installations is to ensure high relaibility and stability of their performance. A variant is given of such analyzer, constructed with use of SiLi or Ge detecting blocks. Possibility for quick change of the excitation sources made of the set of iron-55, cadmium-109, americium-241 or cobalt-57 ensures effective excitation of elements in the range from calcium to uranium. Some practical methods of analysis have been discussed in the paper. They are based both on the methods of passive and active experiments at the calibration stages. Accuracy of these methods is enough for change of ordinary chemical analysis by the radiometric one. Problems are discussed of application of mini- and micro-computers, permitting processing of information according to the metods of analysis having been developed. Some examples are given of practical realization of the multielement X-ray-radiometric analysis of the lead-zinc, cppper-molybdenum, lead-barite and some other types of ores and also of the products of processing of ores

  16. An activation analysis system for short-lived radioisotopes including automatic dead-time corrections with a microcomputer

    International Nuclear Information System (INIS)

    A system based on an IBM-PC microcomputer coupled to a Canberra Series 80 multichannel analyser was developed for activation analysis with short-lived radioisotopes. The data transfer program can store up to 77 gamma-ray spectra on a floppy disc. A spectrum analysis program, DVC, was written to determine peak areas interactively, to correct the counting losses, and to calculate elemental concentrations. (author)

  17. Analysis of volatile compounds of Ilex paraguariensis A. St. - Hil. and its main adulterating species Ilex theizans Mart. ex Reissek and Ilex dumosa Reissek Análise de compostos voláteis de Ilex paraguariensis A. St. - Hil. e suas principais espécies adulterantes Ilex theizans Mart. ex Reissek e Ilex dumosa Reissek

    Directory of Open Access Journals (Sweden)

    Rogério Marcos Dallago

    2011-12-01

    Full Text Available The adulteration of the product Ilex paraguariensis with other Ilex species is a mAjor problem for maté tea producers. In this work, three species of Ilex were evaluated for their volatile composition by headspace solid phase microextraction coupled to gas chromatography and mass spectrum detector (HS-SPME/GC-MS. The adulterating species I. dumnosa and I. theizans Mart. ex Reissek presented a different profile of volatile organic compounds when compared to I. paraguariensis. Aldehydes methyl-butanal, pentanal, hexanal, heptanal and nonanal were detected only in the adulterating species. This result suggests that such compounds are potential chemical markers for identification of adulteration and quality analysis of products based on Ilex paraguariensis.A adulteração do produto Ilex paraguariensis com outras espécies de Ilex é um dos principais problemas dos produtores de erva-mate. Neste trabalho, três espécies de Ilex foram avaliadas quanto à sua composição volátil por microextração em fase sólida acoplada à cromatografia gasosa e detector de espectro de massas (HS-SPME/GC-MS. As espécies adulterantes I. dumnosa e I. theizans Mart. ex Reissek apresentaram um perfil diferente de compostos orgânicos voláteis, quando comparadas com a I. paraguariensis. Os aldeídos metil-butanal, pentanal, hexanal, heptanal e nonanal foram detectados apenas nas espécies adulterantes. Esse resultado sugere que esses compostos químicos são marcadores potenciais para a identificação de adulteração e análise da qualidade dos produtos à base de Ilex paraguariensis.

  18. Automatic fault tree construction with RIKKE - a compendium of examples. Vol. 2

    International Nuclear Information System (INIS)

    This second volume describes the construction of fault trees for systems with loops, including control and safety loops. It also gives a short summary of the event coding scheme used in the FTLIB component model library. (author)

  19. Toward dynamic isotopomer analysis in the rat brain in vivo: automatic quantitation of 13C NMR spectra using LCModel

    OpenAIRE

    Henry, Pierre-Gilles; Oz, Gülin; Provencher, Stephen; Gruetter, Rolf

    2003-01-01

    The LCModel method was adapted to analyze localized in vivo (13)C NMR spectra obtained from the rat brain in vivo at 9.4 T. Prior knowledge of chemical-shifts, J-coupling constants and J-evolution was included in the analysis. Up to 50 different isotopomer signals corresponding to 10 metabolites were quantified simultaneously in 400 microl volumes in the rat brain in vivo during infusion of [1,6-(13)C(2)]glucose. The analysis remained accurate even at low signal-to-noise ratio of the order of...

  20. Automatic Payroll Deposit System.

    Science.gov (United States)

    Davidson, D. B.

    1979-01-01

    The Automatic Payroll Deposit System in Yakima, Washington's Public School District No. 7, directly transmits each employee's salary amount for each pay period to a bank or other financial institution. (Author/MLF)

  1. Development of automatic reactor vessel inspection systems: development of data acquisition and analysis system for the nuclear vessel weld

    Energy Technology Data Exchange (ETDEWEB)

    Park, C. H.; Lim, H. T.; Um, B. G. [Korea Advanced Institute of Science and Technology, Taejeon (Korea)

    2001-03-01

    The objective of this project is to develop an automated ultrasonic data acquisition and data analysis system to examine the reactor vessel weldsIn order to examine nuclear vessel welds including reactor pressure vessel(RPV), huge amount of ultrasonic data from 6 channels should be able to be on-line processed. In addition, ultrasonic transducer scanning device should be remotely controlled, because working place is high radiation area. This kind of an automated ultrasonic testing equipment has not been developed domestically yet In order to develop an automated ultrasonic testing system, RPV ultrasonic testing equipments developed in foreign countries were investigated and the capability of high speed ultrasonic signal processing hardwares was analyzed in this study, ultrasonic signal processing system was designed. And also, ultrasonic data acquisition and analysis software was developed. 11 refs., 6 figs., 9 tabs. (Author)

  2. Design, Construction and Effectiveness Analysis of Hybrid Automatic Solar Tracking System for Amorphous and Crystalline Solar Cells

    OpenAIRE

    Bhupendra Gupta

    2013-01-01

    - This paper concerns the design and construction of a Hybrid solar tracking system. The constructed device was implemented by integrating it with Amorphous & Crystalline Solar Panel, three dimensional freedom mechanism and microcontroller. The amount of power available from a photovoltaic panel is determined by three parameters, the type of solar tracker, materials of solar panel and the intensity of the sunlight. The objective of this paper is to present analysis on the use of two differ...

  3. Determination of Free and Total Sulfites in Wine using an Automatic Flow Injection Analysis System with Voltammetric Detection

    OpenAIRE

    Gonçalves, Luís Moreira; Pacheco, João Grosso; Magalhães, Paulo Jorge; Rodrigues, José António; Barros, Aquiles Araújo

    2009-01-01

    Abstract An automated Flow Injection Analysis (FIA) system based on a initial analyte separation by gas-diffusion and subsequent determination by square-wave voltammetry (SWV) in a flow cell is proposed for the determination of total and free content of sulphur dioxide (SO2) in wine. The proposed method was compared with two iodometric methodologies (the Ripper method and the simplified method commonly used by the wine industry). The developed method shown repeatability (RSD lower ...

  4. A Genetic Algorithms-based Procedure for Automatic Tolerance Allocation Integrated in a Commercial Variation Analysis Software

    OpenAIRE

    2012-01-01

    Indexed by SCOPUS In the functional design process of a mechanical component, the tolerance allocation stage is of primary importance to make the component itself responding to the functional requirements and to cost constraints. Present state-of-the-art approach to tolerance allocation is based on the use of Statistical Tolerance Analysis (STA) software packages which, by means of Monte Carlo simulation, allow forecasting the result of a set of user-selected geometrical and dimensional toler...

  5. AN AUTOMATIZED IN-PLACE ANALYSIS OF A HEAVY LIFT JACK-UP VESSEL UNDER SURVIVAL CONDITIONS

    Directory of Open Access Journals (Sweden)

    Gil Rama

    2014-08-01

    Full Text Available Heavy lift jack-up vessels (HLJV are used for the installation of components of large offshore wind farms. A systematic FE-analysis is presented for the HLJV THOR (owned by Hochtief Infrastructure GmbH under extreme weather conditions. A parametric finite element (FE model and analysis are developed by using ANSYS-APDL programming environment. The analysis contains static and dynamic nonlinear FE-calculations, which are carried out according to the relevant standards (ISO 19905 for in-place analyses of jack-up vessels. Besides strategies of model abstraction, a guide for the determination of the relevant loads is given. In order to calculate the dynamic loads, single degree of freedom (SDOF analogy and dynamic nonlinear FE-calculations are used. As a result of detailed determination of dynamic loads and consideration of soil properties by spring elements, the used capacities are able to be reduced by 28 %. This provides for significant improvement of the environmental restrictions of the HLJV THOR for the considered load scenario.

  6. Motion Analysis and Efficiency Calculation for A CR-CR Gear Automatic Transmission%一种CR-CR型自动变速器的运动分析和效率计算

    Institute of Scientific and Technical Information of China (English)

    陈玲琳; 陈奇

    2012-01-01

    以CR- CR双排行星齿轮机构四挡自动变速器为例,运用单排行星齿轮机构一般运动规律的特性方程式对该类变速器进行运动分析,采用啮合功率法将行星齿轮机构转化为定轴轮系机构,对自动变速箱的传动效率进行计算.所述方法为其进一步自动变速箱的结构改进和性能提高提供了一定的理论基础.%In order to improve the structure and performance , a fourth gear automatic transmission which has a CR - CR double planetary gear mechanism is taken as the example. And a characteristic equation , which shows the general law of motion of the single - row planetary gear mechanism , is used to analysis the motion law of the automatic transmission . "meshing power method" is also used to calculate the efficiency of the automatic transmission. Meshing power method is an approach that first translates planetary gear mechanism into fixed axis gear trains , then solves the transmission efficiency problem . Key Words: Fourth gear automatic transmission Motion analysis Efficiency calculation Meshing power method

  7. Automatic screening of obstructive sleep apnea from the ECG based on empirical mode decomposition and wavelet analysis

    International Nuclear Information System (INIS)

    This study analyses two different methods to detect obstructive sleep apnea (OSA) during sleep time based only on the ECG signal. OSA is a common sleep disorder caused by repetitive occlusions of the upper airways, which produces a characteristic pattern on the ECG. ECG features, such as the heart rate variability (HRV) and the QRS peak area, contain information suitable for making a fast, non-invasive and simple screening of sleep apnea. Fifty recordings freely available on Physionet have been included in this analysis, subdivided in a training and in a testing set. We investigated the possibility of using the recently proposed method of empirical mode decomposition (EMD) for this application, comparing the results with the ones obtained through the well-established wavelet analysis (WA). By these decomposition techniques, several features have been extracted from the ECG signal and complemented with a series of standard HRV time domain measures. The best performing feature subset, selected through a sequential feature selection (SFS) method, was used as the input of linear and quadratic discriminant classifiers. In this way we were able to classify the signals on a minute-by-minute basis as apneic or nonapneic with different best-subset sizes, obtaining an accuracy up to 89% with WA and 85% with EMD. Furthermore, 100% correct discrimination of apneic patients from normal subjects was achieved independently of the feature extractor. Finally, the same procedure was repeated by pooling features from standard HRV time domain, EMD and WA together in order to investigate if the two decomposition techniques could provide complementary features. The obtained accuracy was 89%, similarly to the one achieved using only Wavelet analysis as the feature extractor; however, some complementary features in EMD and WA are evident

  8. GAIT-ER-AID: An Expert System for Analysis of Gait with Automatic Intelligent Pre-Processing of Data

    OpenAIRE

    Bontrager, EL.; Perry, J.; Bogey, R.; Gronley, J.; Barnes, L.; Bekey, G.; Kim, JW

    1990-01-01

    This paper describes the architecture and applications of an expert system designed to identify the specific muscles responsible for a given dysfunctional gait pattern. The system consists of two parts: a data analysis expert system (DA/ES) and a gait pathology expert system (GP/ES). The DA/ES processes raw data on joint angles, foot-floor contact patterns and EMG's from relevant muscles and synthesizes them into a data frame for use by the GP/ES. Various aspects of the intelligent data pre-p...

  9. An environmental friendly method for the automatic determination of hypochlorite in commercial products using multisyringe flow injection analysis.

    Science.gov (United States)

    Soto, N Ornelas; Horstkotte, B; March, J G; López de Alba, P L; López Martínez, L; Cerdá Martín, V

    2008-03-24

    A multisyringe flow injection analysis system was used for the determination of hypochlorite in cleaning agents, by measurement of the native absorbance of hypochlorite at 292 nm. The methodology was based on the selective decomposition of hypochlorite by a cobalt oxide catalyst giving chloride and oxygen. The difference of the absorbance of the sample before and after its pass through a cobalt oxide column was selected as analytical signal. As no further reagent was required this work can be considered as a contribution to environmental friendly analytical chemistry. The entire analytical procedure, including in-line sample dilution in three steps was automated by first, dilution in a stirred miniature vessel, second by dispersion and third by in-line addition of water using multisyringe flow injection technique. The dynamic concentration range was 0.04-0.78 gL(-1) (relative standard deviation lower than 3%), where the extension of the hypochlorite decomposition was of 90+/-4%. The proposed method was successfully applied to the analysis of commercial cleaning products. The accuracy of the method was established by iodometric titration. PMID:18328319

  10. A radar-based regional extreme rainfall analysis to derive the thresholds for a novel automatic alert system in Switzerland

    Science.gov (United States)

    Panziera, Luca; Gabella, Marco; Zanini, Stefano; Hering, Alessandro; Germann, Urs; Berne, Alexis

    2016-06-01

    This paper presents a regional extreme rainfall analysis based on 10 years of radar data for the 159 regions adopted for official natural hazard warnings in Switzerland. Moreover, a nowcasting tool aimed at issuing heavy precipitation regional alerts is introduced. The two topics are closely related, since the extreme rainfall analysis provides the thresholds used by the nowcasting system for the alerts. Warm and cold seasons' monthly maxima of several statistical quantities describing regional rainfall are fitted to a generalized extreme value distribution in order to derive the precipitation amounts corresponding to sub-annual return periods for durations of 1, 3, 6, 12, 24 and 48 h. It is shown that regional return levels exhibit a large spatial variability in Switzerland, and that their spatial distribution strongly depends on the duration of the aggregation period: for accumulations of 3 h and shorter, the largest return levels are found over the northerly alpine slopes, whereas for longer durations the southern Alps exhibit the largest values. The inner alpine chain shows the lowest values, in agreement with previous rainfall climatologies. The nowcasting system presented here is aimed to issue heavy rainfall alerts for a large variety of end users, who are interested in different precipitation characteristics and regions, such as, for example, small urban areas, remote alpine catchments or administrative districts. The alerts are issued not only if the rainfall measured in the immediate past or forecast in the near future exceeds some predefined thresholds but also as soon as the sum of past and forecast precipitation is larger than threshold values. This precipitation total, in fact, has primary importance in applications for which antecedent rainfall is as important as predicted one, such as urban floods early warning systems. The rainfall fields, the statistical quantity representing regional rainfall and the frequency of alerts issued in case of

  11. Histological analysis of tissue structures of the internal organs of steppe tortoises following their exposure to spaceflight conditions while circumnavigating the moon aboard the Zond-7 automatic station

    Science.gov (United States)

    Sutulov, L. S.; Sutulov, Y. L.; Trukhina, L. V.

    1975-01-01

    Tortoises flown around the Moon on the 6-1/2 day voyage of the Zond-7 automatic space station evidently did not suffer any pathological changes to their peripheral blood picture, heart, lungs, intestines, or liver.

  12. Artificial intelligence applied to the automatic analysis of absorption spectra. Objective measurement of the fine structure constant

    CERN Document Server

    Bainbridge, Matthew B

    2016-01-01

    A new and fully-automated method is presented for the analysis of high-resolution absorption spectra (GVPFIT). The method has broad application but here we apply it specifically to the problem of measuring the fine structure constant at high redshift. For this we need objectivity and reproducibility. GVPFIT is also motivated by the importance of obtaining a large statistical sample of measurements of $\\Delta\\alpha/\\alpha$. Interactive analyses are both time consuming and complex and automation makes obtaining a large sample feasible. Three numerical methods are unified into one artificial intelligence process: a genetic algorithm that emulates the Darwinian processes of reproduction, mutation and selection, non-linear least-squares with parameter constraints (VPFIT), and Bayesian model averaging. In contrast to previous methodologies, which relied on a particular solution as being the most likely model, GVPFIT plus Bayesian model averaging derives results from a large set of models, and helps overcome systema...

  13. Workshop Arboretum Volčji potok

    Directory of Open Access Journals (Sweden)

    Ana Kučan

    2012-01-01

    Full Text Available From its constitution onwards, the Volčji Potok Arboretum has been caught between various conflicting orientations. It is both a scientific, research and educational institution, and a cultural monument with exquisite garden and landscape design features and areas of great natural value and built cultural heritage, as well as commercial venue. At the same time, it functions as a park and an area for mass events, a garden centre and nursery. This variety of functions has helped Arboretum to survive the pressures of time; however, partial and uncoordinated interventions have threatened its original mission and its image and generated a number of conflicting situations. The workshop, organised on the initiative of the Institute for the Protection of Cultural Heritage of Slovenia, which involved students from the Faculty of Architecture and students from the Department of Landscape Architecture of the Biotechnical Faculty in mixed groups, generated eight proposals to solve some of the most urgent problems by introducing optimised development with clearly defined goals and priorities.

  14. Automatically produced FRP beams with embedded FOS in complex geometry: process, material compatibility, micromechanical analysis, and performance tests

    Science.gov (United States)

    Gabler, Markus; Tkachenko, Viktoriya; Küppers, Simon; Kuka, Georg G.; Habel, Wolfgang R.; Milwich, Markus; Knippers, Jan

    2012-04-01

    The main goal of the presented work was to evolve a multifunctional beam composed out of fiber reinforced plastics (FRP) and an embedded optical fiber with various fiber Bragg grating sensors (FBG). These beams are developed for the use as structural member for bridges or industrial applications. It is now possible to realize large scale cross sections, the embedding is part of a fully automated process and jumpers can be omitted in order to not negatively influence the laminate. The development includes the smart placement and layout of the optical fibers in the cross section, reliable strain transfer, and finally the coupling of the embedded fibers after production. Micromechanical tests and analysis were carried out to evaluate the performance of the sensor. The work was funded by the German ministry of economics and technology (funding scheme ZIM). Next to the authors of this contribution, Melanie Book with Röchling Engineering Plastics KG (Haren/Germany; Katharina Frey with SAERTEX GmbH & Co. KG (Saerbeck/Germany) were part of the research group.

  15. Development of automatic reactor vessel inspection systems; development of data acquisition and analysis system for the nuclear vessel weld

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jong Po; Park, C. H.; Kim, H. T.; Noh, H. C.; Lee, J. M.; Kim, C. K.; Um, B. G. [Research Institute of KAITEC, Seoul (Korea)

    2002-03-01

    The objective of this project is to develop an automated ultrasonic data acquisition and data analysis system to examine heavy vessel welds. In order to examine nuclear vessel welds including reactor pressure vessel(RPV), huge amount of ultrasonic data from 6 channels should be able to be on-line processed. In addition, ultrasonic transducer scanning device should be remotely controlled, because working place is high radiation area. This kind of an automated ultrasonic testing equipment has not been developed domestically yet. In order to develop an automated ultrasonic testing system, RPV ultrasonic testing equipments developed in foreign countries were investigated and the capability of high speed ultrasonic signal processing hardwares was analyzed. In this study, ultrasonic signal processing system was designed. And also, ultrasonic data acquisition software was developed. The new systems were tested on the RPV welds of Ulchin Unit 6 to confirm their functions and capabilities. They worked very well as designed and the tests were successfully completed. 13 refs., 34 figs., 11 tabs. (Author)

  16. Development of automatic surveillance of animal behaviour and welfare using image analysis and machine learned segmentation technique.

    Science.gov (United States)

    Nilsson, M; Herlin, A H; Ardö, H; Guzhva, O; Åström, K; Bergsten, C

    2015-11-01

    In this paper the feasibility to extract the proportion of pigs located in different areas of a pig pen by advanced image analysis technique is explored and discussed for possible applications. For example, pigs generally locate themselves in the wet dunging area at high ambient temperatures in order to avoid heat stress, as wetting the body surface is the major path to dissipate the heat by evaporation. Thus, the portion of pigs in the dunging area and resting area, respectively, could be used as an indicator of failure of controlling the climate in the pig environment as pigs are not supposed to rest in the dunging area. The computer vision methodology utilizes a learning based segmentation approach using several features extracted from the image. The learning based approach applied is based on extended state-of-the-art features in combination with a structured prediction framework based on a logistic regression solver using elastic net regularization. In addition, the method is able to produce a probability per pixel rather than form a hard decision. This overcomes some of the limitations found in a setup using grey-scale information only. The pig pen is a difficult imaging environment because of challenging lighting conditions like shadows, poor lighting and poor contrast between pig and background. In order to test practical conditions, a pen containing nine young pigs was filmed from a top view perspective by an Axis M3006 camera with a resolution of 640 × 480 in three, 10-min sessions under different lighting conditions. The results indicate that a learning based method improves, in comparison with greyscale methods, the possibility to reliable identify proportions of pigs in different areas of the pen. Pigs with a changed behaviour (location) in the pen may indicate changed climate conditions. Changed individual behaviour may also indicate inferior health or acute illness. PMID:26189971

  17. Recommended number of strides for automatic assessment of gait symmetry and regularity in above-knee amputees by means of accelerometry and autocorrelation analysis

    Directory of Open Access Journals (Sweden)

    Tura Andrea

    2012-02-01

    Full Text Available Abstract Background Symmetry and regularity of gait are essential outcomes of gait retraining programs, especially in lower-limb amputees. This study aims presenting an algorithm to automatically compute symmetry and regularity indices, and assessing the minimum number of strides for appropriate evaluation of gait symmetry and regularity through autocorrelation of acceleration signals. Methods Ten transfemoral amputees (AMP and ten control subjects (CTRL were studied. Subjects wore an accelerometer and were asked to walk for 70 m at their natural speed (twice. Reference values of step and stride regularity indices (Ad1 and Ad2 were obtained by autocorrelation analysis of the vertical and antero-posterior acceleration signals, excluding initial and final strides. The Ad1 and Ad2 coefficients were then computed at different stages by analyzing increasing portions of the signals (considering both the signals cleaned by initial and final strides, and the whole signals. At each stage, the difference between Ad1 and Ad2 values and the corresponding reference values were compared with the minimum detectable difference, MDD, of the index. If that difference was less than MDD, it was assumed that the portion of signal used in the analysis was of sufficient length to allow reliable estimation of the autocorrelation coefficient. Results All Ad1 and Ad2 indices were lower in AMP than in CTRL (P Conclusions Without the need to identify and eliminate the phases of gait initiation and termination, twenty strides can provide a reasonable amount of information to reliably estimate gait regularity in transfemoral amputees.

  18. Identification of Biocontrol Bacteria against Soybean Root Rot with Biolog Automatic Microbiology Analysis System%拮抗大豆根腐病细菌的Biolog鉴定

    Institute of Scientific and Technical Information of China (English)

    许艳丽; 刘海龙; 李春杰; 潘凤娟; 李淑娴; 刘新晶

    2012-01-01

    In order to identify the systematic position of taxonomy of two biocontrol bacteria against soybean root rot. Traditional morphological identification and BIOLOG automatic microbiology analysis system were used to identify strain B021a and B04b. The results showed that similarity value of strain B021a with Vibrio tubiashii was 0. 634, possibility to 86% and genetic distance to 4.00,and similarity value of strain B04b with Pasteurella trehalosi was 0. 610,probability to 75% and genetic distance to 2. 77. Strain B021a was identified as Vibrio tubiashii and strain B04b as Pasteurella trehalosi by colony morphological propertie and BIOLOC analysis system.%为明确2株生防细菌的分类地位,采用传统形态学方法结合Biolog微生物自动分析系统,鉴定了大豆根腐病的2株生防细菌.结果表明,菌株B021a与塔式弧菌相似度值为0.634,可能性是86%,遗传距离为4.00.菌株B04b与海藻巴斯德菌相似度值为0.610,可能性是75%,遗传距离为2.77.综合形态学和Biolog鉴定结果,认为菌株B021a是塔式弧菌,菌株B04b是海藻巴斯德菌.

  19. Nuclear Reactor RA Safety Report, Vol. 16, Maximum hypothetical accident

    International Nuclear Information System (INIS)

    Fault tree analysis of the maximum hypothetical accident covers the basic elements: accident initiation, phase development phases - scheme of possible accident flow. Cause of the accident initiation is the break of primary cooling pipe, heavy water system. Loss of primary coolant causes loss of pressure in the primary circuit at the coolant input in the reactor vessel. This initiates safety protection system which should automatically shutdown the reactor. Separate chapters are devoted to: after-heat removal, coolant and moderator loss; accident effects on the reactor core, effects in the reactor building, and release of radioactive wastes

  20. Automatic text summarization

    CERN Document Server

    Torres Moreno, Juan Manuel

    2014-01-01

    This new textbook examines the motivations and the different algorithms for automatic document summarization (ADS). We performed a recent state of the art. The book shows the main problems of ADS, difficulties and the solutions provided by the community. It presents recent advances in ADS, as well as current applications and trends. The approaches are statistical, linguistic and symbolic. Several exemples are included in order to clarify the theoretical concepts.  The books currently available in the area of Automatic Document Summarization are not recent. Powerful algorithms have been develop

  1. Automatic Camera Control

    DEFF Research Database (Denmark)

    Burelli, Paolo; Preuss, Mike

    2014-01-01

    Automatically generating computer animations is a challenging and complex problem with applications in games and film production. In this paper, we investigate howto translate a shot list for a virtual scene into a series of virtual camera configurations — i.e automatically controlling the virtual...... camera. We approach this problem by modelling it as a dynamic multi-objective optimisation problem and show how this metaphor allows a much richer expressiveness than a classical single objective approach. Finally, we showcase the application of a multi-objective evolutionary algorithm to generate a shot...

  2. The NetVISA automatic association tool. Next generation software testing and performance under realistic conditions.

    Science.gov (United States)

    Le Bras, Ronan; Arora, Nimar; Kushida, Noriyuki; Tomuta, Elena; Kebede, Fekadu; Feitio, Paulino

    2016-04-01

    The CTBTO's International Data Centre is in the process of developing the next generation software to perform the automatic association step. The NetVISA software uses a Bayesian approach with a forward physical model using probabilistic representations of the propagation, station capabilities, background seismicity, noise detection statistics, and coda phase statistics. The software has been in development for a few years and is now reaching the stage where it is being tested in a realistic operational context. An interactive module has been developed where the NetVISA automatic events that are in addition to the Global Association (GA) results are presented to the analysts. We report on a series of tests where the results are examined and evaluated by seasoned analysts. Consistent with the statistics previously reported (Arora et al., 2013), the first test shows that the software is able to enhance analysis work by providing additional event hypothesis for consideration by analysts. A test on a three-day data set was performed and showed that the system found 42 additional real events out of 116 examined, including 6 that pass the criterion for the Reviewed Event Bulletin of the IDC. The software was functional in a realistic, real-time mode, during the occurrence of the fourth nuclear test claimed by the Democratic People's Republic of Korea on January 6th, 2016. Confirming a previous statistical observation, the software found more associated stations (51, including 35 primary stations) than GA (36, including 26 primary stations) for this event. Nimar S. Arora, Stuart Russell, Erik Sudderth. Bulletin of the Seismological Society of America (BSSA) April 2013, vol. 103 no. 2A pp709-729.

  3. 高校学生信息自动提取与分析系统研究%Research and analysis of college students information automatic extraction

    Institute of Scientific and Technical Information of China (English)

    聂明辉

    2013-01-01

    Digital campus construction to the third stage, started system integration and large-scale data sharing. Through the mutual sharing of data platform, data extraction and clean-up technologies, particularly get the information we need, but also enables us to understand and evaluate students more comprehensive and objective in the university campus life, learning situation. This paper is based on ODI technology, MapReduce technology and data warehouse technology as the foundation, proposed and constructed a university student in school, life information automatic extraction and analysis system model, for the university informatization environment more intelligent management students provides a feasible technical scheme.%高校数字化校园建设到第三阶段,开始了大规模的系统整合以及数据共享。通过互通共享的数据平台,利用数据抽取和清理技术,有所侧重的获取我们需要的信息,也使我们能够更全面更客观的了解和评价学生在大学校园内的生活、学习情况。本文就此以ODI技术、MapReduce技术以及数据仓库技术为基础,提出并构筑了一套高校学生在校学习、生活信息自动提取与分析系统模型,为信息化大环境下高校更加智慧的管理学生提供了一种可行的技术方案。

  4. 遥感大数据自动分析与数据挖掘%Automatic Analysis and Mining of Remote Sensing Big Data

    Institute of Scientific and Technical Information of China (English)

    李德仁; 张良培; 夏桂松

    2014-01-01

    成像方式的多样化以及遥感数据获取能力的增强,导致遥感数据的多元化和海量化,这意味着遥感大数据时代已经来临。然而,现有的遥感影像分析和海量数据处理技术难以满足当前遥感大数据应用的要求。发展适用于遥感大数据的自动分析和信息挖掘理论与技术,是目前国际遥感科学技术的前沿领域之一。本文围绕遥感大数据自动分析和数据挖掘等关键问题,深入调查和分析了国内外的研究现状和进展,指出了在遥感大数据自动分析和数据挖掘的科学难题和未来发展方向。%With the diversification of the imaging methods and the growing categories ,quantity , and observation frequency of remote sensing data ,the ability of l and‐cover observation has reached an unprec‐edented level ,which means a new era of big data in remote sensing is coming .However ,the existing methods and processing techniques cannot fulfill the need of the big data application in remote sensing . Thus ,to develop the automatic analysis and mining theory and techniques for remote sensing big data is among the most advanced international research areas .This paper investigates and analyses the domestic and overseas research status and progress around this field and points out its key problems and developing tr ends .

  5. 不同全自动凝血仪检测结果的分析研究%Comparative Analysis of Results with Different Automatic Blood Coagulation Analyzers

    Institute of Scientific and Technical Information of China (English)

    李玲; 刘文康; 詹颉; 解娟; 李博; 任健康

    2012-01-01

    目的:探讨不同全自动凝血分析仪检测结果是否具有可比性,同时对其检测结果临床可接受性进行评估,使不同全自动凝血分析仪检测结果标准化.方法:连续30天用SYSMEX CA- 1500及CA-7000全自动凝血分析仪同时检测并比对仪器配套定值质控物的PT、INR、APTT、FIB、TT值;同时连续30天利用两台仪器检测并对比新鲜血标本的PT、INR、APTT、FIB、TT值.结果:SYSMEX CA- 1500及CA-7000日间质控物各检测项目:PT、INR、APTT、FIB、TT变异系数均小于5%.CA- 1500及CA-7000全自动凝血分析仪检测新鲜血标本:PT、INR、APTT、FIB、TT统计分析结果,t检验其P值均>0.05;相关系数r在0.993-0.999之间;两台仪器的偏差均符合1/2美国CLIA’88能力验证分析质量要求.结论:两台仪器PT、INR、APTT、FIB、TT的检测结果具有很好的相关性,经统计分析两台仪器检测结果无统计学意义.对不同凝血分析仪进行比对分析,不仅能够及时发现仪器存在的系统误差.而且使其检测结果具有很好的一致性,给临床可提供一个准确、可靠一致的实验室检测结果,使临床对疾病的诊断、疗效观察有一个统一的评判标准.%Objective: The purpose of our study was to analyze results comparability with different automatic blood coagulation analyzers and to estimate clinical acceptability in order to standardize the results with different automatic blood coagulation analyzers. Methods:The values of PT, INR, APTT, FIB and TT were obtained and compared in fresh blood samples and quality controls with instrument matching definite values with SYSMEX CA-1500 and CA-7000 analyzers for 30 days continuously. Results: The coefficient of variations of values of PT, INR, APTT, FIB and TT in inter-day quality controls with SYSMEX CA-1500 and CA-7000 were all less than 5%. Statistical analysis on values of PT, INR, APTT, FIB and TT in fresh blood samples with CA-1500 and CA-7000 analyzers

  6. Analysis and Improvement of Ladle Automatic Opening Rate of VD process%影响VD工艺钢包自开率的分析及改进

    Institute of Scientific and Technical Information of China (English)

    代刚; 储晓明

    2015-01-01

    In order to solve problem of reducing to almost zero from nearly 100%ladle automatic opening rate due to change of EAF-LF-CCM production process to be EAF-LF-VD-CCM production process at EAF plant of Jiangsu Steel Group, the reason of reducing of ladle automatic opening rate was analyzed, and the corresponding improvement was made focusing on the reason of influencing on ladle automatic opening rate which made the ladle automatic opening rate increased to 89%and stable.%为了解决江苏苏钢集团电炉厂生产工艺过程由EAF-LF-CCM变为EAF-LF-VD-CCM后钢包自开率由近100%降至几乎为零的问题,分析了钢包自开率降低的原因,针对影响钢包自开率原因进行相关改进,使钢包自开率提高到89%以上并稳定.

  7. Proceedings of the 1. Arabic conference on chemical applications (Chemia 2). Vol. 2

    International Nuclear Information System (INIS)

    The conference of chemical application was held on 1-5 Nov 1997 in Cairo, This vol.2 contains of chemical application on nuclear materials. Studies on these vol.This second volume covers papers presented on the subjects

  8. Dynamics of structures '89. Vol. 1 and 2

    International Nuclear Information System (INIS)

    The proceedings, comprising 3 volumes published by the Plzen Centre of the Czechoslovak Society for Science and Technology (Vol. 1 and 2) and by Skoda Works in Plzen (Vol. 3), contain 107 papers, out of which 8 fall within the INIS Subject Scope; these deal with problems related to the earthquake resistance of nuclear power plants. Attention is paid to the evaluation of seismic characteristics of nuclear power plant equipment, to the equipment testing and to calculations of its dynamic characteristics under simulated seismic stress. (Z.M.)

  9. Analysis on Existing Automatic Dispensing Pattern in Hospital Outpatient Pharmacy%医院门诊药房现行自动化调剂模式分析

    Institute of Scientific and Technical Information of China (English)

    梁茂本; 王国如; 吕新颜

    2015-01-01

    目的:推动医院门诊药房自动化调剂模式发展。方法走访国内数家三级甲等综合性医院门诊药房和自动化药房系统生产厂家,并参阅相关文献,分析现行自动化调剂模式。结果与结论先进的门诊药房自动化调剂模式是集实时发药、智能预配发药、自助发药多种模式于一体的自动化和智能化系统,可提高配方准确率,降低了劳动强度,体现了我国现代化药房的建设和发展方向。%Objective To promote the development of the existing automatic dispensing pattern in the hospital outpatient pharmacy. Methods By conducting the visiting to the outpatient pharmacy of several domestic class 3A general hospitals and manufacturers for producing the automatic pharmacy systems and refering to pertinent literatures, the existing automatic dispensing patterns were ana-lyzed. Results and Conclusion The advanced automatic dispensing pattern of outpatient pharmacy is an automatic and intelligent system integrating the real-time dispensing, intelligently pre-allocation dispensing and self-service dispensing, which improves the dispensing accuracy, reduces the labor intensity and reflects the construction and development direction of modern pharmacy in our country.

  10. Automatic summarising factors and directions

    CERN Document Server

    Jones, K S

    1998-01-01

    This position paper suggests that progress with automatic summarising demands a better research methodology and a carefully focussed research strategy. In order to develop effective procedures it is necessary to identify and respond to the context factors, i.e. input, purpose, and output factors, that bear on summarising and its evaluation. The paper analyses and illustrates these factors and their implications for evaluation. It then argues that this analysis, together with the state of the art and the intrinsic difficulty of summarising, imply a nearer-term strategy concentrating on shallow, but not surface, text analysis and on indicative summarising. This is illustrated with current work, from which a potentially productive research programme can be developed.

  11. Proceedings of the second international conference on environmental impact assessment of all economical activities. Vol. 1

    International Nuclear Information System (INIS)

    Proceedings of the conference consist of 3 volumes: Vol. 1 - 'Environmental Impact Assessment of all Economical Activities including Industry'; Vol. 2 - 'Air Pollution Control and Prevention'; Vol. 3 - Waste Management and Environmental Problems in Construction Industry'. Out of 32 papers contained in Vol. 1, 2 were inputted to INIS. They deal with models of radionuclide transport in food chains and the use of aerial monitoring in the study of environmental contamination. (Z.S.)

  12. Proceedings of the second international conference on environmental impact assessment of all economical activities. Vol. 2

    International Nuclear Information System (INIS)

    Proceedings of the conference consist of 3 volumes: Vol. 1 - 'Environmental Impact Assessment of all Economical Activities including Industry'; Vol. 2 - 'Air Pollution Control and Prevention'; Vol. 3 - Waste Management and Environmental Problems in Construction Industry'. Out of 32 papers contained in Vol. 2, 4 were inputted to INIS. They deal with nuclear fusion as a potential energy source, with environmental aspects of disposal of ashes from power plants in the Czech Republic, and with land reclamation after mining activities. (Z.S.)

  13. Automatic trend estimation

    CERN Document Server

    Vamos¸, C˘alin

    2013-01-01

    Our book introduces a method to evaluate the accuracy of trend estimation algorithms under conditions similar to those encountered in real time series processing. This method is based on Monte Carlo experiments with artificial time series numerically generated by an original algorithm. The second part of the book contains several automatic algorithms for trend estimation and time series partitioning. The source codes of the computer programs implementing these original automatic algorithms are given in the appendix and will be freely available on the web. The book contains clear statement of the conditions and the approximations under which the algorithms work, as well as the proper interpretation of their results. We illustrate the functioning of the analyzed algorithms by processing time series from astrophysics, finance, biophysics, and paleoclimatology. The numerical experiment method extensively used in our book is already in common use in computational and statistical physics.

  14. Automatic image processing as a means of safeguarding nuclear material

    International Nuclear Information System (INIS)

    Problems involved in computerized analysis of pictures taken by automatic film or video cameras in the context of international safeguards implementation are described. They include technical ones as well as the need to establish objective criteria for assessing image information. In the near future automatic image processing systems will be useful in verifying the identity and integrity of IAEA seals. (author)

  15. An automatic hinge system for leg orthoses

    NARCIS (Netherlands)

    Rietman, J.S.; Goudsmit, J.; Meulemans, D.; Halbertsma, J.P.K.; Geertzen, J.H.B.

    2004-01-01

    This paper describes a new, automatic hinge system for leg orthoses, which provides knee stability in stance, and allows knee-flexion during swing. Indications for the hinge system are a paresis or paralysis of the quadriceps muscles. Instrumented gait analysis was performed in three patients, fitte

  16. Automatic Estimation of Movement Statistics of People

    DEFF Research Database (Denmark)

    Ægidiussen Jensen, Thomas; Rasmussen, Henrik Anker; Moeslund, Thomas B.

    2012-01-01

    Automatic analysis of how people move about in a particular environment has a number of potential applications. However, no system has so far been able to do detection and tracking robustly. Instead, trajectories are often broken into tracklets. The key idea behind this paper is based around...

  17. Automatic Program Reports

    OpenAIRE

    Lígia Maria da Silva Ribeiro; Gabriel de Sousa Torcato David

    2007-01-01

    To profit from the data collected by the SIGARRA academic IS, a systematic setof graphs and statistics has been added to it and are available on-line. Thisanalytic information can be automatically included in a flexible yearly report foreach program as well as in a synthesis report for the whole school. Somedifficulties in the interpretation of some graphs led to the definition of new keyindicators and the development of a data warehouse across the university whereeffective data consolidation...

  18. Automatic food decisions

    DEFF Research Database (Denmark)

    Mueller Loose, Simone

    Consumers' food decisions are to a large extent shaped by automatic processes, which are either internally directed through learned habits and routines or externally influenced by context factors and visual information triggers. Innovative research methods such as eye tracking, choice experiments...... and food diaries allow us to better understand the impact of unconscious processes on consumers' food choices. Simone Mueller Loose will provide an overview of recent research insights into the effects of habit and context on consumers' food choices....

  19. Support vector machine for automatic pain recognition

    Science.gov (United States)

    Monwar, Md Maruf; Rezaei, Siamak

    2009-02-01

    Facial expressions are a key index of emotion and the interpretation of such expressions of emotion is critical to everyday social functioning. In this paper, we present an efficient video analysis technique for recognition of a specific expression, pain, from human faces. We employ an automatic face detector which detects face from the stored video frame using skin color modeling technique. For pain recognition, location and shape features of the detected faces are computed. These features are then used as inputs to a support vector machine (SVM) for classification. We compare the results with neural network based and eigenimage based automatic pain recognition systems. The experiment results indicate that using support vector machine as classifier can certainly improve the performance of automatic pain recognition system.

  20. Schostakowitsch. Orchesterlieder (Vol. 2), Neeme Järvi / Werner Pfister

    Index Scriptorium Estoniae

    Pfister, Werner

    1996-01-01

    Uuest heliplaadist "Schostakowitsch. Orchesterlieder (Vol. 2): Sechs Romanzen op. 21, Sechs Gedichte op. 143a, Suite auf Verse von Michelangelo Buonarroti op. 145a. Göteborger Sinfoniker, Neeme Järvi". DG CD 447 085-2 (WD: 71'06") DDD

  1. Shostakovich: The Orchestral Songs Vol. 2 / Michael Tanner

    Index Scriptorium Estoniae

    Tanner, Michael

    1996-01-01

    Uuest heliplaadist "Shostakovich: The Orchestral Songs Vol. 2: Six Romances on texts by Japanese poets, Op. 21. Six Poems on Marina Tsvetayeva, Op. 143. Suite on Verses of Michelangelo, Op. 145. Gothenburg Symphony Orchestra, Neeme Järvi". DG 447 085-2GH (71 minutes:DDD)

  2. 变压器自动投退的技术经济分析%Technical Economic Analysis of Transformer Automatic Changeover

    Institute of Scientific and Technical Information of China (English)

    田甜; 唐旭东

    2012-01-01

    以两台变压器并列运行为例,分析了变压器自动投退的技术经济情况.结果表明,在变电站运行过程中自动投切变压器,改变其运行方式,可减少变压器运行损耗,提高经济效益,为优化电网运行提供有效途径.%Taking two transformers in parallel operation for an example, this paper analyzes the technical economy of transformer for automatic changeover. The results show that automatic changeover transformer and change of operation mode can reduce transformer operation loss and improve economic efficiency in substation operation process, which provides effective way for optimal operation of grid.

  3. Analysis of results obtained using the automatic chemical control of the quality of the water heat carrier in the drum boiler of the Ivanovo CHP-3 power plant

    Science.gov (United States)

    Larin, A. B.; Kolegov, A. V.

    2012-10-01

    Results of industrial tests of the new method used for the automatic chemical control of the quality of boiler water of the drum-type power boiler ( P d = 13.8 MPa) are described. The possibility of using an H-cationite column for measuring the electric conductivity of an H-cationized sample of boiler water over a long period of time is shown.

  4. 关于污水处理厂自动控制系统设计分析%Design and analysis of automatic control system for wastewater treatment plant

    Institute of Scientific and Technical Information of China (English)

    何王金

    2015-01-01

    With the rapid development of China's market economy, the environmental problem has become a serious problem,and sewage treatment is the problem that people need to be solved..At present, China has established a lot of sewage treatment plant, the automatic control system is the key technology of sewage treatment,to realize the automatic control of sewage treatment plant,and the process parameters,data collection,analysis and management, to ensure that the effluent standard,improve production efficiency, reduce the production cost. This paper mainly introduces the sewage treatment plant automatic control system features and functions, and introduces the principle of the overall design of the automatic control system and to various parts of the design analysis,hoping to ensure the design of automatic control system of sewage treatment plant is more reasonable.%随着我国市场经济的快速发展,环境问题已成为了一个严重的问题,其中污水处理问题就是人们需要亟待解决的问题。目前,我国已经建立了很多污水处理厂,其中自动控制系统是污水处理的关键技术,实现了污水处理厂的自动化控制,并且对工艺参数、数据进行采集、分析和管理,保证了出水的达标,提高了生产效率,降低了生产成本。本文主要介绍了污水处理厂自动控制系统的特点和功能,并介绍了自动控制系统的总体设计原则以及对各部分设计进行分析等,希望能保证污水处理厂的自动控制系统设计更加合理。

  5. An image-based automatic mesh generation and numerical simulation for a population-based analysis of aerosol delivery in the human lungs

    Science.gov (United States)

    Miyawaki, Shinjiro; Tawhai, Merryn H.; Hoffman, Eric A.; Lin, Ching-Long

    2013-11-01

    The authors propose a method to automatically generate three-dimensional subject-specific airway geometries and meshes for computational fluid dynamics (CFD) studies of aerosol delivery in the human lungs. The proposed method automatically expands computed tomography (CT)-based airway skeleton to generate the centerline (CL)-based model, and then fits it to the CT-segmented geometry to generate the hybrid CL-CT-based model. To produce a turbulent laryngeal jet known to affect aerosol transport, we developed a physiologically-consistent laryngeal model that can be attached to the trachea of the above models. We used Gmsh to automatically generate the mesh for the above models. To assess the quality of the models, we compared the regional aerosol distributions in a human lung predicted by the hybrid model and the manually generated CT-based model. The aerosol distribution predicted by the hybrid model was consistent with the prediction by the CT-based model. We applied the hybrid model to 8 healthy and 16 severe asthmatic subjects, and average geometric error was 3.8% of the branch radius. The proposed method can be potentially applied to the branch-by-branch analyses of a large population of healthy and diseased lungs. NIH Grants R01-HL-094315 and S10-RR-022421, CT data provided by SARP, and computer time provided by XSEDE.

  6. Analysis of Inlfuencing Factors about Cleaning Effect of Automatic Cleaning Machine%全自动清洗机清洗效果影响因素分析

    Institute of Scientific and Technical Information of China (English)

    徐金荣; 洪范宗; 苏秋玲

    2013-01-01

    目的探讨全自动清洗机清洗效果的影响因素。方法全自动清洗机清洗器械时,在清洗夹中放置STF清洗效果检测卡进行检测。结果清洗剂、清洗温度、清洗时间等参数会对最终清洗效果产生影响;喷淋臂、加酶泵、清洗泵、器械装载等故障也会对最终清洗效果产生影响。结论全自动清洗机清洗效果的影响因素主要为清洗参数和机械故障。%Objective To explore the influencing factors about cleaning effect of automatic cleaning machine. Methods When the instruments are being cleaned by automatic cleaning machine, placing STF cleaning effect detection card in cleaning clamp to check the cleaning effect. Results Cleaning parameters such as cleaning agents, cleaning temperature and cleaning time have an impact on the ifnal cleaning effect. Mechanical failures such as pray arm failure, enzyme pump failure, cleaning pump failure and instrument loading failure also have an impact on the ifnal cleaning effect. Conclusion The main Inlfuencing factors about cleaning effect of automatic cleaning machine includes cleaning parameters and mechanical failures.

  7. Efficacy and Complications of Ultrasound-Guided Percutaneous Renal Biopsy Using Automatic Biopsy Gun in Pediatric Diffuse Renal Disease: Analysis of 97 Cases

    Energy Technology Data Exchange (ETDEWEB)

    Han, Seung Min; Chung, Tae Woong; Yoon, Woong [Chonnam National University Hospital, Gwangju (Korea, Republic of)

    2007-09-15

    To evaluate the diagnostic efficacy and complications of ultrasound-guided percutaneous renal biopsy using automatic biopsy gun in patients with pediatric diffuse renal disease. Using an 18G automatic biopsy gun, biopsies were performed on 97 pediatric patients with clinically suspicious diffuse renal disease. The acquired tissue specimens were analyzed by photomicroscopy, immunofluorescence, and electron microscopy to support the diagnosis. In the 97 biopsies, the success of the histologic diagnosis, number of glomeruli, and complication rates were retrospectively evaluated by analyzing the variable exams and clinical records. Adequate tissue for histologic diagnosis was obtained in 91 of 97 biopsies (94%) and the mean number of glomeruli was 9.6. Complications such as minute pain, gross hematuria, and small perirenal hematoma presented in 22 of the 97 biopsies (23%), all of which either improved within 5-72 hours or did not need specific treatment. Ultrasound-guided percutaneous renal biopsy using 18G automatic biopsy gun is an effective and safe method for the histologic diagnosis of pediatric diffuse renal disease without any major complication

  8. Using airborne LiDAR in geoarchaeological contexts: Assessment of an automatic tool for the detection and the morphometric analysis of grazing archaeological structures (French Massif Central).

    Science.gov (United States)

    Roussel, Erwan; Toumazet, Jean-Pierre; Florez, Marta; Vautier, Franck; Dousteyssier, Bertrand

    2014-05-01

    Airborne laser scanning (ALS) of archaeological regions of interest is nowadays a widely used and established method for accurate topographic and microtopographic survey. The penetration of the vegetation cover by the laser beam allows the reconstruction of reliable digital terrain models (DTM) of forested areas where traditional prospection methods are inefficient, time-consuming and non-exhaustive. The ALS technology provides the opportunity to discover new archaeological features hidden by vegetation and provides a comprehensive survey of cultural heritage sites within their environmental context. However, the post-processing of LiDAR points clouds produces a huge quantity of data in which relevant archaeological features are not easily detectable with common visualizing and analysing tools. Undoubtedly, there is an urgent need for automation of structures detection and morphometric extraction techniques, especially for the "archaeological desert" in densely forested areas. This presentation deals with the development of automatic detection procedures applied to archaeological structures located in the French Massif Central, in the western forested part of the Puy-de-Dôme volcano between 950 and 1100 m a.s.l.. These unknown archaeological sites were discovered by the March 2011 ALS mission and display a high density of subcircular depressions with a corridor access. The spatial organization of these depressions vary from isolated to aggregated or aligned features. Functionally, they appear to be former grazing constructions built from the medieval to the modern period. Similar grazing structures are known in other locations of the French Massif Central (Sancy, Artense, Cézallier) where the ground is vegetation-free. In order to develop a reliable process of automatic detection and mapping of these archaeological structures, a learning zone has been delineated within the ALS surveyed area. The grazing features were mapped and typical morphometric attributes

  9. Automatic Configuration in NTP

    Institute of Scientific and Technical Information of China (English)

    Jiang Zongli(蒋宗礼); Xu Binbin

    2003-01-01

    NTP is nowadays the most widely used distributed network time protocol, which aims at synchronizing the clocks of computers in a network and keeping the accuracy and validation of the time information which is transmitted in the network. Without automatic configuration mechanism, the stability and flexibility of the synchronization network built upon NTP protocol are not satisfying. P2P's resource discovery mechanism is used to look for time sources in a synchronization network, and according to the network environment and node's quality, the synchronization network is constructed dynamically.

  10. Automatic personnel contamination monitor

    International Nuclear Information System (INIS)

    United Nuclear Industries, Inc. (UNI) has developed an automatic personnel contamination monitor (APCM), which uniquely combines the design features of both portal and hand and shoe monitors. In addition, this prototype system also has a number of new features, including: micro computer control and readout, nineteen large area gas flow detectors, real-time background compensation, self-checking for system failures, and card reader identification and control. UNI's experience in operating the Hanford N Reactor, located in Richland, Washington, has shown the necessity of automatically monitoring plant personnel for contamination after they have passed through the procedurally controlled radiation zones. This final check ensures that each radiation zone worker has been properly checked before leaving company controlled boundaries. Investigation of the commercially available portal and hand and shoe monitors indicated that they did not have the sensitivity or sophistication required for UNI's application, therefore, a development program was initiated, resulting in the subject monitor. Field testing shows good sensitivity to personnel contamination with the majority of alarms showing contaminants on clothing, face and head areas. In general, the APCM has sensitivity comparable to portal survey instrumentation. The inherit stand-in, walk-on feature of the APCM not only makes it easy to use, but makes it difficult to bypass. (author)

  11. Quality and completeness of risk analyses. Vol. 1

    International Nuclear Information System (INIS)

    The program described was started in 1974, at Risoe National Laboratory. The motivation was criticism then being directed at the Reactor Safety Study WASH 1400, and the view that if risk analysis were to have a future as a scientific study, then its procedures would need to be verified. The material described is a record of a prolonged set of experiments and experiences and this second edition of the report includes an update of the research to cover a study of 35 risk analyses reviewed and checked between 1988 and 1992. A survey is presented of the ways in which incompleteness, lacunae, and oversights arise in risk analysis. Areas were detailed knowledge of disturbance causes and consequences need to be known are alarm priority setting, suppression of nuisance alarms and status annunciation signals, advanced shut down system design and runback systems, alarm and disturbance analysis, automatic plant supervision, disturbance diagnosis and testing support, and monitoring of safe operation margins. Despite improvements in risk analysis technique, in safety management, and in safety design, there will always be problems of ignorance, of material properties, of chemical reactions, of behavioural patterns and safety decision making, which leave plants vulnerable and hazardous. (AB) (24 refs.)

  12. Comment on "Why reduced-form regression models of health effects versus exposures should not replace QRA: livestock production and infant mortality as an example," by Louis Anthony (Tony) Cox, Jr., Risk Analysis 2009, Vol. 29, No. 12.

    Science.gov (United States)

    Sneeringer, Stacy

    2010-04-01

    While a recent paper by Cox in this journal uses as its motivating factor the benefits of quantitative risk assessment, its content is entirely devoted to critiquing Sneeringer's article in the American Journal of Agricultural Economics. Cox's two main critiques of Sneeringer are fundamentally flawed and misrepresent the original article. Cox posits that Sneeringer did A and B, and then argues why A and B are incorrect. However, Sneeringer in fact did C and D; thus critiques of A and B are not applicable to Sneeringer's analysis. PMID:20345577

  13. Image feature meaning for automatic key-frame extraction

    Science.gov (United States)

    Di Lecce, Vincenzo; Guerriero, Andrea

    2003-12-01

    Video abstraction and summarization, being request in several applications, has address a number of researches to automatic video analysis techniques. The processes for automatic video analysis are based on the recognition of short sequences of contiguous frames that describe the same scene, shots, and key frames representing the salient content of the shot. Since effective shot boundary detection techniques exist in the literature, in this paper we will focus our attention on key frames extraction techniques to identify the low level visual features of the frames that better represent the shot content. To evaluate the features performance, key frame automatically extracted using these features, are compared to human operator video annotations.

  14. Keystone feasibility study. Final report. Vol. 4

    Energy Technology Data Exchange (ETDEWEB)

    1982-12-01

    Volume four of the Keystone coal-to-methanol project includes the following: (1) project management; (2) economic and financial analyses; (3) market analysis; (4) process licensing and agreements; and (5) appendices. 24 figures, 27 tables.

  15. JAPS: an automatic parallelizing system based on JAVA

    Institute of Scientific and Technical Information of China (English)

    杜建成; 陈道蓄; 谢立

    1999-01-01

    JAPS is an automatic parallelizing system based on JAVA running on NOW. It implements the automatic process from dependence analysis to parallel execution. The current version of JAPS can exploit functional parallelism and the detection of data parallelism will be incorporated in the new version, which is underway. The framework and key techniques of JAPS are presented. Specific topics discussed are task partitioning, summary information collection, data dependence analysis, pre-scheduling and dynamic scheduling, etc.

  16. 火灾自动报警系统设计与分析%Analysis and Design of Automatic Fire Alarm System

    Institute of Scientific and Technical Information of China (English)

    卢小军; 罗成刚

    2014-01-01

    In the current construction engineering, how to timely discover a fire has become a problem atracting people’s atention. Therefore, the automatic fire alarm system emerged as the times required. It made a significant contribution to the early disaster discovery, and disasters reduction. Based on this, the article analyzes the automatic fire alarm system, in order to provide a reference.%在当前的建筑工程中,如何及时发现火情,成为了人们关注的问题。因此,火灾自动报警系统便应运而生,其对于灾情的提早发现、减少灾害做出了重大贡献。基于此,文章就火灾自动报警系统进行了分析,以期能够提供一个借鉴。

  17. Automatic identification of agricultural terraces through object-oriented analysis of very high resolution DSMs and multispectral imagery obtained from an unmanned aerial vehicle.

    Science.gov (United States)

    Diaz-Varela, R A; Zarco-Tejada, P J; Angileri, V; Loudjani, P

    2014-02-15

    Agricultural terraces are features that provide a number of ecosystem services. As a result, their maintenance is supported by measures established by the European Common Agricultural Policy (CAP). In the framework of CAP implementation and monitoring, there is a current and future need for the development of robust, repeatable and cost-effective methodologies for the automatic identification and monitoring of these features at farm scale. This is a complex task, particularly when terraces are associated to complex vegetation cover patterns, as happens with permanent crops (e.g. olive trees). In this study we present a novel methodology for automatic and cost-efficient identification of terraces using only imagery from commercial off-the-shelf (COTS) cameras on board unmanned aerial vehicles (UAVs). Using state-of-the-art computer vision techniques, we generated orthoimagery and digital surface models (DSMs) at 11 cm spatial resolution with low user intervention. In a second stage, these data were used to identify terraces using a multi-scale object-oriented classification method. Results show the potential of this method even in highly complex agricultural areas, both regarding DSM reconstruction and image classification. The UAV-derived DSM had a root mean square error (RMSE) lower than 0.5 m when the height of the terraces was assessed against field GPS data. The subsequent automated terrace classification yielded an overall accuracy of 90% based exclusively on spectral and elevation data derived from the UAV imagery.

  18. Automatic identification of agricultural terraces through object-oriented analysis of very high resolution DSMs and multispectral imagery obtained from an unmanned aerial vehicle.

    Science.gov (United States)

    Diaz-Varela, R A; Zarco-Tejada, P J; Angileri, V; Loudjani, P

    2014-02-15

    Agricultural terraces are features that provide a number of ecosystem services. As a result, their maintenance is supported by measures established by the European Common Agricultural Policy (CAP). In the framework of CAP implementation and monitoring, there is a current and future need for the development of robust, repeatable and cost-effective methodologies for the automatic identification and monitoring of these features at farm scale. This is a complex task, particularly when terraces are associated to complex vegetation cover patterns, as happens with permanent crops (e.g. olive trees). In this study we present a novel methodology for automatic and cost-efficient identification of terraces using only imagery from commercial off-the-shelf (COTS) cameras on board unmanned aerial vehicles (UAVs). Using state-of-the-art computer vision techniques, we generated orthoimagery and digital surface models (DSMs) at 11 cm spatial resolution with low user intervention. In a second stage, these data were used to identify terraces using a multi-scale object-oriented classification method. Results show the potential of this method even in highly complex agricultural areas, both regarding DSM reconstruction and image classification. The UAV-derived DSM had a root mean square error (RMSE) lower than 0.5 m when the height of the terraces was assessed against field GPS data. The subsequent automated terrace classification yielded an overall accuracy of 90% based exclusively on spectral and elevation data derived from the UAV imagery. PMID:24473345

  19. Entirely automatic on-line analysis system applied to catalytic cracking of gasoline%汽油催化裂化的全自动在线分析系统

    Institute of Scientific and Technical Information of China (English)

    薛青松; 王一萌

    2011-01-01

    With one set of idle catalyst-testing equipment and three gas chromatograph, a novel system for entirely automatic on-line analyzing gasoline catalytic cracking products was developed by re-equipping the only automatic pneumatic controller adapted for directing three way of six-channel valves simultaneously. The modified equipment results of synchronism, multitasking and high robotization of reaction-analysis system, more time and resources were saved, either. Furthermore, the six-channel valves on standby status were revised adapted for clearer analysis system.%利用实验室闲置的高压微反应装置和气相色谱仪,通过改装高压微反应装置上唯一一路自动气动控制系统,使其同时控制三路六通阀,实现高压微反应-三台色谱-三套色谱软件联动,全自动在线分析汽油催化裂解产物,不仅实现了反应-分析的同步性、多任务化和高度自动化,实验成本也大幅下降.并对六通阀原待采样模式进行了改进,大幅降低了对管道的污染.

  20. Composite materials: Fatigue and fracture. Vol. 3

    Science.gov (United States)

    O'Brien, T. K. (Editor)

    1991-01-01

    The present volume discusses topics in the fields of matrix cracking and delamination, interlaminar fracture toughness, delamination analysis, strength and impact characteristics, and fatigue and fracture behavior. Attention is given to cooling rate effects in carbon-reinforced PEEK, the effect of porosity on flange-web corner strength, mode II delamination in toughened composites, the combined effect of matrix cracking and free edge delamination, and a 3D stress analysis of plain weave composites. Also discussed are the compression behavior of composites, damage-based notched-strength modeling, fatigue failure processes in aligned carbon-epoxy laminates, and the thermomechanical fatigue of a quasi-isotropic metal-matrix composite.

  1. Análise da madeira de Pinus oocarpa parte I: estudo dos constituintes macromoleculares e extrativos voláteis Chemical analysis of Pinus oocarpa wood part I: quantification of macromolecular components and volatile extractives

    Directory of Open Access Journals (Sweden)

    Sérgio Antônio Lemos de Morais

    2005-06-01

    Full Text Available Neste estudo foram analisados os principais componentes químicos da madeira de Pinus oocarpa, cultivado na região do cerrado. A composição química dessa madeira foi: 59,05% de a-celulose, 21,22% de hemiceluloses A e B, 25,18% de lignina, 2,78% de extrativos em diclorometano, 4,38% de extrativos em etanol:tolueno, 4,31% de extrativos em água quente e 1,26% de cinzas. O conteúdo de celulose foi relativamente elevado, indicando que essa madeira possui grande potencial para produção de pasta de celulose. Investigou-se, também, a composição dos extrativos. Os principais constituintes do extrato diclorometano dessa madeira foram os ácidos diterpênicos, além dos ácidos palmítico e oléico. No óleo essencial, extraído por aparelho de Clevenger, os principais componentes identificados foram aromadendreno, ledano, hexadecanal e ácido oléico.The chemical composition of Pinus oocarpa wood cultivated in the Brazilian cerrado was established. The obtained results were: a-cellulose (59.05%, hemicelluloses A and B (21.22%, lignin (25.18%, dichloromethane extractives (2.78%, ethanol:toluene extractives (4.38%, hot water extractives (4.31% and ash (1.26%. The cellulose content was high. This result opens perspectives for using Pinus oocarpa wood in pulp and paper industries. Most of the dichloromethane extractives were diterpenic, palmitic and oleic acids. The volatile composition, obtained by means of the Clevenger method followed by GC-MS analysis was constituted mainly by aromadendrene, ledane, hexadecanal and oleic acid.

  2. Empirical Research in Theatre, Vol 3.

    Science.gov (United States)

    Addington, David W., Ed.; Kepke, Allen N., Ed.

    This journal provides a focal point for the collection and distribution of systematically processed information about theory and practice in theatre. Part of an irregularly published series, this issue contains investigations of the application of transactional analysis to the theatre, the psychological effect of counterattitudinal acting in…

  3. Information Management and Market Engineering. Vol. II

    OpenAIRE

    Dreier, Thomas; Krämer, Jan; Studer, Rudi; Weinhardt, Christof; [Hrsg.

    2010-01-01

    The research program Information Management and Market Engineering focuses on the analysis and the design of electronic markets. Taking a holistic view of the conceptualization and realization of solutions, the research integrates the disciplines business administration, economics, computer science, and law. Topics of interest range from the implementation, quality assurance, and advancement of electronic markets to their integration into business processes and legal frameworks.

  4. Automatic Classification of Attacks on IP Telephony

    Directory of Open Access Journals (Sweden)

    Jakub Safarik

    2013-01-01

    Full Text Available This article proposes an algorithm for automatic analysis of attack data in IP telephony network with a neural network. Data for the analysis is gathered from variable monitoring application running in the network. These monitoring systems are a typical part of nowadays network. Information from them is usually used after attack. It is possible to use an automatic classification of IP telephony attacks for nearly real-time classification and counter attack or mitigation of potential attacks. The classification use proposed neural network, and the article covers design of a neural network and its practical implementation. It contains also methods for neural network learning and data gathering functions from honeypot application.

  5. Exercises in dental radiology. Vol. 3

    International Nuclear Information System (INIS)

    The book is addressed to paediatric dentists and other dentists who have children among their patents; it presents a survey of normal and pathological development of teeth and surrounding tissues. Imaging errors, eruption problems, anomalies the radiological picture of primary and secondary crowding during eruption, analysis of the deciduous teeth, teleradiography, traumas and temporomandibular diseases are discussed. Each chapter contains questions concerning the interpretation of the radiological findings. (orig./MG)

  6. Information Management and Market Engineering. [Vol. I

    OpenAIRE

    Dreier, Thomas; Studer, Rudi; Weinhardt, Christof

    2006-01-01

    The research program �Information Management and Market Engineering� focuses on the analysis and the design of electronic markets. Taking a holistic view of the conceptualization and realization of solutions, the research integrates the disciplines business administration, economics, computer science, and law. Topics of interest range from the implementation, quality assurance, and further development of electronic markets to their integration into business processes, innovative...

  7. Automatic analysis of change detection of multi-temporal ERS-2 SAR images by using two-threshold EM and MRF algorithms

    Institute of Scientific and Technical Information of China (English)

    CHEN Fei; LUO Lin; JIN Yaqiu

    2004-01-01

    To automatically detect and analyze the surface change in the urban area from multi-temporal SAR images, an algorithm of two-threshold expectation maximum (EM) and Markov random field (MRF) is developed. Difference of the SAR images demonstrates variation of backscattering caused by the surface change all over the image pixels. Two thresholds are obtained by the EM iterative process and categorized to three classes: enhanced scattering, reduced scattering and unchanged regimes. Initializing from the EM result, the iterated conditional modes (ICM) algorithm of the MRF is then used to analyze the detection of contexture change in the urban area. As an example, two images of the ERS-2 SAR in 1996 and 2002 over the Shanghai City are studied.

  8. Unsupervised Threshold for Automatic Extraction of Dolphin Dorsal Fin Outlines from Digital Photographs in DARWIN (Digital Analysis and Recognition of Whale Images on a Network)

    CERN Document Server

    Hale, Scott A

    2012-01-01

    At least two software packages---DARWIN, Eckerd College, and FinScan, Texas A&M---exist to facilitate the identification of cetaceans---whales, dolphins, porpoises---based upon the naturally occurring features along the edges of their dorsal fins. Such identification is useful for biological studies of population, social interaction, migration, etc. The process whereby fin outlines are extracted in current fin-recognition software packages is manually intensive and represents a major user input bottleneck: it is both time consuming and visually fatiguing. This research aims to develop automated methods (employing unsupervised thresholding and morphological processing techniques) to extract cetacean dorsal fin outlines from digital photographs thereby reducing manual user input. Ideally, automatic outline generation will improve the overall user experience and improve the ability of the software to correctly identify cetaceans. Various transformations from color to gray space were examined to determine whi...

  9. Soil-structure interaction Vol.2. Influence of lift-off

    International Nuclear Information System (INIS)

    This study has been performed for the Nuclear Regulatory Commission (NRC) by the Structural Analysis Division of Brookhaven National Laboratory (BNL). The study was conducted during the fiscal year 1985 on the program entitled 'Benchmarking of Structural Engineering Problems' sponsored by NRC. The program considered three separate but complementary problems, each associated with the soil-structure interaction (SSI) phase of the seismic response analysis of nuclear plant facilities. The reports are presented in three separate volumes. The general title for the reports is 'Soil Structure Interaction' with the following subtitles: Vol. 1 Influence of Layering by A.J. Philippacopoulos, Vol. 2 Influence of Lift-Off by C.A. Miller, Vol. 3 Influence of Ground Water by C.J. Costantino. The two problems presented in Volumes 2 and 3 were conducted at the City University of New York (CUNY) under subcontract to BNL. This report, Volume 2 of the report, presents a summary of the work performed defining the influence liftoff has on the seismic response of nuclear power plant structures. The standard lumped parameter analysis method was modified by representing the lumped soil/structure interaction horizontal and rocking dampers with distributed (over the foundation area) springs and dampers. The distributed springs and dampers are then modified so that they can only transmit compressive stresses. Additional interaction damping is included to account for the energy dissipated as a portion of the foundation which has separated comes back into contact with the soil. The validity of the model is evaluated by comparing predictions made with it to data measured during the SIMQUAKE II experiment. The predictions were found to correlate quite well with the measured data except for some discrepancies at the higher frequencies (greater than 10 cps). This discrepancy was attributed to the relatively crude model used for impact effects. Data is presented which identifies the peak

  10. IJIMAI Editor's Note - Vol. 3 Issue 5

    Directory of Open Access Journals (Sweden)

    Rubén Gonzalez-Crespo

    2015-12-01

    Full Text Available The research works presented in this issue are based on various topics of interest, among which are included: DSL, Machine Learning, Information hiding, Steganography, SMA, RTECTL, SMT-based bounded model checking, STS, Spatial sound, X3D, X3DOM, Web Audio API, Web3D, Real-time, Realistic 3D, 3D Audio, Apache Wave, API, Collaborative, Pedestrian Inertial, Navigation System, Indoor Location, Learning Algorithms, Information Fusion, Agile development, Scrum, Cross Functional Teams, Knowledge Transfer, Technological Innovation, Technology Transfer, Social Networks Analysis, Project Management, Links in Social Networks, Rights of Knowledge Sharing and Web 2.0.

  11. Exercises in dental radiology. Vol. 1

    Energy Technology Data Exchange (ETDEWEB)

    Langlais, R.P.; Kasle, M.J.

    1980-01-01

    With concise questions for diagnosis and differential diagnosis the authors present 298 radiographs, which are completed by notes taken from the anamnesis and other findings. Five fields are discussed: X-ray anatomy, film faults, identification of materials and foreign bodies, anomalies, pathologic alterations and localisation tasks. Without any doubt the voluminous collection of pathologic findings is the most important part. But also in the other chapters numerous valuable limits are given. They all lead to precise and systematic analysis and careful interpretation of radiographs. In many cases the answers in the appendix give by additional explanations information on technical details, radipathologic specialities and on the mode of diagnosis.

  12. 基于主成分分析和马氏距离的测井曲线自动分层方法%Automatic stratification of well logging curves based on principal component analysis and Mahalanobis distance

    Institute of Scientific and Technical Information of China (English)

    涂必超; 杨枫林

    2012-01-01

    采用主成分分析与判别分析相结合的方法对测井曲线进行自动分层.首先对所有井进行数据预处理,通过程序比较字符串的方式提取出其中的公共曲线,然后对处理过的数据用R编程实现主成分分析,选择主成分代替原来的数据,达到降维的目的.最后以标准井提供的分层结果作为样本进行距离判别分析,对剩下井进行自动分层,并与手工分层的结果比对,给出最终处理的分层结果.%The principal component analysis and discriminated analysis are applied to automatically stratify well logging curves. Firstly, the data preprocessing to all wells is done. By the way of comparing the character string, its common curves are picked up. Then, principal component analysis to the processed data is realized by using R programming. Principal component is used to replace primary data so as to reduce dimension. Finally, discriminated analysis is done based on the sample of stratified result supplied by standard well. The rest wells are automatically stratified. By comparing with the result of manual stratification, the final - processed stratified result is obtained.

  13. 车辆段/停车场增设全自动运行功能的分析%Analysis of Automatic Train Operation Function of Depot and Parking Yard

    Institute of Scientific and Technical Information of China (English)

    黄志红

    2015-01-01

    基于目前车辆段/停车场的联锁控制方案,对车辆段/停车场作业的需求分析,通过在车辆段/停车场配置全自动运行区域,列车在该区域具备CBTC级别下的ATP/ATO功能以及ATS监控功能,并能够以ATO自动完成进出段/场的运行功能。%Based on the current interlocking control scheme for depot and parking yard, the paper makes analysis of the train operation requirements in depot and parking yard. With the arrangement of automatic train operation area in depot and parking yard, and trains acquire ATP/ATO function of CBTC level and ATS function, the trains with ATO can automatically running in and out depot and parking yard.

  14. Finite Element Analysis on Automatic Transmission Box of Electric Vehicle based on Solidworks Simulation%基于Solidworks Simulation纯电动汽车变速器箱体有限元分析

    Institute of Scientific and Technical Information of China (English)

    王艳

    2012-01-01

    In early development, the transmission ratio has been designed based on the utilization rate of energy for an electric vehicle. The transmission scheme and the size of automatic transmission box have also been obtained. In this paper. Solidworks structure modeling of transmission box was constructed. Using Solidworks Simulation, the finite element analysis was gained on both the front cover and the back cover of the transmission box. The maximum stress was obtained and the location where the maximum stress took place was found. The results show that the design of the automatic transmission not only meets the optimal energy utilization rate but also meets the requirements of the static strength.%前期已经对某电动汽车进行基于能量利用率的传动比设计,设计出自动变速器传动方案及其尺寸。本文采用Solidworks软件建立箱体结构模型,并以Solidworks Simulation对箱体前盖和后盖分别进行有限元分析,得出最大应力值及出现位置。结果表明,所设计的自动变速器既满足能量利用率最优又符合静强度要求。

  15. Automatic Kurdish Dialects Identification

    Directory of Open Access Journals (Sweden)

    Hossein Hassani

    2016-02-01

    Full Text Available Automatic dialect identification is a necessary Lan guage Technology for processing multi- dialect languages in which the dialects are linguis tically far from each other. Particularly, this becomes crucial where the dialects are mutually uni ntelligible. Therefore, to perform computational activities on these languages, the sy stem needs to identify the dialect that is the subject of the process. Kurdish language encompasse s various dialects. It is written using several different scripts. The language lacks of a standard orthography. This situation makes the Kurdish dialectal identification more interesti ng and required, both form the research and from the application perspectives. In this research , we have applied a classification method, based on supervised machine learning, to identify t he dialects of the Kurdish texts. The research has focused on two widely spoken and most dominant Kurdish dialects, namely, Kurmanji and Sorani. The approach could be applied to the other Kurdish dialects as well. The method is also applicable to the languages which are similar to Ku rdish in their dialectal diversity and differences.

  16. IJIMAI Editor's Note - Vol. 2 Issue 7

    Directory of Open Access Journals (Sweden)

    Luis de-la-Fuente-Valentín

    2014-09-01

    Full Text Available This special issue, Special Issue on Multisensor user tracking and analytics to improve education and other application fields, concentrates on the practical and experimental use of data mining and analytics techniques, specially focusing on the educational area. The selected papers deal with the most relevant issues in the field, such as the integration of data from different sources, the identification of data suitable for the problem analysis, and the validation of the analytics techniques as support in the decision making process. The application fields of the analytics techniques presented in this paper have a clear focus on the educational area (where Learning Analytics has emerged as a buzzword in the recent years but not restricted to it. The result is a collection of use cases, experimental validations and analytics systems with a clear contribution to the state of the art.

  17. An automatic evaluation system for NTA film neutron dosimeters

    CERN Document Server

    Müller, R

    1999-01-01

    At CERN, neutron personal monitoring for over 4000 collaborators is performed with Kodak NTA films, which have been shown to be the most suitable neutron dosimeter in the radiation environment around high-energy accelerators. To overcome the lengthy and strenuous manual scanning process with an optical microscope, an automatic analysis system has been developed. We report on the successful automatic scanning of NTA films irradiated with sup 2 sup 3 sup 8 Pu-Be source neutrons, which results in densely ionised recoil tracks, as well as on the extension of the method to higher energy neutrons causing sparse and fragmentary tracks. The application of the method in routine personal monitoring is discussed. $9 overcome the lengthy and strenuous manual scanning process with an optical microscope, an automatic analysis system has been developed. We report on the successful automatic scanning of NTA films irradiated with /sup 238/Pu-Be source $9 discussed. (10 refs).

  18. Proceedings of the second international conference on environmental impact assessment of all economical activities. Vol. 3

    International Nuclear Information System (INIS)

    Proceedings of the conference consist of 3 volumes: Vol. 1 - 'Environmental Impact Assessment of all Economical Activities including Industry'; Vol. 2 - 'Air Pollution Control and Prevention'; Vol. 3 - Waste Management and Environmental Problems in Construction Industry'. Out of 39 papers contained in Vol. 3, 3 were inputted to INIS. They deal with the use of portable radioisotope X-ray fluorescence analyzers in the determination of building material contamination by toxic elements, with underground waste repositories and ground water contamination, and the impact of the Temelin nuclear power plant on the hydrosphere and other environmental components. (Z.S.)

  19. Electronic amplifiers for automatic compensators

    CERN Document Server

    Polonnikov, D Ye

    1965-01-01

    Electronic Amplifiers for Automatic Compensators presents the design and operation of electronic amplifiers for use in automatic control and measuring systems. This book is composed of eight chapters that consider the problems of constructing input and output circuits of amplifiers, suppression of interference and ensuring high sensitivity.This work begins with a survey of the operating principles of electronic amplifiers in automatic compensator systems. The succeeding chapters deal with circuit selection and the calculation and determination of the principal characteristics of amplifiers, as

  20. The Automatic Telescope Network (ATN)

    CERN Document Server

    Mattox, J R

    1999-01-01

    Because of the scheduled GLAST mission by NASA, there is strong scientific justification for preparation for very extensive blazar monitoring in the optical bands to exploit the opportunity to learn about blazars through the correlation of variability of the gamma-ray flux with flux at lower frequencies. Current optical facilities do not provide the required capability.Developments in technology have enabled astronomers to readily deploy automatic telescopes. The effort to create an Automatic Telescope Network (ATN) for blazar monitoring in the GLAST era is described. Other scientific applications of the networks of automatic telescopes are discussed. The potential of the ATN for science education is also discussed.

  1. Automatic detection of moving objects in video surveillance

    OpenAIRE

    Guezouli, Larbi; Belhani, Hanane

    2016-01-01

    This work is in the field of video surveillance including motion detection. The video surveillance is one of essential techniques for automatic video analysis to extract crucial information or relevant scenes in video surveillance systems. The aim of our work is to propose solutions for the automatic detection of moving objects in real time with a surveillance camera. The detected objects are objects that have some geometric shape (circle, ellipse, square, and rectangle).

  2. SEMANTIC INTEGRATION FOR AUTOMATIC ONTOLOGY MAPPING

    Directory of Open Access Journals (Sweden)

    Siham AMROUCH

    2013-11-01

    Full Text Available In the last decade, ontologies have played a key technology role for information sharing and agents interoperability in different application domains. In semantic web domain, ontologies are efficiently used to face the great challenge of representing the semantics of data, in order to bring the actual web to its full power and hence, achieve its objective. However, using ontologies as common and shared vocabularies requires a certain degree of interoperability between them. To confront this requirement, mapping ontologies is a solution that is not to be avoided. In deed, ontology mapping build a meta layer that allows different applications and information systems to access and share their informations, of course, after resolving the different forms of syntactic, semantic and lexical mismatches. In the contribution presented in this paper, we have integrated the semantic aspect based on an external lexical resource, wordNet, to design a new algorithm for fully automatic ontology mapping. This fully automatic character features the main difference of our contribution with regards to the most of the existing semi-automatic algorithms of ontology mapping, such as Chimaera, Prompt, Onion, Glue, etc. To better enhance the performances of our algorithm, the mapping discovery stage is based on the combination of two sub-modules. The former analysis the concept’s names and the later analysis their properties. Each one of these two sub-modules is it self based on the combination of lexical and semantic similarity measures.

  3. 带电作业工器具自动管理系统应用分析%Application Analysis of Live Line Tool Automatic Management System

    Institute of Scientific and Technical Information of China (English)

    曹国文; 蒋标

    2015-01-01

    In view of complicated formalities of tools receiving and returning, long time-consuming, low work efficiency in the live line tools storehouse in Bayannur Electric Power Bureau, the live line tool automatic management system was adopted. RFID system simplified the the procedure of tools receipt and approval. As implementing, the staff need to take the tools with radio frequency tags through the the doors installed with radio frequency device instruments. The system will record the information of the staff, the time, the name of tools taking out (taking in), and the numbers of the tools, and upload and store automatically the information to live working instruments warehouse computer. The staff can inquire information in this computer and in the office computer through private network, which can not only save working time, improve work efficiency, but also guarantee the safety of the instruments use.%针对内蒙古巴彦淖尔电业局带电作业工具库内工器具领用、归还手续繁琐、耗时长,工作效率低的情况,在该局使用了带电作业工器具自动管理系统(RFID)。该系统简化了带电作业工器具出、入库的流程,操作时,工作人员只需携带贴有射频标签的工器具从装有射频装置的大门通过,该系统就会记录领取工器具的工作人员、时间、带出(回)的工器具名称以及数量信息,并将信息上传至带电作业工器具库房的计算机中,自动保存,工作人员可在该计算机中查询借出、归还、库存信息,也可在办公室计算机中通过专网查询当前工器具信息,无需人工登记,不仅节约了工作时间,提高了工作效率,而且确保了工器具的使用安全。

  4. Automatic modulation recognition of communication signals

    CERN Document Server

    Azzouz, Elsayed Elsayed

    1996-01-01

    Automatic modulation recognition is a rapidly evolving area of signal analysis. In recent years, interest from the academic and military research institutes has focused around the research and development of modulation recognition algorithms. Any communication intelligence (COMINT) system comprises three main blocks: receiver front-end, modulation recogniser and output stage. Considerable work has been done in the area of receiver front-ends. The work at the output stage is concerned with information extraction, recording and exploitation and begins with signal demodulation, that requires accurate knowledge about the signal modulation type. There are, however, two main reasons for knowing the current modulation type of a signal; to preserve the signal information content and to decide upon the suitable counter action, such as jamming. Automatic Modulation Recognition of Communications Signals describes in depth this modulation recognition process. Drawing on several years of research, the authors provide a cr...

  5. Automatic Phonetic Transcription for Danish Speech Recognition

    DEFF Research Database (Denmark)

    Kirkedal, Andreas Søeborg

    to acquire and expensive to create. For languages with productive compounding or agglutinative languages like German and Finnish, respectively, phonetic dictionaries are also hard to maintain. For this reason, automatic phonetic transcription tools have been produced for many languages. The quality...... of automatic phonetic transcriptions vary greatly with respect to language and transcription strategy. For some languages where the difference between the graphemic and phonetic representations are small, graphemic transcriptions can be used to create ASR systems with acceptable performance. In other languages...... representations, e.g. morphological analysis, decompounding, letter-to-sound rules, etc. Two different phonetic transcribers for Danish will be compared in this study: eSpeak (Duddington, 2010) and Phonix (Henrichsen, 2014). Both transcribers produce a richer transcription than ASR can utilise such as stress...

  6. A stupid plagiarism in the article « Ethnopharmacology and phytochemical screening of bioactive extracts of Limoniastrum feei ( plombagenaceae » Asian Journal of Natural & Applied Sciences, Vol. 2(1, 5-9, 2013.

    Directory of Open Access Journals (Sweden)

    A. CHERITI

    2013-11-01

    Full Text Available It’s well known that plagiarism is considered mis­conduct research, unethical, perturb and damage the integrity and knowledge of the scientific community. In addition that is a form of violation of copyright law and is therefore illegal. Whereas matching an integral article (Copy and paste from another scientific journal is a potential plagiarism qualified unjustified attitude and a stupid practice. The most striking case, is the copy and paste of our article published since 2012 in this journal (PhytoChem & BioSub Journal Vol. 6 N° 2, 83-87, 2012 by another “ scientists” - L. Ziane, H. A. Lazouni, A. Moussaoui, N. Hamidi - and published in Asian Journal of Natural & Applied Sciences, Vol. 2(1, 5-9, 2013. Thus, any one, authors or publisher not condemn and promote these practices should be automatically disqualified from scientific research.

  7. Automatic programming of simulation models

    Science.gov (United States)

    Schroer, Bernard J.; Tseng, Fan T.; Zhang, Shou X.; Dwan, Wen S.

    1988-01-01

    The objective of automatic programming is to improve the overall environment for describing the program. This improved environment is realized by a reduction in the amount of detail that the programmer needs to know and is exposed to. Furthermore, this improved environment is achieved by a specification language that is more natural to the user's problem domain and to the user's way of thinking and looking at the problem. The goal of this research is to apply the concepts of automatic programming (AP) to modeling discrete event simulation system. Specific emphasis is on the design and development of simulation tools to assist the modeler define or construct a model of the system and to then automatically write the corresponding simulation code in the target simulation language, GPSS/PC. A related goal is to evaluate the feasibility of various languages for constructing automatic programming simulation tools.

  8. Clothes Dryer Automatic Termination Evaluation

    Energy Technology Data Exchange (ETDEWEB)

    TeGrotenhuis, Ward E.

    2014-10-01

    Volume 2: Improved Sensor and Control Designs Many residential clothes dryers on the market today provide automatic cycles that are intended to stop when the clothes are dry, as determined by the final remaining moisture content (RMC). However, testing of automatic termination cycles has shown that many dryers are susceptible to over-drying of loads, leading to excess energy consumption. In particular, tests performed using the DOE Test Procedure in Appendix D2 of 10 CFR 430 subpart B have shown that as much as 62% of the energy used in a cycle may be from over-drying. Volume 1 of this report shows an average of 20% excess energy from over-drying when running automatic cycles with various load compositions and dryer settings. Consequently, improving automatic termination sensors and algorithms has the potential for substantial energy savings in the U.S.

  9. Machining deformation analysis on the rear cover of the automatic transmission%自动变速器后盖壳体加工变形分析

    Institute of Scientific and Technical Information of China (English)

    汤善荣; 吴成龙

    2009-01-01

    文章通过对某自动变速器后盖薄壁铝合金壳体,在试制加工过程中产生的加工变形进行分析,阐述了薄壁类零件发生变形的原因以及相应的解决方案,通过试验总结出合理选用刀具切削参数、优化夹具设计等方案,可以有效地解决薄壁类零件在加工过程中产生的变形问题.%This thesis analysed the rear cover thin-walled aluminum alloys components of an automatic transmission which has been deformed during the trial-manufacture, and expatiated the causes of the deformation and the corresponding solutions. The thesis summed up the scheme of the appropriate cutting parameter and the fixture by the experiment. The problem of the deformation during the machining can be effectively resolved.

  10. The trajectory analysis of GPS automatics homing system in a changing wind%GPS自动归航系统在变化风场中的轨迹

    Institute of Scientific and Technical Information of China (English)

    殷俊; 谷雷

    2001-01-01

    This paper analyses the trajectory of GPS automatics homing system in constant wind fields , and discusses the condition under which the gliding parachute system could reach preassigned location. Certain kinds of changing wind fields are presented, the trajectory of radial homing system in changing wind fields were calculated. In order to find weather the radial homing system could reach preassigned location or divergent, simulating program is given. Through the analyses of results of simulation, the solution about head-wind control is presented that GPS orientation technology and the measured data of velocity should be adopted.%研究了全球卫星定位系统(GPS)自动归航系统在恒定风场中归航轨迹的计算,探讨了系统能够达到预定目标点的条件,给出系统在几种变化风场中的轨迹图。为了进一步验证归航轨迹能否回到预定点,给出了系统在变化风场中的仿真算法。并通过对仿真结果的分析,提出在系统着陆期间利用GPS定位和测速数据实行逆风控制的方案。

  11. Kinematic Analysis of Incomplete Gear Automatic Reverse Mechanism%不完全齿轮自动换向机构的运动分析

    Institute of Scientific and Technical Information of China (English)

    王猛; 李长春

    2012-01-01

    利用不完全齿轮机构的特点,设计一种齿轮式自动换向装置,实现输入轴连续单向转动,输出轴连续正反向转动,且保持输入轴的转速和输出轴的转速之比为定值.分析了不完全齿轮首齿和末齿的啮合线长度,在保证重合度等于1时,运用齿顶高修形法,避免了不完全齿轮传动的干涉现象发生.%By the characteristics of the incomplete gear mechanism, a kind of gear automatic reversing device to realize the input shaft continuous unidirectional turns is designed to realize, the output shaft positive and negative rotary under the condition of the input shaft speed and output speed ratio is fixed value. The length of line of action with the teeth at the end and the first of incomplete gear is analyzed, on the premise of contact ratio equal to one, using repair form method of addendum ,the interference phenomenon of the incomplete gear transmission is avoided.

  12. Automatic Number Plate Recognition System

    OpenAIRE

    Rajshree Dhruw; Dharmendra Roy

    2014-01-01

    Automatic Number Plate Recognition (ANPR) is a mass surveillance system that captures the image of vehicles and recognizes their license number. The objective is to design an efficient automatic authorized vehicle identification system by using the Indian vehicle number plate. In this paper we discus different methodology for number plate localization, character segmentation & recognition of the number plate. The system is mainly applicable for non standard Indian number plates by recognizing...

  13. Búsquedas en grandes volúmenes de datos

    OpenAIRE

    Britos, Luis; Di Gennaro, María E.; Gil Costa, Graciela Verónica; Kasián, Fernando; Lobos, Jair; Ludueña, Verónica; Molina, Romina; Printista, Alicia Marcela; Reyes, Nora Susana; Roggero, Patricia; Trabes, Guillermo

    2016-01-01

    En la actualidad los sistemas de información demandan no sólo poder realizar búsquedas eficientes sobre distintos tipos de datos, tales como texto libre, audio, video, secuencias de ADN, etc., sino también poder manejar grandes volúmenes de estos datos. Dada una consulta, el objetivo de un sistema de recuperación de información es obtener lo que podría ser útil o relevante para el usuario, usando una estructura de almacenamiento especialmente diseñada para responderla eficientemente. Nuest...

  14. From Editor vol 11, No.4

    Directory of Open Access Journals (Sweden)

    Ugur Demiray

    2010-10-01

    Full Text Available Greetings Dear readers of TOJDE,TOJDE is appeared on your screen now as Volume 11, Number: 4. In this issue it is published 4 notes for Editor, 12 articles, 2 book and one conference reviews. And this time, 27 authors from 10 different countries are placed. These published articles are from Bangldesh, Greece, India, Israel, Malaysia, Nigeria, Portugal, Singapore, Turkey and USA.The first Notes for editor arrived from USA, written by Kevin YEE and Jace HARGIS. They focused on PREZI: A Different Way to Present. Prezi represents the first step toward other visual tools that are not, strictly speaking, and presentations at all, but may yet find uses in classrooms. Browser-based programs that allow for concept mapping and brainstorming (sometimes with drawing and even inter-user chat functionality approximate some of prezi’s best features yet stand on their own as organizing tools that provide inspiration for the users. Examples include bubble.us, Thinkature, Mindmeister, and Graphic Organizer. The second notes for editor is titled as “Investigating the Factor Structure ıf The Blog Attitude Scale” which is written by Zahra SHAHSAVAR, Tan Bee HOON, S. Vahid ARYADOUST from Malaysia. Their study reports the design and development of a blog attitude scale (BAS. In exploratory factor analysis, three factors were discovered: blog anxiety, blog desirability, and blog self-efficacy; 14 items were excluded. The extracted items were subjected to a confirmatory factor analysis which lent further support to the BAS underpinning structure. “Implementation of An Online Teacher Assessment/Appraisal In Technical Education Institution: A Case Study” is the third paper for “Notes for Editor” section of TOJDE’s in this issue. It has written by Sraboni MANDAL, Dr. SANJAY, Dajnish SHRIVASTAVA from National Institute of Technology, Jamshedpur, INDIA. The purpose to discusses a case study of implementation of teacher appraisal system which initially non

  15. From Editor vol 11, No.3

    Directory of Open Access Journals (Sweden)

    Ugur Demiray

    2010-07-01

    Full Text Available Greetings Dear readers of TOJDE,TOJDE is appeared on your screen now as Volume 11, Number: 3. In this issue it is published 3 notes for Editor, 12 articles, 2 book reviews. And this time, 32 authors from 10 different countries are placed. These published articles are from Barbados, Ghana, Iran, Malaysia, Pakistan, Spain, Turkey, United Kingdom, UAE and USA.“Developing and Validation A Usability Evaluation Tools For Distance Education Websites:Persian Version” has sent as Notes for editor section of TOJDE from Iran and written by Soheila HAFEZI, Ahmad FARAHI from Payame Noor University and Soheil Najafi MEHRI, Hosein MAHMOODI from Baqiyatallah Medical Sciences University, Tehran.. Their paper involves that he content validity index was measured by set of ten experts, who evaluated each item individually. According to CVI, the final version of instrument was composed of 40 questions divided into 8 domains: Navigation, Functionality, Feedback, Control, Language, Consistency, Error prevention and correction, and Visual clarity. CVI score for each phrase was more than 0.75. According to our findings, this instrument has enough validity to apply in evaluation usability of educational websites of Persian distance education websites. However, instrument reliability can be measured in further study. The second notes for editor is titled as “A Critical Analysis Of Managerial Skills Competencies Of Secondary School Heads Trained Through Distance Mode of Allama Iqbal Open University” which is written by Muhammad AKHLAQ from Preston University, Islamabad, PAKISTAN and SHAZIA MUNAWAR SULEHRI from Ministry of Education, PAKISTAN. They mentioned in their paper to analyze the managerial skills competencies of secondary schools heads trained through distance mode of education in Pakistan. For this purpose a sample 300 secondary school teachers and 100 secondary schools head-teachers trained through distance mode and working in the Federal Government

  16. 移动互联平台自动答疑模式之分析%Analysis of Mobile Internet Platform Automatic Question-answering Mode

    Institute of Scientific and Technical Information of China (English)

    郭毅; 涂婧璐

    2015-01-01

    In all schools to promote quality education in the process of teaching reform, questions and answers after class is an es⁃sential part of teaching content. With the popularity of Internet applications, network platform answering questions has become a necessary complement to classroom teaching. Now,network Q&A at present most of the limit of environment and answering the timeliness and other issues, so that the students' interest in learning was affected by. Our solution is to network the traditional way to answer questions is applied to mobile Internet devices, it can complete the function of automatic answer question. It makes the Q&A without time and space environmental constraints, thus expanding the students' learning space. This will greatly stimulate and promote students' interest in learning.%在各学校推进以素质教育为主的教学改革过程中,课后问答是教学环节必不可少的内容,随着互联网的普及应用,网络答疑已成为课堂教学的必要补充。目前的网络答疑大多受环境及答疑时效性等问题限制,使学生学习兴趣受到影响。我们把传统的网络答疑方式应用到手机等移动设备上,完成提问的自动答复,使之不受时空等约束,扩展学生的学习空间,将极大的激发学习兴趣。

  17. 基于大数据分析的铁路自动售检票监控系统研究%Railway Automatic Ticketing and Gate Monitoring System based on big data analysis

    Institute of Scientific and Technical Information of China (English)

    王成; 史天运

    2015-01-01

    This article proposed the general frame of Railway Automatic Ticketing and Gate Monitoring System(RATGS). The System was consisted of 4 layers, which were the infrastructure layer, the management layer, the analysis layer and application layer. The System was introduced technologies such as multidimensional data analysis, the distributed ifle system storage MapReduce, Complex Event Processing(CEP), data mining and etc., to implement the value added services based on passenger behavior analysis, such as fault early warning, analysis of failure rate, the utilization rate analysis of equipments, business optimization analysis, OD hotspot analysis, abnormal passenger recognition, usability analysis of equipment. All of these pointed out a new method for the future development of RATGS.%本文提出铁路自动售检票监控系统总体框架由基础层、管理层、分析层和应用层组成。利用多维数据分析、分布式文件系统存储和MapReduce计算、复杂事件处理、数据挖掘等技术,实现对铁路自动售检票系统的故障预警和故障率分析、设备利用率分析、业务优化分析以及OD热点分析、异常旅客识别、设备易用性分析等以旅客行为分析为基础的增值业务,为铁路自动售检票系统的未来发展提供一种新思路。

  18. Preventing SQL Injection through Automatic Query Sanitization with ASSIST

    CERN Document Server

    Mui, Raymond; 10.4204/EPTCS.35.3

    2010-01-01

    Web applications are becoming an essential part of our everyday lives. Many of our activities are dependent on the functionality and security of these applications. As the scale of these applications grows, injection vulnerabilities such as SQL injection are major security challenges for developers today. This paper presents the technique of automatic query sanitization to automatically remove SQL injection vulnerabilities in code. In our technique, a combination of static analysis and program transformation are used to automatically instrument web applications with sanitization code. We have implemented this technique in a tool named ASSIST (Automatic and Static SQL Injection Sanitization Tool) for protecting Java-based web applications. Our experimental evaluation showed that our technique is effective against SQL injection vulnerabilities and has a low overhead.

  19. Preventing SQL Injection through Automatic Query Sanitization with ASSIST

    Directory of Open Access Journals (Sweden)

    Raymond Mui

    2010-09-01

    Full Text Available Web applications are becoming an essential part of our everyday lives. Many of our activities are dependent on the functionality and security of these applications. As the scale of these applications grows, injection vulnerabilities such as SQL injection are major security challenges for developers today. This paper presents the technique of automatic query sanitization to automatically remove SQL injection vulnerabilities in code. In our technique, a combination of static analysis and program transformation are used to automatically instrument web applications with sanitization code. We have implemented this technique in a tool named ASSIST (Automatic and Static SQL Injection Sanitization Tool for protecting Java-based web applications. Our experimental evaluation showed that our technique is effective against SQL injection vulnerabilities and has a low overhead.

  20. Development of an automatic reactor inspection system

    International Nuclear Information System (INIS)

    Using recent technologies on a mobile robot computer science, we developed an automatic inspection system for weld lines of the reactor vessel. The ultrasonic inspection of the reactor pressure vessel is currently performed by commercialized robot manipulators. Since, however, the conventional fixed type robot manipulator is very huge, heavy and expensive, it needs long inspection time and is hard to handle and maintain. In order to resolve these problems, we developed a new automatic inspection system using a small mobile robot crawling on the vertical wall of the reactor vessel. According to our conceptual design, we developed the reactor inspection system including an underwater inspection robot, a laser position control subsystem, an ultrasonic data acquisition/analysis subsystem and a main control subsystem. We successfully carried out underwater experiments on the reactor vessel mockup, and real reactor ready for Ulchine nuclear power plant unit 6 at Dusan Heavy Industry in Korea. After this project, we have a plan to commercialize our inspection system. Using this system, we can expect much reduction of the inspection time, performance enhancement, automatic management of inspection history, etc. In the economic point of view, we can also expect import substitution more than 4 million dollars. The established essential technologies for intelligent control and automation are expected to be synthetically applied to the automation of similar systems in nuclear power plants

  1. Automatic generation of documents

    OpenAIRE

    Rosa Gini; Jacopo Pasquini

    2006-01-01

    This paper describes a natural interaction between Stata and markup languages. Stata’s programming and analysis features, together with the flexibility in output formatting of markup languages, allow generation and/or update of whole documents (reports, presentations on screen or web, etc.). Examples are given for both LaTeX and HTML. Stata’s commands are mainly dedicated to analysis of data on a computer screen and output of analysis stored in a log file available to researchers for later re...

  2. ECONOMICS OF ALEXANDER THE GREAT (15 vols + 4 cdroms) by Gregory Zorzos

    OpenAIRE

    Gregory Zorzos

    2002-01-01

    Research contains many ancient texts (Ancient Greek, Hebrews, Hieroglyphs, Assyrian, Sumerian, Babylonian, Latin, etc.). 1. (MICRO-MACRO ECONOMICS OF ALEXANDER THE GREAT (5 vols + cdrom). Microeconomics and macroeconomics of Alexander the Great. Economic theories, feasibilities, economic plannings, general description of the campaign's business plan etc. 2. BANKS OF ALEXANDER THE GREAT (2 vols + cdrom) Describes banking system, economists, financiers, investors, accountants, bookkeepers, etc,...

  3. Index to Nuclear Safety. A technical progress review by chronology, permuted title, and author. Vol. 11, No. 1--Vol. 17, No. 6

    Energy Technology Data Exchange (ETDEWEB)

    Cottrell, W.B.; Klein, A.

    1977-02-23

    This index to Nuclear Safety covers articles in Nuclear Safety Vol. 11, No. 1 (Jan.-Feb. 1970), through Vol. 17, No. 6 (Nov.-Dec. 1976). The index includes a chronological list of articles (including abstract) followed by KWIC and Author Indexes. Nuclear Safety, a bimonthly technical progress review prepared by the Nuclear Safety Information Center, covers all safety aspects of nuclear power reactors and associated facilities. The index lists over 350 technical articles in the last six years of publication.

  4. Index to Nuclear Safety. A technical progress review by chronology, permuted title, and author. Vol. 11, No. 1--Vol. 17, No. 6

    International Nuclear Information System (INIS)

    This index to Nuclear Safety covers articles in Nuclear Safety Vol. 11, No. 1 (Jan.-Feb. 1970), through Vol. 17, No. 6 (Nov.-Dec. 1976). The index includes a chronological list of articles (including abstract) followed by KWIC and Author Indexes. Nuclear Safety, a bimonthly technical progress review prepared by the Nuclear Safety Information Center, covers all safety aspects of nuclear power reactors and associated facilities. The index lists over 350 technical articles in the last six years of publication

  5. Automatic image classification for the urinoculture screening.

    Science.gov (United States)

    Andreini, Paolo; Bonechi, Simone; Bianchini, Monica; Garzelli, Andrea; Mecocci, Alessandro

    2016-03-01

    Urinary tract infections (UTIs) are considered to be the most common bacterial infection and, actually, it is estimated that about 150 million UTIs occur world wide yearly, giving rise to roughly $6 billion in healthcare expenditures and resulting in 100,000 hospitalizations. Nevertheless, it is difficult to carefully assess the incidence of UTIs, since an accurate diagnosis depends both on the presence of symptoms and on a positive urinoculture, whereas in most outpatient settings this diagnosis is made without an ad hoc analysis protocol. On the other hand, in the traditional urinoculture test, a sample of midstream urine is put onto a Petri dish, where a growth medium favors the proliferation of germ colonies. Then, the infection severity is evaluated by a visual inspection of a human expert, an error prone and lengthy process. In this paper, we propose a fully automated system for the urinoculture screening that can provide quick and easily traceable results for UTIs. Based on advanced image processing and machine learning tools, the infection type recognition, together with the estimation of the bacterial load, can be automatically carried out, yielding accurate diagnoses. The proposed AID (Automatic Infection Detector) system provides support during the whole analysis process: first, digital color images of Petri dishes are automatically captured, then specific preprocessing and spatial clustering algorithms are applied to isolate the colonies from the culture ground and, finally, an accurate classification of the infections and their severity evaluation are performed. The AID system speeds up the analysis, contributes to the standardization of the process, allows result repeatability, and reduces the costs. Moreover, the continuous transition between sterile and external environments (typical of the standard analysis procedure) is completely avoided. PMID:26780249

  6. Automatic Melody Segmentation

    OpenAIRE

    Rodríguez López, Marcelo

    2016-01-01

    The work presented in this dissertation investigates music segmentation. In the field of Musicology, segmentation refers to a score analysis technique, whereby notated pieces or passages of these pieces are divided into “units” referred to as sections, periods, phrases, and so on. Segmentation analysis is a widespread practice among musicians: performers use it to help them memorise pieces, music theorists and historians use it to compare works, music students use it to understand the composi...

  7. 焊缝形貌对埋弧焊缝自动超声波探伤结果的影响分析%Analysis on Effect of Weld Profile on Submerged Arc Weld Automatic Ultrasonic Testing Results

    Institute of Scientific and Technical Information of China (English)

    桂光正

    2011-01-01

    The weld internal quality of double side submerged arc welding is mainly detected by automatic ultrasonic testing system. In this article, it introduced weld automatic ultrasonic testing equipment and weld tracking principle which was imported from the abroad and used in production line. Through analysis on structure of sampled pipe detected by ultrasonic testing, it explained effect factors to the accuracy of ultrasonic testing results from several aspects, such as outside weld profile, inside and outside weld width, center misalignment of inside and outside weld, anomaly weld profile resulted from edge offset, and so on. Finally, it gave measures and methods of improving ultrasonic testing accuracy for submerged arc welded pipe.%双面埋弧焊管焊缝的内部质量主要通过自动超声波探伤系统来检测.介绍了宝钢股份UOE生产线引进的管体焊缝自动超声波探伤设备及其焊缝跟踪原理.通过对该系统及超声波探伤样管结构等的分析,指出外焊缝形貌、内外焊缝宽度、内外焊缝中心偏移以及错边导致的焊缝形貌不规则等均会影响自动超声波探伤结果的准确性.最后,给出了提高埋弧焊管焊缝自动超声波探伤准确性的措施及方法.

  8. Automatic Melody Segmentation

    NARCIS (Netherlands)

    Rodríguez López, Marcelo

    2016-01-01

    The work presented in this dissertation investigates music segmentation. In the field of Musicology, segmentation refers to a score analysis technique, whereby notated pieces or passages of these pieces are divided into “units” referred to as sections, periods, phrases, and so on. Segmentation analy

  9. 自动化视网膜血管网络的分形维数定量分析%Fractal Dimension Quantitative Analysis of Automatic Retinal Vessel Network

    Institute of Scientific and Technical Information of China (English)

    吴辉群; 袁莉莉; 吴幻; 倪晓薇; 邹如意; 陈亚兰; 施李丽; 蒋葵; 董建成

    2015-01-01

    of the curve. In comparison, the fractal dimension curve of automatic segmentation fundus image obtained was a little more regular. Whether added Gaussian noise or salt and pepper noise, the fractal dimension obtained to be higher than the original Fourier fractal dimension. In the situation of adding noises simultaneously, the value of Fourier fractal dimension that improved was to be far lower than the value of unmodified Fourier fractal dimension. The curve of multifractal dimension obtained after the wavelet transform was more violent and the waveform was not smooth compared to the original value that obtained from the original images. Contrastively, the curve of multifractal dimension of the original image was in a relatively stable state, almost no major fluctuations. Conclusion: Quantitative study of fractal dimension retinal vascular network could provide some help on retinal image analysis.

  10. 基于小波分析的地貌多尺度表达与自动综合%Multi-scale Representation and Automatic Generalization of Relief Based on Wavelet Analysis

    Institute of Scientific and Technical Information of China (English)

    吴凡; 祝国瑞

    2001-01-01

    With the development of GIS application ceaselessly, a mass of multi-scale geospatial data need to be analyzed and represented because users require different detailed spatial data to dealwith different problems and output maps at different scales. It has become one of the key problemsto applied GIS. The logic relations have to be established between spatial data sets at differentscales so that one representation of spatial data can be transferred to another completely. The completeness refers that spatial precision and characteristics and a high information density that adaptsto relevant abstract detail must be preserved,and the consistency of spatial semantics and spatialrelations have to be maintained simultaneously.In addition, the deriving of new spatial data setsshould be bi-directional on some constraint in GIS, from fine-scale to broad-scale and vice versa.Automatic generalization of geographical information is the core content of multi-scale representation of spatial data, but the scale-dependent generalization methods are far from abundance becauseof its extreme complicacy.Most existing algorithms about automatic generalization do not relate toscale directly or accurately, not forecast and control the generalized effects, and cannot assess theholistic consistency of the generalized results. The rational and quantitative methods and criterionsof measuring the extent of generalization have not still been sought out. Wavelet analysis is a newbranch of mathematics burgeoning at the end of 1980s. It has double meanings simultaneously onprofundity of theory and extent of application. Because it has good local character at both time orspace and frequency field simultaneously, and sample interval of signal can be adjusted automatically with different frequency components, any details of function, such as a sign or image etc., canbe analyzed at any scales by using wavelet analysis. Therefore, wavelet analysis suggests a new solution to the problems mentioned above

  11. Automatic Time Skew Detection and Correction

    OpenAIRE

    Korchagin, Danil

    2011-01-01

    In this paper, we propose a new approach for the automatic time skew detection and correction for multisource audiovisual data, recorded by different cameras/recorders during the same event. All recorded data are successfully tested for potential time skew problem and corrected based on ASR-related features. The core of the algorithm is based on perceptual time-quefrency analysis with a precision of 10 ms. The results show correct time skew detection and elimination in 100% of cases for a rea...

  12. Janus: Automatic Ontology Builder from XSD Files

    CERN Document Server

    Bedini, Ivan; Gardarin, Georges

    2010-01-01

    The construction of a reference ontology for a large domain still remains an hard human task. The process is sometimes assisted by software tools that facilitate the information extraction from a textual corpus. Despite of the great use of XML Schema files on the internet and especially in the B2B domain, tools that offer a complete semantic analysis of XML schemas are really rare. In this paper we introduce Janus, a tool for automatically building a reference knowledge base starting from XML Schema files. Janus also provides different useful views to simplify B2B application integration.

  13. New automatic radiation monitoring network in Slovenia

    International Nuclear Information System (INIS)

    The Slovenian Nuclear Safety Administration gathers all on-line dose rate data measured by the various automatic networks operating throughout the territory of Slovenia. With the help of the PHARE financing program and in close cooperation with the Environmental agency of RS the upgrade of the existing network begun in 2005 and was finished in March 2006. The upgrade provided new measuring sites with all relevant data needed in case of a radiological accident. Even bigger improvement was made in the area of data presentation and analysis, which was the main shortcoming of the old system. (author)

  14. 某水电站油压装置自动补气超时问题的分析及处理%Analysis and Treatment of the Overtime Problem in Automatic Air Supplement of Oil Pressure Device in a Hydropower Station

    Institute of Scientific and Technical Information of China (English)

    郭守峰

    2015-01-01

    The structure and working principles of an automatic air supplement unit are introduced, which is adopted in the oil pressure device of the generator governor in a hydropower station.The overtime problem of the air supplement process occurred in the automatic air supplement unit is analyzed.Based on the analysis, the overtime problem is solved by optimization of the automatic air supplement parameters.%介绍了某水电站调速器油压装置自动补气装置组成、原理,并对该电站出现的自动补气装置补气超时现象的原因进行了分析,结合实际,通过自动补气参数的优化,自动补气超时现象消除。

  15. 某水电站油压装置自动补气超时问题的分析及处理%Analysis and Treatment of the Overtime Problem in Automatic Air Supplement of Oil Pressure Device in a Hydropower Station

    Institute of Scientific and Technical Information of China (English)

    郭守峰

    2015-01-01

    介绍了某水电站调速器油压装置自动补气装置组成、原理,并对该电站出现的自动补气装置补气超时现象的原因进行了分析,结合实际,通过自动补气参数的优化,自动补气超时现象消除。%The structure and working principles of an automatic air supplement unit are introduced, which is adopted in the oil pressure device of the generator governor in a hydropower station.The overtime problem of the air supplement process occurred in the automatic air supplement unit is analyzed.Based on the analysis, the overtime problem is solved by optimization of the automatic air supplement parameters.

  16. Automatic mapping of monitoring data

    DEFF Research Database (Denmark)

    Lophaven, Søren; Nielsen, Hans Bruun; Søndergaard, Jacob

    2005-01-01

    This paper presents an approach, based on universal kriging, for automatic mapping of monitoring data. The performance of the mapping approach is tested on two data-sets containing daily mean gamma dose rates in Germany reported by means of the national automatic monitoring network (IMIS......). In the second dataset an accidental release of radioactivity in the environment was simulated in the South-Western corner of the monitored area. The approach has a tendency to smooth the actual data values, and therefore it underestimates extreme values, as seen in the second dataset. However, it is capable...

  17. Annual review in automatic programming

    CERN Document Server

    Goodman, Richard

    2014-01-01

    Annual Review in Automatic Programming focuses on the techniques of automatic programming used with digital computers. Topics covered range from the design of machine-independent programming languages to the use of recursive procedures in ALGOL 60. A multi-pass translation scheme for ALGOL 60 is described, along with some commercial source languages. The structure and use of the syntax-directed compiler is also considered.Comprised of 12 chapters, this volume begins with a discussion on the basic ideas involved in the description of a computing process as a program for a computer, expressed in

  18. Automatic Construction of Finite Algebras

    Institute of Scientific and Technical Information of China (English)

    张健

    1995-01-01

    This paper deals with model generation for equational theories,i.e.,automatically generating (finite)models of a given set of (logical) equations.Our method of finite model generation and a tool for automatic construction of finite algebras is described.Some examples are given to show the applications of our program.We argue that,the combination of model generators and theorem provers enables us to get a better understanding of logical theories.A brief comparison betwween our tool and other similar tools is also presented.

  19. Operational Automatic Remote Sensing Image Understanding Systems: Beyond Geographic Object-Based and Object-Oriented Image Analysis (GEOBIA/GEOOIA. Part 2: Novel system Architecture, Information/Knowledge Representation, Algorithm Design and Implementation

    Directory of Open Access Journals (Sweden)

    Luigi Boschetti

    2012-09-01

    Full Text Available According to literature and despite their commercial success, state-of-the-art two-stage non-iterative geographic object-based image analysis (GEOBIA systems and three-stage iterative geographic object-oriented image analysis (GEOOIA systems, where GEOOIA/GEOBIA, remain affected by a lack of productivity, general consensus and research. To outperform the Quality Indexes of Operativeness (OQIs of existing GEOBIA/GEOOIA systems in compliance with the Quality Assurance Framework for Earth Observation (QA4EO guidelines, this methodological work is split into two parts. Based on an original multi-disciplinary Strengths, Weaknesses, Opportunities and Threats (SWOT analysis of the GEOBIA/GEOOIA approaches, the first part of this work promotes a shift of learning paradigm in the pre-attentive vision first stage of a remote sensing (RS image understanding system (RS-IUS, from sub-symbolic statistical model-based (inductive image segmentation to symbolic physical model-based (deductive image preliminary classification capable of accomplishing image sub-symbolic segmentation and image symbolic pre-classification simultaneously. In the present second part of this work, a novel hybrid (combined deductive and inductive RS-IUS architecture featuring a symbolic deductive pre-attentive vision first stage is proposed and discussed in terms of: (a computational theory (system design, (b information/knowledge representation, (c algorithm design and (d implementation. As proof-of-concept of symbolic physical model-based pre-attentive vision first stage, the spectral knowledge-based, operational, near real-time, multi-sensor, multi-resolution, application-independent Satellite Image Automatic Mapper™ (SIAM™ is selected from existing literature. To the best of these authors’ knowledge, this is the first time a symbolic syntactic inference system, like SIAM™, is made available to the RS community for operational use in a RS-IUS pre-attentive vision first stage

  20. Automatic Licenses Plate Recognition

    OpenAIRE

    Ronak P Patel; Narendra M Patel; Keyur Brahmbhatt

    2013-01-01

    This paper describes the Smart Vehicle Screening System, which can be installed into a tollboothfor automated recognition of vehicle license plate information using a photograph of a vehicle. An automatedsystem could then be implemented to control the payment of fees, parking areas, highways, bridges ortunnels, etc. This paper contains new algorithm for recognition number plate using Morphological operation,Thresholding operation, Edge detection, Bounding box analysis for number plate extract...

  1. Automatic Compartment Modelling and Segmentation for Dynamical Renal Scintigraphies

    DEFF Research Database (Denmark)

    Ståhl, Daniel; Åström, Kalle; Overgaard, Niels Christian;

    2011-01-01

    for segmentation of pixels into physical compartments, extract their corresponding time-activity curves and then compute the parameters that are relevant for medical assessment. In this paper we present a fully automatic system that incorporates spatial smoothing constraints, compartment modelling and positivity......Time-resolved medical data has important applications in a large variety of medical applications. In this paper we study automatic analysis of dynamical renal scintigraphies. The traditional analysis pipeline for dynamical renal scintigraphies is to use manual or semiautomatic methods...

  2. Componentes volátiles de mamey (mammea americana L.

    Directory of Open Access Journals (Sweden)

    Alicia Lucía Morales

    2010-07-01

    Full Text Available Los componentes volátiles del aroma de mamey (Mammea americana L, fueron extraídos utilizando el método de destilación por arrastre con vapor-extracción simultánea con solvente orgánico. El extracto fue prefraccionado por cromatografía en columna en silica gel con gradiente discontinuo Pentano: Éter etílico para obtener tres fracciones que fueron analizadas por CGAR y CGAR-EM. Se detectaron 34 compuestos, de los cuales fueron identificados 22, siendo los componentes mayoritarios: Furfural (7281 ^ig/kg y E-Famesol (2145 ng/kg

  3. Image Based Hair Segmentation Algorithm for the Application of Automatic Facial Caricature Synthesis

    OpenAIRE

    2014-01-01

    Hair is a salient feature in human face region and are one of the important cues for face analysis. Accurate detection and presentation of hair region is one of the key components for automatic synthesis of human facial caricature. In this paper, an automatic hair detection algorithm for the application of automatic synthesis of facial caricature based on a single image is proposed. Firstly, hair regions in training images are labeled manually and then the hair position prior distributions an...

  4. Automatic landslides detection on Stromboli volcanic Island

    Science.gov (United States)

    Silengo, Maria Cristina; Delle Donne, Dario; Ulivieri, Giacomo; Cigolini, Corrado; Ripepe, Maurizio

    2016-04-01

    Landslides occurring in active volcanic islands play a key role in triggering tsunami and other related risks. Therefore, it becomes vital for a correct and prompt risk assessment to monitor landslides activity and to have an automatic system for a robust early-warning. We then developed a system based on a multi-frequency analysis of seismic signals for automatic landslides detection occurring at Stromboli volcano. We used a network of 4 seismic 3 components stations located along the unstable flank of the Sciara del Fuoco. Our method is able to recognize and separate the different sources of seismic signals related to volcanic and tectonic activity (e.g. tremor, explosions, earthquake) from landslides. This is done using a multi-frequency analysis combined with a waveform patter recognition. We applied the method to one year of seismic activity of Stromboli volcano centered during the last 2007 effusive eruption. This eruption was characterized by a pre-eruptive landslide activity reflecting the slow deformation of the volcano edifice. The algorithm is at the moment running off-line but has proved to be robust and efficient in picking automatically landslide. The method provides also real-time statistics on the landslide occurrence, which could be used as a proxy for the volcano deformation during the pre-eruptive phases. This method is very promising since the number of false detections is quite small (detection as an improving tool for early warnings of tsunami-genic landslide activity. We suggest that a similar approach could be also applied to other unstable non-volcanic also slopes.

  5. Automatic Identification of Metaphoric Utterances

    Science.gov (United States)

    Dunn, Jonathan Edwin

    2013-01-01

    This dissertation analyzes the problem of metaphor identification in linguistic and computational semantics, considering both manual and automatic approaches. It describes a manual approach to metaphor identification, the Metaphoricity Measurement Procedure (MMP), and compares this approach with other manual approaches. The dissertation then…

  6. Automatically Preparing Safe SQL Queries

    Science.gov (United States)

    Bisht, Prithvi; Sistla, A. Prasad; Venkatakrishnan, V. N.

    We present the first sound program source transformation approach for automatically transforming the code of a legacy web application to employ PREPARE statements in place of unsafe SQL queries. Our approach therefore opens the way for eradicating the SQL injection threat vector from legacy web applications.

  7. The Automatic Measurement of Targets

    DEFF Research Database (Denmark)

    Höhle, Joachim

    1997-01-01

    The automatic measurement of targets is demonstrated by means of a theoretical example and by an interactive measuring program for real imagery from a réseau camera. The used strategy is a combination of two methods: the maximum correlation coefficient and the correlation in the subpixel range...

  8. Automatic quantification of iris color

    DEFF Research Database (Denmark)

    Christoffersen, S.; Harder, Stine; Andersen, J. D.;

    2012-01-01

    An automatic algorithm to quantify the eye colour and structural information from standard hi-resolution photos of the human iris has been developed. Initially, the major structures in the eye region are identified including the pupil, iris, sclera, and eyelashes. Based on this segmentation, the ...

  9. Automatic Association of News Items.

    Science.gov (United States)

    Carrick, Christina; Watters, Carolyn

    1997-01-01

    Discussion of electronic news delivery systems and the automatic generation of electronic editions focuses on the association of related items of different media type, specifically photos and stories. The goal is to be able to determine to what degree any two news items refer to the same news event. (Author/LRW)

  10. Automatic milking : a better understanding

    NARCIS (Netherlands)

    Meijering, A.; Hogeveen, H.; Koning, de C.J.A.M.

    2004-01-01

    In 2000 the book Robotic Milking, reflecting the proceedings of an International Symposium which was held in The Netherlands came out. At that time, commercial introduction of automatic milking systems was no longer obstructed by technological inadequacies. Particularly in a few west-European countr

  11. Analysis of Volatile Components in Semen Sojae Praepatum with Automatic Static Headspace and Gas Chromatography-Mass Spectrometry%静态顶空-气质联用分析淡豆豉中挥发性成分

    Institute of Scientific and Technical Information of China (English)

    柴川; 于生; 崔小兵; 张爱华; 朱栋; 单晨啸; 文红梅

    2013-01-01

    It was the first report on the volatile components contained in Semen Sojae Praepatum. For the analysis, the Semen Sojae Praepatum was analyzed by automatic static headspace and gas Chromatography-mass Spectrometry. A total of 27 compounds were identified in Semen Sojae Praepatum through the computer retrieval on the NIST5 mass spectral library. They consisted of 11 generality components, such as 2-Butanone, Butanal 3-methyl-, Butanal 2-methyl-, Limonene and 16 special components, such as copaene, Pyrazine, tetramethyl-, 2,3,5-Trimethyl-6-ethylpyrazine, Bicyclo[2.2.1]heptan-2-ol,1,7,7-trimethyl-,acetate,(1S-endo)-. Meanwhile, the quantitative analysis was taken using area normalization method, which showed that some difference was detected among six batches of Semen Sojae Praepatum. The results indicated that automatic static headspace and gas chromatography-mass spectrometry was a fast,easy,efficient and accurate method to analyze the volatile components in Semen Sojae Praepatum , and we thought that the findings may promote the fingerprint research of the volatile components in Semen Sojae Praepatum , to provide a scientific basis for the establishment of the quality standard.%  采用自动化静态顶空(HS)-气质联用(GC-MS)技术对6个批次淡豆豉的挥发性成分进行快速分析鉴定。从测定到的40多种成分中确定了2-丁酮、3-甲基丁醛、2-甲基丁醛、香芹烯等11种共有化合物及2,3,5-三甲基吡嗪、L-乙酸冰片酯、古巴烯、四甲基吡嗪等16种非共有化合物;同时使用峰面积归一化法计算了27种挥发性成分的相对含量,各组分的质量分数存在一定差异。研究表明,使用自动化静态顶空气质联用法测定淡豆豉的挥发性成分快速简便,且在一定程度上促进了淡豆豉挥发性成分的指纹图谱构建,为淡豆豉质量标准的建立提供了参考。

  12. Handbook on criticality. Vol. 1. Criticality and nuclear safety; Handbuch zur Kritikalitaet. Bd. 1. Kritikalitaet und nukleare Sicherheit

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2015-04-15

    This handbook was prepared primarily with the aim to provide information to experts in industry, authorities or research facilities engaged in criticality-safety-related problems that will allow an adequate and rapid assessment of criticality safety issues already in the planning and preparation of nuclear facilities. However, it is not the intention of the authors of the handbook to offer ready solutions to complex problems of nuclear safety. Such questions have to remain subject to an in-depth analysis and assessment to be carried out by dedicated criticality safety experts. Compared with the previous edition dated December 1998, this handbook has been further revised and supplemented. The proven basic structure of the handbook remains unchanged. The handbook follows in some ways similar criticality handbooks or instructions published in the USA, UK, France, Japan and the former Soviet Union. The expedient use of the information given in this handbook requires a fundamental understanding of criticality and the terminology of nuclear safety. In Vol. 1, ''Criticality and Nuclear Safety'', therefore, first the most important terms and fundamentals are introduced and explained. Subsequently, experimental techniques and calculation methods for evaluating criticality problems are presented. The following chapters of Vol. 1 deal i. a. with the effect of neutron reflectors and absorbers, neutron interaction, measuring methods for criticality, and organisational safety measures and provide an overview of criticality-relevant operational experience and of criticality accidents and their potential hazardous impact. Vol. 2 parts 1 and 2 finally compile criticality parameters in graphical and tabular form. The individual graph sheets are provided with an initially explained set of identifiers, to allow the quick finding of the information of current interest. Part 1 includes criticality parameters for systems with {sup 235}U as fissile material, while part

  13. Caracterización de la Especie Ilex Paraguariensis mediante Análisis por Cromatografía Gaseosa de sus Componentes Volátiles Characterization of Ilex Paraguariensis using Gas Chromatography of the Volatile Components

    OpenAIRE

    Carlos R Romero; Nelly L Jorge; Manuel E Gómez

    2006-01-01

    Se han estudiado los componentes volátiles de las hojas de especies de Ilex paraguariensis por cromatografía gaseosa, utilizando una columna capilar con metilfenilsilicona como fase estacionaria, a fin de contribuir a su caracterización taxonómica. Se compararon dos métodos de extracción de componentes volátiles: destilación por arrastre con vapor de agua y muestreo estático de la cámara ocupada por el vapor en equilibrio con el material (Headspace Analysis, HSA). La técnica de HSA es la que ...

  14. Vers un contrôle de vol d'un oiseau artificiel

    OpenAIRE

    Lenoir, Yves

    2010-01-01

    Cet article concerne l'étude d'un drone à ailes battantes. Des observations du vol de grands oiseaux, notamment de la cinématique du cycle de leur battement, et la prise en compte de leurs capacités physiologiques ont conduit à un calcul simplifié des forces aérodynamiques engendéres par un oiseau en vol rectiligne stabilisé. Le résultat est utilisé pour trouver les contrôles de gauchissement (vrillage) des ailes assurant le maintien du vol horizontal. La valeur de la puissance moyenne requis...

  15. Diffuse X-ray scattering from benzil, C14H10O2: analysis via automatic refinement of a Monte Carlo model

    International Nuclear Information System (INIS)

    Full text: A recently developed method for fitting a Monte Carlo computer simulation model to observed single crystal diffuse X-ray scattering data has been used to study the diffuse scattering in benzil. The analysis has shown that the diffuse lines, that feature so prominently in the observed diffraction patterns, are due to strong longitudinal displacement correlations transmitted from molecule to molecule via a network of contacts involving hydrogen bonding

  16. Solving the AI Planning Plus Scheduling Problem Using Model Checking via Automatic Translation from the Abstract Plan Preparation Language (APPL) to the Symbolic Analysis Laboratory (SAL)

    Science.gov (United States)

    Butler, Ricky W.; Munoz, Cesar A.; Siminiceanu, Radu I.

    2007-01-01

    This paper describes a translator from a new planning language named the Abstract Plan Preparation Language (APPL) to the Symbolic Analysis Laboratory (SAL) model checker. This translator has been developed in support of the Spacecraft Autonomy for Vehicles and Habitats (SAVH) project sponsored by the Exploration Technology Development Program, which is seeking to mature autonomy technology for the vehicles and operations centers of Project Constellation.

  17. ARCA: Traffic Classification Method Based on Automatic Reverse and Cluster Analysis%融合自动化逆向和聚类分析的协议识别方法

    Institute of Scientific and Technical Information of China (English)

    李城龙; 薛一波; 汪东升

    2012-01-01

    网络流分类与协议识别是网络管理的前提和必要条件,但是越来越多加密协议的出现,使得传统的流分类方法失效.针对加密协议的协议识别问题,提出了一种融合自动化逆向分析技术和网络消息聚类分析技术的新型分类方法(automatic reverse and message analysis,ARCA).该方法通过自动化逆向分析技术获得网络协议的结构特征;再利用网络消息聚类分析技术,获得网络协议的交互过程;最后将网络协议的结构特征与交互过程用于加密协议流量的识别和分类检测.该方法不依赖于网络包的内容检测,能够解决协议加密带来的识别问题.通过对多个加密协议(如迅雷、BT、QQ和GYalk等)真实流量的实验,其准确率和召回率分别高于96.9%和93.1%,而且只需要检测流量中0.9%的字节内容即可.因此,ARCA方法能够对各类加密协议流量进行有效和快速的识别.%Traffic classification and protocol identification are the premise and the essential condition to effective network management. However, more and more encrypted protocols make traditional traffic classification methods less effective. To address the issue, this paper proposes an automatic reverse and message analysis (ARCA) method to identify encryption protocols. Different from traditional classification approaches, the proposed method exploits the protocol structure by automatically and reversely analyzing the target protocol, obtains the protocol interactive process by clustering messages, then identifies the protocol using the protocol structure and interactive process together. This method does not need to check payload, so it can classify the encrypted protocols. The paper evaluates the efficacy and accuracy of ARCA with real world traffic, such as encryption protocols Thunder, BitTorrent, QQ and GTalk. The experimental results show that the accuracy rates and the recall rates are over 96.9% and 93.1% respectively and

  18. Automatic and accurate measurements of P-wave and S-wave polarisation properties with a weighted multi-station complex polarisation analysis

    Science.gov (United States)

    de Meersman, K.; van der Baan, M.; Kendall, J.-M.; Jones, R. H.

    2003-04-01

    We present a weighted multi-station complex polarisation analysis to determine P-wave and S-wave polarisation properties of three-component seismic array data. Complex polarisation analysis of particle motion on seismic data was first introduced by Vidale (1986). In its original form, the method is an interpretation of the eigenvalue decomposition of a 3 by 3, complex data-covariance matrix. We have extended the definition of the data-covariance matrix (C) to C=X^HW-1 X, where C now is a 3n by 3n symmetric complex covariance matrix, with n the number of included three-component (3C) stations. X is the data matrix, the columns of which are the analytic signals of the Northern, Eastern and vertical components of the subsequent 3C stations. X^H is the transpose of the complex conjugate of X and W is a diagonal weighting matrix containing the pre-arrival noise levels of all components and all stations. The signals used in the data-matrix are corrected for arrival time differences. The eigenvectors and eigenvalues of C now describe the polarisation properties within the selected analysis window for all included stations. The main advantages of this approach are a better separation of signal and noise in the covariance matrix and the measurement of signal polarisation properties that are not influenced by the presence of polarised white noise. The technique was incorporated in an automated routine to measure the P-wave and S-wave polarisation properties of a microseismic data-set. The data were recorded in the Valhall oilfield in 1998 with a six level 3C vertical linear array with geophones at 20 m intervals between depths of 2100 m and 2200 m. In total 303 microseismic events were analysed and the results compared with manual interpretations. This comparison showed the advantage and high accuracy of the method.

  19. Research on Bench-marking Analysis Model of Competitive Intelligence Integrated Text Automatic Classification%融合文本自动分类的竞争情报定标比超分析模型研究

    Institute of Scientific and Technical Information of China (English)

    张玉峰; 黄姮

    2011-01-01

    本文分析了传统定标比超方法的思想和缺陷,提出将传统情报分析方法与智能分析技术相结合,构建了融合文本自动分类的竞争情报定标比超分析模型。本文提出构建定标比超内容层次指标体系,将其作为文本自动分类的分类体系。两种方法相辅相成、相互优化,实现竞争情报的良性循环型、科学的智能分析。进而,深入研究了该模型的功能任务和情报分析过程与算法。最后,从科学性、时效性、全面性、准确性和动态性方面对该模型进行了性能评价。%This paper analyses the theories and defects of traditional bench-marking and proposes a bench-marking analysis model of competitive intelligence integrated text automatic classification,which integrates traditional intelligence analytical method and inte

  20. EVALUACIÓN AUTOMÁTICA DE COHERENCIA TEXTUAL EN NOTICIAS POLICIALES UTILIZANDO ANÁLISIS SEMÁNTICO LATENTE AUTOMATIC EVALUATION OF TEXTUAL COHERENCE IN POLICE NEWS USING LATENT SEMANTIC ANALYSIS

    Directory of Open Access Journals (Sweden)

    SERGIO HERNÁNDEZ OSUNA

    2010-01-01

    Full Text Available El presente artículo expone los resultados de una investigación que buscó evaluar la coherencia textual en forma automática, utilizando el método de Análisis Semántico Latente, en el dominio formado por noticias policiales. Con este fin se construyó una herramienta prototipo, empleando únicamente software libre, que se puede obtener desde Internet. Para validar el funcionamiento del prototipo se comparó su evaluación con la realizada por ocho evaluadores humanos: cuatro periodistas y cuatro profesores de español con estudios de postgrado en lingüística.This article presents the results of an investigation that aimed to assess textual coherence automatically, using the method of Latent Semantic Analysis in the domain formed by police news. For this purpose, a prototype tool was built using only free software, which can be obtained from the Internet. To validate the performance of the prototype, its evaluation was compared with that made by eight human evaluators: four journalists and four spanish’s teachers with graduate studies in linguistics.

  1. Scene analysis based automatic background music recommendation for personal videos%基于视频场景分析的背景音乐自动推荐方法

    Institute of Scientific and Technical Information of China (English)

    徐鸿雁

    2014-01-01

    对视频内容添加适当的背景音乐,通常需要丰富的专业知识。面向这一应用,提出了一种基于视频场景分析的背景音乐自动推荐方法。该方法从场景、节奏、旋律和音色上进行特征分析,通过建立结构化关联模型,实现推荐音乐与视频内容的最佳匹配。实验结果表明,该方法能够有效地为个人录制视频自动推荐专业背景音乐,从而大大减少视频用户编辑背景音乐的工作量。%Embedding suitable background music into personal videos requires much rich professional knowledge. To this purpose, this paper proposed an automatic background music recommendation method based on the video scene analysis. The proposed method targets to make the recommended background music match the video content well, in terms of rhythm, melody, and timbre. Through the experiments on real videos, it is well verified that the proposed method can effectively recommend professional background music for various videos, which thus can significantly relieve the editing burden of the common users.

  2. An Automatic Proof of Euler's Formula

    Directory of Open Access Journals (Sweden)

    Jun Zhang

    2005-05-01

    Full Text Available In this information age, everything is digitalized. The encoding of functions and the automatic proof of functions are important. This paper will discuss the automatic calculation for Taylor expansion coefficients, as an example, it can be applied to prove Euler's formula automatically.

  3. Self-Compassion and Automatic Thoughts

    Science.gov (United States)

    Akin, Ahmet

    2012-01-01

    The aim of this research is to examine the relationships between self-compassion and automatic thoughts. Participants were 299 university students. In this study, the Self-compassion Scale and the Automatic Thoughts Questionnaire were used. The relationships between self-compassion and automatic thoughts were examined using correlation analysis…

  4. Automatic Addition of Genre Information in a Japanese Dictionary

    Directory of Open Access Journals (Sweden)

    Raoul BLIN

    2012-10-01

    Full Text Available This article presents the method used for the automatic addition of genre information to the Japanese entries in a Japanese-French dictionary. The dictionary is intended for a wide audience, ranging from learners of Japanese as a second language to researchers. The genre characterization is based on the statistical analysis of corpora representing different genres. We will discuss the selection of genres and corpora, the tool and method of analysis, the difficulties encountered during this analysis and their solutions.

  5. Arvustuse piirid ja auhinnad : mõtteid kirjanduskriitikast vol 2 / Jan Kaus

    Index Scriptorium Estoniae

    Kaus, Jan, 1971-

    2007-01-01

    Tänapäeva eesti kirjanduskriitikast. Vt. ka vol. 1: Kaus, Jan. Eetika, taburetiefekt ja jätkuv objektivisatsioon, Sirp, 15. juuni., lk. 7. Vastukaja: Raudam, Toomas. Arvustuse piiritused //Sirp (2007) 10. aug., lk. 9

  6. Siim Nestor soovitab : Tallinn Doom Night vol.1. Odessa Pop / Siim Nestor

    Index Scriptorium Estoniae

    Nestor, Siim, 1974-

    2006-01-01

    Soome doom-metal ansambel Reverend Bizarre üritusel "Tallinn Doom Night vol.1" 8. dets. klubis Rockstar's. Indipoppi viljelev Rootsi duo My Darling Yoy! üritusel "Odessa Pop" 9. dets. Tallinnas klubis KuKu

  7. Automatic Metadata Generation using Associative Networks

    CERN Document Server

    Rodriguez, Marko A; Van de Sompel, Herbert

    2008-01-01

    In spite of its tremendous value, metadata is generally sparse and incomplete, thereby hampering the effectiveness of digital information services. Many of the existing mechanisms for the automated creation of metadata rely primarily on content analysis which can be costly and inefficient. The automatic metadata generation system proposed in this article leverages resource relationships generated from existing metadata as a medium for propagation from metadata-rich to metadata-poor resources. Because of its independence from content analysis, it can be applied to a wide variety of resource media types and is shown to be computationally inexpensive. The proposed method operates through two distinct phases. Occurrence and co-occurrence algorithms first generate an associative network of repository resources leveraging existing repository metadata. Second, using the associative network as a substrate, metadata associated with metadata-rich resources is propagated to metadata-poor resources by means of a discrete...

  8. Characterizing chaotic melodies in automatic music composition.

    Science.gov (United States)

    Coca, Andrés E; Tost, Gerard O; Zhao, Liang

    2010-09-01

    In this paper, we initially present an algorithm for automatic composition of melodies using chaotic dynamical systems. Afterward, we characterize chaotic music in a comprehensive way as comprising three perspectives: musical discrimination, dynamical influence on musical features, and musical perception. With respect to the first perspective, the coherence between generated chaotic melodies (continuous as well as discrete chaotic melodies) and a set of classical reference melodies is characterized by statistical descriptors and melodic measures. The significant differences among the three types of melodies are determined by discriminant analysis. Regarding the second perspective, the influence of dynamical features of chaotic attractors, e.g., Lyapunov exponent, Hurst coefficient, and correlation dimension, on melodic features is determined by canonical correlation analysis. The last perspective is related to perception of originality, complexity, and degree of melodiousness (Euler's gradus suavitatis) of chaotic and classical melodies by nonparametric statistical tests.

  9. Characterizing chaotic melodies in automatic music composition

    Science.gov (United States)

    Coca, Andrés E.; Tost, Gerard O.; Zhao, Liang

    2010-09-01

    In this paper, we initially present an algorithm for automatic composition of melodies using chaotic dynamical systems. Afterward, we characterize chaotic music in a comprehensive way as comprising three perspectives: musical discrimination, dynamical influence on musical features, and musical perception. With respect to the first perspective, the coherence between generated chaotic melodies (continuous as well as discrete chaotic melodies) and a set of classical reference melodies is characterized by statistical descriptors and melodic measures. The significant differences among the three types of melodies are determined by discriminant analysis. Regarding the second perspective, the influence of dynamical features of chaotic attractors, e.g., Lyapunov exponent, Hurst coefficient, and correlation dimension, on melodic features is determined by canonical correlation analysis. The last perspective is related to perception of originality, complexity, and degree of melodiousness (Euler's gradus suavitatis) of chaotic and classical melodies by nonparametric statistical tests.

  10. Automatic schema evolution in Root

    International Nuclear Information System (INIS)

    ROOT version 3 (spring 2001) supports automatic class schema evolution. In addition this version also produces files that are self-describing. This is achieved by storing in each file a record with the description of all the persistent classes in the file. Being self-describing guarantees that a file can always be read later, its structure browsed and objects inspected, also when the library with the compiled code of these classes is missing. The schema evolution mechanism supports the frequent case when multiple data sets generated with many different class versions must be analyzed in the same session. ROOT supports the automatic generation of C++ code describing the data objects in a file

  11. Automatic spikes detection in seismogram

    Institute of Scientific and Technical Information of China (English)

    王海军; 靳平; 刘贵忠

    2003-01-01

    @@ Data processing for seismic network is very complex and fussy, because a lot of data is recorded in seismic network every day, which make it impossible to process these data all by manual work. Therefore, seismic data should be processed automatically to produce a initial results about events detection and location. Afterwards, these results are reviewed and modified by analyst. In automatic processing data quality checking is important. There are three main problem data thatexist in real seismic records, which include: spike, repeated data and dropouts. Spike is defined as isolated large amplitude point; the other two problem datahave the same features that amplitude of sample points are uniform in a interval. In data quality checking, the first step is to detect and statistic problem data in a data segment, if percent of problem data exceed a threshold, then the whole data segment is masked and not be processed in the later process.

  12. Physics of Automatic Target Recognition

    CERN Document Server

    Sadjadi, Firooz

    2007-01-01

    Physics of Automatic Target Recognition addresses the fundamental physical bases of sensing, and information extraction in the state-of-the art automatic target recognition field. It explores both passive and active multispectral sensing, polarimetric diversity, complex signature exploitation, sensor and processing adaptation, transformation of electromagnetic and acoustic waves in their interactions with targets, background clutter, transmission media, and sensing elements. The general inverse scattering, and advanced signal processing techniques and scientific evaluation methodologies being used in this multi disciplinary field will be part of this exposition. The issues of modeling of target signatures in various spectral modalities, LADAR, IR, SAR, high resolution radar, acoustic, seismic, visible, hyperspectral, in diverse geometric aspects will be addressed. The methods for signal processing and classification will cover concepts such as sensor adaptive and artificial neural networks, time reversal filt...

  13. Automatic Schema Evolution in Root

    Institute of Scientific and Technical Information of China (English)

    ReneBrun; FonsRademakers

    2001-01-01

    ROOT version 3(spring 2001) supports automatic class schema evolution.In addition this version also produces files that are self-describing.This is achieved by storing in each file a record with the description of all the persistent classes in the file.Being self-describing guarantees that a file can always be read later,its structure browsed and objects inspected.also when the library with the compiled code of these classes is missing The schema evolution mechanism supports the frequent case when multiple data sets generated with many different class versions must be analyzed in the same session.ROOT supports the automatic generation of C++ code describing the data objects in a file.

  14. Automatic QRS complex detection algorithm designed for a novel wearable, wireless electrocardiogram recording device

    DEFF Research Database (Denmark)

    Saadi, Dorthe Bodholt; Egstrup, Kenneth; Branebjerg, Jens;

    2012-01-01

    We have designed and optimized an automatic QRS complex detection algorithm for electrocardiogram (ECG) signals recorded with the DELTA ePatch platform. The algorithm is able to automatically switch between single-channel and multi-channel analysis mode. This preliminary study includes data from ...

  15. The Automaticity of Social Life

    OpenAIRE

    Bargh, John A.; Williams, Erin L.

    2006-01-01

    Much of social life is experienced through mental processes that are not intended and about which one is fairly oblivious. These processes are automatically triggered by features of the immediate social environment, such as the group memberships of other people, the qualities of their behavior, and features of social situations (e.g., norms, one's relative power). Recent research has shown these nonconscious influences to extend beyond the perception and interpretation of the social world to ...

  16. Automatically-Programed Machine Tools

    Science.gov (United States)

    Purves, L.; Clerman, N.

    1985-01-01

    Software produces cutter location files for numerically-controlled machine tools. APT, acronym for Automatically Programed Tools, is among most widely used software systems for computerized machine tools. APT developed for explicit purpose of providing effective software system for programing NC machine tools. APT system includes specification of APT programing language and language processor, which executes APT statements and generates NC machine-tool motions specified by APT statements.

  17. Automatic Generation of Technical Documentation

    OpenAIRE

    Reiter, Ehud; Mellish, Chris; Levine, John

    1994-01-01

    Natural-language generation (NLG) techniques can be used to automatically produce technical documentation from a domain knowledge base and linguistic and contextual models. We discuss this application of NLG technology from both a technical and a usefulness (costs and benefits) perspective. This discussion is based largely on our experiences with the IDAS documentation-generation project, and the reactions various interested people from industry have had to IDAS. We hope that this summary of ...

  18. Annual review in automatic programming

    CERN Document Server

    Halpern, Mark I; Bolliet, Louis

    2014-01-01

    Computer Science and Technology and their Application is an eight-chapter book that first presents a tutorial on database organization. Subsequent chapters describe the general concepts of Simula 67 programming language; incremental compilation and conversational interpretation; dynamic syntax; the ALGOL 68. Other chapters discuss the general purpose conversational system for graphical programming and automatic theorem proving based on resolution. A survey of extensible programming language is also shown.

  19. The Automatic Galaxy Collision Software

    CERN Document Server

    Smith, Beverly J; Pfeiffer, Phillip; Perkins, Sam; Barkanic, Jason; Fritts, Steve; Southerland, Derek; Manchikalapudi, Dinikar; Baker, Matt; Luckey, John; Franklin, Coral; Moffett, Amanda; Struck, Curtis

    2009-01-01

    The key to understanding the physical processes that occur during galaxy interactions is dynamical modeling, and especially the detailed matching of numerical models to specific systems. To make modeling interacting galaxies more efficient, we have constructed the `Automatic Galaxy Collision' (AGC) code, which requires less human intervention in finding good matches to data. We present some preliminary results from this code for the well-studied system Arp 284 (NGC 7714/5), and address questions of uniqueness of solutions.

  20. Reseña: Critical Studies on Corporate Responsibility, Governance and Sustainability, Vol. 4 y 6

    Directory of Open Access Journals (Sweden)

    Adrián Zicari

    2014-10-01

    Full Text Available Corporate Social Irresponsibility: A Challenging Concept. Series: Critical Studies on Corporate Responsibility, Governance and Sustainability, Vol. 4. Ralph Tench, William Sun and Brian Jones, 2012, Emerald Group Publishing Limited, 315 pages. Communicating Corporate Social Responsibility Perspectives and Practice. Series: Critical Studies on Corporate Responsibility, Governance and Sustainability, Vol. 6. Ralph Tench, William Sun and Brian Jones, 2014, Emerald Group Publishing Limited, 456 pages

  1. Editor's welcome, PORTAL, Vol. 1, No. 2, July 2004

    Directory of Open Access Journals (Sweden)

    Paul Allatson

    2005-03-01

    Full Text Available Since the highly successful inauguration of PORTAL in January 2004, we have received many kind expressions of support from international studies practitioners in a range of fields, and from such places as Canada, Italy, Mexico, Nigeria, New Zealand, Spain, Trinidad, the U.K., and the U.S.A. Particularly gratifying have been the endorsements of the journal and its publishing aims by people involved in their own electronic publishing enterprises. For their generous responses to PORTAL, the Editorial Committee would like to express its collective appreciation to the following people: Professor Jean-Marie Volet, of the University of Western Australia, and the guiding editor of the ground-breaking e-journal Mots Pluriels (www.arts.uwa.edu.au/MotsPluriels; and Francis Leo Collins, member of the Editorial Committee for the Graduate Journal of Asia Pacific Studies (GJAPS, based in Auckland, New Zealand. PORTAL's readers may be interested in the current call for papers from GJAPS (www.arts.auckland.ac.nz/gjaps, for a special issue on "Imagining the Asia-Pacific" (deadline October 31, 2004. This issue of PORTAL contains essays that cover wide terrain: the Chilean diasporic community in Australia; the world of German intellectuals; contemporary Mexican socio-political movements; rural-urban migration in China; and transnational advocacy networks and election monitoring in the Philippines, Chile, Nicaragua and Mexico. In the cultural section of this issue, we are delighted to present a short story from the noted German Studies scholar Anthony Stephens, and the first half of a beautiful, deeply poetic and haunting novel entitled Son, from the London-based writer and art-critic Jennifer Higgie. The novel’s second and final part will appear in PORTAL vol. 2, no. 1, in January 2005. On a different note, we would like to express our support for the inaugural Ubud Writers and Readers Festival, to be held in Ubud, Bali, from October 11 to 17, 2004. The Festival

  2. AVATAR -- Automatic variance reduction in Monte Carlo calculations

    Energy Technology Data Exchange (ETDEWEB)

    Van Riper, K.A.; Urbatsch, T.J.; Soran, P.D. [and others

    1997-05-01

    AVATAR{trademark} (Automatic Variance And Time of Analysis Reduction), accessed through the graphical user interface application, Justine{trademark}, is a superset of MCNP{trademark} that automatically invokes THREEDANT{trademark} for a three-dimensional deterministic adjoint calculation on a mesh independent of the Monte Carlo geometry, calculates weight windows, and runs MCNP. Computational efficiency increases by a factor of 2 to 5 for a three-detector oil well logging tool model. Human efficiency increases dramatically, since AVATAR eliminates the need for deep intuition and hours of tedious handwork.

  3. Semi-Automatic Rename Refactoring for JavaScript

    DEFF Research Database (Denmark)

    Feldthaus, Asger; Møller, Anders

    2013-01-01

    Modern IDEs support automated refactoring for many programming languages, but support for JavaScript is still primitive. To perform renaming, which is one of the fundamental refactorings, there is often no practical alternative to simple syntactic search-and-replace. Although more sophisticated...... alternatives have been developed, they are limited by whole-program assumptions and poor scalability. We propose a technique for semi-automatic refactoring for JavaScript, with a focus on renaming. Unlike traditional refactoring algorithms, semi-automatic refactoring works by a combination of static analysis...

  4. An automatic system for elaboration of chip breaking diagrams

    DEFF Research Database (Denmark)

    Andreasen, Jan Lasson; De Chiffre, Leonardo

    1998-01-01

    A laboratory system for fully automatic elaboration of chip breaking diagrams has been developed and tested. The system is based on automatic chip breaking detection by frequency analysis of cutting forces in connection with programming of a CNC-lathe to scan different feeds, speeds and cutting...... depths. An evaluation of the system based on a total of 1671 experiments has shown that unfavourable snarled chips can be detected with 98% certainty which indeed makes the system a valuable tool in chip breakability tests. Using the system, chip breaking diagrams can be elaborated with a previously...

  5. Index to Nuclear Safety. A technical progress review by chronology, permuted title, and author. Vol 11, No. 1 through Vol. 16, No. 6

    International Nuclear Information System (INIS)

    This index to Nuclear Safety covers articles in Nuclear Safety Vol. 11, No. 1 (Jan.-Feb. 1970) through Vol. 16, No. 6 (Nov.-Dec. 1975). Included in the index is a chronological list of articles (including abstract) followed by both a KWIC index and an Author Index. Nuclear Safety is a bimonthly technical progress review prepared by the Nuclear Safety Information Center and covers all safety aspects of nuclear power reactors and associated facilities. The index lists over 300 technical articles in the last six years of publication

  6. Index to Nuclear Safety: a technical progress review by chronology, permuted title, and author. Vol. 11(1)--Vol. 18(6)

    Energy Technology Data Exchange (ETDEWEB)

    Cottrell, W.B.; Klein, A.

    1978-04-11

    This index to Nuclear Safety covers articles published in Nuclear Safety, Vol. 11, No. 1 (January-February 1970), through Vol. 18, No. 6 (November-December 1977). It is divided into three sections: a chronological list of articles (including abstracts) followed by a permuted-title (KWIC) index and an author index. Nuclear Safety, a bimonthly technical progress review prepared by the Nuclear Safety Information Center (NSIC), covers all safety aspects of nuclear power reactors and associated facilities. Over 450 technical articles published in Nuclear Safety in the last eight years are listed in this index.

  7. Index to Nuclear Safety. A technical progress review by chronology, permuted title, and author. Vol 11, No. 1 through Vol. 16, No. 6

    Energy Technology Data Exchange (ETDEWEB)

    Cottrell, W.B.; Klein, A.

    1976-04-01

    This index to Nuclear Safety covers articles in Nuclear Safety Vol. 11, No. 1 (Jan.-Feb. 1970) through Vol. 16, No. 6 (Nov.-Dec. 1975). Included in the index is a chronological list of articles (including abstract) followed by both a KWIC index and an Author Index. Nuclear Safety is a bimonthly technical progress review prepared by the Nuclear Safety Information Center and covers all safety aspects of nuclear power reactors and associated facilities. The index lists over 300 technical articles in the last six years of publication.

  8. 自动标准压力发生器动态控制准确度分析%Dynamic Control Accuracy Analysis of Automatic Digital Pressure Controller

    Institute of Scientific and Technical Information of China (English)

    李海兵; 麻锐; 卓华; 陈武卿

    2014-01-01

    An additional uncertainty of pressure generators can be introduced to account for control precision. This additional precision is on-ly needed when the controls are in dynamic mode and the operator is using the front panel to display. Herein a calculation method of control preci-sion is displayed for PACE6000 serials pressure generators. It is assumed that a piston gauge is used as standard while a high-precision digital pressure gauge is applied as reference, and when 0.01-class pressure calibrator is used as standard, the uncertainty analysis of 0.05-class digital pressure gauge is shown, and the MPEV ( maximum permissible error value) of inlet-pressure transducer must be ±0.008%.%自动标准压力发生器(以下简称发生器)性能的一个重要指标是准确度。当发生器处于动态控制模式及使用面板显示时需要考虑附加控制准确度。本文通过对PACE6000系列的发生器进行比对实验,以活塞式压力计为标准器,高准确度数字压力计为参考对象,得出了其控制准确度,最后给出了以0.01级数字压力发生器为标准器、0.05级数字压力计示值误差的测量不确定度分析,其内置压力传感器的最大允许误差须为依0.008%。

  9. Automatic Feature Extraction from Planetary Images

    Science.gov (United States)

    Troglio, Giulia; Le Moigne, Jacqueline; Benediktsson, Jon A.; Moser, Gabriele; Serpico, Sebastiano B.

    2010-01-01

    With the launch of several planetary missions in the last decade, a large amount of planetary images has already been acquired and much more will be available for analysis in the coming years. The image data need to be analyzed, preferably by automatic processing techniques because of the huge amount of data. Although many automatic feature extraction methods have been proposed and utilized for Earth remote sensing images, these methods are not always applicable to planetary data that often present low contrast and uneven illumination characteristics. Different methods have already been presented for crater extraction from planetary images, but the detection of other types of planetary features has not been addressed yet. Here, we propose a new unsupervised method for the extraction of different features from the surface of the analyzed planet, based on the combination of several image processing techniques, including a watershed segmentation and the generalized Hough Transform. The method has many applications, among which image registration and can be applied to arbitrary planetary images.

  10. Automatic system for detecting pornographic images

    Science.gov (United States)

    Ho, Kevin I. C.; Chen, Tung-Shou; Ho, Jun-Der

    2002-09-01

    Due to the dramatic growth of network and multimedia technology, people can more easily get variant information by using Internet. Unfortunately, it also makes the diffusion of illegal and harmful content much easier. So, it becomes an important topic for the Internet society to protect and safeguard Internet users from these content that may be encountered while surfing on the Net, especially children. Among these content, porno graphs cause more serious harm. Therefore, in this study, we propose an automatic system to detect still colour porno graphs. Starting from this result, we plan to develop an automatic system to search porno graphs or to filter porno graphs. Almost all the porno graphs possess one common characteristic that is the ratio of the size of skin region and non-skin region is high. Based on this characteristic, our system first converts the colour space from RGB colour space to HSV colour space so as to segment all the possible skin-colour regions from scene background. We also apply the texture analysis on the selected skin-colour regions to separate the skin regions from non-skin regions. Then, we try to group the adjacent pixels located in skin regions. If the ratio is over a given threshold, we can tell if the given image is a possible porno graph. Based on our experiment, less than 10% of non-porno graphs are classified as pornography, and over 80% of the most harmful porno graphs are classified correctly.

  11. Automatic transcription of Turkish microtonal music.

    Science.gov (United States)

    Benetos, Emmanouil; Holzapfel, André

    2015-10-01

    Automatic music transcription, a central topic in music signal analysis, is typically limited to equal-tempered music and evaluated on a quartertone tolerance level. A system is proposed to automatically transcribe microtonal and heterophonic music as applied to the makam music of Turkey. Specific traits of this music that deviate from properties targeted by current transcription tools are discussed, and a collection of instrumental and vocal recordings is compiled, along with aligned microtonal reference pitch annotations. An existing multi-pitch detection algorithm is adapted for transcribing music with 20 cent resolution, and a method for converting a multi-pitch heterophonic output into a single melodic line is proposed. Evaluation metrics for transcribing microtonal music are applied, which use various levels of tolerance for inaccuracies with respect to frequency and time. Results show that the system is able to transcribe microtonal instrumental music at 20 cent resolution with an F-measure of 56.7%, outperforming state-of-the-art methods for the same task. Case studies on transcribed recordings are provided, to demonstrate the shortcomings and the strengths of the proposed method. PMID:26520294

  12. Automatic Segmentation of Dermoscopic Images by Iterative Classification

    Directory of Open Access Journals (Sweden)

    Maciel Zortea

    2011-01-01

    Full Text Available Accurate detection of the borders of skin lesions is a vital first step for computer aided diagnostic systems. This paper presents a novel automatic approach to segmentation of skin lesions that is particularly suitable for analysis of dermoscopic images. Assumptions about the image acquisition, in particular, the approximate location and color, are used to derive an automatic rule to select small seed regions, likely to correspond to samples of skin and the lesion of interest. The seed regions are used as initial training samples, and the lesion segmentation problem is treated as binary classification problem. An iterative hybrid classification strategy, based on a weighted combination of estimated posteriors of a linear and quadratic classifier, is used to update both the automatically selected training samples and the segmentation, increasing reliability and final accuracy, especially for those challenging images, where the contrast between the background skin and lesion is low.

  13. Automatic integration of confidence in the brain valuation signal.

    Science.gov (United States)

    Lebreton, Maël; Abitbol, Raphaëlle; Daunizeau, Jean; Pessiglione, Mathias

    2015-08-01

    A key process in decision-making is estimating the value of possible outcomes. Growing evidence suggests that different types of values are automatically encoded in the ventromedial prefrontal cortex (VMPFC). Here we extend this idea by suggesting that any overt judgment is accompanied by a second-order valuation (a confidence estimate), which is also automatically incorporated in VMPFC activity. In accordance with the predictions of our normative model of rating tasks, two behavioral experiments showed that confidence levels were quadratically related to first-order judgments (age, value or probability ratings). The analysis of three functional magnetic resonance imaging data sets using similar rating tasks confirmed that the quadratic extension of first-order ratings (our proxy for confidence) was encoded in VMPFC activity, even if no confidence judgment was required of the participants. Such an automatic aggregation of value and confidence in a same brain region might provide insight into many distortions of judgment and choice. PMID:26192748

  14. Automatic design of decision-tree algorithms with evolutionary algorithms.

    Science.gov (United States)

    Barros, Rodrigo C; Basgalupp, Márcio P; de Carvalho, André C P L F; Freitas, Alex A

    2013-01-01

    This study reports the empirical analysis of a hyper-heuristic evolutionary algorithm that is capable of automatically designing top-down decision-tree induction algorithms. Top-down decision-tree algorithms are of great importance, considering their ability to provide an intuitive and accurate knowledge representation for classification problems. The automatic design of these algorithms seems timely, given the large literature accumulated over more than 40 years of research in the manual design of decision-tree induction algorithms. The proposed hyper-heuristic evolutionary algorithm, HEAD-DT, is extensively tested using 20 public UCI datasets and 10 microarray gene expression datasets. The algorithms automatically designed by HEAD-DT are compared with traditional decision-tree induction algorithms, such as C4.5 and CART. Experimental results show that HEAD-DT is capable of generating algorithms which are significantly more accurate than C4.5 and CART.

  15. Radiometric Normalization of Temporal Images Combining Automatic Detection of Pseudo-Invariant Features from the Distance and Similarity Spectral Measures, Density Scatterplot Analysis, and Robust Regression

    Directory of Open Access Journals (Sweden)

    Ana Paula Ferreira de Carvalho

    2013-05-01

    Full Text Available Radiometric precision is difficult to maintain in orbital images due to several factors (atmospheric conditions, Earth-sun distance, detector calibration, illumination, and viewing angles. These unwanted effects must be removed for radiometric consistency among temporal images, leaving only land-leaving radiances, for optimum change detection. A variety of relative radiometric correction techniques were developed for the correction or rectification of images, of the same area, through use of reference targets whose reflectance do not change significantly with time, i.e., pseudo-invariant features (PIFs. This paper proposes a new technique for radiometric normalization, which uses three sequential methods for an accurate PIFs selection: spectral measures of temporal data (spectral distance and similarity, density scatter plot analysis (ridge method, and robust regression. The spectral measures used are the spectral angle (Spectral Angle Mapper, SAM, spectral correlation (Spectral Correlation Mapper, SCM, and Euclidean distance. The spectral measures between the spectra at times t1 and t2 and are calculated for each pixel. After classification using threshold values, it is possible to define points with the same spectral behavior, including PIFs. The distance and similarity measures are complementary and can be calculated together. The ridge method uses a density plot generated from images acquired on different dates for the selection of PIFs. In a density plot, the invariant pixels, together, form a high-density ridge, while variant pixels (clouds and land cover changes are spread, having low density, facilitating its exclusion. Finally, the selected PIFs are subjected to a robust regression (M-estimate between pairs of temporal bands for the detection and elimination of outliers, and to obtain the optimal linear equation for a given set of target points. The robust regression is insensitive to outliers, i.e., observation that appears to deviate

  16. Automatic landslide length and width estimation based on the geometric processing of the bounding box and the geomorphometric analysis of DEMs

    Science.gov (United States)

    Niculiţǎ, Mihai

    2016-08-01

    The morphology of landslides is influenced by the slide/flow of the material downslope. Usually, the distance of the movement of the material is greater than the width of the displaced material (especially for flows, but also the majority of slides); the resulting landslides have a greater length than width. In some specific geomorphologic environments (monoclinic regions, with cuesta landforms type) or as is the case for some types of landslides (translational slides, bank failures, complex landslides), for the majority of landslides, the distance of the movement of the displaced material can be smaller than its width; thus the landslides have a smaller length than width. When working with landslide inventories containing both types of landslides presented above, the analysis of the length and width of the landslides computed using usual geographic information system techniques (like bounding boxes) can be flawed. To overcome this flaw, I present an algorithm which uses both the geometry of the landslide polygon minimum oriented bounding box and a digital elevation model of the landslide topography for identifying the long vs. wide landslides. I tested the proposed algorithm for a landslide inventory which covers 131.1 km2 of the Moldavian Plateau, eastern Romania. This inventory contains 1327 landslides, of which 518 were manually classified as long and 809 as wide. In a first step, the difference in elevation of the length and width of the minimum oriented bounding box is used to separate long landslides from wide landslides (long landslides having the greatest elevation difference along the length of the bounding box). In a second step, the long landslides are checked as to whether their length is greater than the length of flow downslope (estimated with a flow-routing algorithm), in which case the landslide is classified as wide. By using this approach, the area under the Receiver Operating Characteristic curve value for the classification of the long vs. wide

  17. Marine environment news Vol. 4, no. 1, June 2006

    International Nuclear Information System (INIS)

    The last six months have been a frenetically busy time for us in Monaco. Our Marine Programmes have been positively reviewed by the Standing Advisory Group on Nuclear Applications (SAGNA) and by an External Evaluation of our Programme. Both Groups report to the Director General, Mr Mohammed ElBaradei, and we hope that new investment in personnel and equipment may eventually result from their evaluations and feedback. We were honoured by the visit of His Serene Highness Prince Albert II of Monaco in March 2006 to our facilities. HSH continues to take a personal interest in MEL's isotopic and pollutant analyses of biota and environmental samples from the Arctic environment which we sampled during His Highness' cruise in June 2005 (see Vol. 3. No 2. MEL Newsletter). This issue also shows that MEL has hosted several important workshops and meetings. The US Research Vessel Endeavour visited the port of Monaco in April and MEL hosted an informal reception for the crew. The visit was in connection with ongoing, joint MEL-US studies in ocean carbon sinks in the Mediterranean (the MEDFLUX programme). More recently, MEL has been involved in discussion with Gulf Member States for a Marine Radioactivity Baseline Study. Finally, I am pleased to note that our MEL Newsletter is clearly having a positive outreach with Member States, since we are currently witnessing a doubling in Member States requests through the TC Concept Proposals (2007-2008) for fellowships, courses and capacity building in marine environment

  18. Failure of classical traffic flow theories: Stochastic highway capacity and automatic driving

    Science.gov (United States)

    Kerner, Boris S.

    2016-05-01

    In a mini-review Kerner (2013) it has been shown that classical traffic flow theories and models failed to explain empirical traffic breakdown - a phase transition from metastable free flow to synchronized flow at highway bottlenecks. The main objective of this mini-review is to study the consequence of this failure of classical traffic-flow theories for an analysis of empirical stochastic highway capacity as well as for the effect of automatic driving vehicles and cooperative driving on traffic flow. To reach this goal, we show a deep connection between the understanding of empirical stochastic highway capacity and a reliable analysis of automatic driving vehicles in traffic flow. With the use of simulations in the framework of three-phase traffic theory, a probabilistic analysis of the effect of automatic driving vehicles on a mixture traffic flow consisting of a random distribution of automatic driving and manual driving vehicles has been made. We have found that the parameters of automatic driving vehicles can either decrease or increase the probability of the breakdown. The increase in the probability of traffic breakdown, i.e., the deterioration of the performance of the traffic system can occur already at a small percentage (about 5%) of automatic driving vehicles. The increase in the probability of traffic breakdown through automatic driving vehicles can be realized, even if any platoon of automatic driving vehicles satisfies condition for string stability.

  19. Determination of lead in water sample by automatic on-line analysis monitor%水质铅自动在线监测仪测定水样中铅的含量

    Institute of Scientific and Technical Information of China (English)

    洪陵成; 朱金伟; 张红艳; 刘超; 马小茹

    2013-01-01

    首先,构建了基于聚苯乙烯-双硫腙纳米纤维的铅离子富集前处理装置,可以有效地降低铅的检出限并提高检测灵敏度和选择性。其次,研发了水质铅自动在线分析监测仪,以预镀汞膜的玻碳电极为工作电极,采用阳极溶出伏安法,对实验参数进行了优化,例如预镀汞膜的条件、缓冲液的种类和浓度、铅富集时间和电压等。在最优化条件下,铅的溶出峰面积与其浓度在0~2000μg/L 范围内呈现良好的线性关系,其回归方程为 y1=-0.07843+0.00269 x (相关系数为0.998,浓度范围为5~2000μg/L ),y2=-0.0035+0.00178 x (相关系数为0.998,浓度范围为5~100μg/L ),检出限为0.38μg/L。在此基础上,采用标准加入法对水样中铅的含量进行了测定,并与原子吸收法的测定结果进行了比较。结果显示,该水质铅自动在线监测仪具有快速、准确、简便、灵敏等优点。%First ,the pre-treatment device based on polystyrene-dithizone nanofibers for lead enrich-ment was constructed ,which could reduce detection limit and improve sensitivity and selectivity effective-ly. Next ,with glassy carbon electrode pre-coated by mercury film as working electrode ,an automatic on-line analysis monitor of lead was developed by anodic stripping voltammetry ,and various experimental conditions were investigated ,i.e. ,the condition of pre-coated mercury film ,the kind and concentration of buffer solutions ,the enrichment time and voltage of lead etc.Under the optimized conditions ,the stripping peak area of lead showed a good linearity of wide range between 0-2000 μg/L. The linear regression equa-tion was y1 = -0.07843+0.00269x(R=0.998 ,with the wide range between 5-2000 μg/L) ,y2 = -0.0035+0.00178 x(R=0.998 ,with the wide range between 5-100 μg/L) ,LOD (limit of detection) = 0.38 μg/L (at a signal-to-noise ratio of 3). Then ,the water samples were determined by standard

  20. Unification of automatic target tracking and automatic target recognition

    Science.gov (United States)

    Schachter, Bruce J.

    2014-06-01

    The subject being addressed is how an automatic target tracker (ATT) and an automatic target recognizer (ATR) can be fused together so tightly and so well that their distinctiveness becomes lost in the merger. This has historically not been the case outside of biology and a few academic papers. The biological model of ATT∪ATR arises from dynamic patterns of activity distributed across many neural circuits and structures (including retina). The information that the brain receives from the eyes is "old news" at the time that it receives it. The eyes and brain forecast a tracked object's future position, rather than relying on received retinal position. Anticipation of the next moment - building up a consistent perception - is accomplished under difficult conditions: motion (eyes, head, body, scene background, target) and processing limitations (neural noise, delays, eye jitter, distractions). Not only does the human vision system surmount these problems, but it has innate mechanisms to exploit motion in support of target detection and classification. Biological vision doesn't normally operate on snapshots. Feature extraction, detection and recognition are spatiotemporal. When vision is viewed as a spatiotemporal process, target detection, recognition, tracking, event detection and activity recognition, do not seem as distinct as they are in current ATT and ATR designs. They appear as similar mechanism taking place at varying time scales. A framework is provided for unifying ATT and ATR.