WorldWideScience

Sample records for automated analysis method

  1. A catalog of automated analysis methods for enterprise models.

    Science.gov (United States)

    Florez, Hector; Sánchez, Mario; Villalobos, Jorge

    2016-01-01

    Enterprise models are created for documenting and communicating the structure and state of Business and Information Technologies elements of an enterprise. After models are completed, they are mainly used to support analysis. Model analysis is an activity typically based on human skills and due to the size and complexity of the models, this process can be complicated and omissions or miscalculations are very likely. This situation has fostered the research of automated analysis methods, for supporting analysts in enterprise analysis processes. By reviewing the literature, we found several analysis methods; nevertheless, they are based on specific situations and different metamodels; then, some analysis methods might not be applicable to all enterprise models. This paper presents the work of compilation (literature review), classification, structuring, and characterization of automated analysis methods for enterprise models, expressing them in a standardized modeling language. In addition, we have implemented the analysis methods in our modeling tool. PMID:27047732

  2. Automated migration analysis based on cell texture: method & reliability

    Directory of Open Access Journals (Sweden)

    Chittenden Thomas W

    2005-03-01

    Full Text Available Abstract Background In this paper, we present and validate a way to measure automatically the extent of cell migration based on automated examination of a series of digital photographs. It was designed specifically to identify the impact of Second Hand Smoke (SHS on endothelial cell migration but has broader applications. The analysis has two stages: (1 preprocessing of image texture, and (2 migration analysis. Results The output is a graphic overlay that indicates the front lines of cell migration superimposed on each original image, with automated reporting of the distance traversed vs. time. Expert preference compares to manual placement of leading edge shows complete equivalence of automated vs. manual leading edge definition for cell migration measurement. Conclusion Our method is indistinguishable from careful manual determinations of cell front lines, with the advantages of full automation, objectivity, and speed.

  3. A Method of Automated Nonparametric Content Analysis for Social Science

    OpenAIRE

    Hopkins, Daniel J.; King, Gary

    2010-01-01

    The increasing availability of digitized text presents enormous opportunities for social scientists. Yet hand coding many blogs, speeches, government records, newspapers, or other sources of unstructured text is infeasible. Although computer scientists have methods for automated content analysis, most are optimized to classify individual documents, whereas social scientists instead want generalizations about the population of documents, such as the proportion in a given category. Unfortunatel...

  4. Applications of Automation Methods for Nonlinear Fracture Test Analysis

    Science.gov (United States)

    Allen, Phillip A.; Wells, Douglas N.

    2013-01-01

    Using automated and standardized computer tools to calculate the pertinent test result values has several advantages such as: 1. allowing high-fidelity solutions to complex nonlinear phenomena that would be impractical to express in written equation form, 2. eliminating errors associated with the interpretation and programing of analysis procedures from the text of test standards, 3. lessening the need for expertise in the areas of solid mechanics, fracture mechanics, numerical methods, and/or finite element modeling, to achieve sound results, 4. and providing one computer tool and/or one set of solutions for all users for a more "standardized" answer. In summary, this approach allows a non-expert with rudimentary training to get the best practical solution based on the latest understanding with minimum difficulty.Other existing ASTM standards that cover complicated phenomena use standard computer programs: 1. ASTM C1340/C1340M-10- Standard Practice for Estimation of Heat Gain or Loss Through Ceilings Under Attics Containing Radiant Barriers by Use of a Computer Program 2. ASTM F 2815 - Standard Practice for Chemical Permeation through Protective Clothing Materials: Testing Data Analysis by Use of a Computer Program 3. ASTM E2807 - Standard Specification for 3D Imaging Data Exchange, Version 1.0 The verification, validation, and round-robin processes required of a computer tool closely parallel the methods that are used to ensure the solution validity for equations included in test standard. The use of automated analysis tools allows the creation and practical implementation of advanced fracture mechanics test standards that capture the physics of a nonlinear fracture mechanics problem without adding undue burden or expense to the user. The presented approach forms a bridge between the equation-based fracture testing standards of today and the next generation of standards solving complex problems through analysis automation.

  5. A novel automated image analysis method for accurate adipocyte quantification

    OpenAIRE

    Osman, Osman S.; Selway, Joanne L; Kępczyńska, Małgorzata A; Stocker, Claire J.; O’Dowd, Jacqueline F; Cawthorne, Michael A.; Arch, Jonathan RS; Jassim, Sabah; Langlands, Kenneth

    2013-01-01

    Increased adipocyte size and number are associated with many of the adverse effects observed in metabolic disease states. While methods to quantify such changes in the adipocyte are of scientific and clinical interest, manual methods to determine adipocyte size are both laborious and intractable to large scale investigations. Moreover, existing computational methods are not fully automated. We, therefore, developed a novel automatic method to provide accurate measurements of the cross-section...

  6. Methods of automated cell analysis and their application in radiation biology

    International Nuclear Information System (INIS)

    The present review is concerned with the methods of automated analysis of biological microobjects and covers two groups into which all the systems of automated analysis can be divided-systems of flow ( flow cytometry) and scanning (image analysis systems) type. Particular emphasis has been placed on their use in radiobiological studies, namely, in the micronucleus test, a cytogenetic assay for monitoring the clastogenic action of ionizing radiation commonly used at present. Examples of suing methods described and actual setups in other biomedical researches are given. Analysis of advantages and disadvantages of the methods of automated cell analysis enables to choose more thoroughly between the systems of flow and scanning type to use them in particular research

  7. Engineering Mathematical Analysis Method for Productivity Rate in Linear Arrangement Serial Structure Automated Flow Assembly Line

    Directory of Open Access Journals (Sweden)

    Tan Chan Sin

    2015-01-01

    Full Text Available Productivity rate (Q or production rate is one of the important indicator criteria for industrial engineer to improve the system and finish good output in production or assembly line. Mathematical and statistical analysis method is required to be applied for productivity rate in industry visual overviews of the failure factors and further improvement within the production line especially for automated flow line since it is complicated. Mathematical model of productivity rate in linear arrangement serial structure automated flow line with different failure rate and bottleneck machining time parameters becomes the basic model for this productivity analysis. This paper presents the engineering mathematical analysis method which is applied in an automotive company which possesses automated flow assembly line in final assembly line to produce motorcycle in Malaysia. DCAS engineering and mathematical analysis method that consists of four stages known as data collection, calculation and comparison, analysis, and sustainable improvement is used to analyze productivity in automated flow assembly line based on particular mathematical model. Variety of failure rate that causes loss of productivity and bottleneck machining time is shown specifically in mathematic figure and presents the sustainable solution for productivity improvement for this final assembly automated flow line.

  8. Tool for automated method design in activation analysis

    International Nuclear Information System (INIS)

    A computational approach to the optimization of the adjustable parameters of nuclear activation analysis has been developed for use in comprehensive method design calculations. An estimate of sample composition is used to predict the gamma-ray spectra to be expected for given sets of values of experimental parameters. These spectra are used to evaluate responses such as detection limits and measurement precision for application to optimization by the simplex method. This technique has been successfully implemented for the simultaneous determination of sample size and irradiation, decay and counting times by the optimization of either detection limit or precision. Both single-element and multielement determinations can be designed with the aid of these calculations. The combination of advance prediction and simplex optimization is both flexible and efficient and produces numerical results suitable for use in further computations

  9. A standard analysis method (SAM) for the automated analysis of polychlorinated biphenyls (PCBs) in soils using the chemical analysis automation (CAA) paradigm: validation and performance

    International Nuclear Information System (INIS)

    The Chemical Analysis Automation (CAA) program is developing a standardized modular automation strategy for chemical analysis. In this automation concept, analytical chemistry is performed with modular building blocks that correspond to individual elements of the steps in the analytical process. With a standardized set of behaviors and interactions, these blocks can be assembled in a 'plug and play' manner into a complete analysis system. These building blocks, which are referred to as Standard Laboratory Modules (SLM), interface to a host control system that orchestrates the entire analytical process, from sample preparation through data interpretation. The integrated system is called a Standard Analysis Method (SAME). A SAME for the automated determination of Polychlorinated Biphenyls (PCB) in soils, assembled in a mobile laboratory, is undergoing extensive testing and validation. The SAME consists of the following SLMs: a four channel Soxhlet extractor, a High Volume Concentrator, column clean up, a gas chromatograph, a PCB data interpretation module, a robot, and a human- computer interface. The SAME is configured to meet the requirements specified in U.S. Environmental Protection Agency's (EPA) SW-846 Methods 3541/3620A/8082 for the analysis of pcbs in soils. The PCB SAME will be described along with the developmental test plan. Performance data obtained during developmental testing will also be discussed

  10. Comparison of semi-automated image analysis and manual methods for tissue quantification in pancreatic carcinoma

    Energy Technology Data Exchange (ETDEWEB)

    Sims, A.J. [Regional Medical Physics Department, Freeman Hospital, Newcastle upon Tyne (United Kingdom)]. E-mail: a.j.sims@newcastle.ac.uk; Murray, A. [Regional Medical Physics Department, Freeman Hospital, Newcastle upon Tyne (United Kingdom); Bennett, M.K. [Department of Histopathology, Newcastle upon Tyne Hospitals NHS Trust, Newcastle upon Tyne (United Kingdom)

    2002-04-21

    Objective measurements of tissue area during histological examination of carcinoma can yield valuable prognostic information. However, such measurements are not made routinely because the current manual approach is time consuming and subject to large statistical sampling error. In this paper, a semi-automated image analysis method for measuring tissue area in histological samples is applied to the measurement of stromal tissue, cell cytoplasm and lumen in samples of pancreatic carcinoma and compared with the standard manual point counting method. Histological samples from 26 cases of pancreatic carcinoma were stained using the sirius red, light-green method. Images from each sample were captured using two magnifications. Image segmentation based on colour cluster analysis was used to subdivide each image into representative colours which were classified manually into one of three tissue components. Area measurements made using this technique were compared to corresponding manual measurements and used to establish the comparative accuracy of the semi-automated image analysis technique, with a quality assurance study to measure the repeatability of the new technique. For both magnifications and for each tissue component, the quality assurance study showed that the semi-automated image analysis algorithm had better repeatability than its manual equivalent. No significant bias was detected between the measurement techniques for any of the comparisons made using the 26 cases of pancreatic carcinoma. The ratio of manual to semi-automatic repeatability errors varied from 2.0 to 3.6. Point counting would need to be increased to be between 400 and 1400 points to achieve the same repeatability as for the semi-automated technique. The results demonstrate that semi-automated image analysis is suitable for measuring tissue fractions in histological samples prepared with coloured stains and is a practical alternative to manual point counting. (author)

  11. Comparison of semi-automated image analysis and manual methods for tissue quantification in pancreatic carcinoma

    International Nuclear Information System (INIS)

    Objective measurements of tissue area during histological examination of carcinoma can yield valuable prognostic information. However, such measurements are not made routinely because the current manual approach is time consuming and subject to large statistical sampling error. In this paper, a semi-automated image analysis method for measuring tissue area in histological samples is applied to the measurement of stromal tissue, cell cytoplasm and lumen in samples of pancreatic carcinoma and compared with the standard manual point counting method. Histological samples from 26 cases of pancreatic carcinoma were stained using the sirius red, light-green method. Images from each sample were captured using two magnifications. Image segmentation based on colour cluster analysis was used to subdivide each image into representative colours which were classified manually into one of three tissue components. Area measurements made using this technique were compared to corresponding manual measurements and used to establish the comparative accuracy of the semi-automated image analysis technique, with a quality assurance study to measure the repeatability of the new technique. For both magnifications and for each tissue component, the quality assurance study showed that the semi-automated image analysis algorithm had better repeatability than its manual equivalent. No significant bias was detected between the measurement techniques for any of the comparisons made using the 26 cases of pancreatic carcinoma. The ratio of manual to semi-automatic repeatability errors varied from 2.0 to 3.6. Point counting would need to be increased to be between 400 and 1400 points to achieve the same repeatability as for the semi-automated technique. The results demonstrate that semi-automated image analysis is suitable for measuring tissue fractions in histological samples prepared with coloured stains and is a practical alternative to manual point counting. (author)

  12. Carotid artery stenosis: reproducibility of automated 3D CT angiography analysis method

    International Nuclear Information System (INIS)

    The aim of this study was to assess the reproducibility and anatomical accuracy of automated 3D CT angiography analysis software in the evaluation of carotid artery stenosis with reference to rotational DSA (rDSA). Seventy-two vessels in 36 patients with symptomatic carotid stenosis were evaluated by 3D CT angiography and conventional DSA (cDSA). Thirty-one patients also underwent rotational 3D DSA (rDSA). Multislice CT was performed with bolus tracking and slice thickness of 1.5 mm (1-mm collimation, table feed 5 mm/s) and reconstruction interval of 1.0 mm. Two observers independently performed the stenosis measurements on 3D CTA and on MPR rDSA according to the NASCET criteria. The first measurements on CTA utilized an analysis program with automatic stenosis recognition and quantitation. In the subsequent measurements, manual corrections were applied when necessary. Interfering factors for stenosis quantitation, such as calcifications, ulcerations, and adjacent vessels, were registered. Intraobserver and interobserver correlation for CTA were 0.89 and 0.90, respectively. (p<0.001). The interobserver correlation between two observers for MPR rDSA was 0.90 (p<0.001). The intertechnique correlation between CTA and rDSA was 0.69 (p<0.001) using automated measurements but increased to 0.81 (p<0.001) with the manually corrected measurements. Automated stenosis recognition achieved a markedly poorer correlation with MPR rDSA in carotids with interfering factors than those in cases where there were no such factors. Automated 3D CT angiography analysis methods are highly reproducible. Manually corrected measurements facilitated avoidance of the interfering factors, such as ulcerations, calcifications, and adjacent vessels, and thus increased anatomical accuracy of arterial delineation by automated CT angiography with reference to MPR rDSA. (orig.)

  13. A new web-based method for automated analysis of muscle histology

    Directory of Open Access Journals (Sweden)

    Pertl Cordula

    2013-01-01

    Full Text Available Abstract Background Duchenne Muscular Dystrophy is an inherited degenerative neuromuscular disease characterised by rapidly progressive muscle weakness. Currently, curative treatment is not available. Approaches for new treatments that improve muscle strength and quality of life depend on preclinical testing in animal models. The mdx mouse model is the most frequently used animal model for preclinical studies in muscular dystrophy research. Standardised pathology-relevant parameters of dystrophic muscle in mdx mice for histological analysis have been developed in international, collaborative efforts, but automation has not been accessible to most research groups. A standardised and mainly automated quantitative assessment of histopathological parameters in the mdx mouse model is desirable to allow an objective comparison between laboratories. Methods Immunological and histochemical reactions were used to obtain a double staining for fast and slow myosin. Additionally, fluorescence staining of the myofibre membranes allows defining the minimal Feret’s diameter. The staining of myonuclei with the fluorescence dye bisbenzimide H was utilised to identify nuclei located internally within myofibres. Relevant structures were extracted from the image as single objects and assigned to different object classes using web-based image analysis (MyoScan. Quantitative and morphometric data were analysed, e.g. the number of nuclei per fibre and minimal Feret’s diameter in 6 month old wild-type C57BL/10 mice and mdx mice. Results In the current version of the module “MyoScan”, essential parameters for histologic analysis of muscle sections were implemented including the minimal Feret’s diameter of the myofibres and the automated calculation of the percentage of internally nucleated myofibres. Morphometric data obtained in the present study were in good agreement with previously reported data in the literature and with data obtained from manual

  14. An automated method for analysis of microcirculation videos for accurate assessment of tissue perfusion

    Directory of Open Access Journals (Sweden)

    Demir Sumeyra U

    2012-12-01

    Full Text Available Abstract Background Imaging of the human microcirculation in real-time has the potential to detect injuries and illnesses that disturb the microcirculation at earlier stages and may improve the efficacy of resuscitation. Despite advanced imaging techniques to monitor the microcirculation, there are currently no tools for the near real-time analysis of the videos produced by these imaging systems. An automated system tool that can extract microvasculature information and monitor changes in tissue perfusion quantitatively might be invaluable as a diagnostic and therapeutic endpoint for resuscitation. Methods The experimental algorithm automatically extracts microvascular network and quantitatively measures changes in the microcirculation. There are two main parts in the algorithm: video processing and vessel segmentation. Microcirculatory videos are first stabilized in a video processing step to remove motion artifacts. In the vessel segmentation process, the microvascular network is extracted using multiple level thresholding and pixel verification techniques. Threshold levels are selected using histogram information of a set of training video recordings. Pixel-by-pixel differences are calculated throughout the frames to identify active blood vessels and capillaries with flow. Results Sublingual microcirculatory videos are recorded from anesthetized swine at baseline and during hemorrhage using a hand-held Side-stream Dark Field (SDF imaging device to track changes in the microvasculature during hemorrhage. Automatically segmented vessels in the recordings are analyzed visually and the functional capillary density (FCD values calculated by the algorithm are compared for both health baseline and hemorrhagic conditions. These results were compared to independently made FCD measurements using a well-known semi-automated method. Results of the fully automated algorithm demonstrated a significant decrease of FCD values. Similar, but more variable FCD

  15. Automated Nanofiber Diameter Measurement in SEM Images Using a Robust Image Analysis Method

    Directory of Open Access Journals (Sweden)

    Ertan Öznergiz

    2014-01-01

    Full Text Available Due to the high surface area, porosity, and rigidity, applications of nanofibers and nanosurfaces have developed in recent years. Nanofibers and nanosurfaces are typically produced by electrospinning method. In the production process, determination of average fiber diameter is crucial for quality assessment. Average fiber diameter is determined by manually measuring the diameters of randomly selected fibers on scanning electron microscopy (SEM images. However, as the number of the images increases, manual fiber diameter determination becomes a tedious and time consuming task as well as being sensitive to human errors. Therefore, an automated fiber diameter measurement system is desired. In the literature, this task is achieved by using image analysis algorithms. Typically, these methods first isolate each fiber in the image and measure the diameter of each isolated fiber. Fiber isolation is an error-prone process. In this study, automated calculation of nanofiber diameter is achieved without fiber isolation using image processing and analysis algorithms. Performance of the proposed method was tested on real data. The effectiveness of the proposed method is shown by comparing automatically and manually measured nanofiber diameter values.

  16. Automated Analysis of Human Sperm Number and Concentration (Oligospermia) Using Otsu Threshold Method and Labelling

    Science.gov (United States)

    Susrama, I. G.; Purnama, K. E.; Purnomo, M. H.

    2016-01-01

    Oligospermia is a male fertility issue defined as a low sperm concentration in the ejaculate. Normally the sperm concentration is 20-120 million/ml, while Oligospermia patients has sperm concentration less than 20 million/ml. Sperm test done in the fertility laboratory to determine oligospermia by checking fresh sperm according to WHO standards in 2010 [9]. The sperm seen in a microscope using a Neubauer improved counting chamber and manually count the number of sperm. In order to be counted automatically, this research made an automation system to analyse and count the sperm concentration called Automated Analysis of Sperm Concentration Counters (A2SC2) using Otsu threshold segmentation process and morphology. Data sperm used is the fresh sperm directly in the analysis in the laboratory from 10 people. The test results using A2SC2 method obtained an accuracy of 91%. Thus in this study, A2SC2 can be used to calculate the amount and concentration of sperm automatically

  17. New Fully Automated Method for Segmentation of Breast Lesions on Ultrasound Based on Texture Analysis.

    Science.gov (United States)

    Gómez-Flores, Wilfrido; Ruiz-Ortega, Bedert Abel

    2016-07-01

    The study described here explored a fully automatic segmentation approach based on texture analysis for breast lesions on ultrasound images. The proposed method involves two main stages: (i) In lesion region detection, the original gray-scale image is transformed into a texture domain based on log-Gabor filters. Local texture patterns are then extracted from overlapping lattices that are further classified by a linear discriminant analysis classifier to distinguish between the "normal tissue" and "breast lesion" classes. Next, an incremental method based on the average radial derivative function reveals the region with the highest probability of being a lesion. (ii) In lesion delineation, using the detected region and the pre-processed ultrasound image, an iterative thresholding procedure based on the average radial derivative function is performed to determine the final lesion contour. The experiments are carried out on a data set of 544 breast ultrasound images (including cysts, benign solid masses and malignant lesions) acquired with three distinct ultrasound machines. In terms of the area under the receiver operating characteristic curve, the one-way analysis of variance test (α=0.05) indicates that the proposed approach significantly outperforms two published fully automatic methods (ptexture features to accurately segment breast lesions. In addition, the proposed approach can potentially be used for automated computer diagnosis purposes to assist physicians in detection and classification of breast masses. PMID:27095150

  18. A Fully Automated and Robust Method to Incorporate Stamping Data in Crash, NVH and Durability Analysis

    Science.gov (United States)

    Palaniswamy, Hariharasudhan; Kanthadai, Narayan; Roy, Subir; Beauchesne, Erwan

    2011-08-01

    Crash, NVH (Noise, Vibration, Harshness), and durability analysis are commonly deployed in structural CAE analysis for mechanical design of components especially in the automotive industry. Components manufactured by stamping constitute a major portion of the automotive structure. In CAE analysis they are modeled at a nominal state with uniform thickness and no residual stresses and strains. However, in reality the stamped components have non-uniformly distributed thickness and residual stresses and strains resulting from stamping. It is essential to consider the stamping information in CAE analysis to accurately model the behavior of the sheet metal structures under different loading conditions. Especially with the current emphasis on weight reduction by replacing conventional steels with aluminum and advanced high strength steels it is imperative to avoid over design. Considering this growing need in industry, a highly automated and robust method has been integrated within Altair Hyperworks® to initialize sheet metal components in CAE models with stamping data. This paper demonstrates this new feature and the influence of stamping data for a full car frontal crash analysis.

  19. Automated cloning methods.; TOPICAL

    International Nuclear Information System (INIS)

    Argonne has developed a series of automated protocols to generate bacterial expression clones by using a robotic system designed to be used in procedures associated with molecular biology. The system provides plate storage, temperature control from 4 to 37 C at various locations, and Biomek and Multimek pipetting stations. The automated system consists of a robot that transports sources from the active station on the automation system. Protocols for the automated generation of bacterial expression clones can be grouped into three categories (Figure 1). Fragment generation protocols are initiated on day one of the expression cloning procedure and encompass those protocols involved in generating purified coding region (PCR)

  20. Semi-automated method to measure pneumonia severity in mice through computed tomography (CT) scan analysis

    Science.gov (United States)

    Johri, Ansh; Schimel, Daniel; Noguchi, Audrey; Hsu, Lewis L.

    2010-03-01

    Imaging is a crucial clinical tool for diagnosis and assessment of pneumonia, but quantitative methods are lacking. Micro-computed tomography (micro CT), designed for lab animals, provides opportunities for non-invasive radiographic endpoints for pneumonia studies. HYPOTHESIS: In vivo micro CT scans of mice with early bacterial pneumonia can be scored quantitatively by semiautomated imaging methods, with good reproducibility and correlation with bacterial dose inoculated, pneumonia survival outcome, and radiologists' scores. METHODS: Healthy mice had intratracheal inoculation of E. coli bacteria (n=24) or saline control (n=11). In vivo micro CT scans were performed 24 hours later with microCAT II (Siemens). Two independent radiologists scored the extent of airspace abnormality, on a scale of 0 (normal) to 24 (completely abnormal). Using the Amira 5.2 software (Mercury Computer Systems), a histogram distribution of voxel counts between the Hounsfield range of -510 to 0 was created and analyzed, and a segmentation procedure was devised. RESULTS: A t-test was performed to determine whether there was a significant difference in the mean voxel value of each mouse in the three experimental groups: Saline Survivors, Pneumonia Survivors, and Pneumonia Non-survivors. It was found that the voxel count method was able to statistically tell apart the Saline Survivors from the Pneumonia Survivors, the Saline Survivors from the Pneumonia Non-survivors, but not the Pneumonia Survivors vs. Pneumonia Non-survivors. The segmentation method, however, was successfully able to distinguish the two Pneumonia groups. CONCLUSION: We have pilot-tested an evaluation of early pneumonia in mice using micro CT and a semi-automated method for lung segmentation and scoring system. Statistical analysis indicates that the system is reliable and merits further evaluation.

  1. Automation and uncertainty analysis of a method for in-vivo range verification in particle therapy

    International Nuclear Information System (INIS)

    We introduce the automation of the range difference calculation deduced from particle-irradiation induced β+-activity distributions with the so-called most-likely-shift approach, and evaluate its reliability via the monitoring of algorithm- and patient-specific uncertainty factors. The calculation of the range deviation is based on the minimization of the absolute profile differences in the distal part of two activity depth profiles shifted against each other. Depending on the workflow of positron emission tomography (PET)-based range verification, the two profiles under evaluation can correspond to measured and simulated distributions, or only measured data from different treatment sessions. In comparison to previous work, the proposed approach includes an automated identification of the distal region of interest for each pair of PET depth profiles and under consideration of the planned dose distribution, resulting in the optimal shift distance. Moreover, it introduces an estimate of uncertainty associated to the identified shift, which is then used as weighting factor to ‘red flag’ problematic large range differences. Furthermore, additional patient-specific uncertainty factors are calculated using available computed tomography (CT) data to support the range analysis. The performance of the new method for in-vivo treatment verification in the clinical routine is investigated with in-room PET images for proton therapy as well as with offline PET images for proton and carbon ion therapy. The comparison between measured PET activity distributions and predictions obtained by Monte Carlo simulations or measurements from previous treatment fractions is performed. For this purpose, a total of 15 patient datasets were analyzed, which were acquired at Massachusetts General Hospital and Heidelberg Ion-Beam Therapy Center with in-room PET and offline PET/CT scanners, respectively. Calculated range differences between the compared activity distributions are reported in

  2. Automated Methods of Corrosion Measurements

    DEFF Research Database (Denmark)

    Andersen, Jens Enevold Thaulov

    . Mechanical control, recording, and data processing must therefore be automated to a high level of precision and reliability. These general techniques and the apparatus involved have been described extensively. The automated methods of such high-resolution microscopy coordinated with computerized...

  3. Knee x-ray image analysis method for automated detection of osteoarthritis.

    Science.gov (United States)

    Shamir, Lior; Ling, Shari M; Scott, William W; Bos, Angelo; Orlov, Nikita; Macura, Tomasz J; Eckley, D Mark; Ferrucci, Luigi; Goldberg, Ilya G

    2009-02-01

    We describe a method for automated detection of radiographic osteoarthritis (OA) in knee X-ray images. The detection is based on the Kellgren-Lawrence (KL) classification grades, which correspond to the different stages of OA severity. The classifier was built using manually classified X-rays, representing the first four KL grades (normal, doubtful, minimal, and moderate). Image analysis is performed by first identifying a set of image content descriptors and image transforms that are informative for the detection of OA in the X-rays and assigning weights to these image features using Fisher scores. Then, a simple weighted nearest neighbor rule is used in order to predict the KL grade to which a given test X-ray sample belongs. The dataset used in the experiment contained 350 X-ray images classified manually by their KL grades. Experimental results show that moderate OA (KL grade 3) and minimal OA (KL grade 2) can be differentiated from normal cases with accuracy of 91.5% and 80.4%, respectively. Doubtful OA (KL grade 1) was detected automatically with a much lower accuracy of 57%. The source code developed and used in this study is available for free download at www.openmicroscopy.org. PMID:19342330

  4. An automated method for analysis of microcirculation videos for accurate assessment of tissue perfusion

    OpenAIRE

    Demir Sumeyra U; Hakimzadeh Roya; Hargraves Rosalyn Hobson; Ward Kevin R; Myer Eric V; Najarian Kayvan

    2012-01-01

    Abstract Background Imaging of the human microcirculation in real-time has the potential to detect injuries and illnesses that disturb the microcirculation at earlier stages and may improve the efficacy of resuscitation. Despite advanced imaging techniques to monitor the microcirculation, there are currently no tools for the near real-time analysis of the videos produced by these imaging systems. An automated system tool that can extract microvasculature information and monitor changes in tis...

  5. Automated Motivic Analysis

    DEFF Research Database (Denmark)

    Lartillot, Olivier

    2016-01-01

    Motivic analysis provides very detailed understanding of musical composi- tions, but is also particularly difficult to formalize and systematize. A computational automation of the discovery of motivic patterns cannot be reduced to a mere extraction of all possible sequences of descriptions....... The systematic approach inexorably leads to a proliferation of redundant structures that needs to be addressed properly. Global filtering techniques cause a drastic elimination of interesting structures that damages the quality of the analysis. On the other hand, a selection of closed patterns allows...

  6. Automated methods of corrosion measurement

    DEFF Research Database (Denmark)

    Andersen, Jens Enevold Thaulov; Bech-Nielsen, Gregers; Reeve, John Ch;

    1997-01-01

    to revise assumptions regarding the basis of the method, which sometimes leads to the discovery of as-yet unnoticed phenomena. The present selection of automated methods for corrosion measurements is not motivated simply by the fact that a certain measurement can be performed automatically...

  7. Development and application of an automated analysis method for individual cerebral perfusion single photon emission tomography images

    CERN Document Server

    Cluckie, A J

    2001-01-01

    Neurological images may be analysed by performing voxel by voxel comparisons with a group of control subject images. An automated, 3D, voxel-based method has been developed for the analysis of individual single photon emission tomography (SPET) scans. Clusters of voxels are identified that represent regions of abnormal radiopharmaceutical uptake. Morphological operators are applied to reduce noise in the clusters, then quantitative estimates of the size and degree of the radiopharmaceutical uptake abnormalities are derived. Statistical inference has been performed using a Monte Carlo method that has not previously been applied to SPET scans, or for the analysis of individual images. This has been validated for group comparisons of SPET scans and for the analysis of an individual image using comparison with a group. Accurate statistical inference was obtained independent of experimental factors such as degrees of freedom, image smoothing and voxel significance level threshold. The analysis method has been eval...

  8. New automated image analysis method for the assessment of Ki-67 labeling index in meningiomas.

    Directory of Open Access Journals (Sweden)

    Wielisław Papierz

    2010-05-01

    Full Text Available Many studies have emphasised the importance of Ki-67 labeling index (LI as the proliferation marker in meningiomas. Several authors confirmed, that Ki-67 LI has prognostic significance and correlates with likelihood of tumour recurrences. These observations were widely accepted by pathologists, but up till now no standard method for Ki-67 LI assessment was developed and introduced for the diagnostic pathology. In this paper we present a new computerised system for automated Ki-67 LI estimation in meningiomas as an aid for histological grading of meningiomas and potential standard method of Ki-67 LI assessment. We also discuss the concordance of Ki-67 LI results obtained by presented computerized system and expert pathologist, as well as possible pitfalls and mistakes in automated counting of immunopositive or negative cells. For the quantitative evaluation of digital images of meningiomas the designed software uses an algorithm based on mathematical description of cell morphology. This solution acts together with the Support Vector Machine (SVM used in the classification mode for the recognition of immunoreactivity of cells. The applied sequential thresholding simulated well the human process of cell recognition. The same digital images of randomly selected tumour areas were parallelly analysed by computer and blindly by two expert pathologists. Ki-67 labeling indices were estimated and the results compared. The mean relative discrepancy between the levels of Ki-67 LI by our system and by the human expert did not exceed 14% in all investigated cases. These preliminary results suggest that the designed software could be an useful tool supporting the diagnostic digital pathology. However, more extended studies are needed for approval of this suggestion.

  9. Automated document analysis system

    Science.gov (United States)

    Black, Jeffrey D.; Dietzel, Robert; Hartnett, David

    2002-08-01

    A software application has been developed to aid law enforcement and government intelligence gathering organizations in the translation and analysis of foreign language documents with potential intelligence content. The Automated Document Analysis System (ADAS) provides the capability to search (data or text mine) documents in English and the most commonly encountered foreign languages, including Arabic. Hardcopy documents are scanned by a high-speed scanner and are optical character recognized (OCR). Documents obtained in an electronic format bypass the OCR and are copied directly to a working directory. For translation and analysis, the script and the language of the documents are first determined. If the document is not in English, the document is machine translated to English. The documents are searched for keywords and key features in either the native language or translated English. The user can quickly review the document to determine if it has any intelligence content and whether detailed, verbatim human translation is required. The documents and document content are cataloged for potential future analysis. The system allows non-linguists to evaluate foreign language documents and allows for the quick analysis of a large quantity of documents. All document processing can be performed manually or automatically on a single document or a batch of documents.

  10. Preparatory methods for DNA hydrolysis, cytochemistry, immunocytochemistry and ploidy analysis. Their application to automated and routine diagnostic cytopathology.

    Science.gov (United States)

    Husain, O A; Watts, K C

    1987-06-01

    A review is presented of some methods used to prepare cytologic specimens for analytical and/or automated studies, with the steps of the procedures detailed in appendices. The preparation of the cell monolayers required for optimal automated cell image analysis and classification, e.g., by the Cytoscan 110, is discussed, as is the preparation of poly-L-lysine-coated slides used in the production of monolayered specimens. These monolayers, which can be prepared from a variety of specimens, are also useful for cytochemical and immunocytochemical studies and DNA ploidy analysis. For DNA analysis, a modified gallocyanin chrome alum staining procedure is described as a stoichiometric alternative to the time-consuming Feulgen reaction. The hydrolysis technique required by the latter method is also detailed. The freeze-fracturing technique for the enhancement of monoclonal antibody immunocytochemical staining of detectable antigens is described, along with an indirect immunoalkaline phosphatase staining method. The use of enzyme cytochemical reactions for glucose 6 phosphate dehydrogenase and lysosomal naphthylamidase is also presented. PMID:3620061

  11. Automated quantitative analysis for pneumoconiosis

    Science.gov (United States)

    Kondo, Hiroshi; Zhao, Bin; Mino, Masako

    1998-09-01

    Automated quantitative analysis for pneumoconiosis is presented. In this paper Japanese standard radiographs of pneumoconiosis are categorized by measuring the area density and the number density of small rounded opacities. And furthermore the classification of the size and shape of the opacities is made from the measuring of the equivalent radiuses of each opacity. The proposed method includes a bi- level unsharp masking filter with a 1D uniform impulse response in order to eliminate the undesired parts such as the images of blood vessels and ribs in the chest x-ray photo. The fuzzy contrast enhancement is also introduced in this method for easy and exact detection of small rounded opacities. Many simulation examples show that the proposed method is more reliable than the former method.

  12. AUTOMATED ANALYSIS OF BREAKERS

    Directory of Open Access Journals (Sweden)

    E. M. Farhadzade

    2014-01-01

    Full Text Available Breakers relate to Electric Power Systems’ equipment, the reliability of which influence, to a great extend, on reliability of Power Plants. In particular, the breakers determine structural reliability of switchgear circuit of Power Stations and network substations. Failure in short-circuit switching off by breaker with further failure of reservation unit or system of long-distance protection lead quite often to system emergency.The problem of breakers’ reliability improvement and the reduction of maintenance expenses is becoming ever more urgent in conditions of systematic increasing of maintenance cost and repair expenses of oil circuit and air-break circuit breakers. The main direction of this problem solution is the improvement of diagnostic control methods and organization of on-condition maintenance. But this demands to use a great amount of statistic information about nameplate data of breakers and their operating conditions, about their failures, testing and repairing, advanced developments (software of computer technologies and specific automated information system (AIS.The new AIS with AISV logo was developed at the department: “Reliability of power equipment” of AzRDSI of Energy. The main features of AISV are:· to provide the security and data base accuracy;· to carry out systematic control of breakers conformity with operating conditions;· to make the estimation of individual  reliability’s value and characteristics of its changing for given combination of characteristics variety;· to provide personnel, who is responsible for technical maintenance of breakers, not only with information but also with methodological support, including recommendations for the given problem solving  and advanced methods for its realization.

  13. Introducing Powell's Direction Set Method to a Fully Automated Analysis of Eclipsing Binary Stars

    CERN Document Server

    Prsa, A

    2006-01-01

    With recent observational advancements, substantial amounts of photometric and spectroscopic eclipsing binary data have been acquired. As part of an ongoing effort to assemble a reliable pipeline for fully automatic data analysis, we put Powell's direction set method to the test. The method does not depend on numerical derivatives, only on function evaluations, and as such it cannot diverge. Compared to differential corrections (DC) and Nelder & Mead's downhill simplex (NMS) method, Powell's method proves to be more efficient in terms of solution determination and the required number of iterations. However, its application is still not optimal in terms of time cost. Causes for this deficiency are identified and two steps toward the solution are proposed: non-ortogonality of the parameter set should be removed and better initial directions should be determined before the minimization is initiated. Once these setbacks are worked out, Powell's method will probably replace DC and NMS as the default minimizing...

  14. Development and application of an automated analysis method for individual cerebral perfusion single photon emission tomography images

    International Nuclear Information System (INIS)

    Neurological images may be analysed by performing voxel by voxel comparisons with a group of control subject images. An automated, 3D, voxel-based method has been developed for the analysis of individual single photon emission tomography (SPET) scans. Clusters of voxels are identified that represent regions of abnormal radiopharmaceutical uptake. Morphological operators are applied to reduce noise in the clusters, then quantitative estimates of the size and degree of the radiopharmaceutical uptake abnormalities are derived. Statistical inference has been performed using a Monte Carlo method that has not previously been applied to SPET scans, or for the analysis of individual images. This has been validated for group comparisons of SPET scans and for the analysis of an individual image using comparison with a group. Accurate statistical inference was obtained independent of experimental factors such as degrees of freedom, image smoothing and voxel significance level threshold. The analysis method has been evaluated for application to cerebral perfusion SPET imaging in ischaemic stroke. It has been shown that useful quantitative estimates, high sensitivity and high specificity may be obtained. Sensitivity and the accuracy of signal quantification were found to be dependent on the operator defined analysis parameters. Recommendations for the values of these parameters have been made. The analysis method developed has been compared with an established method and shown to result in higher specificity for the data and analysis parameter sets tested. In addition, application to a group of ischaemic stroke patient SPET scans has demonstrated its clinical utility. The influence of imaging conditions has been assessed using phantom data acquired with different gamma camera SPET acquisition parameters. A lower limit of five million counts and standardisation of all acquisition parameters has been recommended for the analysis of individual SPET scans. (author)

  15. NetFCM: A Semi-Automated Web-Based Method for Flow Cytometry Data Analysis

    DEFF Research Database (Denmark)

    Frederiksen, Juliet Wairimu; Buggert, Marcus; Karlsson, Annika C.;

    2014-01-01

    tool both for subset identification as well as for quantification of differences between samples. Additionally, NetFCM can classify and cluster samples based on multidimensional data. We tested the method using a data set of peripheral blood mononuclear cells collected from 23 HIV-infected individuals...... data analysis has become more complex and labor-intensive than previously. We have therefore developed a semi-automatic gating strategy (NetFCM) that uses clustering and principal component analysis (PCA) together with other statistical methods to mimic manual gating approaches. NetFCM is an online......, which were stimulated with overlapping HIV Gag-p55 and CMV-pp65 peptides or medium alone (negative control). NetFCM clustered the virus-specific CD8+ T cells based on IFN and TNF responses into distinct compartments. Additionally, NetFCM was capable of identifying HIV- and CMV-specific responses...

  16. Automated methods for quantitative and qualitative analysis of PTR-TOF mass spectra

    International Nuclear Information System (INIS)

    Statistical analysis of measured signals from counting systems is a common method to increase the accuracy and precision of peak position and peak area. The most common approach to analyze data gained from counting systems is to fit the data peak by peak using an appropriate probability density function (PDF) like a Gaussian function. Since a counting system creates histograms, the counted data do not represent data points of the anticipated PDF. Therefore, one should not fit any PDF directly to the histogram data. Here we present a solution to this problem by fitting distributions instead of densities. A simple formula allows to correct for Poisson statistics and dead-time effects. The improved peak analysis method is applied to mass spectra obtained from a recently developed proton-transfer-reaction time-offlight mass spectrometer (PTR-TOF) enhancing the mass accuracy and peak quantification. (author)

  17. Statistical colour models: an automated digital image analysis method for quantification of histological biomarkers

    OpenAIRE

    Shu, Jie; Dolman, G. E.; Duan, Jiang; Qiu, Guoping; Ilyas, Mohammad

    2016-01-01

    Background Colour is the most important feature used in quantitative immunohistochemistry (IHC) image analysis; IHC is used to provide information relating to aetiology and to confirm malignancy. Methods Statistical modelling is a technique widely used for colour detection in computer vision. We have developed a statistical model of colour detection applicable to detection of stain colour in digital IHC images. Model was first trained by massive colour pixels collected semi-automatically. To ...

  18. Automated analysis of 3D echocardiography

    NARCIS (Netherlands)

    Stralen, Marijn van

    2009-01-01

    In this thesis we aim at automating the analysis of 3D echocardiography, mainly targeting the functional analysis of the left ventricle. Manual analysis of these data is cumbersome, time-consuming and is associated with inter-observer and inter-institutional variability. Methods for reconstruction o

  19. Longitudinal analysis of the temporal evolution of Acinetobacter baumannii strains in Ohio, USA, by using rapid automated typing methods.

    Directory of Open Access Journals (Sweden)

    Brooke K Decker

    Full Text Available Genotyping methods are essential to understand the transmission dynamics of Acinetobacter baumannii. We examined the representative genotypes of A. baumannii at different time periods in select locations in Ohio, using two rapid automated typing methods: PCR coupled with electrospray ionization mass spectrometry (PCR/ESI-MS, a form of multi-locus sequence typing (MLST, and repetitive-sequence-based-PCR (rep-PCR. Our analysis included 122 isolates from 4 referral hospital systems, in 2 urban areas of Ohio. These isolates were associated with outbreaks at 3 different time periods (1996, 2000 and 2005-2007. Type assignments of PCR/ESI-MS and rep-PCR were compared to each other and to worldwide (WW clone types. The discriminatory power of each method was determined using the Simpson's index of diversity (DI. We observed that PCR/ESI-MS sequence type (ST 14, corresponding to WW clone 3, predominated in 1996, whereas ST 12 and 14 co-existed in the intermediate period (2000 and ST 10 and 12, belonging to WW clone 2, predominated more recently in 2007. The shift from WW clone 3 to WW clone 2 was accompanied by an increase in carbapenem resistance. The DI was approximately 0.74 for PCR/ESI-MS, 0.88 for rep-PCR and 0.90 for the combination of both typing methods. We conclude that combining rapid automated typing methods such as PCR/ESI-MS and rep-PCR serves to optimally characterize the regional molecular epidemiology of A. baumannii. Our data also sheds light on the changing sequence types in an 11 year period in Northeast Ohio.

  20. Development testing of the chemical analysis automation polychlorinated biphenyl standard analysis method during surface soils sampling at the David Witherspoon 1630 site

    International Nuclear Information System (INIS)

    The Chemical Analysis Automation (CAA) project is developing standardized, software-driven, site-deployable robotic laboratory systems with the objective of lowering the per-sample analysis cost, decreasing sample turnaround time, and minimizing human exposure to hazardous and radioactive materials associated with DOE remediation projects. The first integrated system developed by the CAA project is designed to determine polychlorinated biphenyls (PCB) content in soil matrices. A demonstration and development testing of this system was conducted in conjuction with surface soil characterization activities at the David Witherspoon 1630 Site in Knoxville, Tennessee. The PCB system consists of five hardware standard laboratory modules (SLMs), one software SLM, the task sequence controller (TSC), and the human-computer interface (HCI). Four of the hardware SLMs included a four-channel Soxhlet extractor, a high-volume concentrator, a column cleanup, and a gas chromatograph. These SLMs performed the sample preparation and measurement steps within the total analysis protocol. The fifth hardware module was a robot that transports samples between the SLMs and the required consumable supplies to the SLMs. The software SLM is an automated data interpretation module that receives raw data from the gas chromatograph SLM and analyzes the data to yield the analyte information. The TSC is a software system that provides the scheduling, management of system resources, and the coordination of all SLM activities. The HCI is a graphical user interface that presents the automated laboratory to the analyst in terms of the analytical procedures and methods. Human control of the automated laboratory is accomplished via the HCI. Sample information required for processing by the automated laboratory is entered through the HCI. Information related to the sample and the system status is presented to the analyst via graphical icons

  1. Computer-implemented system and method for automated and highly accurate plaque analysis, reporting, and visualization

    Science.gov (United States)

    Kemp, James Herbert (Inventor); Talukder, Ashit (Inventor); Lambert, James (Inventor); Lam, Raymond (Inventor)

    2008-01-01

    A computer-implemented system and method of intra-oral analysis for measuring plaque removal is disclosed. The system includes hardware for real-time image acquisition and software to store the acquired images on a patient-by-patient basis. The system implements algorithms to segment teeth of interest from surrounding gum, and uses a real-time image-based morphing procedure to automatically overlay a grid onto each segmented tooth. Pattern recognition methods are used to classify plaque from surrounding gum and enamel, while ignoring glare effects due to the reflection of camera light and ambient light from enamel regions. The system integrates these components into a single software suite with an easy-to-use graphical user interface (GUI) that allows users to do an end-to-end run of a patient record, including tooth segmentation of all teeth, grid morphing of each segmented tooth, and plaque classification of each tooth image.

  2. GSMA: Gene Set Matrix Analysis, An Automated Method for Rapid Hypothesis Testing of Gene Expression Data

    Directory of Open Access Journals (Sweden)

    Chris Cheadle

    2007-01-01

    Full Text Available Background: Microarray technology has become highly valuable for identifying complex global changes in gene expression patterns. The assignment of functional information to these complex patterns remains a challenging task in effectively interpreting data and correlating results from across experiments, projects and laboratories. Methods which allow the rapid and robust evaluation of multiple functional hypotheses increase the power of individual researchers to data mine gene expression data more efficiently.Results: We have developed (gene set matrix analysis GSMA as a useful method for the rapid testing of group-wise up- or downregulation of gene expression simultaneously for multiple lists of genes (gene sets against entire distributions of gene expression changes (datasets for single or multiple experiments. The utility of GSMA lies in its flexibility to rapidly poll gene sets related by known biological function or as designated solely by the end-user against large numbers of datasets simultaneously.Conclusions: GSMA provides a simple and straightforward method for hypothesis testing in which genes are tested by groups across multiple datasets for patterns of expression enrichment.

  3. An automated method for nonparametric kinetic analysis of clinical DCE-MRI data: application to glioblastoma treated with bevacizumab.

    Science.gov (United States)

    Ferl, Gregory Z; Xu, Lu; Friesenhahn, Michel; Bernstein, Lisa J; Barboriak, Daniel P; Port, Ruediger E

    2010-05-01

    Here, we describe an automated nonparametric method for evaluating gadolinium-diethylene triamine pentaacetic acid (Gd-DTPA) kinetics, based on dynamic contrast-enhanced-MRI scans of glioblastoma patients taken before and after treatment with bevacizumab; no specific model or equation structure is assumed or used. Tumor and venous blood concentration-time profiles are smoothed, using a robust algorithm that removes artifacts due to patient motion, and then deconvolved, yielding an impulse response function. In addition to smoothing, robustness of the deconvolution operation is assured by excluding data that occur prior to the plasma peak; an exhaustive analysis was performed to demonstrate that exclusion of the prepeak plasma data does not significantly affect results. All analysis steps are executed by a single R script that requires blood and tumor curves as the sole input. Statistical moment analysis of the Impulse response function yields the area under the curve (AUC) and mean residence time (MRT). Comparison of deconvolution results to fitted Tofts model parameters suggests that AUCMRT and AUC of the Impulse response function closely approximate fractional clearance from plasma to tissue (K(trans)) and fractional interstitial volume (v(e)). Intervisit variability is shown to be comparable when using the deconvolution method (11% [AUCMRT] and 13%[AUC]) compared to the Tofts model (14%[K(trans)] and 24%[v(e)]). AUC and AUCMRT both exhibit a statistically significant decrease (P < 0.005) 1 day after administration of bevacizumab. PMID:20432307

  4. Rapid methods and automation in dairy microbiology.

    Science.gov (United States)

    Vasavada, P C

    1993-10-01

    The importance of microbiology to the dairy industry has been demonstrated by recent outbreaks of foodborne illness associated with consumption of milk and dairy products that had been contaminated with pathogenic organisms or toxins. Undesirable microorganisms constitute the primary hazard to safety, quality, and wholesomeness of milk and dairy foods. Consequently, increased emphasis has been placed on the microbiological analysis of milk and dairy products designed to evaluate quality and to ensure safety and regulatory compliance. The focus of dairy microbiology, however, remains largely on conventional methods: plate counts, most probable numbers, and dye reduction tests. These methods are slow, tedious, intensive in their requirements for material and labor, and often not suitable for assessing the quality and shelf-life of perishable dairy foods. With the exception of coliforms, Salmonella, and Staphylococcus aureus, isolation and characterization of various organisms occurring in milk and milk products are seldom a part of the routine microbiological analysis in the dairy industry. Recent emphasis on the programs based on HACCP (Hazard Analysis and Critical Control Points) for total quality management in the dairy industry and increased demand for microbiological surveillance of products, process, and environment have led to increased interest in rapid methods and automation in microbiology. Several methods for rapid detection, isolation, enumeration, and characterization of microorganisms are being adapted by the dairy industry. This presentation reviews rapid methods and automation in microbiology for microbiological analysis of milk and dairy products. PMID:8227634

  5. Feasibility Analysis of Crane Automation

    Institute of Scientific and Technical Information of China (English)

    DONG Ming-xiao; MEI Xue-song; JIANG Ge-dong; ZHANG Gui-qing

    2006-01-01

    This paper summarizes the modeling methods, open-loop control and closed-loop control techniques of various forms of cranes, worldwide, and discusses their feasibilities and limitations in engineering. Then the dynamic behaviors of cranes are analyzed. Finally, we propose applied modeling methods and feasible control techniques and demonstrate the feasibilities of crane automation.

  6. Automated Methods Of Corrosion Measurements

    DEFF Research Database (Denmark)

    Bech-Nielsen, Gregers; Andersen, Jens Enevold Thaulov; Reeve, John Ch;

    1997-01-01

    The chapter describes the following automated measurements: Corrosion Measurements by Titration, Imaging Corrosion by Scanning Probe Microscopy, Critical Pitting Temperature and Application of the Electrochemical Hydrogen Permeation Cell.......The chapter describes the following automated measurements: Corrosion Measurements by Titration, Imaging Corrosion by Scanning Probe Microscopy, Critical Pitting Temperature and Application of the Electrochemical Hydrogen Permeation Cell....

  7. Analysis of the disagreement between automated bioluminescence-based and culture methods for detecting significant bacteriuria, with proposals for standardizing evaluations of bacteriuria detection methods.

    OpenAIRE

    Nichols, W. W.; Curtis, G D; Johnston, H H

    1982-01-01

    A fully automated method for detecting significant bacteriuria is described which uses firefly luciferin and luciferase to detect bacterial ATP in urine. The automated method was calibrated and evaluated, using 308 urine specimens, against two reference culture methods. We obtained a specificity of 0.79 and sensitivity of 0.75 using a quantitative pour plate reference test and a specificity of 0.79 and a sensitivity of 0.90 using a semiquantitative standard loop reference test. The majority o...

  8. Automated activation-analysis system

    International Nuclear Information System (INIS)

    An automated delayed neutron counting and instrumental neutron activation analysis system has been developed at Los Alamos National Laboratory's Omega West Reactor (OWR) to analyze samples for uranium and 31 additional elements with a maximum throughput of 400 samples per day. The system and its mode of operation for a large reconnaissance survey are described

  9. Automated Analysis of Infinite Scenarios

    DEFF Research Database (Denmark)

    Buchholtz, Mikael

    The security of a network protocol crucially relies on the scenario in which the protocol is deployed. This paper describes syntactic constructs for modelling network scenarios and presents an automated analysis tool, which can guarantee that security properties hold in all of the (infinitely many...

  10. Automated methods of textual content analysis and description of text structures

    CERN Document Server

    Chýla, Roman

    Universal Semantic Language (USL) is a semi-formalized approach for the description of knowledge (a knowledge representation tool). The idea of USL was introduced by Vladimir Smetacek in the system called SEMAN which was used for keyword extraction tasks in the former Information centre of the Czechoslovak Republic. However due to the dissolution of the centre in early 90's, the system has been lost. This thesis reintroduces the idea of USL in a new context of quantitative content analysis. First we introduce the historical background and the problems of semantics and knowledge representation, semes, semantic fields, semantic primes and universals. The basic methodology of content analysis studies is illustrated on the example of three content analysis tools and we describe the architecture of a new system. The application was built specifically for USL discovery but it can work also in the context of classical content analysis. It contains Natural Language Processing (NLP) components and employs the algorith...

  11. Automated Methods for Multiplexed Pathogen Detection

    Energy Technology Data Exchange (ETDEWEB)

    Straub, Tim M.; Dockendorff, Brian P.; Quinonez-Diaz, Maria D.; Valdez, Catherine O.; Shutthanandan, Janani I.; Tarasevich, Barbara J.; Grate, Jay W.; Bruckner-Lea, Cindy J.

    2005-09-01

    Detection of pathogenic microorganisms in environmental samples is a difficult process. Concentration of the organisms of interest also co-concentrates inhibitors of many end-point detection methods, notably, nucleic acid methods. In addition, sensitive, highly multiplexed pathogen detection continues to be problematic. The primary function of the BEADS (Biodetection Enabling Analyte Delivery System) platform is the automated concentration and purification of target analytes from interfering substances, often present in these samples, via a renewable surface column. In one version of BEADS, automated immunomagnetic separation (IMS) is used to separate cells from their samples. Captured cells are transferred to a flow-through thermal cycler where PCR, using labeled primers, is performed. PCR products are then detected by hybridization to a DNA suspension array. In another version of BEADS, cell lysis is performed, and community RNA is purified and directly labeled. Multiplexed detection is accomplished by direct hybridization of the RNA to a planar microarray. The integrated IMS/PCR version of BEADS can successfully purify and amplify 10 E. coli O157:H7 cells from river water samples. Multiplexed PCR assays for the simultaneous detection of E. coli O157:H7, Salmonella, and Shigella on bead suspension arrays was demonstrated for the detection of as few as 100 cells for each organism. Results for the RNA version of BEADS are also showing promising results. Automation yields highly purified RNA, suitable for multiplexed detection on microarrays, with microarray detection specificity equivalent to PCR. Both versions of the BEADS platform show great promise for automated pathogen detection from environmental samples. Highly multiplexed pathogen detection using PCR continues to be problematic, but may be required for trace detection in large volume samples. The RNA approach solves the issues of highly multiplexed PCR and provides ''live vs. dead

  12. Introducing adapted Nelder & Mead's downhill simplex method to a fully automated analysis of eclipsing binaries

    CERN Document Server

    Prsa, A

    2004-01-01

    Eclipsing binaries are extremely attractive objects because absolute physical parameters (masses, luminosities, radii) of both components may be determined from observations. Since most efforts to extract these parameters were based on dedicated observing programs, existing modeling code is based on interactivity. Gaia will make a revolutionary advance in shear number of observed eclipsing binaries and new methods for automatic handling must be introduced and thoroughly tested. This paper focuses on Nelder & Mead's downhill simplex method applied to a synthetically created test binary as it will be observed by Gaia.

  13. The contaminant analysis automation robot implementation for the automated laboratory

    International Nuclear Information System (INIS)

    The Contaminant Analysis Automation (CAA) project defines the automated laboratory as a series of standard laboratory modules (SLM) serviced by a robotic standard support module (SSM). These SLMs are designed to allow plug-and-play integration into automated systems that perform standard analysis methods (SAM). While the SLMs are autonomous in the execution of their particular chemical processing task, the SAM concept relies on a high-level task sequence controller (TSC) to coordinate the robotic delivery of materials requisite for SLM operations, initiate an SLM operation with the chemical method dependent operating parameters, and coordinate the robotic removal of materials from the SLM when its commands and events has been established to allow ready them for transport operations as well as performing the Supervisor and Subsystems (GENISAS) software governs events from the SLMs and robot. The Intelligent System Operating Environment (ISOE) enables the inter-process communications used by GENISAS. CAA selected the Hewlett-Packard Optimized Robot for Chemical Analysis (ORCA) and its associated Windows based Methods Development Software (MDS) as the robot SSM. The MDS software is used to teach the robot each SLM position and required material port motions. To allow the TSC to command these SLM motions, a hardware and software implementation was required that allowed message passing between different operating systems. This implementation involved the use of a Virtual Memory Extended (VME) rack with a Force CPU-30 computer running VxWorks; a real-time multitasking operating system, and a Radiuses PC compatible VME computer running MDS. A GENISAS server on The Force computer accepts a transport command from the TSC, a GENISAS supervisor, over Ethernet and notifies software on the RadiSys PC of the pending command through VMEbus shared memory. The command is then delivered to the MDS robot control software using a Windows Dynamic Data Exchange conversation

  14. Automated Analysis of Corpora Callosa

    DEFF Research Database (Denmark)

    Stegmann, Mikkel Bille; Davies, Rhodri H.

    2003-01-01

    This report describes and evaluates the steps needed to perform modern model-based interpretation of the corpus callosum in MRI. The process is discussed from the initial landmark-free contours to full-fledged statistical models based on the Active Appearance Models framework. Topics treated...... include landmark placement, background modelling and multi-resolution analysis. Preliminary quantitative and qualitative validation in a cross-sectional study show that fully automated analysis and segmentation of the corpus callosum are feasible....

  15. Automated Analysis Methods for the Assessment of Bicycle Infrastructure and Demand

    OpenAIRE

    Pashkevich, Anton

    2013-01-01

    A first phase of the research activity has been related to the study of the state of art of the infrastructures for cycling, bicycle use and methods for evaluation. In this part, the candidate has studied the "bicycle system" in countries with high bicycle use and in particular in the Netherlands. Has been carried out an evaluation of the questionnaires of the survey conducted within the European project BICY on mobility in general in 13 cities of the participating countries. The questionn...

  16. Introducing adapted Nelder & Mead's downhill simplex method to a fully automated analysis of eclipsing binaries

    OpenAIRE

    Prsa, A.; Zwitter, T.

    2004-01-01

    Eclipsing binaries are extremely attractive objects because absolute physical parameters (masses, luminosities, radii) of both components may be determined from observations. Since most efforts to extract these parameters were based on dedicated observing programs, existing modeling code is based on interactivity. Gaia will make a revolutionary advance in shear number of observed eclipsing binaries and new methods for automatic handling must be introduced and thoroughly tested. This paper foc...

  17. Automated Nanofiber Diameter Measurement in SEM Images Using a Robust Image Analysis Method

    OpenAIRE

    2014-01-01

    Due to the high surface area, porosity, and rigidity, applications of nanofibers and nanosurfaces have developed in recent years. Nanofibers and nanosurfaces are typically produced by electrospinning method. In the production process, determination of average fiber diameter is crucial for quality assessment. Average fiber diameter is determined by manually measuring the diameters of randomly selected fibers on scanning electron microscopy (SEM) images. However, as the number of the images inc...

  18. Validation of three viable-cell counting methods: Manual, semi-automated, and automated

    Directory of Open Access Journals (Sweden)

    Daniela Cadena-Herrera

    2015-09-01

    Full Text Available A viable cell count is essential to evaluate the kinetics of cell growth. Since the hemocytometer was first used for counting blood cells, several variants of the methodology have been developed towards reducing the time of analysis and improving accuracy through automation of both sample preparation and counting. The successful implementation of automated techniques relies in the adjustment of cell staining, image display parameters and cell morphology to obtain equivalent precision, accuracy and linearity with respect to the hemocytometer. In this study we conducted the validation of three trypan blue exclusion-based methods: manual, semi-automated, and fully automated; which were used for the estimation of density and viability of cells employed for the biosynthesis and bioassays of recombinant proteins. Our results showed that the evaluated attributes remained within the same range for the automated methods with respect to the manual, providing an efficient alternative for analyzing a huge number of samples.

  19. Method development in automated mineralogy

    OpenAIRE

    Sandmann, Dirk

    2015-01-01

    The underlying research that resulted in this doctoral dissertation was performed at the Division of Economic Geology and Petrology of the Department of Mineralogy, TU Bergakademie Freiberg between 2011 and 2014. It was the primary aim of this thesis to develop and test novel applications for the technology of ‘Automated Mineralogy’ in the field of economic geology and geometallurgy. A “Mineral Liberation Analyser” (MLA) instrument of FEI Company was used to conduct most analytical studies. T...

  20. Note: An automated image analysis method for high-throughput classification of surface-bound bacterial cell motions.

    Science.gov (United States)

    Shen, Simon; Syal, Karan; Tao, Nongjian; Wang, Shaopeng

    2015-12-01

    We present a Single-Cell Motion Characterization System (SiCMoCS) to automatically extract bacterial cell morphological features from microscope images and use those features to automatically classify cell motion for rod shaped motile bacterial cells. In some imaging based studies, bacteria cells need to be attached to the surface for time-lapse observation of cellular processes such as cell membrane-protein interactions and membrane elasticity. These studies often generate large volumes of images. Extracting accurate bacterial cell morphology features from these images is critical for quantitative assessment. Using SiCMoCS, we demonstrated simultaneous and automated motion tracking and classification of hundreds of individual cells in an image sequence of several hundred frames. This is a significant improvement from traditional manual and semi-automated approaches to segmenting bacterial cells based on empirical thresholds, and a first attempt to automatically classify bacterial motion types for motile rod shaped bacterial cells, which enables rapid and quantitative analysis of various types of bacterial motion. PMID:26724085

  1. Note: An automated image analysis method for high-throughput classification of surface-bound bacterial cell motions

    Science.gov (United States)

    Shen, Simon; Syal, Karan; Tao, Nongjian; Wang, Shaopeng

    2015-12-01

    We present a Single-Cell Motion Characterization System (SiCMoCS) to automatically extract bacterial cell morphological features from microscope images and use those features to automatically classify cell motion for rod shaped motile bacterial cells. In some imaging based studies, bacteria cells need to be attached to the surface for time-lapse observation of cellular processes such as cell membrane-protein interactions and membrane elasticity. These studies often generate large volumes of images. Extracting accurate bacterial cell morphology features from these images is critical for quantitative assessment. Using SiCMoCS, we demonstrated simultaneous and automated motion tracking and classification of hundreds of individual cells in an image sequence of several hundred frames. This is a significant improvement from traditional manual and semi-automated approaches to segmenting bacterial cells based on empirical thresholds, and a first attempt to automatically classify bacterial motion types for motile rod shaped bacterial cells, which enables rapid and quantitative analysis of various types of bacterial motion.

  2. Automation of finite element methods

    CERN Document Server

    Korelc, Jože

    2016-01-01

    New finite elements are needed as well in research as in industry environments for the development of virtual prediction techniques. The design and implementation of novel finite elements for specific purposes is a tedious and time consuming task, especially for nonlinear formulations. The automation of this process can help to speed up this process considerably since the generation of the final computer code can be accelerated by order of several magnitudes. This book provides the reader with the required knowledge needed to employ modern automatic tools like AceGen within solid mechanics in a successful way. It covers the range from the theoretical background, algorithmic treatments to many different applications. The book is written for advanced students in the engineering field and for researchers in educational and industrial environments.

  3. A new automated method for analysis of gated-SPECT images based on a three-dimensional heart shaped model

    DEFF Research Database (Denmark)

    Lomsky, Milan; Richter, Jens; Johansson, Lena; El-Ali, Henrik; Aström, Karl; Ljungberg, Michael; Edenbrandt, Lars; El Ali, Henrik H.

    2005-01-01

    A new automated method for quantification of left ventricular function from gated-single photon emission computed tomography (SPECT) images has been developed. The method for quantification of cardiac function (CAFU) is based on a heart shaped model and the active shape algorithm. The model....... The maximal differences between the CAFU estimations and the true left ventricular volumes of the digital phantoms were 11 ml for the end-diastolic volume (EDV), 3 ml for the end-systolic volume (ESV) and 3% for the ejection fraction (EF). The largest differences were seen in the smallest heart. In...... the patient group the EDV calculated using QGS and CAFU showed good agreement for large hearts and higher CAFU values compared with QGS for the smaller hearts. In the larger hearts, ESV was much larger for QGS than for CAFU both in the phantom and patient studies. In the smallest hearts there was good...

  4. Evaluating a method for automated rigid registration

    DEFF Research Database (Denmark)

    Darkner, Sune; Vester-Christensen, Martin; Larsen, Rasmus

    2007-01-01

    We evaluate a novel method for fully automated rigid registration of 2D manifolds in 3D space based on distance maps, the Gibbs sampler and Iterated Conditional Modes (ICM). The method is tested against the ICP considered as the gold standard for automated rigid registration. Furthermore, the...... point distance. T-test for common mean are used to determine the performance of the two methods (supported by a Wilcoxon signed rank test). The performance influence of sampling density, sampling quantity, and norms is analyzed using a similar method....

  5. Exploratory analysis of methods for automated classification of laboratory test orders into syndromic groups in veterinary medicine.

    Directory of Open Access Journals (Sweden)

    Fernanda C Dórea

    Full Text Available BACKGROUND: Recent focus on earlier detection of pathogen introduction in human and animal populations has led to the development of surveillance systems based on automated monitoring of health data. Real- or near real-time monitoring of pre-diagnostic data requires automated classification of records into syndromes--syndromic surveillance--using algorithms that incorporate medical knowledge in a reliable and efficient way, while remaining comprehensible to end users. METHODS: This paper describes the application of two of machine learning (Naïve Bayes and Decision Trees and rule-based methods to extract syndromic information from laboratory test requests submitted to a veterinary diagnostic laboratory. RESULTS: High performance (F1-macro = 0.9995 was achieved through the use of a rule-based syndrome classifier, based on rule induction followed by manual modification during the construction phase, which also resulted in clear interpretability of the resulting classification process. An unmodified rule induction algorithm achieved an F(1-micro score of 0.979 though this fell to 0.677 when performance for individual classes was averaged in an unweighted manner (F(1-macro, due to the fact that the algorithm failed to learn 3 of the 16 classes from the training set. Decision Trees showed equal interpretability to the rule-based approaches, but achieved an F(1-micro score of 0.923 (falling to 0.311 when classes are given equal weight. A Naïve Bayes classifier learned all classes and achieved high performance (F(1-micro= 0.994 and F(1-macro = .955, however the classification process is not transparent to the domain experts. CONCLUSION: The use of a manually customised rule set allowed for the development of a system for classification of laboratory tests into syndromic groups with very high performance, and high interpretability by the domain experts. Further research is required to develop internal validation rules in order to establish

  6. Reload safety analysis automation tools

    International Nuclear Information System (INIS)

    Performing core physics calculations for the sake of reload safety analysis is a very demanding and time consuming process. This process generally begins with the preparation of libraries for the core physics code using a lattice code. The next step involves creating a very large set of calculations with the core physics code. Lastly, the results of the calculations must be interpreted, correctly applying uncertainties and checking whether applicable limits are satisfied. Such a procedure requires three specialized experts. One must understand the lattice code in order to correctly calculate and interpret its results. The next expert must have a good understanding of the physics code in order to create libraries from the lattice code results and to correctly define all the calculations involved. The third expert must have a deep knowledge of the power plant and the reload safety analysis procedure in order to verify, that all the necessary calculations were performed. Such a procedure involves many steps and is very time consuming. At ÚJV Řež, a.s., we have developed a set of tools which can be used to automate and simplify the whole process of performing reload safety analysis. Our application QUADRIGA automates lattice code calculations for library preparation. It removes user interaction with the lattice code and reduces his task to defining fuel pin types, enrichments, assembly maps and operational parameters all through a very nice and user-friendly GUI. The second part in reload safety analysis calculations is done by CycleKit, a code which is linked with our core physics code ANDREA. Through CycleKit large sets of calculations with complicated interdependencies can be performed using simple and convenient notation. CycleKit automates the interaction with ANDREA, organizes all the calculations, collects the results, performs limit verification and displays the output in clickable html format. Using this set of tools for reload safety analysis simplifies

  7. APSAS; an Automated Particle Size Analysis System

    Science.gov (United States)

    Poppe, Lawrence J.; Eliason, A.H.; Fredericks, J.J.

    1985-01-01

    The Automated Particle Size Analysis System integrates a settling tube and an electroresistance multichannel particle-size analyzer (Coulter Counter) with a Pro-Comp/gg microcomputer and a Hewlett Packard 2100 MX(HP 2100 MX) minicomputer. This system and its associated software digitize the raw sediment grain-size data, combine the coarse- and fine-fraction data into complete grain-size distributions, perform method of moments and inclusive graphics statistics, verbally classify the sediment, generate histogram and cumulative frequency plots, and transfer the results into a data-retrieval system. This system saves time and labor and affords greater reliability, resolution, and reproducibility than conventional methods do.

  8. Automation for System Safety Analysis

    Science.gov (United States)

    Malin, Jane T.; Fleming, Land; Throop, David; Thronesbery, Carroll; Flores, Joshua; Bennett, Ted; Wennberg, Paul

    2009-01-01

    This presentation describes work to integrate a set of tools to support early model-based analysis of failures and hazards due to system-software interactions. The tools perform and assist analysts in the following tasks: 1) extract model parts from text for architecture and safety/hazard models; 2) combine the parts with library information to develop the models for visualization and analysis; 3) perform graph analysis and simulation to identify and evaluate possible paths from hazard sources to vulnerable entities and functions, in nominal and anomalous system-software configurations and scenarios; and 4) identify resulting candidate scenarios for software integration testing. There has been significant technical progress in model extraction from Orion program text sources, architecture model derivation (components and connections) and documentation of extraction sources. Models have been derived from Internal Interface Requirements Documents (IIRDs) and FMEA documents. Linguistic text processing is used to extract model parts and relationships, and the Aerospace Ontology also aids automated model development from the extracted information. Visualizations of these models assist analysts in requirements overview and in checking consistency and completeness.

  9. A Comparison of Fully Automated Methods of Data Analysis and Computer Assisted Heuristic Methods in an Electrode Kinetic Study of the Pathologically Variable [Fe(CN) 6 ] 3–/4– Process by AC Voltammetry

    KAUST Repository

    Morris, Graham P.

    2013-12-17

    Fully automated and computer assisted heuristic data analysis approaches have been applied to a series of AC voltammetric experiments undertaken on the [Fe(CN)6]3-/4- process at a glassy carbon electrode in 3 M KCl aqueous electrolyte. The recovered parameters in all forms of data analysis encompass E0 (reversible potential), k0 (heterogeneous charge transfer rate constant at E0), α (charge transfer coefficient), Ru (uncompensated resistance), and Cdl (double layer capacitance). The automated method of analysis employed time domain optimization and Bayesian statistics. This and all other methods assumed the Butler-Volmer model applies for electron transfer kinetics, planar diffusion for mass transport, Ohm\\'s Law for Ru, and a potential-independent Cdl model. Heuristic approaches utilize combinations of Fourier Transform filtering, sensitivity analysis, and simplex-based forms of optimization applied to resolved AC harmonics and rely on experimenter experience to assist in experiment-theory comparisons. Remarkable consistency of parameter evaluation was achieved, although the fully automated time domain method provided consistently higher α values than those based on frequency domain data analysis. The origin of this difference is that the implemented fully automated method requires a perfect model for the double layer capacitance. In contrast, the importance of imperfections in the double layer model is minimized when analysis is performed in the frequency domain. Substantial variation in k0 values was found by analysis of the 10 data sets for this highly surface-sensitive pathologically variable [Fe(CN) 6]3-/4- process, but remarkably, all fit the quasi-reversible model satisfactorily. © 2013 American Chemical Society.

  10. NEW TECHNIQUES USED IN AUTOMATED TEXT ANALYSIS

    Directory of Open Access Journals (Sweden)

    M. I strate

    2010-12-01

    Full Text Available Automated analysis of natural language texts is one of the most important knowledge discovery tasks for any organization. According to Gartner Group, almost 90% of knowledge available at an organization today is dispersed throughout piles of documents buried within unstructured text. Analyzing huge volumes of textual information is often involved in making informed and correct business decisions. Traditional analysis methods based on statistics fail to help processing unstructured texts and the society is in search of new technologies for text analysis. There exist a variety of approaches to the analysis of natural language texts, but most of them do not provide results that could be successfully applied in practice. This article concentrates on recent ideas and practical implementations in this area.

  11. Automated Pipelines for Spectroscopic Analysis

    CERN Document Server

    Prieto, Carlos Allende

    2016-01-01

    The Gaia mission will have a profound impact on our understanding of the structure and dynamics of the Milky Way. Gaia is providing an exhaustive census of stellar parallaxes, proper motions, positions, colors and radial velocities, but also leaves some flaring holes in an otherwise complete data set. The radial velocities measured with the on-board high-resolution spectrograph will only reach some 10% of the full sample of stars with astrometry and photometry from the mission, and detailed chemical information will be obtained for less than 1%. Teams all over the world are organizing large-scale projects to provide complementary radial velocities and chemistry, since this can now be done very efficiently from the ground thanks to large and mid-size telescopes with a wide field-of-view and multi-object spectrographs. As a result, automated data processing is taking an ever increasing relevance, and the concept is applying to many more areas, from targeting to analysis. In this paper, I provide a quick overvie...

  12. Statistical Analysis of Filament Features Based on the Hα Solar Images from 1988 to 2013 by Computer Automated Detection Method

    Science.gov (United States)

    Hao, Q.; Fang, C.; Cao, W.; Chen, P. F.

    2015-12-01

    We improve our filament automated detection method which was proposed in our previous works. It is then applied to process the full disk Hα data mainly obtained by the Big Bear Solar Observatory from 1988 to 2013, spanning nearly three solar cycles. The butterfly diagrams of the filaments, showing the information of the filament area, spine length, tilt angle, and the barb number, are obtained. The variations of these features with the calendar year and the latitude band are analyzed. The drift velocities of the filaments in different latitude bands are calculated and studied. We also investigate the north-south (N-S) asymmetries of the filament numbers in total and in each subclass classified according to the filament area, spine length, and tilt angle. The latitudinal distribution of the filament number is found to be bimodal. About 80% of all the filaments have tilt angles within [0°, 60°]. For the filaments within latitudes lower (higher) than 50°, the northeast (northwest) direction is dominant in the northern hemisphere and the southeast (southwest) direction is dominant in the southern hemisphere. The latitudinal migrations of the filaments experience three stages with declining drift velocities in each of solar cycles 22 and 23, and it seems that the drift velocity is faster in shorter solar cycles. Most filaments in latitudes lower (higher) than 50° migrate toward the equator (polar region). The N-S asymmetry indices indicate that the southern hemisphere is the dominant hemisphere in solar cycle 22 and the northern hemisphere is the dominant one in solar cycle 23.

  13. Computer vision methods applied to semi-automated analysis of inter-individual distances of perching birds

    OpenAIRE

    Kumperščak, Borut

    2010-01-01

    A system for analysis of perchnig bird flocks based on computer vision technologies and OpenCV library. Input data are high resolution photos of perching birds with uniform background. Number of birds, their sizes, and positions is computed out of these images. First, the objects are found that are reasonably assumed to be birds and segmented out of the background by means of computer vision methods. Analitic measurements are made on these objects, and results are saved to a database. Gathere...

  14. Distribution system analysis and automation

    CERN Document Server

    Gers, Juan

    2013-01-01

    A comprehensive guide to techniques that allow engineers to simulate, analyse and optimise power distribution systems which combined with automation, underpin the emerging concept of the "smart grid". This book is supported by theoretical concepts with real-world applications and MATLAB exercises.

  15. A Systematic, Automated Network Planning Method

    DEFF Research Database (Denmark)

    Holm, Jens Åge; Pedersen, Jens Myrup

    2006-01-01

    to consistency and long-term characteristics. The developed method gives significant improvements on these parameters. The case study was conducted as a comparison between an existing network where the traffic was known and a proposed network designed by the developed method. It turned out that the proposed......This paper describes a case study conducted to evaluate the viability of a systematic, automated network planning method. The motivation for developing the network planning method was that many data networks are planned in an adhoc manner with no assurance of quality of the solution with respect...... network performed better than the existing network with regard to the performance measurements used which reflected how well the traffic was routed in the networks and the cost of establishing the networks. Challenges that need to be solved before the developed method can be used to design network...

  16. Engineering systems for novel automation methods

    International Nuclear Information System (INIS)

    Modern automation methods of Optimal Control, or for state reconstruction or parameter identification, require a discrete dynamic path model. This is established among others by time and location discretisation of a system of partial differential equations. The digital wave filter principle is paricularly suitable for this purpose, since the numeric stability of the derived algorithms can be easily guaranteed, and their robustness as to effects of word length limitations can be proven. This principle is also particularly attractive in that it can be excellently integrated into currently existing engineering systems for instrumentation and control. (orig./CB)

  17. Automated macromolecular crystal detection system and method

    Science.gov (United States)

    Christian, Allen T.; Segelke, Brent; Rupp, Bernard; Toppani, Dominique

    2007-06-05

    An automated macromolecular method and system for detecting crystals in two-dimensional images, such as light microscopy images obtained from an array of crystallization screens. Edges are detected from the images by identifying local maxima of a phase congruency-based function associated with each image. The detected edges are segmented into discrete line segments, which are subsequently geometrically evaluated with respect to each other to identify any crystal-like qualities such as, for example, parallel lines, facing each other, similarity in length, and relative proximity. And from the evaluation a determination is made as to whether crystals are present in each image.

  18. Management issues in automated audit analysis

    Energy Technology Data Exchange (ETDEWEB)

    Jackson, K.A.; Hochberg, J.G.; Wilhelmy, S.K.; McClary, J.F.; Christoph, G.G.

    1994-03-01

    This paper discusses management issues associated with the design and implementation of an automated audit analysis system that we use to detect security events. It gives the viewpoint of a team directly responsible for developing and managing such a system. We use Los Alamos National Laboratory`s Network Anomaly Detection and Intrusion Reporter (NADIR) as a case in point. We examine issues encountered at Los Alamos, detail our solutions to them, and where appropriate suggest general solutions. After providing an introduction to NADIR, we explore four general management issues: cost-benefit questions, privacy considerations, legal issues, and system integrity. Our experiences are of general interest both to security professionals and to anyone who may wish to implement a similar system. While NADIR investigates security events, the methods used and the management issues are potentially applicable to a broad range of complex systems. These include those used to audit credit card transactions, medical care payments, and procurement systems.

  19. A semi-automated method for the detection of seismic anisotropy at depth via receiver function analysis

    Science.gov (United States)

    Licciardi, A.; Piana Agostinetti, N.

    2016-06-01

    Information about seismic anisotropy is embedded in the variation of the amplitude of the Ps pulses as a function of the azimuth, on both the Radial and the Transverse components of teleseismic receiver functions (RF). We develop a semi-automatic method to constrain the presence and the depth of anisotropic layers beneath a single seismic broad-band station. An algorithm is specifically designed to avoid trial and error methods and subjective crustal parametrizations in RF inversions, providing a suitable tool for large-size data set analysis. The algorithm couples together information extracted from a 1-D VS profile and from a harmonic decomposition analysis of the RF data set. This information is used to determine the number of anisotropic layers and their approximate position at depth, which, in turn, can be used to, for example, narrow the search boundaries for layer thickness and S-wave velocity in a subsequent parameter space search. Here, the output of the algorithm is used to invert an RF data set by means of the Neighbourhood Algorithm (NA). To test our methodology, we apply the algorithm to both synthetic and observed data. We make use of synthetic RF with correlated Gaussian noise to investigate the resolution power for multiple and thin (1-3 km) anisotropic layers in the crust. The algorithm successfully identifies the number and position of anisotropic layers at depth prior the NA inversion step. In the NA inversion, strength of anisotropy and orientation of the symmetry axis are correctly retrieved. Then, the method is applied to field measurement from station BUDO in the Tibetan Plateau. Two consecutive layers of anisotropy are automatically identified with our method in the first 25-30 km of the crust. The data are then inverted with the retrieved parametrization. The direction of the anisotropic axis in the uppermost layer correlates well with the orientation of the major planar structure in the area. The deeper anisotropic layer is associated with

  20. Analysis of an automated background correction method for cardiovascular MR phase contrast imaging in children and young adults

    Energy Technology Data Exchange (ETDEWEB)

    Rigsby, Cynthia K.; Hilpipre, Nicholas; Boylan, Emma E.; Popescu, Andrada R.; Deng, Jie [Ann and Robert H. Lurie Children' s Hospital of Chicago, Department of Medical Imaging, Chicago, IL (United States); McNeal, Gary R. [Siemens Medical Solutions USA Inc., Customer Solutions Group, Cardiovascular MR R and D, Chicago, IL (United States); Zhang, Gang [Ann and Robert H. Lurie Children' s Hospital of Chicago Research Center, Biostatistics Research Core, Chicago, IL (United States); Choi, Grace [Ann and Robert H. Lurie Children' s Hospital of Chicago, Department of Pediatrics, Chicago, IL (United States); Greiser, Andreas [Siemens AG Healthcare Sector, Erlangen (Germany)

    2014-03-15

    Phase contrast magnetic resonance imaging (MRI) is a powerful tool for evaluating vessel blood flow. Inherent errors in acquisition, such as phase offset, eddy currents and gradient field effects, can cause significant inaccuracies in flow parameters. These errors can be rectified with the use of background correction software. To evaluate the performance of an automated phase contrast MRI background phase correction method in children and young adults undergoing cardiac MR imaging. We conducted a retrospective review of patients undergoing routine clinical cardiac MRI including phase contrast MRI for flow quantification in the aorta (Ao) and main pulmonary artery (MPA). When phase contrast MRI of the right and left pulmonary arteries was also performed, these data were included. We excluded patients with known shunts and metallic implants causing visible MRI artifact and those with more than mild to moderate aortic or pulmonary stenosis. Phase contrast MRI of the Ao, mid MPA, proximal right pulmonary artery (RPA) and left pulmonary artery (LPA) using 2-D gradient echo Fast Low Angle SHot (FLASH) imaging was acquired during normal respiration with retrospective cardiac gating. Standard phase image reconstruction and the automatic spatially dependent background-phase-corrected reconstruction were performed on each phase contrast MRI dataset. Non-background-corrected and background-phase-corrected net flow, forward flow, regurgitant volume, regurgitant fraction, and vessel cardiac output were recorded for each vessel. We compared standard non-background-corrected and background-phase-corrected mean flow values for the Ao and MPA. The ratio of pulmonary to systemic blood flow (Qp:Qs) was calculated for the standard non-background and background-phase-corrected data and these values were compared to each other and for proximity to 1. In a subset of patients who also underwent phase contrast MRI of the MPA, RPA, and LPA a comparison was made between standard non

  1. Analysis of an automated background correction method for cardiovascular MR phase contrast imaging in children and young adults

    International Nuclear Information System (INIS)

    Phase contrast magnetic resonance imaging (MRI) is a powerful tool for evaluating vessel blood flow. Inherent errors in acquisition, such as phase offset, eddy currents and gradient field effects, can cause significant inaccuracies in flow parameters. These errors can be rectified with the use of background correction software. To evaluate the performance of an automated phase contrast MRI background phase correction method in children and young adults undergoing cardiac MR imaging. We conducted a retrospective review of patients undergoing routine clinical cardiac MRI including phase contrast MRI for flow quantification in the aorta (Ao) and main pulmonary artery (MPA). When phase contrast MRI of the right and left pulmonary arteries was also performed, these data were included. We excluded patients with known shunts and metallic implants causing visible MRI artifact and those with more than mild to moderate aortic or pulmonary stenosis. Phase contrast MRI of the Ao, mid MPA, proximal right pulmonary artery (RPA) and left pulmonary artery (LPA) using 2-D gradient echo Fast Low Angle SHot (FLASH) imaging was acquired during normal respiration with retrospective cardiac gating. Standard phase image reconstruction and the automatic spatially dependent background-phase-corrected reconstruction were performed on each phase contrast MRI dataset. Non-background-corrected and background-phase-corrected net flow, forward flow, regurgitant volume, regurgitant fraction, and vessel cardiac output were recorded for each vessel. We compared standard non-background-corrected and background-phase-corrected mean flow values for the Ao and MPA. The ratio of pulmonary to systemic blood flow (Qp:Qs) was calculated for the standard non-background and background-phase-corrected data and these values were compared to each other and for proximity to 1. In a subset of patients who also underwent phase contrast MRI of the MPA, RPA, and LPA a comparison was made between standard non

  2. Initial development of an automated task analysis profiling system

    International Nuclear Information System (INIS)

    A program for automated task analysis is described. Called TAPS (task analysis profiling system), the program accepts normal English prose and outputs skills, knowledges, attitudes, and abilities (SKAAs) along with specific guidance and recommended ability measurement tests for nuclear power plant operators. A new method for defining SKAAs is presented along with a sample program output

  3. Optimization-based Method for Automated Road Network Extraction

    Energy Technology Data Exchange (ETDEWEB)

    Xiong, D

    2001-09-18

    Automated road information extraction has significant applicability in transportation. It provides a means for creating, maintaining, and updating transportation network databases that are needed for purposes ranging from traffic management to automated vehicle navigation and guidance. This paper is to review literature on the subject of road extraction and to describe a study of an optimization-based method for automated road network extraction.

  4. Optimization-based Method for Automated Road Network Extraction

    International Nuclear Information System (INIS)

    Automated road information extraction has significant applicability in transportation. It provides a means for creating, maintaining, and updating transportation network databases that are needed for purposes ranging from traffic management to automated vehicle navigation and guidance. This paper is to review literature on the subject of road extraction and to describe a study of an optimization-based method for automated road network extraction

  5. Cooling method with automated seasonal freeze protection

    Energy Technology Data Exchange (ETDEWEB)

    Cambell, Levi; Chu, Richard; David, Milnes; Ellsworth, Jr, Michael; Iyengar, Madhusudan; Simons, Robert; Singh, Prabjit; Zhang, Jing

    2016-05-31

    An automated multi-fluid cooling method is provided for cooling an electronic component(s). The method includes obtaining a coolant loop, and providing a coolant tank, multiple valves, and a controller. The coolant loop is at least partially exposed to outdoor ambient air temperature(s) during normal operation, and the coolant tank includes first and second reservoirs containing first and second fluids, respectively. The first fluid freezes at a lower temperature than the second, the second fluid has superior cooling properties compared with the first, and the two fluids are soluble. The multiple valves are controllable to selectively couple the first or second fluid into the coolant in the coolant loop, wherein the coolant includes at least the second fluid. The controller automatically controls the valves to vary first fluid concentration level in the coolant loop based on historical, current, or anticipated outdoor air ambient temperature(s) for a time of year.

  6. A Method for Automated Program Code Testing

    Directory of Open Access Journals (Sweden)

    Sigitas DRĄSUTIS

    2010-10-01

    Full Text Available The Internet has recently encouraged the society to convert almost all its needs to electronic resources such as e-libraries, e-cultures, e-entertainment as well as e-learning, which has become a radical idea to increase the effectiveness of learning services in most schools, colleges and universities. E-learning can not be completely featured and met without e-testing. However, in many cases e-testing tools are suitable just for traditional/theoretical knowledge testing, covered by such items as questions, quizzes, matching boxes and other. The article ``A Method for Automated Program Code Testing'' tackles the lack of functions in e-testing systems and suggests e-assessment possibilities for students who study computer science, especially programming. The article analyzes the method that allows freely entering answers to questions, checking program syntax during the testing and enables automatic written code checking and evaluation.

  7. Automated quantification and analysis of mandibular asymmetry

    DEFF Research Database (Denmark)

    Darvann, T. A.; Hermann, N. V.; Larsen, P.; Ólafsdóttir, Hildur; Hansen, I. V.; Hove, H. D.; Christensen, L.; Rueckert, D.; Kreiborg, S.

    We present an automated method of spatially detailed 3D asymmetry quantification in mandibles extracted from CT and apply it to a population of infants with unilateral coronal synostosis (UCS). An atlas-based method employing non-rigid registration of surfaces is used for determining deformation...

  8. An Automated Method for Ozonesonde Calibration: New Insights

    Science.gov (United States)

    Schmidlin, F. J.; Hoegger, Bruno A.; Levrat, Gilbert; Baldwin, Tony

    2008-01-01

    An automated method for preparation of the electrochemical concentration cell (ECC) ozonesonde is presented. Development of a computer-controlled system for preparation and calibration of the ECC is an improvement over the manual preparation method, and reduces subjectivity considerably. Preparation measurements in digital form aids analysis of the ECC before release and enhances post-flight data certification. Calibration of ozonesondes over a range of ozone concentrations between 0 mPA and 30 mPA is discussed. This presentation describes the automatic system, gives examples of calibrations. The automated system enables comparison of varying potassium iodide (KI) concentrations that should allow adjustment of earlier ozonesonde data obtained with different KT concentrations used since 1970, i.e., 2, 1.5, 1, and 0.5 percent. Preliminary results indicate ECC accuracy has a strong dependence on the electrolyte concentration and should not be considered linear with altitude.

  9. An overview of the contaminant analysis automation program

    International Nuclear Information System (INIS)

    The Department of Energy (DOE) has significant amounts of radioactive and hazardous wastes stored, buried, and still being generated at many sites within the United States. These wastes must be characterized to determine the elemental, isotopic, and compound content before remediation can begin. In this paper, the authors project that sampling requirements will necessitate generating more than 10 million samples by 1995, which will far exceed the capabilities of our current manual chemical analysis laboratories. The Contaminant Analysis Automation effort (CAA), with Los Alamos National Laboratory (LANL) as to the coordinating Laboratory, is designing and fabricating robotic systems that will standardize and automate both the hardware and the software of the most common environmental chemical methods. This will be accomplished by designing and producing several unique analysis systems called Standard Analysis Methods (SAM). Each SAM will automate a specific chemical method, including sample preparation, the analytical analysis, and the data interpretation, by using a building block known as the Standard Laboratory Module (SLM). This concept allows the chemist to assemble an automated environmental method using standardized SLMs easily and without the worry of hardware compatibility or the necessity of generating complicated control programs

  10. Automated method for the direct analysis of 8-oxo-guanosine and 8-oxo-2 '-deoxyguanosine in human urine using ultraperformance liquid chromatography and tandem mass spectrometry

    DEFF Research Database (Denmark)

    Henriksen, T.; Hillestrom, P.R.; Poulsen, Henrik Enghusen; Weimann, A.

    2009-01-01

    addition there is a need for assays that can measure more than one product from DNA oxidation. We present a sensitive, precise, and accurate method for quantitative analysis of the oxidized nucleosides 8-oxoGuo and 8-oxodG in human urine. The assay is based on automated sample handling using a BIOMEK 3000......The potential use of oxidative stress-induced DNA and RNA damage products as biomarkers is an important aspect of biomedical research. There is a need for assays with high specificity and sensitivity that also can be used in molecular epidemiology studies with a large number of subjects. In...... Workstation, and UPLC-ESI(+)-MS/MS analysis. High specificity is evidenced by the use of qualifier ions for both analytes. The quantification limit in urine samples is 1 nM for both analytes. Accuracy and precision were documented, showing average recoveries of 106.2% (8-oxoGuo) and 106.9% (8-oxodG), and...

  11. Automated method for the direct analysis of 8-oxo-guanosine and 8-oxo-2'-deoxyguanosine in human urine using ultraperformance liquid chromatography and tandem mass spectrometry

    DEFF Research Database (Denmark)

    Hillestrøm, Peter R; Henriksen, Trine; Hillestrøm, Peter René; Poulsen, Henrik E; Weimann, Allan

    2009-01-01

    addition there is a need for assays that can measure more than one product from DNA oxidation. We present a sensitive, precise, and accurate method for quantitative analysis of the oxidized nucleosides 8-oxoGuo and 8-oxodG in human urine. The assay is based on automated sample handling using a BIOMEK 3000......The potential use of oxidative stress-induced DNA and RNA damage products as biomarkers is an important aspect of biomedical research. There is a need for assays with high specificity and sensitivity that also can be used in molecular epidemiology studies with a large number of subjects. In...... Workstation, and UPLC-ESI(+)-MS/MS analysis. High specificity is evidenced by the use of qualifier ions for both analytes. The quantification limit in urine samples is 1 nM for both analytes. Accuracy and precision were documented, showing average recoveries of 106.2% (8-oxoGuo) and 106.9% (8-oxodG), and...

  12. CRITICAL ASSESSMENT OF AUTOMATED FLOW CYTOMETRY DATA ANALYSIS TECHNIQUES

    Science.gov (United States)

    Aghaeepour, Nima; Finak, Greg; Hoos, Holger; Mosmann, Tim R.; Gottardo, Raphael; Brinkman, Ryan; Scheuermann, Richard H.

    2013-01-01

    Traditional methods for flow cytometry (FCM) data processing rely on subjective manual gating. Recently, several groups have developed computational methods for identifying cell populations in multidimensional FCM data. The Flow Cytometry: Critical Assessment of Population Identification Methods (FlowCAP) challenges were established to compare the performance of these methods on two tasks – mammalian cell population identification to determine if automated algorithms can reproduce expert manual gating, and sample classification to determine if analysis pipelines can identify characteristics that correlate with external variables (e.g., clinical outcome). This analysis presents the results of the first of these challenges. Several methods performed well compared to manual gating or external variables using statistical performance measures, suggesting that automated methods have reached a sufficient level of maturity and accuracy for reliable use in FCM data analysis. PMID:23396282

  13. Automated Technology for Verificiation and Analysis

    DEFF Research Database (Denmark)

    This volume contains the papers presented at the 7th International Symposium on Automated Technology for Verification and Analysis held during October 13-16 in Macao SAR, China. The primary objective of the ATVA conferences remains the same: to exchange and promote the latest advances of state-of...

  14. An approach to automated chromosome analysis

    International Nuclear Information System (INIS)

    The methods of approach developed with a view to automatic processing of the different stages of chromosome analysis are described in this study divided into three parts. Part 1 relates the study of automated selection of metaphase spreads, which operates a decision process in order to reject ail the non-pertinent images and keep the good ones. This approach has been achieved by Computing a simulation program that has allowed to establish the proper selection algorithms in order to design a kit of electronic logical units. Part 2 deals with the automatic processing of the morphological study of the chromosome complements in a metaphase: the metaphase photographs are processed by an optical-to-digital converter which extracts the image information and writes it out as a digital data set on a magnetic tape. For one metaphase image this data set includes some 200 000 grey values, encoded according to a 16, 32 or 64 grey-level scale, and is processed by a pattern recognition program isolating the chromosomes and investigating their characteristic features (arm tips, centromere areas), in order to get measurements equivalent to the lengths of the four arms. Part 3 studies a program of automated karyotyping by optimized pairing of human chromosomes. The data are derived from direct digitizing of the arm lengths by means of a BENSON digital reader. The program supplies' 1/ a list of the pairs, 2/ a graphic representation of the pairs so constituted according to their respective lengths and centromeric indexes, and 3/ another BENSON graphic drawing according to the author's own representation of the chromosomes, i.e. crosses with orthogonal arms, each branch being the accurate measurement of the corresponding chromosome arm. This conventionalized karyotype indicates on the last line the really abnormal or non-standard images unpaired by the program, which are of special interest for the biologist. (author)

  15. Analysis methods of neutrons induced resonances in the transmission experiments by time-of-flight and automation of these methods on IBM 7094 II computer

    International Nuclear Information System (INIS)

    The neutron induced resonances analysis aims to determine the neutrons characteristics, leading to the excitation energies, de-excitation probabilities by gamma radiation emission, by neutron emission or by fission, their spin, their parity... This document describes the methods developed, or adapted, the calculation schemes and the algorithms implemented to realize such analysis on a computer, from data obtained during time-of-flight experiments on the linear accelerator of Saclay. (A.L.B.)

  16. Automated Analysis of Child Phonetic Production Using Naturalistic Recordings

    Science.gov (United States)

    Xu, Dongxin; Richards, Jeffrey A.; Gilkerson, Jill

    2014-01-01

    Purpose: Conventional resource-intensive methods for child phonetic development studies are often impractical for sampling and analyzing child vocalizations in sufficient quantity. The purpose of this study was to provide new information on early language development by an automated analysis of child phonetic production using naturalistic…

  17. Principles and methods for automated palynology.

    Science.gov (United States)

    Holt, K A; Bennett, K D

    2014-08-01

    Pollen grains are microscopic so their identification and quantification has, for decades, depended upon human observers using light microscopes: a labour-intensive approach. Modern improvements in computing and imaging hardware and software now bring automation of pollen analyses within reach. In this paper, we provide the first review in over 15 yr of progress towards automation of the part of palynology concerned with counting and classifying pollen, bringing together literature published from a wide spectrum of sources. We consider which attempts offer the most potential for an automated palynology system for universal application across all fields of research concerned with pollen classification and counting. We discuss what is required to make the datasets of these automated systems as acceptable as those produced by human palynologists, and present suggestions for how automation will generate novel approaches to counting and classifying pollen that have hitherto been unthinkable. PMID:25180326

  18. Automated Functional Analysis in Dynamic Medical Imaging

    Czech Academy of Sciences Publication Activity Database

    Tichý, Ondřej

    Praha : Katedra matematiky, FSv ČVUT v Praze, 2012, s. 19-20. [Aplikovaná matematika – Rektorysova soutěž. Praha (CZ), 07.12.2012] Institutional support: RVO:67985556 Keywords : Factor Analysis * Dynamic Sequence * Scintigraphy Subject RIV: BB - Applied Statistics, Operational Research http://library.utia.cas.cz/separaty/2012/AS/tichy-automated functional analysis in dynamic medical imaging.pdf

  19. Automated topology classification method for instantaneous velocity fields

    Energy Technology Data Exchange (ETDEWEB)

    Depardon, S. [Direction de la Recherche et de l' Innovation Automobile, PSA Peugeot Citroen, Velizy-Villacoublay Cedex (France); Laboratoire d' Etudes Aerodynamiques, Teleport 2, 1 Av. Clement Ader, BP 40109, Futuroscope Chasseneuil (France); Lasserre, J.J. [Direction de la Recherche et de l' Innovation Automobile, PSA Peugeot Citroen, Velizy-Villacoublay Cedex (France); Brizzi, L.E.; Boree, J. [Laboratoire d' Etudes Aerodynamiques, Teleport 2, 1 Av. Clement Ader, BP 40109, Futuroscope Chasseneuil (France)

    2007-05-15

    Topological concepts provide highly comprehensible representations of the main features of a flow with a limited number of elements. This paper presents an automated classification method of instantaneous velocity fields based on the analysis of their critical points distribution and feature flow fields. It uses the fact that topological changes of a velocity field are continuous in time to extract large scale periodic phenomena from insufficiently time-resolved datasets. This method is applied to two test-cases: an analytical flow field and PIV planes acquired downstream a wall-mounted cube. (orig.)

  20. Rapid, automated online SPE-LC-QTRAP-MS/MS method for the simultaneous analysis of 14 phthalate metabolites and 5 bisphenol analogues in human urine.

    Science.gov (United States)

    Heffernan, A L; Thompson, K; Eaglesham, G; Vijayasarathy, S; Mueller, J F; Sly, P D; Gomez, M J

    2016-05-01

    Phthalates and bisphenol A (BPA) have received special attention in recent years due to their frequent use in consumer products and potential for adverse effects on human health. BPA is being replaced with a number of alternatives, including bisphenol S, bisphenol B, bisphenol F and bisphenol AF. These bisphenol analogues have similar potential for adverse health effects, but studies on human exposure are limited. Accurate measurement of multiple contaminants is important for estimating exposure. This paper describes a sensitive and automated method for the simultaneous determination of 14 phthalate metabolites, BPA and four bisphenol analogues in urine using online solid phase extraction coupled with high-performance liquid chromatography/tandem mass spectrometry using a hybrid triple-quadrupole linear ion trap mass spectrometer (LC-QTRAP-MS/MS), requiring very little sample volume (50µL). Quantification was performed under selected reaction monitoring (SRM) mode with negative electrospray ionization. The use of SRM combined with an enhanced product ion scan within the same analysis was examined. Unequivocal identification was provided by the acquisition of three SRM transitions per compound and isotope dilution. The analytical performance of the method was evaluated in synthetic and human urine. Linearity of response over three orders of magnitude was demonstrated for all of the compounds (R(2)>0.99), with method detection limits of 0.01-0.5ng/mL and limits of reporting of 0.07-3.1ng/mL. Accuracy ranged from 93% to 113% and inter- and intra-day precision were <22%. Finally, the validated method has been successfully applied to a cohort of pregnant women to measure biomarker concentrations of phthalates and bisphenols, with median concentrations ranging from 0.3ng/mL (bisphenol S) to 18.5ng/mL (monoethyl phthalate). PMID:26946031

  1. Automated procedure for performing computer security risk analysis

    International Nuclear Information System (INIS)

    Computers, the invisible backbone of nuclear safeguards, monitor and control plant operations and support many materials accounting systems. Our automated procedure to assess computer security effectiveness differs from traditional risk analysis methods. The system is modeled as an interactive questionnaire, fully automated on a portable microcomputer. A set of modular event trees links the questionnaire to the risk assessment. Qualitative scores are obtained for target vulnerability, and qualitative impact measures are evaluated for a spectrum of threat-target pairs. These are then combined by a linguistic algebra to provide an accurate and meaningful risk measure. 12 references, 7 figures

  2. Techniques for Automated Performance Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Marcus, Ryan C. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2014-09-02

    The performance of a particular HPC code depends on a multitude of variables, including compiler selection, optimization flags, OpenMP pool size, file system load, memory usage, MPI configuration, etc. As a result of this complexity, current predictive models have limited applicability, especially at scale. We present a formulation of scientific codes, nodes, and clusters that reduces complex performance analysis to well-known mathematical techniques. Building accurate predictive models and enhancing our understanding of scientific codes at scale is an important step towards exascale computing.

  3. Automating Risk Analysis of Software Design Models

    OpenAIRE

    Maxime Frydman; Guifré Ruiz; Elisa Heymann; Eduardo César; Barton P. Miller

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security e...

  4. Computer automated movement detection for the analysis of behavior

    OpenAIRE

    Ramazani, Roseanna B.; Harish R Krishnan; BERGESON, SUSAN E.; Atkinson, Nigel S.

    2007-01-01

    Currently, measuring ethanol behaviors in flies depends on expensive image analysis software or time intensive experimenter observation. We have designed an automated system for the collection and analysis of locomotor behavior data, using the IEEE 1394 acquisition program dvgrab, the image toolkit ImageMagick and the programming language Perl. In the proposed method, flies are placed in a clear container and a computer-controlled camera takes pictures at regular intervals. Digital subtractio...

  5. ENHANCEMENT OF METHODICAL BACKGROUND FOR AUTOMATION WITH ENTERPRISE SPECIFICATION

    OpenAIRE

    Pisarchuk, O.; Uvarova, V.

    2010-01-01

    The article gets over the key methodical approaches to building up automated accounting information system with enterprise specification, by the example of program product «1C:Accounting». Shows the advantages and disadvantages of each key methodical approaches.

  6. Flux-P: Automating Metabolic Flux Analysis

    Directory of Open Access Journals (Sweden)

    Birgitta E. Ebert

    2012-11-01

    Full Text Available Quantitative knowledge of intracellular fluxes in metabolic networks is invaluable for inferring metabolic system behavior and the design principles of biological systems. However, intracellular reaction rates can not often be calculated directly but have to be estimated; for instance, via 13C-based metabolic flux analysis, a model-based interpretation of stable carbon isotope patterns in intermediates of metabolism. Existing software such as FiatFlux, OpenFLUX or 13CFLUX supports experts in this complex analysis, but requires several steps that have to be carried out manually, hence restricting the use of this software for data interpretation to a rather small number of experiments. In this paper, we present Flux-P as an approach to automate and standardize 13C-based metabolic flux analysis, using the Bio-jETI workflow framework. Exemplarily based on the FiatFlux software, it demonstrates how services can be created that carry out the different analysis steps autonomously and how these can subsequently be assembled into software workflows that perform automated, high-throughput intracellular flux analysis of high quality and reproducibility. Besides significant acceleration and standardization of the data analysis, the agile workflow-based realization supports flexible changes of the analysis workflows on the user level, making it easy to perform custom analyses.

  7. Automating Risk Analysis of Software Design Models

    Directory of Open Access Journals (Sweden)

    Maxime Frydman

    2014-01-01

    Full Text Available The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance.

  8. Automating risk analysis of software design models.

    Science.gov (United States)

    Frydman, Maxime; Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance. PMID:25136688

  9. Streamlining and automation of radioanalytical methods at a commercial laboratory

    International Nuclear Information System (INIS)

    Through the careful planning and design of laboratory facilities and incorporation of modern instrumentation and robotics systems, properly trained and competent laboratory associates can efficiently and safely handle radioactive and mixed waste samples. This paper addresses the potential improvements radiochemistry and mixed waste laboratories can achieve utilizing robotics for automated sample analysis. Several examples of automated systems for sample preparation and analysis will be discussed

  10. Streamlining and automation of radioanalytical methods at a commercial laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Harvey, J.T.; Dillard, J.W. [IT Corp., Knoxville, TN (United States)

    1993-12-31

    Through the careful planning and design of laboratory facilities and incorporation of modern instrumentation and robotics systems, properly trained and competent laboratory associates can efficiently and safely handle radioactive and mixed waste samples. This paper addresses the potential improvements radiochemistry and mixed waste laboratories can achieve utilizing robotics for automated sample analysis. Several examples of automated systems for sample preparation and analysis will be discussed.

  11. Automating Trend Analysis for Spacecraft Constellations

    Science.gov (United States)

    Davis, George; Cooter, Miranda; Updike, Clark; Carey, Everett; Mackey, Jennifer; Rykowski, Timothy; Powers, Edward I. (Technical Monitor)

    2001-01-01

    Spacecraft trend analysis is a vital mission operations function performed by satellite controllers and engineers, who perform detailed analyses of engineering telemetry data to diagnose subsystem faults and to detect trends that may potentially lead to degraded subsystem performance or failure in the future. It is this latter function that is of greatest importance, for careful trending can often predict or detect events that may lead to a spacecraft's entry into safe-hold. Early prediction and detection of such events could result in the avoidance of, or rapid return to service from, spacecraft safing, which not only results in reduced recovery costs but also in a higher overall level of service for the satellite system. Contemporary spacecraft trending activities are manually intensive and are primarily performed diagnostically after a fault occurs, rather than proactively to predict its occurrence. They also tend to rely on information systems and software that are oudated when compared to current technologies. When coupled with the fact that flight operations teams often have limited resources, proactive trending opportunities are limited, and detailed trend analysis is often reserved for critical responses to safe holds or other on-orbit events such as maneuvers. While the contemporary trend analysis approach has sufficed for current single-spacecraft operations, it will be unfeasible for NASA's planned and proposed space science constellations. Missions such as the Dynamics, Reconnection and Configuration Observatory (DRACO), for example, are planning to launch as many as 100 'nanospacecraft' to form a homogenous constellation. A simple extrapolation of resources and manpower based on single-spacecraft operations suggests that trending for such a large spacecraft fleet will be unmanageable, unwieldy, and cost-prohibitive. It is therefore imperative that an approach to automating the spacecraft trend analysis function be studied, developed, and applied to

  12. A Novel Automated Method for Analyzing Cylindrical Computed Tomography Data

    Science.gov (United States)

    Roth, D. J.; Burke, E. R.; Rauser, R. W.; Martin, R. E.

    2011-01-01

    A novel software method is presented that is applicable for analyzing cylindrical and partially cylindrical objects inspected using computed tomography. This method involves unwrapping and re-slicing data so that the CT data from the cylindrical object can be viewed as a series of 2-D sheets in the vertical direction in addition to volume rendering and normal plane views provided by traditional CT software. The method is based on interior and exterior surface edge detection and under proper conditions, is FULLY AUTOMATED and requires no input from the user except the correct voxel dimension from the CT scan. The software is available from NASA in 32- and 64-bit versions that can be applied to gigabyte-sized data sets, processing data either in random access memory or primarily on the computer hard drive. Please inquire with the presenting author if further interested. This software differentiates itself in total from other possible re-slicing software solutions due to complete automation and advanced processing and analysis capabilities.

  13. Extended -Regular Sequence for Automated Analysis of Microarray Images

    Directory of Open Access Journals (Sweden)

    Jin Hee-Jeong

    2006-01-01

    Full Text Available Microarray study enables us to obtain hundreds of thousands of expressions of genes or genotypes at once, and it is an indispensable technology for genome research. The first step is the analysis of scanned microarray images. This is the most important procedure for obtaining biologically reliable data. Currently most microarray image processing systems require burdensome manual block/spot indexing work. Since the amount of experimental data is increasing very quickly, automated microarray image analysis software becomes important. In this paper, we propose two automated methods for analyzing microarray images. First, we propose the extended -regular sequence to index blocks and spots, which enables a novel automatic gridding procedure. Second, we provide a methodology, hierarchical metagrid alignment, to allow reliable and efficient batch processing for a set of microarray images. Experimental results show that the proposed methods are more reliable and convenient than the commercial tools.

  14. Comparison of Manual Versus Automated Data Collection Method for an Evidence-Based Nursing Practice Study

    Science.gov (United States)

    Byrne, M.D.; Jordan, T.R.; Welle, T.

    2013-01-01

    Objective The objective of this study was to investigate and improve the use of automated data collection procedures for nursing research and quality assurance. Methods A descriptive, correlational study analyzed 44 orthopedic surgical patients who were part of an evidence-based practice (EBP) project examining post-operative oxygen therapy at a Midwestern hospital. The automation work attempted to replicate a manually-collected data set from the EBP project. Results Automation was successful in replicating data collection for study data elements that were available in the clinical data repository. The automation procedures identified 32 “false negative” patients who met the inclusion criteria described in the EBP project but were not selected during the manual data collection. Automating data collection for certain data elements, such as oxygen saturation, proved challenging because of workflow and practice variations and the reliance on disparate sources for data abstraction. Automation also revealed instances of human error including computational and transcription errors as well as incomplete selection of eligible patients. Conclusion Automated data collection for analysis of nursing-specific phenomenon is potentially superior to manual data collection methods. Creation of automated reports and analysis may require initial up-front investment with collaboration between clinicians, researchers and information technology specialists who can manage the ambiguities and challenges of research and quality assurance work in healthcare. PMID:23650488

  15. Automated Radiochemical Separation, Analysis, and Sensing

    International Nuclear Information System (INIS)

    Chapter 14 for the 2nd edition of the Handbook of Radioactivity Analysis. The techniques and examples described in this chapter demonstrate that modern fluidic techniques and instrumentation can be used to develop automated radiochemical separation workstations. In many applications, these can be mechanically simple and key parameters can be controlled from software. If desired, many of the fluidic components and solution can be located remotely from the radioactive samples and other hot sample processing zones. There are many issues to address in developing automated radiochemical separation that perform reliably time after time in unattended operation. These are associated primarily with the separation and analytical chemistry aspects of the process. The relevant issues include the selectivity of the separation, decontamination factors, matrix effects, and recoveries from the separation column. In addition, flow rate effects, column lifetimes, carryover from one sample to another, and sample throughput must be considered. Nevertheless, successful approaches for addressing these issues have been developed. Radiochemical analysis is required not only for processing nuclear waste samples in the laboratory, but also for at-site or in situ applications. Monitors for nuclear waste processing operations represent an at-site application where continuous unattended monitoring is required to assure effective process radiochemical separations that produce waste streams that qualify for conversion to stable waste forms. Radionuclide sensors for water monitoring and long term stewardship represent an application where at-site or in situ measurements will be most effective. Automated radiochemical analyzers and sensors have been developed that demonstrate that radiochemical analysis beyond the analytical laboratory is both possible and practical

  16. Automated Parameter Studies Using a Cartesian Method

    Science.gov (United States)

    Murman, Scott M.; Aftosimis, Michael J.; Nemec, Marian

    2004-01-01

    Computational Fluid Dynamics (CFD) is now routinely used to analyze isolated points in a design space by performing steady-state computations at fixed flight conditions (Mach number, angle of attack, sideslip), for a fixed geometric configuration of interest. This "point analysis" provides detailed information about the flowfield, which aides an engineer in understanding, or correcting, a design. A point analysis is typically performed using high fidelity methods at a handful of critical design points, e.g. a cruise or landing configuration, or a sample of points along a flight trajectory.

  17. Automated analysis of Xe-133 pulmonary ventilation (AAPV) in children

    Science.gov (United States)

    Cao, Xinhua; Treves, S. Ted

    2011-03-01

    In this study, an automated analysis of pulmonary ventilation (AAPV) was developed to visualize the ventilation in pediatric lungs using dynamic Xe-133 scintigraphy. AAPV is a software algorithm that converts a dynamic series of Xe- 133 images into four functional images: equilibrium, washout halftime, residual, and clearance rate by analyzing pixelbased activity. Compared to conventional methods of calculating global or regional ventilation parameters, AAPV provides a visual representation of pulmonary ventilation functions.

  18. Automated analysis of damages for radiation in plastics surfaces

    International Nuclear Information System (INIS)

    Analysis of damages done by the radiation in a polymer characterized by optic properties of polished surfaces, of uniformity and chemical resistance that the acrylic; resistant until the 150 centigrade grades of temperature, and with an approximate weight of half of the glass. An objective of this work is the development of a method that analyze in automated form the superficial damages induced by radiation in plastic materials means an images analyst. (Author)

  19. Automated reasoning applications to design validation and sneak function analysis

    International Nuclear Information System (INIS)

    Argonne National Laboratory (ANL) is actively involved in the LMFBR Man-Machine Integration (MMI) Safety Program. The objective of this program is to enhance the operational safety and reliability of fast-breeder reactors by optimum integration of men and machines through the application of human factors principles and control engineering to the design, operation, and the control environment. ANL is developing methods to apply automated reasoning and computerization in the validation and sneak function analysis process. This project provides the element definitions and relations necessary for an automated reasoner (AR) to reason about design validation and sneak function analysis. This project also provides a demonstration of this AR application on an Experimental Breeder Reactor-II (EBR-II) system, the Argonne Cooling System

  20. Optimization of automation: I. Estimation method of cognitive automation rates reflecting the effects of automation on human operators in nuclear power plants

    International Nuclear Information System (INIS)

    Highlights: • We propose an estimation method of the automation rate by taking the advantages of automation as the estimation measures. • We conduct the experiments to examine the validity of the suggested method. • The higher the cognitive automation rate is, the greater the decreased rate of the working time will be. • The usefulness of the suggested estimation method is proved by statistical analyses. - Abstract: Since automation was introduced in various industrial fields, the concept of the automation rate has been used to indicate the inclusion proportion of automation among all work processes or facilities. Expressions of the inclusion proportion of automation are predictable, as is the ability to express the degree of the enhancement of human performance. However, many researchers have found that a high automation rate does not guarantee high performance. Therefore, to reflect the effects of automation on human performance, this paper proposes a new estimation method of the automation rate that considers the effects of automation on human operators in nuclear power plants (NPPs). Automation in NPPs can be divided into two types: system automation and cognitive automation. Some general descriptions and characteristics of each type of automation are provided, and the advantages of automation are investigated. The advantages of each type of automation are used as measures of the estimation method of the automation rate. One advantage was found to be a reduction in the number of tasks, and another was a reduction in human cognitive task loads. The system and the cognitive automation rate were proposed as quantitative measures by taking advantage of the aforementioned benefits. To quantify the required human cognitive task loads and thus suggest the cognitive automation rate, Conant’s information-theory-based model was applied. The validity of the suggested method, especially as regards the cognitive automation rate, was proven by conducting

  1. Automated Analysis of Security in Networking Systems

    DEFF Research Database (Denmark)

    Buchholtz, Mikael

    2004-01-01

    such networking systems are modelled in the process calculus LySa. On top of this programming language based formalism an analysis is developed, which relies on techniques from data and control ow analysis. These are techniques that can be fully automated, which make them an ideal basis for tools targeted at non......It has for a long time been a challenge to built secure networking systems. One way to counter this problem is to provide developers of software applications for networking systems with easy-to-use tools that can check security properties before the applications ever reach the marked. These tools...... will both help raise the general level of awareness of the problems and prevent the most basic flaws from occurring. This thesis contributes to the development of such tools. Networking systems typically try to attain secure communication by applying standard cryptographic techniques. In this thesis...

  2. Automated Analysis, Classification, and Display of Waveforms

    Science.gov (United States)

    Kwan, Chiman; Xu, Roger; Mayhew, David; Zhang, Frank; Zide, Alan; Bonggren, Jeff

    2004-01-01

    A computer program partly automates the analysis, classification, and display of waveforms represented by digital samples. In the original application for which the program was developed, the raw waveform data to be analyzed by the program are acquired from space-shuttle auxiliary power units (APUs) at a sampling rate of 100 Hz. The program could also be modified for application to other waveforms -- for example, electrocardiograms. The program begins by performing principal-component analysis (PCA) of 50 normal-mode APU waveforms. Each waveform is segmented. A covariance matrix is formed by use of the segmented waveforms. Three eigenvectors corresponding to three principal components are calculated. To generate features, each waveform is then projected onto the eigenvectors. These features are displayed on a three-dimensional diagram, facilitating the visualization of the trend of APU operations.

  3. A METHOD FOR AUTOMATED ANALYSIS OF 10 ML WATER SAMPLES CONTAINING ACIDIC, BASIC, AND NEUTRAL SEMIVOLATILE COMPOUNDS LISTED IN USEPA METHOD 8270 BY SOLID PHASE EXTRACTION COUPLED IN-LINE TO LARGE VOLUME INJECTION GAS CHROMATOGRAPHY/MASS SPECTROMETRY

    Science.gov (United States)

    Data is presented showing the progress made towards the development of a new automated system combining solid phase extraction (SPE) with gas chromatography/mass spectrometry for the single run analysis of water samples containing a broad range of acid, base and neutral compounds...

  4. Automated Cache Performance Analysis And Optimization

    Energy Technology Data Exchange (ETDEWEB)

    Mohror, Kathryn [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2013-12-23

    While there is no lack of performance counter tools for coarse-grained measurement of cache activity, there is a critical lack of tools for relating data layout to cache behavior to application performance. Generally, any nontrivial optimizations are either not done at all, or are done ”by hand” requiring significant time and expertise. To the best of our knowledge no tool available to users measures the latency of memory reference instructions for partic- ular addresses and makes this information available to users in an easy-to-use and intuitive way. In this project, we worked to enable the Open|SpeedShop performance analysis tool to gather memory reference latency information for specific instructions and memory ad- dresses, and to gather and display this information in an easy-to-use and intuitive way to aid performance analysts in identifying problematic data structures in their codes. This tool was primarily designed for use in the supercomputer domain as well as grid, cluster, cloud-based parallel e-commerce, and engineering systems and middleware. Ultimately, we envision a tool to automate optimization of application cache layout and utilization in the Open|SpeedShop performance analysis tool. To commercialize this soft- ware, we worked to develop core capabilities for gathering enhanced memory usage per- formance data from applications and create and apply novel methods for automatic data structure layout optimizations, tailoring the overall approach to support existing supercom- puter and cluster programming models and constraints. In this Phase I project, we focused on infrastructure necessary to gather performance data and present it in an intuitive way to users. With the advent of enhanced Precise Event-Based Sampling (PEBS) counters on recent Intel processor architectures and equivalent technology on AMD processors, we are now in a position to access memory reference information for particular addresses. Prior to the introduction of PEBS counters

  5. ASteCA - Automated Stellar Cluster Analysis

    CERN Document Server

    Perren, Gabriel I; Piatti, Andrés E

    2014-01-01

    We present ASteCA (Automated Stellar Cluster Analysis), a suit of tools designed to fully automatize the standard tests applied on stellar clusters to determine their basic parameters. The set of functions included in the code make use of positional and photometric data to obtain precise and objective values for a given cluster's center coordinates, radius, luminosity function and integrated color magnitude, as well as characterizing through a statistical estimator its probability of being a true physical cluster rather than a random overdensity of field stars. ASteCA incorporates a Bayesian field star decontamination algorithm capable of assigning membership probabilities using photometric data alone. An isochrone fitting process based on the generation of synthetic clusters from theoretical isochrones and selection of the best fit through a genetic algorithm is also present, which allows ASteCA to provide accurate estimates for a cluster's metallicity, age, extinction and distance values along with its unce...

  6. Automation of Large-scale Computer Cluster Monitoring Information Analysis

    Science.gov (United States)

    Magradze, Erekle; Nadal, Jordi; Quadt, Arnulf; Kawamura, Gen; Musheghyan, Haykuhi

    2015-12-01

    High-throughput computing platforms consist of a complex infrastructure and provide a number of services apt to failures. To mitigate the impact of failures on the quality of the provided services, a constant monitoring and in time reaction is required, which is impossible without automation of the system administration processes. This paper introduces a way of automation of the process of monitoring information analysis to provide the long and short term predictions of the service response time (SRT) for a mass storage and batch systems and to identify the status of a service at a given time. The approach for the SRT predictions is based on Adaptive Neuro Fuzzy Inference System (ANFIS). An evaluation of the approaches is performed on real monitoring data from the WLCG Tier 2 center GoeGrid. Ten fold cross validation results demonstrate high efficiency of both approaches in comparison to known methods.

  7. Prevalence of discordant microscopic changes with automated CBC analysis

    Directory of Open Access Journals (Sweden)

    Fabiano de Jesus Santos

    2014-12-01

    Full Text Available Introduction:The most common cause of diagnostic error is related to errors in laboratory tests as well as errors of results interpretation. In order to reduce them, the laboratory currently has modern equipment which provides accurate and reliable results. The development of automation has revolutionized the laboratory procedures in Brazil and worldwide.Objective:To determine the prevalence of microscopic changes present in blood slides concordant and discordant with results obtained using fully automated procedures.Materials and method:From January to July 2013, 1,000 hematological parameters slides were analyzed. Automated analysis was performed on last generation equipment, which methodology is based on electrical impedance, and is able to quantify all the figurative elements of the blood in a universe of 22 parameters. The microscopy was performed by two experts in microscopy simultaneously.Results:The data showed that only 42.70% were concordant, comparing with 57.30% discordant. The main findings among discordant were: Changes in red blood cells 43.70% (n = 250, white blood cells 38.46% (n = 220, and number of platelet 17.80% (n = 102.Discussion:The data show that some results are not consistent with clinical or physiological state of an individual, and cannot be explained because they have not been investigated, which may compromise the final diagnosis.Conclusion:It was observed that it is of fundamental importance that the microscopy qualitative analysis must be performed in parallel with automated analysis in order to obtain reliable results, causing a positive impact on the prevention, diagnosis, prognosis, and therapeutic follow-up.

  8. A Method for Automated Planning of FTTH Access Network Infrastructures

    DEFF Research Database (Denmark)

    Riaz, Muhammad Tahir; Pedersen, Jens Myrup; Madsen, Ole Brun

    2005-01-01

    In this paper a method for automated planning of Fiber to the Home (FTTH) access networks is proposed. We introduced a systematic approach for planning access network infrastructure. The GIS data and a set of algorithms were employed to make the planning process more automatic. The method explains...

  9. Automated kymograph analysis for profiling axonal transport of secretory granules.

    Science.gov (United States)

    Mukherjee, Amit; Jenkins, Brian; Fang, Cheng; Radke, Richard J; Banker, Gary; Roysam, Badrinath

    2011-06-01

    This paper describes an automated method to profile the velocity patterns of small organelles (BDNF granules) being transported along a selected section of axon of a cultured neuron imaged by time-lapse fluorescence microscopy. Instead of directly detecting the granules as in conventional tracking, the proposed method starts by generating a two-dimensional spatio-temporal map (kymograph) of the granule traffic along an axon segment. Temporal sharpening during the kymograph creation helps to highlight granule movements while suppressing clutter due to stationary granules. A voting algorithm defined over orientation distribution functions is used to refine the locations and velocities of the granules. The refined kymograph is analyzed using an algorithm inspired from the minimum set cover framework to generate multiple motion trajectories of granule transport paths. The proposed method is computationally efficient, robust to significant levels of noise and clutter, and can be used to capture and quantify trends in transport patterns quickly and accurately. When evaluated on a collection of image sequences, the proposed method was found to detect granule movement events with 94% recall rate and 82% precision compared to a time-consuming manual analysis. Further, we present a study to evaluate the efficacy of velocity profiling by analyzing the impact of oxidative stress on granule transport in which the fully automated analysis correctly reproduced the biological conclusion generated by manual analysis. PMID:21330183

  10. Ecological Automation Design, Extending Work Domain Analysis

    NARCIS (Netherlands)

    Amelink, M.H.J.

    2010-01-01

    In high–risk domains like aviation, medicine and nuclear power plant control, automation has enabled new capabilities, increased the economy of operation and has greatly contributed to safety. However, automation increases the number of couplings in a system, which can inadvertently lead to more com

  11. Automating Object-Oriented Software Development Methods

    OpenAIRE

    Tekinerdogan, Bedir; SAEKI, Motoshi; Sunyé, Gerson; Broek, van den, E.; Hruby, Pavel; Frohner, A´ kos

    2002-01-01

    Current software projects have generally to deal with producing and managing large and complex software products. It is generally believed that applying software development methods are useful in coping with this complexity and for supporting quality. As such numerous object-oriented software development methods have been defined. Nevertheless, methods often provide a complexity by their own due to their large number of artifacts, method rules and their complicated processes. We think that au...

  12. Automating Object-Oriented Software Development Methods

    OpenAIRE

    Tekinerdogan, Bedir; SAEKI, Motoshi; Sunyé, Gerson; Broek, van den, E.; Hruby, Pavel

    2001-01-01

    Current software projects have generally to deal with producing and managing large and complex software products. It is generally believed that applying software development methods are useful in coping with this complexity and for supporting quality. As such numerous object-oriented software development methods have been defined. Nevertheless, methods often provide a complexity by their own due to their large number of artifacts, method rules and their complicated processes. We think that au...

  13. Automated seismic event location by waveform coherence analysis

    OpenAIRE

    Grigoli, Francesco

    2014-01-01

    Automated location of seismic events is a very important task in microseismic monitoring operations as well for local and regional seismic monitoring. Since microseismic records are generally characterised by low signal-to-noise ratio, such methods are requested to be noise robust and sufficiently accurate. Most of the standard automated location routines are based on the automated picking, identification and association of the first arrivals of P and S waves and on the minimization of the re...

  14. Methods of analysis by the U.S. Geological Survey National Water Quality Laboratory; determination of the total phosphorus by a Kjeldahl digestion method and an automated colorimetric finish that includes dialysis

    Science.gov (United States)

    Patton, Charles J.; Truitt, Earl P.

    1992-01-01

    A method to determine total phosphorus (TP) in the same digests prepared for total Kjeldahl nitrogen (TKN) determinations is desribed. The batch, high-temperature (block digester), HG(II)-catalyzed digestion step is similar to U.S. Geological Survey methods I-2552-85/I-4552-85 and U.S. Environmental Protection Agency method 365.4 except that sample and reagent volumes are halved. Prepared digests are desolvated at 220 degrees Celsius and digested at 370 degrees Celsius in separate block digesters set at these temperatures, rather than in a single, temperature-programmed block digester. This approach is used in the method escribed here, which permits 40 calibrants, reference waters, and smaples to be digested and resolvated in about an hour. Orthophosphate ions originally present in samples, along with those released during the digestion step, are determined colorimetrically at a rate of 90 tests per hour by an automated version of the phosphoantimonylmolybdenum blue procedure. About 100 microliters of digest are required per determination. The upper concentration limit is 2 milligrams per liter (mg/L) with a method detection limt of 0.01 mg/L. Repeatability for a sample containing approximately 1.6 mg/L of TP in a high suspended-solids matrix is 0.7 percent. Between-day precision for the same sample is 5.0 percent. A dialyzer in the air-segmented continuous flow analyzer provides on-line digest cleanup, eliminated particulates that otherwise would interfere in the colorimetric finish. An single-channel analyzer can process the resolvated digests from two pairs of block digesters each hour. Paired t-test analysis of TP concentrations for approximately 1,600 samples determined by the new method (U.S. Geologial Survey methods I-2610-91 and I-4610-91) and the old method (U.S. Geological Survey methods I-2600-85 and I-4600-85) revealed positive bias in the former of 0.02 to 0.04 mg/L for surface-water samples in agreement with previous studies. Concentrations of total

  15. Automation literature: A brief review and analysis

    Science.gov (United States)

    Smith, D.; Dieterly, D. L.

    1980-01-01

    Current thought and research positions which may allow for an improved capability to understand the impact of introducing automation to an existing system are established. The orientation was toward the type of studies which may provide some general insight into automation; specifically, the impact of automation in human performance and the resulting system performance. While an extensive number of articles were reviewed, only those that addressed the issue of automation and human performance were selected to be discussed. The literature is organized along two dimensions: time, Pre-1970, Post-1970; and type of approach, Engineering or Behavioral Science. The conclusions reached are not definitive, but do provide the initial stepping stones in an attempt to begin to bridge the concept of automation in a systematic progression.

  16. An automated method for the layup of fiberglass fabric

    Science.gov (United States)

    Zhu, Siqi

    This dissertation presents an automated composite fabric layup solution based on a new method to deform fiberglass fabric referred to as shifting. A layup system was designed and implemented using a large robotic gantry and custom end-effector for shifting. Layup tests proved that the system can deposit fabric onto two-dimensional and three-dimensional tooling surfaces accurately and repeatedly while avoiding out-of-plane deformation. A process planning method was developed to generate tool paths for the layup system based on a geometric model of the tooling surface. The approach is analogous to Computer Numerical Controlled (CNC) machining, where Numerical Control (NC) code from a Computer-Aided Design (CAD) model is generated to drive the milling machine. Layup experiments utilizing the proposed method were conducted to validate the performance. The results show that the process planning software requires minimal time or human intervention and can generate tool paths leading to accurate composite fabric layups. Fiberglass fabric samples processed with shifting deformation were observed for meso-scale deformation. Tow thinning, bending and spacing was observed and measured. Overall, shifting did not create flaws in amounts that would disqualify the method from use in industry. This suggests that shifting is a viable method for use in automated manufacturing. The work of this dissertation provides a new method for the automated layup of broad width composite fabric that is not possible with any available composite automation systems to date.

  17. Evaluation of an automated method for urinocolture screening

    Directory of Open Access Journals (Sweden)

    Claudia Ballabio

    2010-09-01

    Full Text Available Introduction: Urinary tract infections are one of the most common diseases found in medical practice and are diagnosed with traditional methods of cultivation on plates. In this study we evaluated an automated instrumentation for screening of the urinocultures that can provide results quickly and guarantee traceability. The comparison of results obtained with automatic and plate methods is reported. Methods: 316 urine samples including midstream urine, urine catheter and urine bag have been analyzed by Alfred 60 (Alifax through light scattering technology that measures the replication of the bacteria. Simultaneously, the samples were sown on agar plates CPS3,Agar Cled, Mc Conkey Agar. Results:A total of 316 samples were analyzed by the automated method, 190 resulted negative, all confirmed by culture, while 126 were found positive. 82 cases were confirmed positive in culture plate, 65 with significant isolation of bacteria and 17 with polymicrobial flora with a significant charge. 44 cases were negative in culture plate but positive for the automated method. Conclusions: The absence of false negative results at low charges can represent a starting point to introduce an automated method for urinocolture screening.

  18. Automated analysis and annotation of basketball video

    Science.gov (United States)

    Saur, Drew D.; Tan, Yap-Peng; Kulkarni, Sanjeev R.; Ramadge, Peter J.

    1997-01-01

    Automated analysis and annotation of video sequences are important for digital video libraries, content-based video browsing and data mining projects. A successful video annotation system should provide users with useful video content summary in a reasonable processing time. Given the wide variety of video genres available today, automatically extracting meaningful video content for annotation still remains hard by using current available techniques. However, a wide range video has inherent structure such that some prior knowledge about the video content can be exploited to improve our understanding of the high-level video semantic content. In this paper, we develop tools and techniques for analyzing structured video by using the low-level information available directly from MPEG compressed video. Being able to work directly in the video compressed domain can greatly reduce the processing time and enhance storage efficiency. As a testbed, we have developed a basketball annotation system which combines the low-level information extracted from MPEG stream with the prior knowledge of basketball video structure to provide high level content analysis, annotation and browsing for events such as wide- angle and close-up views, fast breaks, steals, potential shots, number of possessions and possession times. We expect our approach can also be extended to structured video in other domains.

  19. An automated and simple method for brain MR image extraction

    OpenAIRE

    Zhu Zixin; Liu Jiafeng; Zhang Haiyan; Li Haiyun

    2011-01-01

    Abstract Background The extraction of brain tissue from magnetic resonance head images, is an important image processing step for the analyses of neuroimage data. The authors have developed an automated and simple brain extraction method using an improved geometric active contour model. Methods The method uses an improved geometric active contour model which can not only solve the boundary leakage problem but also is less sensitive to intensity inhomogeneity. The method defines the initial fu...

  20. Automated Protein Assay Using Flow Injection Analysis

    Science.gov (United States)

    Wolfe, Carrie A. C.; Oates, Matthew R.; Hage, David S.

    1998-08-01

    The technique of flow injection analysis (FIA) is a common instrumental method used in detecting a variety of chemical and biological agents. This paper describes an undergraduate laboratory that uses FIA to perform a bicinchoninic acid (BCA) colorimetric assay for quantitating protein samples. The method requires less than 2 min per sample injection and gives a response over a broad range of protein concentrations. This method can be used in instrumental analysis labs to illustrate the principles and use of FIA, or as a means for introducing students to common methods employed in the analysis of biological agents.

  1. Statistical Analysis of Filament Features Based on the H{\\alpha} Solar Images from 1988 to 2013 by Computer Automated Detection Method

    CERN Document Server

    Hao, Q; Cao, W; Chen, P F

    2015-01-01

    We improve our filament automated detection method which was proposed in our previous works. It is then applied to process the full disk H$\\alpha$ data mainly obtained by Big Bear Solar Observatory (BBSO) from 1988 to 2013, spanning nearly 3 solar cycles. The butterfly diagrams of the filaments, showing the information of the filament area, spine length, tilt angle, and the barb number, are obtained. The variations of these features with the calendar year and the latitude band are analyzed. The drift velocities of the filaments in different latitude bands are calculated and studied. We also investigate the north-south (N-S) asymmetries of the filament numbers in total and in each subclass classified according to the filament area, spine length, and tilt angle. The latitudinal distribution of the filament number is found to be bimodal. About 80% of all the filaments have tilt angles within [0{\\deg}, 60{\\deg}]. For the filaments within latitudes lower (higher) than 50{\\deg} the northeast (northwest) direction i...

  2. Effective Manufacturing Method for Automated Inside Diameter Grinding

    Science.gov (United States)

    Slowinski, Bronislaw; Nadolny, Krzysztof

    This paper presents essence and results of experimental investigations of highly efficient automated internal cylindrical grinding method. The essence of this method consists in the removal of the whole grinding allowance in one pass of a grinding wheel, parallel to preserving the required quality of the surface layer of a workpiece. A grinding wheel applied to the developed method had a zonal diversified internal structure and a properly prepared conical chamfer.

  3. Automated quantitative image analysis of nanoparticle assembly

    Science.gov (United States)

    Murthy, Chaitanya R.; Gao, Bo; Tao, Andrea R.; Arya, Gaurav

    2015-05-01

    The ability to characterize higher-order structures formed by nanoparticle (NP) assembly is critical for predicting and engineering the properties of advanced nanocomposite materials. Here we develop a quantitative image analysis software to characterize key structural properties of NP clusters from experimental images of nanocomposites. This analysis can be carried out on images captured at intermittent times during assembly to monitor the time evolution of NP clusters in a highly automated manner. The software outputs averages and distributions in the size, radius of gyration, fractal dimension, backbone length, end-to-end distance, anisotropic ratio, and aspect ratio of NP clusters as a function of time along with bootstrapped error bounds for all calculated properties. The polydispersity in the NP building blocks and biases in the sampling of NP clusters are accounted for through the use of probabilistic weights. This software, named Particle Image Characterization Tool (PICT), has been made publicly available and could be an invaluable resource for researchers studying NP assembly. To demonstrate its practical utility, we used PICT to analyze scanning electron microscopy images taken during the assembly of surface-functionalized metal NPs of differing shapes and sizes within a polymer matrix. PICT is used to characterize and analyze the morphology of NP clusters, providing quantitative information that can be used to elucidate the physical mechanisms governing NP assembly.The ability to characterize higher-order structures formed by nanoparticle (NP) assembly is critical for predicting and engineering the properties of advanced nanocomposite materials. Here we develop a quantitative image analysis software to characterize key structural properties of NP clusters from experimental images of nanocomposites. This analysis can be carried out on images captured at intermittent times during assembly to monitor the time evolution of NP clusters in a highly automated

  4. Automating Object-Oriented Software Development Methods

    NARCIS (Netherlands)

    Tekinerdogan, Bedir; Saeki, Motoshi; Sunyé, Gerson; Broek, van den Pim; Hruby, Pavel; Frohner, A´ kos

    2002-01-01

    Current software projects have generally to deal with producing and managing large and complex software products. It is generally believed that applying software development methods are useful in coping with this complexity and for supporting quality. As such numerous object-oriented software develo

  5. Automating Object-Oriented Software Development Methods

    NARCIS (Netherlands)

    Tekinerdogan, Bedir; Saeki, Motoshi; Sunyé, Gerson; Broek, van den Pim; Hruby, Pavel

    2001-01-01

    Current software projects have generally to deal with producing and managing large and complex software products. It is generally believed that applying software development methods are useful in coping with this complexity and for supporting quality. As such numerous object-oriented software devel

  6. ECG Artifact Removal from Surface EMG Signal Using an Automated Method Based on Wavelet-ICA.

    Science.gov (United States)

    Abbaspour, Sara; Lindén, Maria; Gholamhosseini, Hamid

    2015-01-01

    This study aims at proposing an efficient method for automated electrocardiography (ECG) artifact removal from surface electromyography (EMG) signals recorded from upper trunk muscles. Wavelet transform is applied to the simulated data set of corrupted surface EMG signals to create multidimensional signal. Afterward, independent component analysis (ICA) is used to separate ECG artifact components from the original EMG signal. Components that correspond to the ECG artifact are then identified by an automated detection algorithm and are subsequently removed using a conventional high pass filter. Finally, the results of the proposed method are compared with wavelet transform, ICA, adaptive filter and empirical mode decomposition-ICA methods. The automated artifact removal method proposed in this study successfully removes the ECG artifacts from EMG signals with a signal to noise ratio value of 9.38 while keeping the distortion of original EMG to a minimum. PMID:25980853

  7. Automated Traffic Management System and Method

    Science.gov (United States)

    Glass, Brian J. (Inventor); Spirkovska, Liljana (Inventor); McDermott, William J. (Inventor); Reisman, Ronald J. (Inventor); Gibson, James (Inventor); Iverson, David L. (Inventor)

    2000-01-01

    A data management system and method that enables acquisition, integration, and management of real-time data generated at different rates, by multiple heterogeneous incompatible data sources. The system achieves this functionality by using an expert system to fuse data from a variety of airline, airport operations, ramp control, and air traffic control tower sources, to establish and update reference data values for every aircraft surface operation. The system may be configured as a real-time airport surface traffic management system (TMS) that electronically interconnects air traffic control, airline data, and airport operations data to facilitate information sharing and improve taxi queuing. In the TMS operational mode, empirical data shows substantial benefits in ramp operations for airlines, reducing departure taxi times by about one minute per aircraft in operational use, translating as $12 to $15 million per year savings to airlines at the Atlanta, Georgia airport. The data management system and method may also be used for scheduling the movement of multiple vehicles in other applications, such as marine vessels in harbors and ports, trucks or railroad cars in ports or shipping yards, and railroad cars in switching yards. Finally, the data management system and method may be used for managing containers at a shipping dock, stock on a factory floor or in a warehouse, or as a training tool for improving situational awareness of FAA tower controllers, ramp and airport operators, or commercial airline personnel in airfield surface operations.

  8. Automated mass action model space generation and analysis methods for two-reactant combinatorially complex equilibriums: An analysis of ATP-induced ribonucleotide reductase R1 hexamerization data

    Directory of Open Access Journals (Sweden)

    Radivoyevitch Tomas

    2009-12-01

    /30 > 508/2088 with p -15. Finally, 99 of the 2088 models did not have any terms with ATP/R1 ratios >1.5, but of the top 30, there were 14 such models (14/30 > 99/2088 with p -16, i.e. the existence of R1 hexamers with >3 a-sites occupied by ATP is also not supported by this dataset. Conclusion The analysis presented suggests that three a-sites may not be occupied by ATP in R1 hexamers under the conditions of the data analyzed. If a-sites fill before h-sites, this implies that the dataset analyzed can be explained without the existence of an h-site. Reviewers This article was reviewed by Ossama Kashlan (nominated by Philip Hahnfeldt, Bin Hu (nominated by William Hlavacek and Rainer Sachs.

  9. Testing an Automated Accuracy Assessment Method on Bibliographic Data

    Directory of Open Access Journals (Sweden)

    Marlies Olensky

    2014-12-01

    Full Text Available This study investigates automated data accuracy assessment as described in data quality literature for its suitability to assess bibliographic data. The data samples comprise the publications of two Nobel Prize winners in the field of Chemistry for a 10-year-publication period retrieved from the two bibliometric data sources, Web of Science and Scopus. The bibliographic records are assessed against the original publication (gold standard and an automatic assessment method is compared to a manual one. The results show that the manual assessment method reflects truer accuracy scores. The automated assessment method would need to be extended by additional rules that reflect specific characteristics of bibliographic data. Both data sources had higher accuracy scores per field than accumulated per record. This study contributes to the research on finding a standardized assessment method of bibliographic data accuracy as well as defining the impact of data accuracy on the citation matching process.

  10. Automated drawing of network plots in network meta-analysis.

    Science.gov (United States)

    Rücker, Gerta; Schwarzer, Guido

    2016-03-01

    In systematic reviews based on network meta-analysis, the network structure should be visualized. Network plots often have been drawn by hand using generic graphical software. A typical way of drawing networks, also implemented in statistical software for network meta-analysis, is a circular representation, often with many crossing lines. We use methods from graph theory in order to generate network plots in an automated way. We give a number of requirements for graph drawing and present an algorithm that fits prespecified ideal distances between the nodes representing the treatments. The method was implemented in the function netgraph of the R package netmeta and applied to a number of networks from the literature. We show that graph representations with a small number of crossing lines are often preferable to circular representations. PMID:26060934

  11. Quantifying biodiversity using digital cameras and automated image analysis.

    Science.gov (United States)

    Roadknight, C. M.; Rose, R. J.; Barber, M. L.; Price, M. C.; Marshall, I. W.

    2009-04-01

    Monitoring the effects on biodiversity of extensive grazing in complex semi-natural habitats is labour intensive. There are also concerns about the standardization of semi-quantitative data collection. We have chosen to focus initially on automating the most time consuming aspect - the image analysis. The advent of cheaper and more sophisticated digital camera technology has lead to a sudden increase in the number of habitat monitoring images and information that is being collected. We report on the use of automated trail cameras (designed for the game hunting market) to continuously capture images of grazer activity in a variety of habitats at Moor House National Nature Reserve, which is situated in the North of England at an average altitude of over 600m. Rainfall is high, and in most areas the soil consists of deep peat (1m to 3m), populated by a mix of heather, mosses and sedges. The cameras have been continuously in operation over a 6 month period, daylight images are in full colour and night images (IR flash) are black and white. We have developed artificial intelligence based methods to assist in the analysis of the large number of images collected, generating alert states for new or unusual image conditions. This paper describes the data collection techniques, outlines the quantitative and qualitative data collected and proposes online and offline systems that can reduce the manpower overheads and increase focus on important subsets in the collected data. By converting digital image data into statistical composite data it can be handled in a similar way to other biodiversity statistics thus improving the scalability of monitoring experiments. Unsupervised feature detection methods and supervised neural methods were tested and offered solutions to simplifying the process. Accurate (85 to 95%) categorization of faunal content can be obtained, requiring human intervention for only those images containing rare animals or unusual (undecidable) conditions, and

  12. White matter hyperintensities segmentation: a new semi-automated method

    Directory of Open Access Journals (Sweden)

    Giacomo Luccichenti

    2013-12-01

    Full Text Available White matter hyperintensities (WMH are brain areas of increased signal on T2-weighted or fluid attenuated inverse recovery magnetic resonance imaging (MRI scans. In this study we present a new semi-automated method to measure WMH load that is based on the segmentation of the intensity histogram of fluid-attenuated inversion recovery images. Thirty patients with Mild Cognitive Impairment with variable WMH load were enrolled. The semi-automated WMH segmentation included: removal of non-brain tissue, spatial normalization, removal of cerebellum and brain stem, spatial filtering, thresholding to segment probable WMH, manual editing for correction of false positives and negatives, generation of WMH map and volumetric estimation of the WMH load. Accuracy was quantitatively evaluated by comparing semi-automated and manual WMH segmentations performed by two independent raters. Differences between the two procedures were assessed using Student’s t tests and similarity was evaluated using linear regression model and Dice Similarity Coefficient (DSC. The volumes of the manual and semi-automated segmentations did not statistically differ (t-value= -1.79, DF=29, p= 0.839 for rater 1; t-value= 1.113, DF=29, p= 0.2749 for rater 2, were highly correlated (R²= 0.921, F (1,29 =155,54, p

  13. Comparison of manual and automated quantification methods of {sup 123}I-ADAM

    Energy Technology Data Exchange (ETDEWEB)

    Kauppinen, T. [Helsinki Univ. Central Hospital (Finland). HUS Helsinki Medical Imaging Center; Helsinki Univ. Central Hospital (Finland). Division of Nuclear Medicine; Koskela, A.; Ahonen, A. [Helsinki Univ. Central Hospital (Finland). Division of Nuclear Medicine; Diemling, M. [Hermes Medical Solutions, Stockholm (Sweden); Keski-Rahkonen, A.; Sihvola, E. [Helsinki Univ. (Finland). Dept. of Public Health; Helsinki Univ. Central Hospital (Finland). Dept. of Psychiatry

    2005-07-01

    {sup 123}I-ADAM is a novel radioligand for imaging of the brain serotonin transporters (SERTs). Traditionally, the analysis of brain receptor studies has been based on observer-dependent manual region of interest definitions and visual interpretation. Our aim was to create a template for automated image registrations and volume of interest (VOI) quantification and to show that an automated quantification method of {sup 123}I-ADAM is more repeatable than the manual method. Patients, methods: A template and a predefined VOI map was created from {sup 123}I-ADAM scans done for healthy volunteers (n=15). Scans of another group of healthy persons (HS, n=12) and patients with bulimia nervosa (BN, n=10) were automatically fitted to the template and specific binding ratios (SBRs) were calculated by using the VOI map. Manual VOI definitions were done for the HS and BN groups by both one and two observers. The repeatability of the automated method was evaluated by using the BN group. Results: For the manual method, the interobserver coefficient of repeatability was 0.61 for the HS group and 1.00 for the BN group. The intra-observer coefficient of repeatability for the BN group was 0.70. For the automated method, the coefficient of repeatability was 0.13 for SBRs in midbrain. Conclusion: An automated quantification gives valuable information in addition to visual interpretation decreasing also the total image handling time and giving clear advantages for research work. An automated method for analysing {sup 123}I-ADAM binding to the brain SERT gives repeatable results for fitting the studies to the template and for calculating SBRs, and could therefore replace manual methods. (orig.)

  14. Comparison of manual and automated quantification methods of 123I-ADAM

    International Nuclear Information System (INIS)

    123I-ADAM is a novel radioligand for imaging of the brain serotonin transporters (SERTs). Traditionally, the analysis of brain receptor studies has been based on observer-dependent manual region of interest definitions and visual interpretation. Our aim was to create a template for automated image registrations and volume of interest (VOI) quantification and to show that an automated quantification method of 123I-ADAM is more repeatable than the manual method. Patients, methods: A template and a predefined VOI map was created from 123I-ADAM scans done for healthy volunteers (n=15). Scans of another group of healthy persons (HS, n=12) and patients with bulimia nervosa (BN, n=10) were automatically fitted to the template and specific binding ratios (SBRs) were calculated by using the VOI map. Manual VOI definitions were done for the HS and BN groups by both one and two observers. The repeatability of the automated method was evaluated by using the BN group. Results: For the manual method, the interobserver coefficient of repeatability was 0.61 for the HS group and 1.00 for the BN group. The intra-observer coefficient of repeatability for the BN group was 0.70. For the automated method, the coefficient of repeatability was 0.13 for SBRs in midbrain. Conclusion: An automated quantification gives valuable information in addition to visual interpretation decreasing also the total image handling time and giving clear advantages for research work. An automated method for analysing 123I-ADAM binding to the brain SERT gives repeatable results for fitting the studies to the template and for calculating SBRs, and could therefore replace manual methods. (orig.)

  15. Components for automated microfluidics sample preparation and analysis

    Science.gov (United States)

    Archer, M.; Erickson, J. S.; Hilliard, L. R.; Howell, P. B., Jr.; Stenger, D. A.; Ligler, F. S.; Lin, B.

    2008-02-01

    The increasing demand for portable devices to detect and identify pathogens represents an interdisciplinary effort between engineering, materials science, and molecular biology. Automation of both sample preparation and analysis is critical for performing multiplexed analyses on real world samples. This paper selects two possible components for such automated portable analyzers: modified silicon structures for use in the isolation of nucleic acids and a sheath flow system suitable for automated microflow cytometry. Any detection platform that relies on the genetic content (RNA and DNA) present in complex matrices requires careful extraction and isolation of the nucleic acids in order to ensure their integrity throughout the process. This sample pre-treatment step is commonly performed using commercially available solid phases along with various molecular biology techniques that require multiple manual steps and dedicated laboratory space. Regardless of the detection scheme, a major challenge in the integration of total analysis systems is the development of platforms compatible with current isolation techniques that will ensure the same quality of nucleic acids. Silicon is an ideal candidate for solid phase separations since it can be tailored structurally and chemically to mimic the conditions used in the laboratory. For analytical purposes, we have developed passive structures that can be used to fully ensheath one flow stream with another. As opposed to traditional flow focusing methods, our sheath flow profile is truly two dimensional, making it an ideal candidate for integration into a microfluidic flow cytometer. Such a microflow cytometer could be used to measure targets captured on either antibody- or DNA-coated beads.

  16. Statistical method for the determination of equivalence of automated test procedures

    OpenAIRE

    Norman Wiggins; Gorko, Mary A.; Jennifer Llewelyn; K. Rick Lung

    2003-01-01

    In the development of test methods for solid dosage forms, manual test procedures for assay and content uniformity often precede the development of automated test procedures. Since the mode of extraction for automated test methods is often slightly different from that of the manual test method, additional validation of an automated test method is usually required. In addition to compliance with validation guidelines, developers of automated test methods are often asked to demonstrate equivale...

  17. Osteolytica: An automated image analysis software package that rapidly measures cancer-induced osteolytic lesions in in vivo models with greater reproducibility compared to other commonly used methods.

    Science.gov (United States)

    Evans, H R; Karmakharm, T; Lawson, M A; Walker, R E; Harris, W; Fellows, C; Huggins, I D; Richmond, P; Chantry, A D

    2016-02-01

    Methods currently used to analyse osteolytic lesions caused by malignancies such as multiple myeloma and metastatic breast cancer vary from basic 2-D X-ray analysis to 2-D images of micro-CT datasets analysed with non-specialised image software such as ImageJ. However, these methods have significant limitations. They do not capture 3-D data, they are time-consuming and they often suffer from inter-user variability. We therefore sought to develop a rapid and reproducible method to analyse 3-D osteolytic lesions in mice with cancer-induced bone disease. To this end, we have developed Osteolytica, an image analysis software method featuring an easy to use, step-by-step interface to measure lytic bone lesions. Osteolytica utilises novel graphics card acceleration (parallel computing) and 3-D rendering to provide rapid reconstruction and analysis of osteolytic lesions. To evaluate the use of Osteolytica we analysed tibial micro-CT datasets from murine models of cancer-induced bone disease and compared the results to those obtained using a standard ImageJ analysis method. Firstly, to assess inter-user variability we deployed four independent researchers to analyse tibial datasets from the U266-NSG murine model of myeloma. Using ImageJ, inter-user variability between the bones was substantial (±19.6%), in contrast to using Osteolytica, which demonstrated minimal variability (±0.5%). Secondly, tibial datasets from U266-bearing NSG mice or BALB/c mice injected with the metastatic breast cancer cell line 4T1 were compared to tibial datasets from aged and sex-matched non-tumour control mice. Analyses by both Osteolytica and ImageJ showed significant increases in bone lesion area in tumour-bearing mice compared to control mice. These results confirm that Osteolytica performs as well as the current 2-D ImageJ osteolytic lesion analysis method. However, Osteolytica is advantageous in that it analyses over the entirety of the bone volume (as opposed to selected 2-D images), it

  18. Automated Tetrahedral Mesh Generation for CFD Analysis of Aircraft in Conceptual Design

    Science.gov (United States)

    Ordaz, Irian; Li, Wu; Campbell, Richard L.

    2014-01-01

    The paper introduces an automation process of generating a tetrahedral mesh for computational fluid dynamics (CFD) analysis of aircraft configurations in early conceptual design. The method was developed for CFD-based sonic boom analysis of supersonic configurations, but can be applied to aerodynamic analysis of aircraft configurations in any flight regime.

  19. Automated Steel Cleanliness Analysis Tool (ASCAT)

    Energy Technology Data Exchange (ETDEWEB)

    Gary Casuccio (RJ Lee Group); Michael Potter (RJ Lee Group); Fred Schwerer (RJ Lee Group); Dr. Richard J. Fruehan (Carnegie Mellon University); Dr. Scott Story (US Steel)

    2005-12-30

    The objective of this study was to develop the Automated Steel Cleanliness Analysis Tool (ASCATTM) to permit steelmakers to evaluate the quality of the steel through the analysis of individual inclusions. By characterizing individual inclusions, determinations can be made as to the cleanliness of the steel. Understanding the complicating effects of inclusions in the steelmaking process and on the resulting properties of steel allows the steel producer to increase throughput, better control the process, reduce remelts, and improve the quality of the product. The ASCAT (Figure 1) is a steel-smart inclusion analysis tool developed around a customized next-generation computer controlled scanning electron microscopy (NG-CCSEM) hardware platform that permits acquisition of inclusion size and composition data at a rate never before possible in SEM-based instruments. With built-in customized ''intelligent'' software, the inclusion data is automatically sorted into clusters representing different inclusion types to define the characteristics of a particular heat (Figure 2). The ASCAT represents an innovative new tool for the collection of statistically meaningful data on inclusions, and provides a means of understanding the complicated effects of inclusions in the steel making process and on the resulting properties of steel. Research conducted by RJLG with AISI (American Iron and Steel Institute) and SMA (Steel Manufactures of America) members indicates that the ASCAT has application in high-grade bar, sheet, plate, tin products, pipes, SBQ, tire cord, welding rod, and specialty steels and alloys where control of inclusions, whether natural or engineered, are crucial to their specification for a given end-use. Example applications include castability of calcium treated steel; interstitial free (IF) degasser grade slag conditioning practice; tundish clogging and erosion minimization; degasser circulation and optimization; quality assessment

  20. Automated Steel Cleanliness Analysis Tool (ASCAT)

    International Nuclear Information System (INIS)

    The objective of this study was to develop the Automated Steel Cleanliness Analysis Tool (ASCATTM) to permit steelmakers to evaluate the quality of the steel through the analysis of individual inclusions. By characterizing individual inclusions, determinations can be made as to the cleanliness of the steel. Understanding the complicating effects of inclusions in the steelmaking process and on the resulting properties of steel allows the steel producer to increase throughput, better control the process, reduce remelts, and improve the quality of the product. The ASCAT (Figure 1) is a steel-smart inclusion analysis tool developed around a customized next-generation computer controlled scanning electron microscopy (NG-CCSEM) hardware platform that permits acquisition of inclusion size and composition data at a rate never before possible in SEM-based instruments. With built-in customized ''intelligent'' software, the inclusion data is automatically sorted into clusters representing different inclusion types to define the characteristics of a particular heat (Figure 2). The ASCAT represents an innovative new tool for the collection of statistically meaningful data on inclusions, and provides a means of understanding the complicated effects of inclusions in the steel making process and on the resulting properties of steel. Research conducted by RJLG with AISI (American Iron and Steel Institute) and SMA (Steel Manufactures of America) members indicates that the ASCAT has application in high-grade bar, sheet, plate, tin products, pipes, SBQ, tire cord, welding rod, and specialty steels and alloys where control of inclusions, whether natural or engineered, are crucial to their specification for a given end-use. Example applications include castability of calcium treated steel; interstitial free (IF) degasser grade slag conditioning practice; tundish clogging and erosion minimization; degasser circulation and optimization; quality assessment/steel cleanliness; slab, billet

  1. α-Automated Reasoning Method Based on Lattice-Valued Propositional Logic LP(X)

    Institute of Scientific and Technical Information of China (English)

    王伟; 徐扬; 王学芳

    2002-01-01

    This paper is focused on automated reasoning based on classical propositional logic and lattice-valued propositional logic LP(X). A new method of automated reasoning is given, and the soundness and completeness theorems of this method are proved.

  2. FASTER: an unsupervised fully automated sleep staging method for mice

    OpenAIRE

    Sunagawa, Genshiro A; Séi, Hiroyoshi; Shimba, Shigeki; Urade, Yoshihiro; Ueda, Hiroki R.

    2013-01-01

    Identifying the stages of sleep, or sleep staging, is an unavoidable step in sleep research and typically requires visual inspection of electroencephalography (EEG) and electromyography (EMG) data. Currently, scoring is slow, biased and prone to error by humans and thus is the most important bottleneck for large-scale sleep research in animals. We have developed an unsupervised, fully automated sleep staging method for mice that allows less subjective and high-throughput evaluation of sleep. ...

  3. A screened automated structural search with semiempirical methods

    OpenAIRE

    Ota, Yukihiro; Ruiz-Barragan, Sergi; Machida, Masahiko; Shiga, Motoyuki

    2016-01-01

    We developed an interface program between a program suite for an automated search of chemical reaction pathways, GRRM, and a program package of semiempirical methods, MOPAC. A two-step structural search is proposed as an application of this interface program. A screening test is first performed by semiempirical calculations. Subsequently, a reoptimization procedure is done by ab initio or density functional calculations. We apply this approach to ion adsorption on cellulose. The computational...

  4. Evaluation of feature-based methods for automated network orientation

    OpenAIRE

    Apollonio, F I; Ballabeni, A.; M. Gaiani; F. Remondino

    2014-01-01

    Every day new tools and algorithms for automated image processing and 3D reconstruction purposes become available, giving the possibility to process large networks of unoriented and markerless images, delivering sparse 3D point clouds at reasonable processing time. In this paper we evaluate some feature-based methods used to automatically extract the tie points necessary for calibration and orientation procedures, in order to better understand their performances for 3D reconstruction...

  5. Automated methods for formal proofs in simple arithmetics and algebra

    OpenAIRE

    Chaieb, Amine

    2008-01-01

    In an LCF-like theorem prover, any proof must be produced from a small set of inference rules. The development of automated proof methods in such systems is extremely important. In this thesis we study the following question: How should we integrate a proof procedure in an LCF-like theorem prover, both in general and in the special case of arithmetics? We investigate three integration paradigms and present several proof procedures. These include universal and weak existe...

  6. Automating the Synthetic Field Method:Application to Sextans A

    OpenAIRE

    Holwerda, B. W.; Allen, R. J.; van der Kruit, P. C.

    2002-01-01

    Holwerda, B W; Allen, R J; Van der Kruit, P C; ( Kapteyn Institute Groningen ) ( Space Telescope Science Institute, ) 29 Jan 2002 . - 4 p Abstract: We have automated the ``Synthetic Field Method'' developed by Gonzalez et al.(1998) and used it to measure the opacity of the ISM in the Local Group dwarf galaxy Sextans A by using the changes in counts of background galaxies seen through the foreground system. The Sextans A results are consistent with the observational relation found by Cuillandr...

  7. Method for Automated Bone Shape Correction within Bone Distraction Procedure

    Science.gov (United States)

    Blynskiy, F. Yu

    2016-01-01

    The method for automated bone shape correction within bone distraction procedure is presented. High precision deformation angle measurement is provided by the software for X- Ray images processing. Special BDC v.1.0.1. application is designed. The purpose of the BDC is modeling of the bone geometry structure to calculate the appropriate distraction forces. The correction procedure control is realized by the hardware of the distraction system.

  8. A Simple Method for Automated Equilibration Detection in Molecular Simulations.

    Science.gov (United States)

    Chodera, John D

    2016-04-12

    Molecular simulations intended to compute equilibrium properties are often initiated from configurations that are highly atypical of equilibrium samples, a practice which can generate a distinct initial transient in mechanical observables computed from the simulation trajectory. Traditional practice in simulation data analysis recommends this initial portion be discarded to equilibration, but no simple, general, and automated procedure for this process exists. Here, we suggest a conceptually simple automated procedure that does not make strict assumptions about the distribution of the observable of interest in which the equilibration time is chosen to maximize the number of effectively uncorrelated samples in the production timespan used to compute equilibrium averages. We present a simple Python reference implementation of this procedure and demonstrate its utility on typical molecular simulation data. PMID:26771390

  9. Automated Analysis of Source Code Patches using Machine Learning Algorithms

    OpenAIRE

    Castro Lechtaler, Antonio; Liporace, Julio César; Cipriano, Marcelo; García, Edith; Maiorano, Ariel; Malvacio, Eduardo; Tapia, Néstor

    2015-01-01

    An updated version of a tool for automated analysis of source code patches and branch differences is presented. The upgrade involves the use of machine learning techniques on source code, comments, and messages. It aims to help analysts, code reviewers, or auditors perform repetitive tasks continuously. The environment designed encourages collaborative work. It systematizes certain tasks pertaining to reviewing or auditing processes. Currently, the scope of the automated test is limited. C...

  10. Morphological observation and analysis using automated image cytometry for the comparison of trypan blue and fluorescence-based viability detection method.

    Science.gov (United States)

    Chan, Leo Li-Ying; Kuksin, Dmitry; Laverty, Daniel J; Saldi, Stephanie; Qiu, Jean

    2015-05-01

    The ability to accurately determine cell viability is essential to performing a well-controlled biological experiment. Typical experiments range from standard cell culturing to advanced cell-based assays that may require cell viability measurement for downstream experiments. The traditional cell viability measurement method has been the trypan blue (TB) exclusion assay. However, since the introduction of fluorescence-based dyes for cell viability measurement using flow or image-based cytometry systems, there have been numerous publications comparing the two detection methods. Although previous studies have shown discrepancies between TB exclusion and fluorescence-based viability measurements, image-based morphological analysis was not performed in order to examine the viability discrepancies. In this work, we compared TB exclusion and fluorescence-based viability detection methods using image cytometry to observe morphological changes due to the effect of TB on dead cells. Imaging results showed that as the viability of a naturally-dying Jurkat cell sample decreased below 70 %, many TB-stained cells began to exhibit non-uniform morphological characteristics. Dead cells with these characteristics may be difficult to count under light microscopy, thus generating an artificially higher viability measurement compared to fluorescence-based method. These morphological observations can potentially explain the differences in viability measurement between the two methods. PMID:24643390

  11. Individual flexor tendon identification within the carpal tunnel: A semi-automated analysis method for serial cross-section magnetic resonance images

    Directory of Open Access Journals (Sweden)

    Nicole M Kunze

    2009-12-01

    Full Text Available Nicole M Kunze1, Jessica E Goetz2, Daniel R Thedens3, Thomas E Baer2, Ericka A Lawler2, Thomas D Brown21Department of Biomedical Engineering, 2Department of Orthopaedics and Rehabilitation, 3Department of Radiology, University of Iowa, Iowa City, IA, USAAbstract: Carpal tunnel syndrome is commonly viewed as resulting from chronic mechanical insult of the median nerve by adjacent anatomical structures. Both the median nerve and its surrounding soft tissue structures are well visualized on magnetic resonance (MR images of the wrist and hand. Addressing nerve damage from impingement of flexor digitorum tendons co-occupying the tunnel is attractive, but to date has been restricted by a lack of means for making individual identifications of the respective tendons. In this image analysis work, we have developed a region-growing method to positively identify each individual digital flexor tendon within the carpal tunnel by tracking it from a more distal MR section where the respective tendon identities are unambiguous. Illustratively, the new method was applied to MRI scans from four different subjects in a variety of hand poses. Conventional shape measures yielded less discriminatory information than did evaluations of individual tendon location and arrangement. This new method of rapid identification of individual tendons will facilitate analysis of tendon/nerve interactions within the tunnel, thereby providing better information about mechanical insult of the median nerve.Keywords: carpal tunnel syndrome, magnetic resonance imaging, region growing, digital flexor tendons

  12. Automation of the Analysis of Moessbauer Spectra

    International Nuclear Information System (INIS)

    In the present report we propose the automation of least square fitting of Moessbauer spectra, the identification of the substance, its crystal structure and the access to the references with the help of a genetic algorith, Fuzzy logic, and the artificial neural network associated with a databank of Moessbauer parameters and references. This system could be useful for specialists and non-specialists, in industry as well as in research laboratories

  13. Semantic analysis for system level design automation

    OpenAIRE

    Greenwood, Rob

    1992-01-01

    This thesis describes the design and implementation of a system to extract meaning from natural language specifications of digital systems. This research is part of the ASPIN project which has the long-term goal of providing an automated system for digital system synthesis from informal specifications. This work makes several contributions, one being the application of artificial intelligence techniques to specifications writing. Also, the work deals with the subset of the Engl...

  14. Tank Farm Operations Surveillance Automation Analysis

    International Nuclear Information System (INIS)

    The Nuclear Operations Project Services identified the need to improve manual tank farm surveillance data collection, review, distribution and storage practices often referred to as Operator Rounds. This document provides the analysis in terms of feasibility to improve the manual data collection methods by using handheld computer units, barcode technology, a database for storage and acquisitions, associated software, and operational procedures to increase the efficiency of Operator Rounds associated with surveillance activities

  15. Tank Farm Operations Surveillance Automation Analysis

    Energy Technology Data Exchange (ETDEWEB)

    MARQUEZ, D.L.

    2000-12-21

    The Nuclear Operations Project Services identified the need to improve manual tank farm surveillance data collection, review, distribution and storage practices often referred to as Operator Rounds. This document provides the analysis in terms of feasibility to improve the manual data collection methods by using handheld computer units, barcode technology, a database for storage and acquisitions, associated software, and operational procedures to increase the efficiency of Operator Rounds associated with surveillance activities.

  16. Fully automated diabetic retinopathy screening using morphological component analysis.

    Science.gov (United States)

    Imani, Elaheh; Pourreza, Hamid-Reza; Banaee, Touka

    2015-07-01

    Diabetic retinopathy is the major cause of blindness in the world. It has been shown that early diagnosis can play a major role in prevention of visual loss and blindness. This diagnosis can be made through regular screening and timely treatment. Besides, automation of this process can significantly reduce the work of ophthalmologists and alleviate inter and intra observer variability. This paper provides a fully automated diabetic retinopathy screening system with the ability of retinal image quality assessment. The novelty of the proposed method lies in the use of Morphological Component Analysis (MCA) algorithm to discriminate between normal and pathological retinal structures. To this end, first a pre-screening algorithm is used to assess the quality of retinal images. If the quality of the image is not satisfactory, it is examined by an ophthalmologist and must be recaptured if necessary. Otherwise, the image is processed for diabetic retinopathy detection. In this stage, normal and pathological structures of the retinal image are separated by MCA algorithm. Finally, the normal and abnormal retinal images are distinguished by statistical features of the retinal lesions. Our proposed system achieved 92.01% sensitivity and 95.45% specificity on the Messidor dataset which is a remarkable result in comparison with previous work. PMID:25863517

  17. A GIS-based automated procedure for landslide susceptibility mapping by the Conditional Analysis method: the Baganza valley case study (Italian Northern Apennines)

    Science.gov (United States)

    Clerici, Aldo; Perego, Susanna; Tellini, Claudio; Vescovi, Paolo

    2006-08-01

    Among the many GIS based multivariate statistical methods for landslide susceptibility zonation, the so called “Conditional Analysis method” holds a special place for its conceptual simplicity. In fact, in this method landslide susceptibility is simply expressed as landslide density in correspondence with different combinations of instability-factor classes. To overcome the operational complexity connected to the long, tedious and error prone sequence of commands required by the procedure, a shell script mainly based on the GRASS GIS was created. The script, starting from a landslide inventory map and a number of factor maps, automatically carries out the whole procedure resulting in the construction of a map with five landslide susceptibility classes. A validation procedure allows to assess the reliability of the resulting model, while the simple mean deviation of the density values in the factor class combinations, helps to evaluate the goodness of landslide density distribution. The procedure was applied to a relatively small basin (167 km2) in the Italian Northern Apennines considering three landslide types, namely rotational slides, flows and complex landslides, for a total of 1,137 landslides, and five factors, namely lithology, slope angle and aspect, elevation and slope/bedding relations. The analysis of the resulting 31 different models obtained combining the five factors, confirms the role of lithology, slope angle and slope/bedding relations in influencing slope stability.

  18. A new automated method of e-learner's satisfaction measurement

    Directory of Open Access Journals (Sweden)

    Armands Strazds

    2007-06-01

    Full Text Available This paper presents a new method of measuring learner’s satisfaction while using electronic learning materials (e-courses, edutainment games, etc. in virtual non-linear environments. Method is based on a relation of Discovering and Learning probability distribution curves obtained by collecting and evaluating the human-computer interaction data. While being near real-time, this measurement is considered highly unobtrusive and cost-effective because of its automated approach. The first working prototype EDUSA 1.0 was developed and successfully tested by the Distance Education Studies Centre of Riga Technical University.

  19. A screened automated structural search with semiempirical methods

    Science.gov (United States)

    Ota, Yukihiro; Ruiz-Barragan, Sergi; Machida, Masahiko; Shiga, Motoyuki

    2016-03-01

    We developed an interface program between a program suite for an automated search of chemical reaction pathways, GRRM, and a program package of semiempirical methods, MOPAC. A two-step structural search is proposed as an application of this interface program. A screening test is first performed by semiempirical calculations. Subsequently, a reoptimization procedure is done by ab initio or density functional calculations. We apply this approach to ion adsorption on cellulose. The computational efficiency is also shown for a GRRM search. The interface program is suitable for the structural search of large molecular systems for which semiempirical methods are applicable.

  20. A screened automated structural search with semiempirical methods

    CERN Document Server

    Ota, Yukihiro; Machida, Masahiko; Shiga, Motoyuki

    2016-01-01

    We developed an interface program between a program suite for an automated search of chemical reaction pathways, GRRM, and a program package of semiempirical methods, MOPAC. A two-step structural search is proposed as an application of this interface program. A screening test is first performed by semiempirical calculations. Subsequently, a reoptimization procedure is done by ab initio or density functional calculations. We apply this approach to ion adsorption on cellulose. The computational efficiency is also shown for a GRRM search. The interface program is suitable for the structural search of large molecular systems for which semiempirical methods are applicable.

  1. Development of an automated technique for failure modes and effect analysis

    DEFF Research Database (Denmark)

    Blanke, M.; Borch, Ole; Bagnoli, F.;

    Advances in automation have provided integration of monitoring and control functions to enhance the operator's overview and ability to take remedy actions when faults occur. Automation in plant supervision is technically possible with integrated automation systems as platforms, but new design...... methods are needed to cope efficiently with the complexity and to ensure that the functionality of a supervisor is correct and consistent. In particular these methods are expected to significantly improve fault tolerance of the designed systems. The purpose of this work is to develop a software module...... implementing an automated technique for Failure Modes and Effects Analysis (FMEA). This technique is based on the matrix formulation of FMEA for the investigation of failure propagation through a system. As main result, this technique will provide the design engineer with decision tables for fault handling...

  2. Development of an Automated Technique for Failure Modes and Effect Analysis

    DEFF Research Database (Denmark)

    Blanke, M.; Borch, Ole; Allasia, G.;

    1999-01-01

    Advances in automation have provided integration of monitoring and control functions to enhance the operator's overview and ability to take remedy actions when faults occur. Automation in plant supervision is technically possible with integrated automation systems as platforms, but new design...... methods are needed to cope efficiently with the complexity and to ensure that the functionality of a supervisor is correct and consistent. In particular these methods are expected to significantly improve fault tolerance of the designed systems. The purpose of this work is to develop a software module...... implementing an automated technique for Failure Modes and Effects Analysis (FMEA). This technique is based on the matrix formulation of FMEA for the investigation of failure propagation through a system. As main result, this technique will provide the design engineer with decision tables for fault handling...

  3. Automating sensitivity analysis of computer models using computer calculus

    International Nuclear Information System (INIS)

    An automated procedure for performing sensitivity analysis has been developed. The procedure uses a new FORTRAN compiler with computer calculus capabilities to generate the derivatives needed to set up sensitivity equations. The new compiler is called GRESS - Gradient Enhanced Software System. Application of the automated procedure with direct and adjoint sensitivity theory for the analysis of non-linear, iterative systems of equations is discussed. Calculational efficiency consideration and techniques for adjoint sensitivity analysis are emphasized. The new approach is found to preserve the traditional advantages of adjoint theory while removing the tedious human effort previously needed to apply this theoretical methodology. Conclusions are drawn about the applicability of the automated procedure in numerical analysis and large-scale modelling sensitivity studies

  4. Automated modelling of complex refrigeration cycles through topological structure analysis

    International Nuclear Information System (INIS)

    We have developed a computational method for analysis of refrigeration cycles. The method is well suited for automated analysis of complex refrigeration systems. The refrigerator is specified through a description of flows representing thermodynamic sates at system locations; components that modify the thermodynamic state of a flow; and controls that specify flow characteristics at selected points in the diagram. A system of equations is then established for the refrigerator, based on mass, energy and momentum balances for each of the system components. Controls specify the values of certain system variables, thereby reducing the number of unknowns. It is found that the system of equations for the refrigerator may contain a number of redundant or duplicate equations, and therefore further equations are necessary for a full characterization. The number of additional equations is related to the number of loops in the cycle, and this is calculated by a matrix-based topological method. The methodology is demonstrated through an analysis of a two-stage refrigeration cycle.

  5. Feasibility studies of safety assessment methods for programmable automation systems. Final report of the AVV project

    International Nuclear Information System (INIS)

    Feasibility studies of two different groups of methodologies for safety assessment of programmable automation systems has been executed at the Technical Research Centre of Finland (VTT). The studies concerned the dynamic testing methods and the fault tree (FT) and failure mode and effects analysis (FMEA) methods. In order to get real experience in the application of these methods, an experimental testing of two realistic pilot systems were executed and a FT/FMEA analysis of a programmable safety function accomplished. The purpose of the studies was not to assess the object systems, but to get experience in the application of methods and assess their potentials and development needs. (46 refs., 21 figs.)

  6. On Automating and Standardising Corpus Callosum Analysis in Brain MRI

    DEFF Research Database (Denmark)

    Stegmann, Mikkel Bille; Skoglund, Karl

    2005-01-01

    Corpus callosum analysis is influenced by many factors. The effort in controlling these has previously been incomplete and scattered. This paper sketches a complete pipeline for automated corpus callosum analysis from magnetic resonance images, with focus on measurement standardisation. The prese...

  7. Image analysis and platform development for automated phenotyping in cytomics

    NARCIS (Netherlands)

    Yan, Kuan

    2013-01-01

    This thesis is dedicated to the empirical study of image analysis in HT/HC screen study. Often a HT/HC screening produces extensive amounts that cannot be manually analyzed. Thus, an automated image analysis solution is prior to an objective understanding of the raw image data. Compared to general a

  8. Analysis of Trinity Power Metrics for Automated Monitoring

    Energy Technology Data Exchange (ETDEWEB)

    Michalenko, Ashley Christine [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-07-28

    This is a presentation from Los Alamos National Laboraotyr (LANL) about the analysis of trinity power metrics for automated monitoring. The following topics are covered: current monitoring efforts, motivation for analysis, tools used, the methodology, work performed during the summer, and future work planned.

  9. Automating sensitivity analysis of computer models using computer calculus

    International Nuclear Information System (INIS)

    An automated procedure for performing sensitivity analyses has been developed. The procedure uses a new FORTRAN compiler with computer calculus capabilities to generate the derivatives needed to set up sensitivity equations. The new compiler is called GRESS - Gradient Enhanced Software System. Application of the automated procedure with ''direct'' and ''adjoint'' sensitivity theory for the analysis of non-linear, iterative systems of equations is discussed. Calculational efficiency consideration and techniques for adjoint sensitivity analysis are emphasized. The new approach is found to preserve the traditional advantages of adjoint theory while removing the tedious human effort previously needed to apply this theoretical methodology. Conclusions are drawn about the applicability of the automated procedure in numerical analysis and large-scale modelling sensitivity studies. 24 refs., 2 figs

  10. Malaria: the value of the automated depolarization analysis.

    Science.gov (United States)

    Josephine, F P; Nissapatorn, V

    2005-01-01

    This retrospective and descriptive study was carried out in the University of Malaya Medical Center (UMMC) from January to September, 2004. This study aimed to evaluate the diagnostic utility of the Cell-Dyn 4000 hematology analyzer's depolarization analysis and to determine the sensitivity and specificity of this technique in the context of malaria diagnosis. A total of 889 cases presenting with pyrexia of unknown origin or clinically suspected of malaria were examined. Sixteen of these blood samples were found to be positive; 12 for P. vivax, 3 for P. malariae, and 1 for P. falciparum by peripheral blood smear as the standard technique for parasite detection and species identification. Demographic characteristics showed that the majority of patients were in the age range of 20-57 with a mean of 35.9 (+/- SD) 11.4 years, and male foreign workers. Of these, 16 positive blood samples were also processed by Cell-Dyne 4000 analyzer in the normal complete blood count (CBC) operational mode. Malaria parasites produce hemozoin, which depolarizes light and this allows the automated detection of malaria during routine complete blood count analysis with the Abbot Cell-Dyn CD4000 instrument. The white blood cell (WBC) differential plots of all malaria positive samples showed abnormal depolarization events in the NEU-EOS and EOS I plots. This was not seen in the negative samples. In 12 patients with P. vivax infection, a cluster pattern in the Neu-EOS and EOS I plots was observed, and appeared color-coded green or black. In 3 patients with P. malariae infection, few random depolarization events in the NEU-EOS and EOS I plots were seen, and appeared color-coded green, black or blue. While in the patient with P. falciparum infection, the sample was color-coded green with a few random purple depolarizing events in the NEU-EOS and EOS I plots. This study confirms that automated depolarization analysis is a highly sensitive and specific method to diagnose whether or not a patient

  11. Method and apparatus for automated, modular, biomass power generation

    Science.gov (United States)

    Diebold, James P.; Lilley, Arthur; Browne, Kingsbury III; Walt, Robb Ray; Duncan, Dustin; Walker, Michael; Steele, John; Fields, Michael; Smith, Trevor

    2011-03-22

    Method and apparatus for generating a low tar, renewable fuel gas from biomass and using it in other energy conversion devices, many of which were designed for use with gaseous and liquid fossil fuels. An automated, downdraft gasifier incorporates extensive air injection into the char bed to maintain the conditions that promote the destruction of residual tars. The resulting fuel gas and entrained char and ash are cooled in a special heat exchanger, and then continuously cleaned in a filter prior to usage in standalone as well as networked power systems.

  12. Comparison of Particulate Mercury Measured with Manual and Automated Methods

    Directory of Open Access Journals (Sweden)

    Rachel Russo

    2011-01-01

    Full Text Available A study was conducted to compare measuring particulate mercury (HgP with the manual filter method and the automated Tekran system. Simultaneous measurements were conducted with the Tekran and Teflon filter methodologies in the marine and coastal continental atmospheres. Overall, the filter HgP values were on the average 21% higher than the Tekran HgP, and >85% of the data were outside of ±25% region surrounding the 1:1 line. In some cases the filter values were as much as 3-fold greater, with

  13. Extending and automating a Systems-Theoretic hazard analysis for requirements generation and analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Thomas, John (Massachusetts Institute of Technology)

    2012-05-01

    Systems Theoretic Process Analysis (STPA) is a powerful new hazard analysis method designed to go beyond traditional safety techniques - such as Fault Tree Analysis (FTA) - that overlook important causes of accidents like flawed requirements, dysfunctional component interactions, and software errors. While proving to be very effective on real systems, no formal structure has been defined for STPA and its application has been ad-hoc with no rigorous procedures or model-based design tools. This report defines a formal mathematical structure underlying STPA and describes a procedure for systematically performing an STPA analysis based on that structure. A method for using the results of the hazard analysis to generate formal safety-critical, model-based system and software requirements is also presented. Techniques to automate both the analysis and the requirements generation are introduced, as well as a method to detect conflicts between the safety and other functional model-based requirements during early development of the system.

  14. Multielement and automated radiochemical separation procedures for activation analysis

    International Nuclear Information System (INIS)

    In recent years the demand for information about the distribution of elements at trace concentration levels in high purity materials and in biological, environmental and geological specimens has increased greatly. Neutron activation analysis can play an important role in obtaining the required information. Radiochemical separations are required in many of the applications mentioned. A critical review of the progress made over the last 15 years in the development and application of radiochemical separation schemes for multielement activation analysis and in their automation is presented. About 80 radiochemical separation schemes are reviewed. Advantages and disadvantages of the automation of radiochemical separations are critically analysed. The various machines developed are illustrated and technical suggestions for the development of automated machines are given. (author)

  15. An automated method for identifying artifact in ICA of resting-state fMRI

    Directory of Open Access Journals (Sweden)

    Kaushik eBhaganagarapu

    2013-07-01

    Full Text Available An enduring issue with data-driven analysis and filtering methods is the interpretation of results. To assist, we present an automatic method for identifaction of artifact in independent components (ICs derived from functional MRI (fMRI. The method was designed with the following features: Does not require temporal information about an fMRI paradigm; Does not require the user to train the algorithm; Requires only the fMRI images (additional acquisition of anatomical imaging not required; Is able to identify a high proportion of artifact-related ICs without removing components that are likely to be of neuronal origin; Can be applied to resting-state fMRI; Is automated, requiring minimal or no human intervention.We applied the method to a MELODIC probabilistic ICA of resting-state functional connectivity data acquired in 50 healthy control subjects, and compared the results to a blinded expert manual classification. The method identified between 26% and 72% of the components as artifact (mean 55%. 0.3% of components identified as artifact were discordant with the manual classification; retrospective examination of these ICs suggested the automated method had correctly identified these as artifact.We have developed an effective automated method which removes a substantial number of unwanted noisy components in ICA analyses of resting-state fMRI data. Source code of our implementation of the method is available.

  16. CRITICAL ASSESSMENT OF AUTOMATED FLOW CYTOMETRY DATA ANALYSIS TECHNIQUES

    OpenAIRE

    Aghaeepour, Nima; Finak, Greg; ,; Hoos, Holger; Mosmann, Tim R; Gottardo, Raphael; Brinkman, Ryan; Scheuermann, Richard H.

    2013-01-01

    Traditional methods for flow cytometry (FCM) data processing rely on subjective manual gating. Recently, several groups have developed computational methods for identifying cell populations in multidimensional FCM data. The Flow Cytometry: Critical Assessment of Population Identification Methods (FlowCAP) challenges were established to compare the performance of these methods on two tasks – mammalian cell population identification to determine if automated algorithms can reproduce expert manu...

  17. Automated defect recognition method based on neighbor layer slice images of ICT

    International Nuclear Information System (INIS)

    The current automated defect recognition of industrial computerized tomography(ICT) slice images is mostly carried out in individual image. Certain false detections would exist for some isolated noises can not be wiped off without considering the information of neighbor layer images. To solve this problem,a new automated defect recognition method is presented based on a two-step analysis of consecutive slice images. First, all potential defects are segmented using a classic method in each image. Second, real defects and false defects are recognized by all potential defect matching of neighbor layer images in two steps based on the continuity of real defects characteristic and the non-continuity of false defects between the neighbor images. The method is verified by experiments and results prove that the real defects can be detected with high probability and false detections can be reduced effectively. (authors)

  18. Automated analysis for scintigraphic evaluation of gastric emptying using invariant moments.

    Science.gov (United States)

    Abutaleb, A; Delalic, Z J; Ech, R; Siegel, J A

    1989-01-01

    This study introduces a method for automated analysis of the standard solid-meal gastric emptying test. The purpose was to develop a diagnostic tool to characterize more reproducibly abnormalities of solid-phase gastric emptying. The processing of gastric emptying is automated using geometrical moments that are invariant to scaling, rotation, and shift. Twenty subjects were studied. The first step was to obtain images of the stomach using a nuclear gamma camera immediately after the subject had eaten a radio-labeled meal. The second step was to process and analyze the images by a recently developed automated gastric emptying analysis (AGEA) method, which determines the gastric contour and the geometrical properties include such parameters as area, centroid, orientation, and moments of inertia. Statistical tests showed that some of the moments were sensitive to the patient's gastric status (normal versus abnormal). The difference between the normal and abnormal patients became noticeable approximately 1 h after meal ingestion. PMID:18230536

  19. Performance Analysis of GAME: A Generic Automated Marking Environment

    Science.gov (United States)

    Blumenstein, Michael; Green, Steve; Fogelman, Shoshana; Nguyen, Ann; Muthukkumarasamy, Vallipuram

    2008-01-01

    This paper describes the Generic Automated Marking Environment (GAME) and provides a detailed analysis of its performance in assessing student programming projects and exercises. GAME has been designed to automatically assess programming assignments written in a variety of languages based on the "structure" of the source code and the correctness…

  20. An Automated Data Analysis Tool for Livestock Market Data

    Science.gov (United States)

    Williams, Galen S.; Raper, Kellie Curry

    2011-01-01

    This article describes an automated data analysis tool that allows Oklahoma Cooperative Extension Service educators to disseminate results in a timely manner. Primary data collected at Oklahoma Quality Beef Network (OQBN) certified calf auctions across the state results in a large amount of data per sale site. Sale summaries for an individual sale…

  1. ORIGAMI Automator Primer. Automated ORIGEN Source Terms and Spent Fuel Storage Pool Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Wieselquist, William A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Thompson, Adam B. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Bowman, Stephen M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Peterson, Joshua L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-04-01

    Source terms and spent nuclear fuel (SNF) storage pool decay heat load analyses for operating nuclear power plants require a large number of Oak Ridge Isotope Generation and Depletion (ORIGEN) calculations. SNF source term calculations also require a significant amount of bookkeeping to track quantities such as core and assembly operating histories, spent fuel pool (SFP) residence times, heavy metal masses, and enrichments. The ORIGEN Assembly Isotopics (ORIGAMI) module in the SCALE code system provides a simple scheme for entering these data. However, given the large scope of the analysis, extensive scripting is necessary to convert formats and process data to create thousands of ORIGAMI input files (one per assembly) and to process the results into formats readily usable by follow-on analysis tools. This primer describes a project within the SCALE Fulcrum graphical user interface (GUI) called ORIGAMI Automator that was developed to automate the scripting and bookkeeping in large-scale source term analyses. The ORIGAMI Automator enables the analyst to (1) easily create, view, and edit the reactor site and assembly information, (2) automatically create and run ORIGAMI inputs, and (3) analyze the results from ORIGAMI. ORIGAMI Automator uses the standard ORIGEN binary concentrations files produced by ORIGAMI, with concentrations available at all time points in each assembly’s life. The GUI plots results such as mass, concentration, activity, and decay heat using a powerful new ORIGEN Post-Processing Utility for SCALE (OPUS) GUI component. This document includes a description and user guide for the GUI, a step-by-step tutorial for a simplified scenario, and appendices that document the file structures used.

  2. Development of An Optimization Method for Determining Automation Rate in Nuclear Power Plants

    International Nuclear Information System (INIS)

    Since automation was introduced in various industrial fields, it has been known that automation provides positive effects like greater efficiency and fewer human errors, and negative effect defined as out-of-the-loop (OOTL). Thus, before introducing automation in nuclear field, the estimation of the positive and negative effects of automation on human operators should be conducted. In this paper, by focusing on CPS, the optimization method to find an appropriate proportion of automation is suggested by integrating the suggested cognitive automation rate and the concepts of the level of ostracism. The cognitive automation rate estimation method was suggested to express the reduced amount of human cognitive loads, and the level of ostracism was suggested to express the difficulty in obtaining information from the automation system and increased uncertainty of human operators' diagnosis. The maximized proportion of automation that maintains the high level of attention for monitoring the situation is derived by an experiment, and the automation rate is estimated by the suggested automation rate estimation method. It is expected to derive an appropriate inclusion proportion of the automation system avoiding the OOTL problem and having maximum efficacy at the same time

  3. Automated SEM-EDS GSR Analysis for Turkish Ammunitions

    International Nuclear Information System (INIS)

    In this work, Automated Scanning Electron Microscopy with Energy Dispersive X-ray Spectrometry (SEM-EDS) was used to characterize 7.65 and 9mm cartridges Turkish ammunition. All samples were analyzed in a SEM Jeol JSM-5600LV equipped BSE detector and a Link ISIS 300 (EDS). A working distance of 20mm, an accelerating voltage of 20 keV and gunshot residue software was used in all analysis. Automated search resulted in a high number of particles analyzed containing gunshot residues (GSR) unique elements (PbBaSb). The obtained data about the definition of characteristic GSR particles was concordant with other studies on this topic

  4. Automation of finite element analysis in pressure equipments design. Applications in energy and petrochemistry industries

    International Nuclear Information System (INIS)

    Pressure Equipments are used in different Petrochemistry and Energy industries, like in thermal ou nuclear power plants, oil refineries, and chemical plants. These industries have to insure the safety and environmental conditions. Design of Pressure Equipments have to respect Codes and Standards. For those Pressure Vessels with Neighbouring nozzles, or under load cases different from Pressure loadings (Seismic, Concentrated, etc..), Finite Element simulation remains the unique and accepted method for verifying equipments design according to Codes and Standards. This paper presents application of automated procedure for stress and criteria verification in Energy and Petrochemistry industries. This automated procedure allows to insure the analysis quality and to reduce the analysis time. (author)

  5. Granulometric profiling of aeolian dust deposits by automated image analysis

    Science.gov (United States)

    Varga, György; Újvári, Gábor; Kovács, János; Jakab, Gergely; Kiss, Klaudia; Szalai, Zoltán

    2016-04-01

    Determination of granulometric parameters is of growing interest in the Earth sciences. Particle size data of sedimentary deposits provide insights into the physicochemical environment of transport, accumulation and post-depositional alterations of sedimentary particles, and are important proxies applied in paleoclimatic reconstructions. It is especially true for aeolian dust deposits with a fairly narrow grain size range as a consequence of the extremely selective nature of wind sediment transport. Therefore, various aspects of aeolian sedimentation (wind strength, distance to source(s), possible secondary source regions and modes of sedimentation and transport) can be reconstructed only from precise grain size data. As terrestrial wind-blown deposits are among the most important archives of past environmental changes, proper explanation of the proxy data is a mandatory issue. Automated imaging provides a unique technique to gather direct information on granulometric characteristics of sedimentary particles. Granulometric data obtained from automatic image analysis of Malvern Morphologi G3-ID is a rarely applied new technique for particle size and shape analyses in sedimentary geology. Size and shape data of several hundred thousand (or even million) individual particles were automatically recorded in this study from 15 loess and paleosoil samples from the captured high-resolution images. Several size (e.g. circle-equivalent diameter, major axis, length, width, area) and shape parameters (e.g. elongation, circularity, convexity) were calculated by the instrument software. At the same time, the mean light intensity after transmission through each particle is automatically collected by the system as a proxy of optical properties of the material. Intensity values are dependent on chemical composition and/or thickness of the particles. The results of the automated imaging were compared to particle size data determined by three different laser diffraction instruments

  6. Improving automated 3D reconstruction methods via vision metrology

    Science.gov (United States)

    Toschi, Isabella; Nocerino, Erica; Hess, Mona; Menna, Fabio; Sargeant, Ben; MacDonald, Lindsay; Remondino, Fabio; Robson, Stuart

    2015-05-01

    This paper aims to provide a procedure for improving automated 3D reconstruction methods via vision metrology. The 3D reconstruction problem is generally addressed using two different approaches. On the one hand, vision metrology (VM) systems try to accurately derive 3D coordinates of few sparse object points for industrial measurement and inspection applications; on the other, recent dense image matching (DIM) algorithms are designed to produce dense point clouds for surface representations and analyses. This paper strives to demonstrate a step towards narrowing the gap between traditional VM and DIM approaches. Efforts are therefore intended to (i) test the metric performance of the automated photogrammetric 3D reconstruction procedure, (ii) enhance the accuracy of the final results and (iii) obtain statistical indicators of the quality achieved in the orientation step. VM tools are exploited to integrate their main functionalities (centroid measurement, photogrammetric network adjustment, precision assessment, etc.) into the pipeline of 3D dense reconstruction. Finally, geometric analyses and accuracy evaluations are performed on the raw output of the matching (i.e. the point clouds) by adopting a metrological approach. The latter is based on the use of known geometric shapes and quality parameters derived from VDI/VDE guidelines. Tests are carried out by imaging the calibrated Portable Metric Test Object, designed and built at University College London (UCL), UK. It allows assessment of the performance of the image orientation and matching procedures within a typical industrial scenario, characterised by poor texture and known 3D/2D shapes.

  7. Volumetric measurements of pulmonary nodules: variability in automated analysis tools

    Science.gov (United States)

    Juluru, Krishna; Kim, Woojin; Boonn, William; King, Tara; Siddiqui, Khan; Siegel, Eliot

    2007-03-01

    Over the past decade, several computerized tools have been developed for detection of lung nodules and for providing volumetric analysis. Incidentally detected lung nodules have traditionally been followed over time by measurements of their axial dimensions on CT scans to ensure stability or document progression. A recently published article by the Fleischner Society offers guidelines on the management of incidentally detected nodules based on size criteria. For this reason, differences in measurements obtained by automated tools from various vendors may have significant implications on management, yet the degree of variability in these measurements is not well understood. The goal of this study is to quantify the differences in nodule maximum diameter and volume among different automated analysis software. Using a dataset of lung scans obtained with both "ultra-low" and conventional doses, we identified a subset of nodules in each of five size-based categories. Using automated analysis tools provided by three different vendors, we obtained size and volumetric measurements on these nodules, and compared these data using descriptive as well as ANOVA and t-test analysis. Results showed significant differences in nodule maximum diameter measurements among the various automated lung nodule analysis tools but no significant differences in nodule volume measurements. These data suggest that when using automated commercial software, volume measurements may be a more reliable marker of tumor progression than maximum diameter. The data also suggest that volumetric nodule measurements may be relatively reproducible among various commercial workstations, in contrast to the variability documented when performing human mark-ups, as is seen in the LIDC (lung imaging database consortium) study.

  8. An automated software for analysis of experimental data on decay heat from spent nuclear fuel

    OpenAIRE

    Llerena Herrera, Isbel

    2012-01-01

    The Swedish Nuclear Fuel and Waste Management Company (SKB) has developed a method for final disposal of spent nuclear fuel. This technique requires accurate measurement of the residual decay heat of every assembly. For this purpose, depletion codes as well as calorimetric and gamma-ray spectroscopy experimental methods have been developed and evaluated. In this work a prototype analysis tool has been developed to automate the analysis of both calorimetric and gamma-ray spectroscopy measureme...

  9. Sleep-spindle detection: crowdsourcing and evaluating performance of experts, non-experts and automated methods

    DEFF Research Database (Denmark)

    Warby, Simon C.; Wendt, Sabrina Lyngbye; Welinder, Peter;

    2014-01-01

    crowdsource spindle identification by human experts and non-experts, and we compared their performance with that of automated detection algorithms in data from middle- to older-aged subjects from the general population. We also refined methods for forming group consensus and evaluating the performance of...... that crowdsourcing the scoring of sleep data is an efficient method to collect large data sets, even for difficult tasks such as spindle identification. Further refinements to spindle detection algorithms are needed for middle- to older-aged subjects....... event detectors in physiological data such as electroencephalographic recordings from polysomnography. Compared to the expert group consensus gold standard, the highest performance was by individual experts and the non-expert group consensus, followed by automated spindle detectors. This analysis showed...

  10. Fast reversible single-step method for enhanced band contrast of polyacrylamide gels for automated detection.

    Science.gov (United States)

    Ling, Wei-Li; Lua, Wai-Heng; Gan, Samuel Ken-En

    2015-05-01

    Staining SDS-PAGE is commonly used in protein analysis for many downstream characterization processes. Although staining and destaining protocols can be adjusted, they can be laborious, and faint bands often become false negatives. Similarly, these faint bands hinder automated software band detections that are necessary for quantitative analyses. To overcome these problems, we describe a single-step rapid and reversible method to increase (up to 500%) band contrast in stained gels. Through the use of alcohols, we improved band detection and facilitated gel storage by drying the gels into compact white sheets. This method is suitable for all stained SDS-PAGE gels, including gradient gels and is shown to improve automated band detection by enhanced band contrast. PMID:25782090

  11. On Automating and Standardising Corpus Callosum Analysis in Brain MRI

    DEFF Research Database (Denmark)

    Stegmann, Mikkel Bille; Skoglund, Karl

    2005-01-01

    Corpus callosum analysis is influenced by many factors. The effort in controlling these has previously been incomplete and scattered. This paper sketches a complete pipeline for automated corpus callosum analysis from magnetic resonance images, with focus on measurement standardisation. The...... presented pipeline deals with i) estimation of the mid-sagittal plane, ii) localisation and registration of the corpus callosum, iii) parameterisation and representation of its contour, and iv) means of standardising the traditional reference area measurements....

  12. Automated analysis for detecting beams in laser wakefield simulations

    Energy Technology Data Exchange (ETDEWEB)

    Ushizima, Daniela M.; Rubel, Oliver; Prabhat, Mr.; Weber, Gunther H.; Bethel, E. Wes; Aragon, Cecilia R.; Geddes, Cameron G.R.; Cormier-Michel, Estelle; Hamann, Bernd; Messmer, Peter; Hagen, Hans

    2008-07-03

    Laser wakefield particle accelerators have shown the potential to generate electric fields thousands of times higher than those of conventional accelerators. The resulting extremely short particle acceleration distance could yield a potential new compact source of energetic electrons and radiation, with wide applications from medicine to physics. Physicists investigate laser-plasma internal dynamics by running particle-in-cell simulations; however, this generates a large dataset that requires time-consuming, manual inspection by experts in order to detect key features such as beam formation. This paper describes a framework to automate the data analysis and classification of simulation data. First, we propose a new method to identify locations with high density of particles in the space-time domain, based on maximum extremum point detection on the particle distribution. We analyze high density electron regions using a lifetime diagram by organizing and pruning the maximum extrema as nodes in a minimum spanning tree. Second, we partition the multivariate data using fuzzy clustering to detect time steps in a experiment that may contain a high quality electron beam. Finally, we combine results from fuzzy clustering and bunch lifetime analysis to estimate spatially confined beams. We demonstrate our algorithms successfully on four different simulation datasets.

  13. Automated analysis for detecting beams in laser wakefield simulations

    International Nuclear Information System (INIS)

    Laser wakefield particle accelerators have shown the potential to generate electric fields thousands of times higher than those of conventional accelerators. The resulting extremely short particle acceleration distance could yield a potential new compact source of energetic electrons and radiation, with wide applications from medicine to physics. Physicists investigate laser-plasma internal dynamics by running particle-in-cell simulations; however, this generates a large dataset that requires time-consuming, manual inspection by experts in order to detect key features such as beam formation. This paper describes a framework to automate the data analysis and classification of simulation data. First, we propose a new method to identify locations with high density of particles in the space-time domain, based on maximum extremum point detection on the particle distribution. We analyze high density electron regions using a lifetime diagram by organizing and pruning the maximum extrema as nodes in a minimum spanning tree. Second, we partition the multivariate data using fuzzy clustering to detect time steps in a experiment that may contain a high quality electron beam. Finally, we combine results from fuzzy clustering and bunch lifetime analysis to estimate spatially confined beams. We demonstrate our algorithms successfully on four different simulation datasets

  14. Automated Imaging and Analysis of the Hemagglutination Inhibition Assay.

    Science.gov (United States)

    Nguyen, Michael; Fries, Katherine; Khoury, Rawia; Zheng, Lingyi; Hu, Branda; Hildreth, Stephen W; Parkhill, Robert; Warren, William

    2016-04-01

    The hemagglutination inhibition (HAI) assay quantifies the level of strain-specific influenza virus antibody present in serum and is the standard by which influenza vaccine immunogenicity is measured. The HAI assay endpoint requires real-time monitoring of rapidly evolving red blood cell (RBC) patterns for signs of agglutination at a rate of potentially thousands of patterns per day to meet the throughput needs for clinical testing. This analysis is typically performed manually through visual inspection by highly trained individuals. However, concordant HAI results across different labs are challenging to demonstrate due to analyst bias and variability in analysis methods. To address these issues, we have developed a bench-top, standalone, high-throughput imaging solution that automatically determines the agglutination states of up to 9600 HAI assay wells per hour and assigns HAI titers to 400 samples in a single unattended 30-min run. Images of the tilted plates are acquired as a function of time and analyzed using algorithms that were developed through comprehensive examination of manual classifications. Concordance testing of the imaging system with eight different influenza antigens demonstrates 100% agreement between automated and manual titer determination with a percent difference of ≤3.4% for all cases. PMID:26464422

  15. Morphological observation and analysis using automated image cytometry for the comparison of trypan blue and fluorescence-based viability detection method

    OpenAIRE

    Chan, Leo Li-Ying; Kuksin, Dmitry; Laverty, Daniel J.; Saldi, Stephanie; Qiu, Jean

    2014-01-01

    The ability to accurately determine cell viability is essential to performing a well-controlled biological experiment. Typical experiments range from standard cell culturing to advanced cell-based assays that may require cell viability measurement for downstream experiments. The traditional cell viability measurement method has been the trypan blue (TB) exclusion assay. However, since the introduction of fluorescence-based dyes for cell viability measurement using flow or image-based cytometr...

  16. Automated striatal uptake analysis of 18F-FDOPA PET images applied to Parkinson's disease patients

    International Nuclear Information System (INIS)

    6-[18F]Fluoro-L-DOPA (FDOPA) is a radiopharmaceutical valuable for assessing the presynaptic dopaminergic function when used with positron emission tomography (PET). More specifically, the striatal-to-occipital ratio (SOR) of FDOPA uptake images has been extensively used as a quantitative parameter in these PET studies. Our aim was to develop an easy, automated method capable of performing objective analysis of SOR in FDOPA PET images of Parkinson's disease (PD) patients. Brain images from FDOPA PET studies of 21 patients with PD and 6 healthy subjects were included in our automated striatal analyses. Images of each individual were spatially normalized into an FDOPA template. Subsequently, the image slice with the highest level of basal ganglia activity was chosen among the series of normalized images. Also, the immediate preceding and following slices of the chosen image were then selected. Finally, the summation of these three images was used to quantify and calculate the SOR values. The results obtained by automated analysis were compared with manual analysis by a trained and experienced image processing technologist. The SOR values obtained from the automated analysis had a good agreement and high correlation with manual analysis. The differences in caudate, putamen, and striatum were -0.023, -0.029, and -0.025, respectively; correlation coefficients 0.961, 0.957, and 0.972, respectively. We have successfully developed a method for automated striatal uptake analysis of FDOPA PET images. There was no significant difference between the SOR values obtained from this method and using manual analysis. Yet it is an unbiased time-saving and cost-effective program and easy to implement on a personal computer. (author)

  17. Towards Automated Design, Analysis and Optimization of Declarative Curation Workflows

    Directory of Open Access Journals (Sweden)

    Tianhong Song

    2014-10-01

    Full Text Available Data curation is increasingly important. Our previous work on a Kepler curation package has demonstrated advantages that come from automating data curation pipelines by using workflow systems. However, manually designed curation workflows can be error-prone and inefficient due to a lack of user understanding of the workflow system, misuse of actors, or human error. Correcting problematic workflows is often very time-consuming. A more proactive workflow system can help users avoid such pitfalls. For example, static analysis before execution can be used to detect the potential problems in a workflow and help the user to improve workflow design. In this paper, we propose a declarative workflow approach that supports semi-automated workflow design, analysis and optimization. We show how the workflow design engine helps users to construct data curation workflows, how the workflow analysis engine detects different design problems of workflows and how workflows can be optimized by exploiting parallelism.

  18. Automated Method for Monitoring Water Quality Using Landsat Imagery

    Directory of Open Access Journals (Sweden)

    D. Clay Barrett

    2016-06-01

    Full Text Available Regular monitoring of water quality is increasingly necessary to keep pace with rapid environmental change and protect human health and well-being. Remote sensing has been suggested as a potential solution for monitoring certain water quality parameters without the need for in situ sampling, but universal methods and tools are lacking. While many studies have developed predictive relationships between remotely sensed surface reflectance and water parameters, these relationships are often unique to a particular geographic region and have little applicability in other areas. In order to remotely monitor water quality, these relationships must be developed on a region by region basis. This paper presents an automated method for processing remotely sensed images from Landsat Thematic Mapper (TM and Enhanced Thematic Mapper Plus (ETM+ and extracting corrected reflectance measurements around known sample locations to allow rapid development of predictive water quality relationships to improve remote monitoring. Using open Python scripting, this study (1 provides an openly accessible and simple method for processing publicly available remote sensing data; and (2 allows determination of relationships between sampled water quality parameters and reflectance values to ultimately allow predictive monitoring. The method is demonstrated through a case study of the Ozark/Ouchita-Appalachian ecoregion in eastern Oklahoma using data collected for the Beneficial Use Monitoring Program (BUMP.

  19. Automating with ROBOCOM. An expert system for complex engineering analysis

    International Nuclear Information System (INIS)

    Nuclear engineering analysis is automated with the help of preprocessors and postprocessors. All the analysis and processing steps are recorded in a form that is reportable and replayable. These recordings serve both as documentations and as robots, for they are capable of performing the analyses they document. Since the processors and robots in ROBOCOM interface the users in a way independent of the analysis program being used, it is now possible to unify input modeling for programs with similar functionality. ROBOCOM will eventually evolve into an encyclopedia of how every nuclear engineering analysis is performed

  20. The automation of analysis of technological process effectiveness

    Directory of Open Access Journals (Sweden)

    B. Krupińska

    2007-10-01

    Full Text Available Purpose: Improvement of technological processes by the use of technological efficiency analysis can create basis of their optimization. Informatization and computerization of wider and wider scope of activity is one of the most important current development trends of an enterprise.Design/methodology/approach: Indicators appointment makes it possible to evaluate the process efficiency, which can constitute an optimization basis of particular operation. Model of technological efficiency analysis is based on particular efficiency indicators that characterize operation, taking into account following criteria: operation – material, operation – machine, operation – human, operation – technological parameters.Findings: From the qualitative and correctness of choose of technology point of view comprehensive technological processes assessment makes up the basis of technological efficiency analysis. Results of technological efficiency analysis of technological process of prove that the chosen model of technological efficiency analysis makes it possible to improve the process continuously by the technological analysis, and application of computer assistance makes it possible to automate the process of efficiency analysis, and finally controlled improvement of technological processes.Practical implications: For the sake of complexity of technological efficiency analysis one has created an AEPT computer analysis from which result: operation efficiency indicators with distinguished indicators with minimal acceptable values, values of efficiency of the applied samples, value of technological process efficiency.Originality/value: The created computer analysis of ef technological process efficiency (AEPT makes it possible to automate the process of analysis and optimization.

  1. Histogram analysis with automated extraction of brain-tissue region from whole-brain CT images

    OpenAIRE

    Kondo, Masatoshi; Yamashita, Koji; Yoshiura, Takashi; Hiwatash, Akio; Shirasaka, Takashi; Arimura, Hisao; Nakamura, Yasuhiko; Honda, Hiroshi

    2015-01-01

    To determine whether an automated extraction of the brain-tissue region from CT images is useful for the histogram analysis of the brain-tissue region was studied. We used the CT images of 11 patients. We developed an automatic brain-tissue extraction algorithm. We evaluated the similarity index of this automated extraction method relative to manual extraction, and we compared the mean CT number of all extracted pixels and the kurtosis and skewness of the distribution of CT numbers of all ext...

  2. Micro photometer's automation for quantitative spectrograph analysis

    International Nuclear Information System (INIS)

    A Microphotometer is used to increase the sharpness of dark spectral lines. Analyzing these lines one sample content and its concentration could be determined and the analysis is known as Quantitative Spectrographic Analysis. The Quantitative Spectrographic Analysis is carried out in 3 steps, as follows. 1. Emulsion calibration. This consists of gauging a photographic emulsion, to determine the intensity variations in terms of the incident radiation. For the procedure of emulsion calibration an adjustment with square minimum to the data obtained is applied to obtain a graph. It is possible to determine the density of dark spectral line against the incident light intensity shown by the microphotometer. 2. Working curves. The values of known concentration of an element against incident light intensity are plotted. Since the sample contains several elements, it is necessary to find a work curve for each one of them. 3. Analytical results. The calibration curve and working curves are compared and the concentration of the studied element is determined. The automatic data acquisition, calculation and obtaining of resulting, is done by means of a computer (PC) and a computer program. The conditioning signal circuits have the function of delivering TTL levels (Transistor Transistor Logic) to make the communication between the microphotometer and the computer possible. Data calculation is done using a computer programm

  3. Automated optics inspection analysis for NIF

    International Nuclear Information System (INIS)

    The National Ignition Facility (NIF) is a high-energy laser facility comprised of 192 beamlines that house thousands of optics. These optics guide, amplify and tightly focus light onto a tiny target for fusion ignition research and high energy density physics experiments. The condition of these optics is key to the economic, efficient and maximally energetic performance of the laser. Our goal, and novel achievement, is to find on the optics any imperfections while they are tens of microns in size, track them through time to see if they grow and if so, remove the optic and repair the single site so the entire optic can then be re-installed for further use on the laser. This paper gives an overview of the image analysis used for detecting, measuring, and tracking sites of interest on an optic while it is installed on the beamline via in situ inspection and after it has been removed for maintenance. In this way, the condition of each optic is monitored throughout the optic's lifetime. This overview paper will summarize key algorithms and technical developments for custom image analysis and processing and highlight recent improvements. (Associated papers will include more details on these issues.) We will also discuss the use of OI Analysis for daily operation of the NIF laser and its extension to inspection of NIF targets.

  4. Automated logic conversion method for plant controller systems

    International Nuclear Information System (INIS)

    An automated method is proposed for logic conversion from functional description diagrams to detailed logic schematics by incorporating expertise knowledge in plant controller systems design. The method uses connection data of function elements in the functional description diagram as input, and synthesizes a detailed logic structure by adding elements to the given connection data incrementally, and to generate detailed logic schematics. In logic synthesis, for building up complex synthesis procedures by combining generally-described knowledge, knowledge is applied by groups. The search order of the groups is given by upper-level knowledge. Furthermore, the knowledge is expressed in terms of two classes of rules; one for generating a hypothesis of individual synthesis operations and the other for considering several hypotheses to determine the connection ordering of elements to be added. In the generation of detailed logic schematics, knowledge is used as rules for deriving various kinds of layout conditions on schematics, and rules for generating two-dimensional coordinates of layout objects. Rules in the latter class use layout conditions to predict intersections among layout objects without their coordinates being fixed. The effectiveness of the method with 150 rules was verified by its experimental application to some logic conversions in a real power plant design. Evaluation of the results showed them to be equivalent to those obtained by well qualified designers. (author)

  5. Automated simultaneous analysis phylogenetics (ASAP: an enabling tool for phlyogenomics

    Directory of Open Access Journals (Sweden)

    Lee Ernest K

    2008-02-01

    Full Text Available Abstract Background The availability of sequences from whole genomes to reconstruct the tree of life has the potential to enable the development of phylogenomic hypotheses in ways that have not been before possible. A significant bottleneck in the analysis of genomic-scale views of the tree of life is the time required for manual curation of genomic data into multi-gene phylogenetic matrices. Results To keep pace with the exponentially growing volume of molecular data in the genomic era, we have developed an automated technique, ASAP (Automated Simultaneous Analysis Phylogenetics, to assemble these multigene/multi species matrices and to evaluate the significance of individual genes within the context of a given phylogenetic hypothesis. Conclusion Applications of ASAP may enable scientists to re-evaluate species relationships and to develop new phylogenomic hypotheses based on genome-scale data.

  6. Method and system for assigning a confidence metric for automated determination of optic disc location

    Science.gov (United States)

    Karnowski, Thomas P.; Tobin, Jr., Kenneth W.; Muthusamy Govindasamy, Vijaya Priya; Chaum, Edward

    2012-07-10

    A method for assigning a confidence metric for automated determination of optic disc location that includes analyzing a retinal image and determining at least two sets of coordinates locating an optic disc in the retinal image. The sets of coordinates can be determined using first and second image analysis techniques that are different from one another. An accuracy parameter can be calculated and compared to a primary risk cut-off value. A high confidence level can be assigned to the retinal image if the accuracy parameter is less than the primary risk cut-off value and a low confidence level can be assigned to the retinal image if the accuracy parameter is greater than the primary risk cut-off value. The primary risk cut-off value being selected to represent an acceptable risk of misdiagnosis of a disease having retinal manifestations by the automated technique.

  7. RFI detection by automated feature extraction and statistical analysis

    OpenAIRE

    Winkel, Benjamin; Kerp, Juergen; Stanko, Stephan

    2006-01-01

    In this paper we present an interference detection toolbox consisting of a high dynamic range Digital Fast-Fourier-Transform spectrometer (DFFT, based on FPGA-technology) and data analysis software for automated radio frequency interference (RFI) detection. The DFFT spectrometer allows high speed data storage of spectra on time scales of less than a second. The high dynamic range of the device assures constant calibration even during extremely powerful RFI events. The software uses an algorit...

  8. Experience based ageing analysis of NPP protection automation in Finland

    International Nuclear Information System (INIS)

    This paper describes three successive studies on ageing of protection automation of nuclear power plants. These studies were aimed at developing a methodology for an experience based ageing analysis, and applying it to identify the most critical components from ageing and safety points of view. The analyses resulted also to suggestions for improvement of data collection systems for the purpose of further ageing analyses. (author)

  9. Automated tumor analysis for molecular profiling in lung cancer.

    Science.gov (United States)

    Hamilton, Peter W; Wang, Yinhai; Boyd, Clinton; James, Jacqueline A; Loughrey, Maurice B; Hougton, Joseph P; Boyle, David P; Kelly, Paul; Maxwell, Perry; McCleary, David; Diamond, James; McArt, Darragh G; Tunstall, Jonathon; Bankhead, Peter; Salto-Tellez, Manuel

    2015-09-29

    The discovery and clinical application of molecular biomarkers in solid tumors, increasingly relies on nucleic acid extraction from FFPE tissue sections and subsequent molecular profiling. This in turn requires the pathological review of haematoxylin & eosin (H&E) stained slides, to ensure sample quality, tumor DNA sufficiency by visually estimating the percentage tumor nuclei and tumor annotation for manual macrodissection. In this study on NSCLC, we demonstrate considerable variation in tumor nuclei percentage between pathologists, potentially undermining the precision of NSCLC molecular evaluation and emphasising the need for quantitative tumor evaluation. We subsequently describe the development and validation of a system called TissueMark for automated tumor annotation and percentage tumor nuclei measurement in NSCLC using computerized image analysis. Evaluation of 245 NSCLC slides showed precise automated tumor annotation of cases using Tissuemark, strong concordance with manually drawn boundaries and identical EGFR mutational status, following manual macrodissection from the image analysis generated tumor boundaries. Automated analysis of cell counts for % tumor measurements by Tissuemark showed reduced variability and significant correlation (p tissue samples for molecular profiling in discovery and diagnostics. PMID:26317646

  10. Fuzzy Emotional Semantic Analysis and Automated Annotation of Scene Images

    Directory of Open Access Journals (Sweden)

    Jianfang Cao

    2015-01-01

    Full Text Available With the advances in electronic and imaging techniques, the production of digital images has rapidly increased, and the extraction and automated annotation of emotional semantics implied by images have become issues that must be urgently addressed. To better simulate human subjectivity and ambiguity for understanding scene images, the current study proposes an emotional semantic annotation method for scene images based on fuzzy set theory. A fuzzy membership degree was calculated to describe the emotional degree of a scene image and was implemented using the Adaboost algorithm and a back-propagation (BP neural network. The automated annotation method was trained and tested using scene images from the SUN Database. The annotation results were then compared with those based on artificial annotation. Our method showed an annotation accuracy rate of 91.2% for basic emotional values and 82.4% after extended emotional values were added, which correspond to increases of 5.5% and 8.9%, respectively, compared with the results from using a single BP neural network algorithm. Furthermore, the retrieval accuracy rate based on our method reached approximately 89%. This study attempts to lay a solid foundation for the automated emotional semantic annotation of more types of images and therefore is of practical significance.

  11. Fuzzy emotional semantic analysis and automated annotation of scene images.

    Science.gov (United States)

    Cao, Jianfang; Chen, Lichao

    2015-01-01

    With the advances in electronic and imaging techniques, the production of digital images has rapidly increased, and the extraction and automated annotation of emotional semantics implied by images have become issues that must be urgently addressed. To better simulate human subjectivity and ambiguity for understanding scene images, the current study proposes an emotional semantic annotation method for scene images based on fuzzy set theory. A fuzzy membership degree was calculated to describe the emotional degree of a scene image and was implemented using the Adaboost algorithm and a back-propagation (BP) neural network. The automated annotation method was trained and tested using scene images from the SUN Database. The annotation results were then compared with those based on artificial annotation. Our method showed an annotation accuracy rate of 91.2% for basic emotional values and 82.4% after extended emotional values were added, which correspond to increases of 5.5% and 8.9%, respectively, compared with the results from using a single BP neural network algorithm. Furthermore, the retrieval accuracy rate based on our method reached approximately 89%. This study attempts to lay a solid foundation for the automated emotional semantic annotation of more types of images and therefore is of practical significance. PMID:25838818

  12. Automated eddy current analysis of materials

    Science.gov (United States)

    Workman, Gary L.

    1991-01-01

    The use of eddy current techniques for characterizing flaws in graphite-based filament-wound cylindrical structures is described. A major emphasis was also placed upon incorporating artificial intelligence techniques into the signal analysis portion of the inspection process. Developing an eddy current scanning system using a commercial robot for inspecting graphite structures (and others) was a goal in the overall concept and is essential for the final implementation for the expert systems interpretation. Manual scans, as performed in the preliminary work here, do not provide sufficiently reproducible eddy current signatures to be easily built into a real time expert system. The expert systems approach to eddy current signal analysis requires that a suitable knowledge base exist in which correct decisions as to the nature of a flaw can be performed. A robotic workcell using eddy current transducers for the inspection of carbon filament materials with improved sensitivity was developed. Improved coupling efficiencies achieved with the E-probes and horseshoe probes are exceptional for graphite fibers. The eddy current supervisory system and expert system was partially developed on a MacIvory system. Continued utilization of finite element models for predetermining eddy current signals was shown to be useful in this work, both for understanding how electromagnetic fields interact with graphite fibers, and also for use in determining how to develop the knowledge base. Sufficient data was taken to indicate that the E-probe and the horseshoe probe can be useful eddy current transducers for inspecting graphite fiber components. The lacking component at this time is a large enough probe to have sensitivity in both the far and near field of a thick graphite epoxy component.

  13. Development of methods for DSM and distribution automation planning

    International Nuclear Information System (INIS)

    Demand-Side Management (DSM) is usually an utility (or sometimes governmental) activity designed to influence energy demand of customers (both level and load variation). It includes basic options like strategic conservation or load growth, peak clipping. Load shifting and fuel switching. Typical ways to realize DSM are direct load control, innovative tariffs, different types of campaign etc. Restructuring of utility in Finland and increased competition in electricity market have had dramatic influence on the DSM. Traditional ways are impossible due to the conflicting interests of generation, network and supply business and increased competition between different actors in the market. Costs and benefits of DSM are divided to different companies, and different type of utilities are interested only in those activities which are beneficial to them. On the other hand, due to the increased competition the suppliers are diversifying to different types of products and increasing number of customer services partly based on DSM are available. The aim of this project was to develop and assess methods for DSM and distribution automation planning from the utility point of view. The methods were also applied to case studies at utilities

  14. Development of methods for DSM and distribution automation planning

    Energy Technology Data Exchange (ETDEWEB)

    Kaerkkaeinen, S.; Kekkonen, V. [VTT Energy, Espoo (Finland); Rissanen, P. [Tietosavo Oy (Finland)

    1998-08-01

    Demand-Side Management (DSM) is usually an utility (or sometimes governmental) activity designed to influence energy demand of customers (both level and load variation). It includes basic options like strategic conservation or load growth, peak clipping. Load shifting and fuel switching. Typical ways to realize DSM are direct load control, innovative tariffs, different types of campaign etc. Restructuring of utility in Finland and increased competition in electricity market have had dramatic influence on the DSM. Traditional ways are impossible due to the conflicting interests of generation, network and supply business and increased competition between different actors in the market. Costs and benefits of DSM are divided to different companies, and different type of utilities are interested only in those activities which are beneficial to them. On the other hand, due to the increased competition the suppliers are diversifying to different types of products and increasing number of customer services partly based on DSM are available. The aim of this project was to develop and assess methods for DSM and distribution automation planning from the utility point of view. The methods were also applied to case studies at utilities

  15. Automated identification of mitochondrial regions in complex intracellular space by texture analysis

    Science.gov (United States)

    Pham, Tuan D.

    2014-01-01

    Automated processing and quantification of biological images have been rapidly increasing the attention of researchers in image processing and pattern recognition because the roles of computerized image and pattern analyses are critical for new biological findings and drug discovery based on modern high-throughput and highcontent image screening. This paper presents a study of the automated detection of regions of mitochondria, which are a subcellular structure of eukaryotic cells, in microscopy images. The automated identification of mitochondria in intracellular space that is captured by the state-of-the-art combination of focused ion beam and scanning electron microscope imaging reported here is the first of its type. Existing methods and a proposed algorithm for texture analysis were tested with the real intracellular images. The high correction rate of detecting the locations of the mitochondria in a complex environment suggests the effectiveness of the proposed study.

  16. Analysis of Defense Language Institute automated student questionnaire data

    OpenAIRE

    Strycharz, Theodore M.

    1996-01-01

    This thesis explores the dimensionality of the Defense Language Institute's (DLI) primary student feed back tool - the Automated Student Questionnaire (ASQ). In addition a data set from ASQ 2.0 (the newest version) is analyzed for trends in student satisfaction across the sub-scales of sex, pay grade, and Defense Language Proficiency Test (DLPT) results. The method of principal components is used to derive initial factors. Although an interpretation of those factors seems plausible, these are...

  17. Artificial neural networks for automation of Rutherford backscattering spectroscopy experiments and data analysis

    International Nuclear Information System (INIS)

    We present an algorithm based on artificial neural networks able to determine optimized experimental conditions for Rutherford backscattering measurements of Ge-implanted Si. The algorithm can be implemented for any other element implanted into a lighter substrate. It is foreseeable that the method developed in this work can be applied to still many other systems. The algorithm presented is a push-button black box, and does not require any human intervention. It is thus suited for automated control of an experimental setup, given an interface to the relevant hardware. Once the experimental conditions are optimized, the algorithm analyzes the final data obtained, and determines the desired parameters. The method is thus also suited for automated analysis of the data. The algorithm presented can be easily extended to other ion beam analysis techniques. Finally, it is suggested how the artificial neural networks required for automated control and analysis of experiments could be automatically generated. This would be suited for automated generation of the required computer code. Thus could RBS be done without experimentalists, data analysts, or programmers, with only technicians to keep the machines running

  18. An automated 3D reconstruction method of UAV images

    Science.gov (United States)

    Liu, Jun; Wang, He; Liu, Xiaoyang; Li, Feng; Sun, Guangtong; Song, Ping

    2015-10-01

    In this paper a novel fully automated 3D reconstruction approach based on low-altitude unmanned aerial vehicle system (UAVs) images will be presented, which does not require previous camera calibration or any other external prior knowledge. Dense 3D point clouds are generated by integrating orderly feature extraction, image matching, structure from motion (SfM) and multi-view stereo (MVS) algorithms, overcoming many of the cost, time limitations of rigorous photogrammetry techniques. An image topology analysis strategy is introduced to speed up large scene reconstruction by taking advantage of the flight-control data acquired by UAV. Image topology map can significantly reduce the running time of feature matching by limiting the combination of images. A high-resolution digital surface model of the study area is produced base on UAV point clouds by constructing the triangular irregular network. Experimental results show that the proposed approach is robust and feasible for automatic 3D reconstruction of low-altitude UAV images, and has great potential for the acquisition of spatial information at large scales mapping, especially suitable for rapid response and precise modelling in disaster emergency.

  19. Implicit media frames: Automated analysis of public debate on artificial sweeteners

    CERN Document Server

    Hellsten, Iina; Leydesdorff, Loet

    2010-01-01

    The framing of issues in the mass media plays a crucial role in the public understanding of science and technology. This article contributes to research concerned with diachronic analysis of media frames by making an analytical distinction between implicit and explicit media frames, and by introducing an automated method for analysing diachronic changes of implicit frames. In particular, we apply a semantic maps method to a case study on the newspaper debate about artificial sweeteners, published in The New York Times (NYT) between 1980 and 2006. Our results show that the analysis of semantic changes enables us to filter out the dynamics of implicit frames, and to detect emerging metaphors in public debates. Theoretically, we discuss the relation between implicit frames in public debates and codification of information in scientific discourses, and suggest further avenues for research interested in the automated analysis of frame changes and trends in public debates.

  20. Automated shock detection and analysis algorithm for space weather application

    Science.gov (United States)

    Vorotnikov, Vasiliy S.; Smith, Charles W.; Hu, Qiang; Szabo, Adam; Skoug, Ruth M.; Cohen, Christina M. S.

    2008-03-01

    Space weather applications have grown steadily as real-time data have become increasingly available. Numerous industrial applications have arisen with safeguarding of the power distribution grids being a particular interest. NASA uses short-term and long-term space weather predictions in its launch facilities. Researchers studying ionospheric, auroral, and magnetospheric disturbances use real-time space weather services to determine launch times. Commercial airlines, communication companies, and the military use space weather measurements to manage their resources and activities. As the effects of solar transients upon the Earth's environment and society grow with the increasing complexity of technology, better tools are needed to monitor and evaluate the characteristics of the incoming disturbances. A need is for automated shock detection and analysis methods that are applicable to in situ measurements upstream of the Earth. Such tools can provide advance warning of approaching disturbances that have significant space weather impacts. Knowledge of the shock strength and speed can also provide insight into the nature of the approaching solar transient prior to arrival at the magnetopause. We report on efforts to develop a tool that can find and analyze shocks in interplanetary plasma data without operator intervention. This method will run with sufficient speed to be a practical space weather tool providing useful shock information within 1 min of having the necessary data to ground. The ability to run without human intervention frees space weather operators to perform other vital services. We describe ways of handling upstream data that minimize the frequency of false positive alerts while providing the most complete description of approaching disturbances that is reasonably possible.

  1. Automation on computer of the partial area method in the analysis of resonances induced by 'S' neutrons 2. with an interference term and extension of the method to the treatment of multi resonances (1963)

    International Nuclear Information System (INIS)

    This report deals with the numerical analysis on an I.B.M. 7090 computer of transmission resonances induced by 's' wave neutrons in time of flight experiments. The analysis method used is the partial area one. In this second part the interference term is taken into account. Modifications have been made in the programs and subroutines described in the first part, to determine the resonant transmissions from experimental raw data, and the relating partial areas. Also programs and subroutines are thoroughly described, which estimate the resonance parameters. The field of the partial area method has been extended to cover the case where several resonances have to be treated simultaneously, provided they do not interfere. (authors)

  2. A fully automated plasma protein precipitation sample preparation method for LC-MS/MS bioanalysis.

    Science.gov (United States)

    Ma, Ji; Shi, Jianxia; Le, Hoa; Cho, Robert; Huang, Judy Chi-jou; Miao, Shichang; Wong, Bradley K

    2008-02-01

    This report describes the development and validation of a robust robotic system that fully integrates all peripheral devices needed for the automated preparation of plasma samples by protein precipitation. The liquid handling system consisted of a Tecan Freedom EVO 200 liquid handling platform equipped with an 8-channel liquid handling arm, two robotic plate-handling arms, and two plate shakers. Important additional components integrated into the platform were a robotic temperature-controlled centrifuge, a plate sealer, and a plate seal piercing station. These enabled unattended operation starting from a stock solution of the test compound, a set of test plasma samples and associated reagents. The stock solution of the test compound was used to prepare plasma calibration and quality control samples. Once calibration and quality control samples were prepared, precipitation of plasma proteins was achieved by addition of three volumes of acetonitrile. Integration of the peripheral devices allowed automated sequential completion of the centrifugation, plate sealing, piercing and supernatant transferral steps. The method produced a sealed, injection-ready 96-well plate of plasma extracts. Accuracy and precision of the automated system were satisfactory for the intended use: intra-day and the inter-day precision were excellent (C.V.<5%), while the intra-day and inter-day accuracies were acceptable (relative error<8%). The flexibility of the platform was sufficient to accommodate pharmacokinetic studies of different numbers of animals and time points. To the best of our knowledge, this represents the first complete automation of the protein precipitation method for plasma sample analysis. PMID:18226589

  3. Detailed interrogation of trypanosome cell biology via differential organelle staining and automated image analysis

    Directory of Open Access Journals (Sweden)

    Wheeler Richard J

    2012-01-01

    Full Text Available Abstract Background Many trypanosomatid protozoa are important human or animal pathogens. The well defined morphology and precisely choreographed division of trypanosomatid cells makes morphological analysis a powerful tool for analyzing the effect of mutations, chemical insults and changes between lifecycle stages. High-throughput image analysis of micrographs has the potential to accelerate collection of quantitative morphological data. Trypanosomatid cells have two large DNA-containing organelles, the kinetoplast (mitochondrial DNA and nucleus, which provide useful markers for morphometric analysis; however they need to be accurately identified and often lie in close proximity. This presents a technical challenge. Accurate identification and quantitation of the DNA content of these organelles is a central requirement of any automated analysis method. Results We have developed a technique based on double staining of the DNA with a minor groove binding (4'', 6-diamidino-2-phenylindole (DAPI and a base pair intercalating (propidium iodide (PI or SYBR green fluorescent stain and color deconvolution. This allows the identification of kinetoplast and nuclear DNA in the micrograph based on whether the organelle has DNA with a more A-T or G-C rich composition. Following unambiguous identification of the kinetoplasts and nuclei the resulting images are amenable to quantitative automated analysis of kinetoplast and nucleus number and DNA content. On this foundation we have developed a demonstrative analysis tool capable of measuring kinetoplast and nucleus DNA content, size and position and cell body shape, length and width automatically. Conclusions Our approach to DNA staining and automated quantitative analysis of trypanosomatid morphology accelerated analysis of trypanosomatid protozoa. We have validated this approach using Leishmania mexicana, Crithidia fasciculata and wild-type and mutant Trypanosoma brucei. Automated analysis of T. brucei

  4. Using historical wafermap data for automated yield analysis

    International Nuclear Information System (INIS)

    To be productive and profitable in a modern semiconductor fabrication environment, large amounts of manufacturing data must be collected, analyzed, and maintained. This includes data collected from in- and off-line wafer inspection systems and from the process equipment itself. This data is increasingly being used to design new processes, control and maintain tools, and to provide the information needed for rapid yield learning and prediction. Because of increasing device complexity, the amount of data being generated is outstripping the yield engineer close-quote s ability to effectively monitor and correct unexpected trends and excursions. The 1997 SIA National Technology Roadmap for Semiconductors highlights a need to address these issues through open-quotes automated data reduction algorithms to source defects from multiple data sources and to reduce defect sourcing time.close quotes SEMATECH and the Oak Ridge National Laboratory have been developing new strategies and technologies for providing the yield engineer with higher levels of assisted data reduction for the purpose of automated yield analysis. In this article, we will discuss the current state of the art and trends in yield management automation. copyright 1999 American Vacuum Society

  5. Highway Electrification And Automation Technologies - Regional Impacts Analysis Project: Phase I: Baseline Scenario Data Analysis

    OpenAIRE

    Scag; Path

    1993-01-01

    The Highway Electrification and Automation Technologies Regional Impacts Analysis Project addresses the transportation-related problems of freeway congestion, air pollution, and dependence on fossil fuels in southern California. This report presents a documentation of the basis for the impacts analysis. It contains sections on data collected, baseline forecast for 2025, and electrification and automation specification scenarios. This report constitutes the final report for Phase I of the proj...

  6. Cost Analysis of an Automated and Manual Cataloging and Book Processing System.

    Science.gov (United States)

    Druschel, Joselyn

    1981-01-01

    Cost analysis of an automated network system and a manual system of cataloging and book processing indicates a 20 percent savings using automation. Per unit costs based on the average monthly automation rate are used for comparison. Higher manual system costs are attributed to staff costs. (RAA)

  7. A novel automated hydrophilic interaction liquid chromatography method using diode-array detector/electrospray ionization tandem mass spectrometry for analysis of sodium risedronate and related degradation products in pharmaceuticals.

    Science.gov (United States)

    Bertolini, Tiziana; Vicentini, Lorenza; Boschetti, Silvia; Andreatta, Paolo; Gatti, Rita

    2014-10-24

    A simple, sensitive and fast hydrophilic interaction liquid chromatography (HILIC) method using ultraviolet diode-array detector (UV-DAD)/electrospray ionization tandem mass spectrometry was developed for the automated high performance liquid chromatography (HPLC) determination of sodium risedronate (SR) and its degradation products in new pharmaceuticals. The chromatographic separations were performed on Ascentis Express HILIC 2.7μm (150mm×2.1mm, i.d.) stainless steel column (fused core). The mobile phase consisted of formate buffer solution (pH 3.4; 0.03M)/acetonitrile 42:58 and 45:55 (v/v) for granules for oral solution and effervescent tablet analysis, respectively, at a flow-rate of 0.2mL/min, setting the wavelength at 262nm. Stability characteristics of SR were evaluated by performing stress test studies. The main degradation product formed under oxidation conditions corresponding to sodium hydrogen (1-hydroxy-2-(1-oxidopyridin-3-yl)-1-phosphonoethyl)phosphonate was characterized by high performance liquid chromatography-electrospray ionization-mass tandem mass spectrometry (HPLC-ESI-MS/MS). The validation parameters such as linearity, sensitivity, accuracy, precision and selectivity were found to be highly satisfactory. Linear responses were observed in standard and in fortified placebo solutions. Intra-day precision (relative standard deviation, RSD) was ≤1.1% for peak area and ≤0.2% for retention times (tR) without significant differences between intra- and inter-day data. Recovery studies showed good results for all the examined compounds (from 98.7 to 101.0%) with RSD ranging from 0.6 to 0.7%. The limits of detection (LOD) and quantitation (LOQ) were 1 and 3ng/mL, respectively. The high stability of standard and sample solutions at room temperature means an undoubted advantage of the method allowing the simultaneous preparation of many samples and consecutive chromatographic analyses by using an autosampler. The developed stability indicating

  8. Semi-automated analysis of EEG spikes in the preterm fetal sheep using wavelet analysis

    International Nuclear Information System (INIS)

    Full text: Presentation Preference Oral Presentation Perinatal hypoxia plays a key role in the cause of brain injury in premature infants. Cerebral hypothermia commenced in the latent phase of evolving injury (first 6-8 h post hypoxic-ischemic insult) is the lead candidate for treatment however currently there is no means to identify which infants can benefit from treatment. Recent studies suggest that epileptiform transients in latent phase are predictive of neural outcome. To quantify this, an automated means of EEG analysis is required as EEG monitoring produces vast amounts of data which is timely to analyse manually. We have developed a semi-automated EEG spike detection method which employs a discretized version of the continuous wavelet transform (CWT). EEG data was obtained from a fetal sheep at approximately 0.7 of gestation. Fetal asphyxia was maintained for 25 min and the EEG recorded for 8 h before and after asphyxia. The CWT was calculated followed by the power of the wavelet transform coefficients. Areas of high power corresponded to spike waves so thresholding was employed to identify the spikes. The performance of the method was found have a good sensitivity and selectivity, thus demonstrating that this method is a simple, robust and potentially effective spike detection algorithm.

  9. AMDA: an R package for the automated microarray data analysis

    Directory of Open Access Journals (Sweden)

    Foti Maria

    2006-07-01

    Full Text Available Abstract Background Microarrays are routinely used to assess mRNA transcript levels on a genome-wide scale. Large amount of microarray datasets are now available in several databases, and new experiments are constantly being performed. In spite of this fact, few and limited tools exist for quickly and easily analyzing the results. Microarray analysis can be challenging for researchers without the necessary training and it can be time-consuming for service providers with many users. Results To address these problems we have developed an automated microarray data analysis (AMDA software, which provides scientists with an easy and integrated system for the analysis of Affymetrix microarray experiments. AMDA is free and it is available as an R package. It is based on the Bioconductor project that provides a number of powerful bioinformatics and microarray analysis tools. This automated pipeline integrates different functions available in the R and Bioconductor projects with newly developed functions. AMDA covers all of the steps, performing a full data analysis, including image analysis, quality controls, normalization, selection of differentially expressed genes, clustering, correspondence analysis and functional evaluation. Finally a LaTEX document is dynamically generated depending on the performed analysis steps. The generated report contains comments and analysis results as well as the references to several files for a deeper investigation. Conclusion AMDA is freely available as an R package under the GPL license. The package as well as an example analysis report can be downloaded in the Services/Bioinformatics section of the Genopolis http://www.genopolis.it/

  10. Colorimetric determination of nitrate plus nitrite in water by enzymatic reduction, automated discrete analyzer methods

    Science.gov (United States)

    Patton, Charles J.; Kryskalla, Jennifer R.

    2011-01-01

    This report documents work at the U.S. Geological Survey (USGS) National Water Quality Laboratory (NWQL) to validate enzymatic reduction, colorimetric determinative methods for nitrate + nitrite in filtered water by automated discrete analysis. In these standard- and low-level methods (USGS I-2547-11 and I-2548-11), nitrate is reduced to nitrite with nontoxic, soluble nitrate reductase rather than toxic, granular, copperized cadmium used in the longstanding USGS automated continuous-flow analyzer methods I-2545-90 (NWQL laboratory code 1975) and I-2546-91 (NWQL laboratory code 1979). Colorimetric reagents used to determine resulting nitrite in aforementioned enzymatic- and cadmium-reduction methods are identical. The enzyme used in these discrete analyzer methods, designated AtNaR2 by its manufacturer, is produced by recombinant expression of the nitrate reductase gene from wall cress (Arabidopsis thaliana) in the yeast Pichia pastoris. Unlike other commercially available nitrate reductases we evaluated, AtNaR2 maintains high activity at 37°C and is not inhibited by high-phenolic-content humic acids at reaction temperatures in the range of 20°C to 37°C. These previously unrecognized AtNaR2 characteristics are essential for successful performance of discrete analyzer nitrate + nitrite assays (henceforth, DA-AtNaR2) described here.

  11. Automated method of tracing proton tracks in nuclear emulsions

    International Nuclear Information System (INIS)

    The low performance of the manual recognition of proton-recoil tracks in nuclear emulsions has limited its application to energy spectrum measurement of a pulsed neutron source. We developed an automated microscope system to trace proton-recoil tracks in nuclear emulsions. Given a start point on the proton track of interest, the microscope system can automatically trace and record the entire track using an image processing algorithm. Tests indicate that no interaction of the operator is needed in tracing the entire track. This automated microscope greatly reduces the labor of the operator and increases the efficiency of track data collection in nuclear emulsion

  12. Automated method of tracing proton tracks in nuclear emulsions

    Energy Technology Data Exchange (ETDEWEB)

    Ruan, Jin-lu, E-mail: rjl@mail.ustc.edu.cn [Northwest Institute of Nuclear Technology, P.O. box 69-9, Xi’an, Shaanxi 710024 (China); Li, Hong-yun; Song, Ji-wen [Northwest Institute of Nuclear Technology, P.O. box 69-9, Xi’an, Shaanxi 710024 (China); Zhang, Jian-fu, E-mail: zhang_jianfu@163.com [Northwest Institute of Nuclear Technology, P.O. box 69-9, Xi’an, Shaanxi 710024 (China); Academy of Nuclear Science and Technology, Xi’an Jiaotong University, Xi’an, Shaanxi 710049 (China); Chen, Liang; Zhang, Zhong-bing; Liu, Jin-liang; Liu, Lin-yue [Northwest Institute of Nuclear Technology, P.O. box 69-9, Xi’an, Shaanxi 710024 (China)

    2015-07-21

    The low performance of the manual recognition of proton-recoil tracks in nuclear emulsions has limited its application to energy spectrum measurement of a pulsed neutron source. We developed an automated microscope system to trace proton-recoil tracks in nuclear emulsions. Given a start point on the proton track of interest, the microscope system can automatically trace and record the entire track using an image processing algorithm. Tests indicate that no interaction of the operator is needed in tracing the entire track. This automated microscope greatly reduces the labor of the operator and increases the efficiency of track data collection in nuclear emulsion.

  13. Automated differential photometry of TAOS data: preliminary analysis

    CERN Document Server

    Ricci, D; Ayala, C; Ramón-Fox, F G; Michel, R; Navarro, S; Wang, S -Y; Zhang, Z -W; Lehner, M J; Nicastro, L; Reyes-Ruiz, M

    2014-01-01

    A preliminary data analysis of the stellar light curves obtained by the robotic telescopes of the TAOS project is presented. We selected a data run relative to one of the stellar fields observed by three of the four TAOS telescopes, and we investigate the common trend and the correlation between the light curves. We propose two ways to remove these trends and show the preliminary results. A project aimed at flagging interesting behaviors, such as stellar variability, and to set up an automated follow-up with the San Pedro M\\'artir Facilities is on the way.

  14. Analysis and simulation of a torque assist automated manual transmission

    Science.gov (United States)

    Galvagno, E.; Velardocchia, M.; Vigliani, A.

    2011-08-01

    The paper presents the kinematic and dynamic analysis of a power-shift automated manual transmission (AMT) characterised by a wet clutch, called assist clutch (ACL), replacing the fifth gear synchroniser. This torque assist mechanism becomes a torque transfer path during gearshifts, in order to overcome a typical dynamic problem of the AMTs, that is the driving force interruption. The mean power contributions during gearshifts are computed for different engine and ACL interventions, thus allowing to draw considerations useful for developing the control algorithms. The simulation results prove the advantages in terms of gearshift quality and ride comfort of the analysed transmission.

  15. Automated image analysis in the study of collagenous colitis

    DEFF Research Database (Denmark)

    Fiehn, Anne-Marie Kanstrup; Kristensson, Martin; Engel, Ulla;

    2016-01-01

    PURPOSE: The aim of this study was to develop an automated image analysis software to measure the thickness of the subepithelial collagenous band in colon biopsies with collagenous colitis (CC) and incomplete CC (CCi). The software measures the thickness of the collagenous band on microscopic...... agreement between the four pathologists and the VG app was κ=0.71. CONCLUSION: In conclusion, the Visiopharm VG app is able to measure the thickness of a sub-epithelial collagenous band in colon biopsies with an accuracy comparable to the performance of a pathologist and thereby provides a promising...

  16. Semi-Automated Mapping for the Reflexion Method

    OpenAIRE

    Christl, Andreas

    2005-01-01

    A significant aspect in applying the Software Reflexion Model analysis is mapping of components found in the source code onto the conceptual components defined in the hypothesized architecture. To date, this mapping is done manually, which requires a lot of work for large software systems. This thesis evaluates if and how cluster analysis can leverage the manual mapping of source code artifacts. For this evaluation, the HuGMe method has been developed, in which assets of existing clusteri...

  17. Automating HIV Drug Resistance Genotyping with RECall, a Freely Accessible Sequence Analysis Tool

    OpenAIRE

    Woods, Conan K.; Chanson J Brumme; Liu, Tommy F; Chui, Celia K. S.; Chu, Anna L.; Wynhoven, Brian; Hall, Tom A.; Trevino, Christina; Shafer, Robert W; Harrigan, P. Richard

    2012-01-01

    Genotypic HIV drug resistance testing is routinely used to guide clinical decisions. While genotyping methods can be standardized, a slow, labor-intensive, and subjective manual sequence interpretation step is required. We therefore performed external validation of our custom software RECall, a fully automated sequence analysis pipeline. HIV-1 drug resistance genotyping was performed on 981 clinical samples at the Stanford Diagnostic Virology Laboratory. Sequencing trace files were first inte...

  18. SCHUBOT: Machine Learning Tools for the Automated Analysis of Schubert’s Lieder

    OpenAIRE

    Nagler, Dylan Jeremy

    2014-01-01

    This paper compares various methods for automated musical analysis, applying machine learning techniques to gain insight about the Lieder (art songs) of com- poser Franz Schubert (1797-1828). Known as a rule-breaking, individualistic, and adventurous composer, Schubert produced hundreds of emotionally-charged songs that have challenged music theorists to this day. The algorithms presented in this paper analyze the harmonies, melodies, and texts of these songs. This paper begins with an explor...

  19. The objective evaluation of the phase image: a comparison of different automated methods

    International Nuclear Information System (INIS)

    Patients with suspected or proven coronary artery disease were investigated using both X-ray ventriculography and equilibrium gated radionuclide angiography. In order to diagnose regional wall motion abnormalities, the parametric images obtained by Fourier analysis of the radionuclide images were analysed by different automated methods based on the measurement of the homogeneity of the phase values within the LV ROI. The effect of a diastolic frames exclusion, smoothing the original data, weighting the phase histogram, using Bacharach's error corrected phase distribution functions, using different descriptors of the spread of the phase histograms or distribution functions were tested. (Receiver operating characteristic ROC) curves were plotted for each method. The results show that the diagnostic value of the automated methods depends mainly on the way the histograms or distribution functions are described and to a lesser extent on the type of histograms or distribution functions used. The best result is obtained after smoothing, diastolic frames exclusion, weighting the phase histogram by the amplitude and describing it by its standard deviation. Nevertheless, this result is not significantly better than the visual method. (author)

  20. Molecular Detection of Bladder Cancer by Fluorescence Microsatellite Analysis and an Automated Genetic Analyzing System

    Directory of Open Access Journals (Sweden)

    Sarel Halachmi

    2007-01-01

    Full Text Available To investigate the ability of an automated fluorescent analyzing system to detect microsatellite alterations, in patients with bladder cancer. We investigated 11 with pathology proven bladder Transitional Cell Carcinoma (TCC for microsatellite alterations in blood, urine, and tumor biopsies. DNA was prepared by standard methods from blood, urine and resected tumor specimens, and was used for microsatellite analysis. After the primers were fluorescent labeled, amplification of the DNA was performed with PCR. The PCR products were placed into the automated genetic analyser (ABI Prism 310, Perkin Elmer, USA and were subjected to fluorescent scanning with argon ion laser beams. The fluorescent signal intensity measured by the genetic analyzer measured the product size in terms of base pairs. We found loss of heterozygocity (LOH or microsatellite alterations (a loss or gain of nucleotides, which alter the original normal locus size in all the patients by using fluorescent microsatellite analysis and an automated analyzing system. In each case the genetic changes found in urine samples were identical to those found in the resected tumor sample. The studies demonstrated the ability to detect bladder tumor non-invasively by fluorescent microsatellite analysis of urine samples. Our study supports the worldwide trend for the search of non-invasive methods to detect bladder cancer. We have overcome major obstacles that prevented the clinical use of an experimental system. With our new tested system microsatellite analysis can be done cheaper, faster, easier and with higher scientific accuracy.

  1. Grasping devices and methods in automated production processes

    DEFF Research Database (Denmark)

    Fantoni, Gualtiero; Santochi, Marco; Dini, Gino;

    2014-01-01

    assembly to disassembly, from aerospace to food industry, from textile to logistics) are discussed. Finally, the most recent research is reviewed in order to introduce the new trends in grasping. They provide an outlook on the future of both grippers and robotic hands in automated production processes. (C...

  2. Semi-automated potentiometric titration method for uranium characterization.

    Science.gov (United States)

    Cristiano, B F G; Delgado, J U; da Silva, J W S; de Barros, P D; de Araújo, R M S; Lopes, R T

    2012-07-01

    The manual version of the potentiometric titration method has been used for certification and characterization of uranium compounds. In order to reduce the analysis time and the influence of the analyst, a semi-automatic version of the method was developed in the Brazilian Nuclear Energy Commission. The method was applied with traceability assured by using a potassium dichromate primary standard. The combined standard uncertainty in determining the total concentration of uranium was around 0.01%, which is suitable for uranium characterization. PMID:22154105

  3. Method for semi-automated microscopy of filtration-enriched circulating tumor cells

    OpenAIRE

    Pailler, Emma; Oulhen, Marianne; Billiot, Fanny; Galland, Alexandre; Auger, Nathalie; Faugeroux, Vincent; Laplace-Builhé, Corinne; Besse, Benjamin; Loriot, Yohann; Ngo-Camus, Maud; Hemanda, Merouan; Colin R. Lindsay; Soria, Jean-Charles; Vielh, Philippe; Farace, Françoise

    2016-01-01

    Background Circulating tumor cell (CTC)-filtration methods capture high numbers of CTCs in non-small-cell lung cancer (NSCLC) and metastatic prostate cancer (mPCa) patients, and hold promise as a non-invasive technique for treatment selection and disease monitoring. However filters have drawbacks that make the automation of microscopy challenging. We report the semi-automated microscopy method we developed to analyze filtration-enriched CTCs from NSCLC and mPCa patients. Methods Spiked cell l...

  4. Automated quantitative analysis of ventilation-perfusion lung scintigrams

    International Nuclear Information System (INIS)

    An automated computer analysis of ventilation (Kr-81m) and perfusion (Tc-99m) lung images has been devised that produces a graphical image of the distribution of ventilation and perfusion, and of ventilation-perfusion ratios. The analysis has overcome the following problems: the identification of the midline between two lungs and the lung boundaries, the exclusion of extrapulmonary radioactivity, the superimposition of lung images of different sizes, and the format for presentation of the data. Therefore, lung images of different sizes and shapes may be compared with each other. The analysis has been used to develop normal ranges from 55 volunteers. Comparison of younger and older age groups of men and women show small but significant differences in the distribution of ventilation and perfusion, but no differences in ventilation-perfusion ratios

  5. AUTOMATED DATA ANALYSIS FOR CONSECUTIVE IMAGES FROM DROPLET COMBUSTION EXPERIMENTS

    Directory of Open Access Journals (Sweden)

    Christopher Lee Dembia

    2012-09-01

    Full Text Available A simple automated image analysis algorithm has been developed that processes consecutive images from high speed, high resolution digital images of burning fuel droplets. The droplets burn under conditions that promote spherical symmetry. The algorithm performs the tasks of edge detection of the droplet’s boundary using a grayscale intensity threshold, and shape fitting either a circle or ellipse to the droplet’s boundary. The results are compared to manual measurements of droplet diameters done with commercial software. Results show that it is possible to automate data analysis for consecutive droplet burning images even in the presence of a significant amount of noise from soot formation. An adaptive grayscale intensity threshold provides the ability to extract droplet diameters for the wide range of noise encountered. In instances where soot blocks portions of the droplet, the algorithm manages to provide accurate measurements if a circle fit is used instead of an ellipse fit, as an ellipse can be too accommodating to the disturbance.

  6. An Automated Method to Quantify Radiation Damage in Human Blood Cells

    Energy Technology Data Exchange (ETDEWEB)

    Gordon K. Livingston, Mark S. Jenkins and Akio A. Awa

    2006-07-10

    Cytogenetic analysis of blood lymphocytes is a well established method to assess the absorbed dose in persons exposed to ionizing radiation. Because mature lymphocytes circulate throughout the body, the dose to these cells is believed to represent the average whole body exposure. Cytogenetic methods measure the incidence of structural aberrations in chromosomes as a means to quantify DNA damage which occurs when ionizing radiation interacts with human tissue. Methods to quantify DNA damage at the chromosomal level vary in complexity and tend to be laborious and time consuming. In a mass casualty scenario involving radiological/nuclear materials, the ability to rapidly triage individuals according to radiation dose is critically important. For high-throughput screening for dicentric chromosomes, many of the data collection steps can be optimized with motorized microscopes coupled to automated slide scanning platforms.

  7. Automated reticle inspection data analysis for wafer fabs

    Science.gov (United States)

    Summers, Derek; Chen, Gong; Reese, Bryan; Hutchinson, Trent; Liesching, Marcus; Ying, Hai; Dover, Russell

    2009-04-01

    To minimize potential wafer yield loss due to mask defects, most wafer fabs implement some form of reticle inspection system to monitor photomask quality in high-volume wafer manufacturing environments. Traditionally, experienced operators review reticle defects found by an inspection tool and then manually classify each defect as 'pass, warn, or fail' based on its size and location. However, in the event reticle defects are suspected of causing repeating wafer defects on a completed wafer, potential defects on all associated reticles must be manually searched on a layer-by-layer basis in an effort to identify the reticle responsible for the wafer yield loss. This 'problem reticle' search process is a very tedious and time-consuming task and may cause extended manufacturing line-down situations. Often times, Process Engineers and other team members need to manually investigate several reticle inspection reports to determine if yield loss can be tied to a specific layer. Because of the very nature of this detailed work, calculation errors may occur resulting in an incorrect root cause analysis effort. These delays waste valuable resources that could be spent working on other more productive activities. This paper examines an automated software solution for converting KLA-Tencor reticle inspection defect maps into a format compatible with KLA-Tencor's Klarity Defect(R) data analysis database. The objective is to use the graphical charting capabilities of Klarity Defect to reveal a clearer understanding of defect trends for individual reticle layers or entire mask sets. Automated analysis features include reticle defect count trend analysis and potentially stacking reticle defect maps for signature analysis against wafer inspection defect data. Other possible benefits include optimizing reticle inspection sample plans in an effort to support "lean manufacturing" initiatives for wafer fabs.

  8. Automation of Extraction Chromatograhic and Ion Exchange Separations for Radiochemical Analysis and Monitoring

    International Nuclear Information System (INIS)

    Radiochemical analysis, complete with the separation of radionuclides of interest from the sample matrix and from other interfering radionuclides, is often an essential step in the determination of the radiochemical composition of a nuclear sample or process stream. Although some radionuclides can be determined nondestructively by gamma spectroscopy, where the gamma rays penetrate significant distances in condensed media and the gamma ray energies are diagnostic for specific radionuclides, other radionuclides that may be of interest emit only alpha or beta particles. For these, samples must be taken for destructive analysis and radiochemical separations are required. For process monitoring purposes, the radiochemical separation and detection methods must be rapid so that the results will be timely. These results could be obtained by laboratory analysis or by radiochemical process analyzers operating on-line or at-site. In either case, there is a need for automated radiochemical analysis methods to provide speed, throughput, safety, and consistent analytical protocols. Classical methods of separation used during the development of nuclear technologies, namely manual precipitations, solvent extractions, and ion exchange, are slow and labor intensive. Fortunately, the convergence of digital instrumentation for preprogrammed fluid manipulation and the development of new separation materials for column-based isolation of radionuclides has enabled the development of automated radiochemical analysis methodology. The primary means for separating radionuclides in solution are liquid-liquid extraction and ion exchange. These processes are well known and have been reviewed in the past.1 Ion exchange is readily employed in column formats. Liquid-liquid extraction can also be implemented on column formats using solvent-impregnated resins as extraction chromatographic materials. The organic liquid extractant is immobilized in the pores of a microporous polymer material. Under

  9. Automated IR determination of petroleum products in water based on sequential injection analysis.

    Science.gov (United States)

    Falkova, Marina; Vakh, Christina; Shishov, Andrey; Zubakina, Ekaterina; Moskvin, Aleksey; Moskvin, Leonid; Bulatov, Andrey

    2016-02-01

    The simple and easy performed automated method for the IR determination of petroleum products (PP) in water using extraction-chromatographic cartridges has been developed. The method assumes two stages: on-site extraction of PP during a sampling by using extraction-chromatographic cartridges and subsequent determination of the extracted PP using sequential injection analysis (SIA) with IR detection. The appropriate experimental conditions for extraction of the dissolved in water PP and for automated SIA procedure were investigated. The calibration plot constructed using the developed procedure was linear in the range of 3-200 μg L(-1). The limit of detection (LOD), calculated from a blank test based on 3σ was 1 µg L(-1). The sample volume was 1L. The system throughput was found to be 12 h(-1). PMID:26653498

  10. Automating Flood Hazard Mapping Methods for Near Real-time Storm Surge Inundation and Vulnerability Assessment

    Science.gov (United States)

    Weigel, A. M.; Griffin, R.; Gallagher, D.

    2015-12-01

    Storm surge has enough destructive power to damage buildings and infrastructure, erode beaches, and threaten human life across large geographic areas, hence posing the greatest threat of all the hurricane hazards. The United States Gulf of Mexico has proven vulnerable to hurricanes as it has been hit by some of the most destructive hurricanes on record. With projected rises in sea level and increases in hurricane activity, there is a need to better understand the associated risks for disaster mitigation, preparedness, and response. GIS has become a critical tool in enhancing disaster planning, risk assessment, and emergency response by communicating spatial information through a multi-layer approach. However, there is a need for a near real-time method of identifying areas with a high risk of being impacted by storm surge. Research was conducted alongside Baron, a private industry weather enterprise, to facilitate automated modeling and visualization of storm surge inundation and vulnerability on a near real-time basis. This research successfully automated current flood hazard mapping techniques using a GIS framework written in a Python programming environment, and displayed resulting data through an Application Program Interface (API). Data used for this methodology included high resolution topography, NOAA Probabilistic Surge model outputs parsed from Rich Site Summary (RSS) feeds, and the NOAA Census tract level Social Vulnerability Index (SoVI). The development process required extensive data processing and management to provide high resolution visualizations of potential flooding and population vulnerability in a timely manner. The accuracy of the developed methodology was assessed using Hurricane Isaac as a case study, which through a USGS and NOAA partnership, contained ample data for statistical analysis. This research successfully created a fully automated, near real-time method for mapping high resolution storm surge inundation and vulnerability for the

  11. Automated High-Dimensional Flow Cytometric Data Analysis

    Science.gov (United States)

    Pyne, Saumyadipta; Hu, Xinli; Wang, Kui; Rossin, Elizabeth; Lin, Tsung-I.; Maier, Lisa; Baecher-Allan, Clare; McLachlan, Geoffrey; Tamayo, Pablo; Hafler, David; de Jager, Philip; Mesirov, Jill

    Flow cytometry is widely used for single cell interrogation of surface and intracellular protein expression by measuring fluorescence intensity of fluorophore-conjugated reagents. We focus on the recently developed procedure of Pyne et al. (2009, Proceedings of the National Academy of Sciences USA 106, 8519-8524) for automated high- dimensional flow cytometric analysis called FLAME (FLow analysis with Automated Multivariate Estimation). It introduced novel finite mixture models of heavy-tailed and asymmetric distributions to identify and model cell populations in a flow cytometric sample. This approach robustly addresses the complexities of flow data without the need for transformation or projection to lower dimensions. It also addresses the critical task of matching cell populations across samples that enables downstream analysis. It thus facilitates application of flow cytometry to new biological and clinical problems. To facilitate pipelining with standard bioinformatic applications such as high-dimensional visualization, subject classification or outcome prediction, FLAME has been incorporated with the GenePattern package of the Broad Institute. Thereby analysis of flow data can be approached similarly as other genomic platforms. We also consider some new work that proposes a rigorous and robust solution to the registration problem by a multi-level approach that allows us to model and register cell populations simultaneously across a cohort of high-dimensional flow samples. This new approach is called JCM (Joint Clustering and Matching). It enables direct and rigorous comparisons across different time points or phenotypes in a complex biological study as well as for classification of new patient samples in a more clinical setting.

  12. Automated measurement of estrogen receptor in breast cancer: a comparison of fluorescent and chromogenic methods of measurement.

    Science.gov (United States)

    Zarrella, Elizabeth R; Coulter, Madeline; Welsh, Allison W; Carvajal, Daniel E; Schalper, Kurt A; Harigopal, Malini; L Rimm, David; M Neumeister, Veronique

    2016-09-01

    Whereas FDA-approved methods of assessment of estrogen receptor (ER) are 'fit for purpose', they represent a 30-year-old technology. New quantitative methods, both chromogenic and fluorescent, have been developed and studies have shown that these methods increase the accuracy of assessment of ER. Here, we compare three methods of ER detection and assessment on two retrospective tissue microarray (TMA) cohorts of breast cancer patients: estimates of percent nuclei positive by pathologists and by Aperio's nuclear algorithm (standard chromogenic immunostaining), and immunofluorescence as quantified with the automated quantitative analysis (AQUA) method of quantitative immunofluorescence (QIF). Reproducibility was excellent (R(2)>0.95) between users for both automated analysis methods, and the Aperio and QIF scoring results were also highly correlated, despite the different detection systems. The subjective readings show lower levels of reproducibility and a discontinuous, bimodal distribution of scores not seen by either mechanized method. Kaplan-Meier analysis of 10-year disease-free survival was significant for each method (Pathologist, P=0.0019; Aperio, P=0.0053, AQUA, P=0.0026); however, there were discrepancies in patient classification in 19 out of 233 cases analyzed. Out of these, 11 were visually positive by both chromogenic and fluorescent detection. In 10 cases, the Aperio nuclear algorithm labeled the nuclei as negative; in 1 case, the AQUA score was just under the cutoff for positivity (determined by an Index TMA). In contrast, 8 out of 19 discrepant cases had clear nuclear positivity by fluorescence that was unable to be visualized by chromogenic detection, perhaps because of low positivity masked by the hematoxylin counterstain. These results demonstrate that automated systems enable objective, precise quantification of ER. Furthermore, immunofluorescence detection offers the additional advantage of a signal that cannot be masked by a counterstaining

  13. Automated retinal image analysis for diabetic retinopathy in telemedicine.

    Science.gov (United States)

    Sim, Dawn A; Keane, Pearse A; Tufail, Adnan; Egan, Catherine A; Aiello, Lloyd Paul; Silva, Paolo S

    2015-03-01

    There will be an estimated 552 million persons with diabetes globally by the year 2030. Over half of these individuals will develop diabetic retinopathy, representing a nearly insurmountable burden for providing diabetes eye care. Telemedicine programmes have the capability to distribute quality eye care to virtually any location and address the lack of access to ophthalmic services. In most programmes, there is currently a heavy reliance on specially trained retinal image graders, a resource in short supply worldwide. These factors necessitate an image grading automation process to increase the speed of retinal image evaluation while maintaining accuracy and cost effectiveness. Several automatic retinal image analysis systems designed for use in telemedicine have recently become commercially available. Such systems have the potential to substantially improve the manner by which diabetes eye care is delivered by providing automated real-time evaluation to expedite diagnosis and referral if required. Furthermore, integration with electronic medical records may allow a more accurate prognostication for individual patients and may provide predictive modelling of medical risk factors based on broad population data. PMID:25697773

  14. Semi-automated potentiometric titration method for uranium characterization

    International Nuclear Information System (INIS)

    The manual version of the potentiometric titration method has been used for certification and characterization of uranium compounds. In order to reduce the analysis time and the influence of the analyst, a semi-automatic version of the method was developed in the Brazilian Nuclear Energy Commission. The method was applied with traceability assured by using a potassium dichromate primary standard. The combined standard uncertainty in determining the total concentration of uranium was around 0.01%, which is suitable for uranium characterization. - Highlights: ► We developed a semi-automatic version of potentiometric titration method. ► The method is used for certification and characterization of uranium compounds. ► The traceability of the method was assured by a K2Cr2O7 primary standard. ► The results of U3O8 reference material analyzed was consistent with certified value. ► The uncertainty obtained, near 0.01%, is useful for characterization purposes.

  15. Semi-automated potentiometric titration method for uranium characterization

    Energy Technology Data Exchange (ETDEWEB)

    Cristiano, B.F.G., E-mail: barbara@ird.gov.br [Comissao Nacional de Energia Nuclear (CNEN), Instituto de Radioprotecao e Dosimetria (IRD), Avenida Salvador Allende s/n Recreio dos Bandeirantes, PO Box 37750, Rio de Janeiro, 22780-160 RJ (Brazil); Delgado, J.U.; Silva, J.W.S. da; Barros, P.D. de; Araujo, R.M.S. de [Comissao Nacional de Energia Nuclear (CNEN), Instituto de Radioprotecao e Dosimetria (IRD), Avenida Salvador Allende s/n Recreio dos Bandeirantes, PO Box 37750, Rio de Janeiro, 22780-160 RJ (Brazil); Lopes, R.T. [Programa de Engenharia Nuclear (PEN/COPPE), Universidade Federal do Rio de Janeiro (UFRJ), Ilha do Fundao, PO Box 68509, Rio de Janeiro, 21945-970 RJ (Brazil)

    2012-07-15

    The manual version of the potentiometric titration method has been used for certification and characterization of uranium compounds. In order to reduce the analysis time and the influence of the analyst, a semi-automatic version of the method was developed in the Brazilian Nuclear Energy Commission. The method was applied with traceability assured by using a potassium dichromate primary standard. The combined standard uncertainty in determining the total concentration of uranium was around 0.01%, which is suitable for uranium characterization. - Highlights: Black-Right-Pointing-Pointer We developed a semi-automatic version of potentiometric titration method. Black-Right-Pointing-Pointer The method is used for certification and characterization of uranium compounds. Black-Right-Pointing-Pointer The traceability of the method was assured by a K{sub 2}Cr{sub 2}O{sub 7} primary standard. Black-Right-Pointing-Pointer The results of U{sub 3}O{sub 8} reference material analyzed was consistent with certified value. Black-Right-Pointing-Pointer The uncertainty obtained, near 0.01%, is useful for characterization purposes.

  16. Software fault tree analysis of an automated control system device written in Ada

    OpenAIRE

    Winter, Mathias William.

    1995-01-01

    Software Fault Tree Analysis (SFTA) is a technique used to analyze software for faults that could lead to hazardous conditions in systems which contain software components. Previous thesis works have developed three Ada-based, semi-automated software analysis tools, the Automated Code Translation Tool (ACm) an Ada statement template generator, the Fault Tree Editor (Fm) a graphical fault tree editor, and the Fault Isolator (Fl) an automated software fault tree isolator. These previous works d...

  17. The Effect of Information Analysis Automation Display Content on Human Judgment Performance in Noisy Environments

    Science.gov (United States)

    Bass, Ellen J.; Baumgart, Leigh A.; Shepley, Kathryn Klein

    2014-01-01

    Displaying both the strategy that information analysis automation employs to makes its judgments and variability in the task environment may improve human judgment performance, especially in cases where this variability impacts the judgment performance of the information analysis automation. This work investigated the contribution of providing either information analysis automation strategy information, task environment information, or both, on human judgment performance in a domain where noisy sensor data are used by both the human and the information analysis automation to make judgments. In a simplified air traffic conflict prediction experiment, 32 participants made probability of horizontal conflict judgments under different display content conditions. After being exposed to the information analysis automation, judgment achievement significantly improved for all participants as compared to judgments without any of the automation's information. Participants provided with additional display content pertaining to cue variability in the task environment had significantly higher aided judgment achievement compared to those provided with only the automation's judgment of a probability of conflict. When designing information analysis automation for environments where the automation's judgment achievement is impacted by noisy environmental data, it may be beneficial to show additional task environment information to the human judge in order to improve judgment performance. PMID:24847184

  18. Towards the Procedure Automation of Full Stochastic Spectral Based Fatigue Analysis

    Directory of Open Access Journals (Sweden)

    Khurram Shehzad

    2013-05-01

    Full Text Available Fatigue is one of the most significant failure modes for marine structures such as ships and offshore platforms. Among numerous methods for fatigue life estimation, spectral method is considered as the most reliable one due to its ability to cater different sea states as well as their probabilities of occurrence. However, spectral based simulation procedure itself is quite complex and numerically intensive owing to various critical technical details. Present research study is focused on the application and automation of spectral based fatigue analysis procedure for ship structure using ANSYS software with 3D liner sea keeping code AQWA. Ansys Parametric Design Language (APDL macros are created and subsequently implemented to automate the workflow of simulation process by reducing the time spent on non-value added repetitive activity. A MATLAB program based on direct calculation procedure of spectral fatigue is developed to calculate total fatigue damage. The automation procedure is employed to predict the fatigue life of a ship structural detail using wave scatter data of North Atlantic and Worldwide trade. The current work will provide a system for efficient implementation of stochastic spectral fatigue analysis procedure for ship structures.

  19. Trends and applications of integrated automated ultra-trace sample handling and analysis (T9)

    International Nuclear Information System (INIS)

    Full text: Automated analysis, sub-ppt detection limits, and the trend toward speciated analysis (rather than just elemental analysis) force the innovation of sophisticated and integrated sample preparation and analysis techniques. Traditionally, the ability to handle samples at ppt and sub-ppt levels has been limited to clean laboratories and special sample handling techniques and equipment. The world of sample handling has passed a threshold where older or 'old fashioned' traditional techniques no longer provide the ability to see the sample due to the influence of the analytical blank and the fragile nature of the analyte. When samples require decomposition, extraction, separation and manipulation, application of newer more sophisticated sample handling systems are emerging that enable ultra-trace analysis and species manipulation. In addition, new instrumentation has emerged which integrate sample preparation and analysis to enable on-line near real-time analysis. Examples of those newer sample-handling methods will be discussed and current examples provided as alternatives to traditional sample handling. Two new techniques applying ultra-trace microwave energy enhanced sample handling have been developed that permit sample separation and refinement while performing species manipulation during decomposition. A demonstration, that applies to semiconductor materials, will be presented. Next, a new approach to the old problem of sample evaporation without losses will be demonstrated that is capable of retaining all elements and species tested. Both of those methods require microwave energy manipulation in specialized systems and are not accessible through convection, conduction, or other traditional energy applications. A new automated integrated method for handling samples for ultra-trace analysis has been developed. An on-line near real-time measurement system will be described that enables many new automated sample handling and measurement capabilities. This

  20. Validation of an automated fluorescein method for determining bromide in water

    Science.gov (United States)

    Fishman, M. J.; Schroder, L.J.; Friedman, L.C.

    1985-01-01

    Surface, atmospheric precipitation and deionized water samples were spiked with ??g l-1 concentrations of bromide, and the solutions stored in polyethylene and polytetrafluoroethylene bottles. Bromide was determined periodically for 30 days. Automated fluorescein and ion chromatography methods were used to determine bromide in these prepared samples. Analysis of the data by the paired t-test indicates that the two methods are not significantly different at a probability of 95% for samples containing from 0.015 to 0.5 mg l-1 of bromide. The correlation coefficient for the same sets of paired data is 0.9987. Recovery data, except for the surface water samples to which 0.005 mg l-1 of bromide was added, range from 89 to 112%. There appears to be no loss of bromide from solution in either type of container.Surface, atmospheric precipitation and deionized water samples were spiked with mu g l** minus **1 concentrations of bromide, and the solutions stored in polyethylene and polytetrafluoroethylene bottles. Bromide was determined periodically for 30 days. Automated fluorescein and ion chromatography methods were used to determine bromide in these prepared samples. Analysis of the data by the paired t-test indicates that the two methods are not significantly different at a probability of 95% for samples containing from 0. 015 to 0. 5 mg l** minus **1 of bromide. The correlation coefficient for the same sets of paired data is 0. 9987. Recovery data, except for the surface water samples to which 0. 005 mg l** minus **1 of bromide was added, range from 89 to 112%. Refs.

  1. Automated generation of burnup chain for reactor analysis applications

    International Nuclear Information System (INIS)

    This paper presents the development of an automated generation of a new burnup chain for reactor analysis applications. The JENDL FP Decay Data File 2011 and Fission Yields Data File 2011 were used as the data sources. The nuclides in the new chain are determined by restrictions of the half-life and cumulative yield of fission products or from a given list. Then, decay modes, branching ratios and fission yields are recalculated taking into account intermediate reactions. The new burnup chain is output according to the format for the SRAC code system. Verification was performed to evaluate the accuracy of the new burnup chain. The results show that the new burnup chain reproduces well the results of a reference one with 193 fission products used in SRAC. Further development and applications are being planned with the burnup chain code. (author)

  2. Analysis of Automated Aircraft Conflict Resolution and Weather Avoidance

    Science.gov (United States)

    Love, John F.; Chan, William N.; Lee, Chu Han

    2009-01-01

    This paper describes an analysis of using trajectory-based automation to resolve both aircraft and weather constraints for near-term air traffic management decision making. The auto resolution algorithm developed and tested at NASA-Ames to resolve aircraft to aircraft conflicts has been modified to mitigate convective weather constraints. Modifications include adding information about the size of a gap between weather constraints to the routing solution. Routes that traverse gaps that are smaller than a specific size are not used. An evaluation of the performance of the modified autoresolver to resolve both conflicts with aircraft and weather was performed. Integration with the Center-TRACON Traffic Management System was completed to evaluate the effect of weather routing on schedule delays.

  3. Knowledge-based requirements analysis for automating software development

    Science.gov (United States)

    Markosian, Lawrence Z.

    1988-01-01

    We present a new software development paradigm that automates the derivation of implementations from requirements. In this paradigm, informally-stated requirements are expressed in a domain-specific requirements specification language. This language is machine-understable and requirements expressed in it are captured in a knowledge base. Once the requirements are captured, more detailed specifications and eventually implementations are derived by the system using transformational synthesis. A key characteristic of the process is that the required human intervention is in the form of providing problem- and domain-specific engineering knowledge, not in writing detailed implementations. We describe a prototype system that applies the paradigm in the realm of communication engineering: the prototype automatically generates implementations of buffers following analysis of the requirements on each buffer.

  4. Automated Large-Scale Shoreline Variability Analysis From Video

    Science.gov (United States)

    Pearre, N. S.

    2006-12-01

    Land-based video has been used to quantify changes in nearshore conditions for over twenty years. By combining the ability to track rapid, short-term shoreline change and changes associated with longer term or seasonal processes, video has proved to be a cost effective and versatile tool for coastal science. Previous video-based studies of shoreline change have typically examined the position of the shoreline along a small number of cross-shore lines as a proxy for the continuous coast. The goal of this study is twofold: (1) to further develop automated shoreline extraction algorithms for continuous shorelines, and (2) to track the evolution of a nourishment project at Rehoboth Beach, DE that was concluded in June 2005. Seven cameras are situated approximately 30 meters above mean sea level and 70 meters from the shoreline. Time exposure and variance images are captured hourly during daylight and transferred to a local processing computer. After correcting for lens distortion and geo-rectifying to a shore-normal coordinate system, the images are merged to form a composite planform image of 6 km of coast. Automated extraction algorithms establish shoreline and breaker positions throughout a tidal cycle on a daily basis. Short and long term variability in the daily shoreline will be characterized using empirical orthogonal function (EOF) analysis. Periodic sediment volume information will be extracted by incorporating the results of monthly ground-based LIDAR surveys and by correlating the hourly shorelines to the corresponding tide level under conditions with minimal wave activity. The Delaware coast in the area downdrift of the nourishment site is intermittently interrupted by short groins. An Even/Odd analysis of the shoreline response around these groins will be performed. The impact of groins on the sediment volume transport along the coast during periods of accretive and erosive conditions will be discussed. [This work is being supported by DNREC and the

  5. Methods for Automated and Continuous Commissioning of Building Systems

    Energy Technology Data Exchange (ETDEWEB)

    Larry Luskay; Michael Brambley; Srinivas Katipamula

    2003-04-30

    Avoidance of poorly installed HVAC systems is best accomplished at the close of construction by having a building and its systems put ''through their paces'' with a well conducted commissioning process. This research project focused on developing key components to enable the development of tools that will automatically detect and correct equipment operating problems, thus providing continuous and automatic commissioning of the HVAC systems throughout the life of a facility. A study of pervasive operating problems reveled the following would most benefit from an automated and continuous commissioning process: (1) faulty economizer operation; (2) malfunctioning sensors; (3) malfunctioning valves and dampers, and (4) access to project design data. Methodologies for detecting system operation faults in these areas were developed and validated in ''bare-bones'' forms within standard software such as spreadsheets, databases, statistical or mathematical packages. Demonstrations included flow diagrams and simplified mock-up applications. Techniques to manage data were demonstrated by illustrating how test forms could be populated with original design information and the recommended sequence of operation for equipment systems. Proposed tools would use measured data, design data, and equipment operating parameters to diagnosis system problems. Steps for future research are suggested to help more toward practical application of automated commissioning and its high potential to improve equipment availability, increase occupant comfort, and extend the life of system equipment.

  6. Automated Bifurcation Analysis for Nonlinear Elliptic Partial Difference Equations on Graphs

    CERN Document Server

    Neuberger, John M; Swift, James W

    2010-01-01

    We seek solutions $u\\in\\R^n$ to the semilinear elliptic partial difference equation $-Lu + f_s(u) = 0$, where $L$ is the matrix corresponding to the Laplacian operator on a graph $G$ and $f_s$ is a one-parameter family of nonlinear functions. This article combines the ideas introduced by the authors in two papers: a) {\\it Nonlinear Elliptic Partial Difference Equations on Graphs} (J. Experimental Mathematics, 2006), which introduces analytical and numerical techniques for solving such equations, and b) {\\it Symmetry and Automated Branch Following for a Semilinear Elliptic PDE on a Fractal Region} wherein we present some of our recent advances concerning symmetry, bifurcation, and automation fo We apply the symmetry analysis found in the SIAM paper to arbitrary graphs in order to obtain better initial guesses for Newton's method, create informative graphics, and be in the underlying variational structure. We use two modified implementations of the gradient Newton-Galerkin algorithm (GNGA, Neuberger and Swift) ...

  7. Automated Spectral Manipulation and Data Analysis for EPR Dosimetry of Teeth

    International Nuclear Information System (INIS)

    A method for automating the spectral manipulation and data analysis procedures for EPR dosimetry of teeth is presented. The method is shown to correlate with conventional spectral peak-to-peak values to within 10 mGy for the reconstructed doses. Paired difference t-test data show that definitive systematic differences exist but that these effects on the reconstructed dose accuracy and precision are negligible for typical applications in tooth enamel EPR dosimetry. The algorithm is written in Hypercard script for Macintosh computers but could be implemented on other platforms (author)

  8. Automation of block assignment planning using a diagram-based scenario modeling method

    Directory of Open Access Journals (Sweden)

    Hwang In Hyuck

    2014-03-01

    Full Text Available Most shipbuilding scheduling research so far has focused on the load level on the dock plan. This is be¬cause the dock is the least extendable resource in shipyards, and its overloading is difficult to resolve. However, once dock scheduling is completed, making a plan that makes the best use of the rest of the resources in the shipyard to minimize any additional cost is also important. Block assignment planning is one of the midterm planning tasks; it assigns a block to the facility (factory/shop or surface plate that will actually manufacture the block according to the block characteristics and current situation of the facility. It is one of the most heavily loaded midterm planning tasks and is carried out manu¬ally by experienced workers. In this study, a method of representing the block assignment rules using a diagram was su¬ggested through analysis of the existing manual process. A block allocation program was developed which automated the block assignment process according to the rules represented by the diagram. The planning scenario was validated through a case study that compared the manual assignment and two automated block assignment results.

  9. Conventional Versus Automated Implantation of Loose Seeds in Prostate Brachytherapy: Analysis of Dosimetric and Clinical Results

    Energy Technology Data Exchange (ETDEWEB)

    Genebes, Caroline, E-mail: genebes.caroline@claudiusregaud.fr [Radiation Oncology Department, Institut Claudius Regaud, Toulouse (France); Filleron, Thomas; Graff, Pierre [Radiation Oncology Department, Institut Claudius Regaud, Toulouse (France); Jonca, Frédéric [Department of Urology, Clinique Ambroise Paré, Toulouse (France); Huyghe, Eric; Thoulouzan, Matthieu; Soulie, Michel; Malavaud, Bernard [Department of Urology and Andrology, CHU Rangueil, Toulouse (France); Aziza, Richard; Brun, Thomas; Delannes, Martine; Bachaud, Jean-Marc [Radiation Oncology Department, Institut Claudius Regaud, Toulouse (France)

    2013-11-15

    Purpose: To review the clinical outcome of I-125 permanent prostate brachytherapy (PPB) for low-risk and intermediate-risk prostate cancer and to compare 2 techniques of loose-seed implantation. Methods and Materials: 574 consecutive patients underwent I-125 PPB for low-risk and intermediate-risk prostate cancer between 2000 and 2008. Two successive techniques were used: conventional implantation from 2000 to 2004 and automated implantation (Nucletron, FIRST system) from 2004 to 2008. Dosimetric and biochemical recurrence-free (bNED) survival results were reported and compared for the 2 techniques. Univariate and multivariate analysis researched independent predictors for bNED survival. Results: 419 (73%) and 155 (27%) patients with low-risk and intermediate-risk disease, respectively, were treated (median follow-up time, 69.3 months). The 60-month bNED survival rates were 95.2% and 85.7%, respectively, for patients with low-risk and intermediate-risk disease (P=.04). In univariate analysis, patients treated with automated implantation had worse bNED survival rates than did those treated with conventional implantation (P<.0001). By day 30, patients treated with automated implantation showed lower values of dose delivered to 90% of prostate volume (D90) and volume of prostate receiving 100% of prescribed dose (V100). In multivariate analysis, implantation technique, Gleason score, and V100 on day 30 were independent predictors of recurrence-free status. Grade 3 urethritis and urinary incontinence were observed in 2.6% and 1.6% of the cohort, respectively, with no significant differences between the 2 techniques. No grade 3 proctitis was observed. Conclusion: Satisfactory 60-month bNED survival rates (93.1%) and acceptable toxicity (grade 3 urethritis <3%) were achieved by loose-seed implantation. Automated implantation was associated with worse dosimetric and bNED survival outcomes.

  10. Development of automated high throughput single molecular microfluidic detection platform for signal transduction analysis

    Science.gov (United States)

    Huang, Po-Jung; Baghbani Kordmahale, Sina; Chou, Chao-Kai; Yamaguchi, Hirohito; Hung, Mien-Chie; Kameoka, Jun

    2016-03-01

    Signal transductions including multiple protein post-translational modifications (PTM), protein-protein interactions (PPI), and protein-nucleic acid interaction (PNI) play critical roles for cell proliferation and differentiation that are directly related to the cancer biology. Traditional methods, like mass spectrometry, immunoprecipitation, fluorescence resonance energy transfer, and fluorescence correlation spectroscopy require a large amount of sample and long processing time. "microchannel for multiple-parameter analysis of proteins in single-complex (mMAPS)"we proposed can reduce the process time and sample volume because this system is composed by microfluidic channels, fluorescence microscopy, and computerized data analysis. In this paper, we will present an automated mMAPS including integrated microfluidic device, automated stage and electrical relay for high-throughput clinical screening. Based on this result, we estimated that this automated detection system will be able to screen approximately 150 patient samples in a 24-hour period, providing a practical application to analyze tissue samples in a clinical setting.

  11. Development of a software for INAA analysis automation

    International Nuclear Information System (INIS)

    In this work, a software to automate the post-counting tasks in comparative INAA has been developed that aims to become more flexible than the available options, integrating itself with some of the routines currently in use in the IPEN Activation Analysis Laboratory and allowing the user to choose between a fully-automatic analysis or an Excel-oriented one. The software makes use of the Genie 2000 data importing and analysis routines and stores each 'energy-counts-uncertainty' table as a separate ASCII file that can be used later on if required by the analyst. Moreover, it generates an Excel-compatible CSV (comma separated values) file with only the relevant results from the analyses for each sample or comparator, as well as the results of the concentration calculations and the results obtained with four different statistical tools (unweighted average, weighted average, normalized residuals and Rajeval technique), allowing the analyst to double-check the results. Finally, a 'summary' CSV file is also produced, with the final concentration results obtained for each element in each sample. (author)

  12. Intelligent Control in Automation Based on Wireless Traffic Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kurt Derr; Milos Manic

    2007-08-01

    Wireless technology is a central component of many factory automation infrastructures in both the commercial and government sectors, providing connectivity among various components in industrial realms (distributed sensors, machines, mobile process controllers). However wireless technologies provide more threats to computer security than wired environments. The advantageous features of Bluetooth technology resulted in Bluetooth units shipments climbing to five million per week at the end of 2005 [1, 2]. This is why the real-time interpretation and understanding of Bluetooth traffic behavior is critical in both maintaining the integrity of computer systems and increasing the efficient use of this technology in control type applications. Although neuro-fuzzy approaches have been applied to wireless 802.11 behavior analysis in the past, a significantly different Bluetooth protocol framework has not been extensively explored using this technology. This paper presents a new neurofuzzy traffic analysis algorithm of this still new territory of Bluetooth traffic. Further enhancements of this algorithm are presented along with the comparison against the traditional, numerical approach. Through test examples, interesting Bluetooth traffic behavior characteristics were captured, and the comparative elegance of this computationally inexpensive approach was demonstrated. This analysis can be used to provide directions for future development and use of this prevailing technology in various control type applications, as well as making the use of it more secure.

  13. Intelligent Control in Automation Based on Wireless Traffic Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kurt Derr; Milos Manic

    2007-09-01

    Wireless technology is a central component of many factory automation infrastructures in both the commercial and government sectors, providing connectivity among various components in industrial realms (distributed sensors, machines, mobile process controllers). However wireless technologies provide more threats to computer security than wired environments. The advantageous features of Bluetooth technology resulted in Bluetooth units shipments climbing to five million per week at the end of 2005 [1, 2]. This is why the real-time interpretation and understanding of Bluetooth traffic behavior is critical in both maintaining the integrity of computer systems and increasing the efficient use of this technology in control type applications. Although neuro-fuzzy approaches have been applied to wireless 802.11 behavior analysis in the past, a significantly different Bluetooth protocol framework has not been extensively explored using this technology. This paper presents a new neurofuzzy traffic analysis algorithm of this still new territory of Bluetooth traffic. Further enhancements of this algorithm are presented along with the comparison against the traditional, numerical approach. Through test examples, interesting Bluetooth traffic behavior characteristics were captured, and the comparative elegance of this computationally inexpensive approach was demonstrated. This analysis can be used to provide directions for future development and use of this prevailing technology in various control type applications, as well as making the use of it more secure.

  14. Evaluation of automated cell disruptor methods for oomycetous and ascomycetous model organisms

    Science.gov (United States)

    Two automated cell disruptor-based methods for RNA extraction; disruption of thawed cells submerged in TRIzol Reagent (method QP), and direct disruption of frozen cells on dry ice (method CP), were optimized for a model oomycete, Phytophthora capsici, and compared with grinding in a mortar and pestl...

  15. Automated crown detection algorithm: an analysis of two tropical Amazonian forests

    Science.gov (United States)

    Palace, M.; Keller, M.; Asner, G.; Hagen, S.; Braswell, B.

    2002-12-01

    Spatial analysis of crowns in high-resolution images can improve the estimate of carbon stocks on regional and local scales, aid in demographic studies on the stand level, begin to analyze tree structural properties at the landscape level, and aid in forestry efforts. Radiative inverse transfer models, gap models, and cohort models may be parameterized with the spatial analysis of crowns and subsequently derived forest structural characteristics. We developed an algorithm to automatically detect tree crowns in two tropical Amazonian forests. IKONOS panchromatic images were used from two Amazonian forests in Para, Brazil: the Tapajos National Forest, (3.08° S, 54.94° W) and the Fazenda Cauaxi, (3.75° S, 48.37° W). Analysis was conducted on undisturbed forests from both sites. Our method combines local maximum filtering and local minima value finding methods with analysis of extracted transect data from the local maxima. We use a derivative threshold that ends the transect. Once all pixels of a given brightness value are analyzed, an iterative step examines the next lower brightness value. Pixels where crowns have been delineated are taken out of further analysis. Our method allows for overlap of crowns, gaps between crowns, and complex and noisy canopies to be analyzed. A sensitivity analysis was run on the derivative threshold and the minimum local maximum value to seed the transect analysis. Least-squares goodness of fit is conducted to examine parameterization from the sensitivity analysis. The best fit for the derivative threshold is found set at -8. The sensitivity analysis finds that the minimum local maxima is related to the difference between the maximum brightness value and brightness value with the highest frequency. Mean, minimum and maximum crown widths for field data are (mean 9.0 m +/- 1.6 S.D., min 1.0 m, max 40.7 m) and automated estimation are (mean 11.9 m +/- 5.0 S.D., min 2.0 m, max 34.0 m). The Kolmogorov-Smirov test for difference between

  16. Monitored Retrievable Storage/Multi-Purpose Canister analysis: Simulation and economics of automation

    International Nuclear Information System (INIS)

    Robotic automation is examined as a possible alternative to manual spent nuclear fuel, transport cask and Multi-Purpose canister (MPC) handling at a Monitored Retrievable Storage (MRS) facility. Automation of key operational aspects for the MRS/MPC system are analyzed to determine equipment requirements, through-put times and equipment costs is described. The economic and radiation dose impacts resulting from this automation are compared to manual handling methods

  17. OpenComet: An automated tool for comet assay image analysis

    Directory of Open Access Journals (Sweden)

    Benjamin M. Gyori

    2014-01-01

    Full Text Available Reactive species such as free radicals are constantly generated in vivo and DNA is the most important target of oxidative stress. Oxidative DNA damage is used as a predictive biomarker to monitor the risk of development of many diseases. The comet assay is widely used for measuring oxidative DNA damage at a single cell level. The analysis of comet assay output images, however, poses considerable challenges. Commercial software is costly and restrictive, while free software generally requires laborious manual tagging of cells. This paper presents OpenComet, an open-source software tool providing automated analysis of comet assay images. It uses a novel and robust method for finding comets based on geometric shape attributes and segmenting the comet heads through image intensity profile analysis. Due to automation, OpenComet is more accurate, less prone to human bias, and faster than manual analysis. A live analysis functionality also allows users to analyze images captured directly from a microscope. We have validated OpenComet on both alkaline and neutral comet assay images as well as sample images from existing software packages. Our results show that OpenComet achieves high accuracy with significantly reduced analysis time.

  18. Automated target recognition technique for image segmentation and scene analysis

    Science.gov (United States)

    Baumgart, Chris W.; Ciarcia, Christopher A.

    1994-03-01

    Automated target recognition (ATR) software has been designed to perform image segmentation and scene analysis. Specifically, this software was developed as a package for the Army's Minefield and Reconnaissance and Detector (MIRADOR) program. MIRADOR is an on/off road, remote control, multisensor system designed to detect buried and surface- emplaced metallic and nonmetallic antitank mines. The basic requirements for this ATR software were the following: (1) an ability to separate target objects from the background in low signal-noise conditions; (2) an ability to handle a relatively high dynamic range in imaging light levels; (3) the ability to compensate for or remove light source effects such as shadows; and (4) the ability to identify target objects as mines. The image segmentation and target evaluation was performed using an integrated and parallel processing approach. Three basic techniques (texture analysis, edge enhancement, and contrast enhancement) were used collectively to extract all potential mine target shapes from the basic image. Target evaluation was then performed using a combination of size, geometrical, and fractal characteristics, which resulted in a calculated probability for each target shape. Overall results with this algorithm were quite good, though there is a tradeoff between detection confidence and the number of false alarms. This technology also has applications in the areas of hazardous waste site remediation, archaeology, and law enforcement.

  19. a Psycholinguistic Model for Simultaneous Translation, and Proficiency Assessment by Automated Acoustic Analysis of Discourse.

    Science.gov (United States)

    Yaghi, Hussein M.

    Two separate but related issues are addressed: how simultaneous translation (ST) works on a cognitive level and how such translation can be objectively assessed. Both of these issues are discussed in the light of qualitative and quantitative analyses of a large corpus of recordings of ST and shadowing. The proposed ST model utilises knowledge derived from a discourse analysis of the data, many accepted facts in the psychology tradition, and evidence from controlled experiments that are carried out here. This model has three advantages: (i) it is based on analyses of extended spontaneous speech rather than word-, syllable-, or clause -bound stimuli; (ii) it draws equally on linguistic and psychological knowledge; and (iii) it adopts a non-traditional view of language called 'the linguistic construction of reality'. The discourse-based knowledge is also used to develop three computerised systems for the assessment of simultaneous translation: one is a semi-automated system that treats the content of the translation; and two are fully automated, one of which is based on the time structure of the acoustic signals whilst the other is based on their cross-correlation. For each system, several parameters of performance are identified, and they are correlated with assessments rendered by the traditional, subjective, qualitative method. Using signal processing techniques, the acoustic analysis of discourse leads to the conclusion that quality in simultaneous translation can be assessed quantitatively with varying degrees of automation. It identifies as measures of performance (i) three content-based standards; (ii) four time management parameters that reflect the influence of the source on the target language time structure; and (iii) two types of acoustical signal coherence. Proficiency in ST is shown to be directly related to coherence and speech rate but inversely related to omission and delay. High proficiency is associated with a high degree of simultaneity and

  20. Automated microscopical method for the characterization of pyrite in coal

    Energy Technology Data Exchange (ETDEWEB)

    Kuehn, K. W.; Davis, A.

    1979-08-01

    The Rapid Scan system of automated reflectance microscopy is a line scanning device capable of characterizing the pyrite content of a coal. The system combines a sophisticated software base with standard hardware components to provide flexibility in analytical format. As a prepared coal sample is moved automatically beneath the microscope objective, reflectance values from adjacent 2 ..mu..m squares are sampled by the photomultiplier tube. These values are stored in computer memory for subsequent calculation of the volumetric proportion of pyrite in the sample. Simultaneously, the Rapid Scan also accumulates a distribution of pyrite chord lengths. In a modification of the Rosiwal linear scanning technique, the number of consecutive readings on a pyrite particle are recorded and stored independently in computer memory. In order to determine the precision of pyrite volume estimates, a suite of coals with varying pyrite content was analyzed at several sample-point densities. The level of precision achieved was found to be a function of sample density and was always superior to the precision of the conventional, visual point counting technique. From these data, sampling plans to achieve a desired level of analytical precision were designed. The ability of the Rapid Scan system to quickly characterize many samples of a feed coal in this manner, makes it a potentially effective tool in helping to optimize preparation facility operation for the removal of pyrite.

  1. Automated calibration methods for robotic multisensor landmine detection

    Science.gov (United States)

    Keranen, Joe G.; Miller, Jonathan; Schultz, Gregory; Topolosky, Zeke

    2007-04-01

    Both force protection and humanitarian demining missions require efficient and reliable detection and discrimination of buried anti-tank and anti-personnel landmines. Widely varying surface and subsurface conditions, mine types and placement, as well as environmental regimes challenge the robustness of the automatic target recognition process. In this paper we present applications created for the U.S. Army Nemesis detection platform. Nemesis is an unmanned rubber-tracked vehicle-based system designed to eradicate a wide variety of anti-tank and anti-personnel landmines for humanitarian demining missions. The detection system integrates advanced ground penetrating synthetic aperture radar (GPSAR) and electromagnetic induction (EMI) arrays, highly accurate global and local positioning, and on-board target detection/classification software on the front loader of a semi-autonomous UGV. An automated procedure is developed to estimate the soil's dielectric constant using surface reflections from the ground penetrating radar. The results have implications not only for calibration of system data acquisition parameters, but also for user awareness and tuning of automatic target recognition detection and discrimination algorithms.

  2. High-throughput automated dissolution method applicable for a wide dose range of controlled release pellets.

    Science.gov (United States)

    Petruševska, Marija; Horvat, Matej; Peternel, Luka; Kristan, Katja

    2016-07-01

    The aim of the present study was to demonstrate the application of an automated high-throughput (HT) dissolution method as a useful screening tool for characterization of controlled release pellets in the formulation development phase. Five controlled release pellet formulations with drug substances exhibiting high or low solubility were chosen to investigate the correlation of the automated HT dissolution method with the conventional dissolution testing. Overall, excellent correlations (R(2 )>( )0.96) between the HT and the conventional dissolution method were obtained. In one case the initial unsatisfactory correlation (R(2 )=( )0.84) and poor method agreement (SD = 12.5) was improved by optimizing the HT dissolution method with design of experiment approach. Here in comparison to initial experimental HT dissolution settings, increased amount of pellets (25% of the capsule filling mass), lower temperature (22 °C) and no shaking resulted in significantly better correlation (R(2 )=( )0.97) and method agreement (SD = 5.3). These results show that such optimization is valuable for the development of HT dissolution methods. In conclusion, the high correlation of dissolution profiles obtained from the conventional and the automated HT dissolution method combined with low within-sample and measurement system variability, justifies the utilization of the automated HT dissolution method during development phase of controlled release pellets. PMID:26552838

  3. 40 CFR 13.19 - Analysis of costs; automation; prevention of overpayments, delinquencies or defaults.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 1 2010-07-01 2010-07-01 false Analysis of costs; automation; prevention of overpayments, delinquencies or defaults. 13.19 Section 13.19 Protection of Environment...; automation; prevention of overpayments, delinquencies or defaults. (a) The Administrator may...

  4. AUTOMATED SOLID PHASE EXTRACTION GC/MS FOR ANALYSIS OF SEMIVOLATILES IN WATER AND SEDIMENTS

    Science.gov (United States)

    Data is presented on the development of a new automated system combining solid phase extraction (SPE) with GC/MS spectrometry for the single-run analysis of water samples containing a broad range of organic compounds. The system uses commercially available automated in-line sampl...

  5. Automated longitudinal intra-subject analysis (ALISA) for diffusion MRI tractography

    DEFF Research Database (Denmark)

    Aarnink, Saskia H; Vos, Sjoerd B; Leemans, Alexander; Jernigan, Terry L; Madsen, Kathrine Skak; Baaré, William F C

    2014-01-01

    inter-subject and intra-subject automation in this situation are intended for subjects without gross pathology. In this work, we propose such an automated longitudinal intra-subject analysis (dubbed ALISA) approach, and assessed whether ALISA could preserve the same level of reliability as obtained with...

  6. Automated Integrated Analog Filter Design Issues

    OpenAIRE

    Karolis Kiela; Romualdas Navickas

    2015-01-01

    An analysis of modern automated integrated analog circuits design methods and their use in integrated filter design is done. Current modern analog circuits automated tools are based on optimization algorithms and/or new circuit generation methods. Most automated integrated filter design methods are only suited to gmC and switched current filter topologies. Here, an algorithm for an active RC integrated filter design is proposed, that can be used in automated filter designs. The algorithm is t...

  7. Twelve automated thresholding methods for segmentation of PET images: a phantom study

    Science.gov (United States)

    Prieto, Elena; Lecumberri, Pablo; Pagola, Miguel; Gómez, Marisol; Bilbao, Izaskun; Ecay, Margarita; Peñuelas, Iván; Martí-Climent, Josep M.

    2012-06-01

    Tumor volume delineation over positron emission tomography (PET) images is of great interest for proper diagnosis and therapy planning. However, standard segmentation techniques (manual or semi-automated) are operator dependent and time consuming while fully automated procedures are cumbersome or require complex mathematical development. The aim of this study was to segment PET images in a fully automated way by implementing a set of 12 automated thresholding algorithms, classical in the fields of optical character recognition, tissue engineering or non-destructive testing images in high-tech structures. Automated thresholding algorithms select a specific threshold for each image without any a priori spatial information of the segmented object or any special calibration of the tomograph, as opposed to usual thresholding methods for PET. Spherical 18F-filled objects of different volumes were acquired on clinical PET/CT and on a small animal PET scanner, with three different signal-to-background ratios. Images were segmented with 12 automatic thresholding algorithms and results were compared with the standard segmentation reference, a threshold at 42% of the maximum uptake. Ridler and Ramesh thresholding algorithms based on clustering and histogram-shape information, respectively, provided better results that the classical 42%-based threshold (p < 0.05). We have herein demonstrated that fully automated thresholding algorithms can provide better results than classical PET segmentation tools.

  8. Twelve automated thresholding methods for segmentation of PET images: a phantom study

    International Nuclear Information System (INIS)

    Tumor volume delineation over positron emission tomography (PET) images is of great interest for proper diagnosis and therapy planning. However, standard segmentation techniques (manual or semi-automated) are operator dependent and time consuming while fully automated procedures are cumbersome or require complex mathematical development. The aim of this study was to segment PET images in a fully automated way by implementing a set of 12 automated thresholding algorithms, classical in the fields of optical character recognition, tissue engineering or non-destructive testing images in high-tech structures. Automated thresholding algorithms select a specific threshold for each image without any a priori spatial information of the segmented object or any special calibration of the tomograph, as opposed to usual thresholding methods for PET. Spherical 18F-filled objects of different volumes were acquired on clinical PET/CT and on a small animal PET scanner, with three different signal-to-background ratios. Images were segmented with 12 automatic thresholding algorithms and results were compared with the standard segmentation reference, a threshold at 42% of the maximum uptake. Ridler and Ramesh thresholding algorithms based on clustering and histogram-shape information, respectively, provided better results that the classical 42%-based threshold (p < 0.05). We have herein demonstrated that fully automated thresholding algorithms can provide better results than classical PET segmentation tools. (paper)

  9. Approach to analysis of single nucleotide polymorphisms by automated constant denaturant capillary electrophoresis

    International Nuclear Information System (INIS)

    Melting gel techniques have proven to be amenable and powerful tools in point mutation and single nucleotide polymorphism (SNP) analysis. With the introduction of commercially available capillary electrophoresis instruments, a partly automated platform for denaturant capillary electrophoresis with potential for routine screening of selected target sequences has been established. The aim of this article is to demonstrate the use of automated constant denaturant capillary electrophoresis (ACDCE) in single nucleotide polymorphism analysis of various target sequences. Optimal analysis conditions for different single nucleotide polymorphisms on ACDCE are evaluated with the Poland algorithm. Laboratory procedures include only PCR and electrophoresis. For direct genotyping of individual SNPs, the samples are analyzed with an internal standard and the alleles are identified by co-migration of sample and standard peaks. In conclusion, SNPs suitable for melting gel analysis based on theoretical thermodynamics were separated by ACDCE under appropriate conditions. With this instrumentation (ABI 310 Genetic Analyzer), 48 samples could be analyzed without any intervention. Several institutions have capillary instrumentation in-house, thus making this SNP analysis method accessible to large groups of researchers without any need for instrument modification

  10. Automated image analysis of lateral lumber X-rays by a form model

    International Nuclear Information System (INIS)

    Development of a software for fully automated image analysis of lateral lumbar spine X-rays. Material and method: Using the concept of active shape models, we developed a software that produces a form model of the lumbar spine from lateral lumbar spine radiographs and runs an automated image segmentation. This model is able to detect lumbar vertebrae automatically after the filtering of digitized X-ray images. The model was trained with 20 lateral lumbar spine radiographs with no pathological findings before we evaluated the software with 30 further X-ray images which were sorted by image quality ranging from one (best) to three (worst). There were 10 images for each quality. Results: Image recognition strongly depended on image quality. In group one 52 and in group two 51 out of 60 vertebral bodies including the sacrum were recognized, but in group three only 18 vertebral bodies were properly identified. Conclusion: Fully automated and reliable recognition of vertebral bodies from lateral spine radiographs using the concept of active shape models is possible. The precision of this technique is limited by the superposition of different structures. Further improvements are necessary. Therefore standardized image quality and enlargement of the training data set are required. (orig.)

  11. Semi-Automated Detection of Surface Degradation on Bridges Based on a Level Set Method

    Science.gov (United States)

    Masiero, A.; Guarnieri, A.; Pirotti, F.; Vettore, A.

    2015-08-01

    Due to the effect of climate factors, natural phenomena and human usage, buildings and infrastructures are subject of progressive degradation. The deterioration of these structures has to be monitored in order to avoid hazards for human beings and for the natural environment in their neighborhood. Hence, on the one hand, monitoring such infrastructures is of primarily importance. On the other hand, unfortunately, nowadays this monitoring effort is mostly done by expert and skilled personnel, which follow the overall data acquisition, analysis and result reporting process, making the whole monitoring procedure quite expensive for the public (and private, as well) agencies. This paper proposes the use of a partially user-assisted procedure in order to reduce the monitoring cost and to make the obtained result less subjective as well. The developed method relies on the use of images acquired with standard cameras by even inexperienced personnel. The deterioration on the infrastructure surface is detected by image segmentation based on a level sets method. The results of the semi-automated analysis procedure are remapped on a 3D model of the infrastructure obtained by means of a terrestrial laser scanning acquisition. The proposed method has been successfully tested on a portion of a road bridge in Perarolo di Cadore (BL), Italy.

  12. Models, methods and software for distributed knowledge acquisition for the automated construction of integrated expert systems knowledge bases

    International Nuclear Information System (INIS)

    Based on an analysis of existing models, methods and means of acquiring knowledge, a base method of automated knowledge acquisition has been chosen. On the base of this method, a new approach to integrate information acquired from knowledge sources of different typologies has been proposed, and the concept of a distributed knowledge acquisition with the aim of computerized formation of the most complete and consistent models of problem areas has been introduced. An original algorithm for distributed knowledge acquisition from databases, based on the construction of binary decision trees has been developed

  13. Rapid Quantification of Myocardial Fibrosis: A New Macro-Based Automated Analysis

    Directory of Open Access Journals (Sweden)

    Awal M. Hadi

    2010-01-01

    Full Text Available Background: Fibrosis is associated with various cardiac pathologies and dysfunction. Current quantification methods are time-consuming and laborious. We describe a semi-automated quantification technique for myocardial fibrosis and validated this using traditional methods.

  14. Failure mode and effects analysis of software-based automation systems

    International Nuclear Information System (INIS)

    Failure mode and effects analysis (FMEA) is one of the well-known analysis methods having an established position in the traditional reliability analysis. The purpose of FMEA is to identify possible failure modes of the system components, evaluate their influences on system behaviour and propose proper countermeasures to suppress these effects. The generic nature of FMEA has enabled its wide use in various branches of industry reaching from business management to the design of spaceships. The popularity and diverse use of the analysis method has led to multiple interpretations, practices and standards presenting the same analysis method. FMEA is well understood at the systems and hardware levels, where the potential failure modes usually are known and the task is to analyse their effects on system behaviour. Nowadays, more and more system functions are realised on software level, which has aroused the urge to apply the FMEA methodology also on software based systems. Software failure modes generally are unknown - 'software modules do not fail, they only display incorrect behaviour' - and depend on dynamic behaviour of the application. These facts set special requirements on the FMEA of software based systems and make it difficult to realise. In this report the failure mode and effects analysis is studied for the use of reliability analysis of software-based systems. More precisely, the target system of FMEA is defined to be a safety-critical software-based automation application in a nuclear power plant, implemented on an industrial automation system platform. Through a literature study the report tries to clarify the intriguing questions related to the practical use of software failure mode and effects analysis. The study is a part of the research project 'Programmable Automation System Safety Integrity assessment (PASSI)', belonging to the Finnish Nuclear Safety Research Programme (FINNUS, 1999-2002). In the project various safety assessment methods and tools for

  15. Automated absolute activation analysis with californium-252 sources

    Energy Technology Data Exchange (ETDEWEB)

    MacMurdo, K.W.; Bowman, W.W.

    1978-09-01

    A 100-mg /sup 252/Cf neutron activation analysis facility is used routinely at the Savannah River Laboratory for multielement analysis of many solid and liquid samples. An absolute analysis technique converts counting data directly to elemental concentration without the use of classical comparative standards and flux monitors. With the totally automated pneumatic sample transfer system, cyclic irradiation-decay-count regimes can be pre-selected for up to 40 samples, and samples can be analyzed with the facility unattended. An automatic data control system starts and stops a high-resolution gamma-ray spectrometer and/or a delayed-neutron detector; the system also stores data and controls output modes. Gamma ray data are reduced by three main programs in the IBM 360/195 computer: the 4096-channel spectrum and pertinent experimental timing, counting, and sample data are stored on magnetic tape; the spectrum is then reduced to a list of significant photopeak energies, integrated areas, and their associated statistical errors; and the third program assigns gamma ray photopeaks to the appropriate neutron activation product(s) by comparing photopeak energies to tabulated gamma ray energies. Photopeak areas are then converted to elemental concentration by using experimental timing and sample data, calculated elemental neutron capture rates, absolute detector efficiencies, and absolute spectroscopic decay data. Calculational procedures have been developed so that fissile material can be analyzed by cyclic neutron activation and delayed-neutron counting procedures. These calculations are based on a 6 half-life group model of delayed neutron emission; calculations include corrections for delayed neutron interference from /sup 17/O. Detection sensitivities of < or = 400 ppB for natural uranium and 8 ppB (< or = 0.5 (nCi/g)) for /sup 239/Pu were demonstrated with 15-g samples at a throughput of up to 140 per day. Over 40 elements can be detected at the sub-ppM level.

  16. Automated absolute activation analysis with californium-252 sources

    International Nuclear Information System (INIS)

    A 100-mg 252Cf neutron activation analysis facility is used routinely at the Savannah River Laboratory for multielement analysis of many solid and liquid samples. An absolute analysis technique converts counting data directly to elemental concentration without the use of classical comparative standards and flux monitors. With the totally automated pneumatic sample transfer system, cyclic irradiation-decay-count regimes can be pre-selected for up to 40 samples, and samples can be analyzed with the facility unattended. An automatic data control system starts and stops a high-resolution gamma-ray spectrometer and/or a delayed-neutron detector; the system also stores data and controls output modes. Gamma ray data are reduced by three main programs in the IBM 360/195 computer: the 4096-channel spectrum and pertinent experimental timing, counting, and sample data are stored on magnetic tape; the spectrum is then reduced to a list of significant photopeak energies, integrated areas, and their associated statistical errors; and the third program assigns gamma ray photopeaks to the appropriate neutron activation product(s) by comparing photopeak energies to tabulated gamma ray energies. Photopeak areas are then converted to elemental concentration by using experimental timing and sample data, calculated elemental neutron capture rates, absolute detector efficiencies, and absolute spectroscopic decay data. Calculational procedures have been developed so that fissile material can be analyzed by cyclic neutron activation and delayed-neutron counting procedures. These calculations are based on a 6 half-life group model of delayed neutron emission; calculations include corrections for delayed neutron interference from 17O. Detection sensitivities of 239Pu were demonstrated with 15-g samples at a throughput of up to 140 per day. Over 40 elements can be detected at the sub-ppM level

  17. Multimodal microscopy for automated histologic analysis of prostate cancer

    Directory of Open Access Journals (Sweden)

    Sinha Saurabh

    2011-02-01

    Full Text Available Abstract Background Prostate cancer is the single most prevalent cancer in US men whose gold standard of diagnosis is histologic assessment of biopsies. Manual assessment of stained tissue of all biopsies limits speed and accuracy in clinical practice and research of prostate cancer diagnosis. We sought to develop a fully-automated multimodal microscopy method to distinguish cancerous from non-cancerous tissue samples. Methods We recorded chemical data from an unstained tissue microarray (TMA using Fourier transform infrared (FT-IR spectroscopic imaging. Using pattern recognition, we identified epithelial cells without user input. We fused the cell type information with the corresponding stained images commonly used in clinical practice. Extracted morphological features, optimized by two-stage feature selection method using a minimum-redundancy-maximal-relevance (mRMR criterion and sequential floating forward selection (SFFS, were applied to classify tissue samples as cancer or non-cancer. Results We achieved high accuracy (area under ROC curve (AUC >0.97 in cross-validations on each of two data sets that were stained under different conditions. When the classifier was trained on one data set and tested on the other data set, an AUC value of ~0.95 was observed. In the absence of IR data, the performance of the same classification system dropped for both data sets and between data sets. Conclusions We were able to achieve very effective fusion of the information from two different images that provide very different types of data with different characteristics. The method is entirely transparent to a user and does not involve any adjustment or decision-making based on spectral data. By combining the IR and optical data, we achieved high accurate classification.

  18. An Automated Approach for Slicing Plane Placement in Visual Data Analysis.

    Science.gov (United States)

    Obermaier, Harald; Joy, Kenneth I

    2015-12-01

    Effective display and visual analysis of complex 3D data is a challenging task. Occlusions, overlaps, and projective distortions-as frequently caused by typical 3D rendering techniques-can be major obstacles to unambiguous and robust data analysis. Slicing planes are a ubiquitous tool to resolve several of these issues. They act as simple clipping geometry to provide clear cut-away views of the data. We propose to enhance the visualization and analysis process by providing methods for automatic placement of such slicing planes based on local optimization of gradient vector flow. The final obtained slicing planes maximize the total amount of information displayed with respect to a pre-specified importance function. We demonstrate how such automated slicing plane placement is able to support and enrich 3D data visualization and analysis in multiple scenarios, such as volume or surface rendering, and evaluate its performance in several benchmark data sets. PMID:26529461

  19. Automated static image analysis as a novel tool in describing the physical properties of dietary fiber

    Directory of Open Access Journals (Sweden)

    Marcin Andrzej KUREK

    2015-01-01

    Full Text Available Abstract The growing interest in the usage of dietary fiber in food has caused the need to provide precise tools for describing its physical properties. This research examined two dietary fibers from oats and beets, respectively, in variable particle sizes. The application of automated static image analysis for describing the hydration properties and particle size distribution of dietary fiber was analyzed. Conventional tests for water holding capacity (WHC were conducted. The particles were measured at two points: dry and after water soaking. The most significant water holding capacity (7.00 g water/g solid was achieved by the smaller sized oat fiber. Conversely, the water holding capacity was highest (4.20 g water/g solid in larger sized beet fiber. There was evidence for water absorption increasing with a decrease in particle size in regards to the same fiber source. Very strong correlations were drawn between particle shape parameters, such as fiber length, straightness, width and hydration properties measured conventionally. The regression analysis provided the opportunity to estimate whether the automated static image analysis method could be an efficient tool in describing the hydration properties of dietary fiber. The application of the method was validated using mathematical model which was verified in comparison to conventional WHC measurement results.

  20. Can Automated Imaging for Optic Disc and Retinal Nerve Fiber Layer Analysis Aid Glaucoma Detection?

    Science.gov (United States)

    Banister, Katie; Boachie, Charles; Bourne, Rupert; Cook, Jonathan; Burr, Jennifer M.; Ramsay, Craig; Garway-Heath, David; Gray, Joanne; McMeekin, Peter; Hernández, Rodolfo; Azuara-Blanco, Augusto

    2016-01-01

    Purpose To compare the diagnostic performance of automated imaging for glaucoma. Design Prospective, direct comparison study. Participants Adults with suspected glaucoma or ocular hypertension referred to hospital eye services in the United Kingdom. Methods We evaluated 4 automated imaging test algorithms: the Heidelberg Retinal Tomography (HRT; Heidelberg Engineering, Heidelberg, Germany) glaucoma probability score (GPS), the HRT Moorfields regression analysis (MRA), scanning laser polarimetry (GDx enhanced corneal compensation; Glaucoma Diagnostics (GDx), Carl Zeiss Meditec, Dublin, CA) nerve fiber indicator (NFI), and Spectralis optical coherence tomography (OCT; Heidelberg Engineering) retinal nerve fiber layer (RNFL) classification. We defined abnormal tests as an automated classification of outside normal limits for HRT and OCT or NFI ≥ 56 (GDx). We conducted a sensitivity analysis, using borderline abnormal image classifications. The reference standard was clinical diagnosis by a masked glaucoma expert including standardized clinical assessment and automated perimetry. We analyzed 1 eye per patient (the one with more advanced disease). We also evaluated the performance according to severity and using a combination of 2 technologies. Main Outcome Measures Sensitivity and specificity, likelihood ratios, diagnostic, odds ratio, and proportion of indeterminate tests. Results We recruited 955 participants, and 943 were included in the analysis. The average age was 60.5 years (standard deviation, 13.8 years); 51.1% were women. Glaucoma was diagnosed in at least 1 eye in 16.8%; 32% of participants had no glaucoma-related findings. The HRT MRA had the highest sensitivity (87.0%; 95% confidence interval [CI], 80.2%–92.1%), but lowest specificity (63.9%; 95% CI, 60.2%–67.4%); GDx had the lowest sensitivity (35.1%; 95% CI, 27.0%–43.8%), but the highest specificity (97.2%; 95% CI, 95.6%–98.3%). The HRT GPS sensitivity was 81.5% (95% CI, 73.9%–87.6%), and

  1. MORPHY, a program for an automated "atoms in molecules" analysis

    Science.gov (United States)

    Popelier, Paul L. A.

    1996-02-01

    The operating manual for a structured FORTAN 77 program called MORPHY is presented. This code performs an automated topological analysis of a molecular electron density and its Laplacian. The program is written in a stylistically homogeneous, transparant and modular manner. The input is compact but flexible and allows for multiple jobs in one deck. The output is detailed and has an attractive lay-out. Critical points in the charge density and its Laplacian can be located in a robust and economic way and are displayed via an external on-line visualisation package. The gradient vector field of the charge density can be traced with great accuracy, planar contour, relief and one-dimensional line plots of many scalar properties can be generated. Non-bonded radii are calculated and analytical expressions for interatomic surfaces are computed (with error estimates) and plotted. MORPHY is interfaced with the AIMPAC suite of programs. The capabilities of the program are illustrated with two test runs and five selected figures.

  2. Parameter Studies, time-dependent simulations and design with automated Cartesian methods

    Science.gov (United States)

    Aftosmis, Michael

    2005-01-01

    Over the past decade, NASA has made a substantial investment in developing adaptive Cartesian grid methods for aerodynamic simulation. Cartesian-based methods played a key role in both the Space Shuttle Accident Investigation and in NASA's return to flight activities. The talk will provide an overview of recent technological developments focusing on the generation of large-scale aerodynamic databases, automated CAD-based design, and time-dependent simulations with of bodies in relative motion. Automation, scalability and robustness underly all of these applications and research in each of these topics will be presented.

  3. A METHOD OF TASK ALLOCATION AND AUTOMATED NEGOTIATION FOR MULTI ROBOTS

    Institute of Scientific and Technical Information of China (English)

    Ke Wende; Peng Zhiping; Yuan Quande; Hong Bingrong; Chen Ke; Cai Zesu

    2012-01-01

    A method of task allocation and automated negotiation for multi robots was proposed.Firstly,the principles of task allocation were described based on the real capability of robot.Secondly,the model of automated negotiation was constructed,in which Least-Squares Support Vector Regression (LSSVR) was improved to estimate the opponent's negotiation utility and the robust controller of H∞ output feedback was employed to optimize the utility performance indicators.Thirdly,the protocol of negotiation and reallocation was proposed to improve the real-time capability and task allocation.Finally,the validity of method was proved through experiments.

  4. Automated Reasoning and Equation Solving with the Characteristic Set Method

    Institute of Scientific and Technical Information of China (English)

    Wen-Tsun Wu; Xiao-Shan Gao

    2006-01-01

    A brief introduction to the characteristic set method is given for solving algebraic equation systems and then the method is extended to algebraic difference systems. The method can be used to decompose the zero set for a difference polynomial set in general form to the union of difference polynomial sets in triangular form. Based on the characteristic set method, a decision procedure for the first order theory over an algebraically closed field and a procedure to prove certain difference identities are proposed.

  5. Software complex AS (automation of spectrometry). Conception of a program system invariant to experiment method variation

    International Nuclear Information System (INIS)

    For spectrometric experiments in which data are buffered in the memory, a special structure and operation algorithm of the data registration subsystem employing the basic software of the spectrometer has been developed. It suggests accounting for the experimental method in every measurement by specifying the vector of the registration system state in terms of varied parameters instead of traditional programming of spectrometer control. This type of program for registration subsystem can be used for different spectrometers without modification and it does not impose restrictions on the method of realization of the experiment. A software structure for the experiment automatization system has been developed, which allows realization of complete mathematical processing of the experimental data on-line as well as of turbo-mode processing that automates cyclic performance of such operations as editing of the experimental data content used, computation, and results analysis. The use of the proposed method of accounting for the experimental method to carry out a group of different experiments will reduce the volume of programming, enlarging the realized possibilities. (author)

  6. Evaluation of a content-based retrieval system for blood cell images with automated methods.

    Science.gov (United States)

    Seng, Woo Chaw; Mirisaee, Seyed Hadi

    2011-08-01

    Content-based image retrieval techniques have been extensively studied for the past few years. With the growth of digital medical image databases, the demand for content-based analysis and retrieval tools has been increasing remarkably. Blood cell image is a key diagnostic tool for hematologists. An automated system that can retrieved relevant blood cell images correctly and efficiently would save the effort and time of hematologists. The purpose of this work is to develop such a content-based image retrieval system. Global color histogram and wavelet-based methods are used in the prototype. The system allows users to search by providing a query image and select one of four implemented methods. The obtained results demonstrate the proposed extended query refinement has the potential to capture a user's high level query and perception subjectivity by dynamically giving better query combinations. Color-based methods performed better than wavelet-based methods with regard to precision, recall rate and retrieval time. Shape and density of blood cells are suggested as measurements for future improvement. The system developed is useful for undergraduate education. PMID:20703533

  7. A new method for automated discontinuity trace mapping on rock mass 3D surface model

    Science.gov (United States)

    Li, Xiaojun; Chen, Jianqin; Zhu, Hehua

    2016-04-01

    This paper presents an automated discontinuity trace mapping method on a 3D surface model of rock mass. Feature points of discontinuity traces are first detected using the Normal Tensor Voting Theory, which is robust to noisy point cloud data. Discontinuity traces are then extracted from feature points in four steps: (1) trace feature point grouping, (2) trace segment growth, (3) trace segment connection, and (4) redundant trace segment removal. A sensitivity analysis is conducted to identify optimal values for the parameters used in the proposed method. The optimal triangular mesh element size is between 5 cm and 6 cm; the angle threshold in the trace segment growth step is between 70° and 90°; the angle threshold in the trace segment connection step is between 50° and 70°, and the distance threshold should be at least 15 times the mean triangular mesh element size. The method is applied to the excavation face trace mapping of a drill-and-blast tunnel. The results show that the proposed discontinuity trace mapping method is fast and effective and could be used as a supplement to traditional direct measurement of discontinuity traces.

  8. Automated Counting of Airborne Asbestos Fibers by a High-Throughput Microscopy (HTM Method

    Directory of Open Access Journals (Sweden)

    Hwataik Han

    2011-07-01

    Full Text Available Inhalation of airborne asbestos causes serious health problems such as lung cancer and malignant mesothelioma. The phase-contrast microscopy (PCM method has been widely used for estimating airborne asbestos concentrations because it does not require complicated processes or high-priced equipment. However, the PCM method is time-consuming and laborious as it is manually performed off-site by an expert. We have developed a high-throughput microscopy (HTM method that can detect fibers distinguishable from other spherical particles in a sample slide by image processing both automatically and quantitatively. A set of parameters for processing and analysis of asbestos fiber images was adjusted for standard asbestos samples with known concentrations. We analyzed sample slides containing airborne asbestos fibers collected at 11 different workplaces following PCM and HTM methods, and found a reasonably good agreement in the asbestos concentration. Image acquisition synchronized with the movement of the robotic sample stages followed by an automated batch processing of a stack of sample images enabled us to count asbestos fibers with greatly reduced time and labors. HTM should be a potential alternative to conventional PCM, moving a step closer to realization of on-site monitoring of asbestos fibers in air.

  9. Hybrid digital signal processing and neural networks for automated diagnostics using NDE methods

    International Nuclear Information System (INIS)

    The primary purpose of the current research was to develop an integrated approach by combining information compression methods and artificial neural networks for the monitoring of plant components using nondestructive examination data. Specifically, data from eddy current inspection of heat exchanger tubing were utilized to evaluate this technology. The focus of the research was to develop and test various data compression methods (for eddy current data) and the performance of different neural network paradigms for defect classification and defect parameter estimation. Feedforward, fully-connected neural networks, that use the back-propagation algorithm for network training, were implemented for defect classification and defect parameter estimation using a modular network architecture. A large eddy current tube inspection database was acquired from the Metals and Ceramics Division of ORNL. These data were used to study the performance of artificial neural networks for defect type classification and for estimating defect parameters. A PC-based data preprocessing and display program was also developed as part of an expert system for data management and decision making. The results of the analysis showed that for effective (low-error) defect classification and estimation of parameters, it is necessary to identify proper feature vectors using different data representation methods. The integration of data compression and artificial neural networks for information processing was established as an effective technique for automation of diagnostics using nondestructive examination methods

  10. Linking Automated Data Analysis and Visualization with Applications in Developmental Biology and High-Energy Physics

    Energy Technology Data Exchange (ETDEWEB)

    Ruebel, Oliver [Technical Univ. of Darmstadt (Germany)

    2009-11-20

    Knowledge discovery from large and complex collections of today's scientific datasets is a challenging task. With the ability to measure and simulate more processes at increasingly finer spatial and temporal scales, the increasing number of data dimensions and data objects is presenting tremendous challenges for data analysis and effective data exploration methods and tools. Researchers are overwhelmed with data and standard tools are often insufficient to enable effective data analysis and knowledge discovery. The main objective of this thesis is to provide important new capabilities to accelerate scientific knowledge discovery form large, complex, and multivariate scientific data. The research covered in this thesis addresses these scientific challenges using a combination of scientific visualization, information visualization, automated data analysis, and other enabling technologies, such as efficient data management. The effectiveness of the proposed analysis methods is demonstrated via applications in two distinct scientific research fields, namely developmental biology and high-energy physics.Advances in microscopy, image analysis, and embryo registration enable for the first time measurement of gene expression at cellular resolution for entire organisms. Analysis of high-dimensional spatial gene expression datasets is a challenging task. By integrating data clustering and visualization, analysis of complex, time-varying, spatial gene expression patterns and their formation becomes possible. The analysis framework MATLAB and the visualization have been integrated, making advanced analysis tools accessible to biologist and enabling bioinformatic researchers to directly integrate their analysis with the visualization. Laser wakefield particle accelerators (LWFAs) promise to be a new compact source of high-energy particles and radiation, with wide applications ranging from medicine to physics. To gain insight into the complex physical processes of particle

  11. Linking Automated Data Analysis and Visualization with Applications in Developmental Biology and High-Energy Physics

    International Nuclear Information System (INIS)

    Knowledge discovery from large and complex collections of today's scientific datasets is a challenging task. With the ability to measure and simulate more processes at increasingly finer spatial and temporal scales, the increasing number of data dimensions and data objects is presenting tremendous challenges for data analysis and effective data exploration methods and tools. Researchers are overwhelmed with data and standard tools are often insufficient to enable effective data analysis and knowledge discovery. The main objective of this thesis is to provide important new capabilities to accelerate scientific knowledge discovery form large, complex, and multivariate scientific data. The research covered in this thesis addresses these scientific challenges using a combination of scientific visualization, information visualization, automated data analysis, and other enabling technologies, such as efficient data management. The effectiveness of the proposed analysis methods is demonstrated via applications in two distinct scientific research fields, namely developmental biology and high-energy physics.Advances in microscopy, image analysis, and embryo registration enable for the first time measurement of gene expression at cellular resolution for entire organisms. Analysis of high-dimensional spatial gene expression datasets is a challenging task. By integrating data clustering and visualization, analysis of complex, time-varying, spatial gene expression patterns and their formation becomes possible. The analysis framework MATLAB and the visualization have been integrated, making advanced analysis tools accessible to biologist and enabling bioinformatic researchers to directly integrate their analysis with the visualization. Laser wakefield particle accelerators (LWFAs) promise to be a new compact source of high-energy particles and radiation, with wide applications ranging from medicine to physics. To gain insight into the complex physical processes of particle

  12. Automating dChip: toward reproducible sharing of microarray data analysis

    Directory of Open Access Journals (Sweden)

    Li Cheng

    2008-05-01

    Full Text Available Abstract Background During the past decade, many software packages have been developed for analysis and visualization of various types of microarrays. We have developed and maintained the widely used dChip as a microarray analysis software package accessible to both biologist and data analysts. However, challenges arise when dChip users want to analyze large number of arrays automatically and share data analysis procedures and parameters. Improvement is also needed when the dChip user support team tries to identify the causes of reported analysis errors or bugs from users. Results We report here implementation and application of the dChip automation module. Through this module, dChip automation files can be created to include menu steps, parameters, and data viewpoints to run automatically. A data-packaging function allows convenient transfer from one user to another of the dChip software, microarray data, and analysis procedures, so that the second user can reproduce the entire analysis session of the first user. An analysis report file can also be generated during an automated run, including analysis logs, user comments, and viewpoint screenshots. Conclusion The dChip automation module is a step toward reproducible research, and it can prompt a more convenient and reproducible mechanism for sharing microarray software, data, and analysis procedures and results. Automation data packages can also be used as publication supplements. Similar automation mechanisms could be valuable to the research community if implemented in other genomics and bioinformatics software packages.

  13. Comparison of Dimension Reduction Methods for Automated Essay Grading

    Science.gov (United States)

    Kakkonen, Tuomo; Myller, Niko; Sutinen, Erkki; Timonen, Jari

    2008-01-01

    Automatic Essay Assessor (AEA) is a system that utilizes information retrieval techniques such as Latent Semantic Analysis (LSA), Probabilistic Latent Semantic Analysis (PLSA), and Latent Dirichlet Allocation (LDA) for automatic essay grading. The system uses learning materials and relatively few teacher-graded essays for calibrating the scoring…

  14. Automated Chemical Analysis of Internally Mixed Aerosol Particles Using X-ray Spectromicroscopy at the Carbon K-Edge

    Energy Technology Data Exchange (ETDEWEB)

    Gilles, Mary K; Moffet, R.C.; Henn, T.; Laskin, A.

    2011-01-20

    We have developed an automated data analysis method for atmospheric particles using scanning transmission X-ray microscopy coupled with near edge X-ray fine structure spectroscopy (STXM/NEXAFS). This method is applied to complex internally mixed submicrometer particles containing organic and inorganic material. Several algorithms were developed to exploit NEXAFS spectral features in the energy range from 278 to 320 eV for quantitative mapping of the spatial distribution of elemental carbon, organic carbon, potassium, and noncarbonaceous elements in particles of mixed composition. This energy range encompasses the carbon K-edge and potassium L2 and L3 edges. STXM/NEXAFS maps of different chemical components were complemented with a subsequent analysis using elemental maps obtained by scanning electron microscopy coupled with energy dispersive X-ray analysis (SEM/EDX). We demonstrate the application of the automated mapping algorithms for data analysis and the statistical classification of particles.

  15. Comparison of manual and semi-automated delineation of regions of interest for radioligand PET imaging analysis

    International Nuclear Information System (INIS)

    As imaging centers produce higher resolution research scans, the number of man-hours required to process regional data has become a major concern. Comparison of automated vs. manual methodology has not been reported for functional imaging. We explored validation of using automation to delineate regions of interest on positron emission tomography (PET) scans. The purpose of this study was to ascertain improvements in image processing time and reproducibility of a semi-automated brain region extraction (SABRE) method over manual delineation of regions of interest (ROIs). We compared 2 sets of partial volume corrected serotonin 1a receptor binding potentials (BPs) resulting from manual vs. semi-automated methods. BPs were obtained from subjects meeting consensus criteria for frontotemporal degeneration and from age- and gender-matched healthy controls. Two trained raters provided each set of data to conduct comparisons of inter-rater mean image processing time, rank order of BPs for 9 PET scans, intra- and inter-rater intraclass correlation coefficients (ICC), repeatability coefficients (RC), percentages of the average parameter value (RM%), and effect sizes of either method. SABRE saved approximately 3 hours of processing time per PET subject over manual delineation (p < .001). Quality of the SABRE BP results was preserved relative to the rank order of subjects by manual methods. Intra- and inter-rater ICC were high (>0.8) for both methods. RC and RM% were lower for the manual method across all ROIs, indicating less intra-rater variance across PET subjects' BPs. SABRE demonstrated significant time savings and no significant difference in reproducibility over manual methods, justifying the use of SABRE in serotonin 1a receptor radioligand PET imaging analysis. This implies that semi-automated ROI delineation is a valid methodology for future PET imaging analysis

  16. Automated CBED processing: Sample thickness estimation based on analysis of zone-axis CBED pattern

    International Nuclear Information System (INIS)

    An automated processing of convergent beam electron diffraction (CBED) patterns is presented. The proposed methods are used in an automated tool for estimating the thickness of transmission electron microscopy (TEM) samples by matching an experimental zone-axis CBED pattern with a series of patterns simulated for known thicknesses. The proposed tool detects CBED disks, localizes a pattern in detected disks and unifies the coordinate system of the experimental pattern with the simulated one. The experimental pattern is then compared disk-by-disk with a series of simulated patterns each corresponding to different known thicknesses. The thickness of the most similar simulated pattern is then taken as the thickness estimate. The tool was tested on [0 1 1] Si, [0 1 0] α-Ti and [0 1 1] α-Ti samples prepared using different techniques. Results of the presented approach were compared with thickness estimates based on analysis of CBED patterns in two beam conditions. The mean difference between these two methods was 4.1% for the FIB-prepared silicon samples, 5.2% for the electro-chemically polished titanium and 7.9% for Ar+ ion-polished titanium. The proposed techniques can also be employed in other established CBED analyses. Apart from the thickness estimation, it can potentially be used to quantify lattice deformation, structure factors, symmetry, defects or extinction distance. - Highlights: • Automated TEM sample thickness estimation using zone-axis CBED is presented. • Computer vision and artificial intelligence are employed in CBED processing. • This approach reduces operator effort, analysis time and increases repeatability. • Individual parts can be employed in other analyses of CBED/diffraction pattern

  17. Automated CBED processing: Sample thickness estimation based on analysis of zone-axis CBED pattern

    Energy Technology Data Exchange (ETDEWEB)

    Klinger, M., E-mail: klinger@post.cz; Němec, M.; Polívka, L.; Gärtnerová, V.; Jäger, A.

    2015-03-15

    An automated processing of convergent beam electron diffraction (CBED) patterns is presented. The proposed methods are used in an automated tool for estimating the thickness of transmission electron microscopy (TEM) samples by matching an experimental zone-axis CBED pattern with a series of patterns simulated for known thicknesses. The proposed tool detects CBED disks, localizes a pattern in detected disks and unifies the coordinate system of the experimental pattern with the simulated one. The experimental pattern is then compared disk-by-disk with a series of simulated patterns each corresponding to different known thicknesses. The thickness of the most similar simulated pattern is then taken as the thickness estimate. The tool was tested on [0 1 1] Si, [0 1 0] α-Ti and [0 1 1] α-Ti samples prepared using different techniques. Results of the presented approach were compared with thickness estimates based on analysis of CBED patterns in two beam conditions. The mean difference between these two methods was 4.1% for the FIB-prepared silicon samples, 5.2% for the electro-chemically polished titanium and 7.9% for Ar{sup +} ion-polished titanium. The proposed techniques can also be employed in other established CBED analyses. Apart from the thickness estimation, it can potentially be used to quantify lattice deformation, structure factors, symmetry, defects or extinction distance. - Highlights: • Automated TEM sample thickness estimation using zone-axis CBED is presented. • Computer vision and artificial intelligence are employed in CBED processing. • This approach reduces operator effort, analysis time and increases repeatability. • Individual parts can be employed in other analyses of CBED/diffraction pattern.

  18. Wine analysis to check quality and authenticity by fully-automated 1H-NMR

    Directory of Open Access Journals (Sweden)

    Spraul Manfred

    2015-01-01

    Full Text Available Fully-automated high resolution 1H-NMR spectroscopy offers unique screening capabilities for food quality and safety by combining non-targeted and targeted screening in one analysis (15–20 min from acquisition to report. The advantage of high resolution 1H-NMR is its absolute reproducibility and transferability from laboratory to laboratory, which is not equaled by any other method currently used in food analysis. NMR reproducibility allows statistical investigations e.g. for detection of variety, geographical origin and adulterations, where smallest changes of many ingredients at the same time must be recorded. Reproducibility and transferability of the solutions shown are user-, instrument- and laboratory-independent. Sample prepara- tion, measurement and processing are based on strict standard operation procedures which are substantial for this fully automated solution. The non-targeted approach to the data allows detecting even unknown deviations, if they are visible in the 1H-NMR spectra of e.g. fruit juice, wine or honey. The same data acquired in high-throughput mode are also subjected to quantification of multiple compounds. This 1H-NMR methodology will shortly be introduced, then results on wine will be presented and the advantages of the solutions shown. The method has been proven on juice, honey and wine, where so far unknown frauds could be detected, while at the same time generating targeted parameters are obtained.

  19. Automated Design and Analysis Tool for CLV/CEV Composite and Metallic Structural Components Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation of the proposed effort is a unique automated process for the analysis, design, and sizing of CLV/CEV composite and metallic structures. This...

  20. Automated Design and Analysis Tool for CEV Structural and TPS Components Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation of the proposed effort is a unique automated process for the analysis, design, and sizing of CEV structures and TPS. This developed process will...

  1. Development of a fully automated online mixing system for SAXS protein structure analysis

    DEFF Research Database (Denmark)

    Nielsen, Søren Skou; Arleth, Lise

    2010-01-01

    This thesis presents the development of an automated high-throughput mixing and exposure system for Small-Angle Scattering analysis on a synchrotron using polymer microfluidics. Software and hardware for both automated mixing, exposure control on a beamline and automated data reduction and...... preliminary analysis is presented. Three mixing systems that have been the corner stones of the development process are presented including a fully functioning high-throughput microfluidic system that is able to produce and expose 36 mixed samples per hour using 30 μL of sample volume. The system is tested...

  2. Automated segmentation of chronic stroke lesions using LINDA: Lesion identification with neighborhood data analysis.

    Science.gov (United States)

    Pustina, Dorian; Coslett, H Branch; Turkeltaub, Peter E; Tustison, Nicholas; Schwartz, Myrna F; Avants, Brian

    2016-04-01

    The gold standard for identifying stroke lesions is manual tracing, a method that is known to be observer dependent and time consuming, thus impractical for big data studies. We propose LINDA (Lesion Identification with Neighborhood Data Analysis), an automated segmentation algorithm capable of learning the relationship between existing manual segmentations and a single T1-weighted MRI. A dataset of 60 left hemispheric chronic stroke patients is used to build the method and test it with k-fold and leave-one-out procedures. With respect to manual tracings, predicted lesion maps showed a mean dice overlap of 0.696 ± 0.16, Hausdorff distance of 17.9 ± 9.8 mm, and average displacement of 2.54 ± 1.38 mm. The manual and predicted lesion volumes correlated at r = 0.961. An additional dataset of 45 patients was utilized to test LINDA with independent data, achieving high accuracy rates and confirming its cross-institutional applicability. To investigate the cost of moving from manual tracings to automated segmentation, we performed comparative lesion-to-symptom mapping (LSM) on five behavioral scores. Predicted and manual lesions produced similar neuro-cognitive maps, albeit with some discussed discrepancies. Of note, region-wise LSM was more robust to the prediction error than voxel-wise LSM. Our results show that, while several limitations exist, our current results compete with or exceed the state-of-the-art, producing consistent predictions, very low failure rates, and transferable knowledge between labs. This work also establishes a new viewpoint on evaluating automated methods not only with segmentation accuracy but also with brain-behavior relationships. LINDA is made available online with trained models from over 100 patients. Hum Brain Mapp 37:1405-1421, 2016. © 2016 Wiley Periodicals, Inc. PMID:26756101

  3. A COMPARISON OF AUTOMATED AND TRADITIONAL METHODS FOR THE EXTRACTION OF ARSENICALS FROM FISH

    Science.gov (United States)

    An automated extractor employing accelerated solvent extraction (ASE) has been compared with a traditional sonication method of extraction for the extraction of arsenicals from fish tissue. Four different species of fish and a standard reference material, DORM-2, were subjected t...

  4. The Ocular Redness Index: A Novel Automated Method for Measuring Ocular Injection

    OpenAIRE

    Amparo, Francisco; Wang, Haobing; Emami-Naeini, Parisa; Karimian, Parisa; Dana, Reza

    2013-01-01

    In this study, we present and validate an automated method to assess ocular redness in clinical images. This system is based on a continuous, centesimal scale, and employs a computer algorithm that objectively scores redness without need for a trained physician.

  5. Evaluating E-Learning Accessibility by Automated and Student-Centered Methods

    Science.gov (United States)

    Kumar, Kari L.; Owston, Ron

    2016-01-01

    The use of learning technologies is becoming ubiquitous in higher education. As a result, there is a pressing need to develop methods to evaluate their accessibility to ensure that students do not encounter barriers to accessibility while engaging in e-learning. In this study, sample online units were evaluated for accessibility by automated tools…

  6. Automation of a center pivot using the temperature-time-threshold method of irriation scheduling

    Science.gov (United States)

    A center pivot was completely automated using the temperature-time-threshold (TTT) method of irrigation scheduling. An array of infrared thermometers was mounted on the center pivot and these were used to remotely determine the crop leaf temperature as an indicator of crop water stress. We describ...

  7. Editorial for the Special Issue on Automated Design and Assessment of Heuristic Search Methods

    OpenAIRE

    Ochoa, Gabriela; Preuss, Mike; Bartz-Beielstein, Thomas; Schoenauer, Marc

    2012-01-01

    Heuristic search algorithms have been successfully applied to solve many problems in practice. Their design, however, has increased in complexity as the number of parameters and choices for operators and algorithmic components is also expanding. There is clearly the need of providing the final user with automated tools to assist the tuning, design and assessment of heuristic optimisation methods.

  8. Automation of the method gamma of comparison dosimetry images

    International Nuclear Information System (INIS)

    The objective of this work was the development of JJGAMMA application analysis software, which enables this task systematically, minimizing intervention specialist and therefore the variability due to the observer. Both benefits, allow comparison of images is done in practice with the required frequency and objectivity. (Author)

  9. Image cytometer method for automated assessment of human spermatozoa concentration

    DEFF Research Database (Denmark)

    Egeberg, D L; Kjaerulff, S; Hansen, C; Petersen, J H; Glensbjerg, M; Skakkebaek, N E; Jørgensen, N; Almstrup, K

    2013-01-01

    In the basic clinical work-up of infertile couples, a semen analysis is mandatory and the sperm concentration is one of the most essential variables to be determined. Sperm concentration is usually assessed by manual counting using a haemocytometer and is hence labour intensive and may be subject...

  10. Automated Production Flow Line Failure Rate Mathematical Analysis with Probability Theory

    Directory of Open Access Journals (Sweden)

    Tan Chan Sin

    2014-12-01

    Full Text Available Automated lines have been widely used in the industries especially for mass production and to customize product. Productivity of automated line is a crucial indicator to show the output and performance of the production. Failure or breakdown of station or mechanisms is commonly occurs in the automated line in real condition due to the technological and technical problem which is highly affect the productivity. The failure rates of automated line are not express or analyse in terms of mathematic form. This paper presents the mathematic analysis by using probability theory towards the failure condition in automated line. The mathematic express for failure rates can produce and forecast the output of productivity accurately

  11. Semi-automated Senarmont Method for Measurement of Small Retardation

    CERN Document Server

    Mori, Atsushi

    2014-01-01

    In usual measurement with the Senarmont method using a conventional polarization microscope, the azimuth angle of the analyzer at extinction of the emerging light is detected by naked eyes. If the intensity of light is measured as a function of the azimuth angle of the analyzer, one can find the direction at which the intensity is minimized more accurately by fitting. However, this procedure requires much numbers of operations as compared to the usual method, and thus is time-consuming. To circumvent this problem a setting with which the intensity is measured as a function of the azimuth angle of the analyzer is constructed with rotation of the analyzer through a computer control. As a result, the time required for the measurement has been greatly decreased.

  12. Automated patient and medication payment method for clinical trials

    Directory of Open Access Journals (Sweden)

    Yawn BP

    2013-01-01

    Full Text Available Barbara P Yawn,1 Suzanne Madison,1 Susan Bertram,1 Wilson D Pace,2 Anne Fuhlbrigge,3 Elliot Israel,3 Dawn Littlefield,1 Margary Kurland,1 Michael E Wechsler41Olmsted Medical Center, Department of Research, Rochester, MN, 2UCDHSC, Department of Family Medicine, University of Colorado Health Science Centre, Aurora, CO, 3Brigham and Women's Hospital, Pulmonary and Critical Care Division, Boston, MA, 4National Jewish Medical Center, Division of Pulmonology, Denver, CO, USABackground: Published reports and studies related to patient compensation for clinical trials focus primarily on the ethical issues related to appropriate amounts to reimburse for patient's time and risk burden. Little has been published regarding the method of payment for patient participation. As clinical trials move into widely dispersed community practices and more complex designs, the method of payment also becomes more complex. Here we review the decision process and payment method selected for a primary care-based randomized clinical trial of asthma management in Black Americans.Methods: The method selected is a credit card system designed specifically for clinical trials that allows both fixed and variable real-time payments. We operationalized the study design by providing each patient with two cards, one for reimbursement for study visits and one for payment of medication costs directly to the pharmacies.Results: Of the 1015 patients enrolled, only two refused use of the ClinCard, requesting cash payments for visits and only rarely a weekend or fill-in pharmacist refused to use the card system for payment directly to the pharmacy. Overall, the system has been well accepted by patients and local study teams. The ClinCard administrative system facilitates the fiscal accounting and medication adherence record-keeping by the central teams. Monthly fees are modest, and all 12 study institutional review boards approved use of the system without concern for patient

  13. EddyOne automated analysis of PWR/WWER steam generator tubes eddy current data

    International Nuclear Information System (INIS)

    INETEC Institute for Nuclear Technology developed software package called Eddy One which has option of automated analysis of bobbin coil eddy current data. During its development and on site use, many valuable lessons were learned which are described in this article. In accordance with previous, the following topics are covered: General requirements for automated analysis of bobbin coil eddy current data; Main approaches to automated analysis; Multi rule algorithms for data screening; Landmark detection algorithms as prerequisite for automated analysis (threshold algorithms and algorithms based on neural network principles); Field experience with Eddy One software; Development directions (use of artificial intelligence with self learning abilities for indication detection and sizing); Automated analysis software qualification; Conclusions. Special emphasis is given on results obtained on different types of steam generators, condensers and heat exchangers. Such results are then compared with results obtained by other automated software vendors giving clear advantage to INETEC approach. It has to be pointed out that INETEC field experience was collected also on WWER steam generators what is for now unique experience.(author)

  14. A method for fast automated microscope image stitching.

    Science.gov (United States)

    Yang, Fan; Deng, Zhen-Sheng; Fan, Qiu-Hong

    2013-05-01

    Image stitching is an important technology to produce a panorama or larger image by combining several images with overlapped areas. In many biomedical researches, image stitching is highly desirable to acquire a panoramic image which represents large areas of certain structures or whole sections, while retaining microscopic resolution. In this study, we develop a fast normal light microscope image stitching algorithm based on feature extraction. At first, an algorithm of scale-space reconstruction of speeded-up robust features (SURF) was proposed to extract features from the images to be stitched with a short time and higher repeatability. Then, the histogram equalization (HE) method was employed to preprocess the images to enhance their contrast for extracting more features. Thirdly, the rough overlapping zones of the images preprocessed were calculated by phase correlation, and the improved SURF was used to extract the image features in the rough overlapping areas. Fourthly, the features were corresponded by matching algorithm and the transformation parameters were estimated, then the images were blended seamlessly. Finally, this procedure was applied to stitch normal light microscope images to verify its validity. Our experimental results demonstrate that the improved SURF algorithm is very robust to viewpoint, illumination, blur, rotation and zoom of the images and our method is able to stitch microscope images automatically with high precision and high speed. Also, the method proposed in this paper is applicable to registration and stitching of common images as well as stitching the microscope images in the field of virtual microscope for the purpose of observing, exchanging, saving, and establishing a database of microscope images. PMID:23465523

  15. Semi-Automated Atlas-based Analysis of Brain Histological Sections

    OpenAIRE

    Kopec, Charles D.; Bowers, Amanda C.; Pai, Shraddha; Brody, Carlos D.

    2010-01-01

    Quantifying the location and/or number of features in a histological section of the brain currently requires one to first, manually register a corresponding section from a tissue atlas onto the experimental section and second, count the features. No automated method exists for the first process (registering), and most automated methods for the second process (feature counting) operate reliably only in a high signal-to-noise regime. To reduce experimenter bias and inconsistencies and increase ...

  16. A weld inspection method including an automated dye penetration process

    International Nuclear Information System (INIS)

    An inspection and crack detection method for weldments on a nuclear reactor vessel cover, is presented; it involves spraying a developer product onto the welded surface to be controlled, through a nozzle mounted on a pressure gun; the dye penetration test is driven by a computer which controls the injection pressure, the nozzle travelling speed and the motion of a shutter which role is to prevent that developer product is sprayed during the nozzle opening and shutting transients in order to achieve a perfectly homogenous developer layer on the weldment surface

  17. Automated retinofugal visual pathway reconstruction with multi-shell HARDI and FOD-based analysis.

    Science.gov (United States)

    Kammen, Alexandra; Law, Meng; Tjan, Bosco S; Toga, Arthur W; Shi, Yonggang

    2016-01-15

    Diffusion MRI tractography provides a non-invasive modality to examine the human retinofugal projection, which consists of the optic nerves, optic chiasm, optic tracts, the lateral geniculate nuclei (LGN) and the optic radiations. However, the pathway has several anatomic features that make it particularly challenging to study with tractography, including its location near blood vessels and bone-air interface at the base of the cerebrum, crossing fibers at the chiasm, somewhat-tortuous course around the temporal horn via Meyer's Loop, and multiple closely neighboring fiber bundles. To date, these unique complexities of the visual pathway have impeded the development of a robust and automated reconstruction method using tractography. To overcome these challenges, we develop a novel, fully automated system to reconstruct the retinofugal visual pathway from high-resolution diffusion imaging data. Using multi-shell, high angular resolution diffusion imaging (HARDI) data, we reconstruct precise fiber orientation distributions (FODs) with high order spherical harmonics (SPHARM) to resolve fiber crossings, which allows the tractography algorithm to successfully navigate the complicated anatomy surrounding the retinofugal pathway. We also develop automated algorithms for the identification of ROIs used for fiber bundle reconstruction. In particular, we develop a novel approach to extract the LGN region of interest (ROI) based on intrinsic shape analysis of a fiber bundle computed from a seed region at the optic chiasm to a target at the primary visual cortex. By combining automatically identified ROIs and FOD-based tractography, we obtain a fully automated system to compute the main components of the retinofugal pathway, including the optic tract and the optic radiation. We apply our method to the multi-shell HARDI data of 215 subjects from the Human Connectome Project (HCP). Through comparisons with post-mortem dissection measurements, we demonstrate the retinotopic

  18. Spacecraft Autonomy and Automation: A Comparative Analysis of Strategies for Cost Effective Mission Operations

    Science.gov (United States)

    Wright, Nathaniel, Jr.

    2000-01-01

    The evolution of satellite operations over the last 40 years has drastically changed. October 4, 1957 (during the cold war) the Soviet Union launched the world's first spacecraft into orbit. The Sputnik satellite orbited Earth for three months and catapulted the United States into a race for dominance in space. A year after Sputnik, President Dwight Eisenhower formed the National Space and Aeronautics Administration (NASA). With a team of scientists and engineers, NASA successfully launched Explorer 1, the first US satellite to orbit Earth. During these early years, massive amounts of ground support equipment and operators were required to successfully operate spacecraft vehicles. Today, budget reductions and technological advances have forced new approaches to spacecraft operations. These approaches require increasingly complex, on board spacecraft systems, that enable autonomous operations, resulting in more cost-effective mission operations. NASA's Goddard Space Flight Center, considered world class in satellite development and operations, has developed and operated over 200 satellites during its 40 years of existence. NASA Goddard is adopting several new millennium initiatives that lower operational costs through the spacecraft autonomy and automation. This paper examines NASA's approach to spacecraft autonomy and ground system automation through a comparative analysis of satellite missions for Hubble Space Telescope-HST, Near Earth Asteroid Rendezvous-NEAR, and Solar Heliospheric Observatory-SoHO, with emphasis on cost reduction methods, risk analysis and anomalies and strategies employed for mitigating risk.

  19. Prototype Software for Automated Structural Analysis of Systems

    DEFF Research Database (Denmark)

    Jørgensen, A.; Izadi-Zamanabadi, Roozbeh; Kristensen, M.

    2004-01-01

    In this paper we present a prototype software tool that is developed to analyse the structural model of automated systems in order to identify redundant information that is hence utilized for Fault detection and Isolation (FDI) purposes. The dedicated algorithms in this software tool use a tri...

  20. Correlation of the UV-induced mutational spectra and the DNA damage distribution of the human HPRT gene: Automating the analysis

    International Nuclear Information System (INIS)

    Automated DNA sequencers can be readily adapted for various types of sequence-based nucleic acid analysis: more recently it was determined the distribution of UV photoproducts in the E. coli laci gene using techniques developed for automated fluorescence-based analysis. We have been working to improve the automated approach of damage distribution. Our current method is more rigorous. We have new software that integrates the area under the individual peaks, rather than measuring the height of the curve. In addition, we now employ an internal standard. The analysis can also be partially automated. Detection limits for both major types of UV-photoproducts (cyclobutane dimers and pyrimidine (6-4) pyrimidone photoproducts) are reported. The UV-induced damage distribution in the hprt gene is compared to the mutational spectra in human and rodents cells

  1. Automated methods for thorium determination in liquids, solids and aerosols

    International Nuclear Information System (INIS)

    Methodology for determining trace thorium levels in a variety of sample types for compliance purposes was developed. Thorium in filtered water samples is concentrated by ferric hydroxide co-precipitation. Aerosols on glass-fibre, cellulose ester or teflon filters are acid digested and thorium is concentrated by lanthanum fluoride co-precipitation. Chemical separation and measurement are then done on a Technicon AAII-C auto-analyzer via TTA-solvent extraction and colorimetry using the thorium-arsenazo III colour complex. Solid samples are acid digested and thorium is concentrated and separated using lanthanum fluoride co-precipitation followed by anion-exchange chromatography. Measurement is then carried out on the autoanalyzer by direct development of the thorium-arsenazo III colour complex. Chemical yields are determined through the addition of thorium-234 tracer with assay by gamma-ray spectrometry. The sensitivities of the methods for liquids, aerosols and solids are approximately 1μg/L,0.5μg and 0.5 μg/g respectively. At thorium levels about ten times the detection limits, accuracy and reproducibility are typically +-10 percent for liquids and aerosols and +- 15 percent for solid samples

  2. Redevelopment and reliability study of simultaneously uranium and thorium analysis automation control system

    International Nuclear Information System (INIS)

    Full-text: This project is to refurbish the Instrumental Delayed Neutron Activation Analysis System for Simultaneously Determination of Uranium and Thorium namely PAUTS. PAUTS use nuclear techniques for the quantitative determination of Uranium-235 (U-235) and Thorium-232 (Th-232)radionuclides contents in the samples. It consists of three main automation procedures namely Control sample handling, Data Acquisition for neutron counting, and data handling and analysis program. The automation control technology for this project is based on a personal computer (PC), Ethernet communication support, programmable automation control (PAC) module CFP 2220, infrared photo sensors and LabVIEW software package. The analysis samples capsule was placed in transfers containers or rabbit and will be transfer using fast pneumatic sample handling for activation by irradiate it to neutron in the reactor core. Both radionuclides as a fission product will decay and emit the delayed neutron which are count using the nuclear counting electronics module. Studies on the reliability of fast pneumatic sample handling using the statistical method shows that 95 % confidence level had been reach. Results shows the mean transfer time of the sample from the loader to the reactor core is 3251 ± 210 ms, while the mean transfer time of the samples from the core to the counter chamber is 3264 ± 407 ms. The overall system reliability has been verified using analysis of calibration standard material with known quantity of uranium and thorium IAEA-S17, the IAEA-ThO2 and the IAEA-S14 method. At the moment nuclear counting electronic based on 4 units neutron detector and the results were in line with the previous experiment. Results shows that the content of U and Th is in the average of 19:35 ppm and 432.25 ppm respectively compared with the known quantity of the sample is 29.0 ppm and 460 ppm. Studies on the effects pneumatic sample handling to the irradiation time parameter indicated that the previous

  3. RoboSCell: An automated single cell arraying and analysis instrument

    KAUST Repository

    Sakaki, Kelly

    2009-09-09

    Single cell research has the potential to revolutionize experimental methods in biomedical sciences and contribute to clinical practices. Recent studies suggest analysis of single cells reveals novel features of intracellular processes, cell-to-cell interactions and cell structure. The methods of single cell analysis require mechanical resolution and accuracy that is not possible using conventional techniques. Robotic instruments and novel microdevices can achieve higher throughput and repeatability; however, the development of such instrumentation is a formidable task. A void exists in the state-of-the-art for automated analysis of single cells. With the increase in interest in single cell analyses in stem cell and cancer research the ability to facilitate higher throughput and repeatable procedures is necessary. In this paper, a high-throughput, single cell microarray-based robotic instrument, called the RoboSCell, is described. The proposed instrument employs a partially transparent single cell microarray (SCM) integrated with a robotic biomanipulator for in vitro analyses of live single cells trapped at the array sites. Cells, labeled with immunomagnetic particles, are captured at the array sites by channeling magnetic fields through encapsulated permalloy channels in the SCM. The RoboSCell is capable of systematically scanning the captured cells temporarily immobilized at the array sites and using optical methods to repeatedly measure extracellular and intracellular characteristics over time. The instrument\\'s capabilities are demonstrated by arraying human T lymphocytes and measuring the uptake dynamics of calcein acetoxymethylester-all in a fully automated fashion. © 2009 Springer Science+Business Media, LLC.

  4. Automated analysis of alditols by anion-exchange chromatography with photometric and fluorimetric postcolumn derivatization.

    Science.gov (United States)

    Honda, S; Takahashi, M; Shimada, S; Kakehi, K; Ganno, S

    1983-02-01

    Eight alditols were separated in ca. 80 min as their borate complexes by stepwise elution with three borate buffers on a column packed with Hitachi 2633 resin. The alditols in the eluate were derivatized automatically to colored, fluorescent products by applying sequential reactions of periodate oxidation and Hantzsch condensation, and the products were detected either photometrically or fluorimetrically. This automated method allowed simultaneous determination of 20-500 and 20-200 nmol amounts of alditols by photometric and fluorimetric monitorings, respectively. The lower limits of detection were ca. 2 and 0.5 nmol, respectively. The interference by aldoses was slight. Aldoses may be also determined as alditols by direct injection of aqueous solutions to which excess amounts of sodium borohydride have been added. This method was applied with success to urinary alditol assay and to molecular weight determination by end group analysis. PMID:6846817

  5. Methods of Multivariate Analysis

    CERN Document Server

    Rencher, Alvin C

    2012-01-01

    Praise for the Second Edition "This book is a systematic, well-written, well-organized text on multivariate analysis packed with intuition and insight . . . There is much practical wisdom in this book that is hard to find elsewhere."-IIE Transactions Filled with new and timely content, Methods of Multivariate Analysis, Third Edition provides examples and exercises based on more than sixty real data sets from a wide variety of scientific fields. It takes a "methods" approach to the subject, placing an emphasis on how students and practitioners can employ multivariate analysis in real-life sit

  6. Long-term live cell imaging and automated 4D analysis of drosophila neuroblast lineages.

    Directory of Open Access Journals (Sweden)

    Catarina C F Homem

    Full Text Available The developing Drosophila brain is a well-studied model system for neurogenesis and stem cell biology. In the Drosophila central brain, around 200 neural stem cells called neuroblasts undergo repeated rounds of asymmetric cell division. These divisions typically generate a larger self-renewing neuroblast and a smaller ganglion mother cell that undergoes one terminal division to create two differentiating neurons. Although single mitotic divisions of neuroblasts can easily be imaged in real time, the lack of long term imaging procedures has limited the use of neuroblast live imaging for lineage analysis. Here we describe a method that allows live imaging of cultured Drosophila neuroblasts over multiple cell cycles for up to 24 hours. We describe a 4D image analysis protocol that can be used to extract cell cycle times and growth rates from the resulting movies in an automated manner. We use it to perform lineage analysis in type II neuroblasts where clonal analysis has indicated the presence of a transit-amplifying population that potentiates the number of neurons. Indeed, our experiments verify type II lineages and provide quantitative parameters for all cell types in those lineages. As defects in type II neuroblast lineages can result in brain tumor formation, our lineage analysis method will allow more detailed and quantitative analysis of tumorigenesis and asymmetric cell division in the Drosophila brain.

  7. Orbit transfer rocket engine technology program: Automated preflight methods concept definition

    Science.gov (United States)

    Erickson, C. M.; Hertzberg, D. W.

    1991-01-01

    The possibility of automating preflight engine checkouts on orbit transfer engines is discussed. The minimum requirements in terms of information and processing necessary to assess the engine'e integrity and readiness to perform its mission were first defined. A variety of ways for remotely obtaining that information were generated. The sophistication of these approaches varied from a simple preliminary power up, where the engine is fired up for the first time, to the most advanced approach where the sensor and operational history data system alone indicates engine integrity. The critical issues and benefits of these methods were identified, outlined, and prioritized. The technology readiness of each of these automated preflight methods were then rated on a NASA Office of Exploration scale used for comparing technology options for future mission choices. Finally, estimates were made of the remaining cost to advance the technology for each method to a level where the system validation models have been demonstrated in a simulated environment.

  8. Automated red blood cell analysis compared with routine red blood cell morphology by smear review

    OpenAIRE

    Dr.Poonam Radadiya; Dr.Nandita Mehta; Dr.Hansa Goswami; Dr.R.N.Gonsai

    2015-01-01

    The RBC histogram is an integral part of automated haematology analysis and is now routinely available on all automated cell counters. This histogram and other associated complete blood count (CBC) parameters have been found abnormal in various haematological conditions and may provide major clues in the diagnosis and management of significant red cell disorders. Performing manual blood smears is important to ensure the quality of blood count results an...

  9. Kinetics analysis and automated online screening of aminocarbonylation of aryl halides in flow

    OpenAIRE

    Moore, Jason S.; Smith, Christopher D; Jensen, Klavs F.

    2016-01-01

    Temperature, pressure, gas stoichiometry, and residence time were varied to control the yield and product distribution of the palladium-catalyzed aminocarbonylation of aromatic bromides in both a silicon microreactor and a packed-bed tubular reactor. Automation of the system set points and product sampling enabled facile and repeatable reaction analysis with minimal operator supervision. It was observed that the reaction was divided into two temperature regimes. An automated system was used t...

  10. Automated Source Code Analysis to Identify and Remove Software Security Vulnerabilities: Case Studies on Java Programs

    OpenAIRE

    2013-01-01

    The high-level contribution of this paper is to illustrate the development of generic solution strategies to remove software security vulnerabilities that could be identified using automated tools for source code analysis on software programs (developed in Java). We use the Source Code Analyzer and Audit Workbench automated tools, developed by HP Fortify Inc., for our testing purposes. We present case studies involving a file writer program embedded with features for password validation, and ...

  11. The BUME method: a novel automated chloroform-free 96-well total lipid extraction method for blood plasma[S

    OpenAIRE

    Löfgren, Lars; Ståhlman, Marcus; Forsberg, Gun-Britt; Saarinen, Sinikka; Nilsson, Ralf; Göran I Hansson

    2012-01-01

    Lipid extraction from biological samples is a critical and often tedious preanalytical step in lipid research. Primarily on the basis of automation criteria, we have developed the BUME method, a novel chloroform-free total lipid extraction method for blood plasma compatible with standard 96-well robots. In only 60 min, 96 samples can be automatically extracted with lipid profiles of commonly analyzed lipid classes almost identically and with absolute recoveries similar or better to what is ob...

  12. Automated analysis of damages for radiation in plastics surfaces; Analisis automatizado de danos por radiacion en superficies plasticas

    Energy Technology Data Exchange (ETDEWEB)

    Andrade, C.; Camacho M, E.; Tavera, L.; Balcazar, M. [ININ, 52045 Ocoyoacac, Estado de Mexico (Mexico)

    1990-02-15

    Analysis of damages done by the radiation in a polymer characterized by optic properties of polished surfaces, of uniformity and chemical resistance that the acrylic; resistant until the 150 centigrade grades of temperature, and with an approximate weight of half of the glass. An objective of this work is the development of a method that analyze in automated form the superficial damages induced by radiation in plastic materials means an images analyst. (Author)

  13. Object Type Recognition for Automated Analysis of Protein Subcellular Location

    OpenAIRE

    Zhao, Ting; Velliste, Meel; Boland, Michael V.; Murphy, Robert F.

    2005-01-01

    The new field of location proteomics seeks to provide a comprehensive, objective characterization of the subcellular locations of all proteins expressed in a given cell type. Previous work has demonstrated that automated classifiers can recognize the patterns of all major subcellular organelles and structures in fluorescence microscope images with high accuracy. However, since some proteins may be present in more than one organelle, this paper addresses a more difficult task: recognizing a pa...

  14. Automated forensic extraction of encryption keys using behavioural analysis

    OpenAIRE

    Owen, Gareth

    2012-01-01

    In this paper we describe a technique for automatic algorithm identification and information extraction from unknown binaries. We emulate the binary using PyEmu forcing complete code coverage whilst simultaneously examining its behavior. Our behavior matcher then identifies specific algorithmic behavior and extracts information. We demonstrate the use of this technique for automated extraction of encryption keys from an unseen program with no prior knowledge about its implementation. Our tech...

  15. AutoGate: automating analysis of flow cytometry data

    OpenAIRE

    Meehan, Stephen; Walther, Guenther; Moore, Wayne; Orlova, Darya; Meehan, Connor; Parks, David; Ghosn, Eliver; Philips, Megan; Mitsunaga, Erin; Waters, Jeffrey; Kantor, Aaron; Okamura, Ross; Owumi, Solomon; Yang, Yang; Herzenberg, Leonard A.

    2014-01-01

    Nowadays, one can hardly imagine biology and medicine without flow cytometry to measure CD4 T cell counts in HIV, follow bone marrow transplant patients, characterize leukemias, etc. Similarly, without flow cytometry, there would be a bleak future for stem cell deployment, HIV drug development and full characterization of the cells and cell interactions in the immune system. But while flow instruments have improved markedly, the development of automated tools for processing and analyzing flow...

  16. Towards Automated Design, Analysis and Optimization of Declarative Curation Workflows

    OpenAIRE

    Tianhong Song; Sven Köhler; Bertram Ludäscher; James Hanken; Maureen Kelly; David Lowery; Macklin, James A.; Morris, Paul J.; Morris, Robert A.

    2014-01-01

    Data curation is increasingly important. Our previous work on a Kepler curation package has demonstrated advantages that come from automating data curation pipelines by using workflow systems. However, manually designed curation workflows can be error-prone and inefficient due to a lack of user understanding of the workflow system, misuse of actors, or human error. Correcting problematic workflows is often very time-consuming. A more proactive workflow system can help users avoid such pitfal...

  17. 3D Assembly Group Analysis for Cognitive Automation

    OpenAIRE

    Christian Brecher; Thomas Breitbach; Simon Müller; Marcel Ph. Mayer; Barbara Odenthal; Schlick, Christopher M.; Werner Herfs

    2012-01-01

    A concept that allows the cognitive automation of robotic assembly processes is introduced. An assembly cell comprised of two robots was designed to verify the concept. For the purpose of validation a customer-defined part group consisting of Hubelino bricks is assembled. One of the key aspects for this process is the verification of the assembly group. Hence a software component was designed that utilizes the Microsoft Kinect to perceive both depth and color data in the assembly area. This i...

  18. Automated Dermoscopy Image Analysis of Pigmented Skin Lesions

    Directory of Open Access Journals (Sweden)

    Alfonso Baldi

    2010-03-01

    Full Text Available Dermoscopy (dermatoscopy, epiluminescence microscopy is a non-invasive diagnostic technique for the in vivo observation of pigmented skin lesions (PSLs, allowing a better visualization of surface and subsurface structures (from the epidermis to the papillary dermis. This diagnostic tool permits the recognition of morphologic structures not visible by the naked eye, thus opening a new dimension in the analysis of the clinical morphologic features of PSLs. In order to reduce the learning-curve of non-expert clinicians and to mitigate problems inherent in the reliability and reproducibility of the diagnostic criteria used in pattern analysis, several indicative methods based on diagnostic algorithms have been introduced in the last few years. Recently, numerous systems designed to provide computer-aided analysis of digital images obtained by dermoscopy have been reported in the literature. The goal of this article is to review these systems, focusing on the most recent approaches based on content-based image retrieval systems (CBIR.

  19. A method to establish seismic noise baselines for automated station assessment

    Science.gov (United States)

    McNamara, D.E.; Hutt, C.R.; Gee, L.S.; Benz, H.M.; Buland, R.P.

    2009-01-01

    We present a method for quantifying station noise baselines and characterizing the spectral shape of out-of-nominal noise sources. Our intent is to automate this method in order to ensure that only the highest-quality data are used in rapid earthquake products at NEIC. In addition, the station noise baselines provide a valuable tool to support the quality control of GSN and ANSS backbone data and metadata. The procedures addressed here are currently in development at the NEIC, and work is underway to understand how quickly changes from nominal can be observed and used within the NEIC processing framework. The spectral methods and software used to compute station baselines and described herein (PQLX) can be useful to both permanent and portable seismic stations operators. Applications include: general seismic station and data quality control (QC), evaluation of instrument responses, assessment of near real-time communication system performance, characterization of site cultural noise conditions, and evaluation of sensor vault design, as well as assessment of gross network capabilities (McNamara et al. 2005). Future PQLX development plans include incorporating station baselines for automated QC methods and automating station status report generation and notification based on user-defined QC parameters. The PQLX software is available through the USGS (http://earthquake. usgs.gov/research/software/pqlx.php) and IRIS (http://www.iris.edu/software/ pqlx/).

  20. A comparison between automated detection methods of high-frequency oscillations (80–500 Hz) during seizures

    Science.gov (United States)

    Salami, Pariya; Lévesque, Maxime; Gotman, Jean; Avoli, Massimo

    2016-01-01

    High-frequency oscillations (HFOs, ripples: 80–200 Hz, fast ripples: 250–500 Hz) recorded from the epileptic brain are thought to reflect abnormal network-driven activity. They are also better markers of seizure onset zones compared to interictal spikes. There is thus an increasing number of studies analysing HFOs in vitro, in vivo and in the EEG of human patients with refractory epilepsy. However, most of these studies have focused on HFOs during interictal events or at seizure onset, and few have analysed HFOs during seizures. In this study, we are comparing three different automated methods of HFO detection to two methods of visual analysis, during the pre-ictal, ictal and post-ictal periods on multiple channels using the rat pilocarpine model of temporal lobe epilepsy. The first method (method 1) detected HFOs using the average of the normalised period, the second (method 2) detected HFOs using the average of the normalised period in 1 s windows and the third (method 3) detected HFOs using the average of a reference period before seizure onset. Overall, methods 2 and 3 showed higher sensitivity compared to method 1. When dividing the analysed traces in pre-, ictal and post-ictal periods, method 3 showed the highest sensitivity during the ictal period compared to method 1, while method 2 was not significantly different from method 1. These findings suggest that method 3 could be used for automated and reliable detection of HFOs on large data sets containing multiple channels during the ictal period. PMID:22983173

  1. Integrated Markov-neural reliability computation method: A case for multiple automated guided vehicle system

    International Nuclear Information System (INIS)

    This paper proposes an integrated Markovian and back propagation neural network approaches to compute reliability of a system. While states of failure occurrences are significant elements for accurate reliability computation, Markovian based reliability assessment method is designed. Due to drawbacks shown by Markovian model for steady state reliability computations and neural network for initial training pattern, integration being called Markov-neural is developed and evaluated. To show efficiency of the proposed approach comparative analyses are performed. Also, for managerial implication purpose an application case for multiple automated guided vehicles (AGVs) in manufacturing networks is conducted. - Highlights: • Integrated Markovian and back propagation neural network approach to compute reliability. • Markovian based reliability assessment method. • Managerial implication is shown in an application case for multiple automated guided vehicles (AGVs) in manufacturing networks

  2. Validation of an automated solid-phase extraction method for the analysis of 23 opioids, cocaine, and metabolites in urine with ultra-performance liquid chromatography-tandem mass spectrometry.

    Science.gov (United States)

    Ramírez Fernández, María del Mar; Van Durme, Filip; Wille, Sarah M R; di Fazio, Vincent; Kummer, Natalie; Samyn, Nele

    2014-06-01

    The aim of this work was to automate a sample preparation procedure extracting morphine, hydromorphone, oxymorphone, norcodeine, codeine, dihydrocodeine, oxycodone, 6-monoacetyl-morphine, hydrocodone, ethylmorphine, benzoylecgonine, cocaine, cocaethylene, tramadol, meperidine, pentazocine, fentanyl, norfentanyl, buprenorphine, norbuprenorphine, propoxyphene, methadone and 2-ethylidene-1,5-dimethyl-3,3-diphenylpyrrolidine from urine samples. Samples were extracted by solid-phase extraction (SPE) with cation exchange cartridges using a TECAN Freedom Evo 100 base robotic system, including a hydrolysis step previous extraction when required. Block modules were carefully selected in order to use the same consumable material as in manual procedures to reduce cost and/or manual sample transfers. Moreover, the present configuration included pressure monitoring pipetting increasing pipetting accuracy and detecting sampling errors. The compounds were then separated in a chromatographic run of 9 min using a BEH Phenyl analytical column on a ultra-performance liquid chromatography-tandem mass spectrometry system. Optimization of the SPE was performed with different wash conditions and elution solvents. Intra- and inter-day relative standard deviations (RSDs) were within ±15% and bias was within ±15% for most of the compounds. Recovery was >69% (RSD heroin, buprenorphine and methadone, offering fast and reliable results. Automation resulted in improved precision and accuracy, and a minimum operator intervention, leading to safer sample handling and less time-consuming procedures. PMID:24790061

  3. Automated system for high-throughput protein production using the dialysis cell-free method.

    Science.gov (United States)

    Aoki, Masaaki; Matsuda, Takayoshi; Tomo, Yasuko; Miyata, Yukako; Inoue, Makoto; Kigawa, Takanori; Yokoyama, Shigeyuki

    2009-12-01

    High-throughput protein production systems have become an important issue, because protein production is one of the bottleneck steps in large-scale structural and functional analyses of proteins. We have developed a dialysis reactor and a fully automated system for protein production using the dialysis cell-free synthesis method, which we previously established to produce protein samples on a milligram scale in a high-throughput manner. The dialysis reactor was designed to be suitable for an automated system and has six dialysis cups attached to a flat dialysis membrane. The automated system is based on a Tecan Freedom EVO 200 workstation in a three-arm configuration, and is equipped with shaking incubators, a vacuum module, a robotic centrifuge, a plate heat sealer, and a custom-made tilting carrier for collection of reaction solutions from the flat-bottom cups with dialysis membranes. The consecutive process, from the dialysis cell-free protein synthesis to the partial purification by immobilized metal affinity chromatography on a 96-well filtration plate, was performed within ca. 14h, including 8h of cell-free protein synthesis. The proteins were eluted stepwise in a high concentration using EDTA by centrifugation, while the resin in the filtration plate was washed on the vacuum manifold. The system was validated to be able to simultaneously and automatically produce up to 96 proteins in yields of several milligrams with high well-to-well reliability, sufficient for structural and functional analyses of proteins. The protein samples produced by the automated system have been utilized for NMR screening to judge the protein foldedness and for structure determinations using heteronuclear multi-dimensional NMR spectroscopy. The automated high-throughput protein production system represents an important breakthrough in the structural and functional studies of proteins and has already contributed a massive amount of results in the structural genomics project at the

  4. Automated Discovery of Elementary Chemical Reaction Steps Using Freezing String and Berny Optimization Methods

    OpenAIRE

    Suleimanov, Yury V.; Green, William H.

    2015-01-01

    We present a simple protocol which allows fully automated discovery of elementary chemical reaction steps using in cooperation single- and double-ended transition-state optimization algorithms - the freezing string and Berny optimization methods, respectively. To demonstrate the utility of the proposed approach, the reactivity of several systems of combustion and atmospheric chemistry importance is investigated. The proposed algorithm allowed us to detect without any human intervention not on...

  5. Rapid Automated Dissolution and Analysis Techniques for Radionuclides in Recycle Process Streams

    International Nuclear Information System (INIS)

    The analysis of process samples for radionuclide content is an important part of current procedures for material balance and accountancy in the different process streams of a recycling plant. The destructive sample analysis techniques currently available necessitate a significant amount of time. It is therefore desirable to develop new sample analysis procedures that allow for a quick turnaround time and increased sample throughput with a minimum of deviation between samples. In particular, new capabilities for rapid sample dissolution and radiochemical separation are required. Most of the radioanalytical techniques currently employed for sample analysis are based on manual laboratory procedures. Such procedures are time- and labor-intensive, and not well suited for situations in which a rapid sample analysis is required and/or large number of samples need to be analyzed. To address this issue we are currently investigating radiochemical separation methods based on extraction chromatography that have been specifically optimized for the analysis of process stream samples. The influence of potential interferences present in the process samples as well as mass loading, flow rate and resin performance is being studied. In addition, the potential to automate these procedures utilizing a robotic platform is evaluated. Initial studies have been carried out using the commercially available DGA resin. This resin shows an affinity for Am, Pu, U, and Th and is also exhibiting signs of a possible synergistic effects in the presence of iron.

  6. Automated analysis of image mammogram for breast cancer diagnosis

    Science.gov (United States)

    Nurhasanah, Sampurno, Joko; Faryuni, Irfana Diah; Ivansyah, Okto

    2016-03-01

    Medical imaging help doctors in diagnosing and detecting diseases that attack the inside of the body without surgery. Mammogram image is a medical image of the inner breast imaging. Diagnosis of breast cancer needs to be done in detail and as soon as possible for determination of next medical treatment. The aim of this work is to increase the objectivity of clinical diagnostic by using fractal analysis. This study applies fractal method based on 2D Fourier analysis to determine the density of normal and abnormal and applying the segmentation technique based on K-Means clustering algorithm to image abnormal for determine the boundary of the organ and calculate the area of organ segmentation results. The results show fractal method based on 2D Fourier analysis can be used to distinguish between the normal and abnormal breast and segmentation techniques with K-Means Clustering algorithm is able to generate the boundaries of normal and abnormal tissue organs, so area of the abnormal tissue can be determined.

  7. Automation of C-terminal sequence analysis of 2D-PAGE separated proteins

    Directory of Open Access Journals (Sweden)

    P.P. Moerman

    2014-06-01

    Full Text Available Experimental assignment of the protein termini remains essential to define the functional protein structure. Here, we report on the improvement of a proteomic C-terminal sequence analysis method. The approach aims to discriminate the C-terminal peptide in a CNBr-digest where Met-Xxx peptide bonds are cleaved in internal peptides ending at a homoserine lactone (hsl-derivative. pH-dependent partial opening of the lactone ring results in the formation of doublets for all internal peptides. C-terminal peptides are distinguished as singlet peaks by MALDI-TOF MS and MS/MS is then used for their identification. We present a fully automated protocol established on a robotic liquid-handling station.

  8. An automated procedure for the analysis of time-resolved Schottky spectra

    Science.gov (United States)

    Bühler, Paul

    2016-05-01

    The unique combination of facilities and instrumentation available at the GSI Helmholtzzentrum für Schwerionenforschung in Darmstadt, Germany allows us to investigate the decay modes of highly charged ions by Schottky Mass Spectrometry. In single-ion decay spectrometry the fate of single ions cruising in the cooler-storage ring ESR can be followed and their exact decay time is determined. For a fast and repeated analysis of such data sets a highly automated procedure has been developed. The method is demonstrated with a measurement of the He-like 142Pm59+ which decays by electron-capture and β+ decay to 142Nd. For the total decay constant we find a value of λ=0.0164±0.0010 s-1 in the rest frame of the ions and the branching ratio λβ+ /λEC = 3.68 ± 0.014.

  9. Mass asymmetry and tricyclic wobble motion assessment using automated launch video analysis

    Institute of Scientific and Technical Information of China (English)

    Ryan DECKER; Joseph DONINI; William GARDNER; Jobin JOHN; Walter KOENIG

    2016-01-01

    This paper describes an approach to identify epicyclic and tricyclic motion during projectile flight caused by mass asymmetries in spin-stabilized projectiles. Flight video was captured following projectile launch of several M110A2E1 155 mm artillery projectiles. These videos were then analyzed using the automated flight video analysis method to attain their initial position and orientation histories. Examination of the pitch and yaw histories clearly indicates that in addition to epicyclic motion’s nutation and precession oscillations, an even faster wobble amplitude is present during each spin revolution, even though some of the amplitudes of the oscillation are smaller than 0.02 degree. The results are compared to a sequence of shots where little appreciable mass asymmetries were present, and only nutation and precession frequencies are predominantly apparent in the motion history results. Magnitudes of the wobble motion are estimated and compared to product of inertia measurements of the asymmetric projectiles.

  10. Results of Automated Retinal Image Analysis for Detection of Diabetic Retinopathy from the Nakuru Study, Kenya

    DEFF Research Database (Denmark)

    Juul Bøgelund Hansen, Morten; Abramoff, M. D.; Folk, J. C.;

    2015-01-01

    Objective Digital retinal imaging is an established method of screening for diabetic retinopathy (DR). It has been established that currently about 1% of the world's blind or visually impaired is due to DR. However, the increasing prevalence of diabetes mellitus and DR is creating an increased...... workload on those with expertise in grading retinal images. Safe and reliable automated analysis of retinal images may support screening services worldwide. This study aimed to compare the Iowa Detection Program (IDP) ability to detect diabetic eye diseases (DED) to human grading carried out at Moorfields...... gave an AUC of 0.878 (95% CI 0.850-0.905). It showed a negative predictive value of 98%. The IDP missed no vision threatening retinopathy in any patients and none of the false negative cases met criteria for treatment. Conclusions In this epidemiological sample, the IDP's grading was comparable to that...

  11. Automated detection and analysis of particle beams in laser-plasma accelerator simulations

    Energy Technology Data Exchange (ETDEWEB)

    Ushizima, Daniela Mayumi; Geddes, C.G.; Cormier-Michel, E.; Bethel, E. Wes; Jacobsen, J.; Prabhat, ,; R.ubel, O.; Weber, G,; Hamann, B.

    2010-05-21

    scientific data mining is increasingly considered. In plasma simulations, Bagherjeiran et al. presented a comprehensive report on applying graph-based techniques for orbit classification. They used the KAM classifier to label points and components in single and multiple orbits. Love et al. conducted an image space analysis of coherent structures in plasma simulations. They used a number of segmentation and region-growing techniques to isolate regions of interest in orbit plots. Both approaches analyzed particle accelerator data, targeting the system dynamics in terms of particle orbits. However, they did not address particle dynamics as a function of time or inspected the behavior of bunches of particles. Ruebel et al. addressed the visual analysis of massive laser wakefield acceleration (LWFA) simulation data using interactive procedures to query the data. Sophisticated visualization tools were provided to inspect the data manually. Ruebel et al. have integrated these tools to the visualization and analysis system VisIt, in addition to utilizing efficient data management based on HDF5, H5Part, and the index/query tool FastBit. In Ruebel et al. proposed automatic beam path analysis using a suite of methods to classify particles in simulation data and to analyze their temporal evolution. To enable researchers to accurately define particle beams, the method computes a set of measures based on the path of particles relative to the distance of the particles to a beam. To achieve good performance, this framework uses an analysis pipeline designed to quickly reduce the amount of data that needs to be considered in the actual path distance computation. As part of this process, region-growing methods are utilized to detect particle bunches at single time steps. Efficient data reduction is essential to enable automated analysis of large data sets as described in the next section, where data reduction methods are steered to the particular requirements of our clustering analysis

  12. Analysis of numerical methods

    CERN Document Server

    Isaacson, Eugene

    1994-01-01

    This excellent text for advanced undergraduates and graduate students covers norms, numerical solution of linear systems and matrix factoring, iterative solutions of nonlinear equations, eigenvalues and eigenvectors, polynomial approximation, and other topics. It offers a careful analysis and stresses techniques for developing new methods, plus many examples and problems. 1966 edition.

  13. A method for the automated, reliable retrieval of publication-citation records.

    Directory of Open Access Journals (Sweden)

    Derek Ruths

    Full Text Available BACKGROUND: Publication records and citation indices often are used to evaluate academic performance. For this reason, obtaining or computing them accurately is important. This can be difficult, largely due to a lack of complete knowledge of an individual's publication list and/or lack of time available to manually obtain or construct the publication-citation record. While online publication search engines have somewhat addressed these problems, using raw search results can yield inaccurate estimates of publication-citation records and citation indices. METHODOLOGY: In this paper, we present a new, automated method that produces estimates of an individual's publication-citation record from an individual's name and a set of domain-specific vocabulary that may occur in the individual's publication titles. Because this vocabulary can be harvested directly from a research web page or online (partial publication list, our method delivers an easy way to obtain estimates of a publication-citation record and the relevant citation indices. Our method works by applying a series of stringent name and content filters to the raw publication search results returned by an online publication search engine. In this paper, our method is run using Google Scholar, but the underlying filters can be easily applied to any existing publication search engine. When compared against a manually constructed data set of individuals and their publication-citation records, our method provides significant improvements over raw search results. The estimated publication-citation records returned by our method have an average sensitivity of 98% and specificity of 72% (in contrast to raw search result specificity of less than 10%. When citation indices are computed using these records, the estimated indices are within of the true value 10%, compared to raw search results which have overestimates of, on average, 75%. CONCLUSIONS: These results confirm that our method provides

  14. Towards an Automated Semiotic Analysis of the Romanian Political Discourse

    Directory of Open Access Journals (Sweden)

    Daniela Gifu

    2013-04-01

    Full Text Available As it is known, on the political scene the success of a speech can be measured by the degree in which the speaker is able to change attitudes, opinions, feelings and political beliefs in his auditorium. We suggest a range of analysis tools, all belonging to semiotics, from lexical-semantic, to syntactical and rhetorical, that integrated in the exploratory panoply of discursive weapons of a political speaker could influence the impact of her/his speeches over a sensible auditory. Our approach is based on the assumption that semiotics, in its quality of methodology and meta-language, can capitalize a situational analysis over the political discourse. Such an analysis assumes establishing the communication situation, in our case, the Parliament's vote in favour of suspending the Romanian President, through which we can describe an action of communication. We depict a platform, the Discourse Analysis Tool (DAT, which integrates a range of natural language processing tools with the intent to identify significant characteristics of the political discourse. The tool is able to produce comparative diagrams between the speeches of two or more subjects or analysing the same subject in different contexts. Only the lexical-semantic methods are operational in the platform today, but our investigation suggests new dimensions touching the syntactic, rhetorical and coherence perspective.

  15. On-line diagnostic method with automated noise-signature learning

    International Nuclear Information System (INIS)

    In this paper, an advanced diagnostic method is proposed that uses automated pattern recognition for reactor noise. The method enables intensive diagnosis of known anomalies and extensive detection of unknown plant states. It also enables automatic learning of reference noise patterns for an unknown plant state and monitoring of the subsequent state change by regarding the new reference patterns as those for a known plant state. Application results for the method used on artificial noise data produced by a fast breeder reactor noise simulator are presented

  16. Measurement precision and biological variation of cranial arteries using automated analysis of 3 T magnetic resonance angiography

    DEFF Research Database (Denmark)

    Amin, Faisal Mohammad; Lundholm, Elisabet; Hougaard, Anders;

    2014-01-01

    BACKGROUND: Non-invasive magnetic resonance angiography (MRA) has facilitated repeated measurements of human cranial arteries in several headache and migraine studies. To ensure comparability across studies the same automated analysis software has been used, but the intra- and interobserver, day......-to-day and side-to-side variations have not yet been published. We hypothesised that the observer related, side-to-side, and day-to-day variations would be less than 10%. METHODS: Ten female participants were studied using high-resolution MRA on two study days separated by at least one week. Using the...... automated LKEB-MRA vessel wall analysis software arterial circumferences were measured by blinded observers. Each artery was analysed twice by each of the two different observers. The primary endpoints were to determine the intraclass correlation coefficient (ICC) and intra- an inter-observer, the day...

  17. Instrumental neutron activation analysis - a routine method

    International Nuclear Information System (INIS)

    This thesis describes the way in which at IRI instrumental neutron activation analysis (INAA) has been developed into an automated system for routine analysis. The basis of this work are 20 publications describing the development of INAA since 1968. (Auth.)

  18. Application of quantum dots as analytical tools in automated chemical analysis: A review

    Energy Technology Data Exchange (ETDEWEB)

    Frigerio, Christian; Ribeiro, David S.M.; Rodrigues, S. Sofia M.; Abreu, Vera L.R.G.; Barbosa, Joao A.C.; Prior, Joao A.V.; Marques, Karine L. [REQUIMTE, Laboratory of Applied Chemistry, Department of Chemical Sciences, Faculty of Pharmacy of Porto University, Rua Jorge Viterbo Ferreira, 228, 4050-313 Porto (Portugal); Santos, Joao L.M., E-mail: joaolms@ff.up.pt [REQUIMTE, Laboratory of Applied Chemistry, Department of Chemical Sciences, Faculty of Pharmacy of Porto University, Rua Jorge Viterbo Ferreira, 228, 4050-313 Porto (Portugal)

    2012-07-20

    Highlights: Black-Right-Pointing-Pointer Review on quantum dots application in automated chemical analysis. Black-Right-Pointing-Pointer Automation by using flow-based techniques. Black-Right-Pointing-Pointer Quantum dots in liquid chromatography and capillary electrophoresis. Black-Right-Pointing-Pointer Detection by fluorescence and chemiluminescence. Black-Right-Pointing-Pointer Electrochemiluminescence and radical generation. - Abstract: Colloidal semiconductor nanocrystals or quantum dots (QDs) are one of the most relevant developments in the fast-growing world of nanotechnology. Initially proposed as luminescent biological labels, they are finding new important fields of application in analytical chemistry, where their photoluminescent properties have been exploited in environmental monitoring, pharmaceutical and clinical analysis and food quality control. Despite the enormous variety of applications that have been developed, the automation of QDs-based analytical methodologies by resorting to automation tools such as continuous flow analysis and related techniques, which would allow to take advantage of particular features of the nanocrystals such as the versatile surface chemistry and ligand binding ability, the aptitude to generate reactive species, the possibility of encapsulation in different materials while retaining native luminescence providing the means for the implementation of renewable chemosensors or even the utilisation of more drastic and even stability impairing reaction conditions, is hitherto very limited. In this review, we provide insights into the analytical potential of quantum dots focusing on prospects of their utilisation in automated flow-based and flow-related approaches and the future outlook of QDs applications in chemical analysis.

  19. Application of quantum dots as analytical tools in automated chemical analysis: A review

    International Nuclear Information System (INIS)

    Highlights: ► Review on quantum dots application in automated chemical analysis. ► Automation by using flow-based techniques. ► Quantum dots in liquid chromatography and capillary electrophoresis. ► Detection by fluorescence and chemiluminescence. ► Electrochemiluminescence and radical generation. - Abstract: Colloidal semiconductor nanocrystals or quantum dots (QDs) are one of the most relevant developments in the fast-growing world of nanotechnology. Initially proposed as luminescent biological labels, they are finding new important fields of application in analytical chemistry, where their photoluminescent properties have been exploited in environmental monitoring, pharmaceutical and clinical analysis and food quality control. Despite the enormous variety of applications that have been developed, the automation of QDs-based analytical methodologies by resorting to automation tools such as continuous flow analysis and related techniques, which would allow to take advantage of particular features of the nanocrystals such as the versatile surface chemistry and ligand binding ability, the aptitude to generate reactive species, the possibility of encapsulation in different materials while retaining native luminescence providing the means for the implementation of renewable chemosensors or even the utilisation of more drastic and even stability impairing reaction conditions, is hitherto very limited. In this review, we provide insights into the analytical potential of quantum dots focusing on prospects of their utilisation in automated flow-based and flow-related approaches and the future outlook of QDs applications in chemical analysis.

  20. An automated method of quantifying ferrite microstructures using electron backscatter diffraction (EBSD) data

    Energy Technology Data Exchange (ETDEWEB)

    Shrestha, Sachin L., E-mail: sachin.shrestha@sydney.edu.au [School of Aerospace, Mechanical and Mechatronic Engineering, The University of Sydney, NSW 2006 (Australia); Breen, Andrew J.; Trimby, Patrick [Australian Centre for Microscopy and Microanalysis, The University of Sydney, NSW 2006 (Australia); Proust, Gwénaëlle [School of Civil Engineering, The University of Sydney, NSW 2006 (Australia); Ringer, Simon P.; Cairney, Julie M. [School of Aerospace, Mechanical and Mechatronic Engineering, The University of Sydney, NSW 2006 (Australia); Australian Centre for Microscopy and Microanalysis, The University of Sydney, NSW 2006 (Australia)

    2014-02-01

    The identification and quantification of the different ferrite microconstituents in steels has long been a major challenge for metallurgists. Manual point counting from images obtained by optical and scanning electron microscopy (SEM) is commonly used for this purpose. While classification systems exist, the complexity of steel microstructures means that identifying and quantifying these phases is still a great challenge. Moreover, point counting is extremely tedious, time consuming, and subject to operator bias. This paper presents a new automated identification and quantification technique for the characterisation of complex ferrite microstructures by electron backscatter diffraction (EBSD). This technique takes advantage of the fact that different classes of ferrite exhibit preferential grain boundary misorientations, aspect ratios and mean misorientation, all of which can be detected using current EBSD software. These characteristics are set as criteria for identification and linked to grain size to determine the area fractions. The results of this method were evaluated by comparing the new automated technique with point counting results. The technique could easily be applied to a range of other steel microstructures. - Highlights: • New automated method to identify and quantify ferrite microconstituents in HSLA steels is presented. • Unique characteristics of the ferrite microconstituents are investigated using EBSD. • Characteristics of ferrite microconstituents are exploited to identify the type of ferrite grains within the steel's microstructures. • The identified ferrite grains are linked to their associated grain's size for area fraction calculations.

  1. Automated quality control methods for sensor data: a novel observatory approach

    Directory of Open Access Journals (Sweden)

    J. R. Taylor

    2012-12-01

    Full Text Available National and international networks and observatories of terrestrial-based sensors are emerging rapidly. As such, there is demand for a standardized approach to data quality control, as well as interoperability of data among sensor networks. The National Ecological Observatory Network (NEON has begun constructing their first terrestrial observing sites with 60 locations expected to be distributed across the US by 2017. This will result in over 14 000 automated sensors recording more than > 100 Tb of data per year. These data are then used to create other datasets and subsequent "higher-level" data products. In anticipation of this challenge, an overall data quality assurance plan has been developed and the first suite of data quality control measures defined. This data-driven approach focuses on automated methods for defining a suite of plausibility test parameter thresholds. Specifically, these plausibility tests scrutinize data range, persistence, and stochasticity on each measurement type by employing a suite of binary checks. The statistical basis for each of these tests is developed and the methods for calculating test parameter thresholds are explored here. While these tests have been used elsewhere, we apply them in a novel approach by calculating their relevant test parameter thresholds. Finally, implementing automated quality control is demonstrated with preliminary data from a NEON prototype site.

  2. Automated quality control methods for sensor data: a novel observatory approach

    Directory of Open Access Journals (Sweden)

    J. R. Taylor

    2013-07-01

    Full Text Available National and international networks and observatories of terrestrial-based sensors are emerging rapidly. As such, there is demand for a standardized approach to data quality control, as well as interoperability of data among sensor networks. The National Ecological Observatory Network (NEON has begun constructing their first terrestrial observing sites, with 60 locations expected to be distributed across the US by 2017. This will result in over 14 000 automated sensors recording more than > 100 Tb of data per year. These data are then used to create other datasets and subsequent "higher-level" data products. In anticipation of this challenge, an overall data quality assurance plan has been developed and the first suite of data quality control measures defined. This data-driven approach focuses on automated methods for defining a suite of plausibility test parameter thresholds. Specifically, these plausibility tests scrutinize the data range and variance of each measurement type by employing a suite of binary checks. The statistical basis for each of these tests is developed, and the methods for calculating test parameter thresholds are explored here. While these tests have been used elsewhere, we apply them in a novel approach by calculating their relevant test parameter thresholds. Finally, implementing automated quality control is demonstrated with preliminary data from a NEON prototype site.

  3. Method and System for Protection of Automated Control Systems for “Smart Buildings”

    Directory of Open Access Journals (Sweden)

    Dmitry Mikhaylov

    2013-07-01

    Full Text Available The paper is related to system and method for protection of an automated control system (ACS against un-authorized devices connected to the ACS via wired or wireless channels that substantially obviates the disadvantages of the related art. The protection system monitors the signals spreading in the network analyzing the performance of the network for malicious code or hidden connections of attacker. The system is developed specifically for this purpose and it can protect the industrial control systems more effectively than standard anti-virus programs. Specific anti-virus software installed on a central server of the automated control system protects it from software-based attacks both from internal and external offenders. The system comprises a plurality of bus protection devices of different types, including any of a twisted-pair protection device, a power lines protection device, On-Board Diagnostics signal protocol protection device, and a wireless protection device.

  4. Conceptual design for comprehensive automation in radiochemical analysis of bioassay samples

    International Nuclear Information System (INIS)

    Bioassay Laboratory of Health Physics Division is entrusted with the task of carrying out the bioassay monitoring of occupational workers from various plants/divisions of BARC for various radionuclides like Pu, U, Th, 90Sr, 3H etc. On the average about 1400-1500 analyses are performed on 700-800 urine samples collected annually from radiation workers. The workload has increased by 1.5 to 2.0 times in recent past and is expected to increase further due to expanding nuclear programmes of the Department. Therefore, it was planned to carry out automation in various stages of bioassay sample handling, processing and analysis under the XI plan programme. Automation work in Bioassay Lab. is planned to be taken-up in three stages namely, automation in initial processing of i) urine samples, ii) fecal samples and iii) automation in radiochemical analysis of bioassay samples. In the initial phase, automation in radiochemical analysis of bioassay samples has been taken up

  5. Object-oriented database design for the contaminant analysis automation project

    International Nuclear Information System (INIS)

    The Contaminant Analysis Automation project's automated soil analysis laboratory uses an Object-Oriented database for storage of runtime and archive information. Data which is generated by the processing of a sample, and is relevant for verification of the specifics of that process, is retained in the database. The database also contains intermediate results which are generated by one step of the process and used for decision making by later steps. The description of this database reveals design considerations of the objects used to model the behavior of the chemical laboratory and its components

  6. Automated analysis of pumping tests; Analise automatizada de testes de bombeamento

    Energy Technology Data Exchange (ETDEWEB)

    Sugahara, Luiz Alberto Nozaki

    1996-01-01

    An automated procedure for analysis of pumping test data performed in groundwater wells is described. A computer software was developed to be used under the Windows operational system. The software allows the choice of 3 mathematical models for representing the aquifer behavior, which are: Confined aquifer (Theis model); Leaky aquifer (Hantush model); unconfined aquifer (Boulton model). The analysis of pumping test data using the proper aquifer model, allows for the determination of the model parameters such as transmissivity, storage coefficient, leakage coefficient and delay index. The computer program can be used for the analysis of data obtained from both pumping tests, with one or more pumping rates, and recovery tests. In the multiple rate case, a de superposition procedure has been implemented in order to obtain the equivalent aquifer response for the first flow rate, which is used in obtaining an initial estimate of the model parameters. Such initial estimate is required in the non-linear regression analysis method. The solutions to the partial differential equations describing the aquifer behavior were obtained in Laplace space, followed by numerical inversion of the transformed solution using the Stehfest algorithm. The data analysis procedure is based on a non-linear regression method by matching the field data to the theoretical response of a selected aquifer model, for a given type of test. A least squared regression analysis method was implemented using either Gauss-Newton or Levenberg-Marquardt procedures for minimization of a objective function. The computer software can also be applied to multiple rate test data in order to determine the non-linear well coefficient, allowing for the computation of the well inflow performance curve. (author)

  7. Automated data model evaluation

    International Nuclear Information System (INIS)

    Modeling process is essential phase within information systems development and implementation. This paper presents methods and techniques for analysis and evaluation of data model correctness. Recent methodologies and development results regarding automation of the process of model correctness analysis and relations with ontology tools has been presented. Key words: Database modeling, Data model correctness, Evaluation

  8. Automated method for determination of uranium in kerosene-amine sulphate extracts

    International Nuclear Information System (INIS)

    An automated method is described for the determination of uranium(VI) that has been extracted into a trialkylamine in kerosene or similar diluent from sulphuric acid leach liquor. The method uses the continuous segmented-flow technique and can be set up with the use of commercial components. Discrimination against interference from other ions, especially sulphate, should be adequate for most purposes. 0.5 to 5 g uranium per litre of extract can be determined at a rate of 60 samples per hour. Minor modifications permit extension of this range to lower concentrations. (author)

  9. Automated Image Processing for the Analysis of DNA Repair Dynamics

    CERN Document Server

    Riess, Thorsten; Tomas, Martin; Ferrando-May, Elisa; Merhof, Dorit

    2011-01-01

    The efficient repair of cellular DNA is essential for the maintenance and inheritance of genomic information. In order to cope with the high frequency of spontaneous and induced DNA damage, a multitude of repair mechanisms have evolved. These are enabled by a wide range of protein factors specifically recognizing different types of lesions and finally restoring the normal DNA sequence. This work focuses on the repair factor XPC (xeroderma pigmentosum complementation group C), which identifies bulky DNA lesions and initiates their removal via the nucleotide excision repair pathway. The binding of XPC to damaged DNA can be visualized in living cells by following the accumulation of a fluorescent XPC fusion at lesions induced by laser microirradiation in a fluorescence microscope. In this work, an automated image processing pipeline is presented which allows to identify and quantify the accumulation reaction without any user interaction. The image processing pipeline comprises a preprocessing stage where the ima...

  10. 3D Assembly Group Analysis for Cognitive Automation

    Directory of Open Access Journals (Sweden)

    Christian Brecher

    2012-01-01

    Full Text Available A concept that allows the cognitive automation of robotic assembly processes is introduced. An assembly cell comprised of two robots was designed to verify the concept. For the purpose of validation a customer-defined part group consisting of Hubelino bricks is assembled. One of the key aspects for this process is the verification of the assembly group. Hence a software component was designed that utilizes the Microsoft Kinect to perceive both depth and color data in the assembly area. This information is used to determine the current state of the assembly group and is compared to a CAD model for validation purposes. In order to efficiently resolve erroneous situations, the results are interactively accessible to a human expert. The implications for an industrial application are demonstrated by transferring the developed concepts to an assembly scenario for switch-cabinet systems.

  11. Automation of Morphometric Measurements for Planetary Surface Analysis and Cartography

    Science.gov (United States)

    Kokhanov, A. A.; Bystrov, A. Y.; Kreslavsky, M. A.; Matveev, E. V.; Karachevtseva, I. P.

    2016-06-01

    For automation of measurements of morphometric parameters of surface relief various tools were developed and integrated into GIS. We have created a tool, which calculates statistical characteristics of the surface: interquartile range of heights, and slopes, as well as second derivatives of height fields as measures of topographic roughness. Other tools were created for morphological studies of craters. One of them allows automatic placing of topographic profiles through the geometric center of a crater. Another tool was developed for calculation of small crater depths and shape estimation, using C++ programming language. Additionally, we have prepared tool for calculating volumes of relief features from DTM rasters. The created software modules and models will be available in a new developed web-GIS system, operating in distributed cloud environment.

  12. Automated Multivariate Optimization Tool for Energy Analysis: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Ellis, P. G.; Griffith, B. T.; Long, N.; Torcellini, P. A.; Crawley, D.

    2006-07-01

    Building energy simulations are often used for trial-and-error evaluation of ''what-if'' options in building design--a limited search for an optimal solution, or ''optimization''. Computerized searching has the potential to automate the input and output, evaluate many options, and perform enough simulations to account for the complex interactions among combinations of options. This paper describes ongoing efforts to develop such a tool. The optimization tool employs multiple modules, including a graphical user interface, a database, a preprocessor, the EnergyPlus simulation engine, an optimization engine, and a simulation run manager. Each module is described and the overall application architecture is summarized.

  13. Automated oscillometric determination of the ankle-brachial index: a systematic review and meta-analysis.

    Science.gov (United States)

    Verberk, Willem J; Kollias, Anastasios; Stergiou, George S

    2012-09-01

    Measurement of the ankle-brachial index (ABI) using a Doppler device is widely used to identify subjects with peripheral artery disease (PAD), and those who are at high risk of cardiovascular disease. This paper presents a systematic review (Medline/PubMed, Embase and Cochrane) and meta-analysis of studies assessing the usefulness of automated oscillometric devices for ABI estimation and PAD detection compared with the conventional Doppler method. A total of 25 studies including 4186 subjects were analyzed. A random-effects model analysis showed that the average oscillometric ABI was similar to the Doppler ABI (mean difference ± s.e. 0.020 ± 0.018, P=0.3) but that the absolute differences were significant (0.048 ± 0.009, Parm-leg measurements resulted in a smaller difference between the average oscillometric ABI value and the average Doppler ABI value than did sequential measurements (-0.012 ± 0.022 vs. 0.040 ± 0.026, respectively, P<0.01). The average sensitivity and specificity of the oscillometric ABI estimation in PAD diagnosis was 69 ± 6% and 96 ± 1%, respectively (with Doppler ABI taken as the reference). These data suggest that an automated ABI measurement obtained by oscillometric blood pressure monitors is a reliable and practical alternative to the conventional Doppler measurement for the detection of PAD. To increase the sensitivity of the PAD diagnosis based on an oscillometric ABI, a higher threshold of 1.0 might be preferable. PMID:22739420

  14. Automated detection of regions of interest for tissue microarray experiments: an image texture analysis

    International Nuclear Information System (INIS)

    Recent research with tissue microarrays led to a rapid progress toward quantifying the expressions of large sets of biomarkers in normal and diseased tissue. However, standard procedures for sampling tissue for molecular profiling have not yet been established. This study presents a high throughput analysis of texture heterogeneity on breast tissue images for the purpose of identifying regions of interest in the tissue for molecular profiling via tissue microarray technology. Image texture of breast histology slides was described in terms of three parameters: the percentage of area occupied in an image block by chromatin (B), percentage occupied by stroma-like regions (P), and a statistical heterogeneity index H commonly used in image analysis. Texture parameters were defined and computed for each of the thousands of image blocks in our dataset using both the gray scale and color segmentation. The image blocks were then classified into three categories using the texture feature parameters in a novel statistical learning algorithm. These categories are as follows: image blocks specific to normal breast tissue, blocks specific to cancerous tissue, and those image blocks that are non-specific to normal and disease states. Gray scale and color segmentation techniques led to identification of same regions in histology slides as cancer-specific. Moreover the image blocks identified as cancer-specific belonged to those cell crowded regions in whole section image slides that were marked by two pathologists as regions of interest for further histological studies. These results indicate the high efficiency of our automated method for identifying pathologic regions of interest on histology slides. Automation of critical region identification will help minimize the inter-rater variability among different raters (pathologists) as hundreds of tumors that are used to develop an array have typically been evaluated (graded) by different pathologists. The region of interest

  15. Comparison between manual and automated analysis for the quantification of carotid wall by using sonography. A validation study with CT

    International Nuclear Information System (INIS)

    Purpose: The purpose of this paper was to compare manual and automated analysis for the quantification of carotid wall obtained with sonography by using the computed tomography as validation technique. Material and methods: 21 consecutive patients underwent MDCTA and ultrasound analysis of carotid arteries (mean age 68 years; age range 59–81 years). The intima–media-thickness (IMT) of the 42 carotids was measured with novel and dedicated automated software analysis (called AtheroEdge™, Biomedical Technologies, Denver, CO, USA) and by four observers that manually calculated the IMT. The carotid artery wall thickness (CAWT) was also quantified in the CT datasets. Bland–Altman statistics was employed to measure the agreement between methods. A Student's t-test was used to test the differences between the IMT values of AtheroEdge™. The study obtained the IRB approval. Results: The correlation between automated AtheroEdge™ measurements and those of the human experts were equal to 95.5%, 73.5%, 88.9%, and 81.7%. The IMT coefficient of variation of the human experts was equal to 11.9%. By using a Student's t-test, the differences between the IMT values of AtheroEdge™ and those of the human experts were not found statistically significant (p value = 0.02). On comparing AtheroEdge™ (using Ultrasound) with CAWT (using CT), the results suggested a very good concordance of 84.96%. Conclusions: Data of this preliminary study indicate that automated software AtheroEdge™ can analyze with precision the IMT of carotid arteries and that the concordance with CT is optimal.

  16. Automated reduction and interpretation of multidimensional mass spectra for analysis of complex peptide mixtures

    Science.gov (United States)

    Gambin, Anna; Dutkowski, Janusz; Karczmarski, Jakub; Kluge, Boguslaw; Kowalczyk, Krzysztof; Ostrowski, Jerzy; Poznanski, Jaroslaw; Tiuryn, Jerzy; Bakun, Magda; Dadlez, Michal

    2007-01-01

    Here we develop a fully automated procedure for the analysis of liquid chromatography-mass spectrometry (LC-MS) datasets collected during the analysis of complex peptide mixtures. We present the underlying algorithm and outcomes of several experiments justifying its applicability. The novelty of our approach is to exploit the multidimensional character of the datasets. It is common knowledge that highly complex peptide mixtures can be analyzed by liquid chromatography coupled with mass spectrometry, but we are not aware of any existing automated MS spectra interpretation procedure designed to take into account the multidimensional character of the data. Our work fills this gap by providing an effective algorithm for this task, allowing for automated conversion of raw data to the list of masses of peptides.

  17. Automated Fetal Heart Rate Analysis in Labor: Decelerations and Overshoots

    International Nuclear Information System (INIS)

    Electronic fetal heart rate (FHR) recording is a standard way of monitoring fetal health in labor. Decelerations and accelerations usually indicate fetal distress and normality respectively. But one type of acceleration may differ, namely an overshoot that may atypically reflect fetal stress. Here we describe a new method for detecting decelerations, accelerations and overshoots as part of a novel system for computerized FHR analysis (OxSyS). There was poor agreement between clinicians when identifying these FHR features visually, which precluded setting a gold standard of interpretation. We therefore introduced 'modified' Sensitivity (SE deg.) and 'modified' Positive Predictive Value (PPV deg.) as appropriate performance measures with which the algorithm was optimized. The relation between overshoots and fetal compromise in labor was studied in 15 cases and 15 controls. Overshoots showed promise as an indicator of fetal compromise. Unlike ordinary accelerations, overshoots cannot be considered to be reassuring features of fetal health.

  18. Methods for Risk Analysis

    International Nuclear Information System (INIS)

    Many decision-making situations today affect humans and the environment. In practice, many such decisions are made without an overall view and prioritise one or other of the two areas. Now and then these two areas of regulation come into conflict, e.g. the best alternative as regards environmental considerations is not always the best from a human safety perspective and vice versa. This report was prepared within a major project with the aim of developing a framework in which both the environmental aspects and the human safety aspects are integrated, and decisions can be made taking both fields into consideration. The safety risks have to be analysed in order to be successfully avoided and one way of doing this is to use different kinds of risk analysis methods. There is an abundance of existing methods to choose from and new methods are constantly being developed. This report describes some of the risk analysis methods currently available for analysing safety and examines the relationships between them. The focus here is mainly on human safety aspects

  19. Automated combustion accelerator mass spectrometry for the analysis of biomedical samples in the low attomole range.

    Science.gov (United States)

    van Duijn, Esther; Sandman, Hugo; Grossouw, Dimitri; Mocking, Johannes A J; Coulier, Leon; Vaes, Wouter H J

    2014-08-01

    The increasing role of accelerator mass spectrometry (AMS) in biomedical research necessitates modernization of the traditional sample handling process. AMS was originally developed and used for carbon dating, therefore focusing on a very high precision but with a comparably low sample throughput. Here, we describe the combination of automated sample combustion with an elemental analyzer (EA) online coupled to an AMS via a dedicated interface. This setup allows direct radiocarbon measurements for over 70 samples daily by AMS. No sample processing is required apart from the pipetting of the sample into a tin foil cup, which is placed in the carousel of the EA. In our system, up to 200 AMS analyses are performed automatically without the need for manual interventions. We present results on the direct total (14)C count measurements in <2 μL human plasma samples. The method shows linearity over a range of 0.65-821 mBq/mL, with a lower limit of quantification of 0.65 mBq/mL (corresponding to 0.67 amol for acetaminophen). At these extremely low levels of activity, it becomes important to quantify plasma specific carbon percentages. This carbon percentage is automatically generated upon combustion of a sample on the EA. Apparent advantages of the present approach include complete omission of sample preparation (reduced hands-on time) and fully automated sample analysis. These improvements clearly stimulate the standard incorporation of microtracer research in the drug development process. In combination with the particularly low sample volumes required and extreme sensitivity, AMS strongly improves its position as a bioanalysis method. PMID:25033319

  20. The length of the glaciers in the world – a straightforward method for the automated calculation of glacier center lines

    OpenAIRE

    Machguth, H.; M. Huss

    2014-01-01

    Glacier length is an important measure of glacier geometry but global glacier inventories are mostly lacking length data. Only recently semi-automated approaches to measure glacier length have been developed and applied regionally. Here we present a first global assessment of glacier length using a fully automated method based on glacier surface slope, distance to the glacier margins and a set of trade-off functions. The method is developed for East Greenlan...

  1. Adverse drug events with hyperkalaemia during inpatient stays: evaluation of an automated method for retrospective detection in hospital databases

    OpenAIRE

    Ficheur, Grégoire; Chazard, Emmanuel; Beuscart, Jean-Baptiste; Merlin, Béatrice; Luyckx, Michel; Beuscart, Régis

    2014-01-01

    Background Adverse drug reactions and adverse drug events (ADEs) are major public health issues. Many different prospective tools for the automated detection of ADEs in hospital databases have been developed and evaluated. The objective of the present study was to evaluate an automated method for the retrospective detection of ADEs with hyperkalaemia during inpatient stays. Methods We used a set of complex detection rules to take account of the patient’s clinical and biological context and th...

  2. Toward designing for trust in database automation

    International Nuclear Information System (INIS)

    . The connection between an AH for an automated tool and a list of information elements at the three levels of attributional abstraction is then direct, providing a method for satisfying information requirements for appropriate trust in automation. In this paper, we will present our method for developing specific information requirements for an automated tool, based on a formal analysis of that tool and the models presented by Lee and See. We will show an example of the application of the AH to automation, in the domain of relational database automation, and the resulting set of specific information elements for appropriate trust in the automated tool. Finally, we will comment on the applicability of this approach to the domain of nuclear plant instrumentation. (authors)

  3. Methods for RNA Analysis

    DEFF Research Database (Denmark)

    Olivarius, Signe

    While increasing evidence appoints diverse types of RNA as key players in the regulatory networks underlying cellular differentiation and metabolism, the potential functions of thousands of conserved RNA structures encoded in mammalian genomes remain to be determined. Since the functions of most...... RNAs rely on interactions with proteins, the establishment of protein-binding profiles is essential for the characterization of RNAs. Aiming to facilitate RNA analysis, this thesis introduces proteomics- as well as transcriptomics-based methods for the functional characterization of RNA. First, RNA......-protein pulldown combined with mass spectrometry analysis is applied for in vivo as well as in vitro identification of RNA-binding proteins, the latter succeeding in verifying known RNA-protein interactions. Secondly, acknowledging the significance of flexible promoter usage for the diversification of the...

  4. Archaeological field survey automation: concurrent multisensor site mapping and automated analysis

    Science.gov (United States)

    Józefowicz, Mateusz; Sokolov, Oleksandr; Meszyński, Sebastian; Siemińska, Dominika; Kołosowski, Przemysław

    2016-04-01

    control the platform from a remote location via satellite, with only servicing person on the site and the survey team operating from their office, globally. The method is under development. The team contributing to the project includes also: Oleksii Sokolov, Michał Koepke, Krzysztof Rydel, Michał Stypczyński, Maciej Ślęk, Łukasz Zapała, Michał Dąbrowski.

  5. Automation of statistical analysis in the WIPP hazardous waste facility permit for analytical results from characterization

    International Nuclear Information System (INIS)

    One goal of characterizing, processing, and shipping waste to the Waste Isolation Pilot Plant (WIPP) is to make all activities as efficient as possible. Data management and repetitive calculations are a critical part of the process that can be automated, thereby increasing the accuracy and rate at which work is completed and reducing costs. This paper presents the tools developed to automate statistical analysis and other calculations required by the WIPP Hazardous Waste Facility Permit (HWFP). Statistical analyses are performed on the analytical results on gas samples from the headspace of waste containers and solid samples from the core of the waste container. The calculations include determining the number of samples, test for the shape of the distribution of the analytical results, mean, standard deviation, upper 90-percent confidence limit of the mean, and the minimum required Waste Acceptance Plan (WAP) sample size. The input data for these calculations are from the batch data reports for headspace gas analytical results and solids analysis, which must also be obtained and collated for proper use. The most challenging component of the statistical analysis, if performed manually, is the determination of the distribution shape; therefore, the distribution testing is typically performed using a certified software tool. All other calculations can be completed manually, with a spreadsheet, custom developed software, and/or certified software tool. Out of the options available, manually performing the calculations or using a spreadsheet are the least desirable. These methods rely heavily on the availability of an expert, such as a statistician, to perform the calculation. These methods are also more open to human error such as transcription or 'cut and paste' errors. A SAS program is in the process of being developed to perform the calculations. Due to the potential size of the data input files and the need to archive the data in an accessible format, the SAS

  6. Automated analysis of small animal PET studies through deformable registration to an atlas

    Energy Technology Data Exchange (ETDEWEB)

    Gutierrez, Daniel F. [Geneva University Hospital, Division of Nuclear Medicine and Molecular Imaging, Geneva 4 (Switzerland); Zaidi, Habib [Geneva University Hospital, Division of Nuclear Medicine and Molecular Imaging, Geneva 4 (Switzerland); Geneva University, Geneva Neuroscience Center, Geneva (Switzerland); University of Groningen, Department of Nuclear Medicine and Molecular Imaging, University Medical Center Groningen, Groningen (Netherlands)

    2012-11-15

    This work aims to develop a methodology for automated atlas-guided analysis of small animal positron emission tomography (PET) data through deformable registration to an anatomical mouse model. A non-rigid registration technique is used to put into correspondence relevant anatomical regions of rodent CT images from combined PET/CT studies to corresponding CT images of the Digimouse anatomical mouse model. The latter provides a pre-segmented atlas consisting of 21 anatomical regions suitable for automated quantitative analysis. Image registration is performed using a package based on the Insight Toolkit allowing the implementation of various image registration algorithms. The optimal parameters obtained for deformable registration were applied to simulated and experimental mouse PET/CT studies. The accuracy of the image registration procedure was assessed by segmenting mouse CT images into seven regions: brain, lungs, heart, kidneys, bladder, skeleton and the rest of the body. This was accomplished prior to image registration using a semi-automated algorithm. Each mouse segmentation was transformed using the parameters obtained during CT to CT image registration. The resulting segmentation was compared with the original Digimouse atlas to quantify image registration accuracy using established metrics such as the Dice coefficient and Hausdorff distance. PET images were then transformed using the same technique and automated quantitative analysis of tracer uptake performed. The Dice coefficient and Hausdorff distance show fair to excellent agreement and a mean registration mismatch distance of about 6 mm. The results demonstrate good quantification accuracy in most of the regions, especially the brain, but not in the bladder, as expected. Normalized mean activity estimates were preserved between the reference and automated quantification techniques with relative errors below 10 % in most of the organs considered. The proposed automated quantification technique is

  7. Automated analysis of small animal PET studies through deformable registration to an atlas

    International Nuclear Information System (INIS)

    This work aims to develop a methodology for automated atlas-guided analysis of small animal positron emission tomography (PET) data through deformable registration to an anatomical mouse model. A non-rigid registration technique is used to put into correspondence relevant anatomical regions of rodent CT images from combined PET/CT studies to corresponding CT images of the Digimouse anatomical mouse model. The latter provides a pre-segmented atlas consisting of 21 anatomical regions suitable for automated quantitative analysis. Image registration is performed using a package based on the Insight Toolkit allowing the implementation of various image registration algorithms. The optimal parameters obtained for deformable registration were applied to simulated and experimental mouse PET/CT studies. The accuracy of the image registration procedure was assessed by segmenting mouse CT images into seven regions: brain, lungs, heart, kidneys, bladder, skeleton and the rest of the body. This was accomplished prior to image registration using a semi-automated algorithm. Each mouse segmentation was transformed using the parameters obtained during CT to CT image registration. The resulting segmentation was compared with the original Digimouse atlas to quantify image registration accuracy using established metrics such as the Dice coefficient and Hausdorff distance. PET images were then transformed using the same technique and automated quantitative analysis of tracer uptake performed. The Dice coefficient and Hausdorff distance show fair to excellent agreement and a mean registration mismatch distance of about 6 mm. The results demonstrate good quantification accuracy in most of the regions, especially the brain, but not in the bladder, as expected. Normalized mean activity estimates were preserved between the reference and automated quantification techniques with relative errors below 10 % in most of the organs considered. The proposed automated quantification technique is

  8. Comparison of automated pre-column and post-column analysis of amino acid oligomers

    Science.gov (United States)

    Chow, J.; Orenberg, J. B.; Nugent, K. D.

    1987-01-01

    It has been shown that various amino acids will polymerize under plausible prebiotic conditions on mineral surfaces, such as clays and soluble salts, to form varying amounts of oligomers (n = 2-6). The investigations of these surface reactions required a quantitative method for the separation and detection of these amino acid oligomers at the picomole level in the presence of nanomole levels of the parent amino acid. In initial high-performance liquid chromatography (HPLC) studies using a classical postcolumn o-phthalaldehyde (OPA) derivatization ion-exchange HPLC procedure with fluorescence detection, problems encountered included lengthy analysis time, inadequate separation and large relative differences in sensitivity for the separated species, expressed as a variable fluorescent yield, which contributed to poor quantitation. We have compared a simple, automated, pre-column OPA derivatization and reversed-phase HPLC method with the classical post-column OPA derivatization and ion-exchange HPLC procedure. A comparison of UV and fluorescent detection of the amino acid oligomers is also presented. The conclusion reached is that the pre-column OPA derivatization, reversed-phase HPLC and UV detection produces enhanced separation, improved sensitivity and faster analysis than post-column OPA derivatization, ion-exchange HPLC and fluorescence detection.

  9. Miniaturized Mass-Spectrometry-Based Analysis System for Fully Automated Examination of Conditioned Cell Culture Media

    NARCIS (Netherlands)

    Weber, E.; Pinkse, M.W.H.; Bener-Aksam, E.; Vellekoop, M.J.; Verhaert, P.D.E.M.

    2012-01-01

    We present a fully automated setup for performing in-line mass spectrometry (MS) analysis of conditioned media in cell cultures, in particular focusing on the peptides therein. The goal is to assess peptides secreted by cells in different culture conditions. The developed system is compatible with M

  10. Scanning probe image wizard: A toolbox for automated scanning probe microscopy data analysis

    Science.gov (United States)

    Stirling, Julian; Woolley, Richard A. J.; Moriarty, Philip

    2013-11-01

    We describe SPIW (scanning probe image wizard), a new image processing toolbox for SPM (scanning probe microscope) images. SPIW can be used to automate many aspects of SPM data analysis, even for images with surface contamination and step edges present. Specialised routines are available for images with atomic or molecular resolution to improve image visualisation and generate statistical data on surface structure.

  11. 14 CFR 1261.413 - Analysis of costs; automation; prevention of overpayments, delinquencies, or defaults.

    Science.gov (United States)

    2010-01-01

    ... costs incurred and amounts collected. Data on costs and corresponding recovery rates for debts of... efforts are likely to exceed recoveries, and assist in evaluating offers in compromise. (b) Consider the... 14 Aeronautics and Space 5 2010-01-01 2010-01-01 false Analysis of costs; automation;...

  12. Development of a novel and automated fluorescent immunoassay for the analysis of beta-lactam antibiotics

    NARCIS (Netherlands)

    Benito-Pena, E.; Moreno-Bondi, M.C.; Orellana, G.; Maquieira, K.; Amerongen, van A.

    2005-01-01

    An automated immunosensor for the rapid and sensitive analysis of penicillin type -lactam antibiotics has been developed and optimized. An immunogen was prepared by coupling the common structure of the penicillanic -lactam antibiotics, i.e., 6-aminopenicillanic acid to keyhole limpet hemocyanin. Pol

  13. The iFly Tracking System for an Automated Locomotor and Behavioural Analysis of Drosophila melanogaster

    Science.gov (United States)

    Kohlhoff, Kai J.; Jahn, Thomas R.; Lomas, David A.; Dobson, Christopher M.; Crowther, Damian C.; Vendruscolo, Michele

    2016-01-01

    The use of animal models in medical research provides insights into molecular and cellular mechanisms of human disease, and helps identify and test novel therapeutic strategies. Drosophila melanogaster – the common fruit fly – is one of the most established model organisms, as its study can be performed more readily and with far less expense than for other model animal systems, such as mice, fish, or indeed primates. In the case of fruit flies, standard assays are based on the analysis of longevity and basic locomotor functions. Here we present the iFly tracking system, which enables to increase the amount of quantitative information that can be extracted from these studies, and to reduce significantly the duration and costs associated with them. The iFly system uses a single camera to simultaneously track the trajectories of up to 20 individual flies with about 100μm spatial and 33ms temporal resolution. The statistical analysis of fly movements recorded with such accuracy makes it possible to perform a rapid and fully automated quantitative analysis of locomotor changes in response to a range of different stimuli. We anticipate that the iFly method will reduce very considerably the costs and the duration of the testing of genetic and pharmacological interventions in Drosophila models, including an earlier detection of behavioural changes and a large increase in throughput compared to current longevity and locomotor assays. PMID:21698336

  14. Automated data acquisition and analysis system for inventory verification

    International Nuclear Information System (INIS)

    A real-time system is proposed which would allow CLO Safeguards Branch to conduct a meaningful inventory verification using a variety of NDA instruments. The overall system would include the NDA instruments, automated data handling equipment, and a vehicle to house and transport the instruments and equipment. For the purpose of the preliminary cost estimate a specific data handling system and vehicle were required. A Tracor Northern TN-11 data handling system including a PDP-11 minicomputer and a measurement vehicle similar to the Commission's Regulatory Region I van were used. The basic system is currently estimated to cost about $100,000, and future add-ons which would expand the systems' capabilities are estimated to cost about $40,000. The concept of using a vehicle in order to permanently rack mount the data handling equipmentoffers a number of benefits such as control of equipment environment and allowance for improvements, expansion, and flexibility in the system. Justification is also presented for local design and assembly of the overall system. A summary of the demonstration system which illustrates the advantages and feasibility of the overall system is included in this discussion. Two ideas are discussed which are not considered to be viable alternatives to the proposed system: addition of the data handling capabilities to the semiportable ''cart'' and use of a telephone link to a large computer center

  15. Automated Dsm Extraction from Uav Images and Performance Analysis

    Science.gov (United States)

    Rhee, S.; Kim, T.

    2015-08-01

    As technology evolves, unmanned aerial vehicles (UAVs) imagery is being used from simple applications such as image acquisition to complicated applications such as 3D spatial information extraction. Spatial information is usually provided in the form of a DSM or point cloud. It is important to generate very dense tie points automatically from stereo images. In this paper, we tried to apply stereo image-based matching technique developed for satellite/aerial images to UAV images, propose processing steps for automated DSM generation and to analyse the possibility of DSM generation. For DSM generation from UAV images, firstly, exterior orientation parameters (EOPs) for each dataset were adjusted. Secondly, optimum matching pairs were determined. Thirdly, stereo image matching was performed with each pair. Developed matching algorithm is based on grey-level correlation on pixels applied along epipolar lines. Finally, the extracted match results were united with one result and the final DSM was made. Generated DSM was compared with a reference DSM from Lidar. Overall accuracy was 1.5 m in NMAD. However, several problems have to be solved in future, including obtaining precise EOPs, handling occlusion and image blurring problems. More effective interpolation technique needs to be developed in the future.

  16. An Analysis of Intelligent Automation Demands in Taiwanese Firms

    Directory of Open Access Journals (Sweden)

    Ying-Mei Tai

    2016-03-01

    Full Text Available To accurately elucidate the production deployment, process intelligent automation (IA, and production bottlenecks of Taiwanese companies, as well as the IA application status, this research conducted a structured questionnaire survey on the participants of the IA promotion activities arranged by the Industrial Development Bureau, Ministry of Economic Affairs. A total of 35 valid questionnaires were recovered. Research findings indicated that the majority of participants were large-scale enterprises. These enterprises anticipated adding production bases in Taiwan and China to transit and upgrade their operations or strengthen influence in the domestic market. The degrees of various process IA and production bottlenecks were relatively low, which was associated with the tendency to small amount of diversified products. The majority of sub-categories of hardware equipment and simulation technologies have reached maturity, and the effective application of these technologies can enhance production efficiency. Technologies of intelligent software remain immature and need further development and application. More importantly, they can meet customer values and create new business models, so as to satisfy the purpose of sustainable development.

  17. Automated analysis of sleep-wake state in rats.

    Science.gov (United States)

    Stephenson, Richard; Caron, Aimee M; Cassel, Daniel B; Kostela, Jennifer C

    2009-11-15

    A fully automated computer-based sleep scoring system is described and validated for use in rats. The system was designed to emulate visual sleep scoring by using the same basic features of the electroencephalogram (EEG) and electromyogram (EMG), and a similar set of decision-making rules. State indices are calculated for each 5s epoch by combination of amplitudes (microVrms) of 6 filtered EEG frequency bands (EEGlo, d.c.-1.5Hz; delta, 1.5-6Hz; theta, 6-9Hz; alpha, 10.5-15Hz; beta, 22-30Hz; gamma, 35-45Hz; Sigma(EEG)=delta+theta+alpha+beta+gamma) and EMG (10-100Hz) yielding dimensionless ratios: WAKE-index=(EMGxgamma)/theta; NREM-index=(deltaxalpha)/gamma(2); REM-index=theta(3)/(deltaxalphaxEMG); artifact-index=[(2xEEG(lo))+beta]*(gamma/Sigma(EEG)). The index values are re-scaled and normalized, thereby dispensing with the need for animal-specific threshold values. The system was validated by direct comparison with visually scored data in 9 rats. Overall, the computer and visual scores were 96% concordant, which is similar to inter-rater agreement in visual scoring. False-positive error rate was system in studies lasting 5 weeks. The system was implemented and further validated in a study of sleep architecture in 7 rats under a 12:12h LD cycle. PMID:19703489

  18. A simple method for validation and verification of pipettes mounted on automated liquid handlers

    OpenAIRE

    Stangegaard, Michael; Hansen, Anders Johannes; Frøslev, Tobias Guldberg; Morling, Niels

    2009-01-01

     We have implemented a simple method for validation and verification of the performance of pipettes mounted on automated liquid handlers as necessary for laboratories accredited under ISO 17025. An 8-step serial dilution of Orange G was prepared in quadruplicates in a flat bottom 96-well microtiter plate (BD Falcon) manually by means of calibrated pipettes. Each pipette of the liquid handler (1 up to 8) dispensed a selected volume (1 to 200µl) of Orange G 8 times into the wells of the microti...

  19. Analysis of natural waters with an automated inductively-coupled plasma spectrometer system

    International Nuclear Information System (INIS)

    A commercial ICP spectrometer system has been automated to provide unattended operation and data collection following initializing commands and loading of the sample changer. Automation is provided by a microcomputer which permits interconnection of a sample changer, a card reader, a high-speed printer terminal, a dual floppy disk drive, and the spectrometer's basic computer. Application of the system to the analysis of natural water samples is described. Accuracy and precision data both for short and long periods as determined with standard and reference water samples is presented. Analytical data presentation formats can be altered with the system. Some aspects of data handling and manipulation external to the system are outlined

  20. Development of an automated processing method to detect still timing of cardiac motion for coronary magnetic resonance angiography

    Science.gov (United States)

    Asou, Hiroya; Ichikawa, Katsuhiro; Imada, Naoyuki; Masuda, Takanori; Satou, Tomoyasu

    2011-03-01

    Whole-heart coronary magnetic resonance angiography (WH-MRA) is useful noninvasive examination. Its signal acquisition is performed during very short still timing in each cardiac motion cycle, and therefore the adequate still timing selection is important to obtain the better image quality. However, since the current available selection method is only manual one using visual comparison of cine MRI images with different phases, the selected timings are often incorrect and their reproducibility is not sufficient. We developed an automated selection method to detect the best still timing for the WH-MRA and compared the automated method with conventional manual one. Cine MRI images were used for the analysis. In order to extract the high-speed cardiac cine image, each phase directional pixel set at each pixel position in all cine images were processed by a high-pass filtering using the Fourie transform. After this process, the cine images with low speed timing became dark, and the optimal timing could be determined by a threshold processing. We took ten volunteers' WH-MRA with the manually and automatically selected timings, and visually assessed image quality of each image on a 5-point scale (1=excellent, 2=very good, 3=good, 4=fair, 5=poor). The mean scores of the manual and automatic methods for right coronary arteries (RCA), LDA left anterior descending arteries (LAD) and LCX left circumflex arteries (LCX) were 4.2+/-0.38, 4.1+/-0.44, 3.9+/-0.52 and 4.1+/-0.42, 4.1+/-0.24, 3.2+/-0.35 respectively. The score were increased by our method in the RCA and LCX, and the LCX was significant (pcardiac phase more accurately than or equally to the conventional manual method.

  1. Automated extraction of DNA from biological stains on fabric from crime cases. A comparison of a manual and three automated methods

    DEFF Research Database (Denmark)

    Stangegaard, Michael; Hjort, Benjamin B; Hansen, Thomas N;

    2013-01-01

    The presence of PCR inhibitors in extracted DNA may interfere with the subsequent quantification and short tandem repeat (STR) reactions used in forensic genetic DNA typing. DNA extraction from fabric for forensic genetic purposes may be challenging due to the occasional presence of PCR inhibitors...... that may be co-extracted with the DNA. Using 120 forensic trace evidence samples consisting of various types of fabric, we compared three automated DNA extraction methods based on magnetic beads (PrepFiler Express Forensic DNA Extraction Kit on an AutoMate Express, QIAsyphony DNA Investigator kit...... either with the sample pre-treatment recommended by Qiagen or an in-house optimized sample pre-treatment on a QIAsymphony SP) and one manual method (Chelex) with the aim of reducing the amount of PCR inhibitors in the DNA extracts and increasing the proportion of reportable STR-profiles. A total of 480...

  2. An automated method for cell-free layer width determination in small arterioles

    International Nuclear Information System (INIS)

    Histogram-based thresholding techniques utilized for cell-free layer width measurement in arteriolar flow may produce an overestimation of the layer width since they do not consider faint shaded regions near the vessel wall as part of the erythrocyte column. To address this problem, we developed a new method for detecting the boundary of the erythrocyte column based on an edge detection algorithm. This automated method (grayscale method) provides local detections of the inner vessel wall as well as the boundary between the cell-free layer and the erythrocyte column without binarization of grayscale images. The cell-free layer width measurements using the grayscale method and existing techniques (minimum method and Otsu's method) were compared with those determined manually in arteriolar flows of the rat cremaster muscle. In the absence of the shaded regions, values obtained by the grayscale method and minimum method were statistically in good agreement with the manual method but not in the case of Otsu's method. When the faint shaded regions were present, the grayscale method appeared to produce more accurate results than the minimum method and Otsu's method. (note)

  3. Automated red blood cell analysis compared with routine red blood cell morphology by smear review

    Directory of Open Access Journals (Sweden)

    Dr.Poonam Radadiya

    2015-01-01

    Full Text Available The RBC histogram is an integral part of automated haematology analysis and is now routinely available on all automated cell counters. This histogram and other associated complete blood count (CBC parameters have been found abnormal in various haematological conditions and may provide major clues in the diagnosis and management of significant red cell disorders. Performing manual blood smears is important to ensure the quality of blood count results and to make presumptive diagnosis. In this article we have taken 100 samples for comparative study between RBC histograms obtained by automated haematology analyzer with peripheral blood smear. This article discusses some morphological features of dimorphism and the ensuing characteristic changes in their RBC histograms.

  4. An automated analysis of wide area motion imagery for moving subject detection

    Science.gov (United States)

    Tahmoush, Dave

    2015-05-01

    Automated analysis of wide area motion imagery (WAMI) can significantly reduce the effort required for converting data into reliable decisions. We register consecutive WAMI frames and use false-color frame comparisons to enhance the visual detection of possible subjects in the imagery. The large number of WAMI detections produces the need for a prioritization of detections for further inspection. We create a priority queue of detections for automated revisit with smaller field-ofview assets based on the locations of the movers as well as the probability of the detection. This automated queue works within an operator's preset prioritizations but also allows the flexibility to dynamically respond to new events as well as incorporating additional information into the surveillance tasking.

  5. Trend Analysis on the Automation of the Notebook PC Production Process

    Directory of Open Access Journals (Sweden)

    Chin-Ching Yeh

    2012-09-01

    Full Text Available Notebook PCs are among the Taiwanese electronic products that generate the highest production value and market share. According to the ITRI IEK statistics, the domestic Notebook PC - production value in 2011 was about NT $2.3 trillion. Of about 200 million notebook PCs in global markets in 2011, Taiwan’s notebook PC output accounts for more than 90% of them, meaning that nine out of every ten notebook PCs in the world are manufactured by Taiwanese companies. For such a large industry with its output value and quantity, the degree of automation in production processes is not high. This means that there is still a great room for the automation of the notebook PC production process, or that the degree of automation of the production process of the laptops cannot be enhanced. This paper presents an analysis of the situation.

  6. A Novel Method for the Separation of Overlapping Pollen Species for Automated Detection and Classification.

    Science.gov (United States)

    Tello-Mijares, Santiago; Flores, Francisco

    2016-01-01

    The identification of pollen in an automated way will accelerate different tasks and applications of palynology to aid in, among others, climate change studies, medical allergies calendar, and forensic science. The aim of this paper is to develop a system that automatically captures a hundred microscopic images of pollen and classifies them into the 12 different species from Lagunera Region, Mexico. Many times, the pollen is overlapping on the microscopic images, which increases the difficulty for its automated identification and classification. This paper focuses on a method to segment the overlapping pollen. First, the proposed method segments the overlapping pollen. Second, the method separates the pollen based on the mean shift process (100% segmentation) and erosion by H-minima based on the Fibonacci series. Thus, pollen is characterized by its shape, color, and texture for training and evaluating the performance of three classification techniques: random tree forest, multilayer perceptron, and Bayes net. Using the newly developed system, we obtained segmentation results of 100% and classification on top of 96.2% and 96.1% in recall and precision using multilayer perceptron in twofold cross validation. PMID:27034710

  7. Automated extraction of DNA from biological stains on fabric from crime cases. A comparison of a manual and three automated methods

    OpenAIRE

    Stangegaard, Michael; Hjort, Benjamin B; Hansen, Thomas N.; Hoflund, Anders; Mogensen, Helle S; Hansen, Anders J.; Morling, Niels

    2013-01-01

    The presence of PCR inhibitors in extracted DNA may interfere with the subsequent quantification and short tandem repeat (STR) reactions used in forensic genetic DNA typing. DNA extraction from fabric for forensic genetic purposes may be challenging due to the occasional presence of PCR inhibitors that may be co-extracted with the DNA. Using 120 forensic trace evidence samples consisting of various types of fabric, we compared three automated DNA extraction methods based on magnetic beads (Pr...

  8. Automating case reports for the analysis of digital evidence

    OpenAIRE

    Cassidy, Regis H. Friend

    2005-01-01

    The reporting process during computer analysis is critical in the practice of digital forensics. Case reports are used to review the process and results of an investigation and serve multiple purposes. The investigator may refer to these reports to monitor the progress of his analysis throughout the investigation. When acting as an expert witness, the investigator will refer to organized documentation to recall past analysis. A lot of time can elapse between the analysis and the actual testim...

  9. A microchip electrophoresis-mass spectrometric platform with double cell lysis nano-electrodes for automated single cell analysis.

    Science.gov (United States)

    Li, Xiangtang; Zhao, Shulin; Hu, Hankun; Liu, Yi-Ming

    2016-06-17

    Capillary electrophoresis-based single cell analysis has become an essential approach in researches at the cellular level. However, automation of single cell analysis has been a challenge due to the difficulty to control the number of cells injected and the irreproducibility associated with cell aggregation. Herein we report the development of a new microfluidic platform deploying the double nano-electrode cell lysis technique for automated analysis of single cells with mass spectrometric detection. The proposed microfluidic chip features integration of a cell-sized high voltage zone for quick single cell lysis, a microfluidic channel for electrophoretic separation, and a nanoelectrospray emitter for ionization in MS detection. Built upon this platform, a microchip electrophoresis-mass spectrometric method (MCE-MS) has been developed for automated single cell analysis. In the method, cell introduction, cell lysis, and MCE-MS separation are computer controlled and integrated as a cycle into consecutive assays. Analysis of large numbers of individual PC-12 neuronal cells (both intact and exposed to 25mM KCl) was carried out to determine intracellular levels of dopamine (DA) and glutamic acid (Glu). It was found that DA content in PC-12 cells was higher than Glu content, and both varied from cell to cell. The ratio of intracellular DA to Glu was 4.20±0.8 (n=150). Interestingly, the ratio drastically decreased to 0.38±0.20 (n=150) after the cells are exposed to 25mM KCl for 8min, suggesting the cells released DA promptly and heavily while they released Glu at a much slower pace in response to KCl-induced depolarization. These results indicate that the proposed MCE-MS analytical platform may have a great potential in researches at the cellular level. PMID:27207575

  10. Application of fluorescence-based semi-automated AFLP analysis in barley and wheat

    DEFF Research Database (Denmark)

    Schwarz, G.; Herz, M.; Huang, X.Q.; Michalek, W.; Jahoor, A.; Wenzel, G.; Mohler, V.

    Genetic mapping and the selection of closely linked molecular markers for important agronomic traits require efficient, large-scale genotyping methods. A semi-automated multifluorophore technique was applied for genotyping AFLP marker loci in barley and wheat. In comparison to conventional P-33-b...

  11. Analysis of the main factors determining the degree of automation in NPPs

    International Nuclear Information System (INIS)

    The ''man-machine'' interaction plays an important part in developing such complex installations as Nuclear Power Plants (NPP). Special attention was focussed on this problem after accidents in ''Three-Mile Island'' and Chernobyl NPP. In spite of the great efforts and resources that are consumed in this area, the formalized methods and approaches to achieve the optimal balance between the automation and human actions are absent up till now. The decisions being made depend, to a great extent, on the designers skill. Such a situation is explained by the integrated approach to the problems, the need to allow for a great number of factors both quite apparent and indirectly influencing. The paper describes the main factors determining the degree of an automation on NPP: specific features of concrete NPP; safety; personnel dose rates; general situation in the NPP automation area; economic performance and social consequences. (author). 2 refs, 6 figs

  12. Automated local bright feature image analysis of nuclear protein distribution identifies changes in tissue phenotype

    International Nuclear Information System (INIS)

    The organization of nuclear proteins is linked to cell and tissue phenotypes. When cells arrest proliferation, undergo apoptosis, or differentiate, the distribution of nuclear proteins changes. Conversely, forced alteration of the distribution of nuclear proteins modifies cell phenotype. Immunostaining and fluorescence microscopy have been critical for such findings. However, there is an increasing need for quantitative analysis of nuclear protein distribution to decipher epigenetic relationships between nuclear structure and cell phenotype, and to unravel the mechanisms linking nuclear structure and function. We have developed imaging methods to quantify the distribution of fluorescently-stained nuclear protein NuMA in different mammary phenotypes obtained using three-dimensional cell culture. Automated image segmentation of DAPI-stained nuclei was generated to isolate thousands of nuclei from three-dimensional confocal images. Prominent features of fluorescently-stained NuMA were detected using a novel local bright feature analysis technique, and their normalized spatial density calculated as a function of the distance from the nuclear perimeter to its center. The results revealed marked changes in the distribution of the density of NuMA bright features as non-neoplastic cells underwent phenotypically normal acinar morphogenesis. In contrast, we did not detect any reorganization of NuMA during the formation of tumor nodules by malignant cells. Importantly, the analysis also discriminated proliferating non-neoplastic cells from proliferating malignant cells, suggesting that these imaging methods are capable of identifying alterations linked not only to the proliferation status but also to the malignant character of cells. We believe that this quantitative analysis will have additional applications for classifying normal and pathological tissues

  13. Automated local bright feature image analysis of nuclear proteindistribution identifies changes in tissue phenotype

    Energy Technology Data Exchange (ETDEWEB)

    Knowles, David; Sudar, Damir; Bator, Carol; Bissell, Mina

    2006-02-01

    The organization of nuclear proteins is linked to cell and tissue phenotypes. When cells arrest proliferation, undergo apoptosis, or differentiate, the distribution of nuclear proteins changes. Conversely, forced alteration of the distribution of nuclear proteins modifies cell phenotype. Immunostaining and fluorescence microscopy have been critical for such findings. However, there is an increasing need for quantitative analysis of nuclear protein distribution to decipher epigenetic relationships between nuclear structure and cell phenotype, and to unravel the mechanisms linking nuclear structure and function. We have developed imaging methods to quantify the distribution of fluorescently-stained nuclear protein NuMA in different mammary phenotypes obtained using three-dimensional cell culture. Automated image segmentation of DAPI-stained nuclei was generated to isolate thousands of nuclei from three-dimensional confocal images. Prominent features of fluorescently-stained NuMA were detected using a novel local bright feature analysis technique, and their normalized spatial density calculated as a function of the distance from the nuclear perimeter to its center. The results revealed marked changes in the distribution of the density of NuMA bright features as non-neoplastic cells underwent phenotypically normal acinar morphogenesis. In contrast, we did not detect any reorganization of NuMA during the formation of tumor nodules by malignant cells. Importantly, the analysis also discriminated proliferating non-neoplastic cells from proliferating malignant cells, suggesting that these imaging methods are capable of identifying alterations linked not only to the proliferation status but also to the malignant character of cells. We believe that this quantitative analysis will have additional applications for classifying normal and pathological tissues.

  14. A Novel Vision Localization Method of Automated Micro-Polishing Robot

    Institute of Scientific and Technical Information of China (English)

    Zhao-jun Yang; Fei Chen; Ji Zhao; Xiao-jie Wu

    2009-01-01

    Based on photograrnmetry technology, a novel localization method of micro-polishing robot, which is restricted within certain working space, is presented in this paper. On the basis of pinhole camera model, a new mathematical model of vision localization of automated polishing robot is established. The vision localization is based on the distance-constraints of feature points. The method to solve the mathematical model is discussed. According to the characteristics of gray image, an adaptive method of automatic threshold selection based on connected components is presented. The center coordinate of the feature image point is resolved by bilinear interpolation gray square weighted algorithm. Finally, the mathematical model of testing system is verified by global localization test. The experimental results show that the vision localization system in working space has high precision.

  15. An automated segmentation method for three-dimensional carotid ultrasound images

    Science.gov (United States)

    Zahalka, Abir; Fenster, Aaron

    2001-04-01

    We have developed an automated segmentation method for three-dimensional vascular ultrasound images. The method consists of two steps: an automated initial contour identification, followed by application of a geometrically deformable model (GDM). The formation of the initial contours requires the input of a single seed point by the user, and was shown to be insensitive to the placement of the seed within a structure. The GDM minimizes contour energy, providing a smoothed final result. It requires only three simple parameters, all with easily selectable values. The algorithm is fast, performing segmentation on a 336×352×200 volume in 25 s when running on a 100 MHz 9500 Power Macintosh prototype. The segmentation algorithm was tested on stenosed vessel phantoms with known geometry, and the segmentation of the cross-sectional areas was found to be within 3% of the true area. The algorithm was also applied to two sets of patient carotid images, one acquired with a mechanical scanner and the other with a freehand scanning system, with good results on both.

  16. Measurement of the total antioxidant potential in chronic obstructive pulmonary diseases with a novel automated method

    International Nuclear Information System (INIS)

    To determine the oxidative and antioxidative status of plasma of patients with chronic obstructive pulmonary disease (COPD) and to compare these values with healthy smokers and healthy non-smokers control subjects using a more recently developed automated measurement method. This study involved 40 COPD patients, 25 healthy smokers, and 25 non-healthy smokers who attended the Chest Disease Outpatient Clinic in Harran University Research Hospital, Turkey between the period March 2006 and June 2006. We calculated the total antioxidant potential (TAOP) to determine the antioxidative status of plasma and we measured the total peroxide levels to determine the oxidative status of plasma. The TAOP of plasma was significantly lower in patients with COPD than in healthy smokers and healthy non-smokers (p< 0.001). In contrast, the mean total peroxide level of plasma was significantly higher in COPD patients than in healthy smokers and healthy non-smokers (p<0.001). We found a decreased in TAOP COPD patients using a simple, rapid and reliably automated colorimetric assay, which may suitable for use in routine clinical biochemistry laboratory and considerably facilitates the assessment of this useful clinical parameter. We suggest that this novel method may be used as a routine test to evaluate and follow-up the levels of oxidative stress in COPD. (author)

  17. A rapid and automated low resolution NMR method to analyze oil quality in intact oilseeds

    International Nuclear Information System (INIS)

    Oilseeds with modified fatty acid profiles have been the genetic alternative for high quality vegetable oil for food and biodiesel applications. They can provide stable, functional oils for the food industry, without the hydrogenation process that produces trans-fatty acid, which has been linked to cardiovascular disease. High yield and high quality oilseeds are also necessary for the success of biodiesel programs, as polyunsatured or saturated fatty acid oil produces biofuel with undesirable properties. In this paper, a rapid and automated low resolution NMR method to select intact oilseeds with a modified fatty acid profile is introduced, based on 1H transverse relaxation time (T 2). The T 2 weighted NMR signal, obtained by a CPMG pulse sequence and processed by chemometric methods was able to determine the oil quality in intact seeds by its fatty composition, cetane number, iodine value and kinematic viscosity with a correlation coefficient r > 0.9. The automated system has the potential to analyze more than 1000 samples per hour and is a powerful tool to speed up the selection of high quality oilseeds for food and biodiesel applications

  18. Automated Asteroseismic Analysis of Solar-type Stars

    DEFF Research Database (Denmark)

    Karoff, Christoffer; Campante, T.L.; Chaplin, W.J.

    2010-01-01

    the possibility to do population studies on large samples of stars and such population studies demand a consistent analysis. By consistent analysis we understand an analysis that can be performed without the need to make any subjective choices on e.g. mode identification and an analysis where the uncertainties......, radius, luminosity, effective temperature, surface gravity and age based on grid modeling. All the tools take into account the window function of the observations which means that they work equally well for space-based photometry observations from e.g. the NASA Kepler satellite and ground-based velocity...

  19. Automated analysis of background EEG and reactivity during therapeutic hypothermia in comatose patients after cardiac arrest

    OpenAIRE

    Noirhomme, Quentin; Lehembre, Rémy; Lugo, Zulay; Lesenfants, Damien; Luxen, André; Laureys, Steven; Oddo, Mauro; Rossetti, Andrea

    2014-01-01

    Visual analysis of electroencephalography (EEG) background and reactivity during therapeutic hypothermia provides important outcome information, but is time-consuming and not always consistent between reviewers. Automated EEG analysis may help quantify the brain damage. Forty-six comatose patients in therapeutic hypothermia, after cardiac arrest, were included in the study. EEG background was quantified with burst-suppression ratio (BSR) and approximate entropy, both used to monitor anesthesi...

  20. AVR Microcontroller-based automated technique for analysis of DC motors

    Science.gov (United States)

    Kaur, P.; Chatterji, S.

    2014-01-01

    This paper provides essential information on the development of a 'dc motor test and analysis control card' using AVR series ATMega32 microcontroller. This card can be interfaced to PC and calculates parameters like motor losses, efficiency and plot characteristics for dc motors. Presently, there are different tests and methods available to evaluate motor parameters, but a single and universal user-friendly automated set-up has been discussed in this paper. It has been accomplished by designing a data acquisition and SCR bridge firing hardware based on AVR ATMega32 microcontroller. This hardware has the capability to drive the phase-controlled rectifiers and acquire real-time values of current, voltage, temperature and speed of motor. Various analyses feasible with the designed hardware are of immense importance for dc motor manufacturers and quality-sensitive users. Authors, through this paper aim to provide details of this AVR-based hardware which can be used for dc motor parameter analysis and also for motor control applications.

  1. Experimental saltwater intrusion in coastal aquifers using automated image analysis: Applications to homogeneous aquifers

    Science.gov (United States)

    Robinson, G.; Ahmed, Ashraf A.; Hamill, G. A.

    2016-07-01

    This paper presents the applications of a novel methodology to quantify saltwater intrusion parameters in laboratory-scale experiments. The methodology uses an automated image analysis procedure, minimising manual inputs and the subsequent systematic errors that can be introduced. This allowed the quantification of the width of the mixing zone which is difficult to measure in experimental methods that are based on visual observations. Glass beads of different grain sizes were tested for both steady-state and transient conditions. The transient results showed good correlation between experimental and numerical intrusion rates. The experimental intrusion rates revealed that the saltwater wedge reached a steady state condition sooner while receding than advancing. The hydrodynamics of the experimental mixing zone exhibited similar traits; a greater increase in the width of the mixing zone was observed in the receding saltwater wedge, which indicates faster fluid velocities and higher dispersion. The angle of intrusion analysis revealed the formation of a volume of diluted saltwater at the toe position when the saltwater wedge is prompted to recede. In addition, results of different physical repeats of the experiment produced an average coefficient of variation less than 0.18 of the measured toe length and width of the mixing zone.

  2. OpenCyto: an open source infrastructure for scalable, robust, reproducible, and automated, end-to-end flow cytometry data analysis.

    OpenAIRE

    Greg Finak; Jacob Frelinger; Wenxin Jiang; Newell, Evan W.; John Ramey; Davis, Mark M.; Kalams, Spyros A.; De Rosa, Stephen C.; Raphael Gottardo

    2014-01-01

    Flow cytometry is used increasingly in clinical research for cancer, immunology and vaccines. Technological advances in cytometry instrumentation are increasing the size and dimensionality of data sets, posing a challenge for traditional data management and analysis. Automated analysis methods, despite a general consensus of their importance to the future of the field, have been slow to gain widespread adoption. Here we present OpenCyto, a new BioConductor infrastructure and data analysis fra...

  3. OpenCyto: An Open Source Infrastructure for Scalable, Robust, Reproducible, and Automated, End-to-End Flow Cytometry Data Analysis

    OpenAIRE

    Greg Finak; Jacob Frelinger; Wenxin Jiang; Newell, Evan W; John Ramey; Davis, Mark M.; Kalams, Spyros A.; De Rosa, Stephen C.; Raphael Gottardo

    2014-01-01

    Flow cytometry is used increasingly in clinical research for cancer, immunology and vaccines. Technological advances in cytometry instrumentation are increasing the size and dimensionality of data sets, posing a challenge for traditional data management and analysis. Automated analysis methods, despite a general consensus of their importance to the future of the field, have been slow to gain widespread adoption. Here we present OpenCyto, a new BioConductor infrastructure and data analysis fra...

  4. GenePublisher: automated analysis of DNA microarray data

    DEFF Research Database (Denmark)

    Knudsen, Steen; Workman, Christopher; Sicheritz-Ponten, T.; Friis, Carsten

    2003-01-01

    , statistical analysis and visualization of the data. The results are run against databases of signal transduction pathways, metabolic pathways and promoter sequences in order to extract more information. The results of the entire analysis are summarized in report form and returned to the user....

  5. Quality assurance of automated gamma-ray spectrometric analysis

    International Nuclear Information System (INIS)

    Fully automatic gamma-ray spectrometric analysis procedures perform complete processing of the spectrum without intervention of the operator. In order to maintain the reliability of the final results the analysis checks the intermediate results automatically. When a disagreement is identified by such a check the uncertainty of the intermediate results is increased in order to accommodate the disagreement. The increased uncertainty is propagated into the uncertainty of the final results in order to take into account the disagreement. This approach was implemented in Canberra's Genie ESP gamma-ray spectrometry package for examining the results of the peak analysis. In addition to this intermediate check also a-posteriori checks of the final results can be performed by statistical analysis. Such analysis shows whether the results are under statistical control and can discover sources of variability which are not taken into account in the uncertainty budget

  6. Modelling application for cognitive reliability and error analysis method

    Directory of Open Access Journals (Sweden)

    Fabio De Felice

    2013-10-01

    Full Text Available The automation of production systems has delegated to machines the execution of highly repetitive and standardized tasks. In the last decade, however, the failure of the automatic factory model has led to partially automated configurations of production systems. Therefore, in this scenario, centrality and responsibility of the role entrusted to the human operators are exalted because it requires problem solving and decision making ability. Thus, human operator is the core of a cognitive process that leads to decisions, influencing the safety of the whole system in function of their reliability. The aim of this paper is to propose a modelling application for cognitive reliability and error analysis method.

  7. A simple viability analysis for unicellular cyanobacteria using a new autofluorescence assay, automated microscopy, and ImageJ

    Directory of Open Access Journals (Sweden)

    Schulze Katja

    2011-11-01

    Full Text Available Abstract Background Currently established methods to identify viable and non-viable cells of cyanobacteria are either time-consuming (eg. plating or preparation-intensive (eg. fluorescent staining. In this paper we present a new and fast viability assay for unicellular cyanobacteria, which uses red chlorophyll fluorescence and an unspecific green autofluorescence for the differentiation of viable and non-viable cells without the need of sample preparation. Results The viability assay for unicellular cyanobacteria using red and green autofluorescence was established and validated for the model organism Synechocystis sp. PCC 6803. Both autofluorescence signals could be observed simultaneously allowing a direct classification of viable and non-viable cells. The results were confirmed by plating/colony count, absorption spectra and chlorophyll measurements. The use of an automated fluorescence microscope and a novel ImageJ based image analysis plugin allow a semi-automated analysis. Conclusions The new method simplifies the process of viability analysis and allows a quick and accurate analysis. Furthermore results indicate that a combination of the new assay with absorption spectra or chlorophyll concentration measurements allows the estimation of the vitality of cells.

  8. Automated segmentation of muscle and adipose tissue on CT images for human body composition analysis

    Science.gov (United States)

    Chung, Howard; Cobzas, Dana; Birdsell, Laura; Lieffers, Jessica; Baracos, Vickie

    2009-02-01

    The ability to compute body composition in cancer patients lends itself to determining the specific clinical outcomes associated with fat and lean tissue stores. For example, a wasting syndrome of advanced disease associates with shortened survival. Moreover, certain tissue compartments represent sites for drug distribution and are likely determinants of chemotherapy efficacy and toxicity. CT images are abundant, but these cannot be fully exploited unless there exist practical and fast approaches for tissue quantification. Here we propose a fully automated method for segmenting muscle, visceral and subcutaneous adipose tissues, taking the approach of shape modeling for the analysis of skeletal muscle. Muscle shape is represented using PCA encoded Free Form Deformations with respect to a mean shape. The shape model is learned from manually segmented images and used in conjunction with a tissue appearance prior. VAT and SAT are segmented based on the final deformed muscle shape. In comparing the automatic and manual methods, coefficients of variation (COV) (1 - 2%), were similar to or smaller than inter- and intra-observer COVs reported for manual segmentation.

  9. An Automated High Throughput Proteolysis and Desalting Platform for Quantitative Proteomic Analysis

    Directory of Open Access Journals (Sweden)

    Albert-Baskar Arul

    2013-06-01

    Full Text Available Proteomics for biomarker validation needs high throughput instrumentation to analyze huge set of clinical samples for quantitative and reproducible analysis at a minimum time without manual experimental errors. Sample preparation, a vital step in proteomics plays a major role in identification and quantification of proteins from biological samples. Tryptic digestion a major check point in sample preparation for mass spectrometry based proteomics needs to be more accurate with rapid processing time. The present study focuses on establishing a high throughput automated online system for proteolytic digestion and desalting of proteins from biological samples quantitatively and qualitatively in a reproducible manner. The present study compares online protein digestion and desalting of BSA with conventional off-line (in-solution method and validated for real time sample for reproducibility. Proteins were identified using SEQUEST data base search engine and the data were quantified using IDEALQ software. The present study shows that the online system capable of handling high throughput samples in 96 well formats carries out protein digestion and peptide desalting efficiently in a reproducible and quantitative manner. Label free quantification showed clear increase of peptide quantities with increase in concentration with much linearity compared to off line method. Hence we would like to suggest that inclusion of this online system in proteomic pipeline will be effective in quantification of proteins in comparative proteomics were the quantification is really very crucial.

  10. Fully Automated Sample Preparation for Ultrafast N-Glycosylation Analysis of Antibody Therapeutics.

    Science.gov (United States)

    Szigeti, Marton; Lew, Clarence; Roby, Keith; Guttman, Andras

    2016-04-01

    There is a growing demand in the biopharmaceutical industry for high-throughput, large-scale N-glycosylation profiling of therapeutic antibodies in all phases of product development, but especially during clone selection when hundreds of samples should be analyzed in a short period of time to assure their glycosylation-based biological activity. Our group has recently developed a magnetic bead-based protocol for N-glycosylation analysis of glycoproteins to alleviate the hard-to-automate centrifugation and vacuum-centrifugation steps of the currently used protocols. Glycan release, fluorophore labeling, and cleanup were all optimized, resulting in a process with excellent yield and good repeatability. This article demonstrates the next level of this work by automating all steps of the optimized magnetic bead-based protocol from endoglycosidase digestion, through fluorophore labeling and cleanup with high-throughput sample processing in 96-well plate format, using an automated laboratory workstation. Capillary electrophoresis analysis of the fluorophore-labeled glycans was also optimized for rapid (processing of the automated sample preparation workflow. Ultrafast N-glycosylation analyses of several commercially relevant antibody therapeutics are also shown and compared to their biosimilar counterparts, addressing the biological significance of the differences. PMID:26429557

  11. Automated acquisition and analysis of small angle X-ray scattering data

    International Nuclear Information System (INIS)

    Small Angle X-ray Scattering (SAXS) is a powerful tool in the study of biological macromolecules providing information about the shape, conformation, assembly and folding states in solution. Recent advances in robotic fluid handling make it possible to perform automated high throughput experiments including fast screening of solution conditions, measurement of structural responses to ligand binding, changes in temperature or chemical modifications. Here, an approach to full automation of SAXS data acquisition and data analysis is presented, which advances automated experiments to the level of a routine tool suitable for large scale structural studies. The approach links automated sample loading, primary data reduction and further processing, facilitating queuing of multiple samples for subsequent measurement and analysis and providing means of remote experiment control. The system was implemented and comprehensively tested in user operation at the BioSAXS beamlines X33 and P12 of EMBL at the DORIS and PETRA storage rings of DESY, Hamburg, respectively, but is also easily applicable to other SAXS stations due to its modular design.

  12. GenePublisher: automated analysis of DNA microarray data

    DEFF Research Database (Denmark)

    Knudsen, Steen; Workman, Christopher; Sicheritz-Ponten, T.;

    2003-01-01

    GenePublisher, a system for automatic analysis of data from DNA microarray experiments, has been implemented with a web interface at http://www.cbs.dtu.dk/services/GenePublisher. Raw data are uploaded to the server together with aspecification of the data. The server performs normalization......, statistical analysis and visualization of the data. The results are run against databases of signal transduction pathways, metabolic pathways and promoter sequences in order to extract more information. The results of the entire analysis are summarized in report form and returned to the user....

  13. Automating methods to improve precision in Monte-Carlo event generation for particle colliders

    Energy Technology Data Exchange (ETDEWEB)

    Gleisberg, Tanju

    2008-07-01

    The subject of this thesis was the development of tools for the automated calculation of exact matrix elements, which are a key for the systematic improvement of precision and confidence for theoretical predictions. Part I of this thesis concentrates on the calculations of cross sections at tree level. A number of extensions have been implemented in the matrix element generator AMEGIC++, namely new interaction models such as effective loop-induced couplings of the Higgs boson with massless gauge bosons, required for a number of channels for the Higgs boson search at LHC and anomalous gauge couplings, parameterizing a number of models beyond th SM. Further a special treatment to deal with complicated decay chains of heavy particles has been constructed. A significant effort went into the implementation of methods to push the limits on particle multiplicities. Two recursive methods have been implemented, the Cachazo-Svrcek-Witten recursion and the colour dressed Berends-Giele recursion. For the latter the new module COMIX has been added to the SHERPA framework. The Monte-Carlo phase space integration techniques have been completely revised, which led to significantly reduced statistical error estimates when calculating cross sections and a greatly improved unweighting efficiency for the event generation. Special integration methods have been developed to cope with the newly accessible final states. The event generation framework SHERPA directly benefits from those new developments, improving the precision and the efficiency. Part II was addressed to the automation of QCD calculations at next-to-leading order. A code has been developed, that, for the first time fully automates the real correction part of a NLO calculation. To calculate the correction for a m-parton process obeying the Catani-Seymour dipole subtraction method the following components are provided: 1. the corresponding m+1-parton tree level matrix elements, 2. a number dipole subtraction terms to remove

  14. Automating methods to improve precision in Monte-Carlo event generation for particle colliders

    International Nuclear Information System (INIS)

    The subject of this thesis was the development of tools for the automated calculation of exact matrix elements, which are a key for the systematic improvement of precision and confidence for theoretical predictions. Part I of this thesis concentrates on the calculations of cross sections at tree level. A number of extensions have been implemented in the matrix element generator AMEGIC++, namely new interaction models such as effective loop-induced couplings of the Higgs boson with massless gauge bosons, required for a number of channels for the Higgs boson search at LHC and anomalous gauge couplings, parameterizing a number of models beyond th SM. Further a special treatment to deal with complicated decay chains of heavy particles has been constructed. A significant effort went into the implementation of methods to push the limits on particle multiplicities. Two recursive methods have been implemented, the Cachazo-Svrcek-Witten recursion and the colour dressed Berends-Giele recursion. For the latter the new module COMIX has been added to the SHERPA framework. The Monte-Carlo phase space integration techniques have been completely revised, which led to significantly reduced statistical error estimates when calculating cross sections and a greatly improved unweighting efficiency for the event generation. Special integration methods have been developed to cope with the newly accessible final states. The event generation framework SHERPA directly benefits from those new developments, improving the precision and the efficiency. Part II was addressed to the automation of QCD calculations at next-to-leading order. A code has been developed, that, for the first time fully automates the real correction part of a NLO calculation. To calculate the correction for a m-parton process obeying the Catani-Seymour dipole subtraction method the following components are provided: 1. the corresponding m+1-parton tree level matrix elements, 2. a number dipole subtraction terms to remove

  15. Analysis of the thoracic aorta using a semi-automated post processing tool

    Energy Technology Data Exchange (ETDEWEB)

    Entezari, Pegah, E-mail: p-entezari@northwestern.edu [Department of Radiology, Cardiovascular Imaging, Northwestern University, Chicago, IL (United States); Kino, Aya, E-mail: ayakino@gmail.com [Department of Radiology, Cardiovascular Imaging, Northwestern University, Chicago, IL (United States); Honarmand, Amir R., E-mail: arhonarmand@yahoo.com [Department of Radiology, Cardiovascular Imaging, Northwestern University, Chicago, IL (United States); Galizia, Mauricio S., E-mail: maugalizia@yahoo.com.br [Department of Radiology, Cardiovascular Imaging, Northwestern University, Chicago, IL (United States); Yang, Yan, E-mail: yyang@vitalimages.com [Vital images Inc, Minnetonka, MN (United States); Collins, Jeremy, E-mail: collins@fsm.northwestern.edu [Department of Radiology, Cardiovascular Imaging, Northwestern University, Chicago, IL (United States); Yaghmai, Vahid, E-mail: vyaghmai@northwestern.edu [Department of Radiology, Cardiovascular Imaging, Northwestern University, Chicago, IL (United States); Carr, James C., E-mail: jcarr@northwestern.edu [Department of Radiology, Cardiovascular Imaging, Northwestern University, Chicago, IL (United States)

    2013-09-15

    Objective: To evaluates a semi-automated method for Thoracic Aortic Aneurysm (TAA) measurement using ECG-gated Dual Source CT Angiogram (DSCTA). Methods: This retrospective HIPAA compliant study was approved by our IRB. Transaxial maximum diameters of outer wall to outer wall were studied in fifty patients at seven anatomic locations of the thoracic aorta: annulus, sinus, sinotubular junction (STJ), mid ascending aorta (MAA) at the level of right pulmonary artery, proximal aortic arch (PROX) immediately proximal to innominate artery, distal aortic arch (DIST) immediately distal to left subclavian artery, and descending aorta (DESC) at the level of diaphragm. Measurements were performed using a manual method and semi-automated software. All readers repeated their measurements. Inter-method, intra-observer and inter-observer agreements were evaluated according to intraclass correlation coefficient (ICC) and Bland–Altman plot. The number of cases with manual contouring or center line adjustment for the semi-automated method and also the post-processing time for each method were recorded. Results: The mean difference between semi-automated and manual methods was less than 1.3 mm at all seven points. Strong inter-method, inter-observer and intra-observer agreement was recorded at all levels (ICC ≥ 0.9). The maximum rate of manual adjustment of center line and contour was at the level of annulus. The average time for manual post-processing of the aorta was 19 ± 0.3 min, while it took 8.26 ± 2.1 min to do the measurements with the semi-automated tool (Vitrea version 6.0.0.1 software). The center line was edited manually at all levels, with most corrections at the level of annulus (60%), while the contour was adjusted at all levels with highest and lowest number of corrections at the levels of annulus and DESC (75% and 0.07% of the cases), respectively. Conclusion: Compared to the commonly used manual method, semi-automated measurement of vessel dimensions is

  16. Analysis of the thoracic aorta using a semi-automated post processing tool

    International Nuclear Information System (INIS)

    Objective: To evaluates a semi-automated method for Thoracic Aortic Aneurysm (TAA) measurement using ECG-gated Dual Source CT Angiogram (DSCTA). Methods: This retrospective HIPAA compliant study was approved by our IRB. Transaxial maximum diameters of outer wall to outer wall were studied in fifty patients at seven anatomic locations of the thoracic aorta: annulus, sinus, sinotubular junction (STJ), mid ascending aorta (MAA) at the level of right pulmonary artery, proximal aortic arch (PROX) immediately proximal to innominate artery, distal aortic arch (DIST) immediately distal to left subclavian artery, and descending aorta (DESC) at the level of diaphragm. Measurements were performed using a manual method and semi-automated software. All readers repeated their measurements. Inter-method, intra-observer and inter-observer agreements were evaluated according to intraclass correlation coefficient (ICC) and Bland–Altman plot. The number of cases with manual contouring or center line adjustment for the semi-automated method and also the post-processing time for each method were recorded. Results: The mean difference between semi-automated and manual methods was less than 1.3 mm at all seven points. Strong inter-method, inter-observer and intra-observer agreement was recorded at all levels (ICC ≥ 0.9). The maximum rate of manual adjustment of center line and contour was at the level of annulus. The average time for manual post-processing of the aorta was 19 ± 0.3 min, while it took 8.26 ± 2.1 min to do the measurements with the semi-automated tool (Vitrea version 6.0.0.1 software). The center line was edited manually at all levels, with most corrections at the level of annulus (60%), while the contour was adjusted at all levels with highest and lowest number of corrections at the levels of annulus and DESC (75% and 0.07% of the cases), respectively. Conclusion: Compared to the commonly used manual method, semi-automated measurement of vessel dimensions is

  17. Automation of Safety Analysis with SysML Models Project

    Data.gov (United States)

    National Aeronautics and Space Administration — This project was a small proof-of-concept case study, generating SysML model information as a side effect of safety analysis. A prototype FMEA Assistant was...

  18. Infrascope: Full-Spectrum Phonocardiography with Automated Signal Analysis Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Using digital signal analysis tools, we will generate a repeatable output from the infrascope and compare it to the output of a traditional electrocardiogram, and...

  19. An automated data processing method dedicated to 3D ultrasonic non destructive testing of composite pieces

    International Nuclear Information System (INIS)

    State-of the art Non Destructive Testing using ultrasound is based on evaluation of C-scan images, which is done mainly visually. The development of the new Sampling Phased Array technique SPA by IZFP Fraunhofer provides a fast three-dimensional reconstruction of inner object structures. This new inspection technique is to be complemented with fully or semi-automated evaluation of ultrasonic data, providing maximum support to the operator. We present in this contribution a processing method for SPA ultrasonic data, where the main focus of this paper will be on speckle noise reduction. The evaluation method is applied on carbon fibre composite where it demonstrates robust and successful performance in recognition of defects.

  20. Automated method and system for the alignment and correlation of images from two different modalities

    Science.gov (United States)

    Giger, Maryellen L.; Chen, Chin-Tu; Armato, Samuel; Doi, Kunio

    1999-10-26

    A method and system for the computerized registration of radionuclide images with radiographic images, including generating image data from radiographic and radionuclide images of the thorax. Techniques include contouring the lung regions in each type of chest image, scaling and registration of the contours based on location of lung apices, and superimposition after appropriate shifting of the images. Specific applications are given for the automated registration of radionuclide lungs scans with chest radiographs. The method in the example given yields a system that spatially registers and correlates digitized chest radiographs with V/Q scans in order to correlate V/Q functional information with the greater structural detail of chest radiographs. Final output could be the computer-determined contours from each type of image superimposed on any of the original images, or superimposition of the radionuclide image data, which contains high activity, onto the radiographic chest image.

  1. Automated ion-selective electrode method for determining fluoride in natural waters

    Science.gov (United States)

    Erdmann, D.E.

    1975-01-01

    An automated fluoride method which uses AutoAnalyzer modules in conjunction with a fluoride ion-selective electrode was evaluated. The results obtained on 38 natural water samples are in excellent agreement with those determined by a similar manual method (average difference = 0.026 mg/l). An average fluoride concentration of 0.496 mg/l was found when several natural water samples were spiked with 0.50 mg/l fluoride. Aluminum is the only significant interfering substance, and it can be easily tolerated if its concentration does not exceed 2 mg/l. Thirty samples were analyzed per hour over a concentration range of 0-2 mg/l.

  2. Photogrammetry-Based Automated Measurements for Tooth Shape and Occlusion Analysis

    Science.gov (United States)

    Knyaz, V. A.; Gaboutchian, A. V.

    2016-06-01

    Tooth measurements (odontometry) are performed for various scientific and practical applications, including dentistry. Present-day techniques are being increasingly based on 3D model use that provides wider prospects in comparison to measurements on real objects: teeth or their plaster copies. The main advantages emerge through application of new measurement methods which provide the needed degree of non-invasiveness, precision, convenience and details. Tooth measurements have been always regarded as a time-consuming research, even more so with use of new methods due to their wider opportunities. This is where automation becomes essential for further development and implication of measurement techniques. In our research automation in obtaining 3D models and automation of measurements provided essential data that was analysed to suggest recommendations for tooth preparation - one of the most responsible clinical procedures in prosthetic dentistry - within a comparatively short period of time. The original photogrammetric 3D reconstruction system allows to generate 3D models of dental arches, reproduce their closure, or occlusion, and to perform a set of standard measurement in automated mode.

  3. Automative Multi Classifier Framework for Medical Image Analysis

    Directory of Open Access Journals (Sweden)

    R. Edbert Rajan

    2015-04-01

    Full Text Available Medical image processing is the technique used to create images of the human body for medical purposes. Nowadays, medical image processing plays a major role and a challenging solution for the critical stage in the medical line. Several researches have done in this area to enhance the techniques for medical image processing. However, due to some demerits met by some advanced technologies, there are still many aspects that need further development. Existing study evaluate the efficacy of the medical image analysis with the level-set shape along with fractal texture and intensity features to discriminate PF (Posterior Fossa tumor from other tissues in the brain image. To develop the medical image analysis and disease diagnosis, to devise an automotive subjective optimality model for segmentation of images based on different sets of selected features from the unsupervised learning model of extracted features. After segmentation, classification of images is done. The classification is processed by adapting the multiple classifier frameworks in the previous work based on the mutual information coefficient of the selected features underwent for image segmentation procedures. In this study, to enhance the classification strategy, we plan to implement enhanced multi classifier framework for the analysis of medical images and disease diagnosis. The performance parameter used for the analysis of the proposed enhanced multi classifier framework for medical image analysis is Multiple Class intensity, image quality, time consumption.

  4. Fully Automated Laser Ablation Liquid Capture Sample Analysis using NanoElectrospray Ionization Mass Spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Lorenz, Matthias [ORNL; Ovchinnikova, Olga S [ORNL; Van Berkel, Gary J [ORNL

    2014-01-01

    RATIONALE: Laser ablation provides for the possibility of sampling a large variety of surfaces with high spatial resolution. This type of sampling when employed in conjunction with liquid capture followed by nanoelectrospray ionization provides the opportunity for sensitive and prolonged interrogation of samples by mass spectrometry as well as the ability to analyze surfaces not amenable to direct liquid extraction. METHODS: A fully automated, reflection geometry, laser ablation liquid capture spot sampling system was achieved by incorporating appropriate laser fiber optics and a focusing lens into a commercially available, liquid extraction surface analysis (LESA ) ready Advion TriVersa NanoMate system. RESULTS: Under optimized conditions about 10% of laser ablated material could be captured in a droplet positioned vertically over the ablation region using the NanoMate robot controlled pipette. The sampling spot size area with this laser ablation liquid capture surface analysis (LA/LCSA) mode of operation (typically about 120 m x 160 m) was approximately 50 times smaller than that achievable by direct liquid extraction using LESA (ca. 1 mm diameter liquid extraction spot). The set-up was successfully applied for the analysis of ink on glass and paper as well as the endogenous components in Alstroemeria Yellow King flower petals. In a second mode of operation with a comparable sampling spot size, termed laser ablation/LESA , the laser system was used to drill through, penetrate, or otherwise expose material beneath a solvent resistant surface. Once drilled, LESA was effective in sampling soluble material exposed at that location on the surface. CONCLUSIONS: Incorporating the capability for different laser ablation liquid capture spot sampling modes of operation into a LESA ready Advion TriVersa NanoMate enhanced the spot sampling spatial resolution of this device and broadened the surface types amenable to analysis to include absorbent and solvent resistant

  5. Analysis of the automated systems of planning of spatial constructions

    Directory of Open Access Journals (Sweden)

    М.С. Барабаш

    2004-04-01

    Full Text Available  The article is devoted to the questions of analysis of existing SAPR and questions of development of new information technologies of planning on the basis of integration of programmatic complexes with the use of united informatively-logical model of object.

  6. Automated analysis of security requirements through risk-based argumentation

    NARCIS (Netherlands)

    Yu, Yijun; Franqueira, Virginia N.L.; Tun, Thein Tan; Wieringa, Roel J.; Nuseibeh, Bashar

    2015-01-01

    Computer-based systems are increasingly being exposed to evolving security threats, which often reveal new vulnerabilities. A formal analysis of the evolving threats is difficult due to a number of practical considerations such as incomplete knowledge about the design, limited information about atta

  7. Semi-automated recognition of protozoa by image analysis

    OpenAIRE

    A.L. Amaral; Baptiste, C; Pons, M. N.; Nicolau, Ana; Lima, Nelson; Ferreira, E. C.; Mota, M.; H. Vivier

    1999-01-01

    A programme was created to semi-automatically analyse protozoal digitised images. Principal Component Analysis technique was used for species identification. After data collection and mathematical treatment, a threedimensional representation was generated and several protozoa (Opercularia, Colpidium, Tetrahymena, Prorodon, Glaucoma and Trachelophyllum) species could be positively identified.

  8. Automated Frequency Domain Decomposition for Operational Modal Analysis

    DEFF Research Database (Denmark)

    Brincker, Rune; Andersen, Palle; Jacobsen, Niels-Jørgen

    2007-01-01

    The Frequency Domain Decomposition (FDD) technique is known as one of the most user friendly and powerful techniques for operational modal analysis of structures. However, the classical implementation of the technique requires some user interaction. The present paper describes an algorithm for...

  9. Automated detection and analysis of particle beams in laser-plasma accelerator simulations

    International Nuclear Information System (INIS)

    scientific data mining is increasingly considered. In plasma simulations, Bagherjeiran et al. presented a comprehensive report on applying graph-based techniques for orbit classification. They used the KAM classifier to label points and components in single and multiple orbits. Love et al. conducted an image space analysis of coherent structures in plasma simulations. They used a number of segmentation and region-growing techniques to isolate regions of interest in orbit plots. Both approaches analyzed particle accelerator data, targeting the system dynamics in terms of particle orbits. However, they did not address particle dynamics as a function of time or inspected the behavior of bunches of particles. Ruebel et al. addressed the visual analysis of massive laser wakefield acceleration (LWFA) simulation data using interactive procedures to query the data. Sophisticated visualization tools were provided to inspect the data manually. Ruebel et al. have integrated these tools to the visualization and analysis system VisIt, in addition to utilizing efficient data management based on HDF5, H5Part, and the index/query tool FastBit. In Ruebel et al. proposed automatic beam path analysis using a suite of methods to classify particles in simulation data and to analyze their temporal evolution. To enable researchers to accurately define particle beams, the method computes a set of measures based on the path of particles relative to the distance of the particles to a beam. To achieve good performance, this framework uses an analysis pipeline designed to quickly reduce the amount of data that needs to be considered in the actual path distance computation. As part of this process, region-growing methods are utilized to detect particle bunches at single time steps. Efficient data reduction is essential to enable automated analysis of large data sets as described in the next section, where data reduction methods are steered to the particular requirements of our clustering analysis

  10. Automated Performance Monitoring Data Analysis and Reporting within the Open Source R Environment

    Science.gov (United States)

    Kennel, J.; Tonkin, M. J.; Faught, W.; Lee, A.; Biebesheimer, F.

    2013-12-01

    Environmental scientists encounter quantities of data at a rate that in many cases outpaces our ability to appropriately store, visualize and convey the information. The free software environment, R, provides a framework for efficiently processing, analyzing, depicting and reporting on data from a multitude of formats in the form of traceable and quality-assured data summary reports. Automated data summary reporting leverages document markup languages such as markdown, HTML, or LaTeX using R-scripts capable of completing a variety of simple or sophisticated data processing, analysis and visualization tasks. Automated data summary reports seamlessly integrate analysis into report production with calculation outputs - such as plots, maps and statistics - included alongside report text. Once a site-specific template is set up, including data types, geographic base data and reporting requirements, reports can be (re-)generated trivially as the data evolve. The automated data summary report can be a stand-alone report, or it can be incorporated as an attachment to an interpretive report prepared by a subject-matter expert, thereby providing the technical basis to report on and efficiently evaluate large volumes of data resulting in a concise interpretive report. Hence, the data summary report does not replace the scientist, but relieves them of repetitive data processing tasks, facilitating a greater level of analysis. This is demonstrated using an implementation developed for monthly groundwater data reporting for a multi-constituent contaminated site, highlighting selected analysis techniques that can be easily incorporated in a data summary report.

  11. Some selected quantitative methods of thermal image analysis in Matlab.

    Science.gov (United States)

    Koprowski, Robert

    2016-05-01

    The paper presents a new algorithm based on some selected automatic quantitative methods for analysing thermal images. It shows the practical implementation of these image analysis methods in Matlab. It enables to perform fully automated and reproducible measurements of selected parameters in thermal images. The paper also shows two examples of the use of the proposed image analysis methods for the area of ​​the skin of a human foot and face. The full source code of the developed application is also provided as an attachment. The main window of the program during dynamic analysis of the foot thermal image. PMID:26556680

  12. Automated analysis of protein subcellular location in time series images

    OpenAIRE

    Hu, Yanhua; Osuna-Highley, Elvira; Hua, Juchang; Nowicki, Theodore Scott; Stolz, Robert; McKayle, Camille; Murphy, Robert F.

    2010-01-01

    Motivation: Image analysis, machine learning and statistical modeling have become well established for the automatic recognition and comparison of the subcellular locations of proteins in microscope images. By using a comprehensive set of features describing static images, major subcellular patterns can be distinguished with near perfect accuracy. We now extend this work to time series images, which contain both spatial and temporal information. The goal is to use temporal features to improve...

  13. BitTorrent Swarm Analysis through Automation and Enhanced Logging

    OpenAIRE

    R˘azvan Deaconescu; Marius Sandu-Popa; Adriana Dr˘aghici; Nicolae T˘apus

    2011-01-01

    Peer-to-Peer protocols currently form the most heavily used protocol class in the Internet, with BitTorrent, the most popular protocol for content distribution, as its flagship. A high number of studies and investigations have been undertaken to measure, analyse and improve the inner workings of the BitTorrent protocol. Approaches such as tracker message analysis, network probing and packet sniffing have been deployed to understand and enhance BitTorrent's internal behaviour. In this paper we...

  14. BitTorrent Swarm Analysis through Automation and Enhanced Logging

    CERN Document Server

    Deaconescu, Răzvan; Drăghici, Adriana; Tăpus, Nicolae

    2011-01-01

    Peer-to-Peer protocols currently form the most heavily used protocol class in the Internet, with BitTorrent, the most popular protocol for content distribution, as its flagship. A high number of studies and investigations have been undertaken to measure, analyse and improve the inner workings of the BitTorrent protocol. Approaches such as tracker message analysis, network probing and packet sniffing have been deployed to understand and enhance BitTorrent's internal behaviour. In this paper we present a novel approach that aims to collect, process and analyse large amounts of local peer information in BitTorrent swarms. We classify the information as periodic status information able to be monitored in real time and as verbose logging information to be used for subsequent analysis. We have designed and implemented a retrieval, storage and presentation infrastructure that enables easy analysis of BitTorrent protocol internals. Our approach can be employed both as a comparison tool, as well as a measurement syste...

  15. Development of a full automation solid phase microextraction method for investigating the partition coefficient of organic pollutant in complex sample.

    Science.gov (United States)

    Jiang, Ruifen; Lin, Wei; Wen, Sijia; Zhu, Fang; Luan, Tiangang; Ouyang, Gangfeng

    2015-08-01

    A fully automated solid phase microextraction (SPME) depletion method was developed to study the partition coefficient of organic compound between complex matrix and water sample. The SPME depletion process was conducted by pre-loading the fiber with a specific amount of organic compounds from a proposed standard gas generation vial, and then desorbing the fiber into the targeted samples. Based on the proposed method, the partition coefficients (Kmatrix) of 4 polyaromatic hydrocarbons (PAHs) between humic acid (HA)/hydroxypropyl-β-cyclodextrin (β-HPCD) and aqueous sample were determined. The results showed that the logKmatrix of 4 PAHs with HA and β-HPCD ranged from 3.19 to 4.08, and 2.45 to 3.15, respectively. In addition, the logKmatrix values decreased about 0.12-0.27 log units for different PAHs for every 10°C increase in temperature. The effect of temperature on the partition coefficient followed van't Hoff plot, and the partition coefficient at any temperature can be predicted based on the plot. Furthermore, the proposed method was applied for the real biological fluid analysis. The partition coefficients of 6 PAHs between the complex matrices in the fetal bovine serum and water were determined, and compared to ones obtained from SPME extraction method. The result demonstrated that the proposed method can be applied to determine the sorption coefficients of hydrophobic compounds between complex matrix and water in a variety of samples. PMID:26118804

  16. Analysis of zearalenone in cereal and Swine feed samples using an automated flow-through immunosensor.

    Science.gov (United States)

    Urraca, Javier L; Benito-Peña, Elena; Pérez-Conde, Concepción; Moreno-Bondi, María C; Pestka, James J

    2005-05-01

    The development of a sensitive flow-though immunosensor for the analysis of the mycotoxin zearalenone in cereal samples is described. The sensor was completely automated and was based on a direct competitive immunosorbent assay and fluorescence detection. The mycotoxin competes with a horseradish-peroxidase-labeled derivative for the binding sites of a rabbit polyclonal antibody. Control pore glass covalently bound to Prot A was used for the oriented immobilization of the antibody-antigen immunocomplexes. The immunosensor shows an IC(50) value of 0.087 ng mL(-1) (RSD = 2.8%, n = 6) and a dynamic range from 0.019 to 0.422 ng mL(-1). The limit of detection (90% of blank signal) of 0.007 ng mL(-1) (RSD = 3.9%, n = 3) is lower than previously published methods. Corn, wheat, and swine feed samples have been analyzed with the device after extraction of the analyte using accelerated solvent extraction (ASE). The immunosensor has been validated using a corn certificate reference material and HPLC with fluorescence detection. PMID:15853369

  17. Application of automated mass spectrometry deconvolution and identification software for pesticide analysis in surface waters.

    Science.gov (United States)

    Furtula, Vesna; Derksen, George; Colodey, Alan

    2006-01-01

    A new approach to surface water analysis has been investigated in order to enhance the detection of different organic contaminants in Nathan Creek, British Columbia. Water samples from Nathan Creek were prepared by liquid/liquid extraction using dichloromethane (DCM) as an extraction solvent and analyzed by gas chromatography mass spectrometry method in scan mode (GC-MS scan). To increase sensitivity for pesticides detection, acquired scan data were further analyzed by Automated Mass Spectrometry Deconvolution and Identification Software (AMDIS) incorporated into the Agilent Deconvolution Reporting Software (DRS), which also includes mass spectral libraries for 567 pesticides. Extracts were reanalyzed by gas chromatography mass spectrometry single ion monitoring (GC-MS-SIM) to confirm and quantitate detected pesticides. Pesticides: atrazine, dimethoate, diazinone, metalaxyl, myclobutanil, napropamide, oxadiazon, propazine and simazine were detected at three sampling sites on the mainstream of the Nathan Creek. Results of the study are further discussed in terms of detectivity and identification level for each pesticide found. The proposed approach of monitoring pesticides in surface waters enables their detection and identification at trace levels. PMID:17090491

  18. Is interactional dissynchrony a clue to deception? Insights from automated analysis of nonverbal visual cues.

    Science.gov (United States)

    Yu, Xiang; Zhang, Shaoting; Yan, Zhennan; Yang, Fei; Huang, Junzhou; Dunbar, Norah E; Jensen, Matthew L; Burgoon, Judee K; Metaxas, Dimitris N

    2015-03-01

    Detecting deception in interpersonal dialog is challenging since deceivers take advantage of the give-and-take of interaction to adapt to any sign of skepticism in an interlocutor's verbal and nonverbal feedback. Human detection accuracy is poor, often with no better than chance performance. In this investigation, we consider whether automated methods can produce better results and if emphasizing the possible disruption in interactional synchrony can signal whether an interactant is truthful or deceptive. We propose a data-driven and unobtrusive framework using visual cues that consists of face tracking, head movement detection, facial expression recognition, and interactional synchrony estimation. Analysis were conducted on 242 video samples from an experiment in which deceivers and truth-tellers interacted with professional interviewers either face-to-face or through computer mediation. Results revealed that the framework is able to automatically track head movements and expressions of both interlocutors to extract normalized meaningful synchrony features and to learn classification models for deception recognition. Further experiments show that these features reliably capture interactional synchrony and efficiently discriminate deception from truth. PMID:24988600

  19. Automated condition classification of a reciprocating compressor using time frequency analysis and an artificial neural network

    Science.gov (United States)

    Lin, Yih-Hwang; Wu, Hsien-Chang; Wu, Chung-Yung

    2006-12-01

    The purpose of this study is to develop an automated system for condition classification of a reciprocating compressor. Various time-frequency analysis techniques will be examined for decomposition of the vibration signals. Because a time-frequency distribution is a 3D data map, data reduction is indispensable for subsequent analysis. The extraction of the system characteristics using three indices, namely the time index, frequency index, and amplitude index, will be presented and examined for their applicability. The probability neural network is applied for automated condition classification using a combination of the three indices. The study reveals that a proper choice of the index combination and the time-frequency band can provide excellent classification accuracy for the machinery conditions examined in this work.

  20. Automated counting and analysis of etched tracks in CR-39 plastic

    International Nuclear Information System (INIS)

    An image analysis system has been set up which is capable of automated counting and analysis of etched nuclear particle tracks in plastic. The system is composed of an optical microscope, CCD camera, frame grabber, personal computer, monitor, and printer. The frame grabber acquires and displays images at video rate. It has a spatial resolution of 512 x 512 pixels with 8 bits of digitisation corresponding to 256 grey levels. The software has been developed for general image processing and adapted for the present purpose. Comparisons of automated and visual microscope counting of tracks in chemically etched CR-39 detectors are presented with emphasis on results of interest for practical radon measurements or neutron dosimetry, e.g. calibration factors, background track densities and variations in background. (author)