WorldWideScience

Sample records for automated analysis method

  1. A catalog of automated analysis methods for enterprise models.

    Science.gov (United States)

    Florez, Hector; Sánchez, Mario; Villalobos, Jorge

    2016-01-01

    Enterprise models are created for documenting and communicating the structure and state of Business and Information Technologies elements of an enterprise. After models are completed, they are mainly used to support analysis. Model analysis is an activity typically based on human skills and due to the size and complexity of the models, this process can be complicated and omissions or miscalculations are very likely. This situation has fostered the research of automated analysis methods, for supporting analysts in enterprise analysis processes. By reviewing the literature, we found several analysis methods; nevertheless, they are based on specific situations and different metamodels; then, some analysis methods might not be applicable to all enterprise models. This paper presents the work of compilation (literature review), classification, structuring, and characterization of automated analysis methods for enterprise models, expressing them in a standardized modeling language. In addition, we have implemented the analysis methods in our modeling tool.

  2. Automated migration analysis based on cell texture: method & reliability

    Directory of Open Access Journals (Sweden)

    Chittenden Thomas W

    2005-03-01

    Full Text Available Abstract Background In this paper, we present and validate a way to measure automatically the extent of cell migration based on automated examination of a series of digital photographs. It was designed specifically to identify the impact of Second Hand Smoke (SHS on endothelial cell migration but has broader applications. The analysis has two stages: (1 preprocessing of image texture, and (2 migration analysis. Results The output is a graphic overlay that indicates the front lines of cell migration superimposed on each original image, with automated reporting of the distance traversed vs. time. Expert preference compares to manual placement of leading edge shows complete equivalence of automated vs. manual leading edge definition for cell migration measurement. Conclusion Our method is indistinguishable from careful manual determinations of cell front lines, with the advantages of full automation, objectivity, and speed.

  3. Applications of Automation Methods for Nonlinear Fracture Test Analysis

    Science.gov (United States)

    Allen, Phillip A.; Wells, Douglas N.

    2013-01-01

    Using automated and standardized computer tools to calculate the pertinent test result values has several advantages such as: 1. allowing high-fidelity solutions to complex nonlinear phenomena that would be impractical to express in written equation form, 2. eliminating errors associated with the interpretation and programing of analysis procedures from the text of test standards, 3. lessening the need for expertise in the areas of solid mechanics, fracture mechanics, numerical methods, and/or finite element modeling, to achieve sound results, 4. and providing one computer tool and/or one set of solutions for all users for a more "standardized" answer. In summary, this approach allows a non-expert with rudimentary training to get the best practical solution based on the latest understanding with minimum difficulty.Other existing ASTM standards that cover complicated phenomena use standard computer programs: 1. ASTM C1340/C1340M-10- Standard Practice for Estimation of Heat Gain or Loss Through Ceilings Under Attics Containing Radiant Barriers by Use of a Computer Program 2. ASTM F 2815 - Standard Practice for Chemical Permeation through Protective Clothing Materials: Testing Data Analysis by Use of a Computer Program 3. ASTM E2807 - Standard Specification for 3D Imaging Data Exchange, Version 1.0 The verification, validation, and round-robin processes required of a computer tool closely parallel the methods that are used to ensure the solution validity for equations included in test standard. The use of automated analysis tools allows the creation and practical implementation of advanced fracture mechanics test standards that capture the physics of a nonlinear fracture mechanics problem without adding undue burden or expense to the user. The presented approach forms a bridge between the equation-based fracture testing standards of today and the next generation of standards solving complex problems through analysis automation.

  4. Applications of Automation Methods for Nonlinear Fracture Test Analysis

    Science.gov (United States)

    Allen, Phillip A.; Wells, Douglas N.

    2013-01-01

    As fracture mechanics material testing evolves, the governing test standards continue to be refined to better reflect the latest understanding of the physics of the fracture processes involved. The traditional format of ASTM fracture testing standards, utilizing equations expressed directly in the text of the standard to assess the experimental result, is self-limiting in the complexity that can be reasonably captured. The use of automated analysis techniques to draw upon a rich, detailed solution database for assessing fracture mechanics tests provides a foundation for a new approach to testing standards that enables routine users to obtain highly reliable assessments of tests involving complex, non-linear fracture behavior. Herein, the case for automating the analysis of tests of surface cracks in tension in the elastic-plastic regime is utilized as an example of how such a database can be generated and implemented for use in the ASTM standards framework. The presented approach forms a bridge between the equation-based fracture testing standards of today and the next generation of standards solving complex problems through analysis automation.

  5. Comparison of manual & automated analysis methods for corneal endothelial cell density measurements by specular microscopy.

    Science.gov (United States)

    Huang, Jianyan; Maram, Jyotsna; Tepelus, Tudor C; Modak, Cristina; Marion, Ken; Sadda, SriniVas R; Chopra, Vikas; Lee, Olivia L

    2017-08-07

    To determine the reliability of corneal endothelial cell density (ECD) obtained by automated specular microscopy versus that of validated manual methods and factors that predict such reliability. Sharp central images from 94 control and 106 glaucomatous eyes were captured with Konan specular microscope NSP-9900. All images were analyzed by trained graders using Konan CellChek Software, employing the fully- and semi-automated methods as well as Center Method. Images with low cell count (input cells number <100) and/or guttata were compared with the Center and Flex-Center Methods. ECDs were compared and absolute error was used to assess variation. The effect on ECD of age, cell count, cell size, and cell size variation was evaluated. No significant difference was observed between the Center and Flex-Center Methods in corneas with guttata (p=0.48) or low ECD (p=0.11). No difference (p=0.32) was observed in ECD of normal controls <40 yrs old between the fully-automated method and manual Center Method. However, in older controls and glaucomatous eyes, ECD was overestimated by the fully-automated method (p=0.034) and semi-automated method (p=0.025) as compared to manual method. Our findings show that automated analysis significantly overestimates ECD in the eyes with high polymegathism and/or large cell size, compared to the manual method. Therefore, we discourage reliance upon the fully-automated method alone to perform specular microscopy analysis, particularly if an accurate ECD value is imperative. Copyright © 2017. Published by Elsevier España, S.L.U.

  6. A standard analysis method (SAM) for the automated analysis of polychlorinated biphenyls (PCBs) in soils using the chemical analysis automation (CAA) paradigm: validation and performance

    International Nuclear Information System (INIS)

    Rzeszutko, C.; Johnson, C.R.; Monagle, M.; Klatt, L.N.

    1997-10-01

    The Chemical Analysis Automation (CAA) program is developing a standardized modular automation strategy for chemical analysis. In this automation concept, analytical chemistry is performed with modular building blocks that correspond to individual elements of the steps in the analytical process. With a standardized set of behaviors and interactions, these blocks can be assembled in a 'plug and play' manner into a complete analysis system. These building blocks, which are referred to as Standard Laboratory Modules (SLM), interface to a host control system that orchestrates the entire analytical process, from sample preparation through data interpretation. The integrated system is called a Standard Analysis Method (SAME). A SAME for the automated determination of Polychlorinated Biphenyls (PCB) in soils, assembled in a mobile laboratory, is undergoing extensive testing and validation. The SAME consists of the following SLMs: a four channel Soxhlet extractor, a High Volume Concentrator, column clean up, a gas chromatograph, a PCB data interpretation module, a robot, and a human- computer interface. The SAME is configured to meet the requirements specified in U.S. Environmental Protection Agency's (EPA) SW-846 Methods 3541/3620A/8082 for the analysis of pcbs in soils. The PCB SAME will be described along with the developmental test plan. Performance data obtained during developmental testing will also be discussed

  7. Learning Methods for Dynamic Topic Modeling in Automated Behavior Analysis.

    Science.gov (United States)

    Isupova, Olga; Kuzin, Danil; Mihaylova, Lyudmila

    2017-09-27

    Semisupervised and unsupervised systems provide operators with invaluable support and can tremendously reduce the operators' load. In the light of the necessity to process large volumes of video data and provide autonomous decisions, this paper proposes new learning algorithms for activity analysis in video. The activities and behaviors are described by a dynamic topic model. Two novel learning algorithms based on the expectation maximization approach and variational Bayes inference are proposed. Theoretical derivations of the posterior estimates of model parameters are given. The designed learning algorithms are compared with the Gibbs sampling inference scheme introduced earlier in the literature. A detailed comparison of the learning algorithms is presented on real video data. We also propose an anomaly localization procedure, elegantly embedded in the topic modeling framework. It is shown that the developed learning algorithms can achieve 95% success rate. The proposed framework can be applied to a number of areas, including transportation systems, security, and surveillance.

  8. Contaminant analysis automation, an overview

    International Nuclear Information System (INIS)

    Hollen, R.; Ramos, O. Jr.

    1996-01-01

    To meet the environmental restoration and waste minimization goals of government and industry, several government laboratories, universities, and private companies have formed the Contaminant Analysis Automation (CAA) team. The goal of this consortium is to design and fabricate robotics systems that standardize and automate the hardware and software of the most common environmental chemical methods. In essence, the CAA team takes conventional, regulatory- approved (EPA Methods) chemical analysis processes and automates them. The automation consists of standard laboratory modules (SLMs) that perform the work in a much more efficient, accurate, and cost- effective manner

  9. An automated method for analysis of microcirculation videos for accurate assessment of tissue perfusion

    Directory of Open Access Journals (Sweden)

    Demir Sumeyra U

    2012-12-01

    Full Text Available Abstract Background Imaging of the human microcirculation in real-time has the potential to detect injuries and illnesses that disturb the microcirculation at earlier stages and may improve the efficacy of resuscitation. Despite advanced imaging techniques to monitor the microcirculation, there are currently no tools for the near real-time analysis of the videos produced by these imaging systems. An automated system tool that can extract microvasculature information and monitor changes in tissue perfusion quantitatively might be invaluable as a diagnostic and therapeutic endpoint for resuscitation. Methods The experimental algorithm automatically extracts microvascular network and quantitatively measures changes in the microcirculation. There are two main parts in the algorithm: video processing and vessel segmentation. Microcirculatory videos are first stabilized in a video processing step to remove motion artifacts. In the vessel segmentation process, the microvascular network is extracted using multiple level thresholding and pixel verification techniques. Threshold levels are selected using histogram information of a set of training video recordings. Pixel-by-pixel differences are calculated throughout the frames to identify active blood vessels and capillaries with flow. Results Sublingual microcirculatory videos are recorded from anesthetized swine at baseline and during hemorrhage using a hand-held Side-stream Dark Field (SDF imaging device to track changes in the microvasculature during hemorrhage. Automatically segmented vessels in the recordings are analyzed visually and the functional capillary density (FCD values calculated by the algorithm are compared for both health baseline and hemorrhagic conditions. These results were compared to independently made FCD measurements using a well-known semi-automated method. Results of the fully automated algorithm demonstrated a significant decrease of FCD values. Similar, but more variable FCD

  10. A method for the automated detection phishing websites through both site characteristics and image analysis

    Science.gov (United States)

    White, Joshua S.; Matthews, Jeanna N.; Stacy, John L.

    2012-06-01

    Phishing website analysis is largely still a time-consuming manual process of discovering potential phishing sites, verifying if suspicious sites truly are malicious spoofs and if so, distributing their URLs to the appropriate blacklisting services. Attackers increasingly use sophisticated systems for bringing phishing sites up and down rapidly at new locations, making automated response essential. In this paper, we present a method for rapid, automated detection and analysis of phishing websites. Our method relies on near real-time gathering and analysis of URLs posted on social media sites. We fetch the pages pointed to by each URL and characterize each page with a set of easily computed values such as number of images and links. We also capture a screen-shot of the rendered page image, compute a hash of the image and use the Hamming distance between these image hashes as a form of visual comparison. We provide initial results demonstrate the feasibility of our techniques by comparing legitimate sites to known fraudulent versions from Phishtank.com, by actively introducing a series of minor changes to a phishing toolkit captured in a local honeypot and by performing some initial analysis on a set of over 2.8 million URLs posted to Twitter over a 4 days in August 2011. We discuss the issues encountered during our testing such as resolvability and legitimacy of URL's posted on Twitter, the data sets used, the characteristics of the phishing sites we discovered, and our plans for future work.

  11. Automated Nanofiber Diameter Measurement in SEM Images Using a Robust Image Analysis Method

    Directory of Open Access Journals (Sweden)

    Ertan Öznergiz

    2014-01-01

    Full Text Available Due to the high surface area, porosity, and rigidity, applications of nanofibers and nanosurfaces have developed in recent years. Nanofibers and nanosurfaces are typically produced by electrospinning method. In the production process, determination of average fiber diameter is crucial for quality assessment. Average fiber diameter is determined by manually measuring the diameters of randomly selected fibers on scanning electron microscopy (SEM images. However, as the number of the images increases, manual fiber diameter determination becomes a tedious and time consuming task as well as being sensitive to human errors. Therefore, an automated fiber diameter measurement system is desired. In the literature, this task is achieved by using image analysis algorithms. Typically, these methods first isolate each fiber in the image and measure the diameter of each isolated fiber. Fiber isolation is an error-prone process. In this study, automated calculation of nanofiber diameter is achieved without fiber isolation using image processing and analysis algorithms. Performance of the proposed method was tested on real data. The effectiveness of the proposed method is shown by comparing automatically and manually measured nanofiber diameter values.

  12. MIMoSA: An Automated Method for Intermodal Segmentation Analysis of Multiple Sclerosis Brain Lesions.

    Science.gov (United States)

    Valcarcel, Alessandra M; Linn, Kristin A; Vandekar, Simon N; Satterthwaite, Theodore D; Muschelli, John; Calabresi, Peter A; Pham, Dzung L; Martin, Melissa Lynne; Shinohara, Russell T

    2018-03-08

    Magnetic resonance imaging (MRI) is crucial for in vivo detection and characterization of white matter lesions (WMLs) in multiple sclerosis. While WMLs have been studied for over two decades using MRI, automated segmentation remains challenging. Although the majority of statistical techniques for the automated segmentation of WMLs are based on single imaging modalities, recent advances have used multimodal techniques for identifying WMLs. Complementary modalities emphasize different tissue properties, which help identify interrelated features of lesions. Method for Inter-Modal Segmentation Analysis (MIMoSA), a fully automatic lesion segmentation algorithm that utilizes novel covariance features from intermodal coupling regression in addition to mean structure to model the probability lesion is contained in each voxel, is proposed. MIMoSA was validated by comparison with both expert manual and other automated segmentation methods in two datasets. The first included 98 subjects imaged at Johns Hopkins Hospital in which bootstrap cross-validation was used to compare the performance of MIMoSA against OASIS and LesionTOADS, two popular automatic segmentation approaches. For a secondary validation, a publicly available data from a segmentation challenge were used for performance benchmarking. In the Johns Hopkins study, MIMoSA yielded average Sørensen-Dice coefficient (DSC) of .57 and partial AUC of .68 calculated with false positive rates up to 1%. This was superior to performance using OASIS and LesionTOADS. The proposed method also performed competitively in the segmentation challenge dataset. MIMoSA resulted in statistically significant improvements in lesion segmentation performance compared with LesionTOADS and OASIS, and performed competitively in an additional validation study. Copyright © 2018 by the American Society of Neuroimaging.

  13. Selection of Filtration Methods in the Analysis of Motion of Automated Guided Vehicle

    Directory of Open Access Journals (Sweden)

    Dobrzańska Magdalena

    2016-08-01

    Full Text Available In this article the issues related to mapping the route and error correction in automated guided vehicle (AGV movement have been discussed. The nature and size of disruption have been determined using the registered runs in experimental studies. On the basis of the analysis a number of numerical runs have been generated, which mapped possible to obtain runs in a real movement of the vehicle. The obtained data set has been used for further research. The aim of this paper was to test the selected methods of digital filtering on the same data set and determine their effectiveness. The results of simulation studies have been presented in the article. The effectiveness of various methods has been determined and on this basis the conclusions have been drawn.

  14. A New Method for Automated Identification and Morphometry of Myelinated Fibers Through Light Microscopy Image Analysis.

    Science.gov (United States)

    Novas, Romulo Bourget; Fazan, Valeria Paula Sassoli; Felipe, Joaquim Cezar

    2016-02-01

    Nerve morphometry is known to produce relevant information for the evaluation of several phenomena, such as nerve repair, regeneration, implant, transplant, aging, and different human neuropathies. Manual morphometry is laborious, tedious, time consuming, and subject to many sources of error. Therefore, in this paper, we propose a new method for the automated morphometry of myelinated fibers in cross-section light microscopy images. Images from the recurrent laryngeal nerve of adult rats and the vestibulocochlear nerve of adult guinea pigs were used herein. The proposed pipeline for fiber segmentation is based on the techniques of competitive clustering and concavity analysis. The evaluation of the proposed method for segmentation of images was done by comparing the automatic segmentation with the manual segmentation. To further evaluate the proposed method considering morphometric features extracted from the segmented images, the distributions of these features were tested for statistical significant difference. The method achieved a high overall sensitivity and very low false-positive rates per image. We detect no statistical difference between the distribution of the features extracted from the manual and the pipeline segmentations. The method presented a good overall performance, showing widespread potential in experimental and clinical settings allowing large-scale image analysis and, thus, leading to more reliable results.

  15. A Fully Automated and Robust Method to Incorporate Stamping Data in Crash, NVH and Durability Analysis

    Science.gov (United States)

    Palaniswamy, Hariharasudhan; Kanthadai, Narayan; Roy, Subir; Beauchesne, Erwan

    2011-08-01

    Crash, NVH (Noise, Vibration, Harshness), and durability analysis are commonly deployed in structural CAE analysis for mechanical design of components especially in the automotive industry. Components manufactured by stamping constitute a major portion of the automotive structure. In CAE analysis they are modeled at a nominal state with uniform thickness and no residual stresses and strains. However, in reality the stamped components have non-uniformly distributed thickness and residual stresses and strains resulting from stamping. It is essential to consider the stamping information in CAE analysis to accurately model the behavior of the sheet metal structures under different loading conditions. Especially with the current emphasis on weight reduction by replacing conventional steels with aluminum and advanced high strength steels it is imperative to avoid over design. Considering this growing need in industry, a highly automated and robust method has been integrated within Altair Hyperworks® to initialize sheet metal components in CAE models with stamping data. This paper demonstrates this new feature and the influence of stamping data for a full car frontal crash analysis.

  16. Automated cloning methods.; TOPICAL

    International Nuclear Information System (INIS)

    Collart, F.

    2001-01-01

    Argonne has developed a series of automated protocols to generate bacterial expression clones by using a robotic system designed to be used in procedures associated with molecular biology. The system provides plate storage, temperature control from 4 to 37 C at various locations, and Biomek and Multimek pipetting stations. The automated system consists of a robot that transports sources from the active station on the automation system. Protocols for the automated generation of bacterial expression clones can be grouped into three categories (Figure 1). Fragment generation protocols are initiated on day one of the expression cloning procedure and encompass those protocols involved in generating purified coding region (PCR)

  17. Lacunarity analysis: a promising method for the automated assessment of melanocytic naevi and melanoma.

    Directory of Open Access Journals (Sweden)

    Stephen Gilmore

    Full Text Available The early diagnosis of melanoma is critical to achieving reduced mortality and increased survival. Although clinical examination is currently the method of choice for melanocytic lesion assessment, there is a growing interest among clinicians regarding the potential diagnostic utility of computerised image analysis. Recognising that there exist significant shortcomings in currently available algorithms, we are motivated to investigate the utility of lacunarity, a simple statistical measure previously used in geology and other fields for the analysis of fractal and multi-scaled images, in the automated assessment of melanocytic naevi and melanoma. Digitised dermoscopic images of 111 benign melanocytic naevi, 99 dysplastic naevi and 102 melanomas were obtained over the period 2003 to 2008, and subject to lacunarity analysis. We found the lacunarity algorithm could accurately distinguish melanoma from benign melanocytic naevi or non-melanoma without introducing many of the limitations associated with other previously reported diagnostic algorithms. Lacunarity analysis suggests an ordering of irregularity in melanocytic lesions, and we suggest the clinical application of this ordering may have utility in the naked-eye dermoscopic diagnosis of early melanoma.

  18. Lacunarity analysis: a promising method for the automated assessment of melanocytic naevi and melanoma.

    Science.gov (United States)

    Gilmore, Stephen; Hofmann-Wellenhof, Rainer; Muir, Jim; Soyer, H Peter

    2009-10-13

    The early diagnosis of melanoma is critical to achieving reduced mortality and increased survival. Although clinical examination is currently the method of choice for melanocytic lesion assessment, there is a growing interest among clinicians regarding the potential diagnostic utility of computerised image analysis. Recognising that there exist significant shortcomings in currently available algorithms, we are motivated to investigate the utility of lacunarity, a simple statistical measure previously used in geology and other fields for the analysis of fractal and multi-scaled images, in the automated assessment of melanocytic naevi and melanoma. Digitised dermoscopic images of 111 benign melanocytic naevi, 99 dysplastic naevi and 102 melanomas were obtained over the period 2003 to 2008, and subject to lacunarity analysis. We found the lacunarity algorithm could accurately distinguish melanoma from benign melanocytic naevi or non-melanoma without introducing many of the limitations associated with other previously reported diagnostic algorithms. Lacunarity analysis suggests an ordering of irregularity in melanocytic lesions, and we suggest the clinical application of this ordering may have utility in the naked-eye dermoscopic diagnosis of early melanoma.

  19. Methods for Automating Analysis of Glacier Morphology for Regional Modelling: Centerlines, Extensions, and Elevation Bands

    Science.gov (United States)

    Viger, R. J.; Van Beusekom, A. E.

    2016-12-01

    The treatment of glaciers in modeling requires information about their shape and extent. This presentation discusses new methods and their application in a new glacier-capable variant of the USGS PRMS model, a physically-based, spatially distributed daily time-step model designed to simulate the runoff and evolution of glaciers through time. In addition to developing parameters describing PRMS land surfaces (hydrologic response units, HRUs), several of the analyses and products are likely of interest to cryospheric science community in general. The first method is a (fully automated) variation of logic previously presented in the literature for definition of the glacier centerline. Given that the surface of a glacier might be convex, using traditional topographic analyses based on a DEM to trace a path down the glacier is not reliable. Instead a path is derived based on a cost function. Although only a single path is presented in our results, the method can be easily modified to delineate a branched network of centerlines for each glacier. The second method extends the glacier terminus downslope by an arbitrary distance, according to local surface topography. This product is can be used to explore possible, if unlikely, scenarios under which glacier area grows. More usefully, this method can be used to approximate glacier extents from previous years without needing historical imagery. The final method presents an approach for segmenting the glacier into altitude-based HRUs. Successful integration of this information with traditional approaches for discretizing the non-glacierized portions of a basin requires several additional steps. These include synthesizing the glacier centerline network with one developed with a traditional DEM analysis, ensuring that flow can be routed under and beyond glaciers to a basin outlet. Results are presented based on analysis of the Copper River Basin, Alaska.

  20. Automated Methods of Corrosion Measurements

    DEFF Research Database (Denmark)

    Andersen, Jens Enevold Thaulov

    1997-01-01

    Scanning probe microscopy (SPM) techniques rely on computer recordings of interactions between the tip of a minute probe and the surface of the small specimen as a function of position; the measurements are used to depict an image of the atomic-scale surface topography on the computer screen....... Mechanical control, recording, and data processing must therefore be automated to a high level of precision and reliability. These general techniques and the apparatus involved have been described extensively. The automated methods of such high-resolution microscopy coordinated with computerized...... electrochemical measurements as well as elemental analysis look very promising for elucidating corrosion reaction mechanisms. The study of initial surface reactions at the atomic or submicron level is becoming an important field of research in the understanding of corrosion processes. At present, mainly two...

  1. Automated methods of corrosion measurement

    DEFF Research Database (Denmark)

    Andersen, Jens Enevold Thaulov; Bech-Nielsen, Gregers; Reeve, John Ch

    1997-01-01

    to revise assumptions regarding the basis of the method, which sometimes leads to the discovery of as-yet unnoticed phenomena. The present selection of automated methods for corrosion measurements is not motivated simply by the fact that a certain measurement can be performed automatically. Automation...

  2. Semi-automated method to measure pneumonia severity in mice through computed tomography (CT) scan analysis

    Science.gov (United States)

    Johri, Ansh; Schimel, Daniel; Noguchi, Audrey; Hsu, Lewis L.

    2010-03-01

    Imaging is a crucial clinical tool for diagnosis and assessment of pneumonia, but quantitative methods are lacking. Micro-computed tomography (micro CT), designed for lab animals, provides opportunities for non-invasive radiographic endpoints for pneumonia studies. HYPOTHESIS: In vivo micro CT scans of mice with early bacterial pneumonia can be scored quantitatively by semiautomated imaging methods, with good reproducibility and correlation with bacterial dose inoculated, pneumonia survival outcome, and radiologists' scores. METHODS: Healthy mice had intratracheal inoculation of E. coli bacteria (n=24) or saline control (n=11). In vivo micro CT scans were performed 24 hours later with microCAT II (Siemens). Two independent radiologists scored the extent of airspace abnormality, on a scale of 0 (normal) to 24 (completely abnormal). Using the Amira 5.2 software (Mercury Computer Systems), a histogram distribution of voxel counts between the Hounsfield range of -510 to 0 was created and analyzed, and a segmentation procedure was devised. RESULTS: A t-test was performed to determine whether there was a significant difference in the mean voxel value of each mouse in the three experimental groups: Saline Survivors, Pneumonia Survivors, and Pneumonia Non-survivors. It was found that the voxel count method was able to statistically tell apart the Saline Survivors from the Pneumonia Survivors, the Saline Survivors from the Pneumonia Non-survivors, but not the Pneumonia Survivors vs. Pneumonia Non-survivors. The segmentation method, however, was successfully able to distinguish the two Pneumonia groups. CONCLUSION: We have pilot-tested an evaluation of early pneumonia in mice using micro CT and a semi-automated method for lung segmentation and scoring system. Statistical analysis indicates that the system is reliable and merits further evaluation.

  3. Methods to quantify the velocity dependence of common gait measurements from automated rodent gait analysis devices.

    Science.gov (United States)

    Neckel, Nathan D

    2015-09-30

    Walking slowly is a different biomechanical task than walking quickly, thus measures of gait will be different at different velocities, such as pre/post injury. It is necessary to determine if the difference in gait measures are from the experimental changes, or simply from traveling at different speeds. Instead of limiting this effect, we have developed techniques to embrace the velocity dependence of gait measures. By translating the pawprints into a body coordinate frame we are able to measure location of paw placement in addition to the standard gait measures. At higher velocities rats have greater consistency of steps, place their forelimb initial contact more medially and anteriorly, and place their hindlimb toe off more medially and posteriorly. Interlimb phasing also becomes more consistent at higher velocities. Following a cervical spinal cord injury consistency is reduced and the velocity dependent behaviors are significantly different. Translating the coordinate frame improves the ability to measure changes in base of support following spinal cord injury. Employing a treadmill, or limiting analysis to a narrow velocity window does address the effects of velocity. We feel that measuring across all velocities is more appropriate than dictating that the animals match speeds. Quantifying locomotion with automated gait analysis devices is a great way to evaluate the changes that experimental treatments provide. These new methods allow for a more appropriate way to address the confound of many gait measures being velocity dependent. Copyright © 2015 Elsevier B.V. All rights reserved.

  4. Volumetric analysis of pelvic hematomas after blunt trauma using semi-automated seeded region growing segmentation: a method validation study.

    Science.gov (United States)

    Dreizin, David; Bodanapally, Uttam K; Neerchal, Nagaraj; Tirada, Nikki; Patlas, Michael; Herskovits, Edward

    2016-11-01

    Manually segmented traumatic pelvic hematoma volumes are strongly predictive of active bleeding at conventional angiography, but the method is time intensive, limiting its clinical applicability. We compared volumetric analysis using semi-automated region growing segmentation to manual segmentation and diameter-based size estimates in patients with pelvic hematomas after blunt pelvic trauma. A 14-patient cohort was selected in an anonymous randomized fashion from a dataset of patients with pelvic binders at MDCT, collected retrospectively as part of a HIPAA-compliant IRB-approved study from January 2008 to December 2013. To evaluate intermethod differences, one reader (R1) performed three volume measurements using the manual technique and three volume measurements using the semi-automated technique. To evaluate interobserver differences for semi-automated segmentation, a second reader (R2) performed three semi-automated measurements. One-way analysis of variance was used to compare differences in mean volumes. Time effort was also compared. Correlation between the two methods as well as two shorthand appraisals (greatest diameter, and the ABC/2 method for estimating ellipsoid volumes) was assessed with Spearman's rho (r). Intraobserver variability was lower for semi-automated compared to manual segmentation, with standard deviations ranging between ±5-32 mL and ±17-84 mL, respectively (p = 0.0003). There was no significant difference in mean volumes between the two readers' semi-automated measurements (p = 0.83); however, means were lower for the semi-automated compared with the manual technique (manual: mean and SD 309.6 ± 139 mL; R1 semi-auto: 229.6 ± 88.2 mL, p = 0.004; R2 semi-auto: 243.79 ± 99.7 mL, p = 0.021). Despite differences in means, the correlation between the two methods was very strong and highly significant (r = 0.91, p volumetric analysis of traumatic pelvic hematomas is potentially valuable at the point-of-care.

  5. UriSed 3 and UX-2000 automated urine sediment analyzers vs manual microscopic method: A comparative performance analysis.

    Science.gov (United States)

    Laiwejpithaya, Sathima; Wongkrajang, Preechaya; Reesukumal, Kanit; Bucha, Chonticha; Meepanya, Suriya; Pattanavin, Chanutchaya; Khejonnit, Varanya; Chuntarut, Achara

    2018-02-01

    Fully automated urine analyzers now play an important role in routine urinalysis in most laboratories. The recently introduced UriSed 3 has a new automated digital imaging urine sediment analyzer with a phase contrast feature. The aim of this study was to compare the performance of the UriSed 3 and UX-2000 automated urine sediment analyzers with each other and with the results of the manual microscopic method. Two hundred seventy-seven (277) samples of leftover fresh urine from our hospital's central laboratory were evaluated by two automated urine sediment analyzers-UriSed 3 and UX-2000. The results of urine sediment analysis were compared between the two automated analyzers and against the results of the manual microscopic method. Both devices demonstrated excellent agreement for quantitative measurement of red blood cells and white blood cells. UX-2000 had a lower coefficient correlation and demonstrated slightly lower agreement for squamous epithelial cells. Regarding semiquantitative analysis, both machines demonstrated very good concordance, with all applicable rates within one grade difference of the other machine. UriSed 3 had higher sensitivity for small round cells, while UX-2000 showed greater sensitivity for detecting bacteria and hyaline casts. UriSed 3 demonstrated slightly better specificity, especially in the detection of hyaline and pathological casts. Both instruments had nearly similar performance for red blood cells and white blood cells measurement. UriSed 3 was more reliable for measuring squamous epithelial cells and small round cells, while the UX-2000 was more accurate for detecting bacteria and hyaline casts. © 2017 Wiley Periodicals, Inc.

  6. Development and application of an automated analysis method for individual cerebral perfusion single photon emission tomography images

    CERN Document Server

    Cluckie, A J

    2001-01-01

    Neurological images may be analysed by performing voxel by voxel comparisons with a group of control subject images. An automated, 3D, voxel-based method has been developed for the analysis of individual single photon emission tomography (SPET) scans. Clusters of voxels are identified that represent regions of abnormal radiopharmaceutical uptake. Morphological operators are applied to reduce noise in the clusters, then quantitative estimates of the size and degree of the radiopharmaceutical uptake abnormalities are derived. Statistical inference has been performed using a Monte Carlo method that has not previously been applied to SPET scans, or for the analysis of individual images. This has been validated for group comparisons of SPET scans and for the analysis of an individual image using comparison with a group. Accurate statistical inference was obtained independent of experimental factors such as degrees of freedom, image smoothing and voxel significance level threshold. The analysis method has been eval...

  7. Automated Gait Analysis Through Hues and Areas (AGATHA): a method to characterize the spatiotemporal pattern of rat gait

    Science.gov (United States)

    Kloefkorn, Heidi E.; Pettengill, Travis R.; Turner, Sara M. F.; Streeter, Kristi A.; Gonzalez-Rothi, Elisa J.; Fuller, David D.; Allen, Kyle D.

    2016-01-01

    While rodent gait analysis can quantify the behavioral consequences of disease, significant methodological differences exist between analysis platforms and little validation has been performed to understand or mitigate these sources of variance. By providing the algorithms used to quantify gait, open-source gait analysis software can be validated and used to explore methodological differences. Our group is introducing, for the first time, a fully-automated, open-source method for the characterization of rodent spatiotemporal gait patterns, termed Automated Gait Analysis Through Hues and Areas (AGATHA). This study describes how AGATHA identifies gait events, validates AGATHA relative to manual digitization methods, and utilizes AGATHA to detect gait compensations in orthopaedic and spinal cord injury models. To validate AGATHA against manual digitization, results from videos of rodent gait, recorded at 1000 frames per second (fps), were compared. To assess one common source of variance (the effects of video frame rate), these 1000 fps videos were re-sampled to mimic several lower fps and compared again. While spatial variables were indistinguishable between AGATHA and manual digitization, low video frame rates resulted in temporal errors for both methods. At frame rates over 125 fps, AGATHA achieved a comparable accuracy and precision to manual digitization for all gait variables. Moreover, AGATHA detected unique gait changes in each injury model. These data demonstrate AGATHA is an accurate and precise platform for the analysis of rodent spatiotemporal gait patterns. PMID:27554674

  8. Automated Gait Analysis Through Hues and Areas (AGATHA): A Method to Characterize the Spatiotemporal Pattern of Rat Gait.

    Science.gov (United States)

    Kloefkorn, Heidi E; Pettengill, Travis R; Turner, Sara M F; Streeter, Kristi A; Gonzalez-Rothi, Elisa J; Fuller, David D; Allen, Kyle D

    2017-03-01

    While rodent gait analysis can quantify the behavioral consequences of disease, significant methodological differences exist between analysis platforms and little validation has been performed to understand or mitigate these sources of variance. By providing the algorithms used to quantify gait, open-source gait analysis software can be validated and used to explore methodological differences. Our group is introducing, for the first time, a fully-automated, open-source method for the characterization of rodent spatiotemporal gait patterns, termed Automated Gait Analysis Through Hues and Areas (AGATHA). This study describes how AGATHA identifies gait events, validates AGATHA relative to manual digitization methods, and utilizes AGATHA to detect gait compensations in orthopaedic and spinal cord injury models. To validate AGATHA against manual digitization, results from videos of rodent gait, recorded at 1000 frames per second (fps), were compared. To assess one common source of variance (the effects of video frame rate), these 1000 fps videos were re-sampled to mimic several lower fps and compared again. While spatial variables were indistinguishable between AGATHA and manual digitization, low video frame rates resulted in temporal errors for both methods. At frame rates over 125 fps, AGATHA achieved a comparable accuracy and precision to manual digitization for all gait variables. Moreover, AGATHA detected unique gait changes in each injury model. These data demonstrate AGATHA is an accurate and precise platform for the analysis of rodent spatiotemporal gait patterns.

  9. Automated spectrophotometric bicarbonate analysis in duodenal juice compared to the back titration method.

    Science.gov (United States)

    Erchinger, Friedemann; Engjom, Trond; Gudbrandsen, Oddrun Anita; Tjora, Erling; Gilja, Odd H; Dimcevski, Georg

    2016-01-01

    We have recently evaluated a short endoscopic secretin test for exocrine pancreatic function. Bicarbonate concentration in duodenal juice is an important parameter in this test. Measurement of bicarbonate by back titration as the gold standard method is time consuming, expensive and technically difficult, thus a simplified method is warranted. We aimed to evaluate an automated spectrophotometric method in samples spanning the effective range of bicarbonate concentrations in duodenal juice. We also evaluated if freezing of samples before analyses would affect its results. Patients routinely examined with short endoscopic secretin test suspected to have decreased pancreatic function of various reasons were included. Bicarbonate in duodenal juice was quantified by back titration and automatic spectrophotometry. Both fresh and thawed samples were analysed spectrophotometrically. 177 samples from 71 patients were analysed. Correlation coefficient of all measurements was r = 0.98 (p titration gold standard. This is a major simplification of direct pancreas function testing, and allows a wider distribution of bicarbonate testing in duodenal juice. Extreme values for Bicarbonate concentration achieved by the autoanalyser method have to be interpreted with caution. Copyright © 2016 IAP and EPC. Published by Elsevier India Pvt Ltd. All rights reserved.

  10. Automated analysis of gastric emptying

    International Nuclear Information System (INIS)

    Abutaleb, A.; Frey, D.; Spicer, K.; Spivey, M.; Buckles, D.

    1986-01-01

    The authors devised a novel method to automate the analysis of nuclear gastric emptying studies. Many previous methods have been used to measure gastric emptying but, are cumbersome and require continuing interference by the operator to use. Two specific problems that occur are related to patient movement between images and changes in the location of the radioactive material within the stomach. Their method can be used with either dual or single phase studies. For dual phase studies the authors use In-111 labeled water and Tc-99MSC (Sulfur Colloid) labeled scrambled eggs. For single phase studies either the liquid or solid phase material is used

  11. Automated Motivic Analysis

    DEFF Research Database (Denmark)

    Lartillot, Olivier

    2016-01-01

    Motivic analysis provides very detailed understanding of musical composi- tions, but is also particularly difficult to formalize and systematize. A computational automation of the discovery of motivic patterns cannot be reduced to a mere extraction of all possible sequences of descriptions....... The systematic approach inexorably leads to a proliferation of redundant structures that needs to be addressed properly. Global filtering techniques cause a drastic elimination of interesting structures that damages the quality of the analysis. On the other hand, a selection of closed patterns allows...... for lossless compression. The structural complexity resulting from successive repetitions of patterns can be controlled through a simple modelling of cycles. Generally, motivic patterns cannot always be defined solely as sequences of descriptions in a fixed set of dimensions: throughout the descriptions...

  12. Spectral neighbor analysis method for automated generation of quantum-accurate interatomic potentials

    International Nuclear Information System (INIS)

    Thompson, A.P.; Swiler, L.P.; Trott, C.R.; Foiles, S.M.; Tucker, G.J.

    2015-01-01

    We present a new interatomic potential for solids and liquids called Spectral Neighbor Analysis Potential (SNAP). The SNAP potential has a very general form and uses machine-learning techniques to reproduce the energies, forces, and stress tensors of a large set of small configurations of atoms, which are obtained using high-accuracy quantum electronic structure (QM) calculations. The local environment of each atom is characterized by a set of bispectrum components of the local neighbor density projected onto a basis of hyperspherical harmonics in four dimensions. The bispectrum components are the same bond-orientational order parameters employed by the GAP potential [1]. The SNAP potential, unlike GAP, assumes a linear relationship between atom energy and bispectrum components. The linear SNAP coefficients are determined using weighted least-squares linear regression against the full QM training set. This allows the SNAP potential to be fit in a robust, automated manner to large QM data sets using many bispectrum components. The calculation of the bispectrum components and the SNAP potential are implemented in the LAMMPS parallel molecular dynamics code. We demonstrate that a previously unnoticed symmetry property can be exploited to reduce the computational cost of the force calculations by more than one order of magnitude. We present results for a SNAP potential for tantalum, showing that it accurately reproduces a range of commonly calculated properties of both the crystalline solid and the liquid phases. In addition, unlike simpler existing potentials, SNAP correctly predicts the energy barrier for screw dislocation migration in BCC tantalum

  13. Literature Lab: a method of automated literature interrogation to infer biology from microarray analysis

    Directory of Open Access Journals (Sweden)

    Stegmaier Kimberly

    2007-12-01

    Full Text Available Abstract Background The biomedical literature is a rich source of associative information but too vast for complete manual review. We have developed an automated method of literature interrogation called "Literature Lab" that identifies and ranks associations existing in the literature between gene sets, such as those derived from microarray experiments, and curated sets of key terms (i.e. pathway names, medical subject heading (MeSH terms, etc. Results Literature Lab was developed using differentially expressed gene sets from three previously published cancer experiments and tested on a fourth, novel gene set. When applied to the genesets from the published data including an in vitro experiment, an in vivo mouse experiment, and an experiment with human tumor samples, Literature Lab correctly identified known biological processes occurring within each experiment. When applied to a novel set of genes differentially expressed between locally invasive and metastatic prostate cancer, Literature Lab identified a strong association between the pathway term "FOSB" and genes with increased expression in metastatic prostate cancer. Immunohistochemistry subsequently confirmed increased nuclear FOSB staining in metastatic compared to locally invasive prostate cancers. Conclusion This work demonstrates that Literature Lab can discover key biological processes by identifying meritorious associations between experimentally derived gene sets and key terms within the biomedical literature.

  14. Automated Analysis of Accountability

    DEFF Research Database (Denmark)

    Bruni, Alessandro; Giustolisi, Rosario; Schürmann, Carsten

    2017-01-01

    that are amenable to automated verification. Our definitions are general enough to be applied to different classes of protocols and different automated security verification tools. Furthermore, we point out formally the relation between verifiability and accountability. We validate our definitions...... with the automatic verification of three protocols: a secure exam protocol, Google’s Certificate Transparency, and an improved version of Bingo Voting. We find through automated verification that all three protocols satisfy verifiability while only the first two protocols meet accountability....

  15. NetFCM: A Semi-Automated Web-Based Method for Flow Cytometry Data Analysis

    DEFF Research Database (Denmark)

    Frederiksen, Juliet Wairimu; Buggert, Marcus; Karlsson, Annika C.

    2014-01-01

    data analysis has become more complex and labor-intensive than previously. We have therefore developed a semi-automatic gating strategy (NetFCM) that uses clustering and principal component analysis (PCA) together with other statistical methods to mimic manual gating approaches. NetFCM is an online...... corresponding to those obtained by manual gating strategies. These data demonstrate that NetFCM has the potential to identify relevant T cell populations by mimicking classical FCM data analysis and reduce the subjectivity and amount of time associated with such analysis. (c) 2014 International Society......Multi-parametric flow cytometry (FCM) represents an invaluable instrument to conduct single cell analysis and has significantly increased our understanding of the immune system. However, due to new techniques allowing us to measure an increased number of phenotypes within the immune system, FCM...

  16. Automated method for simultaneous lead and strontium isotopic analysis applied to rainwater samples and airborne particulate filters (PM10).

    Science.gov (United States)

    Beltrán, Blanca; Avivar, Jessica; Mola, Montserrat; Ferrer, Laura; Cerdà, Víctor; Leal, Luz O

    2013-09-03

    A new automated, sensitive, and fast system for the simultaneous online isolation and preconcentration of lead and strontium by sorption on a microcolumn packed with Sr-resin using an inductively coupled plasma mass spectrometry (ICP-MS) detector was developed, hyphenating lab-on-valve (LOV) and multisyringe flow injection analysis (MSFIA). Pb and Sr are directly retained on the sorbent column and eluted with a solution of 0.05 mol L(-1) ammonium oxalate. The detection limits achieved were 0.04 ng for lead and 0.03 ng for strontium. Mass calibration curves were used since the proposed system allows the use of different sample volumes for preconcentration. Mass linear working ranges were between 0.13 and 50 ng and 0.1 and 50 ng for lead and strontium, respectively. The repeatability of the method, expressed as RSD, was 2.1% and 2.7% for Pb and Sr, respectively. Environmental samples such as rainwater and airborne particulate (PM10) filters as well as a certified reference material SLRS-4 (river water) were satisfactorily analyzed obtaining recoveries between 90 and 110% for both elements. The main features of the LOV-MSFIA-ICP-MS system proposed are the capability to renew solid phase extraction at will in a fully automated way, the remarkable stability of the column which can be reused up to 160 times, and the potential to perform isotopic analysis.

  17. Development and application of an automated analysis method for individual cerebral perfusion single photon emission tomography images

    International Nuclear Information System (INIS)

    Cluckie, Alice Jane

    2001-01-01

    Neurological images may be analysed by performing voxel by voxel comparisons with a group of control subject images. An automated, 3D, voxel-based method has been developed for the analysis of individual single photon emission tomography (SPET) scans. Clusters of voxels are identified that represent regions of abnormal radiopharmaceutical uptake. Morphological operators are applied to reduce noise in the clusters, then quantitative estimates of the size and degree of the radiopharmaceutical uptake abnormalities are derived. Statistical inference has been performed using a Monte Carlo method that has not previously been applied to SPET scans, or for the analysis of individual images. This has been validated for group comparisons of SPET scans and for the analysis of an individual image using comparison with a group. Accurate statistical inference was obtained independent of experimental factors such as degrees of freedom, image smoothing and voxel significance level threshold. The analysis method has been evaluated for application to cerebral perfusion SPET imaging in ischaemic stroke. It has been shown that useful quantitative estimates, high sensitivity and high specificity may be obtained. Sensitivity and the accuracy of signal quantification were found to be dependent on the operator defined analysis parameters. Recommendations for the values of these parameters have been made. The analysis method developed has been compared with an established method and shown to result in higher specificity for the data and analysis parameter sets tested. In addition, application to a group of ischaemic stroke patient SPET scans has demonstrated its clinical utility. The influence of imaging conditions has been assessed using phantom data acquired with different gamma camera SPET acquisition parameters. A lower limit of five million counts and standardisation of all acquisition parameters has been recommended for the analysis of individual SPET scans. (author)

  18. Longitudinal analysis of the temporal evolution of Acinetobacter baumannii strains in Ohio, USA, by using rapid automated typing methods.

    Directory of Open Access Journals (Sweden)

    Brooke K Decker

    Full Text Available Genotyping methods are essential to understand the transmission dynamics of Acinetobacter baumannii. We examined the representative genotypes of A. baumannii at different time periods in select locations in Ohio, using two rapid automated typing methods: PCR coupled with electrospray ionization mass spectrometry (PCR/ESI-MS, a form of multi-locus sequence typing (MLST, and repetitive-sequence-based-PCR (rep-PCR. Our analysis included 122 isolates from 4 referral hospital systems, in 2 urban areas of Ohio. These isolates were associated with outbreaks at 3 different time periods (1996, 2000 and 2005-2007. Type assignments of PCR/ESI-MS and rep-PCR were compared to each other and to worldwide (WW clone types. The discriminatory power of each method was determined using the Simpson's index of diversity (DI. We observed that PCR/ESI-MS sequence type (ST 14, corresponding to WW clone 3, predominated in 1996, whereas ST 12 and 14 co-existed in the intermediate period (2000 and ST 10 and 12, belonging to WW clone 2, predominated more recently in 2007. The shift from WW clone 3 to WW clone 2 was accompanied by an increase in carbapenem resistance. The DI was approximately 0.74 for PCR/ESI-MS, 0.88 for rep-PCR and 0.90 for the combination of both typing methods. We conclude that combining rapid automated typing methods such as PCR/ESI-MS and rep-PCR serves to optimally characterize the regional molecular epidemiology of A. baumannii. Our data also sheds light on the changing sequence types in an 11 year period in Northeast Ohio.

  19. AUTOMATED ANALYSIS OF BREAKERS

    Directory of Open Access Journals (Sweden)

    E. M. Farhadzade

    2014-01-01

    Full Text Available Breakers relate to Electric Power Systems’ equipment, the reliability of which influence, to a great extend, on reliability of Power Plants. In particular, the breakers determine structural reliability of switchgear circuit of Power Stations and network substations. Failure in short-circuit switching off by breaker with further failure of reservation unit or system of long-distance protection lead quite often to system emergency.The problem of breakers’ reliability improvement and the reduction of maintenance expenses is becoming ever more urgent in conditions of systematic increasing of maintenance cost and repair expenses of oil circuit and air-break circuit breakers. The main direction of this problem solution is the improvement of diagnostic control methods and organization of on-condition maintenance. But this demands to use a great amount of statistic information about nameplate data of breakers and their operating conditions, about their failures, testing and repairing, advanced developments (software of computer technologies and specific automated information system (AIS.The new AIS with AISV logo was developed at the department: “Reliability of power equipment” of AzRDSI of Energy. The main features of AISV are:· to provide the security and data base accuracy;· to carry out systematic control of breakers conformity with operating conditions;· to make the estimation of individual  reliability’s value and characteristics of its changing for given combination of characteristics variety;· to provide personnel, who is responsible for technical maintenance of breakers, not only with information but also with methodological support, including recommendations for the given problem solving  and advanced methods for its realization.

  20. Development testing of the chemical analysis automation polychlorinated biphenyl standard analysis method during surface soils sampling at the David Witherspoon 1630 site

    International Nuclear Information System (INIS)

    Hunt, M.A.; Klatt, L.N.; Thompson, D.H.

    1998-02-01

    The Chemical Analysis Automation (CAA) project is developing standardized, software-driven, site-deployable robotic laboratory systems with the objective of lowering the per-sample analysis cost, decreasing sample turnaround time, and minimizing human exposure to hazardous and radioactive materials associated with DOE remediation projects. The first integrated system developed by the CAA project is designed to determine polychlorinated biphenyls (PCB) content in soil matrices. A demonstration and development testing of this system was conducted in conjuction with surface soil characterization activities at the David Witherspoon 1630 Site in Knoxville, Tennessee. The PCB system consists of five hardware standard laboratory modules (SLMs), one software SLM, the task sequence controller (TSC), and the human-computer interface (HCI). Four of the hardware SLMs included a four-channel Soxhlet extractor, a high-volume concentrator, a column cleanup, and a gas chromatograph. These SLMs performed the sample preparation and measurement steps within the total analysis protocol. The fifth hardware module was a robot that transports samples between the SLMs and the required consumable supplies to the SLMs. The software SLM is an automated data interpretation module that receives raw data from the gas chromatograph SLM and analyzes the data to yield the analyte information. The TSC is a software system that provides the scheduling, management of system resources, and the coordination of all SLM activities. The HCI is a graphical user interface that presents the automated laboratory to the analyst in terms of the analytical procedures and methods. Human control of the automated laboratory is accomplished via the HCI. Sample information required for processing by the automated laboratory is entered through the HCI. Information related to the sample and the system status is presented to the analyst via graphical icons

  1. GSMA: Gene Set Matrix Analysis, An Automated Method for Rapid Hypothesis Testing of Gene Expression Data

    Directory of Open Access Journals (Sweden)

    Chris Cheadle

    2007-01-01

    Full Text Available Background: Microarray technology has become highly valuable for identifying complex global changes in gene expression patterns. The assignment of functional information to these complex patterns remains a challenging task in effectively interpreting data and correlating results from across experiments, projects and laboratories. Methods which allow the rapid and robust evaluation of multiple functional hypotheses increase the power of individual researchers to data mine gene expression data more efficiently.Results: We have developed (gene set matrix analysis GSMA as a useful method for the rapid testing of group-wise up- or downregulation of gene expression simultaneously for multiple lists of genes (gene sets against entire distributions of gene expression changes (datasets for single or multiple experiments. The utility of GSMA lies in its flexibility to rapidly poll gene sets related by known biological function or as designated solely by the end-user against large numbers of datasets simultaneously.Conclusions: GSMA provides a simple and straightforward method for hypothesis testing in which genes are tested by groups across multiple datasets for patterns of expression enrichment.

  2. Automated detection method for architectural distortion areas on mammograms based on morphological processing and surface analysis

    Science.gov (United States)

    Ichikawa, Tetsuko; Matsubara, Tomoko; Hara, Takeshi; Fujita, Hiroshi; Endo, Tokiko; Iwase, Takuji

    2004-05-01

    As well as mass and microcalcification, architectural distortion is a very important finding for the early detection of breast cancer via mammograms, and such distortions can be classified into three typical types: spiculation, retraction, and distortion. The purpose of this work is to develop an automatic method for detecting areas of architectural distortion with spiculation. The suspect areas are detected by concentration indexes of line-structures extracted by using mean curvature. After that, discrimination analysis of nine features is employed for the classifications of true and false positives. The employed features are the size, the mean pixel value, the mean concentration index, the mean isotropic index, the contrast, and four other features based on the power spectrum. As a result of this work, the accuracy of the classification was 76% and the sensitivity was 80% with 0.9 false positives per image in our database in regard to spiculation. It was concluded that our method was effective in detectiong the area of architectural distortion; however, some architectural distortions were not detected accurately because of the size, the density, or the different appearance of the distorted areas.

  3. Database-Centric Method for Automated High-Throughput Deconvolution and Analysis of Kinetic Antibody Screening Data.

    Science.gov (United States)

    Nobrega, R Paul; Brown, Michael; Williams, Cody; Sumner, Chris; Estep, Patricia; Caffry, Isabelle; Yu, Yao; Lynaugh, Heather; Burnina, Irina; Lilov, Asparouh; Desroches, Jordan; Bukowski, John; Sun, Tingwan; Belk, Jonathan P; Johnson, Kirt; Xu, Yingda

    2017-10-01

    The state-of-the-art industrial drug discovery approach is the empirical interrogation of a library of drug candidates against a target molecule. The advantage of high-throughput kinetic measurements over equilibrium assessments is the ability to measure each of the kinetic components of binding affinity. Although high-throughput capabilities have improved with advances in instrument hardware, three bottlenecks in data processing remain: (1) intrinsic molecular properties that lead to poor biophysical quality in vitro are not accounted for in commercially available analysis models, (2) processing data through a user interface is time-consuming and not amenable to parallelized data collection, and (3) a commercial solution that includes historical kinetic data in the analysis of kinetic competition data does not exist. Herein, we describe a generally applicable method for the automated analysis, storage, and retrieval of kinetic binding data. This analysis can deconvolve poor quality data on-the-fly and store and organize historical data in a queryable format for use in future analyses. Such database-centric strategies afford greater insight into the molecular mechanisms of kinetic competition, allowing for the rapid identification of allosteric effectors and the presentation of kinetic competition data in absolute terms of percent bound to antigen on the biosensor.

  4. Automated Analysis of Facial Cues from Videos as a Potential Method for Differentiating Stress and Boredom of Players in Games

    Directory of Open Access Journals (Sweden)

    Fernando Bevilacqua

    2018-01-01

    Full Text Available Facial analysis is a promising approach to detect emotions of players unobtrusively; however approaches are commonly evaluated in contexts not related to games or facial cues are derived from models not designed for analysis of emotions during interactions with games. We present a method for automated analysis of facial cues from videos as a potential tool for detecting stress and boredom of players behaving naturally while playing games. Computer vision is used to automatically and unobtrusively extract 7 facial features aimed at detecting the activity of a set of facial muscles. Features are mainly based on the Euclidean distance of facial landmarks and do not rely on predefined facial expressions, training of a model, or the use of facial standards. An empirical evaluation was conducted on video recordings of an experiment involving games as emotion elicitation sources. Results show statistically significant differences in the values of facial features during boring and stressful periods of gameplay for 5 of the 7 features. We believe our approach is more user-tailored, convenient, and better suited for contexts involving games.

  5. Automated Methods Of Corrosion Measurements

    DEFF Research Database (Denmark)

    Bech-Nielsen, Gregers; Andersen, Jens Enevold Thaulov; Reeve, John Ch

    1997-01-01

    The chapter describes the following automated measurements: Corrosion Measurements by Titration, Imaging Corrosion by Scanning Probe Microscopy, Critical Pitting Temperature and Application of the Electrochemical Hydrogen Permeation Cell.......The chapter describes the following automated measurements: Corrosion Measurements by Titration, Imaging Corrosion by Scanning Probe Microscopy, Critical Pitting Temperature and Application of the Electrochemical Hydrogen Permeation Cell....

  6. A Simple Method for Automated Solid Phase Extraction of Water Samples for Immunological Analysis of Small Pollutants.

    Science.gov (United States)

    Heub, Sarah; Tscharner, Noe; Kehl, Florian; Dittrich, Petra S; Follonier, Stéphane; Barbe, Laurent

    2016-01-01

    A new method for solid phase extraction (SPE) of environmental water samples is proposed. The developed prototype is cost-efficient and user friendly, and enables to perform rapid, automated and simple SPE. The pre-concentrated solution is compatible with analysis by immunoassay, with a low organic solvent content. A method is described for the extraction and pre-concentration of natural hormone 17β-estradiol in 100 ml water samples. Reverse phase SPE is performed with octadecyl-silica sorbent and elution is done with 200 µl of methanol 50% v/v. Eluent is diluted by adding di-water to lower the amount of methanol. After preparing manually the SPE column, the overall procedure is performed automatically within 1 hr. At the end of the process, estradiol concentration is measured by using a commercial enzyme-linked immune-sorbent assay (ELISA). 100-fold pre-concentration is achieved and the methanol content in only 10% v/v. Full recoveries of the molecule are achieved with 1 ng/L spiked de-ionized and synthetic sea water samples.

  7. Automated methods of textual content analysis and description of text structures

    CERN Document Server

    Chýla, Roman

    Universal Semantic Language (USL) is a semi-formalized approach for the description of knowledge (a knowledge representation tool). The idea of USL was introduced by Vladimir Smetacek in the system called SEMAN which was used for keyword extraction tasks in the former Information centre of the Czechoslovak Republic. However due to the dissolution of the centre in early 90's, the system has been lost. This thesis reintroduces the idea of USL in a new context of quantitative content analysis. First we introduce the historical background and the problems of semantics and knowledge representation, semes, semantic fields, semantic primes and universals. The basic methodology of content analysis studies is illustrated on the example of three content analysis tools and we describe the architecture of a new system. The application was built specifically for USL discovery but it can work also in the context of classical content analysis. It contains Natural Language Processing (NLP) components and employs the algorith...

  8. Multistage decision-based heart sound delineation method for automated analysis of heart sounds and murmurs.

    Science.gov (United States)

    Nivitha Varghees, V; Ramachandran, K I

    2015-12-01

    A robust multistage decision-based heart sound delineation (MDHSD) method is presented for automatically determining the boundaries and peaks of heart sounds (S1, S2, S3, and S4), systolic, and diastolic murmurs (early, mid, and late) and high-pitched sounds (HPSs) of the phonocardiogram (PCG) signal. The proposed MDHSD method consists of the Gaussian kernels based signal decomposition (GSDs) and multistage decision-based delineation (MDBD). The GSD algorithm first removes the low-frequency (LF) artefacts and then decomposes the filtered signal into two subsignals: the LF sound part (S1, S2, S3, and S4) and the high-frequency sound part (murmurs and HPSs). The MDBD algorithm consists of absolute envelope extraction, adaptive thresholding, and fiducial point determination. The accuracy and robustness of the proposed method is evaluated using various types of normal and pathological PCG signals. Results show that the method achieves an average sensitivity of 98.22%, positive predictivity of 97.46%, and overall accuracy of 95.78%. The method yields maximum average delineation errors of 4.52 and 4.14 ms for determining the start-point and end-point of sounds. The proposed multistage delineation algorithm is capable of improving the delineation accuracy under time-varying amplitudes of heart sounds and various types of murmurs. The proposed method has significant potential applications in heart sounds and murmurs classification systems.

  9. Automation of a gamma spectrometric analysis method for naturally occuring radionuclides in different materials (NORM)

    International Nuclear Information System (INIS)

    Marzocchi, Olaf

    2009-06-01

    This work presents an improvement over the standard analysis routine used in the Physikalisches Messlabor to detect gamma peaks in spectra from naturally occurring radioactive materials (NORM). The new routine introduces the use of custom libraries of known gamma peaks, in order to ease the work of the software than can therefore detect more peaks. As final result, the user performing the analysis has less chances of making errors and can also analyse more spectra in the same amount of time. A new software, with an optimised interface able to further enhance the productivity of the user, is developed and validated. (orig.)

  10. Evaluating a method for automated rigid registration

    DEFF Research Database (Denmark)

    Darkner, Sune; Vester-Christensen, Martin; Larsen, Rasmus

    2007-01-01

    We evaluate a novel method for fully automated rigid registration of 2D manifolds in 3D space based on distance maps, the Gibbs sampler and Iterated Conditional Modes (ICM). The method is tested against the ICP considered as the gold standard for automated rigid registration. Furthermore...

  11. Integrated Design Engineering Analysis (IDEA) Environment Automated Generation of Structured CFD Grids using Topology Methods

    Science.gov (United States)

    Kamhawi, Hilmi N.

    2012-01-01

    This report documents the work performed from March 2010 to March 2012. The Integrated Design and Engineering Analysis (IDEA) environment is a collaborative environment based on an object-oriented, multidisciplinary, distributed framework using the Adaptive Modeling Language (AML) as a framework and supporting the configuration design and parametric CFD grid generation. This report will focus on describing the work in the area of parametric CFD grid generation using novel concepts for defining the interaction between the mesh topology and the geometry in such a way as to separate the mesh topology from the geometric topology while maintaining the link between the mesh topology and the actual geometry.

  12. A method to quantify movement activity of groups of animals using automated image analysis

    Science.gov (United States)

    Xu, Jianyu; Yu, Haizhen; Liu, Ying

    2009-07-01

    Most physiological and environmental changes are capable of inducing variations in animal behavior. The behavioral parameters have the possibility to be measured continuously in-situ by a non-invasive and non-contact approach, and have the potential to be used in the actual productions to predict stress conditions. Most vertebrates tend to live in groups, herds, flocks, shoals, bands, packs of conspecific individuals. Under culture conditions, the livestock or fish are in groups and interact on each other, so the aggregate behavior of the group should be studied rather than that of individuals. This paper presents a method to calculate the movement speed of a group of animal in a enclosure or a tank denoted by body length speed that correspond to group activity using computer vision technique. Frame sequences captured at special time interval were subtracted in pairs after image segmentation and identification. By labeling components caused by object movement in difference frame, the projected area caused by the movement of every object in the capture interval was calculated; this projected area was divided by the projected area of every object in the later frame to get body length moving distance of each object, and further could obtain the relative body length speed. The average speed of all object can well respond to the activity of the group. The group activity of a tilapia (Oreochromis niloticus) school to high (2.65 mg/L) levels of unionized ammonia (UIA) concentration were quantified based on these methods. High UIA level condition elicited a marked increase in school activity at the first hour (P<0.05) exhibiting an avoidance reaction (trying to flee from high UIA condition), and then decreased gradually.

  13. Statistical colour models: an automated digital image analysis method for quantification of histological biomarkers.

    Science.gov (United States)

    Shu, Jie; Dolman, G E; Duan, Jiang; Qiu, Guoping; Ilyas, Mohammad

    2016-04-27

    Colour is the most important feature used in quantitative immunohistochemistry (IHC) image analysis; IHC is used to provide information relating to aetiology and to confirm malignancy. Statistical modelling is a technique widely used for colour detection in computer vision. We have developed a statistical model of colour detection applicable to detection of stain colour in digital IHC images. Model was first trained by massive colour pixels collected semi-automatically. To speed up the training and detection processes, we removed luminance channel, Y channel of YCbCr colour space and chose 128 histogram bins which is the optimal number. A maximum likelihood classifier is used to classify pixels in digital slides into positively or negatively stained pixels automatically. The model-based tool was developed within ImageJ to quantify targets identified using IHC and histochemistry. The purpose of evaluation was to compare the computer model with human evaluation. Several large datasets were prepared and obtained from human oesophageal cancer, colon cancer and liver cirrhosis with different colour stains. Experimental results have demonstrated the model-based tool achieves more accurate results than colour deconvolution and CMYK model in the detection of brown colour, and is comparable to colour deconvolution in the detection of pink colour. We have also demostrated the proposed model has little inter-dataset variations. A robust and effective statistical model is introduced in this paper. The model-based interactive tool in ImageJ, which can create a visual representation of the statistical model and detect a specified colour automatically, is easy to use and available freely at http://rsb.info.nih.gov/ij/plugins/ihc-toolbox/index.html . Testing to the tool by different users showed only minor inter-observer variations in results.

  14. An automated activation analysis system

    International Nuclear Information System (INIS)

    Minor, M.M.; Hensley, W.K.; Denton, M.M.; Garcia, S.R.

    1982-01-01

    An automated delayed neutron counting and instrumental neutron activation analysis system has been developed at Los Alamos National Laboratory's Omega West Reactor (OWR) to analyze samples for uranium and 31 additional elements with a maximum throughput of 400 samples per day. The system and its mode of operation for a large reconnaissance survey will be described. (author)

  15. Automated activation-analysis system

    International Nuclear Information System (INIS)

    Minor, M.M.; Hensley, W.K.; Denton, M.M.; Garcia, S.R.

    1981-01-01

    An automated delayed neutron counting and instrumental neutron activation analysis system has been developed at Los Alamos National Laboratory's Omega West Reactor (OWR) to analyze samples for uranium and 31 additional elements with a maximum throughput of 400 samples per day. The system and its mode of operation for a large reconnaissance survey are described

  16. Automated activation-analysis system

    International Nuclear Information System (INIS)

    Minor, M.M.; Garcia, S.R.; Denton, M.M.

    1982-01-01

    An automated delayed neutron counting and instrumental neutron activation analysis system has been developed at Los Alamos National Laboratory's Omega West Reactor (OWR) to analyze samples for uranium and 31 additional elements with a maximum throughput of 400 samples per day

  17. A new automated method for analysis of gated-SPECT images based on a three-dimensional heart shaped model

    DEFF Research Database (Denmark)

    Lomsky, Milan; Richter, Jens; Johansson, Lena

    2005-01-01

    A new automated method for quantification of left ventricular function from gated-single photon emission computed tomography (SPECT) images has been developed. The method for quantification of cardiac function (CAFU) is based on a heart shaped model and the active shape algorithm. The model....... In the patient group the EDV calculated using QGS and CAFU showed good agreement for large hearts and higher CAFU values compared with QGS for the smaller hearts. In the larger hearts, ESV was much larger for QGS than for CAFU both in the phantom and patient studies. In the smallest hearts there was good...

  18. Automated Analysis of Corpora Callosa

    DEFF Research Database (Denmark)

    Stegmann, Mikkel Bille; Davies, Rhodri H.

    2003-01-01

    This report describes and evaluates the steps needed to perform modern model-based interpretation of the corpus callosum in MRI. The process is discussed from the initial landmark-free contours to full-fledged statistical models based on the Active Appearance Models framework. Topics treated incl...... include landmark placement, background modelling and multi-resolution analysis. Preliminary quantitative and qualitative validation in a cross-sectional study show that fully automated analysis and segmentation of the corpus callosum are feasible....

  19. Automated Marx’s Composite Oscillator Method

    Science.gov (United States)

    Tateno, Hiroto; Taniguchi, Hideaki

    1980-01-01

    Marx’s composite oscillator method has been successfully automated for measuring internal friction and Young’s modulus. The apparatus automatically finds the mechanical resonant frequency of the composite oscillator and keeps the strain amplitude constant, so that internal friction and the modulus can easily be measured and directly recorded on an X-t recorder. This device enables us to observe continuously the time dependence of the pinning and unpinning processes of dislocation by point defects. The error in strain amplitude is suppressed to within ±0.1%, while the internal friction value changes by 2 orders of magnitude. The operation of this system has been analyzed by automatic control theory, and the theoretical results are in good agreement with the experiment results. The analysis and the application of this method are presented here together with some experimental results.

  20. Automated Analysis of Infinite Scenarios

    DEFF Research Database (Denmark)

    Buchholtz, Mikael

    2005-01-01

    The security of a network protocol crucially relies on the scenario in which the protocol is deployed. This paper describes syntactic constructs for modelling network scenarios and presents an automated analysis tool, which can guarantee that security properties hold in all of the (infinitely many......) instances of a scenario. The tool is based on control flow analysis of the process calculus LySa and is applied to the Bauer, Berson, and Feiertag protocol where is reveals a previously undocumented problem, which occurs in some scenarios but not in other....

  1. Automation of finite element methods

    CERN Document Server

    Korelc, Jože

    2016-01-01

    New finite elements are needed as well in research as in industry environments for the development of virtual prediction techniques. The design and implementation of novel finite elements for specific purposes is a tedious and time consuming task, especially for nonlinear formulations. The automation of this process can help to speed up this process considerably since the generation of the final computer code can be accelerated by order of several magnitudes. This book provides the reader with the required knowledge needed to employ modern automatic tools like AceGen within solid mechanics in a successful way. It covers the range from the theoretical background, algorithmic treatments to many different applications. The book is written for advanced students in the engineering field and for researchers in educational and industrial environments.

  2. Fully automated dissolution and separation methods for inductively coupled plasma atomic emission spectrometry rock analysis. Application to the determination of rare earth elements

    International Nuclear Information System (INIS)

    Govindaraju, K.; Mevelle, G.

    1987-01-01

    In rock analysis laboratories, sample preparation is a serious problem, or even an enormous bottleneck. Because this laboratory is production-oriented, this problem was attacked by automating progressively, different steps in rock analysis for major, minor and trace elements. This effort has been considerably eased by the fact that all sample preparation schemes in this laboratory for the past three decades have been based on an initial lithium borate fusion of rock samples and all analytical methods based on multi-element atomic emission spectrometry, with switch-over from solid analysis by arc/spark excitation to solution analysis by plasma excitation in 1974. The sample preparation steps which have been automated are: weighing of samples and fluxes, lithium borate fusion, dissolution and dilution of fusion products and ion-exchange separation of difficult trace elements such as rare earth elements (REE). During 1985 and 1986, these different unit operations have been assembled together as peripheral units in the form of a workstation, called LabRobStation. A travelling robot is the master of LabRobStation, with all peripheral units at its reach in 10 m 2 workspace. As an example of real application, the automated determination of REE, based on more than 8000 samples analysed during 1982 and 1986, is presented. (author)

  3. Exploratory analysis of methods for automated classification of laboratory test orders into syndromic groups in veterinary medicine.

    Directory of Open Access Journals (Sweden)

    Fernanda C Dórea

    Full Text Available BACKGROUND: Recent focus on earlier detection of pathogen introduction in human and animal populations has led to the development of surveillance systems based on automated monitoring of health data. Real- or near real-time monitoring of pre-diagnostic data requires automated classification of records into syndromes--syndromic surveillance--using algorithms that incorporate medical knowledge in a reliable and efficient way, while remaining comprehensible to end users. METHODS: This paper describes the application of two of machine learning (Naïve Bayes and Decision Trees and rule-based methods to extract syndromic information from laboratory test requests submitted to a veterinary diagnostic laboratory. RESULTS: High performance (F1-macro = 0.9995 was achieved through the use of a rule-based syndrome classifier, based on rule induction followed by manual modification during the construction phase, which also resulted in clear interpretability of the resulting classification process. An unmodified rule induction algorithm achieved an F(1-micro score of 0.979 though this fell to 0.677 when performance for individual classes was averaged in an unweighted manner (F(1-macro, due to the fact that the algorithm failed to learn 3 of the 16 classes from the training set. Decision Trees showed equal interpretability to the rule-based approaches, but achieved an F(1-micro score of 0.923 (falling to 0.311 when classes are given equal weight. A Naïve Bayes classifier learned all classes and achieved high performance (F(1-micro= 0.994 and F(1-macro = .955, however the classification process is not transparent to the domain experts. CONCLUSION: The use of a manually customised rule set allowed for the development of a system for classification of laboratory tests into syndromic groups with very high performance, and high interpretability by the domain experts. Further research is required to develop internal validation rules in order to establish

  4. Exploratory analysis of methods for automated classification of laboratory test orders into syndromic groups in veterinary medicine.

    Science.gov (United States)

    Dórea, Fernanda C; Muckle, C Anne; Kelton, David; McClure, J T; McEwen, Beverly J; McNab, W Bruce; Sanchez, Javier; Revie, Crawford W

    2013-01-01

    Recent focus on earlier detection of pathogen introduction in human and animal populations has led to the development of surveillance systems based on automated monitoring of health data. Real- or near real-time monitoring of pre-diagnostic data requires automated classification of records into syndromes--syndromic surveillance--using algorithms that incorporate medical knowledge in a reliable and efficient way, while remaining comprehensible to end users. This paper describes the application of two of machine learning (Naïve Bayes and Decision Trees) and rule-based methods to extract syndromic information from laboratory test requests submitted to a veterinary diagnostic laboratory. High performance (F1-macro = 0.9995) was achieved through the use of a rule-based syndrome classifier, based on rule induction followed by manual modification during the construction phase, which also resulted in clear interpretability of the resulting classification process. An unmodified rule induction algorithm achieved an F(1-micro) score of 0.979 though this fell to 0.677 when performance for individual classes was averaged in an unweighted manner (F(1-macro)), due to the fact that the algorithm failed to learn 3 of the 16 classes from the training set. Decision Trees showed equal interpretability to the rule-based approaches, but achieved an F(1-micro) score of 0.923 (falling to 0.311 when classes are given equal weight). A Naïve Bayes classifier learned all classes and achieved high performance (F(1-micro)= 0.994 and F(1-macro) = .955), however the classification process is not transparent to the domain experts. The use of a manually customised rule set allowed for the development of a system for classification of laboratory tests into syndromic groups with very high performance, and high interpretability by the domain experts. Further research is required to develop internal validation rules in order to establish automated methods to update model rules without user

  5. Automated analysis of complex data

    Science.gov (United States)

    Saintamant, Robert; Cohen, Paul R.

    1994-01-01

    We have examined some of the issues involved in automating exploratory data analysis, in particular the tradeoff between control and opportunism. We have proposed an opportunistic planning solution for this tradeoff, and we have implemented a prototype, Igor, to test the approach. Our experience in developing Igor was surprisingly smooth. In contrast to earlier versions that relied on rule representation, it was straightforward to increment Igor's knowledge base without causing the search space to explode. The planning representation appears to be both general and powerful, with high level strategic knowledge provided by goals and plans, and the hooks for domain-specific knowledge are provided by monitors and focusing heuristics.

  6. A detailed comparison of analysis processes for MCC-IMS data in disease classification-Automated methods can replace manual peak annotations.

    Directory of Open Access Journals (Sweden)

    Salome Horsch

    Full Text Available Disease classification from molecular measurements typically requires an analysis pipeline from raw noisy measurements to final classification results. Multi capillary column-ion mobility spectrometry (MCC-IMS is a promising technology for the detection of volatile organic compounds in the air of exhaled breath. From raw measurements, the peak regions representing the compounds have to be identified, quantified, and clustered across different experiments. Currently, several steps of this analysis process require manual intervention of human experts. Our goal is to identify a fully automatic pipeline that yields competitive disease classification results compared to an established but subjective and tedious semi-manual process.We combine a large number of modern methods for peak detection, peak clustering, and multivariate classification into analysis pipelines for raw MCC-IMS data. We evaluate all combinations on three different real datasets in an unbiased cross-validation setting. We determine which specific algorithmic combinations lead to high AUC values in disease classifications across the different medical application scenarios.The best fully automated analysis process achieves even better classification results than the established manual process. The best algorithms for the three analysis steps are (i SGLTR (Savitzky-Golay Laplace-operator filter thresholding regions and LM (Local Maxima for automated peak identification, (ii EM clustering (Expectation Maximization and DBSCAN (Density-Based Spatial Clustering of Applications with Noise for the clustering step and (iii RF (Random Forest for multivariate classification. Thus, automated methods can replace the manual steps in the analysis process to enable an unbiased high throughput use of the technology.

  7. Reload safety analysis automation tools

    International Nuclear Information System (INIS)

    Havlůj, F.; Hejzlar, J.; Vočka, R.

    2013-01-01

    Performing core physics calculations for the sake of reload safety analysis is a very demanding and time consuming process. This process generally begins with the preparation of libraries for the core physics code using a lattice code. The next step involves creating a very large set of calculations with the core physics code. Lastly, the results of the calculations must be interpreted, correctly applying uncertainties and checking whether applicable limits are satisfied. Such a procedure requires three specialized experts. One must understand the lattice code in order to correctly calculate and interpret its results. The next expert must have a good understanding of the physics code in order to create libraries from the lattice code results and to correctly define all the calculations involved. The third expert must have a deep knowledge of the power plant and the reload safety analysis procedure in order to verify, that all the necessary calculations were performed. Such a procedure involves many steps and is very time consuming. At ÚJV Řež, a.s., we have developed a set of tools which can be used to automate and simplify the whole process of performing reload safety analysis. Our application QUADRIGA automates lattice code calculations for library preparation. It removes user interaction with the lattice code and reduces his task to defining fuel pin types, enrichments, assembly maps and operational parameters all through a very nice and user-friendly GUI. The second part in reload safety analysis calculations is done by CycleKit, a code which is linked with our core physics code ANDREA. Through CycleKit large sets of calculations with complicated interdependencies can be performed using simple and convenient notation. CycleKit automates the interaction with ANDREA, organizes all the calculations, collects the results, performs limit verification and displays the output in clickable html format. Using this set of tools for reload safety analysis simplifies

  8. Automated reasoning-alternative methods

    Directory of Open Access Journals (Sweden)

    Perović Aleksandar

    2004-01-01

    Full Text Available Our main goal is to describe a potential usage of the interpretation method (i.e. formal representation of one first order theory into another together with quantifier elimination procedures developed in the GIS.

  9. A Method of Partly Automated Testing of Software

    Science.gov (United States)

    Lowry, Mike; Visser, Willem; Washington, Rich; Artho, Cyrille; Goldberg, Allen; Haveland, Klaus; Pasareanu, Corina; Khurshid, Sarfraz; Roflu, Grigore

    2007-01-01

    A method of automated testing of software has been developed that provides an alternative to the conventional mostly manual approach for software testing. The method combines (1) automated generation of test cases on the basis of systematic exploration of the input domain of the software to be tested with (2) run-time analysis in which execution traces are monitored, verified against temporal-logic specifications, and analyzed by concurrency-error-detection algorithms. In this new method, the user only needs to provide the temporal logic specifications against which the software will be tested and the abstract description of the input domain.

  10. Automated quantification and analysis of mandibular asymmetry

    DEFF Research Database (Denmark)

    Darvann, T. A.; Hermann, N. V.; Larsen, P.

    2010-01-01

    We present an automated method of spatially detailed 3D asymmetry quantification in mandibles extracted from CT and apply it to a population of infants with unilateral coronal synostosis (UCS). An atlas-based method employing non-rigid registration of surfaces is used for determining deformation ...... after mirroring the mandible across the MSP. A principal components analysis of asymmetry characterizes the major types of asymmetry in the population, and successfully separates the asymmetric UCS mandibles from a number of less asymmetric mandibles from a control population....

  11. Autoradiography and automated image analysis

    International Nuclear Information System (INIS)

    Vardy, P.H.; Willard, A.G.

    1982-01-01

    Limitations with automated image analysis and the solution of problems encountered are discussed. With transmitted light, unstained plastic sections with planar profiles should be used. Stains potentiate signal so that television registers grains as falsely larger areas of low light intensity. Unfocussed grains in paraffin sections will not be seen by image analysers due to change in darkness and size. With incident illumination, the use of crossed polars, oil objectives and an oil filled light trap continuous with the base of the slide will reduce glare. However this procedure so enormously attenuates the light reflected by silver grains, that detection may be impossible. Autoradiographs should then be photographed and the negative images of silver grains on film analysed automatically using transmitted light

  12. Cell cycle kinetic analysis of colorectal neoplasms using a new automated immunohistochemistry-based cell cycle detection method.

    Science.gov (United States)

    Tomono, Ayako; Itoh, Tomoo; Yanagita, Emmy; Imagawa, Naoko; Kakeji, Yoshihiro

    2015-01-01

    We have recently developed a new method called the immunohistochemistry-based cell cycle detection (iCCD), which allows the determination of cell cycle phases on a cell-by-cell basis. This automated procedure can be performed on tissue sections and involves triple immunostaining for geminin, cdt1, and γ H2A.X, which are nuclear proteins expressed sequentially, with a few overlaps, during the cell cycle. In the current study, we applied this technique to resected specimens of colorectal neoplasm to determine the usefulness of iCCD for the pathological examination of colorectal cancers. We examined 141 cases of colorectal cancers. Normal mucosa and adenomas were analyzed as controls. In nonneoplastic mucosa, we observed a pattern of distribution of the cells positive for these cell cycle markers. Adenomas showed a slight distortion in this pattern, the geminin-positive cells, indicative of S/G2/M phase, were localized in the upper one-third region of the crypts. In neoplastic mucosa, the marker expression pattern was disorganized. Compared with normal mucosa, colorectal neoplasms showed an increased proportion of geminin-positive cells and decreased percentages of cdt1-positive cells (G1 phase). However, we did not find significant difference in the expression pattern between adenomas and carcinomas. Cellular proportions were correlated with clinicopathological parameters such as microscopic vascular invasion and pT stages. In cases of preoperative adjuvant therapy, the proportion of geminin-positive cells decreased, whereas that of γ H2A.X-positive cells (indicative of apoptosis/degeneration) increased significantly. We believe that this novel method can be applied to clinical samples to evaluate cell cycle kinetics and the effects of preoperative adjuvant therapy in colorectal cancers.

  13. A Comparison of Fully Automated Methods of Data Analysis and Computer Assisted Heuristic Methods in an Electrode Kinetic Study of the Pathologically Variable [Fe(CN) 6 ] 3–/4– Process by AC Voltammetry

    KAUST Repository

    Morris, Graham P.

    2013-12-17

    Fully automated and computer assisted heuristic data analysis approaches have been applied to a series of AC voltammetric experiments undertaken on the [Fe(CN)6]3-/4- process at a glassy carbon electrode in 3 M KCl aqueous electrolyte. The recovered parameters in all forms of data analysis encompass E0 (reversible potential), k0 (heterogeneous charge transfer rate constant at E0), α (charge transfer coefficient), Ru (uncompensated resistance), and Cdl (double layer capacitance). The automated method of analysis employed time domain optimization and Bayesian statistics. This and all other methods assumed the Butler-Volmer model applies for electron transfer kinetics, planar diffusion for mass transport, Ohm\\'s Law for Ru, and a potential-independent Cdl model. Heuristic approaches utilize combinations of Fourier Transform filtering, sensitivity analysis, and simplex-based forms of optimization applied to resolved AC harmonics and rely on experimenter experience to assist in experiment-theory comparisons. Remarkable consistency of parameter evaluation was achieved, although the fully automated time domain method provided consistently higher α values than those based on frequency domain data analysis. The origin of this difference is that the implemented fully automated method requires a perfect model for the double layer capacitance. In contrast, the importance of imperfections in the double layer model is minimized when analysis is performed in the frequency domain. Substantial variation in k0 values was found by analysis of the 10 data sets for this highly surface-sensitive pathologically variable [Fe(CN) 6]3-/4- process, but remarkably, all fit the quasi-reversible model satisfactorily. © 2013 American Chemical Society.

  14. An automated method for accurate vessel segmentation

    Science.gov (United States)

    Yang, Xin; Liu, Chaoyue; Le Minh, Hung; Wang, Zhiwei; Chien, Aichi; (Tim Cheng, Kwang-Ting

    2017-05-01

    Vessel segmentation is a critical task for various medical applications, such as diagnosis assistance of diabetic retinopathy, quantification of cerebral aneurysm’s growth, and guiding surgery in neurosurgical procedures. Despite technology advances in image segmentation, existing methods still suffer from low accuracy for vessel segmentation in the two challenging while common scenarios in clinical usage: (1) regions with a low signal-to-noise-ratio (SNR), and (2) at vessel boundaries disturbed by adjacent non-vessel pixels. In this paper, we present an automated system which can achieve highly accurate vessel segmentation for both 2D and 3D images even under these challenging scenarios. Three key contributions achieved by our system are: (1) a progressive contrast enhancement method to adaptively enhance contrast of challenging pixels that were otherwise indistinguishable, (2) a boundary refinement method to effectively improve segmentation accuracy at vessel borders based on Canny edge detection, and (3) a content-aware region-of-interests (ROI) adjustment method to automatically determine the locations and sizes of ROIs which contain ambiguous pixels and demand further verification. Extensive evaluation of our method is conducted on both 2D and 3D datasets. On a public 2D retinal dataset (named DRIVE (Staal 2004 IEEE Trans. Med. Imaging 23 501-9)) and our 2D clinical cerebral dataset, our approach achieves superior performance to the state-of-the-art methods including a vesselness based method (Frangi 1998 Int. Conf. on Medical Image Computing and Computer-Assisted Intervention) and an optimally oriented flux (OOF) based method (Law and Chung 2008 European Conf. on Computer Vision). An evaluation on 11 clinical 3D CTA cerebral datasets shows that our method can achieve 94% average accuracy with respect to the manual segmentation reference, which is 23% to 33% better than the five baseline methods (Yushkevich 2006 Neuroimage 31 1116-28; Law and Chung 2008

  15. Automation for System Safety Analysis

    Science.gov (United States)

    Malin, Jane T.; Fleming, Land; Throop, David; Thronesbery, Carroll; Flores, Joshua; Bennett, Ted; Wennberg, Paul

    2009-01-01

    This presentation describes work to integrate a set of tools to support early model-based analysis of failures and hazards due to system-software interactions. The tools perform and assist analysts in the following tasks: 1) extract model parts from text for architecture and safety/hazard models; 2) combine the parts with library information to develop the models for visualization and analysis; 3) perform graph analysis and simulation to identify and evaluate possible paths from hazard sources to vulnerable entities and functions, in nominal and anomalous system-software configurations and scenarios; and 4) identify resulting candidate scenarios for software integration testing. There has been significant technical progress in model extraction from Orion program text sources, architecture model derivation (components and connections) and documentation of extraction sources. Models have been derived from Internal Interface Requirements Documents (IIRDs) and FMEA documents. Linguistic text processing is used to extract model parts and relationships, and the Aerospace Ontology also aids automated model development from the extracted information. Visualizations of these models assist analysts in requirements overview and in checking consistency and completeness.

  16. Developing Automated Methods of Waste Sorting

    Energy Technology Data Exchange (ETDEWEB)

    Shurtliff, Rodney Marvin

    2002-08-01

    The U.S. Department of Energy (DOE) analyzed the need complex-wide for remote and automated technologies as they relate to the treatment and disposal of mixed wastes. This analysis revealed that several DOE sites need the capability to open drums containing waste, visually inspect and sort the contents, and finally repackage the containers that are acceptable at a waste disposal facility such as the Waste Isolation Pilot Plant (WIPP) in New Mexico. Conditioning contaminated waste so that it is compatible with the WIPP criteria for storage is an arduous task whether the waste is contact handled (waste having radioactivity levels below 200 mrem/hr) or remote handled. Currently, WIPP non-compliant items are removed from the waste stream manually, at a rate of about one 55-gallon drum per day. Issues relating to contamination-based health hazards as well as repetitive motion health hazards are steering industry towards a more user-friendly, method of conditioning or sorting waste.

  17. [Automation of chemical analysis in enology].

    Science.gov (United States)

    Dubernet, M

    1978-01-01

    Automatic dosages took place a short time ago in oenology laboratories. First researchs about automation of usual manual analysis have been completed by I.N.R.A. Station of Dijon during 1969--1972 years. Then, other researchs were made and in 1974 the first automatic analyser appeared in application laboratories. In all cases continuous flow method was used. First dosages which has been carried out are volatic acidity, residual sugars, total SO2. The rate of work is 30 samples an hour. Then, an original way for free SO2 was suggested. At present, about a dozen of laboratories in France use these dosages. The ethanol dosage automation, very important in oenology, is very difficult to carry out. A new method using a thermometric analyzer is tested. Research about many dosages as tartaric, malic, lactic acids, glucose, fructose, glycérol, have been performed especially by I.N.R.A. Station in Narbonne. But these dosages are not current and at present no laboratory apply them. Now, equipments price and redemption, change of tradionnal dosages for automatical methods and the level of knowledge required for operators are well known. The reproducibility and the accuracy of the continuous flow automatic dosages allow, for enough important laboratories, to make an increasing number of analysis necessary for wine quality control.

  18. Optimization of automation: III. Development of optimization method for determining automation rate in nuclear power plants

    International Nuclear Information System (INIS)

    Lee, Seung Min; Kim, Jong Hyun; Kim, Man Cheol; Seong, Poong Hyun

    2016-01-01

    Highlights: • We propose an appropriate automation rate that enables the best human performance. • We analyze the shortest working time considering Situation Awareness Recovery (SAR). • The optimized automation rate is estimated by integrating the automation and ostracism rate estimation methods. • The process to derive the optimized automation rate is demonstrated through case studies. - Abstract: Automation has been introduced in various industries, including the nuclear field, because it is commonly believed that automation promises greater efficiency, lower workloads, and fewer operator errors through reducing operator errors and enhancing operator and system performance. However, the excessive introduction of automation has deteriorated operator performance due to the side effects of automation, which are referred to as Out-of-the-Loop (OOTL), and this is critical issue that must be resolved. Thus, in order to determine the optimal level of automation introduction that assures the best human operator performance, a quantitative method of optimizing the automation is proposed in this paper. In order to propose the optimization method for determining appropriate automation levels that enable the best human performance, the automation rate and ostracism rate, which are estimation methods that quantitatively analyze the positive and negative effects of automation, respectively, are integrated. The integration was conducted in order to derive the shortest working time through considering the concept of situation awareness recovery (SAR), which states that the automation rate with the shortest working time assures the best human performance. The process to derive the optimized automation rate is demonstrated through an emergency operation scenario-based case study. In this case study, four types of procedures are assumed through redesigning the original emergency operating procedure according to the introduced automation and ostracism levels. Using the

  19. Automated metabolic gas analysis systems: a review.

    Science.gov (United States)

    Macfarlane, D J

    2001-01-01

    The use of automated metabolic gas analysis systems or metabolic measurement carts (MMC) in exercise studies is common throughout the industrialised world. They have become essential tools for diagnosing many hospital patients, especially those with cardiorespiratory disease. Moreover, the measurement of maximal oxygen uptake (VO2max) is routine for many athletes in fitness laboratories and has become a defacto standard in spite of its limitations. The development of metabolic carts has also facilitated the noninvasive determination of the lactate threshold and cardiac output, respiratory gas exchange kinetics, as well as studies of outdoor activities via small portable systems that often use telemetry. Although the fundamental principles behind the measurement of oxygen uptake (VO2) and carbon dioxide production (VCO2) have not changed, the techniques used have, and indeed, some have almost turned through a full circle. Early scientists often employed a manual Douglas bag method together with separate chemical analyses, but the need for faster and more efficient techniques fuelled the development of semi- and full-automated systems by private and commercial institutions. Yet, recently some scientists are returning back to the traditional Douglas bag or Tissot-spirometer methods, or are using less complex automated systems to not only save capital costs, but also to have greater control over the measurement process. Over the last 40 years, a considerable number of automated systems have been developed, with over a dozen commercial manufacturers producing in excess of 20 different automated systems. The validity and reliability of all these different systems is not well known, with relatively few independent studies having been published in this area. For comparative studies to be possible and to facilitate greater consistency of measurements in test-retest or longitudinal studies of individuals, further knowledge about the performance characteristics of these

  20. A Systematic, Automated Network Planning Method

    DEFF Research Database (Denmark)

    Holm, Jens Åge; Pedersen, Jens Myrup

    2006-01-01

    This paper describes a case study conducted to evaluate the viability of a systematic, automated network planning method. The motivation for developing the network planning method was that many data networks are planned in an adhoc manner with no assurance of quality of the solution with respect...... to consistency and long-term characteristics. The developed method gives significant improvements on these parameters. The case study was conducted as a comparison between an existing network where the traffic was known and a proposed network designed by the developed method. It turned out that the proposed...... structures, that are ready to implement in a real world scenario, are discussed in the end of the paper. These are in the area of ensuring line independence and complexity of the design rules for the planning method....

  1. Distribution system analysis and automation

    CERN Document Server

    Gers, Juan

    2013-01-01

    A comprehensive guide to techniques that allow engineers to simulate, analyse and optimise power distribution systems which combined with automation, underpin the emerging concept of the "smart grid". This book is supported by theoretical concepts with real-world applications and MATLAB exercises.

  2. An investigation of automated activation analysis

    International Nuclear Information System (INIS)

    Kuykendall, William E. Jr.; Wainerdi, Richard E.

    1962-01-01

    A study has been made of the possibility of applying computer techniques to the resolution of data from the complex gamma-ray spectra obtained in non-destructive activation analysis. The primary objective has been to use computer data-handling techniques to allow the existing analytical method to be used for rapid, routine, sensitive and economical elemental analyses. The necessary conditions for the satisfactory application of automated activation analysis have been evaluated and a computer programme has been completed which will process the data from samples containing a large number of different elements. To illustrate the speed of the handling sequence, the data from a sample containing four component elements can be processed in a matter of minutes, with the speed of processing limited primarily by the speed of the output printer. (author) [fr

  3. Automated analysis of slitless spectra. II. Quasars

    International Nuclear Information System (INIS)

    Edwards, G.; Beauchemin, M.; Borra, F.

    1988-01-01

    Automated software have been developed to process slitless spectra. The software, described in a previous paper, automatically separates stars from extended objects and quasars from stars. This paper describes the quasar search techniques and discusses the results. The performance of the software is compared and calibrated with a plate taken in a region of SA 57 that has been extensively surveyed by others using a variety of techniques: the proposed automated software performs very well. It is found that an eye search of the same plate is less complete than the automated search: surveys that rely on eye searches suffer from incompleteness at least from a magnitude brighter than the plate limit. It is shown how the complete automated analysis of a plate and computer simulations are used to calibrate and understand the characteristics of the present data. 20 references

  4. Management issues in automated audit analysis

    Energy Technology Data Exchange (ETDEWEB)

    Jackson, K.A.; Hochberg, J.G.; Wilhelmy, S.K.; McClary, J.F.; Christoph, G.G.

    1994-03-01

    This paper discusses management issues associated with the design and implementation of an automated audit analysis system that we use to detect security events. It gives the viewpoint of a team directly responsible for developing and managing such a system. We use Los Alamos National Laboratory`s Network Anomaly Detection and Intrusion Reporter (NADIR) as a case in point. We examine issues encountered at Los Alamos, detail our solutions to them, and where appropriate suggest general solutions. After providing an introduction to NADIR, we explore four general management issues: cost-benefit questions, privacy considerations, legal issues, and system integrity. Our experiences are of general interest both to security professionals and to anyone who may wish to implement a similar system. While NADIR investigates security events, the methods used and the management issues are potentially applicable to a broad range of complex systems. These include those used to audit credit card transactions, medical care payments, and procurement systems.

  5. Initial development of an automated task analysis profiling system

    International Nuclear Information System (INIS)

    Jorgensen, C.C.

    1984-01-01

    A program for automated task analysis is described. Called TAPS (task analysis profiling system), the program accepts normal English prose and outputs skills, knowledges, attitudes, and abilities (SKAAs) along with specific guidance and recommended ability measurement tests for nuclear power plant operators. A new method for defining SKAAs is presented along with a sample program output

  6. Development of automated system of heavy water analysis

    International Nuclear Information System (INIS)

    Fedorchenko, O.A.; Novozhilov, V.A.; Trenin, V.D.

    1993-01-01

    Application of traditional methods of qualitative and quantitative control of coolant (moderator) for the analysis of heavy water with high tritium content presents many difficulties and an inevitable accumulation of wastes that many facilities will not accept. This report describes an automated system for heavy water sampling and analysis

  7. Automated Technology for Verificiation and Analysis

    DEFF Research Database (Denmark)

    This volume contains the papers presented at the 7th International Symposium on Automated Technology for Verification and Analysis held during October 13-16 in Macao SAR, China. The primary objective of the ATVA conferences remains the same: to exchange and promote the latest advances of state......-of-the-art research on theoretical and practical aspects of automated analysis, verification, and synthesis. Among 74 research papers and 10 tool papers submitted to ATVA 2009, the Program Committee accepted 23 as regular papers and 3 as tool papers. In all, 33 experts from 17 countries worked hard to make sure...

  8. The light transmission method of automated track scanning

    International Nuclear Information System (INIS)

    Gold, Raymond; Roberts, J.H.

    2000-01-01

    An empirical method of automated track scanning is described. This new method is based on the measurement of light transmission (LT) through solid state track recorders (SSTR). Fission fragment tracks in mica SSTR are used to demonstrate the utility of this method. Data analysis reveals that the LT method is equivalent to a point sampling method at the approximately 2% (1σ) uncertainty level of the calibration data. The total (1σ) uncertainty of the LT method decreases with increasing fission density from approximately 5% at a fission density of 4.0E + 06 fissions/cm 2 down to approximately 2.5% at a fission density of 1.2E + 07 fissions/cm 2 . The current stage of development permits only a qualitative comparison of the LT and point sampling methods. Recommendations to refine the LT method are advanced with emphasis on processing procedures for mica SSTR

  9. Principles and methods for automated palynology.

    Science.gov (United States)

    Holt, K A; Bennett, K D

    2014-08-01

    Pollen grains are microscopic so their identification and quantification has, for decades, depended upon human observers using light microscopes: a labour-intensive approach. Modern improvements in computing and imaging hardware and software now bring automation of pollen analyses within reach. In this paper, we provide the first review in over 15 yr of progress towards automation of the part of palynology concerned with counting and classifying pollen, bringing together literature published from a wide spectrum of sources. We consider which attempts offer the most potential for an automated palynology system for universal application across all fields of research concerned with pollen classification and counting. We discuss what is required to make the datasets of these automated systems as acceptable as those produced by human palynologists, and present suggestions for how automation will generate novel approaches to counting and classifying pollen that have hitherto been unthinkable.

  10. Non-destructive phenotypic analysis of early stage tree seedling growth using an automated stereovision imaging method

    Directory of Open Access Journals (Sweden)

    Antonio Montagnoli

    2016-10-01

    Full Text Available A plant phenotyping approach was applied to evaluate growth rate of containerized tree seedlings during the precultivation phase following seed germination. A simple and affordable stereo optical system was used to collect stereoscopic RGB images of seedlings at regular intervals of time. Comparative analysis of these images by means of a newly developed software enabled us to calculate a the increments of seedlings height and b the percentage greenness of seedling leaves. Comparison of these parameters with destructive biomass measurements showed that the height traits can be used to estimate seedling growth for needle-leaved plant species whereas the greenness trait can be used for broad-leaved plant species. Despite the need to adjust for plant type, growth stage and light conditions this new, cheap, rapid, and sustainable phenotyping approach can be used to study large-scale phenome variations due to genome variability and interaction with environmental factors.

  11. Automation method to identify the geological structure of seabed using spatial statistic analysis of echo sounding data

    Science.gov (United States)

    Kwon, O.; Kim, W.; Kim, J.

    2017-12-01

    Recently construction of subsea tunnel has been increased globally. For safe construction of subsea tunnel, identifying the geological structure including fault at design and construction stage is more than important. Then unlike the tunnel in land, it's very difficult to obtain the data on geological structure because of the limit in geological survey. This study is intended to challenge such difficulties in a way of developing the technology to identify the geological structure of seabed automatically by using echo sounding data. When investigation a potential site for a deep subsea tunnel, there is the technical and economical limit with borehole of geophysical investigation. On the contrary, echo sounding data is easily obtainable while information reliability is higher comparing to above approaches. This study is aimed at developing the algorithm that identifies the large scale of geological structure of seabed using geostatic approach. This study is based on theory of structural geology that topographic features indicate geological structure. Basic concept of algorithm is outlined as follows; (1) convert the seabed topography to the grid data using echo sounding data, (2) apply the moving window in optimal size to the grid data, (3) estimate the spatial statistics of the grid data in the window area, (4) set the percentile standard of spatial statistics, (5) display the values satisfying the standard on the map, (6) visualize the geological structure on the map. The important elements in this study include optimal size of moving window, kinds of optimal spatial statistics and determination of optimal percentile standard. To determine such optimal elements, a numerous simulations were implemented. Eventually, user program based on R was developed using optimal analysis algorithm. The user program was designed to identify the variations of various spatial statistics. It leads to easy analysis of geological structure depending on variation of spatial statistics

  12. Automated multi-filtration cleanup with nitrogen-enriched activated carbon material as pesticide multi-residue analysis method in representative crop commodities.

    Science.gov (United States)

    Qin, Yuhong; Zhang, Jingru; Li, Yifan; Wang, Qiuxiao; Wu, Yangliu; Xu, Lanshu; Jin, Xiaojuan; Pan, Canping

    2017-09-15

    An automated multi-filtration cleanup (Auto m-FC) method with nitrogen-enriched activated carbon material based on modified QuEChERS (quick, easy, cheap, effective, rugged, and safe) extracts was developed. It was applied to pesticide multi-residue analysis in six representative crop commodities. The automatic device was aimed to improve the cleanup efficiency and reduce manual operation workload in cleanup step. By controlling extracts volume, flow rate and Auto m-FC cycles, the device could finish cleanup process accurately. In this work, nitrogen-enriched activated carbon mixed with alternative sorbents and anhydrous magnesium sulfate (MgSO 4 ) was packed in a column for Auto m-FC and followed by liquid chromatography with tandem mass spectrometric (LC-MS/MS) detection. This newly developed carbon material showed excellent cleanup performance. It was validated by analyzing 23 pesticides in six representative matrices spiked at two concentration levels of 10 and 100μg/kg. Water addition volume, salts, sorbents, Auto m-FC procedure including the flow rate and the Auto m-FC cycles for each matrix were optimized. Then, three general Auto m-FC methods were introduced to high water content, high oil and starch content, difficult commodities. Spike recoveries were within 82 and 106% and 1-14% RSD for all analytes in the tested matrices. Matrix-matched calibrations were performed with the coefficients of determination over 0.997 between concentration levels of 10 and 1000μg/kg. The developed method was successfully applied to the determination of pesticide residues in market samples. Copyright © 2017. Published by Elsevier B.V.

  13. Computer-automated neutron activation analysis system

    International Nuclear Information System (INIS)

    Minor, M.M.; Garcia, S.R.

    1983-01-01

    An automated delayed neutron counting and instrumental neutron activation analysis system has been developed at Los Alamos National Laboratory's Omega West Reactor (OWR) to analyze samples for uranium and 31 additional elements with a maximum throughput of 400 samples per day. 5 references

  14. Semi-Automated method of analysis of horizontal histological sections of skin for objective evaluation of fractional devices.

    Science.gov (United States)

    Zelickson, Brian D; Walgrave, Susan E; Al-Arashi, Munir Yahya H; Altshuler, Gregory B; Yaroslavsky, Ilya V; Childs, James J; Cohen, Rich H; Erofeev, Andrei V; Depina, Erminaldo F; Smirnov, Mikhail Z; Kist, David A; Tabatadze, David R

    2009-11-01

    The treatment of skin with fractional devices creates columns of micro-ablation or micro-denaturation depending on the device. Since the geometric profiles of thermal damage depend on the treatment parameters or physical properties of the treated tissue, the size of these columns may vary from a few microns to a few millimeters. For objective evaluation of the damage profiles generated by fractional devices, this report describes an innovative and efficient method of processing and evaluating horizontal sections of skin using a novel software program. Ex vivo porcine skin was treated with the Lux1540/10, Lux1540 Zoom and Lux2940 with 500 optics. Horizontal (radial) sections of biopsies were obtained and processed with H&E and NBTC staining. Digital images of the histologic sections were taken in either transmission or reflection illumination and were processed using the SAFHIR program. NBTC- and H&E-stained horizontal sections of ex vivo skin treated with ablative and non-ablative fractional devices were obtained. Geometric parameters, such as depth, diameter, and width of the coagulated layer (if applicable), and micro-columns of thermal damage, were evaluated using the SAFHIR software. The feasibility of objective comparison of the performance of two different fractional devices was demonstrated. The proposed methodology provides a comprehensive, objective, and efficient approach for the comparison of various fractional devices. Correlation of device settings with the objective dimensions of post-treatment damage profiles serve as a powerful tool for the prediction and modulation of clinical response. Copyright 2009 Wiley-Liss, Inc.

  15. Validation of an semi-automated multi component method using protein precipitation LC-MS-MS for the analysis of whole blood samples

    DEFF Research Database (Denmark)

    Slots, Tina

    BACKGROUND: Solid phase extraction (SPE) are one of many multi-component methods, but can be very time-consuming and labour-intensive. Protein precipitation is, on the other hand, a much simpler and faster sample pre-treatment than SPE, and protein precipitation also has the ability to cover...... a wider range of components. AIM: The aim was to develop a robust semi-automated analytical method for whole blood samples based on a protein precipitation method already used in the lab (Sørensen and Hasselstrøm, 2013). The setup should improve the speed, robustness, and reliability of anteand post...

  16. Optimal caliper placement: manual vs automated methods.

    Science.gov (United States)

    Yazdi, B; Zanker, P; Wanger, P; Sonek, J; Pintoffl, K; Hoopmann, M; Kagan, K O

    2014-02-01

    To examine the inter- and intra-operator repeatability of manual placement of callipers in the assessment of basic biometric measurements and to compare the results to an automated calliper placement system. Stored ultrasound images of 95 normal fetuses between 19 and 25 weeks' gestation were used. Five operators (two experts, one resident and two students) were asked to measure the BPD, OFD and FL two times manually and automatically. For each operator, intra-operator repeatability of the manual and automated measurements was assessed by within operator standard deviation. For the assessment of the interoperator repeatability, the mean of the four manual measurements by the two experts was used as the gold standard.The relative bias of the manual measurement of the three non-expert operators and the operator-independent automated measurement were compared with the gold standard measurement by means and 95% confidence interval. In 88.4% of the 95 cases, the automated measurement algorithm was able to obtain appropriate measurements of the BPD, OFD, AC and FL. Within operator standard deviations of the manual measurements ranged between 0.15 and 1.56, irrespective of the experience of the operator.Using the automated biometric measurement system, there was no difference between the measurements of each operator. As far as the inter-operator repeatability is concerned, the difference between the manual measurements of the two students, the resident, and the gold standard was between -0.10 and 2.53 mm. The automated measurements tended to be closer to the gold standard but did not reach statistical significance. In about 90% of the cases, it was possible to obtain basic biometric measurements with an automated system. The use of automated measurements resulted in a significant improvement of the intra-operator but not of the inter-operator repeatability, but measurements were not significantly closer to the gold standard of expert examiners. This article is protected

  17. Systems Analysis as a Prelude to Library Automation

    Science.gov (United States)

    Carter, Ruth C.

    1973-01-01

    Systems analysis, as a prelude to library automation, is an inevitable commonplace fact of life in libraries. Maturation of library automation and the systems analysis which precedes its implementation is observed in this article. (55 references) (Author/TW)

  18. Flow injection analysis: Emerging tool for laboratory automation in radiochemistry

    International Nuclear Information System (INIS)

    Egorov, O.; Ruzicka, J.; Grate, J.W.; Janata, J.

    1996-01-01

    Automation of routine and serial assays is a common practice of modern analytical laboratory, while it is virtually nonexistent in the field of radiochemistry. Flow injection analysis (FIA) is a general solution handling methodology that has been extensively used for automation of routine assays in many areas of analytical chemistry. Reproducible automated solution handling and on-line separation capabilities are among several distinctive features that make FI a very promising, yet under utilized tool for automation in analytical radiochemistry. The potential of the technique is demonstrated through the development of an automated 90 Sr analyzer and its application in the analysis of tank waste samples from the Hanford site. Sequential injection (SI), the latest generation of FIA, is used to rapidly separate 90 Sr from interfering radionuclides and deliver separated Sr zone to a flow-through liquid scintillation detector. The separation is performed on a mini column containing Sr-specific sorbent extraction material, which selectively retains Sr under acidic conditions. The 90 Sr is eluted with water, mixed with scintillation cocktail, and sent through the flow cell of a flow through counter, where 90 Sr radioactivity is detected as a transient signal. Both peak area and peak height can be used for quantification of sample radioactivity. Alternatively, stopped flow detection can be performed to improve detection precision for low activity samples. The authors current research activities are focused on expansion of radiochemical applications of FIA methodology, with an ultimate goal of creating a set of automated methods that will cover the basic needs of radiochemical analysis at the Hanford site. The results of preliminary experiments indicate that FIA is a highly suitable technique for the automation of chemically more challenging separations, such as separation of actinide elements

  19. Automated procedure for performing computer security risk analysis

    International Nuclear Information System (INIS)

    Smith, S.T.; Lim, J.J.

    1984-05-01

    Computers, the invisible backbone of nuclear safeguards, monitor and control plant operations and support many materials accounting systems. Our automated procedure to assess computer security effectiveness differs from traditional risk analysis methods. The system is modeled as an interactive questionnaire, fully automated on a portable microcomputer. A set of modular event trees links the questionnaire to the risk assessment. Qualitative scores are obtained for target vulnerability, and qualitative impact measures are evaluated for a spectrum of threat-target pairs. These are then combined by a linguistic algebra to provide an accurate and meaningful risk measure. 12 references, 7 figures

  20. A novel 3D method of locomotor analysis in adult zebrafish: Implications for automated detection of CNS drug-evoked phenotypes.

    Science.gov (United States)

    Stewart, Adam Michael; Grieco, Fabrizio; Tegelenbosch, Ruud A J; Kyzar, Evan J; Nguyen, Michael; Kaluyeva, Aleksandra; Song, Cai; Noldus, Lucas P J J; Kalueff, Allan V

    2015-11-30

    Expanding the spectrum of organisms to model human brain phenotypes is critical for our improved understanding of the pathobiology of neuropsychiatric disorders. Given the clear limitations of existing mammalian models, there is an urgent need for low-cost, high-throughput in-vivo technologies for drug and gene discovery. Here, we introduce a new automated method for generating 3D (X,Y,Z) swim trajectories in adult zebrafish (Danio rerio), to improve their neurophenotyping. Based on the Track3D module of EthoVision XT video tracking software (Noldus Information Technology), this tool enhances the efficient, high-throughput 3D analyses of zebrafish behavioral responses. Applied to adult zebrafish behavior, this 3D method is highly sensitive to various classes of psychotropic drugs, including selected psychostimulant and hallucinogenic agents. Our present method offers a marked advance in the existing 2D and 3D methods of zebrafish behavioral phenotyping, minimizing research time and recording high-resolution, automatically synchronized videos with calculated, high-precision object positioning. Our novel approach brings practical simplicity and 'integrative' capacity to the often complex and error-prone quantification of zebrafish behavioral phenotypes. Illustrating the value of 3D swim path reconstructions for identifying experimentally-evoked phenotypic profiles, this method fosters innovative, ethologically relevant, and fully automated small molecule screens using adult zebrafish. Copyright © 2015 Elsevier B.V. All rights reserved.

  1. Automated Aesthetic Analysis of Photographic Images.

    Science.gov (United States)

    Aydın, Tunç Ozan; Smolic, Aljoscha; Gross, Markus

    2015-01-01

    We present a perceptually calibrated system for automatic aesthetic evaluation of photographic images. Our work builds upon the concepts of no-reference image quality assessment, with the main difference being our focus on rating image aesthetic attributes rather than detecting image distortions. In contrast to the recent attempts on the highly subjective aesthetic judgment problems such as binary aesthetic classification and the prediction of an image's overall aesthetics rating, our method aims on providing a reliable objective basis of comparison between aesthetic properties of different photographs. To that end our system computes perceptually calibrated ratings for a set of fundamental and meaningful aesthetic attributes, that together form an "aesthetic signature" of an image. We show that aesthetic signatures can still be used to improve upon the current state-of-the-art in automatic aesthetic judgment, but also enable interesting new photo editing applications such as automated aesthetic analysis, HDR tone mapping evaluation, and providing aesthetic feedback during multi-scale contrast manipulation.

  2. Techniques for Automated Performance Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Marcus, Ryan C. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2014-09-02

    The performance of a particular HPC code depends on a multitude of variables, including compiler selection, optimization flags, OpenMP pool size, file system load, memory usage, MPI configuration, etc. As a result of this complexity, current predictive models have limited applicability, especially at scale. We present a formulation of scientific codes, nodes, and clusters that reduces complex performance analysis to well-known mathematical techniques. Building accurate predictive models and enhancing our understanding of scientific codes at scale is an important step towards exascale computing.

  3. Automated information retrieval system for radioactivation analysis

    International Nuclear Information System (INIS)

    Lambrev, V.G.; Bochkov, P.E.; Gorokhov, S.A.; Nekrasov, V.V.; Tolstikova, L.I.

    1981-01-01

    An automated information retrieval system for radioactivation analysis has been developed. An ES-1022 computer and a problem-oriented software ''The description information search system'' were used for the purpose. Main aspects and sources of forming the system information fund, characteristics of the information retrieval language of the system are reported and examples of question-answer dialogue are given. Two modes can be used: selective information distribution and retrospective search [ru

  4. Automated Program Analysis for Cybersecurity (APAC)

    Science.gov (United States)

    2016-07-14

    AUTOMATED PROGRAM ANALYSIS FOR CYBERSECURITY ( APAC ) FIVE DIRECTIONS, INC JULY 2016 FINAL TECHNICAL REPORT APPROVED...CYBERSECURITY ( APAC ) 5a. CONTRACT NUMBER FA8750-14-C-0050 5b. GRANT NUMBER N/A 5c. PROGRAM ELEMENT NUMBER 61101E 6. AUTHOR(S) William Arbaugh...5d. PROJECT NUMBER APAC 5e. TASK NUMBER SD 5f. WORK UNIT NUMBER IR 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Five Directions, Inc

  5. Streamlining and automation of radioanalytical methods at a commercial laboratory

    International Nuclear Information System (INIS)

    Harvey, J.T.; Dillard, J.W.

    1993-01-01

    Through the careful planning and design of laboratory facilities and incorporation of modern instrumentation and robotics systems, properly trained and competent laboratory associates can efficiently and safely handle radioactive and mixed waste samples. This paper addresses the potential improvements radiochemistry and mixed waste laboratories can achieve utilizing robotics for automated sample analysis. Several examples of automated systems for sample preparation and analysis will be discussed

  6. Automating risk analysis of software design models.

    Science.gov (United States)

    Frydman, Maxime; Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance.

  7. Automating Risk Analysis of Software Design Models

    Directory of Open Access Journals (Sweden)

    Maxime Frydman

    2014-01-01

    Full Text Available The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance.

  8. Computational botany methods for automated species identification

    CERN Document Server

    Remagnino, Paolo; Wilkin, Paul; Cope, James; Kirkup, Don

    2017-01-01

    This book discusses innovative methods for mining information from images of plants, especially leaves, and highlights the diagnostic features that can be implemented in fully automatic systems for identifying plant species. Adopting a multidisciplinary approach, it explores the problem of plant species identification, covering both the concepts of taxonomy and morphology. It then provides an overview of morphometrics, including the historical background and the main steps in the morphometric analysis of leaves together with a number of applications. The core of the book focuses on novel diagnostic methods for plant species identification developed from a computer scientist’s perspective. It then concludes with a chapter on the characterization of botanists' visions, which highlights important cognitive aspects that can be implemented in a computer system to more accurately replicate the human expert’s fixation process. The book not only represents an authoritative guide to advanced computational tools fo...

  9. Automating Trend Analysis for Spacecraft Constellations

    Science.gov (United States)

    Davis, George; Cooter, Miranda; Updike, Clark; Carey, Everett; Mackey, Jennifer; Rykowski, Timothy; Powers, Edward I. (Technical Monitor)

    2001-01-01

    Spacecraft trend analysis is a vital mission operations function performed by satellite controllers and engineers, who perform detailed analyses of engineering telemetry data to diagnose subsystem faults and to detect trends that may potentially lead to degraded subsystem performance or failure in the future. It is this latter function that is of greatest importance, for careful trending can often predict or detect events that may lead to a spacecraft's entry into safe-hold. Early prediction and detection of such events could result in the avoidance of, or rapid return to service from, spacecraft safing, which not only results in reduced recovery costs but also in a higher overall level of service for the satellite system. Contemporary spacecraft trending activities are manually intensive and are primarily performed diagnostically after a fault occurs, rather than proactively to predict its occurrence. They also tend to rely on information systems and software that are oudated when compared to current technologies. When coupled with the fact that flight operations teams often have limited resources, proactive trending opportunities are limited, and detailed trend analysis is often reserved for critical responses to safe holds or other on-orbit events such as maneuvers. While the contemporary trend analysis approach has sufficed for current single-spacecraft operations, it will be unfeasible for NASA's planned and proposed space science constellations. Missions such as the Dynamics, Reconnection and Configuration Observatory (DRACO), for example, are planning to launch as many as 100 'nanospacecraft' to form a homogenous constellation. A simple extrapolation of resources and manpower based on single-spacecraft operations suggests that trending for such a large spacecraft fleet will be unmanageable, unwieldy, and cost-prohibitive. It is therefore imperative that an approach to automating the spacecraft trend analysis function be studied, developed, and applied to

  10. Comparison of manual versus automated data collection method for an evidence-based nursing practice study.

    Science.gov (United States)

    Byrne, M D; Jordan, T R; Welle, T

    2013-01-01

    The objective of this study was to investigate and improve the use of automated data collection procedures for nursing research and quality assurance. A descriptive, correlational study analyzed 44 orthopedic surgical patients who were part of an evidence-based practice (EBP) project examining post-operative oxygen therapy at a Midwestern hospital. The automation work attempted to replicate a manually-collected data set from the EBP project. Automation was successful in replicating data collection for study data elements that were available in the clinical data repository. The automation procedures identified 32 "false negative" patients who met the inclusion criteria described in the EBP project but were not selected during the manual data collection. Automating data collection for certain data elements, such as oxygen saturation, proved challenging because of workflow and practice variations and the reliance on disparate sources for data abstraction. Automation also revealed instances of human error including computational and transcription errors as well as incomplete selection of eligible patients. Automated data collection for analysis of nursing-specific phenomenon is potentially superior to manual data collection methods. Creation of automated reports and analysis may require initial up-front investment with collaboration between clinicians, researchers and information technology specialists who can manage the ambiguities and challenges of research and quality assurance work in healthcare.

  11. A METHOD FOR AUTOMATED ANALYSIS OF 10 ML WATER SAMPLES CONTAINING ACIDIC, BASIC, AND NEUTRAL SEMIVOLATILE COMPOUNDS LISTED IN USEPA METHOD 8270 BY SOLID PHASE EXTRACTION COUPLED IN-LINE TO LARGE VOLUME INJECTION GAS CHROMATOGRAPHY/MASS SPECTROMETRY

    Science.gov (United States)

    Data is presented showing the progress made towards the development of a new automated system combining solid phase extraction (SPE) with gas chromatography/mass spectrometry for the single run analysis of water samples containing a broad range of acid, base and neutral compounds...

  12. Automated analysis of damages for radiation in plastics surfaces

    International Nuclear Information System (INIS)

    Andrade, C.; Camacho M, E.; Tavera, L.; Balcazar, M.

    1990-02-01

    Analysis of damages done by the radiation in a polymer characterized by optic properties of polished surfaces, of uniformity and chemical resistance that the acrylic; resistant until the 150 centigrade grades of temperature, and with an approximate weight of half of the glass. An objective of this work is the development of a method that analyze in automated form the superficial damages induced by radiation in plastic materials means an images analyst. (Author)

  13. Automated analysis of brachial ultrasound time series

    Science.gov (United States)

    Liang, Weidong; Browning, Roger L.; Lauer, Ronald M.; Sonka, Milan

    1998-07-01

    Atherosclerosis begins in childhood with the accumulation of lipid in the intima of arteries to form fatty streaks, advances through adult life when occlusive vascular disease may result in coronary heart disease, stroke and peripheral vascular disease. Non-invasive B-mode ultrasound has been found useful in studying risk factors in the symptom-free population. Large amount of data is acquired from continuous imaging of the vessels in a large study population. A high quality brachial vessel diameter measurement method is necessary such that accurate diameters can be measured consistently in all frames in a sequence, across different observers. Though human expert has the advantage over automated computer methods in recognizing noise during diameter measurement, manual measurement suffers from inter- and intra-observer variability. It is also time-consuming. An automated measurement method is presented in this paper which utilizes quality assurance approaches to adapt to specific image features, to recognize and minimize the noise effect. Experimental results showed the method's potential for clinical usage in the epidemiological studies.

  14. Interactive/automated method to count bacterial colonies

    OpenAIRE

    Monteiro, Fernando C.; Ribeiro, J.E.; Martins, Ramiro

    2016-01-01

    The number of colonies in a culture is counted to calculate the concentration of bacteria in the original broth; however, manual counting can be tedious, time-consuming and imprecise. Automation of colony counting has been of increasing interest for many decades, and these methods have been shown to be more consistent than manual counting. Significant limitations of many algorithms used in automated systems are their inability to recognize overlapping colonies as distinct and to count colonie...

  15. Automated reasoning applications to design analysis

    International Nuclear Information System (INIS)

    Stratton, R.C.

    1984-01-01

    Given the necessary relationships and definitions of design functions and components, validation of system incarnation (the physical product of design) and sneak function analysis can be achieved via automated reasoners. The relationships and definitions must define the design specification and incarnation functionally. For the design specification, the hierarchical functional representation is based on physics and engineering principles and bounded by design objectives and constraints. The relationships and definitions of the design incarnation are manifested as element functional definitions, state relationship to functions, functional relationship to direction, element connectivity, and functional hierarchical configuration

  16. A fully automated on-line preconcentration and liquid chromatography-tandem mass spectrometry method for the analysis of anti-infectives in wastewaters.

    Science.gov (United States)

    Segura, Pedro A; Gagnon, Christian; Sauvé, Sébastien

    2007-12-05

    We developed and validated a novel on-line preconcentration liquid chromatography-tandem mass spectrometry method for the determination of anti-infectives in wastewaters. The presented method preconcentrates 1 mL of sample in a load column using a switching-valve technique. The method was optimized with respect to sample load flow rate, volume of the load column wash and organic solvent content of the load column wash. The sample is cleaned using a 30% organic solvent washing step and then gradually eluted to an analytical column for separation. To compensate for matrix effects, quantitation was performed using standard additions. Confirmation of the presence of the detected compounds was done using a second selective reaction monitoring transition. Method intra-day precision was less than 9% and inter-day precision %R.S.D. varied between 2.5 and 23%. Limits of detection for the selected anti-infective compounds ranged from 13 to 61 ng L(-1). All the target anti-infectives were found in the city of Montréal WWTP effluent in concentrations ranging from 71 to 289 ng L(-1). This automated method eases the rapid quantitation of those trace contaminants using small sample volumes.

  17. Optimization of automation: I. Estimation method of cognitive automation rates reflecting the effects of automation on human operators in nuclear power plants

    International Nuclear Information System (INIS)

    Lee, Seung Min; Kim, Jong Hyun; Seong, Poong Hyun

    2014-01-01

    Highlights: • We propose an estimation method of the automation rate by taking the advantages of automation as the estimation measures. • We conduct the experiments to examine the validity of the suggested method. • The higher the cognitive automation rate is, the greater the decreased rate of the working time will be. • The usefulness of the suggested estimation method is proved by statistical analyses. - Abstract: Since automation was introduced in various industrial fields, the concept of the automation rate has been used to indicate the inclusion proportion of automation among all work processes or facilities. Expressions of the inclusion proportion of automation are predictable, as is the ability to express the degree of the enhancement of human performance. However, many researchers have found that a high automation rate does not guarantee high performance. Therefore, to reflect the effects of automation on human performance, this paper proposes a new estimation method of the automation rate that considers the effects of automation on human operators in nuclear power plants (NPPs). Automation in NPPs can be divided into two types: system automation and cognitive automation. Some general descriptions and characteristics of each type of automation are provided, and the advantages of automation are investigated. The advantages of each type of automation are used as measures of the estimation method of the automation rate. One advantage was found to be a reduction in the number of tasks, and another was a reduction in human cognitive task loads. The system and the cognitive automation rate were proposed as quantitative measures by taking advantage of the aforementioned benefits. To quantify the required human cognitive task loads and thus suggest the cognitive automation rate, Conant’s information-theory-based model was applied. The validity of the suggested method, especially as regards the cognitive automation rate, was proven by conducting

  18. Automated quantification of budding Saccharomyces cerevisiae using a novel image cytometry method.

    Science.gov (United States)

    Laverty, Daniel J; Kury, Alexandria L; Kuksin, Dmitry; Pirani, Alnoor; Flanagan, Kevin; Chan, Leo Li-Ying

    2013-06-01

    The measurements of concentration, viability, and budding percentages of Saccharomyces cerevisiae are performed on a routine basis in the brewing and biofuel industries. Generation of these parameters is of great importance in a manufacturing setting, where they can aid in the estimation of product quality, quantity, and fermentation time of the manufacturing process. Specifically, budding percentages can be used to estimate the reproduction rate of yeast populations, which directly correlates with metabolism of polysaccharides and bioethanol production, and can be monitored to maximize production of bioethanol during fermentation. The traditional method involves manual counting using a hemacytometer, but this is time-consuming and prone to human error. In this study, we developed a novel automated method for the quantification of yeast budding percentages using Cellometer image cytometry. The automated method utilizes a dual-fluorescent nucleic acid dye to specifically stain live cells for imaging analysis of unique morphological characteristics of budding yeast. In addition, cell cycle analysis is performed as an alternative method for budding analysis. We were able to show comparable yeast budding percentages between manual and automated counting, as well as cell cycle analysis. The automated image cytometry method is used to analyze and characterize corn mash samples directly from fermenters during standard fermentation. Since concentration, viability, and budding percentages can be obtained simultaneously, the automated method can be integrated into the fermentation quality assurance protocol, which may improve the quality and efficiency of beer and bioethanol production processes.

  19. Automated reasoning applications to design validation and sneak function analysis

    International Nuclear Information System (INIS)

    Stratton, R.C.

    1984-01-01

    Argonne National Laboratory (ANL) is actively involved in the LMFBR Man-Machine Integration (MMI) Safety Program. The objective of this program is to enhance the operational safety and reliability of fast-breeder reactors by optimum integration of men and machines through the application of human factors principles and control engineering to the design, operation, and the control environment. ANL is developing methods to apply automated reasoning and computerization in the validation and sneak function analysis process. This project provides the element definitions and relations necessary for an automated reasoner (AR) to reason about design validation and sneak function analysis. This project also provides a demonstration of this AR application on an Experimental Breeder Reactor-II (EBR-II) system, the Argonne Cooling System

  20. Specdata: Automated Analysis Software for Broadband Spectra

    Science.gov (United States)

    Oliveira, Jasmine N.; Martin-Drumel, Marie-Aline; McCarthy, Michael C.

    2017-06-01

    With the advancement of chirped-pulse techniques, broadband rotational spectra with a few tens to several hundred GHz of spectral coverage are now routinely recorded. When studying multi-component mixtures that might result, for example, with the use of an electrical discharge, lines of new chemical species are often obscured by those of known compounds, and analysis can be laborious. To address this issue, we have developed SPECdata, an open source, interactive tool which is designed to simplify and greatly accelerate the spectral analysis and discovery. Our software tool combines both automated and manual components that free the user from computation, while giving him/her considerable flexibility to assign, manipulate, interpret and export their analysis. The automated - and key - component of the new software is a database query system that rapidly assigns transitions of known species in an experimental spectrum. For each experiment, the software identifies spectral features, and subsequently assigns them to known molecules within an in-house database (Pickett .cat files, list of frequencies...), or those catalogued in Splatalogue (using automatic on-line queries). With suggested assignments, the control is then handed over to the user who can choose to accept, decline or add additional species. Data visualization, statistical information, and interactive widgets assist the user in making decisions about their data. SPECdata has several other useful features intended to improve the user experience. Exporting a full report of the analysis, or a peak file in which assigned lines are removed are among several options. A user may also save their progress to continue at another time. Additional features of SPECdata help the user to maintain and expand their database for future use. A user-friendly interface allows one to search, upload, edit or update catalog or experiment entries.

  1. A review on automated pavement distress detection methods

    NARCIS (Netherlands)

    Coenen, Tom B.J.; Golroo, Amir

    2017-01-01

    In recent years, extensive research has been conducted on pavement distress detection. A large part of these studies applied automated methods to capture different distresses. In this paper, a literature review on the distresses and related detection methods are presented. This review also includes

  2. A Method for Automated Planning of FTTH Access Network Infrastructures

    DEFF Research Database (Denmark)

    Riaz, Muhammad Tahir; Pedersen, Jens Myrup; Madsen, Ole Brun

    2005-01-01

    In this paper a method for automated planning of Fiber to the Home (FTTH) access networks is proposed. We introduced a systematic approach for planning access network infrastructure. The GIS data and a set of algorithms were employed to make the planning process more automatic. The method explains...

  3. Methods of analysis by the U.S. Geological Survey National Water Quality Laboratory; determination of the total phosphorus by a Kjeldahl digestion method and an automated colorimetric finish that includes dialysis

    Science.gov (United States)

    Patton, Charles J.; Truitt, Earl P.

    1992-01-01

    A method to determine total phosphorus (TP) in the same digests prepared for total Kjeldahl nitrogen (TKN) determinations is desribed. The batch, high-temperature (block digester), HG(II)-catalyzed digestion step is similar to U.S. Geological Survey methods I-2552-85/I-4552-85 and U.S. Environmental Protection Agency method 365.4 except that sample and reagent volumes are halved. Prepared digests are desolvated at 220 degrees Celsius and digested at 370 degrees Celsius in separate block digesters set at these temperatures, rather than in a single, temperature-programmed block digester. This approach is used in the method escribed here, which permits 40 calibrants, reference waters, and smaples to be digested and resolvated in about an hour. Orthophosphate ions originally present in samples, along with those released during the digestion step, are determined colorimetrically at a rate of 90 tests per hour by an automated version of the phosphoantimonylmolybdenum blue procedure. About 100 microliters of digest are required per determination. The upper concentration limit is 2 milligrams per liter (mg/L) with a method detection limt of 0.01 mg/L. Repeatability for a sample containing approximately 1.6 mg/L of TP in a high suspended-solids matrix is 0.7 percent. Between-day precision for the same sample is 5.0 percent. A dialyzer in the air-segmented continuous flow analyzer provides on-line digest cleanup, eliminated particulates that otherwise would interfere in the colorimetric finish. An single-channel analyzer can process the resolvated digests from two pairs of block digesters each hour. Paired t-test analysis of TP concentrations for approximately 1,600 samples determined by the new method (U.S. Geologial Survey methods I-2610-91 and I-4610-91) and the old method (U.S. Geological Survey methods I-2600-85 and I-4600-85) revealed positive bias in the former of 0.02 to 0.04 mg/L for surface-water samples in agreement with previous studies. Concentrations of total

  4. An automated and simple method for brain MR image extraction

    Directory of Open Access Journals (Sweden)

    Zhu Zixin

    2011-09-01

    Full Text Available Abstract Background The extraction of brain tissue from magnetic resonance head images, is an important image processing step for the analyses of neuroimage data. The authors have developed an automated and simple brain extraction method using an improved geometric active contour model. Methods The method uses an improved geometric active contour model which can not only solve the boundary leakage problem but also is less sensitive to intensity inhomogeneity. The method defines the initial function as a binary level set function to improve computational efficiency. The method is applied to both our data and Internet brain MR data provided by the Internet Brain Segmentation Repository. Results The results obtained from our method are compared with manual segmentation results using multiple indices. In addition, the method is compared to two popular methods, Brain extraction tool and Model-based Level Set. Conclusions The proposed method can provide automated and accurate brain extraction result with high efficiency.

  5. Automated method of processing video data from track detectors

    Science.gov (United States)

    Aleksandrov, A. B.; Goncharova, L. A.; Davydov, D. A.; Publichenko, P. A.; Roganova, T. M.; Polukhina, N. G.; Feinberg, E. L.

    2007-10-01

    New automated methods simplify significantly and accelerate processing of data from emulsion detectors. In addition to acceleration, automation of measurements allows large files of experimental data to be processed and their statistics to be made sufficient. It also gives impetus to the development of projects of new experiments with large-volume targets and emulsions and large-area solid-state track detectors. In this regard, the problem of increase in the number of scientists with required level of training capable of operation with automated technical equipment of this class becomes urgent. Every year, ten Moscow students master new methods working at the P. N. Lebedev Institute of Physics of the Russian Academy of Sciences with the PAVIKOM fully-automated measuring complex [1 3]. Most students now engaged in high-energy physics gain a notion of only outdated manual methods of processing data from track detectors. In 2005, a new practical work on determination of energy of neutrons transmitted through a nuclear emulsion was prepared on the basis of the PAVIKOM complex and physical experimental work of the Physical Department of Moscow State University. This practical work makes it possible to acquaint the students with initial skills used in automated processing of data from track detectors and can be included into educational process for students of physical departments.

  6. Comparison of manual and automated pretreatment methods for AMS radiocarbon dating of plant fossils

    Science.gov (United States)

    Bradley, L.A.; Stafford, Thomas W.

    1994-01-01

    A new automated pretreatment system for the preparation of materials submitted for accelerator mass spectrometry (AMS) analysis is less time-consuming and results in a higher sample yield. The new procedure was tested using two groups of plant fossils: one group was pretreated using the traditional method, and the second, using the automated pretreatment apparatus. The time it took to complete the procedure and the amount of sample material remaining were compared. The automated pretreatment apparatus proved to be more than three times faster and, in most cases, produced a higher yield. A darker discoloration of the KOH solutions was observed indicating that the automated system is more thorough in removing humates from the specimen compared to the manual method. -Authors

  7. Automated Cache Performance Analysis And Optimization

    Energy Technology Data Exchange (ETDEWEB)

    Mohror, Kathryn [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2013-12-23

    While there is no lack of performance counter tools for coarse-grained measurement of cache activity, there is a critical lack of tools for relating data layout to cache behavior to application performance. Generally, any nontrivial optimizations are either not done at all, or are done ”by hand” requiring significant time and expertise. To the best of our knowledge no tool available to users measures the latency of memory reference instructions for partic- ular addresses and makes this information available to users in an easy-to-use and intuitive way. In this project, we worked to enable the Open|SpeedShop performance analysis tool to gather memory reference latency information for specific instructions and memory ad- dresses, and to gather and display this information in an easy-to-use and intuitive way to aid performance analysts in identifying problematic data structures in their codes. This tool was primarily designed for use in the supercomputer domain as well as grid, cluster, cloud-based parallel e-commerce, and engineering systems and middleware. Ultimately, we envision a tool to automate optimization of application cache layout and utilization in the Open|SpeedShop performance analysis tool. To commercialize this soft- ware, we worked to develop core capabilities for gathering enhanced memory usage per- formance data from applications and create and apply novel methods for automatic data structure layout optimizations, tailoring the overall approach to support existing supercom- puter and cluster programming models and constraints. In this Phase I project, we focused on infrastructure necessary to gather performance data and present it in an intuitive way to users. With the advent of enhanced Precise Event-Based Sampling (PEBS) counters on recent Intel processor architectures and equivalent technology on AMD processors, we are now in a position to access memory reference information for particular addresses. Prior to the introduction of PEBS counters

  8. An automated method for the layup of fiberglass fabric

    Science.gov (United States)

    Zhu, Siqi

    This dissertation presents an automated composite fabric layup solution based on a new method to deform fiberglass fabric referred to as shifting. A layup system was designed and implemented using a large robotic gantry and custom end-effector for shifting. Layup tests proved that the system can deposit fabric onto two-dimensional and three-dimensional tooling surfaces accurately and repeatedly while avoiding out-of-plane deformation. A process planning method was developed to generate tool paths for the layup system based on a geometric model of the tooling surface. The approach is analogous to Computer Numerical Controlled (CNC) machining, where Numerical Control (NC) code from a Computer-Aided Design (CAD) model is generated to drive the milling machine. Layup experiments utilizing the proposed method were conducted to validate the performance. The results show that the process planning software requires minimal time or human intervention and can generate tool paths leading to accurate composite fabric layups. Fiberglass fabric samples processed with shifting deformation were observed for meso-scale deformation. Tow thinning, bending and spacing was observed and measured. Overall, shifting did not create flaws in amounts that would disqualify the method from use in industry. This suggests that shifting is a viable method for use in automated manufacturing. The work of this dissertation provides a new method for the automated layup of broad width composite fabric that is not possible with any available composite automation systems to date.

  9. High-throughput method of dioxin analysis in aqueous samples using consecutive solid phase extraction steps with the new C18 Ultraflow™ pressurized liquid extraction and automated clean-up.

    Science.gov (United States)

    Youn, Yeu-Young; Park, Deok Hie; Lee, Yeon Hwa; Lim, Young Hee; Cho, Hye Sung

    2015-01-01

    A high-throughput analytical method has been developed for the determination of seventeen 2,3,7,8-substituted congeners of polychlorinated dibenzo-p-dioxins and dibenzofurans (PCDD/Fs) in aqueous samples. A recently introduced octadecyl (C18) disk for semi-automated solid-phase extraction of PCDD/Fs in water samples with a high level of particulate material has been tested for the analysis of dioxins. A new type of C18 disk specially designed for the analysis of hexane extractable material (HEM), but never previously reported for use in PCDD/Fs analysis. This kind of disk allows a higher filtration flow, and therefore the time of analysis is reduced. The solid-phase extraction technique is used to change samples from liquid to solid, and therefore pressurized liquid extraction (PLE) can be used in the pre-treatment. In order to achieve efficient purification, extracts from the PLE are purified using an automated Power-prep system with disposable silica, alumina, and carbon columns. Quantitative analyses of PCDD/Fs were performed by GC-HRMS using multi-ion detection (MID) mode. The method was successfully applied to the analysis of water samples from the wastewater treatment system of a vinyl chloride monomer plant. The entire procedure is in agreement with EPA1613 recommendations regarding the blank control, MDLs (method detection limits), accuracy, and precision. The high-throughput method not only meets the requirements of international standards, but also shortens the required analysis time from 2 weeks to 3d. Copyright © 2014 Elsevier Ltd. All rights reserved.

  10. Automated image analysis of the pathological lung in CT

    NARCIS (Netherlands)

    Sluimer, Ingrid Christine

    2005-01-01

    The general objective of the thesis is automation of the analysis of the pathological lung from CT images. Specifically, we aim for automated detection and classification of abnormalities in the lung parenchyma. We first provide a review of computer analysis techniques applied to CT of the

  11. Prevalence of discordant microscopic changes with automated CBC analysis

    Directory of Open Access Journals (Sweden)

    Fabiano de Jesus Santos

    2014-12-01

    Full Text Available Introduction:The most common cause of diagnostic error is related to errors in laboratory tests as well as errors of results interpretation. In order to reduce them, the laboratory currently has modern equipment which provides accurate and reliable results. The development of automation has revolutionized the laboratory procedures in Brazil and worldwide.Objective:To determine the prevalence of microscopic changes present in blood slides concordant and discordant with results obtained using fully automated procedures.Materials and method:From January to July 2013, 1,000 hematological parameters slides were analyzed. Automated analysis was performed on last generation equipment, which methodology is based on electrical impedance, and is able to quantify all the figurative elements of the blood in a universe of 22 parameters. The microscopy was performed by two experts in microscopy simultaneously.Results:The data showed that only 42.70% were concordant, comparing with 57.30% discordant. The main findings among discordant were: Changes in red blood cells 43.70% (n = 250, white blood cells 38.46% (n = 220, and number of platelet 17.80% (n = 102.Discussion:The data show that some results are not consistent with clinical or physiological state of an individual, and cannot be explained because they have not been investigated, which may compromise the final diagnosis.Conclusion:It was observed that it is of fundamental importance that the microscopy qualitative analysis must be performed in parallel with automated analysis in order to obtain reliable results, causing a positive impact on the prevention, diagnosis, prognosis, and therapeutic follow-up.

  12. Automated Model Fit Method for Diesel Engine Control Development

    NARCIS (Netherlands)

    Seykens, X.; Willems, F.P.T.; Kuijpers, B.; Rietjens, C.

    2014-01-01

    This paper presents an automated fit for a control-oriented physics-based diesel engine combustion model. This method is based on the combination of a dedicated measurement procedure and structured approach to fit the required combustion model parameters. Only a data set is required that is

  13. ASteCA: Automated Stellar Cluster Analysis

    Science.gov (United States)

    Perren, G. I.; Vázquez, R. A.; Piatti, A. E.

    2015-04-01

    We present the Automated Stellar Cluster Analysis package (ASteCA), a suit of tools designed to fully automate the standard tests applied on stellar clusters to determine their basic parameters. The set of functions included in the code make use of positional and photometric data to obtain precise and objective values for a given cluster's center coordinates, radius, luminosity function and integrated color magnitude, as well as characterizing through a statistical estimator its probability of being a true physical cluster rather than a random overdensity of field stars. ASteCA incorporates a Bayesian field star decontamination algorithm capable of assigning membership probabilities using photometric data alone. An isochrone fitting process based on the generation of synthetic clusters from theoretical isochrones and selection of the best fit through a genetic algorithm is also present, which allows ASteCA to provide accurate estimates for a cluster's metallicity, age, extinction and distance values along with its uncertainties. To validate the code we applied it on a large set of over 400 synthetic MASSCLEAN clusters with varying degrees of field star contamination as well as a smaller set of 20 observed Milky Way open clusters (Berkeley 7, Bochum 11, Czernik 26, Czernik 30, Haffner 11, Haffner 19, NGC 133, NGC 2236, NGC 2264, NGC 2324, NGC 2421, NGC 2627, NGC 6231, NGC 6383, NGC 6705, Ruprecht 1, Tombaugh 1, Trumpler 1, Trumpler 5 and Trumpler 14) studied in the literature. The results show that ASteCA is able to recover cluster parameters with an acceptable precision even for those clusters affected by substantial field star contamination. ASteCA is written in Python and is made available as an open source code which can be downloaded ready to be used from its official site.

  14. Evaluation of full field automated photoelastic analysis based on phase stepping

    Science.gov (United States)

    Haake, S. J.; Wang, Z. F.; Patterson, E. A.

    A full-field automated polariscope designed for photoelastic analysis and based on the method of phase-stepping is described. The system is evaluated through the analysis of five different photoelastic models using both the automated system and using manual analysis employing the Tardy Compensation method. Models were chosen to provide a range of different fringe patterns, orders, and stress gradients and were: a disk in diametral compression, a constrained beam subject to a point load, a tensile plate with a central hole, a turbine blade, and a turbine disk slot. The repeatability of the full-field system was found to compare well with point by point systems. The worst isochromatic error was approximately 0.007 fringes, and the corresponding isoclinic error was 0.75. Results from the manual and automated methods showed good agreement. It is concluded that automated photoelastic analysis based on phase-stepping procedures offers a potentially accurate and reliable tool for stress analysts.

  15. Automating Object-Oriented Software Development Methods

    NARCIS (Netherlands)

    Tekinerdogan, B.; Saeki, Motoshi; Sunyé, Gerson; van den Broek, P.M.; Hruby, Pavel; Tekinerdogan, B.; van den Broek, P.M.; Saeki, M.; Hruby, P.; Sunye, G.

    2001-01-01

    Current software projects have generally to deal with producing and managing large and complex software products. It is generally believed that applying software development methods are useful in coping with this complexity and for supporting quality. As such numerous object-oriented software

  16. Automating Object-Oriented Software Development Methods

    NARCIS (Netherlands)

    Tekinerdogan, B.; Frohner, A´ kos; Saeki, Motoshi; Sunyé, Gerson; van den Broek, P.M.; Hruby, Pavel

    2002-01-01

    Current software projects have generally to deal with producing and managing large and complex software products. It is generally believed that applying software development methods are useful in coping with this complexity and for supporting quality. As such numerous object-oriented software

  17. Ecological Automation Design, Extending Work Domain Analysis

    NARCIS (Netherlands)

    Amelink, M.H.J.

    2010-01-01

    In high–risk domains like aviation, medicine and nuclear power plant control, automation has enabled new capabilities, increased the economy of operation and has greatly contributed to safety. However, automation increases the number of couplings in a system, which can inadvertently lead to more

  18. Automated Test Methods for XML Metadata

    Science.gov (United States)

    2017-12-28

    definition (XSD) format and other standards and conventions. This method should be of interest primarily to parties having tools or applications that...consume RCC metadata standard documents, and may be of interest to developers of tools or applications that produce RCC metadata standard documents...instance document and encodings to verify that the rules engines and other tools work together. 1. Initialize the programming environment. 2. Write test

  19. Automated Traffic Management System and Method

    Science.gov (United States)

    Glass, Brian J. (Inventor); Spirkovska, Liljana (Inventor); McDermott, William J. (Inventor); Reisman, Ronald J. (Inventor); Gibson, James (Inventor); Iverson, David L. (Inventor)

    2000-01-01

    A data management system and method that enables acquisition, integration, and management of real-time data generated at different rates, by multiple heterogeneous incompatible data sources. The system achieves this functionality by using an expert system to fuse data from a variety of airline, airport operations, ramp control, and air traffic control tower sources, to establish and update reference data values for every aircraft surface operation. The system may be configured as a real-time airport surface traffic management system (TMS) that electronically interconnects air traffic control, airline data, and airport operations data to facilitate information sharing and improve taxi queuing. In the TMS operational mode, empirical data shows substantial benefits in ramp operations for airlines, reducing departure taxi times by about one minute per aircraft in operational use, translating as $12 to $15 million per year savings to airlines at the Atlanta, Georgia airport. The data management system and method may also be used for scheduling the movement of multiple vehicles in other applications, such as marine vessels in harbors and ports, trucks or railroad cars in ports or shipping yards, and railroad cars in switching yards. Finally, the data management system and method may be used for managing containers at a shipping dock, stock on a factory floor or in a warehouse, or as a training tool for improving situational awareness of FAA tower controllers, ramp and airport operators, or commercial airline personnel in airfield surface operations.

  20. Automated mass action model space generation and analysis methods for two-reactant combinatorially complex equilibriums: An analysis of ATP-induced ribonucleotide reductase R1 hexamerization data

    Directory of Open Access Journals (Sweden)

    Radivoyevitch Tomas

    2009-12-01

    /30 > 508/2088 with p -15. Finally, 99 of the 2088 models did not have any terms with ATP/R1 ratios >1.5, but of the top 30, there were 14 such models (14/30 > 99/2088 with p -16, i.e. the existence of R1 hexamers with >3 a-sites occupied by ATP is also not supported by this dataset. Conclusion The analysis presented suggests that three a-sites may not be occupied by ATP in R1 hexamers under the conditions of the data analyzed. If a-sites fill before h-sites, this implies that the dataset analyzed can be explained without the existence of an h-site. Reviewers This article was reviewed by Ossama Kashlan (nominated by Philip Hahnfeldt, Bin Hu (nominated by William Hlavacek and Rainer Sachs.

  1. Semi-automated retinal vessel analysis in nonmydriatic fundus photography.

    Science.gov (United States)

    Schuster, Alexander Karl-Georg; Fischer, Joachim Ernst; Vossmerbaeumer, Urs

    2014-02-01

    Funduscopic assessment of the retinal vessels may be used to assess the health status of microcirculation and as a component in the evaluation of cardiovascular risk factors. Typically, the evaluation is restricted to morphological appreciation without strict quantification. Our purpose was to develop and validate a software tool for semi-automated quantitative analysis of retinal vasculature in nonmydriatic fundus photography. matlab software was used to develop a semi-automated image recognition and analysis tool for the determination of the arterial-venous (A/V) ratio in the central vessel equivalent on 45° digital fundus photographs. Validity and reproducibility of the results were ascertained using nonmydriatic photographs of 50 eyes from 25 subjects recorded from a 3DOCT device (Topcon Corp.). Two hundred and thirty-three eyes of 121 healthy subjects were evaluated to define normative values. A software tool was developed using image thresholds for vessel recognition and vessel width calculation in a semi-automated three-step procedure: vessel recognition on the photograph and artery/vein designation, width measurement and calculation of central retinal vessel equivalents. Mean vessel recognition rate was 78%, vessel class designation rate 75% and reproducibility between 0.78 and 0.91. Mean A/V ratio was 0.84. Application on a healthy norm cohort showed high congruence with prior published manual methods. Processing time per image was one minute. Quantitative geometrical assessment of the retinal vasculature may be performed in a semi-automated manner using dedicated software tools. Yielding reproducible numerical data within a short time leap, this may contribute additional value to mere morphological estimates in the clinical evaluation of fundus photographs. © 2013 Acta Ophthalmologica Scandinavica Foundation. Published by John Wiley & Sons Ltd.

  2. Automated back titration method to measure phosphate

    International Nuclear Information System (INIS)

    Comer, J.; Tehrani, M.; Avdeef, A.; Ross, J. Jr.

    1987-01-01

    Phosphate was measured in soda drinks and as an additive in flour, by a back titration method in which phosphate was precipitated with lanthanum, and the excess lanthanum was titrated with fluoride. All measurements were performed using the Orion fluoride electrode and the Orion 960 Autochemistry System. In most commercial automatic titrators, the inflection point of the titration curve, calculated from the first derivative of the curve, is used to find the equivalence polar of the titration. The inflection technique is compared with a technique based on Gran functions, which uses data collected after the end point and predicts the equivalence point accordingly

  3. Testing an Automated Accuracy Assessment Method on Bibliographic Data

    Directory of Open Access Journals (Sweden)

    Marlies Olensky

    2014-12-01

    Full Text Available This study investigates automated data accuracy assessment as described in data quality literature for its suitability to assess bibliographic data. The data samples comprise the publications of two Nobel Prize winners in the field of Chemistry for a 10-year-publication period retrieved from the two bibliometric data sources, Web of Science and Scopus. The bibliographic records are assessed against the original publication (gold standard and an automatic assessment method is compared to a manual one. The results show that the manual assessment method reflects truer accuracy scores. The automated assessment method would need to be extended by additional rules that reflect specific characteristics of bibliographic data. Both data sources had higher accuracy scores per field than accumulated per record. This study contributes to the research on finding a standardized assessment method of bibliographic data accuracy as well as defining the impact of data accuracy on the citation matching process.

  4. Automated Linguistic Personality Description and Recognition Methods

    Directory of Open Access Journals (Sweden)

    Danylyuk Illya

    2016-12-01

    Full Text Available Background: The relevance of our research, above all, is theoretically motivated by the development of extraordinary scientific and practical interest in the possibilities of language processing of huge amount of data generated by people in everyday professional and personal life in the electronic forms of communication (e-mail, sms, voice, audio and video blogs, social networks, etc.. Purpose: The purpose of the article is to describe the theoretical and practical framework of the project "Communicative-pragmatic and discourse-grammatical lingvopersonology: structuring linguistic identity and computer modeling". The description of key techniques is given, such as machine learning for language modeling, speech synthesis, handwriting simulation. Results: Lingvopersonology developed some great theoretical foundations, its methods, tools, and significant achievements let us predict that the newest promising trend is a linguistic identity modeling by means of information technology, including language. We see three aspects of the modeling: 1 modeling the semantic level of linguistic identity – by means of the use of corpus linguistics; 2 sound level formal modeling of linguistic identity – with the help of speech synthesis; 3 formal graphic level modeling of linguistic identity – with the help of image synthesis (handwriting. For the first case, we suppose to use machine learning technics and vector-space (word2vec algorithm for textual speech modeling. Hybrid CUTE method for personality speech modeling will be applied to the second case. Finally, trained with the person handwriting images neural network can be an instrument for the last case. Discussion: The project "Communicative-pragmatic, discourse, and grammatical lingvopersonology: structuring linguistic identity and computer modeling", which is implementing by the Department of General and Applied Linguistics and Slavonic philology, selected a task to model Yuriy Shevelyov (Sherekh

  5. Automated Image Analysis Corrosion Working Group Update: February 1, 2018

    Energy Technology Data Exchange (ETDEWEB)

    Wendelberger, James G. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2018-02-01

    These are slides for the automated image analysis corrosion working group update. The overall goals were: automate the detection and quantification of features in images (faster, more accurate), how to do this (obtain data, analyze data), focus on Laser Scanning Confocal Microscope (LCM) data (laser intensity, laser height/depth, optical RGB, optical plus laser RGB).

  6. Automated analysis and design of complex structures

    International Nuclear Information System (INIS)

    Wilson, E.L.

    1977-01-01

    The present application of optimum design appears to be restricted to components of the structure rather than to the total structural system. Since design normally involved many analysis of the system any improvement in the efficiency of the basic methods of analysis will allow more complicated systems to be designed by optimum methods. The evaluation of the risk and reliability of a structural system can be extremely important. Reliability studies have been made of many non-structural systems for which the individual components have been extensively tested and the service environment is known. For such systems the reliability studies are valid. For most structural systems, however, the properties of the components can only be estimated and statistical data associated with the potential loads is often minimum. Also, a potentially critical loading condition may be completely neglected in the study. For these reasons and the previous problems associated with the reliability of both linear and nonlinear analysis computer programs it appears to be premature to place a significant value on such studies for complex structures. With these comments as background the purpose of this paper is to discuss the following: the relationship of analysis to design; new methods of analysis; new of improved finite elements; effect of minicomputer on structural analysis methods; the use of system of microprocessors for nonlinear structural analysis; the role of interacting graphics systems in future analysis and design. This discussion will focus on the impact of new, inexpensive computer hardware on design and analysis methods

  7. Postprocessing algorithm for automated analysis of pelvic intraoperative neuromonitoring signals

    Directory of Open Access Journals (Sweden)

    Wegner Celine

    2016-09-01

    Full Text Available Two dimensional pelvic intraoperative neuromonitoring (pIONM® is based on electric stimulation of autonomic nerves under observation of electromyography of internal anal sphincter (IAS and manometry of urinary bladder. The method provides nerve identification and verification of its’ functional integrity. Currently pIONM® is gaining increased attention in times where preservation of function is becoming more and more important. Ongoing technical and methodological developments in experimental and clinical settings require further analysis of the obtained signals. This work describes a postprocessing algorithm for pIONM® signals, developed for automated analysis of huge amount of recorded data. The analysis routine includes a graphical representation of the recorded signals in the time and frequency domain, as well as a quantitative evaluation by means of features calculated from the time and frequency domain. The produced plots are summarized automatically in a PowerPoint presentation. The calculated features are filled into a standardized Excel-sheet, ready for statistical analysis.

  8. Automated and connected vehicle implications and analysis.

    Science.gov (United States)

    2017-05-01

    Automated and connected vehicles (ACV) and, in particular, autonomous vehicles have captured : the interest of the public, industry and transportation authorities. ACVs can significantly reduce : accidents, fuel consumption, pollution and the costs o...

  9. White matter hyperintensities segmentation: a new semi-automated method

    Directory of Open Access Journals (Sweden)

    Mariangela eIorio

    2013-12-01

    Full Text Available White matter hyperintensities (WMH are brain areas of increased signal on T2-weighted or fluid attenuated inverse recovery magnetic resonance imaging (MRI scans. In this study we present a new semi-automated method to measure WMH load that is based on the segmentation of the intensity histogram of fluid-attenuated inversion recovery images. Thirty patients with Mild Cognitive Impairment with variable WMH load were enrolled. The semi-automated WMH segmentation included: removal of non-brain tissue, spatial normalization, removal of cerebellum and brain stem, spatial filtering, thresholding to segment probable WMH, manual editing for correction of false positives and negatives, generation of WMH map and volumetric estimation of the WMH load. Accuracy was quantitatively evaluated by comparing semi-automated and manual WMH segmentations performed by two independent raters. Differences between the two procedures were assessed using Student’s t tests and similarity was evaluated using linear regression model and Dice Similarity Coefficient (DSC. The volumes of the manual and semi-automated segmentations did not statistically differ (t-value= -1.79, DF=29, p= 0.839 for rater 1; t-value= 1.113, DF=29, p= 0.2749 for rater 2, were highly correlated (R²= 0.921, F (1,29 =155,54, p

  10. Steam generator automated eddy current data analysis: A benchmarking study. Final report

    International Nuclear Information System (INIS)

    Brown, S.D.

    1998-12-01

    The eddy current examination of steam generator tubes is a very demanding process. Challenges include: complex signal analysis, massive amount of data to be reviewed quickly with extreme precision and accuracy, shortages of data analysts during peak periods, and the desire to reduce examination costs. One method to address these challenges is by incorporating automation into the data analysis process. Specific advantages, which automated data analysis has the potential to provide, include the ability to analyze data more quickly, consistently and accurately than can be performed manually. Also, automated data analysis can potentially perform the data analysis function with significantly smaller levels of analyst staffing. Despite the clear advantages that an automated data analysis system has the potential to provide, no automated system has been produced and qualified that can perform all of the functions that utility engineers demand. This report investigates the current status of automated data analysis, both at the commercial and developmental level. A summary of the various commercial and developmental data analysis systems is provided which includes the signal processing methodologies used and, where available, the performance data obtained for each system. Also, included in this report is input from seventeen research organizations regarding the actions required and obstacles to be overcome in order to bring automatic data analysis from the laboratory into the field environment. In order to provide assistance with ongoing and future research efforts in the automated data analysis arena, the most promising approaches to signal processing are described in this report. These approaches include: wavelet applications, pattern recognition, template matching, expert systems, artificial neural networks, fuzzy logic, case based reasoning and genetic algorithms. Utility engineers and NDE researchers can use this information to assist in developing automated data

  11. Osteolytica: An automated image analysis software package that rapidly measures cancer-induced osteolytic lesions in in vivo models with greater reproducibility compared to other commonly used methods.

    Science.gov (United States)

    Evans, H R; Karmakharm, T; Lawson, M A; Walker, R E; Harris, W; Fellows, C; Huggins, I D; Richmond, P; Chantry, A D

    2016-02-01

    Methods currently used to analyse osteolytic lesions caused by malignancies such as multiple myeloma and metastatic breast cancer vary from basic 2-D X-ray analysis to 2-D images of micro-CT datasets analysed with non-specialised image software such as ImageJ. However, these methods have significant limitations. They do not capture 3-D data, they are time-consuming and they often suffer from inter-user variability. We therefore sought to develop a rapid and reproducible method to analyse 3-D osteolytic lesions in mice with cancer-induced bone disease. To this end, we have developed Osteolytica, an image analysis software method featuring an easy to use, step-by-step interface to measure lytic bone lesions. Osteolytica utilises novel graphics card acceleration (parallel computing) and 3-D rendering to provide rapid reconstruction and analysis of osteolytic lesions. To evaluate the use of Osteolytica we analysed tibial micro-CT datasets from murine models of cancer-induced bone disease and compared the results to those obtained using a standard ImageJ analysis method. Firstly, to assess inter-user variability we deployed four independent researchers to analyse tibial datasets from the U266-NSG murine model of myeloma. Using ImageJ, inter-user variability between the bones was substantial (±19.6%), in contrast to using Osteolytica, which demonstrated minimal variability (±0.5%). Secondly, tibial datasets from U266-bearing NSG mice or BALB/c mice injected with the metastatic breast cancer cell line 4T1 were compared to tibial datasets from aged and sex-matched non-tumour control mice. Analyses by both Osteolytica and ImageJ showed significant increases in bone lesion area in tumour-bearing mice compared to control mice. These results confirm that Osteolytica performs as well as the current 2-D ImageJ osteolytic lesion analysis method. However, Osteolytica is advantageous in that it analyses over the entirety of the bone volume (as opposed to selected 2-D images), it

  12. Comparison of manual and automated quantification methods of 123I-ADAM

    International Nuclear Information System (INIS)

    Kauppinen, T.; Keski-Rahkonen, A.; Sihvola, E.; Helsinki Univ. Central Hospital

    2005-01-01

    123 I-ADAM is a novel radioligand for imaging of the brain serotonin transporters (SERTs). Traditionally, the analysis of brain receptor studies has been based on observer-dependent manual region of interest definitions and visual interpretation. Our aim was to create a template for automated image registrations and volume of interest (VOI) quantification and to show that an automated quantification method of 123 I-ADAM is more repeatable than the manual method. Patients, methods: A template and a predefined VOI map was created from 123 I-ADAM scans done for healthy volunteers (n=15). Scans of another group of healthy persons (HS, n=12) and patients with bulimia nervosa (BN, n=10) were automatically fitted to the template and specific binding ratios (SBRs) were calculated by using the VOI map. Manual VOI definitions were done for the HS and BN groups by both one and two observers. The repeatability of the automated method was evaluated by using the BN group. Results: For the manual method, the interobserver coefficient of repeatability was 0.61 for the HS group and 1.00 for the BN group. The intra-observer coefficient of repeatability for the BN group was 0.70. For the automated method, the coefficient of repeatability was 0.13 for SBRs in midbrain. Conclusion: An automated quantification gives valuable information in addition to visual interpretation decreasing also the total image handling time and giving clear advantages for research work. An automated method for analysing 123 I-ADAM binding to the brain SERT gives repeatable results for fitting the studies to the template and for calculating SBRs, and could therefore replace manual methods. (orig.)

  13. Comparison of manual and automated quantification methods of {sup 123}I-ADAM

    Energy Technology Data Exchange (ETDEWEB)

    Kauppinen, T. [Helsinki Univ. Central Hospital (Finland). HUS Helsinki Medical Imaging Center; Helsinki Univ. Central Hospital (Finland). Division of Nuclear Medicine; Koskela, A.; Ahonen, A. [Helsinki Univ. Central Hospital (Finland). Division of Nuclear Medicine; Diemling, M. [Hermes Medical Solutions, Stockholm (Sweden); Keski-Rahkonen, A.; Sihvola, E. [Helsinki Univ. (Finland). Dept. of Public Health; Helsinki Univ. Central Hospital (Finland). Dept. of Psychiatry

    2005-07-01

    {sup 123}I-ADAM is a novel radioligand for imaging of the brain serotonin transporters (SERTs). Traditionally, the analysis of brain receptor studies has been based on observer-dependent manual region of interest definitions and visual interpretation. Our aim was to create a template for automated image registrations and volume of interest (VOI) quantification and to show that an automated quantification method of {sup 123}I-ADAM is more repeatable than the manual method. Patients, methods: A template and a predefined VOI map was created from {sup 123}I-ADAM scans done for healthy volunteers (n=15). Scans of another group of healthy persons (HS, n=12) and patients with bulimia nervosa (BN, n=10) were automatically fitted to the template and specific binding ratios (SBRs) were calculated by using the VOI map. Manual VOI definitions were done for the HS and BN groups by both one and two observers. The repeatability of the automated method was evaluated by using the BN group. Results: For the manual method, the interobserver coefficient of repeatability was 0.61 for the HS group and 1.00 for the BN group. The intra-observer coefficient of repeatability for the BN group was 0.70. For the automated method, the coefficient of repeatability was 0.13 for SBRs in midbrain. Conclusion: An automated quantification gives valuable information in addition to visual interpretation decreasing also the total image handling time and giving clear advantages for research work. An automated method for analysing {sup 123}I-ADAM binding to the brain SERT gives repeatable results for fitting the studies to the template and for calculating SBRs, and could therefore replace manual methods. (orig.)

  14. Automated analysis of invadopodia dynamics in live cells

    Directory of Open Access Journals (Sweden)

    Matthew E. Berginski

    2014-07-01

    Full Text Available Multiple cell types form specialized protein complexes that are used by the cell to actively degrade the surrounding extracellular matrix. These structures are called podosomes or invadopodia and collectively referred to as invadosomes. Due to their potential importance in both healthy physiology as well as in pathological conditions such as cancer, the characterization of these structures has been of increasing interest. Following early descriptions of invadopodia, assays were developed which labelled the matrix underneath metastatic cancer cells allowing for the assessment of invadopodia activity in motile cells. However, characterization of invadopodia using these methods has traditionally been done manually with time-consuming and potentially biased quantification methods, limiting the number of experiments and the quantity of data that can be analysed. We have developed a system to automate the segmentation, tracking and quantification of invadopodia in time-lapse fluorescence image sets at both the single invadopodia level and whole cell level. We rigorously tested the ability of the method to detect changes in invadopodia formation and dynamics through the use of well-characterized small molecule inhibitors, with known effects on invadopodia. Our results demonstrate the ability of this analysis method to quantify changes in invadopodia formation from live cell imaging data in a high throughput, automated manner.

  15. A semi-automated solid-phase extraction liquid chromatography/tandem mass spectrometry method for the analysis of tetrahydrocannabinol and metabolites in whole blood.

    Science.gov (United States)

    Jagerdeo, Eshwar; Schaff, Jason E; Montgomery, Madeline A; LeBeau, Marc A

    2009-09-01

    Marijuana is one of the most commonly abused illicit substances in the USA, making cannabinoids important to detect in clinical and forensic toxicology laboratories. Historically, cannabinoids in biological fluids have been derivatized and analyzed by gas chromatography/mass spectrometry (GC/MS). There has been a gradual shift in many laboratories towards liquid chromatography/mass spectrometry (LC/MS) for this analysis due to its improved sensitivity and reduced sample preparation compared with GC/MS procedures. This paper reports a validated method for the analysis of Delta(9)-tetrahydrocannabinol (THC) and its two main metabolites, 11-nor-9-carboxy-Delta(9)-tetrahydrocannabinol (THC-COOH) and 11-hydroxy-Delta(9)-tetrahydrocannabinol (THC-OH), in whole blood samples. The method has also been validated for cannabinol (CBD) and cannabidiol (CDN), two cannabinoids that were shown not to interfere with the method. This method has been successfully applied to samples both from living people and from deceased individuals obtained during autopsy. This method utilizes online solid-phase extraction (SPE) with LC/MS. Pretreatment of samples involves protein precipitation, sample concentration, ultracentrifugation, and reconstitution. The online SPE procedure was developed using Hysphere C8-EC sorbent. A chromatographic gradient with an Xterra MS C(18) column was used for the separation. Four multiple-reaction monitoring (MRM) transitions were monitored for each analyte and internal standard. Linearity generally fell between 2 and 200 ng/mL. The limits of detection (LODs) ranged from 0.5 to 3 ng/mL and the limits of quantitation (LOQs) ranged from 2 to 8 ng/mL. The bias and imprecision were determined using a simple analysis of variance (ANOVA: single factor). The results demonstrate bias as <7%, and imprecision as <9%, for all components at each quantity control level. Published in 2009 by John Wiley & Sons, Ltd.

  16. Automated sizing of large structures by mixed optimization methods

    Science.gov (United States)

    Sobieszczanski, J.; Loendorf, D.

    1973-01-01

    A procedure for automating the sizing of wing-fuselage airframes was developed and implemented in the form of an operational program. The program combines fully stressed design to determine an overall material distribution with mass-strength and mathematical programming methods to design structural details accounting for realistic design constraints. The practicality and efficiency of the procedure is demonstrated for transport aircraft configurations. The methodology is sufficiently general to be applicable to other large and complex structures.

  17. Automation of Methods for the Subjective Measuring of Factors in the Operation of Automated Information Systems by Means of VBA

    Directory of Open Access Journals (Sweden)

    Lyudmila V. Gorbatova

    2015-01-01

    Full Text Available This article describes the process of assessing the effectiveness of the operation of automated information systems in colleges using the method of pairwise comparison and discusses numerical representations used with the above method. The author lists methods for the subjective measuring of the effective operation of automated information systems. The article proposes a way to automate the methods that makes it possible to simplify performing calculations and reduce the amount of time it takes to determine the outcome of a specific task. The author provides an algorithm with the results of work carried out.

  18. Quantifying biodiversity using digital cameras and automated image analysis.

    Science.gov (United States)

    Roadknight, C. M.; Rose, R. J.; Barber, M. L.; Price, M. C.; Marshall, I. W.

    2009-04-01

    Monitoring the effects on biodiversity of extensive grazing in complex semi-natural habitats is labour intensive. There are also concerns about the standardization of semi-quantitative data collection. We have chosen to focus initially on automating the most time consuming aspect - the image analysis. The advent of cheaper and more sophisticated digital camera technology has lead to a sudden increase in the number of habitat monitoring images and information that is being collected. We report on the use of automated trail cameras (designed for the game hunting market) to continuously capture images of grazer activity in a variety of habitats at Moor House National Nature Reserve, which is situated in the North of England at an average altitude of over 600m. Rainfall is high, and in most areas the soil consists of deep peat (1m to 3m), populated by a mix of heather, mosses and sedges. The cameras have been continuously in operation over a 6 month period, daylight images are in full colour and night images (IR flash) are black and white. We have developed artificial intelligence based methods to assist in the analysis of the large number of images collected, generating alert states for new or unusual image conditions. This paper describes the data collection techniques, outlines the quantitative and qualitative data collected and proposes online and offline systems that can reduce the manpower overheads and increase focus on important subsets in the collected data. By converting digital image data into statistical composite data it can be handled in a similar way to other biodiversity statistics thus improving the scalability of monitoring experiments. Unsupervised feature detection methods and supervised neural methods were tested and offered solutions to simplifying the process. Accurate (85 to 95%) categorization of faunal content can be obtained, requiring human intervention for only those images containing rare animals or unusual (undecidable) conditions, and

  19. Rapid method for quantitative analysis of the aroma impact compound, 2-acetyl-1-pyrroline, in fragrant rice using automated headspace gas chromatography.

    Science.gov (United States)

    Sriseadka, Tinakorn; Wongpornchai, Sugunya; Kitsawatpaiboon, Pisan

    2006-10-18

    A rapid method employing static headspace gas chromatography (HS-GC) has been developed and validated for quantitative analysis of the impact aroma compound, 2-acetyl-1-pyrroline (2AP), in grains of fragrant rice. This developed method excludes wet extraction, and the rice headspace volatiles are brought directly and automatically to GC analysis. The conditions of the static HS autosampler were optimized to achieve high recovery and sensitivity. The most effective amount of rice sample used was 1 g, which provided 51% recovery and a linear multiple headspace extraction (MHE) plot of the peak area of 2AP. The sensitivity of the method was enhanced by utilizing a megabore fused silica capillary column in conjunction with a nitrogen-phosphorus detector (NPD). Method validations performed for both static HS-GC-FID and HS-GC-NPD demonstrated linear calibration ranges of 20-10 000 (r(2) = 0.9997) and 5-8000 (r(2) = 0.9998) ng of 2AP/g of rice sample, respectively. The limits of detection for both systems were 20 and 5 ng of 2AP, and the limits of quantitation were 0.30 and 0.01 g of brown rice sample, respectively. Reproducibility calculated as intraday and interday coefficients of variation were 3.25% RSD (n = 15) and 3.92% RSD (n = 35), respectively, for SHS-GC-FID and 1.87% RSD (n = 15) and 2.85% RSD (n = 35), respectively, for SHS-GC-NPD. The method was found to be effective when applied to the evaluation of aroma quality, based on 2AP concentrations, of some fragrant rice samples.

  20. Automated general temperature correction method for dielectric soil moisture sensors

    Science.gov (United States)

    Kapilaratne, R. G. C. Jeewantinie; Lu, Minjiao

    2017-08-01

    An effective temperature correction method for dielectric sensors is important to ensure the accuracy of soil water content (SWC) measurements of local to regional-scale soil moisture monitoring networks. These networks are extensively using highly temperature sensitive dielectric sensors due to their low cost, ease of use and less power consumption. Yet there is no general temperature correction method for dielectric sensors, instead sensor or site dependent correction algorithms are employed. Such methods become ineffective at soil moisture monitoring networks with different sensor setups and those that cover diverse climatic conditions and soil types. This study attempted to develop a general temperature correction method for dielectric sensors which can be commonly used regardless of the differences in sensor type, climatic conditions and soil type without rainfall data. In this work an automated general temperature correction method was developed by adopting previously developed temperature correction algorithms using time domain reflectometry (TDR) measurements to ThetaProbe ML2X, Stevens Hydra probe II and Decagon Devices EC-TM sensor measurements. The rainy day effects removal procedure from SWC data was automated by incorporating a statistical inference technique with temperature correction algorithms. The temperature correction method was evaluated using 34 stations from the International Soil Moisture Monitoring Network and another nine stations from a local soil moisture monitoring network in Mongolia. Soil moisture monitoring networks used in this study cover four major climates and six major soil types. Results indicated that the automated temperature correction algorithms developed in this study can eliminate temperature effects from dielectric sensor measurements successfully even without on-site rainfall data. Furthermore, it has been found that actual daily average of SWC has been changed due to temperature effects of dielectric sensors with a

  1. Automated haematology analysis to diagnose malaria

    NARCIS (Netherlands)

    Campuzano-Zuluaga, Germán; Hänscheid, Thomas; Grobusch, Martin P.

    2010-01-01

    For more than a decade, flow cytometry-based automated haematology analysers have been studied for malaria diagnosis. Although current haematology analysers are not specifically designed to detect malaria-related abnormalities, most studies have found sensitivities that comply with WHO

  2. Automation of radionuclide analysis in nuclear industry

    International Nuclear Information System (INIS)

    Gostilo, V.; Sokolov, A.; Kuzmenko, V.; Kondratjev, V.

    2009-01-01

    The development results for the automated precise HPGe spectrometers and systems for radionuclide analyses in nuclear industry and environmental monitoring are presented. Automated HPGe spectrometer for radionuclide monitoring of coolant in primary circuit of NPPs is intended for technological monitoring of the radionuclide specific activity in liquid and gaseous flows in the on-line mode. The automated spectrometer based on flowing HPGe detector with the through channel is intended for control of the uniformity of distribution of uranium and/or plutonium in fresh fuel elements, transferred through the detector, as well as for on-line control of the fluids and gases flows with low activity. Automated monitoring system for radionuclide volumetric activity in outlet channels of NPPs is intended for radionuclide monitoring of water reservoirs in the regions of nuclear weapons testing, near nuclear storage, nuclear power plants and other objects of nuclear energetic. Autonomous HPGe spectrometer for deep water radionuclide monitoring is applicable for registration of gamma radionuclides, distributed in water depth up to 3000 m (radioactive wastes storage, wreck of atomic ships, lost nuclear charges, atomic industry technological waste release etc.).(authors)

  3. A new automated method of e-learner's satisfaction measurement

    Directory of Open Access Journals (Sweden)

    Armands Strazds

    2007-06-01

    Full Text Available This paper presents a new method of measuring learner’s satisfaction while using electronic learning materials (e-courses, edutainment games, etc. in virtual non-linear environments. Method is based on a relation of Discovering and Learning probability distribution curves obtained by collecting and evaluating the human-computer interaction data. While being near real-time, this measurement is considered highly unobtrusive and cost-effective because of its automated approach. The first working prototype EDUSA 1.0 was developed and successfully tested by the Distance Education Studies Centre of Riga Technical University.

  4. An automated full-symmetry Patterson search method

    International Nuclear Information System (INIS)

    Rius, J.; Miravitlles, C.

    1987-01-01

    A full-symmetry Patterson search method is presented that performs a molecular coarse rotation search in vector space and orientation refinement using the σ function. The oriented molecule is positioned using the fast translation function τ 0 , which is based on the automated interpretation of τ projections using the sum function. This strategy reduces the number of Patterson-function values to be stored in the rotation search, and the use of the τ 0 function minimizes the required time for the development of all probable rotation search solutions. The application of this method to five representative test examples is shown. (orig.)

  5. Automated Steel Cleanliness Analysis Tool (ASCAT)

    Energy Technology Data Exchange (ETDEWEB)

    Gary Casuccio (RJ Lee Group); Michael Potter (RJ Lee Group); Fred Schwerer (RJ Lee Group); Dr. Richard J. Fruehan (Carnegie Mellon University); Dr. Scott Story (US Steel)

    2005-12-30

    The objective of this study was to develop the Automated Steel Cleanliness Analysis Tool (ASCATTM) to permit steelmakers to evaluate the quality of the steel through the analysis of individual inclusions. By characterizing individual inclusions, determinations can be made as to the cleanliness of the steel. Understanding the complicating effects of inclusions in the steelmaking process and on the resulting properties of steel allows the steel producer to increase throughput, better control the process, reduce remelts, and improve the quality of the product. The ASCAT (Figure 1) is a steel-smart inclusion analysis tool developed around a customized next-generation computer controlled scanning electron microscopy (NG-CCSEM) hardware platform that permits acquisition of inclusion size and composition data at a rate never before possible in SEM-based instruments. With built-in customized ''intelligent'' software, the inclusion data is automatically sorted into clusters representing different inclusion types to define the characteristics of a particular heat (Figure 2). The ASCAT represents an innovative new tool for the collection of statistically meaningful data on inclusions, and provides a means of understanding the complicated effects of inclusions in the steel making process and on the resulting properties of steel. Research conducted by RJLG with AISI (American Iron and Steel Institute) and SMA (Steel Manufactures of America) members indicates that the ASCAT has application in high-grade bar, sheet, plate, tin products, pipes, SBQ, tire cord, welding rod, and specialty steels and alloys where control of inclusions, whether natural or engineered, are crucial to their specification for a given end-use. Example applications include castability of calcium treated steel; interstitial free (IF) degasser grade slag conditioning practice; tundish clogging and erosion minimization; degasser circulation and optimization; quality assessment

  6. Automated Steel Cleanliness Analysis Tool (ASCAT)

    International Nuclear Information System (INIS)

    Gary Casuccio; Michael Potter; Fred Schwerer; Richard J. Fruehan; Dr. Scott Story

    2005-01-01

    The objective of this study was to develop the Automated Steel Cleanliness Analysis Tool (ASCATTM) to permit steelmakers to evaluate the quality of the steel through the analysis of individual inclusions. By characterizing individual inclusions, determinations can be made as to the cleanliness of the steel. Understanding the complicating effects of inclusions in the steelmaking process and on the resulting properties of steel allows the steel producer to increase throughput, better control the process, reduce remelts, and improve the quality of the product. The ASCAT (Figure 1) is a steel-smart inclusion analysis tool developed around a customized next-generation computer controlled scanning electron microscopy (NG-CCSEM) hardware platform that permits acquisition of inclusion size and composition data at a rate never before possible in SEM-based instruments. With built-in customized ''intelligent'' software, the inclusion data is automatically sorted into clusters representing different inclusion types to define the characteristics of a particular heat (Figure 2). The ASCAT represents an innovative new tool for the collection of statistically meaningful data on inclusions, and provides a means of understanding the complicated effects of inclusions in the steel making process and on the resulting properties of steel. Research conducted by RJLG with AISI (American Iron and Steel Institute) and SMA (Steel Manufactures of America) members indicates that the ASCAT has application in high-grade bar, sheet, plate, tin products, pipes, SBQ, tire cord, welding rod, and specialty steels and alloys where control of inclusions, whether natural or engineered, are crucial to their specification for a given end-use. Example applications include castability of calcium treated steel; interstitial free (IF) degasser grade slag conditioning practice; tundish clogging and erosion minimization; degasser circulation and optimization; quality assessment/steel cleanliness; slab, billet

  7. Automated haematology analysis to diagnose malaria

    Directory of Open Access Journals (Sweden)

    Grobusch Martin P

    2010-11-01

    Full Text Available Abstract For more than a decade, flow cytometry-based automated haematology analysers have been studied for malaria diagnosis. Although current haematology analysers are not specifically designed to detect malaria-related abnormalities, most studies have found sensitivities that comply with WHO malaria-diagnostic guidelines, i.e. ≥ 95% in samples with > 100 parasites/μl. Establishing a correct and early malaria diagnosis is a prerequisite for an adequate treatment and to minimizing adverse outcomes. Expert light microscopy remains the 'gold standard' for malaria diagnosis in most clinical settings. However, it requires an explicit request from clinicians and has variable accuracy. Malaria diagnosis with flow cytometry-based haematology analysers could become an important adjuvant diagnostic tool in the routine laboratory work-up of febrile patients in or returning from malaria-endemic regions. Haematology analysers so far studied for malaria diagnosis are the Cell-Dyn®, Coulter® GEN·S and LH 750, and the Sysmex XE-2100® analysers. For Cell-Dyn analysers, abnormal depolarization events mainly in the lobularity/granularity and other scatter-plots, and various reticulocyte abnormalities have shown overall sensitivities and specificities of 49% to 97% and 61% to 100%, respectively. For the Coulter analysers, a 'malaria factor' using the monocyte and lymphocyte size standard deviations obtained by impedance detection has shown overall sensitivities and specificities of 82% to 98% and 72% to 94%, respectively. For the XE-2100, abnormal patterns in the DIFF, WBC/BASO, and RET-EXT scatter-plots, and pseudoeosinophilia and other abnormal haematological variables have been described, and multivariate diagnostic models have been designed with overall sensitivities and specificities of 86% to 97% and 81% to 98%, respectively. The accuracy for malaria diagnosis may vary according to species, parasite load, immunity and clinical context where the

  8. Automated genome sequence analysis and annotation.

    Science.gov (United States)

    Andrade, M A; Brown, N P; Leroy, C; Hoersch, S; de Daruvar, A; Reich, C; Franchini, A; Tamames, J; Valencia, A; Ouzounis, C; Sander, C

    1999-05-01

    Large-scale genome projects generate a rapidly increasing number of sequences, most of them biochemically uncharacterized. Research in bioinformatics contributes to the development of methods for the computational characterization of these sequences. However, the installation and application of these methods require experience and are time consuming. We present here an automatic system for preliminary functional annotation of protein sequences that has been applied to the analysis of sets of sequences from complete genomes, both to refine overall performance and to make new discoveries comparable to those made by human experts. The GeneQuiz system includes a Web-based browser that allows examination of the evidence leading to an automatic annotation and offers additional information, views of the results, and links to biological databases that complement the automatic analysis. System structure and operating principles concerning the use of multiple sequence databases, underlying sequence analysis tools, lexical analyses of database annotations and decision criteria for functional assignments are detailed. The system makes automatic quality assessments of results based on prior experience with the underlying sequence analysis tools; overall error rates in functional assignment are estimated at 2.5-5% for cases annotated with highest reliability ('clear' cases). Sources of over-interpretation of results are discussed with proposals for improvement. A conservative definition for reporting 'new findings' that takes account of database maturity is presented along with examples of possible kinds of discoveries (new function, family and superfamily) made by the system. System performance in relation to sequence database coverage, database dynamics and database search methods is analysed, demonstrating the inherent advantages of an integrated automatic approach using multiple databases and search methods applied in an objective and repeatable manner. The GeneQuiz system

  9. Feasibility studies of safety assessment methods for programmable automation systems. Final report of the AVV project

    International Nuclear Information System (INIS)

    Haapanen, P.; Maskuniitty, M.; Pulkkinen, U.; Heikkinen, J.; Korhonen, J.; Tuulari, E.

    1995-10-01

    Feasibility studies of two different groups of methodologies for safety assessment of programmable automation systems has been executed at the Technical Research Centre of Finland (VTT). The studies concerned the dynamic testing methods and the fault tree (FT) and failure mode and effects analysis (FMEA) methods. In order to get real experience in the application of these methods, an experimental testing of two realistic pilot systems were executed and a FT/FMEA analysis of a programmable safety function accomplished. The purpose of the studies was not to assess the object systems, but to get experience in the application of methods and assess their potentials and development needs. (46 refs., 21 figs.)

  10. Power Analysis of an Automated Dynamic Cone Penetrometer

    Science.gov (United States)

    2015-09-01

    ARL-TR-7494 ● SEP 2015 US Army Research Laboratory Power Analysis of an Automated Dynamic Cone Penetrometer by C Wesley...Automated Dynamic Cone Penetrometer by C Wesley Tipton IV and Donald H Porschet Sensors and Electron Devices Directorate, ARL...Dynamic Cone Penetrometer 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) C Wesley Tipton IV and Donald H

  11. Automated Image Analysis of Offshore Infrastructure Marine Biofouling

    Directory of Open Access Journals (Sweden)

    Kate Gormley

    2018-01-01

    Full Text Available In the UK, some of the oldest oil and gas installations have been in the water for over 40 years and have considerable colonisation by marine organisms, which may lead to both industry challenges and/or potential biodiversity benefits (e.g., artificial reefs. The project objective was to test the use of an automated image analysis software (CoralNet on images of marine biofouling from offshore platforms on the UK continental shelf, with the aim of (i training the software to identify the main marine biofouling organisms on UK platforms; (ii testing the software performance on 3 platforms under 3 different analysis criteria (methods A–C; (iii calculating the percentage cover of marine biofouling organisms and (iv providing recommendations to industry. Following software training with 857 images, and testing of three platforms, results showed that diversity of the three platforms ranged from low (in the central North Sea to moderate (in the northern North Sea. The two central North Sea platforms were dominated by the plumose anemone Metridium dianthus; and the northern North Sea platform showed less obvious species domination. Three different analysis criteria were created, where the method of selection of points, number of points assessed and confidence level thresholds (CT varied: (method A random selection of 20 points with CT 80%, (method B stratified random of 50 points with CT of 90% and (method C a grid approach of 100 points with CT of 90%. Performed across the three platforms, the results showed that there were no significant differences across the majority of species and comparison pairs. No significant difference (across all species was noted between confirmed annotations methods (A, B and C. It was considered that the software performed well for the classification of the main fouling species in the North Sea. Overall, the study showed that the use of automated image analysis software may enable a more efficient and consistent

  12. Automated titration method for use on blended asphalts

    Science.gov (United States)

    Pauli, Adam T [Cheyenne, WY; Robertson, Raymond E [Laramie, WY; Branthaver, Jan F [Chatham, IL; Schabron, John F [Laramie, WY

    2012-08-07

    A system for determining parameters and compatibility of a substance such as an asphalt or other petroleum substance uses titration to highly accurately determine one or more flocculation occurrences and is especially applicable to the determination or use of Heithaus parameters and optimal mixing of various asphalt stocks. In a preferred embodiment, automated titration in an oxygen gas exclusive system and further using spectrophotometric analysis (2-8) of solution turbidity is presented. A reversible titration technique enabling in-situ titration measurement of various solution concentrations is also presented.

  13. Tank Farm Operations Surveillance Automation Analysis

    International Nuclear Information System (INIS)

    MARQUEZ, D.L.

    2000-01-01

    The Nuclear Operations Project Services identified the need to improve manual tank farm surveillance data collection, review, distribution and storage practices often referred to as Operator Rounds. This document provides the analysis in terms of feasibility to improve the manual data collection methods by using handheld computer units, barcode technology, a database for storage and acquisitions, associated software, and operational procedures to increase the efficiency of Operator Rounds associated with surveillance activities

  14. Comparison of Particulate Mercury Measured with Manual and Automated Methods

    Directory of Open Access Journals (Sweden)

    Rachel Russo

    2011-01-01

    Full Text Available A study was conducted to compare measuring particulate mercury (HgP with the manual filter method and the automated Tekran system. Simultaneous measurements were conducted with the Tekran and Teflon filter methodologies in the marine and coastal continental atmospheres. Overall, the filter HgP values were on the average 21% higher than the Tekran HgP, and >85% of the data were outside of ±25% region surrounding the 1:1 line. In some cases the filter values were as much as 3-fold greater, with

  15. Systems and Methods for Automated Water Detection Using Visible Sensors

    Science.gov (United States)

    Rankin, Arturo L. (Inventor); Matthies, Larry H. (Inventor); Bellutta, Paolo (Inventor)

    2016-01-01

    Systems and methods are disclosed that include automated machine vision that can utilize images of scenes captured by a 3D imaging system configured to image light within the visible light spectrum to detect water. One embodiment includes autonomously detecting water bodies within a scene including capturing at least one 3D image of a scene using a sensor system configured to detect visible light and to measure distance from points within the scene to the sensor system, and detecting water within the scene using a processor configured to detect regions within each of the at least one 3D images that possess at least one characteristic indicative of the presence of water.

  16. Automated X-ray image analysis for cargo security: Critical review and future promise.

    Science.gov (United States)

    Rogers, Thomas W; Jaccard, Nicolas; Morton, Edward J; Griffin, Lewis D

    2017-01-01

    We review the relatively immature field of automated image analysis for X-ray cargo imagery. There is increasing demand for automated analysis methods that can assist in the inspection and selection of containers, due to the ever-growing volumes of traded cargo and the increasing concerns that customs- and security-related threats are being smuggled across borders by organised crime and terrorist networks. We split the field into the classical pipeline of image preprocessing and image understanding. Preprocessing includes: image manipulation; quality improvement; Threat Image Projection (TIP); and material discrimination and segmentation. Image understanding includes: Automated Threat Detection (ATD); and Automated Contents Verification (ACV). We identify several gaps in the literature that need to be addressed and propose ideas for future research. Where the current literature is sparse we borrow from the single-view, multi-view, and CT X-ray baggage domains, which have some characteristics in common with X-ray cargo.

  17. Automated differentiation of computer models for sensitivity analysis

    International Nuclear Information System (INIS)

    Worley, B.A.

    1991-01-01

    Sensitivity analysis of reactor physics computer models is an established discipline after more than twenty years of active development of generalized perturbations theory based on direct and adjoint methods. Many reactor physics models have been enhanced to solve for sensitivities of model results to model data. The calculated sensitivities are usually normalized first derivatives, although some codes are capable of solving for higher-order sensitivities. The purpose of this paper is to report on the development and application of the GRESS system for automating the implementation of the direct and adjoint techniques into existing FORTRAN computer codes. The GRESS system was developed at ORNL to eliminate the costly man-power intensive effort required to implement the direct and adjoint techniques into already-existing FORTRAN codes. GRESS has been successfully tested for a number of codes over a wide range of applications and presently operates on VAX machines under both VMS and UNIX operating systems. (author). 9 refs, 1 tab

  18. An automated method for identifying artifact in ICA of resting-state fMRI

    Directory of Open Access Journals (Sweden)

    Kaushik eBhaganagarapu

    2013-07-01

    Full Text Available An enduring issue with data-driven analysis and filtering methods is the interpretation of results. To assist, we present an automatic method for identifaction of artifact in independent components (ICs derived from functional MRI (fMRI. The method was designed with the following features: Does not require temporal information about an fMRI paradigm; Does not require the user to train the algorithm; Requires only the fMRI images (additional acquisition of anatomical imaging not required; Is able to identify a high proportion of artifact-related ICs without removing components that are likely to be of neuronal origin; Can be applied to resting-state fMRI; Is automated, requiring minimal or no human intervention.We applied the method to a MELODIC probabilistic ICA of resting-state functional connectivity data acquired in 50 healthy control subjects, and compared the results to a blinded expert manual classification. The method identified between 26% and 72% of the components as artifact (mean 55%. 0.3% of components identified as artifact were discordant with the manual classification; retrospective examination of these ICs suggested the automated method had correctly identified these as artifact.We have developed an effective automated method which removes a substantial number of unwanted noisy components in ICA analyses of resting-state fMRI data. Source code of our implementation of the method is available.

  19. Automated modelling of complex refrigeration cycles through topological structure analysis

    International Nuclear Information System (INIS)

    Belman-Flores, J.M.; Riesco-Avila, J.M.; Gallegos-Munoz, A.; Navarro-Esbri, J.; Aceves, S.M.

    2009-01-01

    We have developed a computational method for analysis of refrigeration cycles. The method is well suited for automated analysis of complex refrigeration systems. The refrigerator is specified through a description of flows representing thermodynamic sates at system locations; components that modify the thermodynamic state of a flow; and controls that specify flow characteristics at selected points in the diagram. A system of equations is then established for the refrigerator, based on mass, energy and momentum balances for each of the system components. Controls specify the values of certain system variables, thereby reducing the number of unknowns. It is found that the system of equations for the refrigerator may contain a number of redundant or duplicate equations, and therefore further equations are necessary for a full characterization. The number of additional equations is related to the number of loops in the cycle, and this is calculated by a matrix-based topological method. The methodology is demonstrated through an analysis of a two-stage refrigeration cycle.

  20. Automics: an integrated platform for NMR-based metabonomics spectral processing and data analysis.

    Science.gov (United States)

    Wang, Tao; Shao, Kang; Chu, Qinying; Ren, Yanfei; Mu, Yiming; Qu, Lijia; He, Jie; Jin, Changwen; Xia, Bin

    2009-03-16

    Spectral processing and post-experimental data analysis are the major tasks in NMR-based metabonomics studies. While there are commercial and free licensed software tools available to assist these tasks, researchers usually have to use multiple software packages for their studies because software packages generally focus on specific tasks. It would be beneficial to have a highly integrated platform, in which these tasks can be completed within one package. Moreover, with open source architecture, newly proposed algorithms or methods for spectral processing and data analysis can be implemented much more easily and accessed freely by the public. In this paper, we report an open source software tool, Automics, which is specifically designed for NMR-based metabonomics studies. Automics is a highly integrated platform that provides functions covering almost all the stages of NMR-based metabonomics studies. Automics provides high throughput automatic modules with most recently proposed algorithms and powerful manual modules for 1D NMR spectral processing. In addition to spectral processing functions, powerful features for data organization, data pre-processing, and data analysis have been implemented. Nine statistical methods can be applied to analyses including: feature selection (Fisher's criterion), data reduction (PCA, LDA, ULDA), unsupervised clustering (K-Mean) and supervised regression and classification (PLS/PLS-DA, KNN, SIMCA, SVM). Moreover, Automics has a user-friendly graphical interface for visualizing NMR spectra and data analysis results. The functional ability of Automics is demonstrated with an analysis of a type 2 diabetes metabolic profile. Automics facilitates high throughput 1D NMR spectral processing and high dimensional data analysis for NMR-based metabonomics applications. Using Automics, users can complete spectral processing and data analysis within one software package in most cases. Moreover, with its open source architecture, interested

  1. Automics: an integrated platform for NMR-based metabonomics spectral processing and data analysis

    Directory of Open Access Journals (Sweden)

    Qu Lijia

    2009-03-01

    Full Text Available Abstract Background Spectral processing and post-experimental data analysis are the major tasks in NMR-based metabonomics studies. While there are commercial and free licensed software tools available to assist these tasks, researchers usually have to use multiple software packages for their studies because software packages generally focus on specific tasks. It would be beneficial to have a highly integrated platform, in which these tasks can be completed within one package. Moreover, with open source architecture, newly proposed algorithms or methods for spectral processing and data analysis can be implemented much more easily and accessed freely by the public. Results In this paper, we report an open source software tool, Automics, which is specifically designed for NMR-based metabonomics studies. Automics is a highly integrated platform that provides functions covering almost all the stages of NMR-based metabonomics studies. Automics provides high throughput automatic modules with most recently proposed algorithms and powerful manual modules for 1D NMR spectral processing. In addition to spectral processing functions, powerful features for data organization, data pre-processing, and data analysis have been implemented. Nine statistical methods can be applied to analyses including: feature selection (Fisher's criterion, data reduction (PCA, LDA, ULDA, unsupervised clustering (K-Mean and supervised regression and classification (PLS/PLS-DA, KNN, SIMCA, SVM. Moreover, Automics has a user-friendly graphical interface for visualizing NMR spectra and data analysis results. The functional ability of Automics is demonstrated with an analysis of a type 2 diabetes metabolic profile. Conclusion Automics facilitates high throughput 1D NMR spectral processing and high dimensional data analysis for NMR-based metabonomics applications. Using Automics, users can complete spectral processing and data analysis within one software package in most cases

  2. Semi-automated method for estimating lesion volumes.

    Science.gov (United States)

    Park, Hyun-Joo; Machado, Andre G; Cooperrider, Jessica; Truong-Furmaga, Havan; Johnson, Matthew; Krishna, Vibhuti; Chen, Zhihong; Gale, John T

    2013-02-15

    Accurately measuring the volume of tissue damage in experimental lesion models is crucial to adequately control for the extent and location of the lesion, variables that can dramatically bias the outcome of preclinical studies. Many of the current commonly used techniques for this assessment, such as measuring the lesion volume with primitive software macros and plotting the lesion location manually using atlases, are time-consuming and offer limited precision. Here we present an easy to use semi-automated computational method for determining lesion volume and location, designed to increase precision and reduce the manual labor required. We compared this novel method to currently used methods and demonstrate that this tool is comparable or superior to current techniques in terms of precision and has distinct advantages with respect to user interface, labor intensiveness and quality of data presentation. Copyright © 2012 Elsevier B.V. All rights reserved.

  3. Automated SEM Modal Analysis Applied to the Diogenites

    Science.gov (United States)

    Bowman, L. E.; Spilde, M. N.; Papike, James J.

    1996-01-01

    Analysis of volume proportions of minerals, or modal analysis, is routinely accomplished by point counting on an optical microscope, but the process, particularly on brecciated samples such as the diogenite meteorites, is tedious and prone to error by misidentification of very small fragments, which may make up a significant volume of the sample. Precise volume percentage data can be gathered on a scanning electron microscope (SEM) utilizing digital imaging and an energy dispersive spectrometer (EDS). This form of automated phase analysis reduces error, and at the same time provides more information than could be gathered using simple point counting alone, such as particle morphology statistics and chemical analyses. We have previously studied major, minor, and trace-element chemistry of orthopyroxene from a suite of diogenites. This abstract describes the method applied to determine the modes on this same suite of meteorites and the results of that research. The modal abundances thus determined add additional information on the petrogenesis of the diogenites. In addition, low-abundance phases such as spinels were located for further analysis by this method.

  4. Automated methods for the summarization of electronic health records.

    Science.gov (United States)

    Pivovarov, Rimma; Elhadad, Noémie

    2015-09-01

    This review examines work on automated summarization of electronic health record (EHR) data and in particular, individual patient record summarization. We organize the published research and highlight methodological challenges in the area of EHR summarization implementation. The target audience for this review includes researchers, designers, and informaticians who are concerned about the problem of information overload in the clinical setting as well as both users and developers of clinical summarization systems. Automated summarization has been a long-studied subject in the fields of natural language processing and human-computer interaction, but the translation of summarization and visualization methods to the complexity of the clinical workflow is slow moving. We assess work in aggregating and visualizing patient information with a particular focus on methods for detecting and removing redundancy, describing temporality, determining salience, accounting for missing data, and taking advantage of encoded clinical knowledge. We identify and discuss open challenges critical to the implementation and use of robust EHR summarization systems. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved.

  5. Nonlinear structural analysis using integrated force method

    Indian Academy of Sciences (India)

    During the formulative period of structural analysis by matrix methods, earnest research was directed to automate the force ... (1973) for the analysis of discrete and continuous systems. IFM is a force method of .... (Nagabhushanam & Patnaik 1989) are being developed, which helps the use of efficient solution techniques for ...

  6. Analysis of Trinity Power Metrics for Automated Monitoring

    Energy Technology Data Exchange (ETDEWEB)

    Michalenko, Ashley Christine [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-07-28

    This is a presentation from Los Alamos National Laboraotyr (LANL) about the analysis of trinity power metrics for automated monitoring. The following topics are covered: current monitoring efforts, motivation for analysis, tools used, the methodology, work performed during the summer, and future work planned.

  7. Feasibility of estimation of brain volume and 2-deoxy-2-(18)F-fluoro-D-glucose metabolism using a novel automated image analysis method: application in Alzheimer's disease.

    Science.gov (United States)

    Musiek, Erik S; Saboury, Babak; Mishra, Shipra; Chen, Yufen; Reddin, Janet S; Newberg, Andrew B; Udupa, Jayaram K; Detre, John A; Hofheinz, Frank; Torigian, Drew; Alavi, Abass

    2012-01-01

    The development of clinically-applicable quantitative methods for the analysis of brain fluorine-18 fluoro desoxyglucose-positron emission tomography ((18)F-FDG-PET) images is a major area of research in many neurologic diseases, particularly Alzheimer's disease (AD). Region of interest visualization, evaluation, and image registration (ROVER) is a novel commercially-available software package which provides automated partial volume corrected measures of volume and glucose uptake from (18)F-FDG PET data. We performed a pilot study of ROVER analysis of brain (18)F-FDG PET images for the first time in a small cohort of patients with AD and controls. Brain (18)F-FDG-PET and volumetric magnetic resonance imaging (MRI) were performed on 14 AD patients and 18 age-matched controls. Images were subjected to ROVER analysis, and voxel-based analysis using SPM5. Volumes by ROVER were 35% lower than MRI volumes in AD patients (as hypometabolic regions were excluded in ROVER-derived volume measurement ) while average ROVER- and MRI-derived cortical volumes were nearly identical in control population. Whole brain volumes when ROVER-derived and whole brain metabolic volumetric products (MVP) were significantly lower in AD and accurately distinguished AD patients from controls (Area Under the Curve (AUC) of Receiver Operator Characteristic (ROC) curves 0.89 and 0.86, respectively). This diagnostic accuracy was similar to voxel-based analyses. Analysis by ROVER of (18)F-FDG-PET images provides a unique index of metabolically-active brain volume, and can accurately distinguish between AD patients and controls as a proof of concept. In conclusion, our findings suggest that ROVER may serve as a useful quantitative adjunct to visual or regional assessment and aid analysis of whole-brain metabolism in AD and other neurologic and psychiatric diseases.

  8. Extending and automating a Systems-Theoretic hazard analysis for requirements generation and analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Thomas, John (Massachusetts Institute of Technology)

    2012-05-01

    Systems Theoretic Process Analysis (STPA) is a powerful new hazard analysis method designed to go beyond traditional safety techniques - such as Fault Tree Analysis (FTA) - that overlook important causes of accidents like flawed requirements, dysfunctional component interactions, and software errors. While proving to be very effective on real systems, no formal structure has been defined for STPA and its application has been ad-hoc with no rigorous procedures or model-based design tools. This report defines a formal mathematical structure underlying STPA and describes a procedure for systematically performing an STPA analysis based on that structure. A method for using the results of the hazard analysis to generate formal safety-critical, model-based system and software requirements is also presented. Techniques to automate both the analysis and the requirements generation are introduced, as well as a method to detect conflicts between the safety and other functional model-based requirements during early development of the system.

  9. Automated immunohistochemical method to analyze large areas of the human cortex.

    Science.gov (United States)

    Abbass, Mohamad; Trought, Kathleen; Long, David; Semechko, Anton; Wong, Albert H C

    2018-01-15

    There have been inconsistencies in the histological abnormalities found in the cerebral cortex from patients with schizophrenia, bipolar disorder and major depression. Discrepancies in previously published reports may arise from small sample sizes, inconsistent methodology and biased cell counting. We applied automated quantification of neuron density, neuron size and cortical layer thickness in large regions of the cerebral cortex in psychiatric patients. This method accurately segments DAPI positive cells that are also stained with CUX2 and FEZF2. Cortical layer thickness, neuron density and neuron size were automatically computed for each cortical layer in numerous Brodmann areas. We did not find pronounced cytoarchitectural abnormalities in the anterior cingulate cortex or orbitofrontal cortex in patients with schizophrenia, bipolar disorder or major depressive disorder. There were no significant differences in layer thickness measured in immunohistochemically stained slides compared with traditional Nissl stained slides. Automated cell counts were correlated, reliable and consistent with manual counts, while being much less time-consuming. We demonstrate the validity of using a novel automated analysis approach to post-mortem brain tissue. We were able to analyze large cortical areas and quantify specific cell populations using immunohistochemical markers. Future analyses could benefit from efficient automated analysis. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Low resolution spectroscopic investigation of Am stars using Automated method

    Science.gov (United States)

    Sharma, Kaushal; Joshi, Santosh; Singh, Harinder P.

    2018-04-01

    The automated method of full spectrum fitting gives reliable estimates of stellar atmospheric parameters (Teff, log g and [Fe/H]) for late A, F, G, and early K type stars. Recently, the technique was further improved in the cooler regime and the validity range was extended up to a spectral type of M6 - M7 (Teff˜ 2900 K). The present study aims to explore the application of this method on the low-resolution spectra of Am stars, a class of chemically peculiar stars, to examine its robustness for these objects. We use ULySS with the Medium-resolution INT Library of Empirical Spectra (MILES) V2 spectral interpolator for parameter determination. The determined Teff and log g values are found to be in good agreement with those obtained from high-resolution spectroscopy.

  11. Development of An Optimization Method for Determining Automation Rate in Nuclear Power Plants

    International Nuclear Information System (INIS)

    Lee, Seung Min; Seong, Poong Hyun; Kim, Jong Hyun

    2014-01-01

    Since automation was introduced in various industrial fields, it has been known that automation provides positive effects like greater efficiency and fewer human errors, and negative effect defined as out-of-the-loop (OOTL). Thus, before introducing automation in nuclear field, the estimation of the positive and negative effects of automation on human operators should be conducted. In this paper, by focusing on CPS, the optimization method to find an appropriate proportion of automation is suggested by integrating the suggested cognitive automation rate and the concepts of the level of ostracism. The cognitive automation rate estimation method was suggested to express the reduced amount of human cognitive loads, and the level of ostracism was suggested to express the difficulty in obtaining information from the automation system and increased uncertainty of human operators' diagnosis. The maximized proportion of automation that maintains the high level of attention for monitoring the situation is derived by an experiment, and the automation rate is estimated by the suggested automation rate estimation method. It is expected to derive an appropriate inclusion proportion of the automation system avoiding the OOTL problem and having maximum efficacy at the same time

  12. Automated computer analysis of plasma-streak traces from SCYLLAC

    International Nuclear Information System (INIS)

    Whiteman, R.L.; Jahoda, F.C.; Kruger, R.P.

    1977-11-01

    An automated computer analysis technique that locates and references the approximate centroid of single- or dual-streak traces from the Los Alamos Scientific Laboratory SCYLLAC facility is described. The technique also determines the plasma-trace width over a limited self-adjusting region. The plasma traces are recorded with streak cameras on Polaroid film, then scanned and digitized for processing. The analysis technique uses scene segmentation to separate the plasma trace from a reference fiducial trace. The technique employs two methods of peak detection; one for the plasma trace and one for the fiducial trace. The width is obtained using an edge-detection, or slope, method. Timing data are derived from the intensity modulation of the fiducial trace. To smooth (despike) the output graphs showing the plasma-trace centroid and width, a technique of ''twicing'' developed by Tukey was employed. In addition, an interactive sorting algorithm allows retrieval of the centroid, width, and fiducial data from any test shot plasma for post analysis. As yet, only a limited set of the plasma traces has been processed with this technique

  13. Automated computer analysis of plasma-streak traces from SCYLLAC

    International Nuclear Information System (INIS)

    Whitman, R.L.; Jahoda, F.C.; Kruger, R.P.

    1977-01-01

    An automated computer analysis technique that locates and references the approximate centroid of single- or dual-streak traces from the Los Alamos Scientific Laboratory SCYLLAC facility is described. The technique also determines the plasma-trace width over a limited self-adjusting region. The plasma traces are recorded with streak cameras on Polaroid film, then scanned and digitized for processing. The analysis technique uses scene segmentation to separate the plasma trace from a reference fiducial trace. The technique employs two methods of peak detection; one for the plasma trace and one for the fiducial trace. The width is obtained using an edge-detection, or slope, method. Timing data are derived from the intensity modulation of the fiducial trace. To smooth (despike) the output graphs showing the plasma-trace centroid and width, a technique of ''twicing'' developed by Tukey was employed. In addition, an interactive sorting algorithm allows retrieval of the centroid, width, and fiducial data from any test shot plasma for post analysis. As yet, only a limited set of sixteen plasma traces has been processed using this technique

  14. Automation of reactor neutron activation analysis

    International Nuclear Information System (INIS)

    Pavlov, S.S.; Dmitriev, A.Yu.; Frontasyeva, M.V.

    2013-01-01

    The present status of the development of a software package designed for automation of NAA at the IBR-2 reactor of FLNP, JINR, Dubna, is reported. Following decisions adopted at the CRP Meeting in Delft, August 27-31, 2012, the missing tool - a sample changer - will be installed for NAA in compliance with the peculiar features of the radioanalytical laboratory REGATA at the IBR-2 reactor. The details of the design are presented. The software for operation with the sample changer consists of two parts. The first part is a user interface and the second one is a program to control the sample changer. The second part will be developed after installing the tool.

  15. Automated sensitivity analysis using the GRESS language

    International Nuclear Information System (INIS)

    Pin, F.G.; Oblow, E.M.; Wright, R.Q.

    1986-04-01

    An automated procedure for performing large-scale sensitivity studies based on the use of computer calculus is presented. The procedure is embodied in a FORTRAN precompiler called GRESS, which automatically processes computer models and adds derivative-taking capabilities to the normal calculated results. In this report, the GRESS code is described, tested against analytic and numerical test problems, and then applied to a major geohydrological modeling problem. The SWENT nuclear waste repository modeling code is used as the basis for these studies. Results for all problems are discussed in detail. Conclusions are drawn as to the applicability of GRESS in the problems at hand and for more general large-scale modeling sensitivity studies

  16. Automated computation of autonomous spectral submanifolds for nonlinear modal analysis

    Science.gov (United States)

    Ponsioen, Sten; Pedergnana, Tiemo; Haller, George

    2018-04-01

    We discuss an automated computational methodology for computing two-dimensional spectral submanifolds (SSMs) in autonomous nonlinear mechanical systems of arbitrary degrees of freedom. In our algorithm, SSMs, the smoothest nonlinear continuations of modal subspaces of the linearized system, are constructed up to arbitrary orders of accuracy, using the parameterization method. An advantage of this approach is that the construction of the SSMs does not break down when the SSM folds over its underlying spectral subspace. A further advantage is an automated a posteriori error estimation feature that enables a systematic increase in the orders of the SSM computation until the required accuracy is reached. We find that the present algorithm provides a major speed-up, relative to numerical continuation methods, in the computation of backbone curves, especially in higher-dimensional problems. We illustrate the accuracy and speed of the automated SSM algorithm on lower- and higher-dimensional mechanical systems.

  17. Accurate automated apnea analysis in preterm infants.

    Science.gov (United States)

    Vergales, Brooke D; Paget-Brown, Alix O; Lee, Hoshik; Guin, Lauren E; Smoot, Terri J; Rusin, Craig G; Clark, Matthew T; Delos, John B; Fairchild, Karen D; Lake, Douglas E; Moorman, Randall; Kattwinkel, John

    2014-02-01

    In 2006 the apnea of prematurity (AOP) consensus group identified inaccurate counting of apnea episodes as a major barrier to progress in AOP research. We compare nursing records of AOP to events detected by a clinically validated computer algorithm that detects apnea from standard bedside monitors. Waveform, vital sign, and alarm data were collected continuously from all very low-birth-weight infants admitted over a 25-month period, analyzed for central apnea, bradycardia, and desaturation (ABD) events, and compared with nursing documentation collected from charts. Our algorithm defined apnea as > 10 seconds if accompanied by bradycardia and desaturation. Of the 3,019 nurse-recorded events, only 68% had any algorithm-detected ABD event. Of the 5,275 algorithm-detected prolonged apnea events > 30 seconds, only 26% had nurse-recorded documentation within 1 hour. Monitor alarms sounded in only 74% of events of algorithm-detected prolonged apnea events > 10 seconds. There were 8,190,418 monitor alarms of any description throughout the neonatal intensive care unit during the 747 days analyzed, or one alarm every 2 to 3 minutes per nurse. An automated computer algorithm for continuous ABD quantitation is a far more reliable tool than the medical record to address the important research questions identified by the 2006 AOP consensus group. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.

  18. Early detection of pharmacovigilance signals with automated methods based on false discovery rates: a comparative study.

    Science.gov (United States)

    Ahmed, Ismaïl; Thiessard, Frantz; Miremont-Salamé, Ghada; Haramburu, Françoise; Kreft-Jais, Carmen; Bégaud, Bernard; Tubert-Bitter, Pascale

    2012-06-01

    Improving the detection of drug safety signals has led several pharmacovigilance regulatory agencies to incorporate automated quantitative methods into their spontaneous reporting management systems. The three largest worldwide pharmacovigilance databases are routinely screened by the lower bound of the 95% confidence interval of proportional reporting ratio (PRR₀₂.₅), the 2.5% quantile of the Information Component (IC₀₂.₅) or the 5% quantile of the Gamma Poisson Shrinker (GPS₀₅). More recently, Bayesian and non-Bayesian False Discovery Rate (FDR)-based methods were proposed that address the arbitrariness of thresholds and allow for a built-in estimate of the FDR. These methods were also shown through simulation studies to be interesting alternatives to the currently used methods. The objective of this work was twofold. Based on an extensive retrospective study, we compared PRR₀₂.₅, GPS₀₅ and IC₀₂.₅ with two FDR-based methods derived from the Fisher's exact test and the GPS model (GPS(pH0) [posterior probability of the null hypothesis H₀ calculated from the Gamma Poisson Shrinker model]). Secondly, restricting the analysis to GPS(pH0), we aimed to evaluate the added value of using automated signal detection tools compared with 'traditional' methods, i.e. non-automated surveillance operated by pharmacovigilance experts. The analysis was performed sequentially, i.e. every month, and retrospectively on the whole French pharmacovigilance database over the period 1 January 1996-1 July 2002. Evaluation was based on a list of 243 reference signals (RSs) corresponding to investigations launched by the French Pharmacovigilance Technical Committee (PhVTC) during the same period. The comparison of detection methods was made on the basis of the number of RSs detected as well as the time to detection. Results comparing the five automated quantitative methods were in favour of GPS(pH0) in terms of both number of detections of true signals and

  19. Granulometric profiling of aeolian dust deposits by automated image analysis

    Science.gov (United States)

    Varga, György; Újvári, Gábor; Kovács, János; Jakab, Gergely; Kiss, Klaudia; Szalai, Zoltán

    2016-04-01

    Determination of granulometric parameters is of growing interest in the Earth sciences. Particle size data of sedimentary deposits provide insights into the physicochemical environment of transport, accumulation and post-depositional alterations of sedimentary particles, and are important proxies applied in paleoclimatic reconstructions. It is especially true for aeolian dust deposits with a fairly narrow grain size range as a consequence of the extremely selective nature of wind sediment transport. Therefore, various aspects of aeolian sedimentation (wind strength, distance to source(s), possible secondary source regions and modes of sedimentation and transport) can be reconstructed only from precise grain size data. As terrestrial wind-blown deposits are among the most important archives of past environmental changes, proper explanation of the proxy data is a mandatory issue. Automated imaging provides a unique technique to gather direct information on granulometric characteristics of sedimentary particles. Granulometric data obtained from automatic image analysis of Malvern Morphologi G3-ID is a rarely applied new technique for particle size and shape analyses in sedimentary geology. Size and shape data of several hundred thousand (or even million) individual particles were automatically recorded in this study from 15 loess and paleosoil samples from the captured high-resolution images. Several size (e.g. circle-equivalent diameter, major axis, length, width, area) and shape parameters (e.g. elongation, circularity, convexity) were calculated by the instrument software. At the same time, the mean light intensity after transmission through each particle is automatically collected by the system as a proxy of optical properties of the material. Intensity values are dependent on chemical composition and/or thickness of the particles. The results of the automated imaging were compared to particle size data determined by three different laser diffraction instruments

  20. Advancements in Automated Circuit Grouping for Intellectual Property Trust Analysis

    Science.gov (United States)

    2017-03-20

    Advancements in Automated Circuit Grouping for Intellectual Property Trust Analysis James Inge, Matthew Kwiec, Stephen Baka, John Hallman...module, a custom on- chip memory module, a custom arithmetic logic unit module, and a custom Ethernet frame check sequence generator module. Though

  1. Automated image analysis in the study of collagenous colitis

    DEFF Research Database (Denmark)

    Fiehn, Anne-Marie Kanstrup; Kristensson, Martin; Engel, Ulla

    2016-01-01

    PURPOSE: The aim of this study was to develop an automated image analysis software to measure the thickness of the subepithelial collagenous band in colon biopsies with collagenous colitis (CC) and incomplete CC (CCi). The software measures the thickness of the collagenous band on microscopic...

  2. An Automated Data Analysis Tool for Livestock Market Data

    Science.gov (United States)

    Williams, Galen S.; Raper, Kellie Curry

    2011-01-01

    This article describes an automated data analysis tool that allows Oklahoma Cooperative Extension Service educators to disseminate results in a timely manner. Primary data collected at Oklahoma Quality Beef Network (OQBN) certified calf auctions across the state results in a large amount of data per sale site. Sale summaries for an individual sale…

  3. EPA Method 245.2: Mercury (Automated Cold Vapor Technique)

    Science.gov (United States)

    Method 245.2 describes procedures for preparation and analysis of drinking water samples for analysis of mercury using acid digestion and cold vapor atomic absorption. Samples are prepared using an acid digestion technique.

  4. ORIGAMI Automator Primer. Automated ORIGEN Source Terms and Spent Fuel Storage Pool Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Wieselquist, William A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Thompson, Adam B. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Bowman, Stephen M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Peterson, Joshua L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-04-01

    Source terms and spent nuclear fuel (SNF) storage pool decay heat load analyses for operating nuclear power plants require a large number of Oak Ridge Isotope Generation and Depletion (ORIGEN) calculations. SNF source term calculations also require a significant amount of bookkeeping to track quantities such as core and assembly operating histories, spent fuel pool (SFP) residence times, heavy metal masses, and enrichments. The ORIGEN Assembly Isotopics (ORIGAMI) module in the SCALE code system provides a simple scheme for entering these data. However, given the large scope of the analysis, extensive scripting is necessary to convert formats and process data to create thousands of ORIGAMI input files (one per assembly) and to process the results into formats readily usable by follow-on analysis tools. This primer describes a project within the SCALE Fulcrum graphical user interface (GUI) called ORIGAMI Automator that was developed to automate the scripting and bookkeeping in large-scale source term analyses. The ORIGAMI Automator enables the analyst to (1) easily create, view, and edit the reactor site and assembly information, (2) automatically create and run ORIGAMI inputs, and (3) analyze the results from ORIGAMI. ORIGAMI Automator uses the standard ORIGEN binary concentrations files produced by ORIGAMI, with concentrations available at all time points in each assembly’s life. The GUI plots results such as mass, concentration, activity, and decay heat using a powerful new ORIGEN Post-Processing Utility for SCALE (OPUS) GUI component. This document includes a description and user guide for the GUI, a step-by-step tutorial for a simplified scenario, and appendices that document the file structures used.

  5. Development and validation of an automated, microscopy-based method for enumeration of groups of intestinal bacteria

    NARCIS (Netherlands)

    Jansen, GJ; Wildeboer-Veloo, ACM; Tonk, RHJ; Franks, AH; Welling, G

    An automated microscopy-based method using fluorescently labelled 16S rRNA-targeted oligonucleotide probes directed against the predominant groups of intestinal bacteria was developed and validated. The method makes use of the Leica 600HR. image analysis system, a Kodak MegaPlus camera model 1.4 and

  6. Volumetric measurements of pulmonary nodules: variability in automated analysis tools

    Science.gov (United States)

    Juluru, Krishna; Kim, Woojin; Boonn, William; King, Tara; Siddiqui, Khan; Siegel, Eliot

    2007-03-01

    Over the past decade, several computerized tools have been developed for detection of lung nodules and for providing volumetric analysis. Incidentally detected lung nodules have traditionally been followed over time by measurements of their axial dimensions on CT scans to ensure stability or document progression. A recently published article by the Fleischner Society offers guidelines on the management of incidentally detected nodules based on size criteria. For this reason, differences in measurements obtained by automated tools from various vendors may have significant implications on management, yet the degree of variability in these measurements is not well understood. The goal of this study is to quantify the differences in nodule maximum diameter and volume among different automated analysis software. Using a dataset of lung scans obtained with both "ultra-low" and conventional doses, we identified a subset of nodules in each of five size-based categories. Using automated analysis tools provided by three different vendors, we obtained size and volumetric measurements on these nodules, and compared these data using descriptive as well as ANOVA and t-test analysis. Results showed significant differences in nodule maximum diameter measurements among the various automated lung nodule analysis tools but no significant differences in nodule volume measurements. These data suggest that when using automated commercial software, volume measurements may be a more reliable marker of tumor progression than maximum diameter. The data also suggest that volumetric nodule measurements may be relatively reproducible among various commercial workstations, in contrast to the variability documented when performing human mark-ups, as is seen in the LIDC (lung imaging database consortium) study.

  7. ESG: extended similarity group method for automated protein function prediction.

    Science.gov (United States)

    Chitale, Meghana; Hawkins, Troy; Park, Changsoon; Kihara, Daisuke

    2009-07-15

    Importance of accurate automatic protein function prediction is ever increasing in the face of a large number of newly sequenced genomes and proteomics data that are awaiting biological interpretation. Conventional methods have focused on high sequence similarity-based annotation transfer which relies on the concept of homology. However, many cases have been reported that simple transfer of function from top hits of a homology search causes erroneous annotation. New methods are required to handle the sequence similarity in a more robust way to combine together signals from strongly and weakly similar proteins for effectively predicting function for unknown proteins with high reliability. We present the extended similarity group (ESG) method, which performs iterative sequence database searches and annotates a query sequence with Gene Ontology terms. Each annotation is assigned with probability based on its relative similarity score with the multiple-level neighbors in the protein similarity graph. We will depict how the statistical framework of ESG improves the prediction accuracy by iteratively taking into account the neighborhood of query protein in the sequence similarity space. ESG outperforms conventional PSI-BLAST and the protein function prediction (PFP) algorithm. It is found that the iterative search is effective in capturing multiple-domains in a query protein, enabling accurately predicting several functions which originate from different domains. ESG web server is available for automated protein function prediction at http://dragon.bio.purdue.edu/ESG/.

  8. Automated logic conversion method for plant controller systems

    International Nuclear Information System (INIS)

    Wada, Yutaka; Kobayashi, Yasuhiro; Miyo, Tsunemasa; Okano, Masato.

    1990-01-01

    An automated method is proposed for logic conversion from functional description diagrams to detailed logic schematics by incorporating expertise knowledge in plant controller systems design. The method uses connection data of function elements in the functional description diagram as input, and synthesizes a detailed logic structure by adding elements to the given connection data incrementally, and to generate detailed logic schematics. In logic synthesis, for building up complex synthesis procedures by combining generally-described knowledge, knowledge is applied by groups. The search order of the groups is given by upper-level knowledge. Furthermore, the knowledge is expressed in terms of two classes of rules; one for generating a hypothesis of individual synthesis operations and the other for considering several hypotheses to determine the connection ordering of elements to be added. In the generation of detailed logic schematics, knowledge is used as rules for deriving various kinds of layout conditions on schematics, and rules for generating two-dimensional coordinates of layout objects. Rules in the latter class use layout conditions to predict intersections among layout objects without their coordinates being fixed. The effectiveness of the method with 150 rules was verified by its experimental application to some logic conversions in a real power plant design. Evaluation of the results showed them to be equivalent to those obtained by well qualified designers. (author)

  9. Sunglass detection method for automation of video surveillance system

    Science.gov (United States)

    Sikandar, Tasriva; Samsudin, Wan Nur Azhani W.; Hawari Ghazali, Kamarul; Mohd, Izzeldin I.; Fazle Rabbi, Mohammad

    2018-04-01

    Wearing sunglass to hide face from surveillance camera is a common activity in criminal incidences. Therefore, sunglass detection from surveillance video has become a demanding issue in automation of security systems. In this paper we propose an image processing method to detect sunglass from surveillance images. Specifically, a unique feature using facial height and width has been employed to identify the covered region of the face. The presence of covered area by sunglass is evaluated using facial height-width ratio. Threshold value of covered area percentage is used to classify the glass wearing face. Two different types of glasses have been considered i.e. eye glass and sunglass. The results of this study demonstrate that the proposed method is able to detect sunglasses in two different illumination conditions such as, room illumination as well as in the presence of sunlight. In addition, due to the multi-level checking in facial region, this method has 100% accuracy of detecting sunglass. However, in an exceptional case where fabric surrounding the face has similar color as skin, the correct detection rate was found 93.33% for eye glass.

  10. Comparative analysis of automation of production process with industrial robots in Asia/Australia and Europe

    Directory of Open Access Journals (Sweden)

    I. Karabegović

    2017-01-01

    Full Text Available The term "INDUSTRY 4.0" or "fourth industrial revolution" was first introduced at the fair in 2011 in Hannover. It comes from the high-tech strategy of the German Federal Government that promotes automation-computerization to complete smart automation, meaning the introduction of a method of self-automation, self-configuration, self-diagnosing and fixing the problem, knowledge and intelligent decision-making. Any automation, including smart, cannot be imagined without industrial robots. Along with the fourth industrial revolution, ‘’robotic revolution’’ is taking place in Japan. Robotic revolution refers to the development and research of robotic technology with the aim of using robots in all production processes, and the use of robots in real life, to be of service to a man in daily life. Knowing these facts, an analysis was conducted of the representation of industrial robots in the production processes on the two continents of Europe and Asia /Australia, as well as research that industry is ready for the introduction of intelligent automation with the goal of establishing future smart factories. The paper gives a representation of the automation of production processes in Europe and Asia/Australia, with predictions for the future.

  11. Automated analysis for detecting beams in laser wakefield simulations

    International Nuclear Information System (INIS)

    Ushizima, Daniela M.; Rubel, Oliver; Prabhat, Mr.; Weber, Gunther H.; Bethel, E. Wes; Aragon, Cecilia R.; Geddes, Cameron G.R.; Cormier-Michel, Estelle; Hamann, Bernd; Messmer, Peter; Hagen, Hans

    2008-01-01

    Laser wakefield particle accelerators have shown the potential to generate electric fields thousands of times higher than those of conventional accelerators. The resulting extremely short particle acceleration distance could yield a potential new compact source of energetic electrons and radiation, with wide applications from medicine to physics. Physicists investigate laser-plasma internal dynamics by running particle-in-cell simulations; however, this generates a large dataset that requires time-consuming, manual inspection by experts in order to detect key features such as beam formation. This paper describes a framework to automate the data analysis and classification of simulation data. First, we propose a new method to identify locations with high density of particles in the space-time domain, based on maximum extremum point detection on the particle distribution. We analyze high density electron regions using a lifetime diagram by organizing and pruning the maximum extrema as nodes in a minimum spanning tree. Second, we partition the multivariate data using fuzzy clustering to detect time steps in a experiment that may contain a high quality electron beam. Finally, we combine results from fuzzy clustering and bunch lifetime analysis to estimate spatially confined beams. We demonstrate our algorithms successfully on four different simulation datasets

  12. Automated GPR Rebar Analysis for Robotic Bridge Deck Evaluation.

    Science.gov (United States)

    Kaur, Parneet; Dana, Kristin J; Romero, Francisco A; Gucunski, Nenad

    2016-10-01

    Ground penetrating radar (GPR) is used to evaluate deterioration of reinforced concrete bridge decks based on measuring signal attenuation from embedded rebar. The existing methods for obtaining deterioration maps from GPR data often require manual interaction and offsite processing. In this paper, a novel algorithm is presented for automated rebar detection and analysis. We test the process with comprehensive measurements obtained using a novel state-of-the-art robotic bridge inspection system equipped with GPR sensors. The algorithm achieves robust performance by integrating machine learning classification using image-based gradient features and robust curve fitting of the rebar hyperbolic signature. The approach avoids edge detection, thresholding, and template matching that require manual tuning and are known to perform poorly in the presence of noise and outliers. The detected hyperbolic signatures of rebars within the bridge deck are used to generate deterioration maps of the bridge deck. The results of the rebar region detector are compared quantitatively with several methods of image-based classification and a significant performance advantage is demonstrated. High rates of accuracy are reported on real data that includes thousands of individual hyperbolic rebar signatures from three real bridge decks.

  13. Automated quantitative analysis of coordinated locomotor behaviour in rats.

    Science.gov (United States)

    Tanger, H J; Vanwersch, R A; Wolthuis, O L

    1984-03-01

    Disturbances of motor coordination are usually difficult to quantify. Therefore, a method was developed for the automated quantitative analysis of the movements of the dyed paws of stepping rats, registered by a colour TV camera. The signals from the TV-video system were converted by an electronic interface into voltages proportional to the X- and Y-coordinates of the paws, from which a desktop computer calculated the movements of these paws in time and distance. Application 1 analysed the steps of a rat walking in a hollow rotating wheel. The results showed low variability of the walking pattern, the method was insensitive to low doses of alcohol, but was suitable to quantify overt, e.g. neurotoxic, locomotor disturbances or recovery thereof. In application 2 hurdles were placed in a similar hollow wheel and the rats were trained to step from the top of one hurdle to another. Physostigmine-induced disturbances of this acquired complex motor task could be detected at doses far below those that cause overt symptoms.

  14. [Clinical application of automated digital image analysis for morphology review of peripheral blood leukocyte].

    Science.gov (United States)

    Xing, Ying; Yan, Xiaohua; Pu, Chengwei; Shang, Ke; Dong, Ning; Wang, Run; Wang, Jianzhong

    2016-03-01

    To explore the clinical application of automated digital image analysis in leukocyte morphology examination when review criteria of hematology analyzer are triggered. The reference range of leukocyte differentiation by automated digital image analysis was established by analyzing 304 healthy blood samples from Peking University First Hospital. Six hundred and ninty-seven blood samples from Peking University First Hospital were randomly collected from November 2013 to April 2014, complete blood cells were counted on hematology analyzer, blood smears were made and stained at the same time. Blood smears were detected by automated digital image analyzer and the results were checked (reclassification) by a staff with abundant morphology experience. The same smear was examined manually by microscope. The results by manual microscopic differentiation were used as"golden standard", and diagnostic efficiency of abnormal specimens by automated digital image analysis was calculated, including sensitivity, specificity and accuracy. The difference of abnormal leukocytes detected by two different methods was analyzed in 30 samples of hematological and infectious diseases. Specificity of identifying abnormalities of white blood cells by automated digital image analysis was more than 90% except monocyte. Sensitivity of neutrophil toxic abnormities (including Döhle body, toxic granulate and vacuolization) was 100%; sensitivity of blast cells, immature granulates and atypical lymphocytes were 91.7%, 60% to 81.5% and 61.5%, respectively. Sensitivity of leukocyte differential count was 91.8% for neutrophils, 88.5% for lymphocytes, 69.1% for monocytes, 78.9% for eosinophils and 36.3 for basophils. The positive rate of recognizing abnormal cells (blast, immature granulocyte and atypical lymphocyte) by manual microscopic method was 46.7%, 53.3% and 10%, respectively. The positive rate of automated digital image analysis was 43.3%, 60% and 10%, respectively. There was no statistic

  15. On Automating and Standardising Corpus Callosum Analysis in Brain MRI

    DEFF Research Database (Denmark)

    Stegmann, Mikkel Bille; Skoglund, Karl

    2005-01-01

    Corpus callosum analysis is influenced by many factors. The effort in controlling these has previously been incomplete and scattered. This paper sketches a complete pipeline for automated corpus callosum analysis from magnetic resonance images, with focus on measurement standardisation....... The presented pipeline deals with i) estimation of the mid-sagittal plane, ii) localisation and registration of the corpus callosum, iii) parameterisation and representation of its contour, and iv) means of standardising the traditional reference area measurements....

  16. Method and system for assigning a confidence metric for automated determination of optic disc location

    Science.gov (United States)

    Karnowski, Thomas P [Knoxville, TN; Tobin, Jr., Kenneth W.; Muthusamy Govindasamy, Vijaya Priya [Knoxville, TN; Chaum, Edward [Memphis, TN

    2012-07-10

    A method for assigning a confidence metric for automated determination of optic disc location that includes analyzing a retinal image and determining at least two sets of coordinates locating an optic disc in the retinal image. The sets of coordinates can be determined using first and second image analysis techniques that are different from one another. An accuracy parameter can be calculated and compared to a primary risk cut-off value. A high confidence level can be assigned to the retinal image if the accuracy parameter is less than the primary risk cut-off value and a low confidence level can be assigned to the retinal image if the accuracy parameter is greater than the primary risk cut-off value. The primary risk cut-off value being selected to represent an acceptable risk of misdiagnosis of a disease having retinal manifestations by the automated technique.

  17. Recent developments in the dissolution and automated analysis of plutonium and uranium for safeguards measurements

    International Nuclear Information System (INIS)

    Jackson, D.D.; Marsh, S.F.; Rein, J.E.; Waterbury, G.R.

    1976-01-01

    The status of a programme to develop assay methods for plutonium and uranium for safeguards purposes is presented. The current effort is directed more towards analyses of scrap-type material with an end goal of precise automated methods that also will be applicable to product materials. A guiding philosophy for the analysis of scrap-type materials, characterized by heterogeneity and difficult dissolution, is relatively fast dissolution treatment to carry out 90% or more solubilization of the uranium and plutonium, analysis of the soluble fraction by precise automated methods, and gamma-counting assay of any residue fraction using simple techniques. A Teflon-container metal-shell apparatus provides acid dissolutions of typical fuel-cycle materials at temperatures to 275 0 C and pressures to 340 atm. Gas-solid reactions at elevated temperatures are promising to separate uranium from refractory materials by the formation of volatile uranium compounds. The condensed compounds then are dissolved in acid for subsequent analysis. An automated spectrophotometer has been placed in operation for the determination of uranium and plutonium. The measurement range is 1 to 14 mg of either element with a relative standard deviation of 0.5% over most of the range. The throughput rate is 5 min per sample. A second-generation automated instrument, which will use a precise and specific electro analytical method as its operational basis, is being developed for the determination of plutonium. (author)

  18. Recent developments in the dissolution and automated analysis of plutonium and uranium for safeguards measurements

    International Nuclear Information System (INIS)

    Jackson, D.D.; Marsh, S.F.; Rein, J.E.; Waterbury, G.R.

    1975-01-01

    The status of a program to develop assay methods for plutonium and uranium for safeguards purposes is presented. The current effort is directed more toward analyses of scrap-type material with an end goal of precise automated methods that also will be applicable to product materials. A guiding philosophy for the analysis of scrap-type materials, characterized by heterogeneity and difficult dissolution, is relatively fast dissolution treatment to effect 90 percent or more solubilization of the uranium and plutonium, analysis of the soluble fraction by precise automated methods, and gamma-counting assay of any residue fraction using simple techniques. A Teflon-container metal-shell apparatus provides acid dissolutions of typical fuel cycle materials at temperatures to 275 0 C and pressures to 340 atm. Gas--solid reactions at elevated temperatures separate uranium from refractory materials by the formation of volatile uranium compounds. The condensed compounds then are dissolved in acid for subsequent analysis. An automated spectrophotometer is used for the determination of uranium and plutonium. The measurement range is 1 to 14 mg of either element with a relative standard deviation of 0.5 percent over most of the range. The throughput rate is 5 min per sample. A second-generation automated instrument is being developed for the determination of plutonium. A precise and specific electroanalytical method is used as its operational basis. (auth)

  19. Cardiac imaging: working towards fully-automated machine analysis & interpretation.

    Science.gov (United States)

    Slomka, Piotr J; Dey, Damini; Sitek, Arkadiusz; Motwani, Manish; Berman, Daniel S; Germano, Guido

    2017-03-01

    Non-invasive imaging plays a critical role in managing patients with cardiovascular disease. Although subjective visual interpretation remains the clinical mainstay, quantitative analysis facilitates objective, evidence-based management, and advances in clinical research. This has driven developments in computing and software tools aimed at achieving fully automated image processing and quantitative analysis. In parallel, machine learning techniques have been used to rapidly integrate large amounts of clinical and quantitative imaging data to provide highly personalized individual patient-based conclusions. Areas covered: This review summarizes recent advances in automated quantitative imaging in cardiology and describes the latest techniques which incorporate machine learning principles. The review focuses on the cardiac imaging techniques which are in wide clinical use. It also discusses key issues and obstacles for these tools to become utilized in mainstream clinical practice. Expert commentary: Fully-automated processing and high-level computer interpretation of cardiac imaging are becoming a reality. Application of machine learning to the vast amounts of quantitative data generated per scan and integration with clinical data also facilitates a move to more patient-specific interpretation. These developments are unlikely to replace interpreting physicians but will provide them with highly accurate tools to detect disease, risk-stratify, and optimize patient-specific treatment. However, with each technological advance, we move further from human dependence and closer to fully-automated machine interpretation.

  20. Development of methods for DSM and distribution automation planning

    International Nuclear Information System (INIS)

    Kaerkkaeinen, S.; Kekkonen, V.; Rissanen, P.

    1998-01-01

    Demand-Side Management (DSM) is usually an utility (or sometimes governmental) activity designed to influence energy demand of customers (both level and load variation). It includes basic options like strategic conservation or load growth, peak clipping. Load shifting and fuel switching. Typical ways to realize DSM are direct load control, innovative tariffs, different types of campaign etc. Restructuring of utility in Finland and increased competition in electricity market have had dramatic influence on the DSM. Traditional ways are impossible due to the conflicting interests of generation, network and supply business and increased competition between different actors in the market. Costs and benefits of DSM are divided to different companies, and different type of utilities are interested only in those activities which are beneficial to them. On the other hand, due to the increased competition the suppliers are diversifying to different types of products and increasing number of customer services partly based on DSM are available. The aim of this project was to develop and assess methods for DSM and distribution automation planning from the utility point of view. The methods were also applied to case studies at utilities

  1. Development of methods for DSM and distribution automation planning

    Energy Technology Data Exchange (ETDEWEB)

    Kaerkkaeinen, S.; Kekkonen, V. [VTT Energy, Espoo (Finland); Rissanen, P. [Tietosavo Oy (Finland)

    1998-08-01

    Demand-Side Management (DSM) is usually an utility (or sometimes governmental) activity designed to influence energy demand of customers (both level and load variation). It includes basic options like strategic conservation or load growth, peak clipping. Load shifting and fuel switching. Typical ways to realize DSM are direct load control, innovative tariffs, different types of campaign etc. Restructuring of utility in Finland and increased competition in electricity market have had dramatic influence on the DSM. Traditional ways are impossible due to the conflicting interests of generation, network and supply business and increased competition between different actors in the market. Costs and benefits of DSM are divided to different companies, and different type of utilities are interested only in those activities which are beneficial to them. On the other hand, due to the increased competition the suppliers are diversifying to different types of products and increasing number of customer services partly based on DSM are available. The aim of this project was to develop and assess methods for DSM and distribution automation planning from the utility point of view. The methods were also applied to case studies at utilities

  2. Automated Asteroseismic Analysis of Solar-type Stars

    DEFF Research Database (Denmark)

    Karoff, Christoffer; Campante, T.L.; Chaplin, W.J.

    2010-01-01

    The rapidly increasing volume of asteroseismic observations on solar-type stars has revealed a need for automated analysis tools. The reason for this is not only that individual analyses of single stars are rather time consuming, but more importantly that these large volumes of observations open...... the possibility to do population studies on large samples of stars and such population studies demand a consistent analysis. By consistent analysis we understand an analysis that can be performed without the need to make any subjective choices on e.g. mode identification and an analysis where the uncertainties...

  3. Automated striatal uptake analysis of 18F-FDOPA PET images applied to Parkinson's disease patients

    International Nuclear Information System (INIS)

    Chang Icheng; Lue Kunhan; Hsieh Hungjen; Liu Shuhsin; Kao, Chinhao K.

    2011-01-01

    6-[ 18 F]Fluoro-L-DOPA (FDOPA) is a radiopharmaceutical valuable for assessing the presynaptic dopaminergic function when used with positron emission tomography (PET). More specifically, the striatal-to-occipital ratio (SOR) of FDOPA uptake images has been extensively used as a quantitative parameter in these PET studies. Our aim was to develop an easy, automated method capable of performing objective analysis of SOR in FDOPA PET images of Parkinson's disease (PD) patients. Brain images from FDOPA PET studies of 21 patients with PD and 6 healthy subjects were included in our automated striatal analyses. Images of each individual were spatially normalized into an FDOPA template. Subsequently, the image slice with the highest level of basal ganglia activity was chosen among the series of normalized images. Also, the immediate preceding and following slices of the chosen image were then selected. Finally, the summation of these three images was used to quantify and calculate the SOR values. The results obtained by automated analysis were compared with manual analysis by a trained and experienced image processing technologist. The SOR values obtained from the automated analysis had a good agreement and high correlation with manual analysis. The differences in caudate, putamen, and striatum were -0.023, -0.029, and -0.025, respectively; correlation coefficients 0.961, 0.957, and 0.972, respectively. We have successfully developed a method for automated striatal uptake analysis of FDOPA PET images. There was no significant difference between the SOR values obtained from this method and using manual analysis. Yet it is an unbiased time-saving and cost-effective program and easy to implement on a personal computer. (author)

  4. Towards Automated Design, Analysis and Optimization of Declarative Curation Workflows

    Directory of Open Access Journals (Sweden)

    Tianhong Song

    2014-10-01

    Full Text Available Data curation is increasingly important. Our previous work on a Kepler curation package has demonstrated advantages that come from automating data curation pipelines by using workflow systems. However, manually designed curation workflows can be error-prone and inefficient due to a lack of user understanding of the workflow system, misuse of actors, or human error. Correcting problematic workflows is often very time-consuming. A more proactive workflow system can help users avoid such pitfalls. For example, static analysis before execution can be used to detect the potential problems in a workflow and help the user to improve workflow design. In this paper, we propose a declarative workflow approach that supports semi-automated workflow design, analysis and optimization. We show how the workflow design engine helps users to construct data curation workflows, how the workflow analysis engine detects different design problems of workflows and how workflows can be optimized by exploiting parallelism.

  5. Non-destructive automated express method for determining the inclination of chromium-nickel steels IGC

    International Nuclear Information System (INIS)

    Nazarov, A.A.; Kamenev, Yu.B.; Kuusk, L.V.; Kormin, E.G.; Vasil'ev, A.N.; Sumbaeva, T.E.

    1986-01-01

    Methods of automated control of 18-10-type steel inclination to IGC are developed and a corresponding automated testing complex (ATS) is created. 08Kh18N10T steel samples had two variants of thermal treatment: 1) 1200 deg (5 h), 600 deg (50 h); 2) 1200 deg (5 h). Methods of non-destructive automated control of 18-10-type steel inclination to IGC are developed on the basis of potentiodynamic reactivation (PR) principle. Automated testing complex is developed, which has undergone experimental running and demonstrated a high confidence of results, reliability and easy operation

  6. Comparing a Perceptual and an Automated Vision-Based Method for Lie Detection in Younger Children.

    Science.gov (United States)

    Serras Pereira, Mariana; Cozijn, Reinier; Postma, Eric; Shahid, Suleman; Swerts, Marc

    2016-01-01

    The present study investigates how easily it can be detected whether a child is being truthful or not in a game situation, and it explores the cue validity of bodily movements for such type of classification. To achieve this, we introduce an innovative methodology - the combination of perception studies (in which eye-tracking technology is being used) and automated movement analysis. Film fragments from truthful and deceptive children were shown to human judges who were given the task to decide whether the recorded child was being truthful or not. Results reveal that judges are able to accurately distinguish truthful clips from lying clips in both perception studies. Even though the automated movement analysis for overall and specific body regions did not yield significant results between the experimental conditions, we did find a positive correlation between the amount of movement in a child and the perception of lies, i.e., the more movement the children exhibited during a clip, the higher the chance that the clip was perceived as a lie. The eye-tracking study revealed that, even when there is movement happening in different body regions, judges tend to focus their attention mainly on the face region. This is the first study that compares a perceptual and an automated method for the detection of deceptive behavior in children whose data have been elicited through an ecologically valid paradigm.

  7. Application of fluorescence-based semi-automated AFLP analysis in barley and wheat

    DEFF Research Database (Denmark)

    Schwarz, G.; Herz, M.; Huang, X.Q.

    2000-01-01

    of semi-automated codominant analysis for hemizygous AFLP markers in an F-2 population was too low, proposing the use of dominant allele-typing defaults. Nevertheless, the efficiency of genetic mapping, especially of complex plant genomes, will be accelerated by combining the presented genotyping......Genetic mapping and the selection of closely linked molecular markers for important agronomic traits require efficient, large-scale genotyping methods. A semi-automated multifluorophore technique was applied for genotyping AFLP marker loci in barley and wheat. In comparison to conventional P-33...

  8. Automated approach to quantitative error analysis

    International Nuclear Information System (INIS)

    Bareiss, E.H.

    1977-04-01

    A method is described how a quantitative measure for the robustness of a given neutron transport theory code for coarse network calculations can be obtained. A code that performs this task automatically and at only nominal cost is described. This code also generates user-oriented benchmark problems which exhibit the analytic behavior at interfaces. 5 figures, 1 table

  9. Alternative validation practice of an automated faulting measurement method.

    Science.gov (United States)

    2010-03-08

    A number of states have adopted profiler based systems to automatically measure faulting, : in jointed concrete pavements. However, little published work exists which documents the : validation process used for such automated faulting systems. This p...

  10. Automated Acquisition and Analysis of Digital Radiographic Images

    International Nuclear Information System (INIS)

    Poland, R.

    1999-01-01

    Engineers at the Savannah River Technology Center have designed, built, and installed a fully automated small field-of-view, lens-coupled, digital radiography imaging system. The system is installed in one of the Savannah River Site''s production facilities to be used for the evaluation of production components. Custom software routines developed for the system automatically acquire, enhance, and diagnostically evaluate critical geometric features of various components that have been captured radiographically. Resolution of the digital radiograms and accuracy of the acquired measurements approaches 0.001 inches. To date, there has been zero deviation in measurement repeatability. The automated image acquisition methodology will be discussed, unique enhancement algorithms will be explained, and the automated routines for measuring the critical component features will be presented. An additional feature discussed is the independent nature of the modular software components, which allows images to be automatically acquired, processed, and evaluated by the computer in the background, while the operator reviews other images on the monitor. System components were also a key in gaining the required image resolution. System factors such as scintillator selection, x-ray source energy, optical components and layout, as well as geometric unsharpness issues are considered in the paper. Finally the paper examines the numerous quality improvement factors and cost saving advantages that will be realized at the Savannah River Site due to the implementation of the Automated Pinch Weld Analysis System (APWAS)

  11. Decision-making and problem solving methods in automation technology

    Energy Technology Data Exchange (ETDEWEB)

    Hankins, W.W.; Pennington, J.E.; Barker, L.K.

    1983-05-01

    This report presents a brief review of the state of the art in the automation of decision making and problem solving. The information upon which the report is based was derived from literature searches, visits to university and government laboratories performing basic research in the area, and a 1980 Langley Research Center sponsored conference on the subject. It is the contention of the authors that the technology in this area is being generated by research primarily in the three disciplines of Artificial Intelligence, Control Theory, and Operations Research. Under the assumption that the state of the art in decision making and problem solving is reflected in the problems being solved, specific problems and methods of their solution are often discussed to elucidate particular aspects of the subject. Synopses of the following major topic areas comprise most of the report: (1) detection and recognition; (2) planning and scheduling; (3) learning; (4) theorem proving; (5) distributed systems; (6) knowledge bases; (7) search; (8) heuristics; and (9) evolutionary programming.

  12. A completely automated PIXE analysis system and its applications

    International Nuclear Information System (INIS)

    Li, M.; Sheng, K.; Chin, P.; Chen, Z.; Wang, X.; Chin, J.; Rong, T.; Tan, M.; Xu, Y.

    1981-01-01

    Using the 3.5 MeV proton beam from a cyclotron, a completely automated PIXE analysis system to determine the concentration of trace elements has been set up. The experimental apparatus consists of a scattering chamber with a remotely controlled automatic target changer and a Si(Li) X-ray detector. A mini-computer with a multichannel analyser is employed to record the X-ray spectrum, to acquire data and perform on-line data processing. By comparing the data recorded the internal standard and a set of reference X-ray spectra, a method of calculating the trace element concentrations and an on-line processing program have been worked out to obtain the final results in a convenient manner. The system has been applied to determine the concentrations of trace elements in lunar rock, in human serum and nucleic acids. Experimental results show that ratio of the concentration of zinc to copper in serum may be used as an important indication of the state of human health. (orig.)

  13. Automated software analysis of nuclear core discharge data

    International Nuclear Information System (INIS)

    Larson, T.W.; Halbig, J.K.; Howell, J.A.; Eccleston, G.W.; Klosterbuer, S.F.

    1993-03-01

    Monitoring the fueling process of an on-load nuclear reactor is a full-time job for nuclear safeguarding agencies. Nuclear core discharge monitors (CDMS) can provide continuous, unattended recording of the reactor's fueling activity for later, qualitative review by a safeguards inspector. A quantitative analysis of this collected data could prove to be a great asset to inspectors because more information can be extracted from the data and the analysis time can be reduced considerably. This paper presents a prototype for an automated software analysis system capable of identifying when fuel bundle pushes occurred and monitoring the power level of the reactor. Neural network models were developed for calculating the region on the reactor face from which the fuel was discharged and predicting the burnup. These models were created and tested using actual data collected from a CDM system at an on-load reactor facility. Collectively, these automated quantitative analysis programs could help safeguarding agencies to gain a better perspective on the complete picture of the fueling activity of an on-load nuclear reactor. This type of system can provide a cost-effective solution for automated monitoring of on-load reactors significantly reducing time and effort

  14. Automated optics inspection analysis for NIF

    International Nuclear Information System (INIS)

    Kegelmeyer, Laura M.; Clark, Raelyn; Leach, Richard R.; McGuigan, David; Kamm, Victoria Miller; Potter, Daniel; Salmon, J. Thad; Senecal, Joshua; Conder, Alan; Nostrand, Mike; Whitman, Pamela K.

    2012-01-01

    The National Ignition Facility (NIF) is a high-energy laser facility comprised of 192 beamlines that house thousands of optics. These optics guide, amplify and tightly focus light onto a tiny target for fusion ignition research and high energy density physics experiments. The condition of these optics is key to the economic, efficient and maximally energetic performance of the laser. Our goal, and novel achievement, is to find on the optics any imperfections while they are tens of microns in size, track them through time to see if they grow and if so, remove the optic and repair the single site so the entire optic can then be re-installed for further use on the laser. This paper gives an overview of the image analysis used for detecting, measuring, and tracking sites of interest on an optic while it is installed on the beamline via in situ inspection and after it has been removed for maintenance. In this way, the condition of each optic is monitored throughout the optic's lifetime. This overview paper will summarize key algorithms and technical developments for custom image analysis and processing and highlight recent improvements. (Associated papers will include more details on these issues.) We will also discuss the use of OI Analysis for daily operation of the NIF laser and its extension to inspection of NIF targets.

  15. Micro photometer's automation for quantitative spectrograph analysis

    International Nuclear Information System (INIS)

    Gutierrez E, C.Y.A.

    1996-01-01

    A Microphotometer is used to increase the sharpness of dark spectral lines. Analyzing these lines one sample content and its concentration could be determined and the analysis is known as Quantitative Spectrographic Analysis. The Quantitative Spectrographic Analysis is carried out in 3 steps, as follows. 1. Emulsion calibration. This consists of gauging a photographic emulsion, to determine the intensity variations in terms of the incident radiation. For the procedure of emulsion calibration an adjustment with square minimum to the data obtained is applied to obtain a graph. It is possible to determine the density of dark spectral line against the incident light intensity shown by the microphotometer. 2. Working curves. The values of known concentration of an element against incident light intensity are plotted. Since the sample contains several elements, it is necessary to find a work curve for each one of them. 3. Analytical results. The calibration curve and working curves are compared and the concentration of the studied element is determined. The automatic data acquisition, calculation and obtaining of resulting, is done by means of a computer (PC) and a computer program. The conditioning signal circuits have the function of delivering TTL levels (Transistor Transistor Logic) to make the communication between the microphotometer and the computer possible. Data calculation is done using a computer programm

  16. A new, fast and semi-automated size determination method (SASDM) for studying multicellular tumor spheroids

    Science.gov (United States)

    Monazzam, Azita; Razifar, Pasha; Lindhe, Örjan; Josephsson, Raymond; Långström, Bengt; Bergström, Mats

    2005-01-01

    Background Considering the width and importance of using Multicellular Tumor Spheroids (MTS) in oncology research, size determination of MTSs by an accurate and fast method is essential. In the present study an effective, fast and semi-automated method, SASDM, was developed to determinate the size of MTSs. The method was applied and tested in MTSs of three different cell-lines. Frozen section autoradiography and Hemotoxylin Eosin (H&E) staining was used for further confirmation. Results SASDM was shown to be effective, user-friendly, and time efficient, and to be more precise than the traditional methods and it was applicable for MTSs of different cell-lines. Furthermore, the results of image analysis showed high correspondence to the results of autoradiography and staining. Conclusion The combination of assessment of metabolic condition and image analysis in MTSs provides a good model to evaluate the effect of various anti-cancer treatments. PMID:16283948

  17. Fuzzy Emotional Semantic Analysis and Automated Annotation of Scene Images

    Science.gov (United States)

    Cao, Jianfang; Chen, Lichao

    2015-01-01

    With the advances in electronic and imaging techniques, the production of digital images has rapidly increased, and the extraction and automated annotation of emotional semantics implied by images have become issues that must be urgently addressed. To better simulate human subjectivity and ambiguity for understanding scene images, the current study proposes an emotional semantic annotation method for scene images based on fuzzy set theory. A fuzzy membership degree was calculated to describe the emotional degree of a scene image and was implemented using the Adaboost algorithm and a back-propagation (BP) neural network. The automated annotation method was trained and tested using scene images from the SUN Database. The annotation results were then compared with those based on artificial annotation. Our method showed an annotation accuracy rate of 91.2% for basic emotional values and 82.4% after extended emotional values were added, which correspond to increases of 5.5% and 8.9%, respectively, compared with the results from using a single BP neural network algorithm. Furthermore, the retrieval accuracy rate based on our method reached approximately 89%. This study attempts to lay a solid foundation for the automated emotional semantic annotation of more types of images and therefore is of practical significance. PMID:25838818

  18. Fuzzy Emotional Semantic Analysis and Automated Annotation of Scene Images

    Directory of Open Access Journals (Sweden)

    Jianfang Cao

    2015-01-01

    Full Text Available With the advances in electronic and imaging techniques, the production of digital images has rapidly increased, and the extraction and automated annotation of emotional semantics implied by images have become issues that must be urgently addressed. To better simulate human subjectivity and ambiguity for understanding scene images, the current study proposes an emotional semantic annotation method for scene images based on fuzzy set theory. A fuzzy membership degree was calculated to describe the emotional degree of a scene image and was implemented using the Adaboost algorithm and a back-propagation (BP neural network. The automated annotation method was trained and tested using scene images from the SUN Database. The annotation results were then compared with those based on artificial annotation. Our method showed an annotation accuracy rate of 91.2% for basic emotional values and 82.4% after extended emotional values were added, which correspond to increases of 5.5% and 8.9%, respectively, compared with the results from using a single BP neural network algorithm. Furthermore, the retrieval accuracy rate based on our method reached approximately 89%. This study attempts to lay a solid foundation for the automated emotional semantic annotation of more types of images and therefore is of practical significance.

  19. Feasibility and Accuracy of Automated Software for Transthoracic Three-Dimensional Left Ventricular Volume and Function Analysis: Comparisons with Two-Dimensional Echocardiography, Three-Dimensional Transthoracic Manual Method, and Cardiac Magnetic Resonance Imaging.

    Science.gov (United States)

    Tamborini, Gloria; Piazzese, Concetta; Lang, Roberto M; Muratori, Manuela; Chiorino, Elisa; Mapelli, Massimo; Fusini, Laura; Ali, Sarah Ghulam; Gripari, Paola; Pontone, Gianluca; Andreini, Daniele; Pepi, Mauro

    2017-11-01

    Recently, a new automated software package (HeartModel) was developed to obtain three-dimensional (3D) left ventricular (LV) volumes using a model-based algorithm (MBA) with a "one-button" simple system and user-adjustable slider. The aims of this study were to verify the feasibility and accuracy of the MBA in comparison with other commonly used imaging techniques in a large unselected population, to evaluate possible accuracy improvements of free operator border adjustments or changes of the slider's default position, and to identify differences in method accuracy related to specific pathologies. This prospective study included consecutive 200 patients. LV volumes and ejection fraction were obtained using the MBA and compared with the two-dimensional biplane method, the 3D full-volume (3DFV) modality, and, in 90 of 200 cases, cardiac magnetic resonance (CMR) measurements. To evaluate the optimal position of the slider with respect to the 3DFV and CMR modalities, a set of threefold cross-validation experiments was performed. Optimized and manually corrected LV volumes obtained using the MBA were also tested. Linear correlation and Bland-Altman analysis were used to assess intertechnique agreement. Automatic volumes were feasible in 194 patients (94.5%), with a mean processing time of 29 ± 10 sec. MBA-derived volumes correlated significantly with all evaluated methods, with slight overestimation of two-dimensional biplane and slight underestimation of CMR measurements. Higher correlations were found between MBA and 3DFV measurements, with negligible differences both in volumes (overestimation) and in LV ejection fraction (underestimation), respectively. Optimization of the user-adjustable slider position improved the correlation and markedly reduced the bias between the MBA and 3DFV or CMR. The accuracy of MBA volumes was lower in some pathologies for incorrect definition of LV endocardium. The MBA is highly feasible, reproducible, and rapid, and it correlates

  20. Automated regional behavioral analysis for human brain images.

    Science.gov (United States)

    Lancaster, Jack L; Laird, Angela R; Eickhoff, Simon B; Martinez, Michael J; Fox, P Mickle; Fox, Peter T

    2012-01-01

    Behavioral categories of functional imaging experiments along with standardized brain coordinates of associated activations were used to develop a method to automate regional behavioral analysis of human brain images. Behavioral and coordinate data were taken from the BrainMap database (http://www.brainmap.org/), which documents over 20 years of published functional brain imaging studies. A brain region of interest (ROI) for behavioral analysis can be defined in functional images, anatomical images or brain atlases, if images are spatially normalized to MNI or Talairach standards. Results of behavioral analysis are presented for each of BrainMap's 51 behavioral sub-domains spanning five behavioral domains (Action, Cognition, Emotion, Interoception, and Perception). For each behavioral sub-domain the fraction of coordinates falling within the ROI was computed and compared with the fraction expected if coordinates for the behavior were not clustered, i.e., uniformly distributed. When the difference between these fractions is large behavioral association is indicated. A z-score ≥ 3.0 was used to designate statistically significant behavioral association. The left-right symmetry of ~100K activation foci was evaluated by hemisphere, lobe, and by behavioral sub-domain. Results highlighted the classic left-side dominance for language while asymmetry for most sub-domains (~75%) was not statistically significant. Use scenarios were presented for anatomical ROIs from the Harvard-Oxford cortical (HOC) brain atlas, functional ROIs from statistical parametric maps in a TMS-PET study, a task-based fMRI study, and ROIs from the ten "major representative" functional networks in a previously published resting state fMRI study. Statistically significant behavioral findings for these use scenarios were consistent with published behaviors for associated anatomical and functional regions.

  1. Lesion Segmentation in Automated 3D Breast Ultrasound: Volumetric Analysis.

    Science.gov (United States)

    Agarwal, Richa; Diaz, Oliver; Lladó, Xavier; Gubern-Mérida, Albert; Vilanova, Joan C; Martí, Robert

    2018-03-01

    Mammography is the gold standard screening technique in breast cancer, but it has some limitations for women with dense breasts. In such cases, sonography is usually recommended as an additional imaging technique. A traditional sonogram produces a two-dimensional (2D) visualization of the breast and is highly operator dependent. Automated breast ultrasound (ABUS) has also been proposed to produce a full 3D scan of the breast automatically with reduced operator dependency, facilitating double reading and comparison with past exams. When using ABUS, lesion segmentation and tracking changes over time are challenging tasks, as the three-dimensional (3D) nature of the images makes the analysis difficult and tedious for radiologists. The goal of this work is to develop a semi-automatic framework for breast lesion segmentation in ABUS volumes which is based on the Watershed algorithm. The effect of different de-noising methods on segmentation is studied showing a significant impact ([Formula: see text]) on the performance using a dataset of 28 temporal pairs resulting in a total of 56 ABUS volumes. The volumetric analysis is also used to evaluate the performance of the developed framework. A mean Dice Similarity Coefficient of [Formula: see text] with a mean False Positive ratio [Formula: see text] has been obtained. The Pearson correlation coefficient between the segmented volumes and the corresponding ground truth volumes is [Formula: see text] ([Formula: see text]). Similar analysis, performed on 28 temporal (prior and current) pairs, resulted in a good correlation coefficient [Formula: see text] ([Formula: see text]) for prior and [Formula: see text] ([Formula: see text]) for current cases. The developed framework showed prospects to help radiologists to perform an assessment of ABUS lesion volumes, as well as to quantify volumetric changes during lesions diagnosis and follow-up.

  2. PP025. Urinary dipstick proteinuria testing - Does automated strip analysis offer an advantage over visual testing?

    Science.gov (United States)

    De Silva, D A; Halstead, C; Côté, A-M; Sabr, Y; von Dadelszen, P; Magee, L A

    2012-07-01

    The visual urinary test strip is widely accepted for screening for proteinuria in pregnancy, given the convenience of the method and its low cost. However, test strips are known to lack sensitivity and specificity. The 2010 NICE (National Institute for Health and Clinical Excellence) guidelines for management of pregnancy hypertension have recommended the use of an automated test strip reader to confirm proteinuria (http://nice.org.uk/CG107). Superior diagnostic test performance of an automated (vs. visual) method has been proposed based on reduced subjectivity. To compare the diagnostic test properties of automated vs. visual read urine dipstick testing for detection of a random protein:creatinine ratio (PrCr) of ⩾30mg/mmol. In this prospective cohort study, consecutive inpatients or outpatients (obstetric medicine and high-risk maternity clinics) were evaluated at a tertiary care facility. Random midstream urine samples (obtained as part of normal clinical care) were split into two aliquots. The first underwent a point-of-care testing for proteinuria using both visual (Multistix 10SG, Siemens Healthcare Diagnostics, Inc., Tarrytown NY) and automated (Chemstrip 10A, Roche Diagnostics, Laval QC) test strips, the latter read by an analyser (Urisys 1100®, Roche Diagnostics, Laval QC). The second aliquot was sent to the hospital laboratory for analysis of urinary protein using a pyrocatechol violet molybdate dye-binding method, and urinary creatinine using an enzymatic method, both on an automated analyser (Vitros® 5,1 FS or Vitros® 5600, Ortho-Clinical Diagnostics, Rochester NY); random PrCr ratios were calculated in the laboratory. Following exclusion of dilute samples with urinary creatinine concentration analysis. Both visual and automated read urinary dipstick testing showed low sensitivity (56.0% and 53.9%, respectively). Positive likelihood ratios (LR+) and 95% CI were 15.0 [5.9,37.9] and 24.6 [7.6,79.6], respectively. Negative LR (LR-) were 0.46 [0

  3. Automated Detection of Salt Marsh Platforms : a Topographic Method

    Science.gov (United States)

    Goodwin, G.; Mudd, S. M.; Clubb, F. J.

    2017-12-01

    Monitoring the topographic evolution of coastal marshes is a crucial step toward improving the management of these valuable landscapes under the pressure of relative sea level rise and anthropogenic modification. However, determining their geometrically complex boundaries currently relies on spectral vegetation detection methods or requires labour-intensive field surveys and digitisation.We propose a novel method to reproducibly isolate saltmarsh scarps and platforms from a DEM. Field observations and numerical models show that saltmarshes mature into sub-horizontal platforms delineated by sub-vertical scarps: based on this premise, we identify scarps as lines of local maxima on a slope*relief raster, then fill landmasses from the scarps upward, thus isolating mature marsh platforms. Non-dimensional search parameters allow batch-processing of data without recalibration. We test our method using lidar-derived DEMs of six saltmarshes in England with varying tidal ranges and geometries, for which topographic platforms were manually isolated from tidal flats. Agreement between manual and automatic segregation exceeds 90% for resolutions of 1m, with all but one sites maintaining this performance for resolutions up to 3.5m. For resolutions of 1m, automatically detected platforms are comparable in surface area and elevation distribution to digitised platforms. We also find that our method allows the accurate detection of local bloc failures 3 times larger than the DEM resolution.Detailed inspection reveals that although tidal creeks were digitised as part of the marsh platform, automatic detection classifies them as part of the tidal flat, causing an increase in false negatives and overall platform perimeter. This suggests our method would benefit from a combination with existing creek detection algorithms. Fallen blocs and pioneer zones are inconsistently identified, particularly in macro-tidal marshes, leading to differences between digitisation and the automated method

  4. Interpreting complex data by methods of recognition and classification in an automated system of aerogeophysical material processing

    Energy Technology Data Exchange (ETDEWEB)

    Koval' , L.A.; Dolgov, S.V.; Liokumovich, G.B.; Ovcharenko, A.V.; Priyezzhev, I.I.

    1984-01-01

    The system of automated processing of aerogeophysical data, ASOM-AGS/YeS, is equipped with complex interpretation of multichannel measurements. Algorithms of factor analysis, automatic classification and apparatus of a priori specified (selected) decisive rules are used. The areas of effect of these procedures can be initially limited to the specified geological information. The possibilities of the method are demonstrated by the results of automated processing of the aerogram-spectrometric measurements in the region of the known copper-porphyr manifestation in Kazakhstan. This ore deposit was clearly noted after processing by the method of main components by complex aureole of independent factors U (severe increase), Th (noticeable increase), K (decrease).

  5. Experience based ageing analysis of NPP protection automation in Finland

    International Nuclear Information System (INIS)

    Simola, K.

    2000-01-01

    This paper describes three successive studies on ageing of protection automation of nuclear power plants. These studies were aimed at developing a methodology for an experience based ageing analysis, and applying it to identify the most critical components from ageing and safety points of view. The analyses resulted also to suggestions for improvement of data collection systems for the purpose of further ageing analyses. (author)

  6. Automated result analysis in radiographic testing of NPPs' welded joints

    International Nuclear Information System (INIS)

    Skomorokhov, A.O.; Nakhabov, A.V.; Belousov, P.A.

    2009-01-01

    The article presents development results of algorithms for automated image interpretation of NPP welded joints radiographic inspection. The developed algorithms are based on state-of-the-art pattern recognition methods. The paper covers automatic radiographic image segmentation, defects detection and their parameters evaluation issues. The developed algorithms testing results for actual radiographic images of welded joints with significant variation of defects parameters are given [ru

  7. Automated Analysis of Security in Networking Systems

    DEFF Research Database (Denmark)

    Buchholtz, Mikael

    2004-01-01

    It has for a long time been a challenge to built secure networking systems. One way to counter this problem is to provide developers of software applications for networking systems with easy-to-use tools that can check security properties before the applications ever reach the marked. These tools...... will both help raise the general level of awareness of the problems and prevent the most basic flaws from occurring. This thesis contributes to the development of such tools. Networking systems typically try to attain secure communication by applying standard cryptographic techniques. In this thesis...... attacks, and attacks launched by insiders. Finally, the perspectives for the application of the analysis techniques are discussed, thereby, coming a small step closer to providing developers with easy- to-use tools for validating the security of networking applications....

  8. Assessment of the relative error in the automation task by sessile drop method

    Directory of Open Access Journals (Sweden)

    T. О. Levitskaya

    2015-11-01

    Full Text Available Assessment of the relative error in the sessile drop method automation. Further development of the sessile drop method is directly related to the development of new techniques and specially developed algorithms enabling automatic computer calculation of surface properties. The sessile drop method mathematical apparatus improvement, drop circuit equation transformation to a form suitable for working, the drop surface calculation method automation, analysis of relative errors in the calculation of surface tension are relevant and are important in experimental determinations. The surface tension measurement relative error, as well as the error caused by the drop ellipsoidness in the plan were determined in the task of the sessile drop automation. It should be noted that if the drop maximum diameter (l is big or if the ratio of l to the drop height above the equatorial diameter(h is big, the relative error in the measurement of surface tension by sessile drop method does not depend much on the equatorial diameter of the drop and ellipsoidness of the drop. In this case, the accuracy of determination of the surface tension varies from 1,0 to 0,5%. At lower values the drop ellipsoidness begins to affect the relative error of surface tension (from 1,2 to 0,8%, but in this case the drop ellipsoidness is less. Therefore, in subsequent experiments, we used larger drops. On the basis of the assessment of the relative error in determining the liquid surface tension by sessile drop method caused by drop ellipsoidness in the plan, the tables showing the limits of the drop parameters (h and l measurement necessary accuracy to get the overall relative error have been made up. Previously, the surface tension used to be calculated with the relative error in the range of 2-3%

  9. Data for automated, high-throughput microscopy analysis of intracellular bacterial colonies using spot detection

    DEFF Research Database (Denmark)

    Ernstsen, Christina L; Login, Frédéric H; Jensen, Helene H

    2017-01-01

    Quantification of intracellular bacterial colonies is useful in strategies directed against bacterial attachment, subsequent cellular invasion and intracellular proliferation. An automated, high-throughput microscopy-method was established to quantify the number and size of intracellular bacterial...... of cell nuclei were automatically quantified using a spot detection-tool. The spot detection-output was exported to Excel, where data analysis was performed. In this article, micrographs and spot detection data are made available to facilitate implementation of the method....

  10. Large-Scale Automated Analysis of Location Patterns in Randomly-Tagged 3T3 Cells

    Science.gov (United States)

    Osuna, Elvira García; Hua, Juchang; Bateman, Nicholas W.; Zhao, Ting; Berget, Peter B.; Murphy, Robert F.

    2010-01-01

    Location proteomics is concerned with the systematic analysis of the subcellular location of proteins. In order to perform high-resolution, high-throughput analysis of all protein location patterns, automated methods are needed. Here we describe the use of such methods on a large collection of images obtained by automated microscopy to perform high-throughput analysis of endogenous proteins randomly-tagged with a fluorescent protein in NIH 3T3 cells. Cluster analysis was performed to identify the statistically significant location patterns in these images. This allowed us to assign a location pattern to each tagged protein without specifying what patterns are possible. To choose the best feature set for this clustering, we have used a novel method that determines which features do not artificially discriminate between control wells on different plates and uses Stepwise Discriminant Analysis (SDA) to determine which features do discriminate as much as possible among the randomly-tagged wells. Combining this feature set with consensus clustering methods resulted in 35 clusters among the first 188 clones we obtained. This approach represents a powerful automated solution to the problem of identifying subcellular locations on a proteome-wide basis for many different cell types. PMID:17285363

  11. An automated image analysis system to measure and count organisms in laboratory microcosms.

    Directory of Open Access Journals (Sweden)

    François Mallard

    Full Text Available 1. Because of recent technological improvements in the way computer and digital camera perform, the potential use of imaging for contributing to the study of communities, populations or individuals in laboratory microcosms has risen enormously. However its limited use is due to difficulties in the automation of image analysis. 2. We present an accurate and flexible method of image analysis for detecting, counting and measuring moving particles on a fixed but heterogeneous substrate. This method has been specifically designed to follow individuals, or entire populations, in experimental laboratory microcosms. It can be used in other applications. 3. The method consists in comparing multiple pictures of the same experimental microcosm in order to generate an image of the fixed background. This background is then used to extract, measure and count the moving organisms, leaving out the fixed background and the motionless or dead individuals. 4. We provide different examples (springtails, ants, nematodes, daphnia to show that this non intrusive method is efficient at detecting organisms under a wide variety of conditions even on faintly contrasted and heterogeneous substrates. 5. The repeatability and reliability of this method has been assessed using experimental populations of the Collembola Folsomia candida. 6. We present an ImageJ plugin to automate the analysis of digital pictures of laboratory microcosms. The plugin automates the successive steps of the analysis and recursively analyses multiple sets of images, rapidly producing measurements from a large number of replicated microcosms.

  12. Scoring of radiation-induced micronuclei in cytokinesis-blocked human lymphocytes by automated image analysis

    International Nuclear Information System (INIS)

    Verhaegen, F.; Seuntjens, J.; Thierens, H.

    1994-01-01

    The micronucleus assay in human lymphocytes is, at present, frequently used to assess chromosomal damage caused by ionizing radiation or mutagens. Manual scoring of micronuclei (MN) by trained personnel is very time-consuming, tiring work, and the results depend on subjective interpretation of scoring criteria. More objective scoring can be accomplished only if the test can be automated. Furthermore, an automated system allows scoring of large numbers of cells, thereby increasing the statistical significance of the results. This is of special importance for screening programs for low doses of chromosome-damaging agents. In this paper, the first results of our effort to automate the micronucleus assay with an image-analysis system are represented. The method we used is described in detail, and the results are compared to those of other groups. Our system is able to detect 88% of the binucleated lymphocytes on the slides. The procedure consists of a fully automated localization of binucleated cells and counting of the MN within these cells, followed by a simple and fast manual operation in which the false positives are removed. Preliminary measurements for blood samples irradiated with a dose of 1 Gy X-rays indicate that the automated system can find 89% ± 12% of the micronuclei within the binucleated cells compared to a manual screening. 18 refs., 8 figs., 1 tab

  13. Unsupervised fully automated inline analysis of global left ventricular function in CINE MR imaging.

    Science.gov (United States)

    Theisen, Daniel; Sandner, Torleif A; Bauner, Kerstin; Hayes, Carmel; Rist, Carsten; Reiser, Maximilian F; Wintersperger, Bernd J

    2009-08-01

    To implement and evaluate the accuracy of unsupervised fully automated inline analysis of global ventricular function and myocardial mass (MM). To compare automated with manual segmentation in patients with cardiac disorders. In 50 patients, cine imaging of the left ventricle was performed with an accelerated retrogated steady state free precession sequence (GRAPPA; R = 2) on a 1.5 Tesla whole body scanner (MAGNETOM Avanto, Siemens Healthcare, Germany). A spatial resolution of 1.4 x 1.9 mm was achieved with a slice thickness of 8 mm and a temporal resolution of 42 milliseconds. Ventricular coverage was based on 9 to 12 short axis slices extending from the annulus of the mitral valve to the apex with 2 mm gaps. Fully automated segmentation and contouring was performed instantaneously after image acquisition. In addition to automated processing, cine data sets were also manually segmented using a semi-automated postprocessing software. Results of both methods were compared with regard to end-diastolic volume (EDV), end-systolic volume (ESV), ejection fraction (EF), and MM. A subgroup analysis was performed in patients with normal (> or =55%) and reduced EF (<55%) based on the results of the manual analysis. Thirty-two percent of patients had a reduced left ventricular EF of <55%. Volumetric results of the automated inline analysis for EDV (r = 0.96), ESV (r = 0.95), EF (r = 0.89), and MM (r = 0.96) showed high correlation with the results of manual segmentation (all P < 0.001). Head-to-head comparison did not show significant differences between automated and manual evaluation for EDV (153.6 +/- 52.7 mL vs. 149.1 +/- 48.3 mL; P = 0.05), ESV (61.6 +/- 31.0 mL vs. 64.1 +/- 31.7 mL; P = 0.08), and EF (58.0 +/- 11.6% vs. 58.6 +/- 11.6%; P = 0.5). However, differences were significant for MM (150.0 +/- 61.3 g vs. 142.4 +/- 59.0 g; P < 0.01). The standard error was 15.6 (EDV), 9.7 (ESV), 5.0 (EF), and 17.1 (mass). The mean time for manual analysis was 15 minutes

  14. Automated modal parameter estimation using correlation analysis and bootstrap sampling

    Science.gov (United States)

    Yaghoubi, Vahid; Vakilzadeh, Majid K.; Abrahamsson, Thomas J. S.

    2018-02-01

    The estimation of modal parameters from a set of noisy measured data is a highly judgmental task, with user expertise playing a significant role in distinguishing between estimated physical and noise modes of a test-piece. Various methods have been developed to automate this procedure. The common approach is to identify models with different orders and cluster similar modes together. However, most proposed methods based on this approach suffer from high-dimensional optimization problems in either the estimation or clustering step. To overcome this problem, this study presents an algorithm for autonomous modal parameter estimation in which the only required optimization is performed in a three-dimensional space. To this end, a subspace-based identification method is employed for the estimation and a non-iterative correlation-based method is used for the clustering. This clustering is at the heart of the paper. The keys to success are correlation metrics that are able to treat the problems of spatial eigenvector aliasing and nonunique eigenvectors of coalescent modes simultaneously. The algorithm commences by the identification of an excessively high-order model from frequency response function test data. The high number of modes of this model provides bases for two subspaces: one for likely physical modes of the tested system and one for its complement dubbed the subspace of noise modes. By employing the bootstrap resampling technique, several subsets are generated from the same basic dataset and for each of them a model is identified to form a set of models. Then, by correlation analysis with the two aforementioned subspaces, highly correlated modes of these models which appear repeatedly are clustered together and the noise modes are collected in a so-called Trashbox cluster. Stray noise modes attracted to the mode clusters are trimmed away in a second step by correlation analysis. The final step of the algorithm is a fuzzy c-means clustering procedure applied to

  15. Automated model-based quantitative analysis of phantoms with spherical inserts in FDG PET scans.

    Science.gov (United States)

    Ulrich, Ethan J; Sunderland, John J; Smith, Brian J; Mohiuddin, Imran; Parkhurst, Jessica; Plichta, Kristin A; Buatti, John M; Beichel, Reinhard R

    2018-01-01

    Quality control plays an increasingly important role in quantitative PET imaging and is typically performed using phantoms. The purpose of this work was to develop and validate a fully automated analysis method for two common PET/CT quality assurance phantoms: the NEMA NU-2 IQ and SNMMI/CTN oncology phantom. The algorithm was designed to only utilize the PET scan to enable the analysis of phantoms with thin-walled inserts. We introduce a model-based method for automated analysis of phantoms with spherical inserts. Models are first constructed for each type of phantom to be analyzed. A robust insert detection algorithm uses the model to locate all inserts inside the phantom. First, candidates for inserts are detected using a scale-space detection approach. Second, candidates are given an initial label using a score-based optimization algorithm. Third, a robust model fitting step aligns the phantom model to the initial labeling and fixes incorrect labels. Finally, the detected insert locations are refined and measurements are taken for each insert and several background regions. In addition, an approach for automated selection of NEMA and CTN phantom models is presented. The method was evaluated on a diverse set of 15 NEMA and 20 CTN phantom PET/CT scans. NEMA phantoms were filled with radioactive tracer solution at 9.7:1 activity ratio over background, and CTN phantoms were filled with 4:1 and 2:1 activity ratio over background. For quantitative evaluation, an independent reference standard was generated by two experts using PET/CT scans of the phantoms. In addition, the automated approach was compared against manual analysis, which represents the current clinical standard approach, of the PET phantom scans by four experts. The automated analysis method successfully detected and measured all inserts in all test phantom scans. It is a deterministic algorithm (zero variability), and the insert detection RMS error (i.e., bias) was 0.97, 1.12, and 1.48 mm for phantom

  16. Study radiolabeling of urea-based PSMA inhibitor with 68-Galliu: Comparative evaluation of automated and not automated methods

    International Nuclear Information System (INIS)

    Alcarde, Lais Fernanda

    2016-01-01

    The methods for clinical diagnosis of prostate cancer include rectal examination and the dosage of the prostatic specific antigen (PSA). However, the PSA level is elevated in about 20 to 30% of cases related to benign pathologies, resulting in false positives and leading patients to unnecessary biopsies. The prostate specific membrane antigen (PSMA), in contrast, is over expressed in prostate cancer and founded at low levels in healthy organs. As a result, it stimulated the development of small molecule inhibitors of PSMA, which carry imaging agents to the tumor and are not affected by their microvasculature. Recent studies suggest that the HBED-CC chelator intrinsically contributes to the binding of the PSMA inhibitor peptide based on urea (Glu-urea-Lys) to the pharmacophore group. This work describes the optimization of radiolabeling conditions of PSMA-HBED-CC with 68 Ga, using automated system (synthesis module) and no automated method, seeking to establish an appropriate condition to prepare this new radiopharmaceutical, with emphasis on the labeling yield and radiochemical purity of the product. It also aimed to evaluate the stability of the radiolabeled peptide in transport conditions and study the biological distribution of the radiopharmaceutical in healthy mice. The study of radiolabeling parameters enabled to define a non-automated method which resulted in high radiochemical purity (> 95 %) without the need for purification of the labeled peptide. The automated method has been adapted, using a module of synthesis and software already available at IPEN, and also resulted in high synthetic yield (≥ 90%) specially when compared with those described in the literature, with the associated benefit of greater control of the production process in compliance with Good Manufacturing Practices. The study of radiolabeling parameters afforded the PSMA-HBED-CC- 68 Ga with higher specific activity than observed in published clinical studies (≥ 140,0 GBq/μmol), with

  17. Automated Frequency Domain Decomposition for Operational Modal Analysis

    DEFF Research Database (Denmark)

    Brincker, Rune; Andersen, Palle; Jacobsen, Niels-Jørgen

    2007-01-01

    The Frequency Domain Decomposition (FDD) technique is known as one of the most user friendly and powerful techniques for operational modal analysis of structures. However, the classical implementation of the technique requires some user interaction. The present paper describes an algorithm...... for automated FDD, thus a version of FDD where no user interaction is required. Such algorithm can be used for obtaining a default estimate of modal parameters in commercial software for operational modal analysis - or even more important - it can be used as the modal information engine in a system...

  18. Semi-automated analysis of EEG spikes in the preterm fetal sheep using wavelet analysis

    International Nuclear Information System (INIS)

    Walbran, A.C.; Unsworth, C.P.; Gunn, A.J.; Benett, L.

    2010-01-01

    Full text: Presentation Preference Oral Presentation Perinatal hypoxia plays a key role in the cause of brain injury in premature infants. Cerebral hypothermia commenced in the latent phase of evolving injury (first 6-8 h post hypoxic-ischemic insult) is the lead candidate for treatment however currently there is no means to identify which infants can benefit from treatment. Recent studies suggest that epileptiform transients in latent phase are predictive of neural outcome. To quantify this, an automated means of EEG analysis is required as EEG monitoring produces vast amounts of data which is timely to analyse manually. We have developed a semi-automated EEG spike detection method which employs a discretized version of the continuous wavelet transform (CWT). EEG data was obtained from a fetal sheep at approximately 0.7 of gestation. Fetal asphyxia was maintained for 25 min and the EEG recorded for 8 h before and after asphyxia. The CWT was calculated followed by the power of the wavelet transform coefficients. Areas of high power corresponded to spike waves so thresholding was employed to identify the spikes. The performance of the method was found have a good sensitivity and selectivity, thus demonstrating that this method is a simple, robust and potentially effective spike detection algorithm.

  19. Grasping devices and methods in automated production processes

    DEFF Research Database (Denmark)

    Fantoni, Gualtiero; Santochi, Marco; Dini, Gino

    2014-01-01

    assembly to disassembly, from aerospace to food industry, from textile to logistics) are discussed. Finally, the most recent research is reviewed in order to introduce the new trends in grasping. They provide an outlook on the future of both grippers and robotic hands in automated production processes. (C...

  20. Improvement of Binary Analysis Components in Automated Malware Analysis Framework

    Science.gov (United States)

    2017-02-21

    and by monitoring their behavior, then generate data for malware detection signature and for developing their counter measure. 15. SUBJECT TERMS...FA2386-15-1-4068 Keiji Takeda, Keio University keiji@sfc.keio.ac.jp 1 Objective This research was conducted to develop components for automated...binary program and by monitoring their behavior, then generate data for malware detection signature and for developing their counter measure. 2

  1. Automated Scoring and Analysis of Micronucleated Human Lymphocytes.

    Science.gov (United States)

    Callisen, Hannes Heinrich

    Physical and chemical mutagens and carcinogens in our environment produce chromosome abberations in the circulating peripheral blood lymphocytes. The abberations, in turn, give rise to micronuclei when the lymphocytes proliferate in culture. In order to improve the micronucleus assay as a method for screening human populations for chromosome damage, I have (1) developed a high-resolution optical low-light-level micrometry expert system (HOLMES) to digitize and process microscope images of micronuclei in human peripheral blood lymphocytes, (2) defined a protocol of image processing techniques to objectively and uniquely identify and score micronuclei, and (3) analysed digital images of lymphocytes in order to study methods for (a) verifying the identification of suspect micronuclei, (b) classifying proliferating and non-proliferating lymphocytes, and (c) understanding the mechanisms of micronuclei formation and micronuclei fate during cell division. For the purpose of scoring micronuclei, HOLMES promises to (a) improve counting statistics since a greater number of cells can be scored without operator/microscopist fatigue, (b) provide for a more objective and consistent criterion for the identification of micronuclei than the human observer, and (c) yield quantitative information on nuclear and micronuclear characteristics useful in better understanding the micronucleus life cycle. My results on computer aided identification of micronuclei on microscope slides are gratifying. They demonstrate that automation of the micronucleus assay is feasible. Manual verification of HOLMES' results show correct extraction of micronuclei from the scene for 70% of the digitized images and correct identification of the micronuclei for 90% of the extracted objects. Moreover, quantitative analysis on digitized images of lymphocytes using HOLMES has revealed several exciting results: (a) micronuclear DNA content may be estimated from simple area measurements, (b) micronuclei seem to

  2. Automated Aqueous Sample Concentration Methods for in situ Astrobiological Instrumentation

    Science.gov (United States)

    Aubrey, A. D.; Grunthaner, F. J.

    2009-12-01

    The era of wet chemical experiments for in situ planetary science investigations is upon us, as evidenced by recent results from the surface of Mars by Phoenix’s microscopy, electrochemistry, and conductivity analyzer, MECA [1]. Studies suggest that traditional thermal volatilization methods for planetary science in situ investigations induce organic degradation during sample processing [2], an effect that is enhanced in the presence of oxidants [3]. Recent developments have trended towards adaptation of non-destructive aqueous extraction and analytical methods for future astrobiological instrumentation. Wet chemical extraction techniques under investigation include subcritical water extraction, SCWE [4], aqueous microwave assisted extraction, MAE, and organic solvent extraction [5]. Similarly, development of miniaturized analytical space flight instruments that require aqueous extracts include microfluidic capillary electrophoresis chips, μCE [6], liquid-chromatography mass-spectrometrometers, LC-MS [7], and life marker chips, LMC [8]. If organics are present on the surface of Mars, they are expected to be present at extremely low concentrations (parts-per-billion), orders of magnitude below the sensitivities of most flight instrument technologies. Therefore, it becomes necessary to develop and integrate concentration mechanisms for in situ sample processing before delivery to analytical flight instrumentation. We present preliminary results of automated solid-phase-extraction (SPE) sample purification and concentration methods for the treatment of highly saline aqueous soil extracts. These methods take advantage of the affinity of low molecular weight organic compounds with natural and synthetic scavenger materials. These interactions allow for the separation of target organic analytes from unfavorable background species (i.e. salts) during inline treatment, and a clever method for selective desorption is utilized to obtain concentrated solutions on the order

  3. Automated network analysis identifies core pathways in glioblastoma.

    Directory of Open Access Journals (Sweden)

    Ethan Cerami

    2010-02-01

    Full Text Available Glioblastoma multiforme (GBM is the most common and aggressive type of brain tumor in humans and the first cancer with comprehensive genomic profiles mapped by The Cancer Genome Atlas (TCGA project. A central challenge in large-scale genome projects, such as the TCGA GBM project, is the ability to distinguish cancer-causing "driver" mutations from passively selected "passenger" mutations.In contrast to a purely frequency based approach to identifying driver mutations in cancer, we propose an automated network-based approach for identifying candidate oncogenic processes and driver genes. The approach is based on the hypothesis that cellular networks contain functional modules, and that tumors target specific modules critical to their growth. Key elements in the approach include combined analysis of sequence mutations and DNA copy number alterations; use of a unified molecular interaction network consisting of both protein-protein interactions and signaling pathways; and identification and statistical assessment of network modules, i.e. cohesive groups of genes of interest with a higher density of interactions within groups than between groups.We confirm and extend the observation that GBM alterations tend to occur within specific functional modules, in spite of considerable patient-to-patient variation, and that two of the largest modules involve signaling via p53, Rb, PI3K and receptor protein kinases. We also identify new candidate drivers in GBM, including AGAP2/CENTG1, a putative oncogene and an activator of the PI3K pathway; and, three additional significantly altered modules, including one involved in microtubule organization. To facilitate the application of our network-based approach to additional cancer types, we make the method freely available as part of a software tool called NetBox.

  4. An Automated Method to Quantify Radiation Damage in Human Blood Cells

    Energy Technology Data Exchange (ETDEWEB)

    Gordon K. Livingston, Mark S. Jenkins and Akio A. Awa

    2006-07-10

    Cytogenetic analysis of blood lymphocytes is a well established method to assess the absorbed dose in persons exposed to ionizing radiation. Because mature lymphocytes circulate throughout the body, the dose to these cells is believed to represent the average whole body exposure. Cytogenetic methods measure the incidence of structural aberrations in chromosomes as a means to quantify DNA damage which occurs when ionizing radiation interacts with human tissue. Methods to quantify DNA damage at the chromosomal level vary in complexity and tend to be laborious and time consuming. In a mass casualty scenario involving radiological/nuclear materials, the ability to rapidly triage individuals according to radiation dose is critically important. For high-throughput screening for dicentric chromosomes, many of the data collection steps can be optimized with motorized microscopes coupled to automated slide scanning platforms.

  5. An Automated Bayesian Framework for Integrative Gene Expression Analysis and Predictive Medicine

    OpenAIRE

    Parikh, Neena; Zollanvari, Amin; Alterovitz, Gil

    2012-01-01

    Motivation: This work constructs a closed loop Bayesian Network framework for predictive medicine via integrative analysis of publicly available gene expression findings pertaining to various diseases. Results: An automated pipeline was successfully constructed. Integrative models were made based on gene expression data obtained from GEO experiments relating to four different diseases using Bayesian statistical methods. Many of these models demonstrated a high level of accuracy and predictive...

  6. Evaluation of automated analysis of 15N and total N in plant material and soil

    DEFF Research Database (Denmark)

    Jensen, E.S.

    1991-01-01

    was lower than 0.1%. The CV of repeated analyses of N-15-labelled plant material and soil samples varied between 0.3% and 1.1%. The reproducibility of repeated total N analyses using the automated method was comparable to results obtained with a semi-micro Kjeldahl procedure. However, the automated method...... analysis showed that the recovery of inorganic N in the NH3 trap was lower when the N was diffused from water than from 2 M KCl. The results also indicated that different proportions of the NO3- and the NH4+ in aqueous solution were recovered in the trap after combined diffusion. The method is most suited...

  7. Extended automated separation techniques in destructive neutron activation analysis

    International Nuclear Information System (INIS)

    Tjioe, P.S.; Goeij, J.J.M. de; Houtman, J.P.W.

    1977-01-01

    An automated post-irradiation chemical separation scheme for the analysis of 14 trace elements in biological materials is described. The procedure consists of a destruction with sulfuric acid and hydrogen peroxide, a distillation of the volatile elements with hydrobromic acid and chromatography of both distillate and residue over Dowex 2x8 anion exchanger columns. Accuracy, precision and sensitivity are tested with reference materials (BOWEN's kale, NBS bovine liver, IAEA materials, dried animal whole blood, wheat flour, dried potatoes, powdered milk, oyster homogenate) and on a sample of pooled human blood. Blank values due to trace elements in the quartz irradiation vials are also discussed. (T.G.)

  8. Multi-method automated diagnostics of rotating machines

    Science.gov (United States)

    Kostyukov, A. V.; Boychenko, S. N.; Shchelkanov, A. V.; Burda, E. A.

    2017-08-01

    The automated machinery diagnostics and monitoring systems utilized within the petrochemical plants are an integral part of the measures taken to ensure safety and, as a consequence, the efficiency of these industrial facilities. Such systems are often limited in their functionality due to the specifics of the diagnostic techniques adopted. As the diagnostic techniques applied in each system are limited, and machinery defects can have different physical nature, it becomes necessary to combine several diagnostics and monitoring systems to control various machinery components. Such an approach is inconvenient, since it requires additional measures to bring the diagnostic results in a single view of the technical condition of production assets. In this case, we mean by a production facility a bonded complex of a process unit, a drive, a power source and lines. A failure of any of these components will cause an outage of the production asset, which is unacceptable. The purpose of the study is to test a combined use of vibration diagnostics and partial discharge techniques within the diagnostic systems of enterprises for automated control of the technical condition of rotating machinery during maintenance and at production facilities. The described solutions allow you to control the condition of mechanical and electrical components of rotating machines. It is shown that the functionality of the diagnostics systems can be expanded with minimal changes in technological chains of repair and operation of rotating machinery. Automation of such systems reduces the influence of the human factor on the quality of repair and diagnostics of the machinery.

  9. A pattern-based method to automate mask inspection files

    Science.gov (United States)

    Kamal Baharin, Ezni Aznida Binti; Muhsain, Mohamad Fahmi Bin; Ahmad Ibrahim, Muhamad Asraf Bin; Ahmad Noorhani, Ahmad Nurul Ihsan Bin; Sweis, Jason; Lai, Ya-Chieh; Hurat, Philippe

    2017-03-01

    Mask inspection is a critical step in the mask manufacturing process in order to ensure all dimensions printed are within the needed tolerances. This becomes even more challenging as the device nodes shrink and the complexity of the tapeout increases. Thus, the amount of measurement points and their critical dimension (CD) types are increasing to ensure the quality of the mask. In addition to the mask quality, there is a significant amount of manpower needed when the preparation and debugging of this process are not automated. By utilizing a novel pattern search technology with the ability to measure and report match region scan-line (edge) measurements, we can create a flow to find, measure and mark all metrology locations of interest and provide this automated report to the mask shop for inspection. A digital library is created based on the technology product and node which contains the test patterns to be measured. This paper will discuss how these digital libraries will be generated and then utilized. As a time-critical part of the manufacturing process, this can also reduce the data preparation cycle time, minimize the amount of manual/human error in naming and measuring the various locations, reduce the risk of wrong/missing CD locations, and reduce the amount of manpower needed overall. We will also review an example pattern and how the reporting structure to the mask shop can be processed. This entire process can now be fully automated.

  10. ARAM: an automated image analysis software to determine rosetting parameters and parasitaemia in Plasmodium samples.

    Science.gov (United States)

    Kudella, Patrick Wolfgang; Moll, Kirsten; Wahlgren, Mats; Wixforth, Achim; Westerhausen, Christoph

    2016-04-18

    Rosetting is associated with severe malaria and a primary cause of death in Plasmodium falciparum infections. Detailed understanding of this adhesive phenomenon may enable the development of new therapies interfering with rosette formation. For this, it is crucial to determine parameters such as rosetting and parasitaemia of laboratory strains or patient isolates, a bottleneck in malaria research due to the time consuming and error prone manual analysis of specimens. Here, the automated, free, stand-alone analysis software automated rosetting analyzer for micrographs (ARAM) to determine rosetting rate, rosette size distribution as well as parasitaemia with a convenient graphical user interface is presented. Automated rosetting analyzer for micrographs is an executable with two operation modes for automated identification of objects on images. The default mode detects red blood cells and fluorescently labelled parasitized red blood cells by combining an intensity-gradient with a threshold filter. The second mode determines object location and size distribution from a single contrast method. The obtained results are compared with standardized manual analysis. Automated rosetting analyzer for micrographs calculates statistical confidence probabilities for rosetting rate and parasitaemia. Automated rosetting analyzer for micrographs analyses 25 cell objects per second reliably delivering identical results compared to manual analysis. For the first time rosette size distribution is determined in a precise and quantitative manner employing ARAM in combination with established inhibition tests. Additionally ARAM measures the essential observables parasitaemia, rosetting rate and size as well as location of all detected objects and provides confidence intervals for the determined observables. No other existing software solution offers this range of function. The second, non-malaria specific, analysis mode of ARAM offers the functionality to detect arbitrary objects

  11. Molecular Detection of Bladder Cancer by Fluorescence Microsatellite Analysis and an Automated Genetic Analyzing System

    Directory of Open Access Journals (Sweden)

    Sarel Halachmi

    2007-01-01

    Full Text Available To investigate the ability of an automated fluorescent analyzing system to detect microsatellite alterations, in patients with bladder cancer. We investigated 11 with pathology proven bladder Transitional Cell Carcinoma (TCC for microsatellite alterations in blood, urine, and tumor biopsies. DNA was prepared by standard methods from blood, urine and resected tumor specimens, and was used for microsatellite analysis. After the primers were fluorescent labeled, amplification of the DNA was performed with PCR. The PCR products were placed into the automated genetic analyser (ABI Prism 310, Perkin Elmer, USA and were subjected to fluorescent scanning with argon ion laser beams. The fluorescent signal intensity measured by the genetic analyzer measured the product size in terms of base pairs. We found loss of heterozygocity (LOH or microsatellite alterations (a loss or gain of nucleotides, which alter the original normal locus size in all the patients by using fluorescent microsatellite analysis and an automated analyzing system. In each case the genetic changes found in urine samples were identical to those found in the resected tumor sample. The studies demonstrated the ability to detect bladder tumor non-invasively by fluorescent microsatellite analysis of urine samples. Our study supports the worldwide trend for the search of non-invasive methods to detect bladder cancer. We have overcome major obstacles that prevented the clinical use of an experimental system. With our new tested system microsatellite analysis can be done cheaper, faster, easier and with higher scientific accuracy.

  12. Semi-automated potentiometric titration method for uranium characterization

    Energy Technology Data Exchange (ETDEWEB)

    Cristiano, B.F.G., E-mail: barbara@ird.gov.br [Comissao Nacional de Energia Nuclear (CNEN), Instituto de Radioprotecao e Dosimetria (IRD), Avenida Salvador Allende s/n Recreio dos Bandeirantes, PO Box 37750, Rio de Janeiro, 22780-160 RJ (Brazil); Delgado, J.U.; Silva, J.W.S. da; Barros, P.D. de; Araujo, R.M.S. de [Comissao Nacional de Energia Nuclear (CNEN), Instituto de Radioprotecao e Dosimetria (IRD), Avenida Salvador Allende s/n Recreio dos Bandeirantes, PO Box 37750, Rio de Janeiro, 22780-160 RJ (Brazil); Lopes, R.T. [Programa de Engenharia Nuclear (PEN/COPPE), Universidade Federal do Rio de Janeiro (UFRJ), Ilha do Fundao, PO Box 68509, Rio de Janeiro, 21945-970 RJ (Brazil)

    2012-07-15

    The manual version of the potentiometric titration method has been used for certification and characterization of uranium compounds. In order to reduce the analysis time and the influence of the analyst, a semi-automatic version of the method was developed in the Brazilian Nuclear Energy Commission. The method was applied with traceability assured by using a potassium dichromate primary standard. The combined standard uncertainty in determining the total concentration of uranium was around 0.01%, which is suitable for uranium characterization. - Highlights: Black-Right-Pointing-Pointer We developed a semi-automatic version of potentiometric titration method. Black-Right-Pointing-Pointer The method is used for certification and characterization of uranium compounds. Black-Right-Pointing-Pointer The traceability of the method was assured by a K{sub 2}Cr{sub 2}O{sub 7} primary standard. Black-Right-Pointing-Pointer The results of U{sub 3}O{sub 8} reference material analyzed was consistent with certified value. Black-Right-Pointing-Pointer The uncertainty obtained, near 0.01%, is useful for characterization purposes.

  13. Automating Flood Hazard Mapping Methods for Near Real-time Storm Surge Inundation and Vulnerability Assessment

    Science.gov (United States)

    Weigel, A. M.; Griffin, R.; Gallagher, D.

    2015-12-01

    Storm surge has enough destructive power to damage buildings and infrastructure, erode beaches, and threaten human life across large geographic areas, hence posing the greatest threat of all the hurricane hazards. The United States Gulf of Mexico has proven vulnerable to hurricanes as it has been hit by some of the most destructive hurricanes on record. With projected rises in sea level and increases in hurricane activity, there is a need to better understand the associated risks for disaster mitigation, preparedness, and response. GIS has become a critical tool in enhancing disaster planning, risk assessment, and emergency response by communicating spatial information through a multi-layer approach. However, there is a need for a near real-time method of identifying areas with a high risk of being impacted by storm surge. Research was conducted alongside Baron, a private industry weather enterprise, to facilitate automated modeling and visualization of storm surge inundation and vulnerability on a near real-time basis. This research successfully automated current flood hazard mapping techniques using a GIS framework written in a Python programming environment, and displayed resulting data through an Application Program Interface (API). Data used for this methodology included high resolution topography, NOAA Probabilistic Surge model outputs parsed from Rich Site Summary (RSS) feeds, and the NOAA Census tract level Social Vulnerability Index (SoVI). The development process required extensive data processing and management to provide high resolution visualizations of potential flooding and population vulnerability in a timely manner. The accuracy of the developed methodology was assessed using Hurricane Isaac as a case study, which through a USGS and NOAA partnership, contained ample data for statistical analysis. This research successfully created a fully automated, near real-time method for mapping high resolution storm surge inundation and vulnerability for the

  14. An automated method for determining the cytoadhesion of Plasmodium falciparum-infected erythrocytes to immobilized cells

    DEFF Research Database (Denmark)

    Hempel, Casper; Boisen, Ida M; Efunshile, Akinwale

    2015-01-01

    an automated high-throughput method for this purpose utilizing the pseudoperoxidase activity of intra-erythrocytic haemoglobin. METHODS: Chinese hamster ovary (CHO) cells were grown to confluence in chamber slides and microtiter plates. Cytoadhesion of co-cultured P. falciparum, selected for binding to CHO...... cells, was quantified by microscopy of Giemsa-stained chamber slides. In the automated assay, binding was quantified spectrophotometrically in microtiter plates after cell lysis using tetramethylbenzidine as peroxidase-catalysed substrate. The relevance of the method for binding studies was assessed...... and Bland-Altman plots. RESULTS: The manual and automated quantification showed strong, positive correlation (r(2) = 0.959, p automated assay showed the expected dose-dependent reduction in binding to CHO cells when blocking with soluble...

  15. Analysis of automated highway system risks and uncertainties. Volume 5

    Energy Technology Data Exchange (ETDEWEB)

    Sicherman, A.

    1994-10-01

    This volume describes a risk analysis performed to help identify important Automated Highway System (AHS) deployment uncertainties and quantify their effect on costs and benefits for a range of AHS deployment scenarios. The analysis identified a suite of key factors affecting vehicle and roadway costs, capacities and market penetrations for alternative AHS deployment scenarios. A systematic protocol was utilized for obtaining expert judgments of key factor uncertainties in the form of subjective probability percentile assessments. Based on these assessments, probability distributions on vehicle and roadway costs, capacity and market penetration were developed for the different scenarios. The cost/benefit risk methodology and analysis provide insights by showing how uncertainties in key factors translate into uncertainties in summary cost/benefit indices.

  16. The Synthesis Method of Automated System of Operational Planning in Low-Space Communication System Messaging

    Directory of Open Access Journals (Sweden)

    Serhii Kovbasiuk

    2017-04-01

    Full Text Available One of the reasons for the decrease of efficiency in low-speed communication systems, satellite communication, which are based on nanoplatform is a high degree of operational planning centralisation. To overcome this problem the method which carries out the distribution of tasks of communications operational planning minimizing the exchange of information between spatially remote sites, and takes into account the computing performance of software and hardware was developed. The technique is based on the use of methods of structural and parametric synthesis, simulation and statistical analysis of the results. Its use allows to obtain the optimal structure of the automated system of operational planning in low-space communication system messaging evaluation of efficiency in terms of fixed communication of information load.

  17. AUTOMATED DATA ANALYSIS FOR CONSECUTIVE IMAGES FROM DROPLET COMBUSTION EXPERIMENTS

    Directory of Open Access Journals (Sweden)

    Christopher Lee Dembia

    2012-09-01

    Full Text Available A simple automated image analysis algorithm has been developed that processes consecutive images from high speed, high resolution digital images of burning fuel droplets. The droplets burn under conditions that promote spherical symmetry. The algorithm performs the tasks of edge detection of the droplet’s boundary using a grayscale intensity threshold, and shape fitting either a circle or ellipse to the droplet’s boundary. The results are compared to manual measurements of droplet diameters done with commercial software. Results show that it is possible to automate data analysis for consecutive droplet burning images even in the presence of a significant amount of noise from soot formation. An adaptive grayscale intensity threshold provides the ability to extract droplet diameters for the wide range of noise encountered. In instances where soot blocks portions of the droplet, the algorithm manages to provide accurate measurements if a circle fit is used instead of an ellipse fit, as an ellipse can be too accommodating to the disturbance.

  18. Automated three-dimensional X-ray analysis using a dual-beam FIB

    International Nuclear Information System (INIS)

    Schaffer, Miroslava; Wagner, Julian; Schaffer, Bernhard; Schmied, Mario; Mulders, Hans

    2007-01-01

    We present a fully automated method for three-dimensional (3D) elemental analysis demonstrated using a ceramic sample of chemistry (Ca)MgTiO x . The specimen is serially sectioned by a focused ion beam (FIB) microscope, and energy-dispersive X-ray spectrometry (EDXS) is used for elemental analysis of each cross-section created. A 3D elemental model is reconstructed from the stack of two-dimensional (2D) data. This work concentrates on issues arising from process automation, the large sample volume of approximately 17x17x10 μm 3 , and the insulating nature of the specimen. A new routine for post-acquisition data correction of different drift effects is demonstrated. Furthermore, it is shown that EDXS data may be erroneous for specimens containing voids, and that back-scattered electron images have to be used to correct for these errors

  19. Theoretical methods in the assessment of vision and automated perimetry.

    Science.gov (United States)

    Jindra, Lawrence F

    2006-01-01

    An analytic understanding of automated perimetry requires an appreciation of the fundamental theories of vision and an understanding of the basic mathematical rudiments of signal processing theory. The theories of vision by Weber, Fechtner, and Stevens are evaluated and the mathematical bases of logarithmic, exponential, and power functions are considered as they relate to various models of visual functioning. Presenting perimetry results as actual, linear stimulus values, not theoretical, non-linear response values, could better allow clinicians to assess and examine the testing data directly to evaluate more correctly and accurately their patients' visual function.

  20. A mixed optimization method for automated design of fuselage structures.

    Science.gov (United States)

    Sobieszczanski, J.; Loendorf, D.

    1972-01-01

    A procedure for automating the design of transport aircraft fuselage structures has been developed and implemented in the form of an operational program. The structure is designed in two stages. First, an overall distribution of structural material is obtained by means of optimality criteria to meet strength and displacement constraints. Subsequently, the detailed design of selected rings and panels consisting of skin and stringers is performed by mathematical optimization accounting for a set of realistic design constraints. The practicality and computer efficiency of the procedure is demonstrated on cylindrical and area-ruled large transport fuselages.

  1. StrAuto: automation and parallelization of STRUCTURE analysis.

    Science.gov (United States)

    Chhatre, Vikram E; Emerson, Kevin J

    2017-03-24

    Population structure inference using the software STRUCTURE has become an integral part of population genetic studies covering a broad spectrum of taxa including humans. The ever-expanding size of genetic data sets poses computational challenges for this analysis. Although at least one tool currently implements parallel computing to reduce computational overload of this analysis, it does not fully automate the use of replicate STRUCTURE analysis runs required for downstream inference of optimal K. There is pressing need for a tool that can deploy population structure analysis on high performance computing clusters. We present an updated version of the popular Python program StrAuto, to streamline population structure analysis using parallel computing. StrAuto implements a pipeline that combines STRUCTURE analysis with the Evanno Δ K analysis and visualization of results using STRUCTURE HARVESTER. Using benchmarking tests, we demonstrate that StrAuto significantly reduces the computational time needed to perform iterative STRUCTURE analysis by distributing runs over two or more processors. StrAuto is the first tool to integrate STRUCTURE analysis with post-processing using a pipeline approach in addition to implementing parallel computation - a set up ideal for deployment on computing clusters. StrAuto is distributed under the GNU GPL (General Public License) and available to download from http://strauto.popgen.org .

  2. A rapid and efficient automated method for the sequential separation of plutonium and radiostrontium in seawater

    International Nuclear Information System (INIS)

    Hyuncheol Kim; Kun Ho Chung; Mee Jang; Mun ja Kang; Geun-Sik Choi

    2015-01-01

    A novel method for the simultaneous separation of Pu and 90 Sr in seawater is proposed that is based on precipitation and extraction chromatography with an automated radionuclide separator. Pu from seawater is co-precipitated with Fe(OH) 2 ; 90 Sr is precipitated as SrCO 3 . The precipitates are then dissolved in HNO 3 and sequentially separated using TEVA and Sr-resin columns on an automated radionuclide separator (MARS, Modular Automated Radionuclide Separator). The yield of Pu and Sr from 1 and 10 L of seawater ranged between 50 and 74 %, and between 77 and 95 %, respectively. (author)

  3. An image processing framework for automated analysis of swimming behavior in tadpoles with vestibular alterations

    Science.gov (United States)

    Zarei, Kasra; Fritzsch, Bernd; Buchholz, James H. J.

    2017-03-01

    Micogravity, as experienced during prolonged space flight, presents a problem for space exploration. Animal models, specifically tadpoles, with altered connections of the vestibular ear allow the examination of the effects of microgravity and can be quantitatively monitored through tadpole swimming behavior. We describe an image analysis framework for performing automated quantification of tadpole swimming behavior. Speckle reducing anisotropic diffusion is used to smooth tadpole image signals by diffusing noise while retaining edges. A narrow band level set approach is used for sharp tracking of the tadpole body. The use of level set method for interface tracking provides an inherent advantage of using level set based image segmentation algorithm (active contouring). Active contour segmentation is followed by two-dimensional skeletonization, which allows the automated quantification of tadpole deflection angles, and subsequently tadpole escape (or C-start) response times. Evaluation of the image analysis methodology was performed by comparing the automated quantifications of deflection angles to manual assessments (obtained using a standard grading scheme), and produced a high correlation (r2 = 0.99) indicating high reliability and accuracy of the proposed method. The methods presented form an important element of objective quantification of the escape response of the tadpole vestibular system to mechanical and biochemical manipulations, and can ultimately contribute to a better understanding of the effects of altered gravity perception on humans.

  4. Video and accelerometer-based motion analysis for automated surgical skills assessment.

    Science.gov (United States)

    Zia, Aneeq; Sharma, Yachna; Bettadapura, Vinay; Sarin, Eric L; Essa, Irfan

    2018-03-01

    Basic surgical skills of suturing and knot tying are an essential part of medical training. Having an automated system for surgical skills assessment could help save experts time and improve training efficiency. There have been some recent attempts at automated surgical skills assessment using either video analysis or acceleration data. In this paper, we present a novel approach for automated assessment of OSATS-like surgical skills and provide an analysis of different features on multi-modal data (video and accelerometer data). We conduct a large study for basic surgical skill assessment on a dataset that contained video and accelerometer data for suturing and knot-tying tasks. We introduce "entropy-based" features-approximate entropy and cross-approximate entropy, which quantify the amount of predictability and regularity of fluctuations in time series data. The proposed features are compared to existing methods of Sequential Motion Texture, Discrete Cosine Transform and Discrete Fourier Transform, for surgical skills assessment. We report average performance of different features across all applicable OSATS-like criteria for suturing and knot-tying tasks. Our analysis shows that the proposed entropy-based features outperform previous state-of-the-art methods using video data, achieving average classification accuracies of 95.1 and 92.2% for suturing and knot tying, respectively. For accelerometer data, our method performs better for suturing achieving 86.8% average accuracy. We also show that fusion of video and acceleration features can improve overall performance for skill assessment. Automated surgical skills assessment can be achieved with high accuracy using the proposed entropy features. Such a system can significantly improve the efficiency of surgical training in medical schools and teaching hospitals.

  5. Automated reticle inspection data analysis for wafer fabs

    Science.gov (United States)

    Summers, Derek; Chen, Gong; Reese, Bryan; Hutchinson, Trent; Liesching, Marcus; Ying, Hai; Dover, Russell

    2009-04-01

    To minimize potential wafer yield loss due to mask defects, most wafer fabs implement some form of reticle inspection system to monitor photomask quality in high-volume wafer manufacturing environments. Traditionally, experienced operators review reticle defects found by an inspection tool and then manually classify each defect as 'pass, warn, or fail' based on its size and location. However, in the event reticle defects are suspected of causing repeating wafer defects on a completed wafer, potential defects on all associated reticles must be manually searched on a layer-by-layer basis in an effort to identify the reticle responsible for the wafer yield loss. This 'problem reticle' search process is a very tedious and time-consuming task and may cause extended manufacturing line-down situations. Often times, Process Engineers and other team members need to manually investigate several reticle inspection reports to determine if yield loss can be tied to a specific layer. Because of the very nature of this detailed work, calculation errors may occur resulting in an incorrect root cause analysis effort. These delays waste valuable resources that could be spent working on other more productive activities. This paper examines an automated software solution for converting KLA-Tencor reticle inspection defect maps into a format compatible with KLA-Tencor's Klarity Defect(R) data analysis database. The objective is to use the graphical charting capabilities of Klarity Defect to reveal a clearer understanding of defect trends for individual reticle layers or entire mask sets. Automated analysis features include reticle defect count trend analysis and potentially stacking reticle defect maps for signature analysis against wafer inspection defect data. Other possible benefits include optimizing reticle inspection sample plans in an effort to support "lean manufacturing" initiatives for wafer fabs.

  6. Methods for Automated and Continuous Commissioning of Building Systems

    Energy Technology Data Exchange (ETDEWEB)

    Larry Luskay; Michael Brambley; Srinivas Katipamula

    2003-04-30

    Avoidance of poorly installed HVAC systems is best accomplished at the close of construction by having a building and its systems put ''through their paces'' with a well conducted commissioning process. This research project focused on developing key components to enable the development of tools that will automatically detect and correct equipment operating problems, thus providing continuous and automatic commissioning of the HVAC systems throughout the life of a facility. A study of pervasive operating problems reveled the following would most benefit from an automated and continuous commissioning process: (1) faulty economizer operation; (2) malfunctioning sensors; (3) malfunctioning valves and dampers, and (4) access to project design data. Methodologies for detecting system operation faults in these areas were developed and validated in ''bare-bones'' forms within standard software such as spreadsheets, databases, statistical or mathematical packages. Demonstrations included flow diagrams and simplified mock-up applications. Techniques to manage data were demonstrated by illustrating how test forms could be populated with original design information and the recommended sequence of operation for equipment systems. Proposed tools would use measured data, design data, and equipment operating parameters to diagnosis system problems. Steps for future research are suggested to help more toward practical application of automated commissioning and its high potential to improve equipment availability, increase occupant comfort, and extend the life of system equipment.

  7. Automated retroillumination photography analysis for objective assessment of Fuchs Corneal Dystrophy severity

    Science.gov (United States)

    Eghrari, Allen O.; Mumtaz, Aisha A.; Garrett, Brian; Rezaei, Mahsa; Akhavan, Mina S.; Riazuddin, S. Amer; Gottsch, John D.

    2016-01-01

    Purpose Retroillumination photography analysis (RPA) is an objective tool for assessment of the number and distribution of guttae in eyes affected with Fuchs Corneal Dystrophy (FCD). Current protocols include manual processing of images; here we assess validity and interrater reliability of automated analysis across various levels of FCD severity. Methods Retroillumination photographs of 97 FCD-affected corneas were acquired and total counts of guttae previously summated manually. For each cornea, a single image was loaded into ImageJ software. We reduced color variability and subtracted background noise. Reflection of light from each gutta was identified as a local area of maximum intensity and counted automatically. Noise tolerance level was titrated for each cornea by examining a small region of each image with automated overlay to ensure appropriate coverage of individual guttae. We tested interrater reliability of automated counts of guttae across a spectrum of clinical and educational experience. Results A set of 97 retroillumination photographs were analyzed. Clinical severity as measured by a modified Krachmer scale ranged from a severity level of 1 to 5 in the set of analyzed corneas. Automated counts by an ophthalmologist correlated strongly with Krachmer grading (R2=0.79) and manual counts (R2=0.88). Intraclass correlation coefficient demonstrated strong correlation, at 0.924 (95% CI, 0.870- 0.958) among cases analyzed by three students, and 0.869 (95% CI, 0.797- 0.918) among cases for which images was analyzed by an ophthalmologist and two students. Conclusions Automated RPA allows for grading of FCD severity with high resolution across a spectrum of disease severity. PMID:27811565

  8. Automated analysis of prerecorded evoked electromyographic activity from rat muscle.

    Science.gov (United States)

    Basarab-Horwath, I; Dewhurst, D G; Dixon, R; Meehan, A S; Odusanya, S

    1989-03-01

    An automated microprocessor-based data acquisition and analysis system has been developed specifically to quantify electromyographic (EMG) activity induced by the convulsant agent catechol in the anaesthetized rat. The stimulus and EMG response are recorded on magnetic tape. On playback, the stimulus triggers a digital oscilloscope and, via interface circuitry, a BBC B microcomputer. The myoelectric activity is digitized by the oscilloscope before being transferred under computer control via a RS232 link to the microcomputer. This system overcomes the problems of dealing with signals of variable latency and allows quantification of latency, amplitude, area and frequency of occurrence of specific components within the signal. The captured data can be used to generate either signal or superimposed high resolution graphic reproductions of the original waveforms. Although this system has been designed for a specific application, it could easily be modified to allow analysis of any complex waveform.

  9. Automated image analysis for quantification of filamentous bacteria

    DEFF Research Database (Denmark)

    Fredborg, M.; Rosenvinge, F. S.; Spillum, E.

    2015-01-01

    Background: Antibiotics of the beta-lactam group are able to alter the shape of the bacterial cell wall, e.g. filamentation or a spheroplast formation. Early determination of antimicrobial susceptibility may be complicated by filamentation of bacteria as this can be falsely interpreted as growth...... displaying different resistant profiles and differences in filamentation kinetics were used to study a novel image analysis algorithm to quantify length of bacteria and bacterial filamentation. A total of 12 beta-lactam antibiotics or beta-lactam-beta-lactamase inhibitor combinations were analyzed...... in systems relying on colorimetry or turbidometry (such as Vitek-2, Phoenix, MicroScan WalkAway). The objective was to examine an automated image analysis algorithm for quantification of filamentous bacteria using the 3D digital microscopy imaging system, oCelloScope. Results: Three E. coli strains...

  10. Automated rice leaf disease detection using color image analysis

    Science.gov (United States)

    Pugoy, Reinald Adrian D. L.; Mariano, Vladimir Y.

    2011-06-01

    In rice-related institutions such as the International Rice Research Institute, assessing the health condition of a rice plant through its leaves, which is usually done as a manual eyeball exercise, is important to come up with good nutrient and disease management strategies. In this paper, an automated system that can detect diseases present in a rice leaf using color image analysis is presented. In the system, the outlier region is first obtained from a rice leaf image to be tested using histogram intersection between the test and healthy rice leaf images. Upon obtaining the outlier, it is then subjected to a threshold-based K-means clustering algorithm to group related regions into clusters. Then, these clusters are subjected to further analysis to finally determine the suspected diseases of the rice leaf.

  11. Automated sensitivity analysis: New tools for modeling complex dynamic systems

    International Nuclear Information System (INIS)

    Pin, F.G.

    1987-01-01

    Sensitivity analysis is an established methodology used by researchers in almost every field to gain essential insight in design and modeling studies and in performance assessments of complex systems. Conventional sensitivity analysis methodologies, however, have not enjoyed the widespread use they deserve considering the wealth of information they can provide, partly because of their prohibitive cost or the large initial analytical investment they require. Automated systems have recently been developed at ORNL to eliminate these drawbacks. Compilers such as GRESS and EXAP now allow automatic and cost effective calculation of sensitivities in FORTRAN computer codes. In this paper, these and other related tools are described and their impact and applicability in the general areas of modeling, performance assessment and decision making for radioactive waste isolation problems are discussed

  12. Quantification of Pulmonary Fibrosis in a Bleomycin Mouse Model Using Automated Histological Image Analysis.

    Directory of Open Access Journals (Sweden)

    Jean-Claude Gilhodes

    Full Text Available Current literature on pulmonary fibrosis induced in animal models highlights the need of an accurate, reliable and reproducible histological quantitative analysis. One of the major limits of histological scoring concerns the fact that it is observer-dependent and consequently subject to variability, which may preclude comparative studies between different laboratories. To achieve a reliable and observer-independent quantification of lung fibrosis we developed an automated software histological image analysis performed from digital image of entire lung sections. This automated analysis was compared to standard evaluation methods with regard to its validation as an end-point measure of fibrosis. Lung fibrosis was induced in mice by intratracheal administration of bleomycin (BLM at 0.25, 0.5, 0.75 and 1 mg/kg. A detailed characterization of BLM-induced fibrosis was performed 14 days after BLM administration using lung function testing, micro-computed tomography and Ashcroft scoring analysis. Quantification of fibrosis by automated analysis was assessed based on pulmonary tissue density measured from thousands of micro-tiles processed from digital images of entire lung sections. Prior to analysis, large bronchi and vessels were manually excluded from the original images. Measurement of fibrosis has been expressed by two indexes: the mean pulmonary tissue density and the high pulmonary tissue density frequency. We showed that tissue density indexes gave access to a very accurate and reliable quantification of morphological changes induced by BLM even for the lowest concentration used (0.25 mg/kg. A reconstructed 2D-image of the entire lung section at high resolution (3.6 μm/pixel has been performed from tissue density values allowing the visualization of their distribution throughout fibrotic and non-fibrotic regions. A significant correlation (p<0.0001 was found between automated analysis and the above standard evaluation methods. This correlation

  13. Automated High-Dimensional Flow Cytometric Data Analysis

    Science.gov (United States)

    Pyne, Saumyadipta; Hu, Xinli; Wang, Kui; Rossin, Elizabeth; Lin, Tsung-I.; Maier, Lisa; Baecher-Allan, Clare; McLachlan, Geoffrey; Tamayo, Pablo; Hafler, David; de Jager, Philip; Mesirov, Jill

    Flow cytometry is widely used for single cell interrogation of surface and intracellular protein expression by measuring fluorescence intensity of fluorophore-conjugated reagents. We focus on the recently developed procedure of Pyne et al. (2009, Proceedings of the National Academy of Sciences USA 106, 8519-8524) for automated high- dimensional flow cytometric analysis called FLAME (FLow analysis with Automated Multivariate Estimation). It introduced novel finite mixture models of heavy-tailed and asymmetric distributions to identify and model cell populations in a flow cytometric sample. This approach robustly addresses the complexities of flow data without the need for transformation or projection to lower dimensions. It also addresses the critical task of matching cell populations across samples that enables downstream analysis. It thus facilitates application of flow cytometry to new biological and clinical problems. To facilitate pipelining with standard bioinformatic applications such as high-dimensional visualization, subject classification or outcome prediction, FLAME has been incorporated with the GenePattern package of the Broad Institute. Thereby analysis of flow data can be approached similarly as other genomic platforms. We also consider some new work that proposes a rigorous and robust solution to the registration problem by a multi-level approach that allows us to model and register cell populations simultaneously across a cohort of high-dimensional flow samples. This new approach is called JCM (Joint Clustering and Matching). It enables direct and rigorous comparisons across different time points or phenotypes in a complex biological study as well as for classification of new patient samples in a more clinical setting.

  14. Adiposoft: automated software for the analysis of white adipose tissue cellularity in histological sections.

    Science.gov (United States)

    Galarraga, Miguel; Campión, Javier; Muñoz-Barrutia, Arrate; Boqué, Noemí; Moreno, Haritz; Martínez, José Alfredo; Milagro, Fermín; Ortiz-de-Solórzano, Carlos

    2012-12-01

    The accurate estimation of the number and size of cells provides relevant information on the kinetics of growth and the physiological status of a given tissue or organ. Here, we present Adiposoft, a fully automated open-source software for the analysis of white adipose tissue cellularity in histological sections. First, we describe the sequence of image analysis routines implemented by the program. Then, we evaluate our software by comparing it with other adipose tissue quantification methods, namely, with the manual analysis of cells in histological sections (used as gold standard) and with the automated analysis of cells in suspension, the most commonly used method. Our results show significant concordance between Adiposoft and the other two methods. We also demonstrate the ability of the proposed method to distinguish the cellular composition of three different rat fat depots. Moreover, we found high correlation and low disagreement between Adiposoft and the manual delineation of cells. We conclude that Adiposoft provides accurate results while considerably reducing the amount of time and effort required for the analysis.

  15. Ankle-brachial index by automated method and renal function

    Directory of Open Access Journals (Sweden)

    Ricardo Pereira Silva

    2017-05-01

    Full Text Available Background The Ankle-brachial index (ABI is a non-invasive method used for the diagnosis of peripheral arterial occlusive disease (PAOD. Aims To determine the clinical features of patients submitted to ABI measurement by automatic method. To investigate association between ABI and renal function. Methods The present is a cross-sectional study. The study was performed in a private clinic in the city of Fortaleza (Ce- Brazil. For ABI analysis, we utilized automatic methodology using a Microlife device. Data collection took place from March 2012 to January 2016. During this period, ABI was measured in 375 patients aged >50 years, who had a diagnosis of hypertension, diabetes or vascular disease. Results Of the 375 patients, 18 were categorized as having abnormal ABI (4.8 per cent and 357 were normal ABI (95.2 per cent. Patients with abnormal ABI showed older mean age when compared to patients with normal ABI. Among patients with normal renal function, only 0.95 per cent showed abnormal ABI; among patients with abnormal renal function, 6 per cent showed abnormal ABI. Conclusion 1 No differences were observed when comparing the groups regarding gender or the prevalence of hypertension, diabetes, dyslipidaemia or CAD. 2 Group with abnormal ABI had renal function greater impairment.

  16. An automated system for whole microscopic image acquisition and analysis.

    Science.gov (United States)

    Bueno, Gloria; Déniz, Oscar; Fernández-Carrobles, María Del Milagro; Vállez, Noelia; Salido, Jesús

    2014-09-01

    The field of anatomic pathology has experienced major changes over the last decade. Virtual microscopy (VM) systems have allowed experts in pathology and other biomedical areas to work in a safer and more collaborative way. VMs are automated systems capable of digitizing microscopic samples that were traditionally examined one by one. The possibility of having digital copies reduces the risk of damaging original samples, and also makes it easier to distribute copies among other pathologists. This article describes the development of an automated high-resolution whole slide imaging (WSI) system tailored to the needs and problems encountered in digital imaging for pathology, from hardware control to the full digitization of samples. The system has been built with an additional digital monochromatic camera together with the color camera by default and LED transmitted illumination (RGB). Monochrome cameras are the preferred method of acquisition for fluorescence microscopy. The system is able to digitize correctly and form large high resolution microscope images for both brightfield and fluorescence. The quality of the digital images has been quantified using three metrics based on sharpness, contrast and focus. It has been proved on 150 tissue samples of brain autopsies, prostate biopsies and lung cytologies, at five magnifications: 2.5×, 10×, 20×, 40×, and 63×. The article is focused on the hardware set-up and the acquisition software, although results of the implemented image processing techniques included in the software and applied to the different tissue samples are also presented. © 2014 Wiley Periodicals, Inc.

  17. Automation of block assignment planning using a diagram-based scenario modeling method

    Science.gov (United States)

    Hwang, In Hyuck; Kim, Youngmin; Lee, Dong Kun; Shin, Jong Gye

    2014-03-01

    Most shipbuilding scheduling research so far has focused on the load level on the dock plan. This is be¬cause the dock is the least extendable resource in shipyards, and its overloading is difficult to resolve. However, once dock scheduling is completed, making a plan that makes the best use of the rest of the resources in the shipyard to minimize any additional cost is also important. Block assignment planning is one of the midterm planning tasks; it assigns a block to the facility (factory/shop or surface plate) that will actually manufacture the block according to the block characteristics and current situation of the facility. It is one of the most heavily loaded midterm planning tasks and is carried out manu¬ally by experienced workers. In this study, a method of representing the block assignment rules using a diagram was su¬ggested through analysis of the existing manual process. A block allocation program was developed which automated the block assignment process according to the rules represented by the diagram. The planning scenario was validated through a case study that compared the manual assignment and two automated block assignment results.

  18. Automated MRI Volumetric Analysis in Patients with Rasmussen Syndrome.

    Science.gov (United States)

    Wang, Z I; Krishnan, B; Shattuck, D W; Leahy, R M; Moosa, A N V; Wyllie, E; Burgess, R C; Al-Sharif, N B; Joshi, A A; Alexopoulos, A V; Mosher, J C; Udayasankar, U; Jones, S E

    2016-12-01

    Rasmussen syndrome, also known as Rasmussen encephalitis, is typically associated with volume loss of the affected hemisphere of the brain. Our aim was to apply automated quantitative volumetric MR imaging analyses to patients diagnosed with Rasmussen encephalitis, to determine the predictive value of lobar volumetric measures and to assess regional atrophy differences as well as monitor disease progression by using these measures. Nineteen patients (42 scans) with diagnosed Rasmussen encephalitis were studied. We used 2 control groups: one with 42 age- and sex-matched healthy subjects and the other with 42 epileptic patients without Rasmussen encephalitis with the same disease duration as patients with Rasmussen encephalitis. Volumetric analysis was performed on T1-weighted images by using BrainSuite. Ratios of volumes from the affected hemisphere divided by those from the unaffected hemisphere were used as input to a logistic regression classifier, which was trained to discriminate patients from controls. Using the classifier, we compared the predictive accuracy of all the volumetric measures. These ratios were used to further assess regional atrophy differences and correlate with epilepsy duration. Interhemispheric and frontal lobe ratios had the best prediction accuracy for separating patients with Rasmussen encephalitis from healthy controls and patient controls without Rasmussen encephalitis. The insula showed significantly more atrophy compared with all the other cortical regions. Patients with longitudinal scans showed progressive volume loss in the affected hemisphere. Atrophy of the frontal lobe and insula correlated significantly with epilepsy duration. Automated quantitative volumetric analysis provides accurate separation of patients with Rasmussen encephalitis from healthy controls and epileptic patients without Rasmussen encephalitis, and thus may assist the diagnosis of Rasmussen encephalitis. Volumetric analysis could also be included as part of

  19. PCA method for automated detection of mispronounced words

    Science.gov (United States)

    Ge, Zhenhao; Sharma, Sudhendu R.; Smith, Mark J. T.

    2011-06-01

    This paper presents a method for detecting mispronunciations with the aim of improving Computer Assisted Language Learning (CALL) tools used by foreign language learners. The algorithm is based on Principle Component Analysis (PCA). It is hierarchical with each successive step refining the estimate to classify the test word as being either mispronounced or correct. Preprocessing before detection, like normalization and time-scale modification, is implemented to guarantee uniformity of the feature vectors input to the detection system. The performance using various features including spectrograms and Mel-Frequency Cepstral Coefficients (MFCCs) are compared and evaluated. Best results were obtained using MFCCs, achieving up to 99% accuracy in word verification and 93% in native/non-native classification. Compared with Hidden Markov Models (HMMs) which are used pervasively in recognition application, this particular approach is computational efficient and effective when training data is limited.

  20. Development of automated system for real-time LIBS analysis

    Science.gov (United States)

    Mazalan, Elham; Ali, Jalil; Tufail, Kashif; Haider, Zuhaib

    2017-03-01

    Recent developments in Laser Induced Breakdown Spectroscopy (LIBS) instrumentation allow the acquisition of several spectra in a second. The dataset from a typical LIBS experiment can consist of a few thousands of spectra. To extract the useful information from that dataset is painstaking effort and time consuming process. Most of the currently available softwares for spectral data analysis are expensive and used for offline data analysis. LabVIEW software compatible with spectrometer (in this case Ocean Optics Maya pro spectrometer), can be used to for data acquisition and real time analysis. In the present work, a LabVIEW based automated system for real-time LIBS analysis integrated with spectrometer device is developed. This system is capable of performing real time analysis based on as-acquired LIBS spectra. Here, we have demonstrated the LIBS data acquisition and real time calculations of plasma temperature and electron density. Data plots and variations in spectral intensity in response to laser energy were observed on LabVIEW monitor interface. Routine laboratory samples of brass and calcine bone were utilized in this experiment. Developed program has shown impressive performance in real time data acquisition and analysis.

  1. Trends and applications of integrated automated ultra-trace sample handling and analysis (T9)

    International Nuclear Information System (INIS)

    Kingston, H.M.S.; Ye Han; Stewart, L.; Link, D.

    2002-01-01

    Full text: Automated analysis, sub-ppt detection limits, and the trend toward speciated analysis (rather than just elemental analysis) force the innovation of sophisticated and integrated sample preparation and analysis techniques. Traditionally, the ability to handle samples at ppt and sub-ppt levels has been limited to clean laboratories and special sample handling techniques and equipment. The world of sample handling has passed a threshold where older or 'old fashioned' traditional techniques no longer provide the ability to see the sample due to the influence of the analytical blank and the fragile nature of the analyte. When samples require decomposition, extraction, separation and manipulation, application of newer more sophisticated sample handling systems are emerging that enable ultra-trace analysis and species manipulation. In addition, new instrumentation has emerged which integrate sample preparation and analysis to enable on-line near real-time analysis. Examples of those newer sample-handling methods will be discussed and current examples provided as alternatives to traditional sample handling. Two new techniques applying ultra-trace microwave energy enhanced sample handling have been developed that permit sample separation and refinement while performing species manipulation during decomposition. A demonstration, that applies to semiconductor materials, will be presented. Next, a new approach to the old problem of sample evaporation without losses will be demonstrated that is capable of retaining all elements and species tested. Both of those methods require microwave energy manipulation in specialized systems and are not accessible through convection, conduction, or other traditional energy applications. A new automated integrated method for handling samples for ultra-trace analysis has been developed. An on-line near real-time measurement system will be described that enables many new automated sample handling and measurement capabilities. This

  2. The Effect of Information Analysis Automation Display Content on Human Judgment Performance in Noisy Environments

    Science.gov (United States)

    Bass, Ellen J.; Baumgart, Leigh A.; Shepley, Kathryn Klein

    2014-01-01

    Displaying both the strategy that information analysis automation employs to makes its judgments and variability in the task environment may improve human judgment performance, especially in cases where this variability impacts the judgment performance of the information analysis automation. This work investigated the contribution of providing either information analysis automation strategy information, task environment information, or both, on human judgment performance in a domain where noisy sensor data are used by both the human and the information analysis automation to make judgments. In a simplified air traffic conflict prediction experiment, 32 participants made probability of horizontal conflict judgments under different display content conditions. After being exposed to the information analysis automation, judgment achievement significantly improved for all participants as compared to judgments without any of the automation's information. Participants provided with additional display content pertaining to cue variability in the task environment had significantly higher aided judgment achievement compared to those provided with only the automation's judgment of a probability of conflict. When designing information analysis automation for environments where the automation's judgment achievement is impacted by noisy environmental data, it may be beneficial to show additional task environment information to the human judge in order to improve judgment performance. PMID:24847184

  3. Development of an automated method of detecting stereotyped feeding events in multisensor data from tagged rorqual whales.

    Science.gov (United States)

    Allen, Ann N; Goldbogen, Jeremy A; Friedlaender, Ari S; Calambokidis, John

    2016-10-01

    The introduction of animal-borne, multisensor tags has opened up many opportunities for ecological research, making previously inaccessible species and behaviors observable. The advancement of tag technology and the increasingly widespread use of bio-logging tags are leading to large volumes of sometimes extremely detailed data. With the increasing quantity and duration of tag deployments, a set of tools needs to be developed to aid in facilitating and standardizing the analysis of movement sensor data. Here, we developed an observation-based decision tree method to detect feeding events in data from multisensor movement tags attached to fin whales (Balaenoptera physalus ). Fin whales exhibit an energetically costly and kinematically complex foraging behavior called lunge feeding, an intermittent ram filtration mechanism. Using this automated system, we identified feeding lunges in 19 fin whales tagged with multisensor tags, during a total of over 100 h of continuously sampled data. Using movement sensor and hydrophone data, the automated lunge detector correctly identified an average of 92.8% of all lunges, with a false-positive rate of 9.5%. The strong performance of our automated feeding detector demonstrates an effective, straightforward method of activity identification in animal-borne movement tag data. Our method employs a detection algorithm that utilizes a hierarchy of simple thresholds based on knowledge of observed features of feeding behavior, a technique that is readily modifiable to fit a variety of species and behaviors. Using automated methods to detect behavioral events in tag records will significantly decrease data analysis time and aid in standardizing analysis methods, crucial objectives with the rapidly increasing quantity and variety of on-animal tag data. Furthermore, our results have implications for next-generation tag design, especially long-term tags that can be outfitted with on-board processing algorithms that automatically detect

  4. Calibration of a semi-automated segmenting method for quantification of adipose tissue compartments from magnetic resonance images of mice.

    Science.gov (United States)

    Garteiser, Philippe; Doblas, Sabrina; Towner, Rheal A; Griffin, Timothy M

    2013-11-01

    To use an automated water-suppressed magnetic resonance imaging (MRI) method to objectively assess adipose tissue (AT) volumes in whole body and specific regional body components (subcutaneous, thoracic and peritoneal) of obese and lean mice. Water-suppressed MR images were obtained on a 7T, horizontal-bore MRI system in whole bodies (excluding head) of 26 week old male C57BL6J mice fed a control (10% kcal fat) or high-fat diet (60% kcal fat) for 20 weeks. Manual (outlined regions) versus automated (Gaussian fitting applied to threshold-weighted images) segmentation procedures were compared for whole body AT and regional AT volumes (i.e., subcutaneous, thoracic, and peritoneal). The AT automated segmentation method was compared to dual-energy X-ray (DXA) analysis. The average AT volumes for whole body and individual compartments correlated well between the manual outlining and the automated methods (R2>0.77, p<0.05). Subcutaneous, peritoneal, and total body AT volumes were increased 2-3 fold and thoracic AT volume increased more than 5-fold in diet-induced obese mice versus controls (p<0.05). MRI and DXA-based method comparisons were highly correlative (R2=0.94, p<0.0001). Automated AT segmentation of water-suppressed MRI data using a global Gaussian filtering algorithm resulted in a fairly accurate assessment of total and regional AT volumes in a pre-clinical mouse model of obesity. © 2013 Elsevier Inc. All rights reserved.

  5. Automated generation of burnup chain for reactor analysis applications

    International Nuclear Information System (INIS)

    Tran, Viet-Phu; Tran, Hoai-Nam; Yamamoto, Akio; Endo, Tomohiro

    2017-01-01

    This paper presents the development of an automated generation of burnup chain for reactor analysis applications. Algorithms are proposed to reevaluate decay modes, branching ratios and effective fission product (FP) cumulative yields of a given list of important FPs taking into account intermediate reactions. A new burnup chain is generated using the updated data sources taken from the JENDL FP decay data file 2011 and Fission yields data file 2011. The new burnup chain is output according to the format for the SRAC code system. Verification has been performed to evaluate the accuracy of the new burnup chain. The results show that the new burnup chain reproduces well the results of a reference one with 193 fission products used in SRAC. Burnup calculations using the new burnup chain have also been performed based on UO 2 and MOX fuel pin cells and compared with a reference chain th2cm6fp193bp6T.

  6. Automated uranium analysis by delayed-neutron counting

    International Nuclear Information System (INIS)

    Kunzendorf, H.; Loevborg, L.; Christiansen, E.M.

    1980-10-01

    Automated uranium analysis by fission-induced delayed-neutron counting is described. A short description is given of the instrumentation including transfer system, process control, irradiation and counting sites, and computer operations. Characteristic parameters of the facility (sample preparations, background, and standards) are discussed. A sensitivity of 817 +- 22 counts per 10 -6 g U is found using irradiation, delay, and counting times of 20 s, 5 s, and 10 s, respectively. Presicion is generally less than 1% for normal geological samples. Critical level and detection limits for 7.5 g samples are 8 and 16 ppb, respectively. The importance of some physical and elemental interferences are outlined. Dead-time corrections of measured count rates are necessary and a polynomical expression is used for count rates up to 10 5 . The presence of rare earth elements is regarded as the most important elemental interference. A typical application is given and other areas of application are described. (auther)

  7. Knowledge-based requirements analysis for automating software development

    Science.gov (United States)

    Markosian, Lawrence Z.

    1988-01-01

    We present a new software development paradigm that automates the derivation of implementations from requirements. In this paradigm, informally-stated requirements are expressed in a domain-specific requirements specification language. This language is machine-understable and requirements expressed in it are captured in a knowledge base. Once the requirements are captured, more detailed specifications and eventually implementations are derived by the system using transformational synthesis. A key characteristic of the process is that the required human intervention is in the form of providing problem- and domain-specific engineering knowledge, not in writing detailed implementations. We describe a prototype system that applies the paradigm in the realm of communication engineering: the prototype automatically generates implementations of buffers following analysis of the requirements on each buffer.

  8. Crowdsourcing and Automated Retinal Image Analysis for Diabetic Retinopathy.

    Science.gov (United States)

    Mudie, Lucy I; Wang, Xueyang; Friedman, David S; Brady, Christopher J

    2017-09-23

    As the number of people with diabetic retinopathy (DR) in the USA is expected to increase threefold by 2050, the need to reduce health care costs associated with screening for this treatable disease is ever present. Crowdsourcing and automated retinal image analysis (ARIA) are two areas where new technology has been applied to reduce costs in screening for DR. This paper reviews the current literature surrounding these new technologies. Crowdsourcing has high sensitivity for normal vs abnormal images; however, when multiple categories for severity of DR are added, specificity is reduced. ARIAs have higher sensitivity and specificity, and some commercial ARIA programs are already in use. Deep learning enhanced ARIAs appear to offer even more improvement in ARIA grading accuracy. The utilization of crowdsourcing and ARIAs may be a key to reducing the time and cost burden of processing images from DR screening.

  9. Research Prototype: Automated Analysis of Scientific and Engineering Semantics

    Science.gov (United States)

    Stewart, Mark E. M.; Follen, Greg (Technical Monitor)

    2001-01-01

    Physical and mathematical formulae and concepts are fundamental elements of scientific and engineering software. These classical equations and methods are time tested, universally accepted, and relatively unambiguous. The existence of this classical ontology suggests an ideal problem for automated comprehension. This problem is further motivated by the pervasive use of scientific code and high code development costs. To investigate code comprehension in this classical knowledge domain, a research prototype has been developed. The prototype incorporates scientific domain knowledge to recognize code properties (including units, physical, and mathematical quantity). Also, the procedure implements programming language semantics to propagate these properties through the code. This prototype's ability to elucidate code and detect errors will be demonstrated with state of the art scientific codes.

  10. galaxieEST: addressing EST identity through automated phylogenetic analysis.

    Science.gov (United States)

    Nilsson, R Henrik; Rajashekar, Balaji; Larsson, Karl-Henrik; Ursing, Björn M

    2004-07-05

    Research involving expressed sequence tags (ESTs) is intricately coupled to the existence of large, well-annotated sequence repositories. Comparatively complete and satisfactory annotated public sequence libraries are, however, available only for a limited range of organisms, rendering the absence of sequences and gene structure information a tangible problem for those working with taxa lacking an EST or genome sequencing project. Paralogous genes belonging to the same gene family but distinguished by derived characteristics are particularly prone to misidentification and erroneous annotation; high but incomplete levels of sequence similarity are typically difficult to interpret and have formed the basis of many unsubstantiated assumptions of orthology. In these cases, a phylogenetic study of the query sequence together with the most similar sequences in the database may be of great value to the identification process. In order to facilitate this laborious procedure, a project to employ automated phylogenetic analysis in the identification of ESTs was initiated. galaxieEST is an open source Perl-CGI script package designed to complement traditional similarity-based identification of EST sequences through employment of automated phylogenetic analysis. It uses a series of BLAST runs as a sieve to retrieve nucleotide and protein sequences for inclusion in neighbour joining and parsimony analyses; the output includes the BLAST output, the results of the phylogenetic analyses, and the corresponding multiple alignments. galaxieEST is available as an on-line web service for identification of fungal ESTs and for download / local installation for use with any organism group at http://galaxie.cgb.ki.se/galaxieEST.html. By addressing sequence relatedness in addition to similarity, galaxieEST provides an integrative view on EST origin and identity, which may prove particularly useful in cases where similarity searches return one or more pertinent, but not full, matches and

  11. analysis methods of uranium

    International Nuclear Information System (INIS)

    Bekdemir, N.; Acarkan, S.

    1997-01-01

    There are various methods for the determination of uranium. The most often used methods are spectrophotometric (PAR, DBM and Arsenazo III) and potentiometric titration methods. For uranium contents between 1-300 g/LU potentiometric titration method based on oxidation-reduction reactions gives reliable results. PAR (1-pyridiyl-2-azo resorcinol) is a sensitive reagent for uranium, forming complexes in aqueous solutions. It is a suitable method for determination of uranium at concentrations between 2-400microgram U. In this study, the spectrophotometric and potentiometric analysis methods, used in the Nuclear Fuel Department will be discussed in detail and other methods and their principles will be briefly mentioned

  12. Differentiating Obstructive from Central and Complex Sleep Apnea Using an Automated Electrocardiogram-Based Method

    Science.gov (United States)

    Thomas, Robert Joseph; Mietus, Joseph E.; Peng, Chung-Kang; Gilmartin, Geoffrey; Daly, Robert W.; Goldberger, Ary L.; Gottlieb, Daniel J.

    2007-01-01

    Study Objectives: Complex sleep apnea is defined as sleep disordered breathing secondary to simultaneous upper airway obstruction and respiratory control dysfunction. The objective of this study was to assess the utility of an electrocardiogram (ECG)-based cardiopulmonary coupling technique to distinguish obstructive from central or complex sleep apnea. Design: Analysis of archived polysomnographic datasets. Setting: A laboratory for computational signal analysis. Interventions: None. Measurements and Results: The PhysioNet Sleep Apnea Database, consisting of 70 polysomnograms including single-lead ECG signals of approximately 8 hours duration, was used to train an ECG-based measure of autonomic and respiratory interactions (cardiopulmonary coupling) to detect periods of apnea and hypopnea, based on the presence of elevated low-frequency coupling (e-LFC). In the PhysioNet BIDMC Congestive Heart Failure Database (ECGs of 15 subjects), a pattern of “narrow spectral band” e-LFC was especially common. The algorithm was then applied to the Sleep Heart Health Study–I dataset, to select the 15 records with the highest amounts of broad and narrow spectral band e-LFC. The latter spectral characteristic seemed to detect not only periods of central apnea, but also obstructive hypopneas with a periodic breathing pattern. Applying the algorithm to 77 sleep laboratory split-night studies showed that the presence of narrow band e-LFC predicted an increased sensitivity to induction of central apneas by positive airway pressure. Conclusions: ECG-based spectral analysis allows automated, operator-independent characterization of probable interactions between respiratory dyscontrol and upper airway anatomical obstruction. The clinical utility of spectrographic phenotyping, especially in predicting failure of positive airway pressure therapy, remains to be more thoroughly tested. Citation: Thomas RJ; Mietus JE; Peng CK; Gilmartin G; Daly RW; Goldberger AL; Gottlieb DJ

  13. 14 CFR 1261.413 - Analysis of costs; automation; prevention of overpayments, delinquencies, or defaults.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 5 2010-01-01 2010-01-01 false Analysis of costs; automation; prevention of overpayments, delinquencies, or defaults. 1261.413 Section 1261.413 Aeronautics and Space NATIONAL...) § 1261.413 Analysis of costs; automation; prevention of overpayments, delinquencies, or defaults. The...

  14. Evaluating Changes in Ocular Redness Using a Novel Automated Method.

    Science.gov (United States)

    Amparo, Francisco; Yin, Jia; Di Zazzo, Antonio; Abud, Tulio; Jurkunas, Ula V; Hamrah, Pedram; Dana, Reza

    2017-07-01

    To evaluate interobserver concordance in measured ocular redness among a group of raters using an objective computer-assisted method (ocular redness index [ORI]) and a group of clinicians using an ordinal comparative scale. We conducted a prospective study to evaluate ocular redness in clinical photographs of 12 patients undergoing pterygium surgery. Photographs were acquired preoperatively, and at 1 week and 1 month postoperatively. One group of clinicians graded conjunctival redness in the photographs using an image-based comparative scale. A second group applied the ORI to measure redness in the same photographs. We evaluated redness change between time points, level of agreement among raters, and assessed redness score differences among observers within each group. Interobserver agreement using the image-based redness scale was 0.458 ( P < 0.001). Interobserver agreement with the ORI was 0.997 ( P < 0.001). We observed statistically significant differences among clinicians' measurements obtained with the image-based redness scale ( P < 0.001). There were no significant differences among measurements obtained with the ORI ( P = 0.27). We observed a significant change in redness between baseline and follow-up visits with all scoring methods. Detailed analysis of redness change was performed only in the ORI group due to availability of continuous scores. Our findings suggest that the ORI scores provide higher consistency among raters than ordinal scales, and can discriminate redness changes that clinical observers often can miss. The ORI may be a reliable alternative to measure ocular redness objectively in the clinic and in clinical trials.

  15. Safety and Capacity Analysis of Automated and Manual Highway Systems

    OpenAIRE

    Carbaugh, Jason; Godbole, Datta N.; Sengupta, Raja

    1999-01-01

    This paper compares safety of automated and manual highway systems with respect to result- ing rear-end collision frequency and severity. The results show that automated driving is safer than the most alert manual drivers, at similar speeds and capacities. We also present a detailed safety-capacity tradeo study for four di erent Automated Highway System concepts that di er in their information structure and separation policy.

  16. Conventional Versus Automated Implantation of Loose Seeds in Prostate Brachytherapy: Analysis of Dosimetric and Clinical Results

    Energy Technology Data Exchange (ETDEWEB)

    Genebes, Caroline, E-mail: genebes.caroline@claudiusregaud.fr [Radiation Oncology Department, Institut Claudius Regaud, Toulouse (France); Filleron, Thomas; Graff, Pierre [Radiation Oncology Department, Institut Claudius Regaud, Toulouse (France); Jonca, Frédéric [Department of Urology, Clinique Ambroise Paré, Toulouse (France); Huyghe, Eric; Thoulouzan, Matthieu; Soulie, Michel; Malavaud, Bernard [Department of Urology and Andrology, CHU Rangueil, Toulouse (France); Aziza, Richard; Brun, Thomas; Delannes, Martine; Bachaud, Jean-Marc [Radiation Oncology Department, Institut Claudius Regaud, Toulouse (France)

    2013-11-15

    Purpose: To review the clinical outcome of I-125 permanent prostate brachytherapy (PPB) for low-risk and intermediate-risk prostate cancer and to compare 2 techniques of loose-seed implantation. Methods and Materials: 574 consecutive patients underwent I-125 PPB for low-risk and intermediate-risk prostate cancer between 2000 and 2008. Two successive techniques were used: conventional implantation from 2000 to 2004 and automated implantation (Nucletron, FIRST system) from 2004 to 2008. Dosimetric and biochemical recurrence-free (bNED) survival results were reported and compared for the 2 techniques. Univariate and multivariate analysis researched independent predictors for bNED survival. Results: 419 (73%) and 155 (27%) patients with low-risk and intermediate-risk disease, respectively, were treated (median follow-up time, 69.3 months). The 60-month bNED survival rates were 95.2% and 85.7%, respectively, for patients with low-risk and intermediate-risk disease (P=.04). In univariate analysis, patients treated with automated implantation had worse bNED survival rates than did those treated with conventional implantation (P<.0001). By day 30, patients treated with automated implantation showed lower values of dose delivered to 90% of prostate volume (D90) and volume of prostate receiving 100% of prescribed dose (V100). In multivariate analysis, implantation technique, Gleason score, and V100 on day 30 were independent predictors of recurrence-free status. Grade 3 urethritis and urinary incontinence were observed in 2.6% and 1.6% of the cohort, respectively, with no significant differences between the 2 techniques. No grade 3 proctitis was observed. Conclusion: Satisfactory 60-month bNED survival rates (93.1%) and acceptable toxicity (grade 3 urethritis <3%) were achieved by loose-seed implantation. Automated implantation was associated with worse dosimetric and bNED survival outcomes.

  17. Conventional Versus Automated Implantation of Loose Seeds in Prostate Brachytherapy: Analysis of Dosimetric and Clinical Results

    International Nuclear Information System (INIS)

    Genebes, Caroline; Filleron, Thomas; Graff, Pierre; Jonca, Frédéric; Huyghe, Eric; Thoulouzan, Matthieu; Soulie, Michel; Malavaud, Bernard; Aziza, Richard; Brun, Thomas; Delannes, Martine; Bachaud, Jean-Marc

    2013-01-01

    Purpose: To review the clinical outcome of I-125 permanent prostate brachytherapy (PPB) for low-risk and intermediate-risk prostate cancer and to compare 2 techniques of loose-seed implantation. Methods and Materials: 574 consecutive patients underwent I-125 PPB for low-risk and intermediate-risk prostate cancer between 2000 and 2008. Two successive techniques were used: conventional implantation from 2000 to 2004 and automated implantation (Nucletron, FIRST system) from 2004 to 2008. Dosimetric and biochemical recurrence-free (bNED) survival results were reported and compared for the 2 techniques. Univariate and multivariate analysis researched independent predictors for bNED survival. Results: 419 (73%) and 155 (27%) patients with low-risk and intermediate-risk disease, respectively, were treated (median follow-up time, 69.3 months). The 60-month bNED survival rates were 95.2% and 85.7%, respectively, for patients with low-risk and intermediate-risk disease (P=.04). In univariate analysis, patients treated with automated implantation had worse bNED survival rates than did those treated with conventional implantation (P<.0001). By day 30, patients treated with automated implantation showed lower values of dose delivered to 90% of prostate volume (D90) and volume of prostate receiving 100% of prescribed dose (V100). In multivariate analysis, implantation technique, Gleason score, and V100 on day 30 were independent predictors of recurrence-free status. Grade 3 urethritis and urinary incontinence were observed in 2.6% and 1.6% of the cohort, respectively, with no significant differences between the 2 techniques. No grade 3 proctitis was observed. Conclusion: Satisfactory 60-month bNED survival rates (93.1%) and acceptable toxicity (grade 3 urethritis <3%) were achieved by loose-seed implantation. Automated implantation was associated with worse dosimetric and bNED survival outcomes

  18. Twelve automated thresholding methods for segmentation of PET images: a phantom study

    Science.gov (United States)

    Prieto, Elena; Lecumberri, Pablo; Pagola, Miguel; Gómez, Marisol; Bilbao, Izaskun; Ecay, Margarita; Peñuelas, Iván; Martí-Climent, Josep M.

    2012-06-01

    Tumor volume delineation over positron emission tomography (PET) images is of great interest for proper diagnosis and therapy planning. However, standard segmentation techniques (manual or semi-automated) are operator dependent and time consuming while fully automated procedures are cumbersome or require complex mathematical development. The aim of this study was to segment PET images in a fully automated way by implementing a set of 12 automated thresholding algorithms, classical in the fields of optical character recognition, tissue engineering or non-destructive testing images in high-tech structures. Automated thresholding algorithms select a specific threshold for each image without any a priori spatial information of the segmented object or any special calibration of the tomograph, as opposed to usual thresholding methods for PET. Spherical 18F-filled objects of different volumes were acquired on clinical PET/CT and on a small animal PET scanner, with three different signal-to-background ratios. Images were segmented with 12 automatic thresholding algorithms and results were compared with the standard segmentation reference, a threshold at 42% of the maximum uptake. Ridler and Ramesh thresholding algorithms based on clustering and histogram-shape information, respectively, provided better results that the classical 42%-based threshold (p < 0.05). We have herein demonstrated that fully automated thresholding algorithms can provide better results than classical PET segmentation tools.

  19. Specific Methods of Information Security for Nuclear Materials Control and Accounting Automate Systems

    Directory of Open Access Journals (Sweden)

    Konstantin Vyacheslavovich Ivanov

    2013-02-01

    Full Text Available The paper is devoted to specific methods of information security for nuclear materials control and accounting automate systems which is not required of OS and DBMS certifications and allowed to programs modification for clients specific without defenses modification. System ACCORD-2005 demonstrates the realization of this method.

  20. Analysis methods of neutrons induced resonances in the transmission experiments by time-of-flight and automation of these methods on IBM 7094 II computer; Methode d'analyse des resonances induites par les neutrons dans les experiences de transmission par temps-de-vol et automatisation de ces methodes sur ordinateur IBM-7094 II

    Energy Technology Data Exchange (ETDEWEB)

    Corge, C

    1967-07-01

    The neutron induced resonances analysis aims to determine the neutrons characteristics, leading to the excitation energies, de-excitation probabilities by gamma radiation emission, by neutron emission or by fission, their spin, their parity... This document describes the methods developed, or adapted, the calculation schemes and the algorithms implemented to realize such analysis on a computer, from data obtained during time-of-flight experiments on the linear accelerator of Saclay. (A.L.B.)

  1. Automated analysis of NF-κB nuclear translocation kinetics in high-throughput screening.

    Directory of Open Access Journals (Sweden)

    Zi Di

    Full Text Available Nuclear entry and exit of the NF-κB family of dimeric transcription factors plays an essential role in regulating cellular responses to inflammatory stress. The dynamics of this nuclear translocation can vary significantly within a cell population and may dramatically change e.g. upon drug exposure. Furthermore, there is significant heterogeneity in individual cell response upon stress signaling. In order to systematically determine factors that define NF-κB translocation dynamics, high-throughput screens that enable the analysis of dynamic NF-κB responses in individual cells in real time are essential. Thus far, only NF-κB downstream signaling responses of whole cell populations at the transcriptional level are in high-throughput mode. In this study, we developed a fully automated image analysis method to determine the time-course of NF-κB translocation in individual cells, suitable for high-throughput screenings in the context of compound screening and functional genomics. Two novel segmentation methods were used for defining the individual nuclear and cytoplasmic regions: watershed masked clustering (WMC and best-fit ellipse of Voronoi cell (BEVC. The dynamic NFκB oscillatory response at the single cell and population level was coupled to automated extraction of 26 analogue translocation parameters including number of peaks, time to reach each peak, and amplitude of each peak. Our automated image analysis method was validated through a series of statistical tests demonstrating computational efficient and accurate NF-κB translocation dynamics quantification of our algorithm. Both pharmacological inhibition of NF-κB and short interfering RNAs targeting the inhibitor of NFκB, IκBα, demonstrated the ability of our method to identify compounds and genetic players that interfere with the nuclear transition of NF-κB.

  2. Melanin Bleaching With Warm Hydrogen Peroxide and Integrated Immunohistochemical Analysis: An Automated Platform.

    Science.gov (United States)

    Liu, Chia-Hsing; Lin, Chih-Hung; Tsai, Min-Jan; Chen, Yu-Hsuan; Yang, Sheau-Fang; Tsai, Kun-Bow

    2018-02-01

    Diagnosing melanocytic lesions is among the most challenging problems in the practice of pathology. The difficulty of physically masking melanin pigment and the similarity of its color to commonly used chromogens often complicate examination of the cytomorphology and immunohistochemical staining results for tumor cells. Melanin bleach can be very helpful for histopathological diagnosis of heavily pigmented melanocytic lesions. Although various depigmentation methods have been reported, no standardized methods have been developed. This study developed a fully automated platform that incorporates hydrogen peroxide-based melanin depigmentation in an automated immunohistochemical analysis. The utility of the method was tested in 1 cell block of malignant melanoma cells in pleural effusion, 10 ocular melanoma tissue samples, and 10 cutaneous melanoma tissue samples. Our results demonstrated that the proposed method, which can be performed in only 3 hours, effectively preserves cell cytomorphology and immunoreactivity. The method is particularly effective for removing melanin pigment to facilitate histopathological examination of cytomorphology and for obtaining an unmasked tissue section for immunohistochemical analysis.

  3. Semi-automated volumetric analysis of artificial lymph nodes in a phantom study

    International Nuclear Information System (INIS)

    Fabel, M.; Biederer, J.; Jochens, A.; Bornemann, L.; Soza, G.; Heller, M.; Bolte, H.

    2011-01-01

    Purpose: Quantification of tumour burden in oncology requires accurate and reproducible image evaluation. The current standard is one-dimensional measurement (e.g. RECIST) with inherent disadvantages. Volumetric analysis is discussed as an alternative for therapy monitoring of lung and liver metastases. The aim of this study was to investigate the accuracy of semi-automated volumetric analysis of artificial lymph node metastases in a phantom study. Materials and methods: Fifty artificial lymph nodes were produced in a size range from 10 to 55 mm; some of them enhanced using iodine contrast media. All nodules were placed in an artificial chest phantom (artiCHEST ® ) within different surrounding tissues. MDCT was performed using different collimations (1–5 mm) at varying reconstruction kernels (B20f, B40f, B60f). Volume and RECIST measurements were performed using Oncology Software (Siemens Healthcare, Forchheim, Germany) and were compared to reference volume and diameter by calculating absolute percentage errors. Results: The software performance allowed a robust volumetric analysis in a phantom setting. Unsatisfying segmentation results were frequently found for native nodules within surrounding muscle. The absolute percentage error (APE) for volumetric analysis varied between 0.01 and 225%. No significant differences were seen between different reconstruction kernels. The most unsatisfactory segmentation results occurred in higher slice thickness (4 and 5 mm). Contrast enhanced lymph nodes showed better segmentation results by trend. Conclusion: The semi-automated 3D-volumetric analysis software tool allows a reliable and convenient segmentation of artificial lymph nodes in a phantom setting. Lymph nodes adjacent to tissue of similar density cause segmentation problems. For volumetric analysis of lymph node metastases in clinical routine a slice thickness of ≤3 mm and a medium soft reconstruction kernel (e.g. B40f for Siemens scan systems) may be a suitable

  4. Appearance of granulated cells in blood films stained by automated aqueous versus methanolic Romanowsky methods.

    Science.gov (United States)

    Allison, Robin W; Velguth, Karen E

    2010-03-01

    Romanowsky stains are used routinely by veterinary clinical pathology laboratories for cytologic and blood film evaluations. Automated stainers are available for both aqueous and methanolic Romanowsky stains. Mast cell granules and canine distemper virus inclusions are known to stain differently by these 2 methods, but we have noticed differences in the staining characteristics of other granulated cells. The aim of this study was to investigate and document the variable appearance of basophils and large granular lymphocytes in blood films stained using aqueous and methanolic Romanowsky methods. Cytologic preparations from 1 canine mast cell tumor and blood films from 8 dogs, 1 cat, 1 rabbit, and 1 ostrich were stained using an automated aqueous stain (Aerospray 7120, with and without a predip fixative) and an automated methanolic stain (Hematek). Staining quality and intensity of the cytoplasmic granules in mast cells, basophils, and large granular lymphocytes was evaluated subjectively. Cytoplasmic granules of mast cells, basophils, and large granular lymphocytes stained poorly or not at all with the automated aqueous stain but stained prominently and were readily identified with the automated methanolic stain. Use of the predip fixative with the Aerospray method improved the visibility of basophil granules but not mast cell granules, and had a variable affect on the visibility of granules in large granular lymphocytes. Clinical pathologists should be aware of the staining methodology used on the slides they evaluate to avoid incorrect interpretation of granulated cell populations.

  5. Monitored Retrievable Storage/Multi-Purpose Canister analysis: Simulation and economics of automation

    International Nuclear Information System (INIS)

    Bennett, P.C.; Stringer, J.B.

    1994-01-01

    Robotic automation is examined as a possible alternative to manual spent nuclear fuel, transport cask and Multi-Purpose canister (MPC) handling at a Monitored Retrievable Storage (MRS) facility. Automation of key operational aspects for the MRS/MPC system are analyzed to determine equipment requirements, through-put times and equipment costs is described. The economic and radiation dose impacts resulting from this automation are compared to manual handling methods

  6. Development of a software for INAA analysis automation

    International Nuclear Information System (INIS)

    Zahn, Guilherme S.; Genezini, Frederico A.; Figueiredo, Ana Maria G.; Ticianelli, Regina B.

    2013-01-01

    In this work, a software to automate the post-counting tasks in comparative INAA has been developed that aims to become more flexible than the available options, integrating itself with some of the routines currently in use in the IPEN Activation Analysis Laboratory and allowing the user to choose between a fully-automatic analysis or an Excel-oriented one. The software makes use of the Genie 2000 data importing and analysis routines and stores each 'energy-counts-uncertainty' table as a separate ASCII file that can be used later on if required by the analyst. Moreover, it generates an Excel-compatible CSV (comma separated values) file with only the relevant results from the analyses for each sample or comparator, as well as the results of the concentration calculations and the results obtained with four different statistical tools (unweighted average, weighted average, normalized residuals and Rajeval technique), allowing the analyst to double-check the results. Finally, a 'summary' CSV file is also produced, with the final concentration results obtained for each element in each sample. (author)

  7. Intelligent Control in Automation Based on Wireless Traffic Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kurt Derr; Milos Manic

    2007-08-01

    Wireless technology is a central component of many factory automation infrastructures in both the commercial and government sectors, providing connectivity among various components in industrial realms (distributed sensors, machines, mobile process controllers). However wireless technologies provide more threats to computer security than wired environments. The advantageous features of Bluetooth technology resulted in Bluetooth units shipments climbing to five million per week at the end of 2005 [1, 2]. This is why the real-time interpretation and understanding of Bluetooth traffic behavior is critical in both maintaining the integrity of computer systems and increasing the efficient use of this technology in control type applications. Although neuro-fuzzy approaches have been applied to wireless 802.11 behavior analysis in the past, a significantly different Bluetooth protocol framework has not been extensively explored using this technology. This paper presents a new neurofuzzy traffic analysis algorithm of this still new territory of Bluetooth traffic. Further enhancements of this algorithm are presented along with the comparison against the traditional, numerical approach. Through test examples, interesting Bluetooth traffic behavior characteristics were captured, and the comparative elegance of this computationally inexpensive approach was demonstrated. This analysis can be used to provide directions for future development and use of this prevailing technology in various control type applications, as well as making the use of it more secure.

  8. Intelligent Control in Automation Based on Wireless Traffic Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kurt Derr; Milos Manic

    2007-09-01

    Wireless technology is a central component of many factory automation infrastructures in both the commercial and government sectors, providing connectivity among various components in industrial realms (distributed sensors, machines, mobile process controllers). However wireless technologies provide more threats to computer security than wired environments. The advantageous features of Bluetooth technology resulted in Bluetooth units shipments climbing to five million per week at the end of 2005 [1, 2]. This is why the real-time interpretation and understanding of Bluetooth traffic behavior is critical in both maintaining the integrity of computer systems and increasing the efficient use of this technology in control type applications. Although neuro-fuzzy approaches have been applied to wireless 802.11 behavior analysis in the past, a significantly different Bluetooth protocol framework has not been extensively explored using this technology. This paper presents a new neurofuzzy traffic analysis algorithm of this still new territory of Bluetooth traffic. Further enhancements of this algorithm are presented along with the comparison against the traditional, numerical approach. Through test examples, interesting Bluetooth traffic behavior characteristics were captured, and the comparative elegance of this computationally inexpensive approach was demonstrated. This analysis can be used to provide directions for future development and use of this prevailing technology in various control type applications, as well as making the use of it more secure.

  9. Models, methods and software for distributed knowledge acquisition for the automated construction of integrated expert systems knowledge bases

    International Nuclear Information System (INIS)

    Dejneko, A.O.

    2011-01-01

    Based on an analysis of existing models, methods and means of acquiring knowledge, a base method of automated knowledge acquisition has been chosen. On the base of this method, a new approach to integrate information acquired from knowledge sources of different typologies has been proposed, and the concept of a distributed knowledge acquisition with the aim of computerized formation of the most complete and consistent models of problem areas has been introduced. An original algorithm for distributed knowledge acquisition from databases, based on the construction of binary decision trees has been developed [ru

  10. Analysis of the light curves of SX Aurigae, by automated Fourier techniques

    International Nuclear Information System (INIS)

    Alkan, H.

    1979-01-01

    The aim of the author's work has been to analyse the light changes of the close eclipsing system SX Aurigae in the frequency-domain. This analysis is based on Kopal's (1975) new theory developed for the study of light variations, between minima as well as within eclipses, of close eclipsing binaries whose components are distorted by axial rotation and and mutual tidal action. A method for the separation of the photometric proximity and eclipse effects directly from the observed data is also presented. In this method no 'rectification' is needed. The automated method has been tested on the light curves of SX Aurigae. Finally, a comparative discussion is given of Kopal and Kitamura (1965) methods of the light curves analysis. (Auth.)

  11. The Ocular Redness Index: a novel automated method for measuring ocular injection.

    Science.gov (United States)

    Amparo, Francisco; Wang, Haobing; Emami-Naeini, Parisa; Karimian, Parisa; Dana, Reza

    2013-07-18

    To develop and validate a novel automated system to assess ocular redness (OR) in clinical images. We developed a novel software that quantifies OR in digital images based on a mathematic algorithm using a centesimal continuous scoring scale. Subsequently, we conducted a study to validate the scores obtained with this system by correlating them with those obtained by two physicians using two image-based comparative subjective scales, the Efron and the Validated Bulbar Redness (VBR) grading scales. Additionally, we evaluated the level of clinical agreement between the Ocular Redness Index (ORI) score and the two image-based methods by means of the Bland-Altman analysis. Main outcome measures included correlation and level of agreement between the ORI score, Efron score, and the VBR score. One hundred and two clinical photographs of eyes with OR were evaluated. The ORI scores significantly correlated with the scores obtained by the two clinicians using the Efron (Observer 1, R=0.925, P<0.001; Observer 2, R=0.857, P<0.001), and VBR (Observer 1, R=0.830, P<0.001; Observer 2, R=0.821, P<0.001) scales. The Bland-Altman analysis revealed levels of disagreement of up to 30 and 27 units for the ORI-Efron and ORI-VBR score comparisons, respectively. The ORI provides an objective and continuous scale for evaluating ocular injection in an automated manner, and without need for a trained physician for scoring. The ORI may be used as a new alternative for objective OR evaluation in clinics and in clinical trials.

  12. Synchronous Control Method and Realization of Automated Pharmacy Elevator

    Science.gov (United States)

    Liu, Xiang-Quan

    Firstly, the control method of elevator's synchronous motion is provided, the synchronous control structure of double servo motor based on PMAC is accomplished. Secondly, synchronous control program of elevator is implemented by using PMAC linear interpolation motion model and position error compensation method. Finally, the PID parameters of servo motor were adjusted. The experiment proves the control method has high stability and reliability.

  13. Immunoassay of thyroid peroxidase autoantibodies: diagnostic performance in automated third generation methods. A multicentre evaluation.

    Science.gov (United States)

    D'Aurizio, Federica; Tozzoli, Renato; Villalta, Danilo; Pesce, Giampaola; Bagnasco, Marcello

    2015-02-01

    The use of automated immunometric methods for the detection of anti-thyroid peroxidase antibodies (TPOAb), the main serological marker of autoimmune thyroid diseases (AITD), has expanded in recent years. However, it is not known whether these new automated platforms have improved the diagnostic performance of TPOAb assays. The aim of this study was to evaluate the potential improvement of the inter-method agreement of current automated third generation systems, 12 years after a previous study, which had assessed the analytical variability between semi-automated second generation methods of TPOAb detection. Eight pools of sera from patients with chronic lymphocytic thyroiditis, exhibiting different TPOAb concentrations, were collected from routine laboratory diagnostics and distributed to seven companies throughout Italy. All automated third generation methods were calibrated against the Medical Research Council (MRC) reference preparation 66/387. The overall mean variability (CV) was 93.6% when results were expressed in part as arbitrary Units (U/mL) and in part as International Units (IU/mL). The conversion of all values in IU/mL resulted in a significant decrease of CV (49.8%). The CV expressed as COM (cut-off concentration multiples) was 64.0%. Agreement of qualitative results was 95.3% with a pronounced difference in the threshold values proposed by manufacturers (range 3.2-35.0 IU/mL). These findings confirm the improvement of harmonisation between different methods of automated third generation TPOAb assays. Nevertheless, further efforts should be made in the definition of the positive cut-off concentration to avoid misclassification of AITD patients as well as in a new international reference preparation and in the autoantigen purification modality.

  14. OpenComet: An automated tool for comet assay image analysis

    Directory of Open Access Journals (Sweden)

    Benjamin M. Gyori

    2014-01-01

    Full Text Available Reactive species such as free radicals are constantly generated in vivo and DNA is the most important target of oxidative stress. Oxidative DNA damage is used as a predictive biomarker to monitor the risk of development of many diseases. The comet assay is widely used for measuring oxidative DNA damage at a single cell level. The analysis of comet assay output images, however, poses considerable challenges. Commercial software is costly and restrictive, while free software generally requires laborious manual tagging of cells. This paper presents OpenComet, an open-source software tool providing automated analysis of comet assay images. It uses a novel and robust method for finding comets based on geometric shape attributes and segmenting the comet heads through image intensity profile analysis. Due to automation, OpenComet is more accurate, less prone to human bias, and faster than manual analysis. A live analysis functionality also allows users to analyze images captured directly from a microscope. We have validated OpenComet on both alkaline and neutral comet assay images as well as sample images from existing software packages. Our results show that OpenComet achieves high accuracy with significantly reduced analysis time.

  15. Interobserver and Intraobserver Variability in pH-Impedance Analysis between 10 Experts and Automated Analysis

    DEFF Research Database (Denmark)

    Loots, Clara M; van Wijk, Michiel P; Blondeau, Kathleen

    2011-01-01

    OBJECTIVE: To determine interobserver and intraobserver variability in pH-impedance interpretation between experts and accuracy of automated analysis (AA). STUDY DESIGN: Ten pediatric 24-hour pH-impedance tracings were analyzed by 10 observers from 7 world groups and with AA. Detection of gastroe...

  16. Hybrid digital signal processing and neural networks for automated diagnostics using NDE methods

    International Nuclear Information System (INIS)

    Upadhyaya, B.R.; Yan, W.

    1993-11-01

    The primary purpose of the current research was to develop an integrated approach by combining information compression methods and artificial neural networks for the monitoring of plant components using nondestructive examination data. Specifically, data from eddy current inspection of heat exchanger tubing were utilized to evaluate this technology. The focus of the research was to develop and test various data compression methods (for eddy current data) and the performance of different neural network paradigms for defect classification and defect parameter estimation. Feedforward, fully-connected neural networks, that use the back-propagation algorithm for network training, were implemented for defect classification and defect parameter estimation using a modular network architecture. A large eddy current tube inspection database was acquired from the Metals and Ceramics Division of ORNL. These data were used to study the performance of artificial neural networks for defect type classification and for estimating defect parameters. A PC-based data preprocessing and display program was also developed as part of an expert system for data management and decision making. The results of the analysis showed that for effective (low-error) defect classification and estimation of parameters, it is necessary to identify proper feature vectors using different data representation methods. The integration of data compression and artificial neural networks for information processing was established as an effective technique for automation of diagnostics using nondestructive examination methods

  17. Development of an Automated LIBS Analytical Test System Integrated with Component Control and Spectrum Analysis Capabilities

    International Nuclear Information System (INIS)

    Ding Yu; Tian Di; Chen Feipeng; Chen Pengfei; Qiao Shujun; Yang Guang; Li Chunsheng

    2015-01-01

    The present paper proposes an automated Laser-Induced Breakdown Spectroscopy (LIBS) analytical test system, which consists of a LIBS measurement and control platform based on a modular design concept, and a LIBS qualitative spectrum analysis software and is developed in C#. The platform provides flexible interfacing and automated control; it is compatible with different manufacturer component models and is constructed in modularized form for easy expandability. During peak identification, a more robust peak identification method with improved stability in peak identification has been achieved by applying additional smoothing on the slope obtained by calculation before peak identification. For the purpose of element identification, an improved main lines analysis method, which detects all elements on the spectral peak to avoid omission of certain elements without strong spectral lines, is applied to element identification in the tested LIBS samples. This method also increases the identification speed. In this paper, actual applications have been carried out. According to tests, the analytical test system is compatible with components of various models made by different manufacturers. It can automatically control components to get experimental data and conduct filtering, peak identification and qualitative analysis, etc. on spectral data. (paper)

  18. GapCoder automates the use of indel characters in phylogenetic analysis.

    Science.gov (United States)

    Young, Nelson D; Healy, John

    2003-02-19

    Several ways of incorporating indels into phylogenetic analysis have been suggested. Simple indel coding has two strengths: (1) biological realism and (2) efficiency of analysis. In the method, each indel with different start and/or end positions is considered to be a separate character. The presence/absence of these indel characters is then added to the data set. We have written a program, GapCoder to automate this procedure. The program can input PIR format aligned datasets, find the indels and add the indel-based characters. The output is a NEXUS format file, which includes a table showing what region each indel characters is based on. If regions are excluded from analysis, this table makes it easy to identify the corresponding indel characters for exclusion. Manual implementation of the simple indel coding method can be very time-consuming, especially in data sets where indels are numerous and/or overlapping. GapCoder automates this method and is therefore particularly useful during procedures where phylogenetic analyses need to be repeated many times, such as when different alignments are being explored or when various taxon or character sets are being explored. GapCoder is currently available for Windows from http://www.home.duq.edu/~youngnd/GapCoder.

  19. Quantification of Pulmonary Fibrosis in a Bleomycin Mouse Model Using Automated Histological Image Analysis.

    Science.gov (United States)

    Gilhodes, Jean-Claude; Julé, Yvon; Kreuz, Sebastian; Stierstorfer, Birgit; Stiller, Detlef; Wollin, Lutz

    2017-01-01

    Current literature on pulmonary fibrosis induced in animal models highlights the need of an accurate, reliable and reproducible histological quantitative analysis. One of the major limits of histological scoring concerns the fact that it is observer-dependent and consequently subject to variability, which may preclude comparative studies between different laboratories. To achieve a reliable and observer-independent quantification of lung fibrosis we developed an automated software histological image analysis performed from digital image of entire lung sections. This automated analysis was compared to standard evaluation methods with regard to its validation as an end-point measure of fibrosis. Lung fibrosis was induced in mice by intratracheal administration of bleomycin (BLM) at 0.25, 0.5, 0.75 and 1 mg/kg. A detailed characterization of BLM-induced fibrosis was performed 14 days after BLM administration using lung function testing, micro-computed tomography and Ashcroft scoring analysis. Quantification of fibrosis by automated analysis was assessed based on pulmonary tissue density measured from thousands of micro-tiles processed from digital images of entire lung sections. Prior to analysis, large bronchi and vessels were manually excluded from the original images. Measurement of fibrosis has been expressed by two indexes: the mean pulmonary tissue density and the high pulmonary tissue density frequency. We showed that tissue density indexes gave access to a very accurate and reliable quantification of morphological changes induced by BLM even for the lowest concentration used (0.25 mg/kg). A reconstructed 2D-image of the entire lung section at high resolution (3.6 μm/pixel) has been performed from tissue density values allowing the visualization of their distribution throughout fibrotic and non-fibrotic regions. A significant correlation (pfibrosis in mice, which will be very valuable for future preclinical drug explorations.

  20. Space Environment Automated Alerts and Anomaly Analysis Assistant (SEA^5) for NASA

    Data.gov (United States)

    National Aeronautics and Space Administration — We propose to develop a comprehensive analysis and dissemination system (Space Environment Automated Alerts  & Anomaly Analysis Assistant: SEA5) that will...

  1. a Psycholinguistic Model for Simultaneous Translation, and Proficiency Assessment by Automated Acoustic Analysis of Discourse.

    Science.gov (United States)

    Yaghi, Hussein M.

    Two separate but related issues are addressed: how simultaneous translation (ST) works on a cognitive level and how such translation can be objectively assessed. Both of these issues are discussed in the light of qualitative and quantitative analyses of a large corpus of recordings of ST and shadowing. The proposed ST model utilises knowledge derived from a discourse analysis of the data, many accepted facts in the psychology tradition, and evidence from controlled experiments that are carried out here. This model has three advantages: (i) it is based on analyses of extended spontaneous speech rather than word-, syllable-, or clause -bound stimuli; (ii) it draws equally on linguistic and psychological knowledge; and (iii) it adopts a non-traditional view of language called 'the linguistic construction of reality'. The discourse-based knowledge is also used to develop three computerised systems for the assessment of simultaneous translation: one is a semi-automated system that treats the content of the translation; and two are fully automated, one of which is based on the time structure of the acoustic signals whilst the other is based on their cross-correlation. For each system, several parameters of performance are identified, and they are correlated with assessments rendered by the traditional, subjective, qualitative method. Using signal processing techniques, the acoustic analysis of discourse leads to the conclusion that quality in simultaneous translation can be assessed quantitatively with varying degrees of automation. It identifies as measures of performance (i) three content-based standards; (ii) four time management parameters that reflect the influence of the source on the target language time structure; and (iii) two types of acoustical signal coherence. Proficiency in ST is shown to be directly related to coherence and speech rate but inversely related to omission and delay. High proficiency is associated with a high degree of simultaneity and

  2. Methods of Multivariate Analysis

    CERN Document Server

    Rencher, Alvin C

    2012-01-01

    Praise for the Second Edition "This book is a systematic, well-written, well-organized text on multivariate analysis packed with intuition and insight . . . There is much practical wisdom in this book that is hard to find elsewhere."-IIE Transactions Filled with new and timely content, Methods of Multivariate Analysis, Third Edition provides examples and exercises based on more than sixty real data sets from a wide variety of scientific fields. It takes a "methods" approach to the subject, placing an emphasis on how students and practitioners can employ multivariate analysis in real-life sit

  3. Fully automated quantitative analysis of breast cancer risk in DCE-MR images

    Science.gov (United States)

    Jiang, Luan; Hu, Xiaoxin; Gu, Yajia; Li, Qiang

    2015-03-01

    Amount of fibroglandular tissue (FGT) and background parenchymal enhancement (BPE) in dynamic contrast enhanced magnetic resonance (DCE-MR) images are two important indices for breast cancer risk assessment in the clinical practice. The purpose of this study is to develop and evaluate a fully automated scheme for quantitative analysis of FGT and BPE in DCE-MR images. Our fully automated method consists of three steps, i.e., segmentation of whole breast, fibroglandular tissues, and enhanced fibroglandular tissues. Based on the volume of interest extracted automatically, dynamic programming method was applied in each 2-D slice of a 3-D MR scan to delineate the chest wall and breast skin line for segmenting the whole breast. This step took advantages of the continuity of chest wall and breast skin line across adjacent slices. We then further used fuzzy c-means clustering method with automatic selection of cluster number for segmenting the fibroglandular tissues within the segmented whole breast area. Finally, a statistical method was used to set a threshold based on the estimated noise level for segmenting the enhanced fibroglandular tissues in the subtraction images of pre- and post-contrast MR scans. Based on the segmented whole breast, fibroglandular tissues, and enhanced fibroglandular tissues, FGT and BPE were automatically computed. Preliminary results of technical evaluation and clinical validation showed that our fully automated scheme could obtain good segmentation of the whole breast, fibroglandular tissues, and enhanced fibroglandular tissues to achieve accurate assessment of FGT and BPE for quantitative analysis of breast cancer risk.

  4. GWATCH: a web platform for automated gene association discovery analysis

    Science.gov (United States)

    2014-01-01

    Background As genome-wide sequence analyses for complex human disease determinants are expanding, it is increasingly necessary to develop strategies to promote discovery and validation of potential disease-gene associations. Findings Here we present a dynamic web-based platform – GWATCH – that automates and facilitates four steps in genetic epidemiological discovery: 1) Rapid gene association search and discovery analysis of large genome-wide datasets; 2) Expanded visual display of gene associations for genome-wide variants (SNPs, indels, CNVs), including Manhattan plots, 2D and 3D snapshots of any gene region, and a dynamic genome browser illustrating gene association chromosomal regions; 3) Real-time validation/replication of candidate or putative genes suggested from other sources, limiting Bonferroni genome-wide association study (GWAS) penalties; 4) Open data release and sharing by eliminating privacy constraints (The National Human Genome Research Institute (NHGRI) Institutional Review Board (IRB), informed consent, The Health Insurance Portability and Accountability Act (HIPAA) of 1996 etc.) on unabridged results, which allows for open access comparative and meta-analysis. Conclusions GWATCH is suitable for both GWAS and whole genome sequence association datasets. We illustrate the utility of GWATCH with three large genome-wide association studies for HIV-AIDS resistance genes screened in large multicenter cohorts; however, association datasets from any study can be uploaded and analyzed by GWATCH. PMID:25374661

  5. Comparison between Manual and Automated Methods for Ki-67 Immunoexpression Quantification in Ameloblastomas

    Directory of Open Access Journals (Sweden)

    Rogelio González-González

    2016-01-01

    Full Text Available Ameloblastoma is a common and unpredictable odontogenic tumor with high relapse rates. Several studies assessing the proliferative capacity of these neoplasms have been published, mainly using the protein Ki-67. Cell counts must be completed to determine the cell proliferation rate. Multiple methods have been developed for this purpose. The most widely used method is the labeling index, which has undergone changes over time to better facilitate cell counting. Here, we compared manual cell counting methods with automated cell counting (ImmunoRatio to determine the relative effectiveness of these methods. The results suggest that ImmunoRatio, a free software tool, may be highly advantageous and provide results similar to manual cell counting methods when used with the appropriate calibration. However, ImmunoRatio has flaws that may affect the labeling index results. Therefore, this automated cell counting method must be supplemented with manual cell counting methods.

  6. Measurement precision and biological variation of cranial arteries using automated analysis of 3 T magnetic resonance angiography

    DEFF Research Database (Denmark)

    Amin, Faisal Mohammad; Lundholm, Elisabet; Hougaard, Anders

    2014-01-01

    BACKGROUND: Non-invasive magnetic resonance angiography (MRA) has facilitated repeated measurements of human cranial arteries in several headache and migraine studies. To ensure comparability across studies the same automated analysis software has been used, but the intra- and interobserver, day......-to-day and side-to-side variations have not yet been published. We hypothesised that the observer related, side-to-side, and day-to-day variations would be less than 10%. METHODS: Ten female participants were studied using high-resolution MRA on two study days separated by at least one week. Using the automated.......8% for MMA and ≤3.1% for MCA; between observers ≤3.4% (MMA) and ≤4.1% (MCA); between days ≤6.0% (MMA) and ≤8.0% (MCA); between sides ≤9.4% (MMA) and ≤6.5% (MCA). CONCLUSION: The present study demonstrates a low (automated LKEB-MRA vessel wall analysis...

  7. Automated counting of bacterial colonies by image analysis.

    Science.gov (United States)

    Chiang, Pei-Ju; Tseng, Min-Jen; He, Zong-Sian; Li, Chia-Hsun

    2015-01-01

    Research on microorganisms often involves culturing as a means to determine the survival and proliferation of bacteria. The number of colonies in a culture is counted to calculate the concentration of bacteria in the original broth; however, manual counting can be time-consuming and imprecise. To save time and prevent inconsistencies, this study proposes a fully automated counting system using image processing methods. To accurately estimate the number of viable bacteria in a known volume of suspension, colonies distributing over the whole surface area of a plate, including the central and rim areas of a Petri dish are taken into account. The performance of the proposed system is compared with verified manual counts, as well as with two freely available counting software programs. Comparisons show that the proposed system is an effective method with excellent accuracy with mean value of absolute percentage error of 3.37%. A user-friendly graphical user interface is also developed and freely available for download, providing researchers in biomedicine with a more convenient instrument for the enumeration of bacterial colonies. Copyright © 2014 Elsevier B.V. All rights reserved.

  8. [Accuracy, precision and speed of parenteral nutrition admixture bags manufacturing: comparison between automated and manual methods].

    Science.gov (United States)

    Zegbeh, H; Pirot, F; Quessada, T; Durand, T; Vételé, F; Rose, A; Bréant, V; Aulagner, G

    2011-01-01

    The parenteral nutrition admixture (PNA) manufacturing in hospital pharmacy is realized by aseptic transfer (AT) or sterilizing filtration (SF). The development of filling systems for PNA manufacturing requires, without standard, an evaluation comparing to traditional methods of SF. The filling accuracy of automated AT and SF was evaluated by mass and physical-chemistry tests in repeatability conditions (identical composition of PNA; n=five bags) and reproducibility conditions (different composition of PNA; n=57 bags). For each manufacturing method, the filling precision and the average time for PNA bags manufacturing were evaluated starting from an identical composition and volume PNA (n=five trials). Both manufacturing methods did not show significant difference of accuracy. Precision of both methods was lower than limits generally admitted for acceptability of mass and physical-chemistry tests. However, the manufacturing time for SF was superior (five different binary admixtures in five bags) or inferior (one identical binary admixture in five bags) to time recorded for automated AT. We show that serial manufacturing of PNA bags by SF with identical composition is faster than automated AT. Nevertheless, automated AT is faster than SF in variable composition of PNA. The manufacturing method choice will be motivate by the nature (i. e., variable composition or not) of the manufactured bags. Copyright © 2010 Elsevier Masson SAS. All rights reserved.

  9. Automated longitudinal intra-subject analysis (ALISA) for diffusion MRI tractography

    DEFF Research Database (Denmark)

    Aarnink, Saskia H; Vos, Sjoerd B; Leemans, Alexander

    2014-01-01

    the inter-subject and intra-subject automation in this situation are intended for subjects without gross pathology. In this work, we propose such an automated longitudinal intra-subject analysis (dubbed ALISA) approach, and assessed whether ALISA could preserve the same level of reliability as obtained...

  10. 40 CFR 13.19 - Analysis of costs; automation; prevention of overpayments, delinquencies or defaults.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 1 2010-07-01 2010-07-01 false Analysis of costs; automation; prevention of overpayments, delinquencies or defaults. 13.19 Section 13.19 Protection of Environment...; automation; prevention of overpayments, delinquencies or defaults. (a) The Administrator may periodically...

  11. Development of Nuclear Power Plant Safety Evaluation Method for the Automation Algorithm Application

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Seung Geun; Seong, Poong Hyun [KAIST, Daejeon (Korea, Republic of)

    2016-10-15

    It is commonly believed that replacing human operators to the automated system would guarantee greater efficiency, lower workloads, and fewer human error. Conventional machine learning techniques are considered as not capable to handle complex situations in NPP. Due to these kinds of issues, automation is not actively adopted although human error probability drastically increases during abnormal situations in NPP due to overload of information, high workload, and short time available for diagnosis. Recently, new machine learning techniques, which are known as ‘deep learning’ techniques have been actively applied to many fields, and the deep learning technique-based artificial intelligences (AIs) are showing better performance than conventional AIs. In 2015, deep Q-network (DQN) which is one of the deep learning techniques was developed and applied to train AI that automatically plays various Atari 2800 games, and this AI surpassed the human-level playing in many kind of games. Also in 2016, ‘Alpha-Go’, which was developed by ‘Google Deepmind’ based on deep learning technique to play the game of Go (i.e. Baduk), was defeated Se-dol Lee who is the World Go champion with score of 4:1. By the effort for reducing human error in NPPs, the ultimate goal of this study is the development of automation algorithm which can cover various situations in NPPs. As the first part, quantitative and real-time NPP safety evaluation method is being developed in order to provide the training criteria for automation algorithm. For that, EWS concept of medical field was adopted, and the applicability is investigated in this paper. Practically, the application of full automation (i.e. fully replaces human operators) may requires much more time for the validation and investigation of side-effects after the development of automation algorithm, and so the adoption in the form of full automation will take long time.

  12. Development of Nuclear Power Plant Safety Evaluation Method for the Automation Algorithm Application

    International Nuclear Information System (INIS)

    Kim, Seung Geun; Seong, Poong Hyun

    2016-01-01

    It is commonly believed that replacing human operators to the automated system would guarantee greater efficiency, lower workloads, and fewer human error. Conventional machine learning techniques are considered as not capable to handle complex situations in NPP. Due to these kinds of issues, automation is not actively adopted although human error probability drastically increases during abnormal situations in NPP due to overload of information, high workload, and short time available for diagnosis. Recently, new machine learning techniques, which are known as ‘deep learning’ techniques have been actively applied to many fields, and the deep learning technique-based artificial intelligences (AIs) are showing better performance than conventional AIs. In 2015, deep Q-network (DQN) which is one of the deep learning techniques was developed and applied to train AI that automatically plays various Atari 2800 games, and this AI surpassed the human-level playing in many kind of games. Also in 2016, ‘Alpha-Go’, which was developed by ‘Google Deepmind’ based on deep learning technique to play the game of Go (i.e. Baduk), was defeated Se-dol Lee who is the World Go champion with score of 4:1. By the effort for reducing human error in NPPs, the ultimate goal of this study is the development of automation algorithm which can cover various situations in NPPs. As the first part, quantitative and real-time NPP safety evaluation method is being developed in order to provide the training criteria for automation algorithm. For that, EWS concept of medical field was adopted, and the applicability is investigated in this paper. Practically, the application of full automation (i.e. fully replaces human operators) may requires much more time for the validation and investigation of side-effects after the development of automation algorithm, and so the adoption in the form of full automation will take long time

  13. Approach to analysis of single nucleotide polymorphisms by automated constant denaturant capillary electrophoresis

    International Nuclear Information System (INIS)

    Bjoerheim, Jens; Abrahamsen, Torveig Weum; Kristensen, Annette Torgunrud; Gaudernack, Gustav; Ekstroem, Per O.

    2003-01-01

    Melting gel techniques have proven to be amenable and powerful tools in point mutation and single nucleotide polymorphism (SNP) analysis. With the introduction of commercially available capillary electrophoresis instruments, a partly automated platform for denaturant capillary electrophoresis with potential for routine screening of selected target sequences has been established. The aim of this article is to demonstrate the use of automated constant denaturant capillary electrophoresis (ACDCE) in single nucleotide polymorphism analysis of various target sequences. Optimal analysis conditions for different single nucleotide polymorphisms on ACDCE are evaluated with the Poland algorithm. Laboratory procedures include only PCR and electrophoresis. For direct genotyping of individual SNPs, the samples are analyzed with an internal standard and the alleles are identified by co-migration of sample and standard peaks. In conclusion, SNPs suitable for melting gel analysis based on theoretical thermodynamics were separated by ACDCE under appropriate conditions. With this instrumentation (ABI 310 Genetic Analyzer), 48 samples could be analyzed without any intervention. Several institutions have capillary instrumentation in-house, thus making this SNP analysis method accessible to large groups of researchers without any need for instrument modification

  14. Methods for automated semantic definition of manufacturing structures (mBOM) in mechanical engineering companies

    Science.gov (United States)

    Stekolschik, Alexander, Prof.

    2017-10-01

    The bill of materials (BOM), which involves all parts and assemblies of the product, is the core of any mechanical or electronic product. The flexible and integrated management of engineering (Engineering Bill of Materials [eBOM]) and manufacturing (Manufacturing Bill of Materials [mBOM]) structures is the key to the creation of modern products in mechanical engineering companies. This paper presents a method framework for the creation and control of e- and, especially, mBOM. The requirements, resulting from the process of differentiation between companies that produce serialized or engineered-to-order products, are considered in the analysis phase. The main part of the paper describes different approaches to fully or partly automated creation of mBOM. The first approach is the definition of part selection rules in the generic mBOM templates. The mBOM can be derived from the eBOM for partly standardized products by using this method. Another approach is the simultaneous use of semantic rules, options, and parameters in both structures. The implementation of the method framework (selection of use cases) in a standard product lifecycle management (PLM) system is part of the research.

  15. Validation of the Automated Method VIENA: An Accurate, Precise, and Robust Measure of Ventricular Enlargement

    NARCIS (Netherlands)

    Vrenken, H.; Vos, E.K.; van der Flier, W.M.; Sluimer, I.C.; Cover, K.S.; Knol, D.L.; Barkhof, F.

    2014-01-01

    Background: In many retrospective studies and large clinical trials, high-resolution, good-contrast 3DT1 images are unavailable, hampering detailed analysis of brain atrophy. Ventricular enlargement then provides a sensitive indirect measure of ongoing central brain atrophy. Validated automated

  16. An Automated Method for Semantic Classification of Regions in Coastal Images

    NARCIS (Netherlands)

    Hoonhout, B.M.; Radermacher, M.; Baart, F.; Van der Maaten, L.J.P.

    2015-01-01

    Large, long-term coastal imagery datasets are nowadays a low-cost source of information for various coastal research disciplines. However, the applicability of many existing algorithms for coastal image analysis is limited for these large datasets due to a lack of automation and robustness.

  17. Structure of the automated uchebno-methodical complex on technical disciplines

    Directory of Open Access Journals (Sweden)

    Вячеслав Михайлович Дмитриев

    2010-12-01

    Full Text Available In article it is put and the problem of automation and information of process of training of students on the basis of the entered system-organizational forms which have received in aggregate the name of education methodical complexes on discipline dares.

  18. Automated static image analysis as a novel tool in describing the physical properties of dietary fiber

    Directory of Open Access Journals (Sweden)

    Marcin Andrzej KUREK

    2015-01-01

    Full Text Available Abstract The growing interest in the usage of dietary fiber in food has caused the need to provide precise tools for describing its physical properties. This research examined two dietary fibers from oats and beets, respectively, in variable particle sizes. The application of automated static image analysis for describing the hydration properties and particle size distribution of dietary fiber was analyzed. Conventional tests for water holding capacity (WHC were conducted. The particles were measured at two points: dry and after water soaking. The most significant water holding capacity (7.00 g water/g solid was achieved by the smaller sized oat fiber. Conversely, the water holding capacity was highest (4.20 g water/g solid in larger sized beet fiber. There was evidence for water absorption increasing with a decrease in particle size in regards to the same fiber source. Very strong correlations were drawn between particle shape parameters, such as fiber length, straightness, width and hydration properties measured conventionally. The regression analysis provided the opportunity to estimate whether the automated static image analysis method could be an efficient tool in describing the hydration properties of dietary fiber. The application of the method was validated using mathematical model which was verified in comparison to conventional WHC measurement results.

  19. Application of automated image analysis to coal petrography

    Science.gov (United States)

    Chao, E.C.T.; Minkin, J.A.; Thompson, C.L.

    1982-01-01

    The coal petrologist seeks to determine the petrographic characteristics of organic and inorganic coal constituents and their lateral and vertical variations within a single coal bed or different coal beds of a particular coal field. Definitive descriptions of coal characteristics and coal facies provide the basis for interpretation of depositional environments, diagenetic changes, and burial history and determination of the degree of coalification or metamorphism. Numerous coal core or columnar samples must be studied in detail in order to adequately describe and define coal microlithotypes, lithotypes, and lithologic facies and their variations. The large amount of petrographic information required can be obtained rapidly and quantitatively by use of an automated image-analysis system (AIAS). An AIAS can be used to generate quantitative megascopic and microscopic modal analyses for the lithologic units of an entire columnar section of a coal bed. In our scheme for megascopic analysis, distinctive bands 2 mm or more thick are first demarcated by visual inspection. These bands consist of either nearly pure microlithotypes or lithotypes such as vitrite/vitrain or fusite/fusain, or assemblages of microlithotypes. Megascopic analysis with the aid of the AIAS is next performed to determine volume percentages of vitrite, inertite, minerals, and microlithotype mixtures in bands 0.5 to 2 mm thick. The microlithotype mixtures are analyzed microscopically by use of the AIAS to determine their modal composition in terms of maceral and optically observable mineral components. Megascopic and microscopic data are combined to describe the coal unit quantitatively in terms of (V) for vitrite, (E) for liptite, (I) for inertite or fusite, (M) for mineral components other than iron sulfide, (S) for iron sulfide, and (VEIM) for the composition of the mixed phases (Xi) i = 1,2, etc. in terms of the maceral groups vitrinite V, exinite E, inertinite I, and optically observable mineral

  20. Automated 3-D echocardiography analysis compared with manual delineations and SPECT MUGA.

    Science.gov (United States)

    Sanchez-Ortiz, Gerardo I; Wright, Gabriel J T; Clarke, Nigel; Declerck, Jérôme; Banning, Adrian P; Noble, J Alison

    2002-09-01

    A major barrier for using 3-D echocardiography for quantitative analysis of heart function in routine clinical practice is the absence of accurate and robust segmentation and tracking methods necessary to make the analysis automatic. In this paper, we present an automated three-dimensional (3-D) echocardiographic acquisition and image-processing methodology for assessment of left ventricular (LV) function. We combine global image information provided by a novel multiscale fuzzy-clustering segmentation algorithm, with local boundaries obtained with phase-based acoustic feature detection. We then use the segmentation results to fit and track the LV endocardial surface using a 3-D continuous transformation. To our knowledge, this is the first report of a completely automated method. The protocol is evaluated in a small clinical case study (nine patients). We compare ejection fractions (EFs) computed with the new approach to those obtained using the standard clinical technique, single-photon emission computed tomography multigated acquisition. Errors on six datasets were found to be within six percentage points. A further two, with poor image quality, improved upon EFs from manually delineated contours, and the last failed due to artifacts in the data. Volume-time curves were derived and the results compared to those from manual segmentation. Improvement over an earlier published version of the method is noted.

  1. Electrically evoked compound action potentials artefact rejection by independent component analysis: procedure automation.

    Science.gov (United States)

    Akhoun, Idrick; McKay, Colette; El-Deredy, Wael

    2015-01-15

    Independent-components-analysis (ICA) successfully separated electrically-evoked compound action potentials (ECAPs) from the stimulation artefact and noise (ECAP-ICA, Akhoun et al., 2013). This paper shows how to automate the ECAP-ICA artefact cancellation process. Raw-ECAPs without artefact rejection were consecutively recorded for each stimulation condition from at least 8 intra-cochlear electrodes. Firstly, amplifier-saturated recordings were discarded, and the data from different stimulus conditions (different current-levels) were concatenated temporally. The key aspect of the automation procedure was the sequential deductive source categorisation after ICA was applied with a restriction to 4 sources. The stereotypical aspect of the 4 sources enables their automatic classification as two artefact components, a noise and the sought ECAP based on theoretical and empirical considerations. The automatic procedure was tested using 8 cochlear implant (CI) users and one to four stimulus electrodes. The artefact and noise sources were successively identified and discarded, leaving the ECAP as the remaining source. The automated ECAP-ICA procedure successfully extracted the correct ECAPs compared to standard clinical forward masking paradigm in 22 out of 26 cases. ECAP-ICA does not require extracting the ECAP from a combination of distinct buffers as it is the case with regular methods. It is an alternative that does not have the possible bias of traditional artefact rejections such as alternate-polarity or forward-masking paradigms. The ECAP-ICA procedure bears clinical relevance, for example as the artefact rejection sub-module of automated ECAP-threshold detection techniques, which are common features of CI clinical fitting software. Copyright © 2014. Published by Elsevier B.V.

  2. Automated absolute activation analysis with californium-252 sources

    Energy Technology Data Exchange (ETDEWEB)

    MacMurdo, K.W.; Bowman, W.W.

    1978-09-01

    A 100-mg /sup 252/Cf neutron activation analysis facility is used routinely at the Savannah River Laboratory for multielement analysis of many solid and liquid samples. An absolute analysis technique converts counting data directly to elemental concentration without the use of classical comparative standards and flux monitors. With the totally automated pneumatic sample transfer system, cyclic irradiation-decay-count regimes can be pre-selected for up to 40 samples, and samples can be analyzed with the facility unattended. An automatic data control system starts and stops a high-resolution gamma-ray spectrometer and/or a delayed-neutron detector; the system also stores data and controls output modes. Gamma ray data are reduced by three main programs in the IBM 360/195 computer: the 4096-channel spectrum and pertinent experimental timing, counting, and sample data are stored on magnetic tape; the spectrum is then reduced to a list of significant photopeak energies, integrated areas, and their associated statistical errors; and the third program assigns gamma ray photopeaks to the appropriate neutron activation product(s) by comparing photopeak energies to tabulated gamma ray energies. Photopeak areas are then converted to elemental concentration by using experimental timing and sample data, calculated elemental neutron capture rates, absolute detector efficiencies, and absolute spectroscopic decay data. Calculational procedures have been developed so that fissile material can be analyzed by cyclic neutron activation and delayed-neutron counting procedures. These calculations are based on a 6 half-life group model of delayed neutron emission; calculations include corrections for delayed neutron interference from /sup 17/O. Detection sensitivities of < or = 400 ppB for natural uranium and 8 ppB (< or = 0.5 (nCi/g)) for /sup 239/Pu were demonstrated with 15-g samples at a throughput of up to 140 per day. Over 40 elements can be detected at the sub-ppM level.

  3. Automated absolute activation analysis with californium-252 sources

    International Nuclear Information System (INIS)

    MacMurdo, K.W.; Bowman, W.W.

    1978-09-01

    A 100-mg 252 Cf neutron activation analysis facility is used routinely at the Savannah River Laboratory for multielement analysis of many solid and liquid samples. An absolute analysis technique converts counting data directly to elemental concentration without the use of classical comparative standards and flux monitors. With the totally automated pneumatic sample transfer system, cyclic irradiation-decay-count regimes can be pre-selected for up to 40 samples, and samples can be analyzed with the facility unattended. An automatic data control system starts and stops a high-resolution gamma-ray spectrometer and/or a delayed-neutron detector; the system also stores data and controls output modes. Gamma ray data are reduced by three main programs in the IBM 360/195 computer: the 4096-channel spectrum and pertinent experimental timing, counting, and sample data are stored on magnetic tape; the spectrum is then reduced to a list of significant photopeak energies, integrated areas, and their associated statistical errors; and the third program assigns gamma ray photopeaks to the appropriate neutron activation product(s) by comparing photopeak energies to tabulated gamma ray energies. Photopeak areas are then converted to elemental concentration by using experimental timing and sample data, calculated elemental neutron capture rates, absolute detector efficiencies, and absolute spectroscopic decay data. Calculational procedures have been developed so that fissile material can be analyzed by cyclic neutron activation and delayed-neutron counting procedures. These calculations are based on a 6 half-life group model of delayed neutron emission; calculations include corrections for delayed neutron interference from 17 O. Detection sensitivities of 239 Pu were demonstrated with 15-g samples at a throughput of up to 140 per day. Over 40 elements can be detected at the sub-ppM level

  4. Oak ridge national laboratory automated clean chemistry for bulk analysis of environmental swipe samples

    Energy Technology Data Exchange (ETDEWEB)

    Bostick, Debra A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Hexel, Cole R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Ticknor, Brian W. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Tevepaugh, Kayron N. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Metzger, Shalina C. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-11-01

    To shorten the lengthy and costly manual chemical purification procedures, sample preparation methods for mass spectrometry are being automated using commercial-off-the-shelf (COTS) equipment. This addresses a serious need in the nuclear safeguards community to debottleneck the separation of U and Pu in environmental samples—currently performed by overburdened chemists—with a method that allows unattended, overnight operation. In collaboration with Elemental Scientific Inc., the prepFAST-MC2 was designed based on current COTS equipment that was modified for U/Pu separations utilizing Eichrom™ TEVA and UTEVA resins. Initial verification of individual columns yielded small elution volumes with consistent elution profiles and good recovery. Combined column calibration demonstrated ample separation without crosscontamination of the eluent. Automated packing and unpacking of the built-in columns initially showed >15% deviation in resin loading by weight, which can lead to inconsistent separations. Optimization of the packing and unpacking methods led to a reduction in the variability of the packed resin to less than 5% daily. The reproducibility of the automated system was tested with samples containing 30 ng U and 15 pg Pu, which were separated in a series with alternating reagent blanks. These experiments showed very good washout of both the resin and the sample from the columns as evidenced by low blank values. Analysis of the major and minor isotope ratios for U and Pu provided values well within data quality limits for the International Atomic Energy Agency. Additionally, system process blanks spiked with 233U and 244Pu tracers were separated using the automated system after it was moved outside of a clean room and yielded levels equivalent to clean room blanks, confirming that the system can produce high quality results without the need for expensive clean room infrastructure. Comparison of the amount of personnel time necessary for successful manual vs

  5. Automation of the method gamma of comparison dosimetry images

    International Nuclear Information System (INIS)

    Moreno Reyes, J. C.; Macias Jaen, J.; Arrans Lara, R.

    2013-01-01

    The objective of this work was the development of JJGAMMA application analysis software, which enables this task systematically, minimizing intervention specialist and therefore the variability due to the observer. Both benefits, allow comparison of images is done in practice with the required frequency and objectivity. (Author)

  6. Image cytometer method for automated assessment of human spermatozoa concentration

    DEFF Research Database (Denmark)

    Egeberg, D L; Kjaerulff, S; Hansen, C

    2013-01-01

    In the basic clinical work-up of infertile couples, a semen analysis is mandatory and the sperm concentration is one of the most essential variables to be determined. Sperm concentration is usually assessed by manual counting using a haemocytometer and is hence labour intensive and may be subject...

  7. Automated image analysis of microstructure changes in metal alloys

    Science.gov (United States)

    Hoque, Mohammed E.; Ford, Ralph M.; Roth, John T.

    2005-02-01

    The ability to identify and quantify changes in the microstructure of metal alloys is valuable in metal cutting and shaping applications. For example, certain metals, after being cryogenically and electrically treated, have shown large increases in their tool life when used in manufacturing cutting and shaping processes. However, the mechanisms of microstructure changes in alloys under various treatments, which cause them to behave differently, are not yet fully understood. The changes are currently evaluated in a semi-quantitative manner by visual inspection of images of the microstructure. This research applies pattern recognition technology to quantitatively measure the changes in microstructure and to validate the initial assertion of increased tool life under certain treatments. Heterogeneous images of aluminum and tungsten carbide of various categories were analyzed using a process including background correction, adaptive thresholding, edge detection and other algorithms for automated analysis of microstructures. The algorithms are robust across a variety of operating conditions. This research not only facilitates better understanding of the effects of electric and cryogenic treatment of these materials, but also their impact on tooling and metal-cutting processes.

  8. Technical and economic viability of automated highway systems : preliminary analysis

    Science.gov (United States)

    1997-01-01

    Technical and economic investigations of automated highway systems (AHS) are addressed. It has generally been accepted that such systems show potential to alleviate urban traffic congestion, so most of the AHS research has been focused instead on tec...

  9. Automated ''float'' method for determination of densities of molten salts

    DEFF Research Database (Denmark)

    Andreasen, Helge A.; Bjerrum, Niels; Foverskov, Carl Erik

    1977-01-01

    wound with platinum wire, an amplifier, a digital voltmeter, an interface, a paper tape punch, and a recorder. The advantages of the system are its ease of operation compared to other ''float'' methods, and the possibility of looking at highly colored melts and also melts having a high vapor pressure...

  10. Automating X-ray Fluorescence Analysis for Rapid Astrobiology Surveys.

    Science.gov (United States)

    Thompson, David R; Flannery, David T; Lanka, Ravi; Allwood, Abigail C; Bue, Brian D; Clark, Benton C; Elam, W Timothy; Estlin, Tara A; Hodyss, Robert P; Hurowitz, Joel A; Liu, Yang; Wade, Lawrence A

    2015-11-01

    A new generation of planetary rover instruments, such as PIXL (Planetary Instrument for X-ray Lithochemistry) and SHERLOC (Scanning Habitable Environments with Raman Luminescence for Organics and Chemicals) selected for the Mars 2020 mission rover payload, aim to map mineralogical and elemental composition in situ at microscopic scales. These instruments will produce large spectral cubes with thousands of channels acquired over thousands of spatial locations, a large potential science yield limited mainly by the time required to acquire a measurement after placement. A secondary bottleneck also faces mission planners after downlink; analysts must interpret the complex data products quickly to inform tactical planning for the next command cycle. This study demonstrates operational approaches to overcome these bottlenecks by specialized early-stage science data processing. Onboard, simple real-time systems can perform a basic compositional assessment, recognizing specific features of interest and optimizing sensor integration time to characterize anomalies. On the ground, statistically motivated visualization can make raw uncalibrated data products more interpretable for tactical decision making. Techniques such as manifold dimensionality reduction can help operators comprehend large databases at a glance, identifying trends and anomalies in data. These onboard and ground-side analyses can complement a quantitative interpretation. We evaluate system performance for the case study of PIXL, an X-ray fluorescence spectrometer. Experiments on three representative samples demonstrate improved methods for onboard and ground-side automation and illustrate new astrobiological science capabilities unavailable in previous planetary instruments. Dimensionality reduction-Planetary science-Visualization.

  11. A Fully Automated Method to Detect and Segment a Manufactured Object in an Underwater Color Image

    Directory of Open Access Journals (Sweden)

    Phlypo Ronald

    2010-01-01

    Full Text Available We propose a fully automated active contours-based method for the detection and the segmentation of a moored manufactured object in an underwater image. Detection of objects in underwater images is difficult due to the variable lighting conditions and shadows on the object. The proposed technique is based on the information contained in the color maps and uses the visual attention method, combined with a statistical approach for the detection and an active contour for the segmentation of the object to overcome the above problems. In the classical active contour method the region descriptor is fixed and the convergence of the method depends on the initialization. With our approach, this dependence is overcome with an initialization using the visual attention results and a criterion to select the best region descriptor. This approach improves the convergence and the processing time while providing the advantages of a fully automated method.

  12. Evaluating Corneal Fluorescein Staining Using a Novel Automated Method.

    Science.gov (United States)

    Amparo, Francisco; Wang, Haobing; Yin, Jia; Marmalidou, Anna; Dana, Reza

    2017-05-01

    To evaluate interobserver concordance in measured corneal fluorescein staining (CFS) using the National Eye Institute/Industry (NEI) grading scale and the Corneal Fluorescein Staining Index (CFSi), a computer-assisted, objective, centesimal scoring system. We conducted a study to evaluate CFS in clinical photographs of patients with corneal epitheliopathy. One group of clinicians graded CFS in the images using the NEI while a second group applied the CFSi. We evaluated the level of interobserver agreement and differences among CFS scores with each method, level of correlation between the two methods, and distribution of cases based on the CFS severity assigned by each method. The level of interobserver agreement was 0.65 (P < 0.001) with the NEI, and 0.99 (P < 0.001) with the CFSi. There were statistically significant differences among clinicians' measurements obtained with the NEI (P < 0.001), but not with the CFSi (P = 0.78). There was a statistically significant correlation between the CFS scores obtained with the two methods (R = 0.72; P < 0.001). The NEI scale allocated the majority of cases (65%) within the higher quartile in the scale's severity (12-15/15). In contrast, the CFSi allocated the majority of cases (61%) within the lower quartile in the scale's severity (0-25/100). The CFSi is easy to implement, provides higher interobserver consistency, and due to its continuous score can discriminate smaller differences in CFS. Reproducibility of the computer-based system is higher and, interestingly, the system allocates cases of epitheliopathy in different severity categories than clinicians do. The CFSi can be an alternative for objective CFS evaluation in the clinic and in clinical trials.

  13. Linking Automated Data Analysis and Visualization with Applications in Developmental Biology and High-Energy Physics

    International Nuclear Information System (INIS)

    Ruebel, Oliver

    2009-01-01

    Knowledge discovery from large and complex collections of today's scientific datasets is a challenging task. With the ability to measure and simulate more processes at increasingly finer spatial and temporal scales, the increasing number of data dimensions and data objects is presenting tremendous challenges for data analysis and effective data exploration methods and tools. Researchers are overwhelmed with data and standard tools are often insufficient to enable effective data analysis and knowledge discovery. The main objective of this thesis is to provide important new capabilities to accelerate scientific knowledge discovery form large, complex, and multivariate scientific data. The research covered in this thesis addresses these scientific challenges using a combination of scientific visualization, information visualization, automated data analysis, and other enabling technologies, such as efficient data management. The effectiveness of the proposed analysis methods is demonstrated via applications in two distinct scientific research fields, namely developmental biology and high-energy physics.Advances in microscopy, image analysis, and embryo registration enable for the first time measurement of gene expression at cellular resolution for entire organisms. Analysis of high-dimensional spatial gene expression datasets is a challenging task. By integrating data clustering and visualization, analysis of complex, time-varying, spatial gene expression patterns and their formation becomes possible. The analysis framework MATLAB and the visualization have been integrated, making advanced analysis tools accessible to biologist and enabling bioinformatic researchers to directly integrate their analysis with the visualization. Laser wakefield particle accelerators (LWFAs) promise to be a new compact source of high-energy particles and radiation, with wide applications ranging from medicine to physics. To gain insight into the complex physical processes of particle

  14. Linking Automated Data Analysis and Visualization with Applications in Developmental Biology and High-Energy Physics

    Energy Technology Data Exchange (ETDEWEB)

    Ruebel, Oliver [Technical Univ. of Darmstadt (Germany)

    2009-11-20

    Knowledge discovery from large and complex collections of today's scientific datasets is a challenging task. With the ability to measure and simulate more processes at increasingly finer spatial and temporal scales, the increasing number of data dimensions and data objects is presenting tremendous challenges for data analysis and effective data exploration methods and tools. Researchers are overwhelmed with data and standard tools are often insufficient to enable effective data analysis and knowledge discovery. The main objective of this thesis is to provide important new capabilities to accelerate scientific knowledge discovery form large, complex, and multivariate scientific data. The research covered in this thesis addresses these scientific challenges using a combination of scientific visualization, information visualization, automated data analysis, and other enabling technologies, such as efficient data management. The effectiveness of the proposed analysis methods is demonstrated via applications in two distinct scientific research fields, namely developmental biology and high-energy physics.Advances in microscopy, image analysis, and embryo registration enable for the first time measurement of gene expression at cellular resolution for entire organisms. Analysis of high-dimensional spatial gene expression datasets is a challenging task. By integrating data clustering and visualization, analysis of complex, time-varying, spatial gene expression patterns and their formation becomes possible. The analysis framework MATLAB and the visualization have been integrated, making advanced analysis tools accessible to biologist and enabling bioinformatic researchers to directly integrate their analysis with the visualization. Laser wakefield particle accelerators (LWFAs) promise to be a new compact source of high-energy particles and radiation, with wide applications ranging from medicine to physics. To gain insight into the complex physical processes of particle

  15. Automated chemical analysis of internally mixed aerosol particles using X-ray spectromicroscopy at the carbon K-edge.

    Science.gov (United States)

    Moffet, Ryan C; Henn, Tobias; Laskin, Alexander; Gilles, Mary K

    2010-10-01

    We have developed an automated data analysis method for atmospheric particles using scanning transmission X-ray microscopy coupled with near edge X-ray fine structure spectroscopy (STXM/NEXAFS). This method is applied to complex internally mixed submicrometer particles containing organic and inorganic material. Several algorithms were developed to exploit NEXAFS spectral features in the energy range from 278 to 320 eV for quantitative mapping of the spatial distribution of elemental carbon, organic carbon, potassium, and noncarbonaceous elements in particles of mixed composition. This energy range encompasses the carbon K-edge and potassium L2 and L3 edges. STXM/NEXAFS maps of different chemical components were complemented with a subsequent analysis using elemental maps obtained by scanning electron microscopy coupled with energy dispersive X-ray analysis (SEM/EDX). We demonstrate the application of the automated mapping algorithms for data analysis and the statistical classification of particles.

  16. Automated gamma-H2AX focus scoring method for human lymphocytes after ionizing radiation exposure

    Energy Technology Data Exchange (ETDEWEB)

    Valente, M.; Voisin, P.; Laloi, P.; Roy, L. [Institut de Radioprotection et de Surete Nucleaire (IRSN), DRPH, SRBE, LDB, BP 17, 92262 Fontenay-aux-Roses (France); Roch-Lefevre, S., E-mail: Sandrine.roch-lefevre@irsn.fr [Institut de Radioprotection et de Surete Nucleaire (IRSN), DRPH, SRBE, LDB, BP 17, 92262 Fontenay-aux-Roses (France)

    2011-09-15

    The purpose of this study was to develop a microscopy-based foci quantification protocol in human lymphocyte capable of supplying useful data for radiation sensitivity assays. Human peripheral blood was exposed to gamma-rays and isolated lymphocytes were stained with fluorochrome-coupled anti-gamma-H2AX (Histone 2AX phosphorylation of serine 139) antibodies. Microscopy slides were automatically acquired and the resulting images were subjected to 3 focus scoring methods: manual, semi-automated and fully automated. All scoring methods were sufficiently sensitive to detect an irradiation of at least 0.05 Gy with low variation between experiments. For higher doses, both automated approaches tend to detect fewer foci than manual scoring but still obtaining a linear correlation (lowest r{sup 2} > 0.971). Compared with manual scoring on the images, the automated approaches are at least 5 times faster with minimum operator intervention needed. We can conclude that our method is able to obtain the foci score of a blood sample in less than 6 h. In addition to the foci score the programs used perform several cell and foci measurements of potential biological importance.

  17. A semi-automated method of monitoring dam passage of American Eels Anguilla rostrata

    Science.gov (United States)

    Welsh, Stuart A.; Aldinger, Joni L.

    2014-01-01

    Fish passage facilities at dams have become an important focus of fishery management in riverine systems. Given the personnel and travel costs associated with physical monitoring programs, automated or semi-automated systems are an attractive alternative for monitoring fish passage facilities. We designed and tested a semi-automated system for eel ladder monitoring at Millville Dam on the lower Shenandoah River, West Virginia. A motion-activated eel ladder camera (ELC) photographed each yellow-phase American Eel Anguilla rostrata that passed through the ladder. Digital images (with date and time stamps) of American Eels allowed for total daily counts and measurements of eel TL using photogrammetric methods with digital imaging software. We compared physical counts of American Eels with camera-based counts; TLs obtained with a measuring board were compared with TLs derived from photogrammetric methods. Data from the ELC were consistent with data obtained by physical methods, thus supporting the semi-automated camera system as a viable option for monitoring American Eel passage. Time stamps on digital images allowed for the documentation of eel passage time—data that were not obtainable from physical monitoring efforts. The ELC has application to eel ladder facilities but can also be used to monitor dam passage of other taxa, such as crayfishes, lampreys, and water snakes.

  18. Automated gamma-H2AX focus scoring method for human lymphocytes after ionizing radiation exposure

    International Nuclear Information System (INIS)

    Valente, M.; Voisin, P.; Laloi, P.; Roy, L.; Roch-Lefevre, S.

    2011-01-01

    The purpose of this study was to develop a microscopy-based foci quantification protocol in human lymphocyte capable of supplying useful data for radiation sensitivity assays. Human peripheral blood was exposed to gamma-rays and isolated lymphocytes were stained with fluorochrome-coupled anti-gamma-H2AX (Histone 2AX phosphorylation of serine 139) antibodies. Microscopy slides were automatically acquired and the resulting images were subjected to 3 focus scoring methods: manual, semi-automated and fully automated. All scoring methods were sufficiently sensitive to detect an irradiation of at least 0.05 Gy with low variation between experiments. For higher doses, both automated approaches tend to detect fewer foci than manual scoring but still obtaining a linear correlation (lowest r 2 > 0.971). Compared with manual scoring on the images, the automated approaches are at least 5 times faster with minimum operator intervention needed. We can conclude that our method is able to obtain the foci score of a blood sample in less than 6 h. In addition to the foci score the programs used perform several cell and foci measurements of potential biological importance.

  19. Automated patient and medication payment method for clinical trials

    Directory of Open Access Journals (Sweden)

    Yawn BP

    2013-01-01

    Full Text Available Barbara P Yawn,1 Suzanne Madison,1 Susan Bertram,1 Wilson D Pace,2 Anne Fuhlbrigge,3 Elliot Israel,3 Dawn Littlefield,1 Margary Kurland,1 Michael E Wechsler41Olmsted Medical Center, Department of Research, Rochester, MN, 2UCDHSC, Department of Family Medicine, University of Colorado Health Science Centre, Aurora, CO, 3Brigham and Women's Hospital, Pulmonary and Critical Care Division, Boston, MA, 4National Jewish Medical Center, Division of Pulmonology, Denver, CO, USABackground: Published reports and studies related to patient compensation for clinical trials focus primarily on the ethical issues related to appropriate amounts to reimburse for patient's time and risk burden. Little has been published regarding the method of payment for patient participation. As clinical trials move into widely dispersed community practices and more complex designs, the method of payment also becomes more complex. Here we review the decision process and payment method selected for a primary care-based randomized clinical trial of asthma management in Black Americans.Methods: The method selected is a credit card system designed specifically for clinical trials that allows both fixed and variable real-time payments. We operationalized the study design by providing each patient with two cards, one for reimbursement for study visits and one for payment of medication costs directly to the pharmacies.Results: Of the 1015 patients enrolled, only two refused use of the ClinCard, requesting cash payments for visits and only rarely a weekend or fill-in pharmacist refused to use the card system for payment directly to the pharmacy. Overall, the system has been well accepted by patients and local study teams. The ClinCard administrative system facilitates the fiscal accounting and medication adherence record-keeping by the central teams. Monthly fees are modest, and all 12 study institutional review boards approved use of the system without concern for patient

  20. A Method to Automate Identification of Spiral Arms in Galaxies

    Science.gov (United States)

    Lacey, Christina K.; Mercer, K.

    2014-01-01

    We present our preliminary results in identifying the spiral arms of NGC 6946 using a nearest-neighbors analysis. NGC 6946 is grand design spiral galaxy with well-defined arms. The spiral arms were previously identified in an Hα image and traced out by Matonick, D. et al., ApJS, 113, 333, (1997) by visual inspection. We want to develop a computer algorithm that will identify the spiral arms automatically. Once the spiral arms have been found digitally, we can use this information to compare the spiral arms with the locations of compact objects such as supernova remnants and perform statistical tests, for example, to determine if the supernova remnants are associated with the spiral arms. We are using the publicly available program PyFITS, a development project of the Science Software Branch at the Space Telescope Science Institute (STScI) that is available for software download from STScI, to perform a computer-based image analysis. We have written python macros that interact with the already written image manipulation and display features of PyFITS to perform the image analysis and implement a nearest-neighbors algorithm to identify and link the centers of the high emission regions from the spiral arm regions. Our code currently identifies the centers of the high emission regions, but more work is needed to link up these sites and draw out the spiral arms. Future work includes improving the code to better identify spiral arms and converting the code to work on the Astropy, a community-developed core Python package for Astronomy (Robitaille, T. P., et al. A&A 558, A33, 2013).

  1. Empirical Analysis and Automated Classification of Security Bug Reports

    Science.gov (United States)

    Tyo, Jacob P.

    2016-01-01

    With the ever expanding amount of sensitive data being placed into computer systems, the need for effective cybersecurity is of utmost importance. However, there is a shortage of detailed empirical studies of security vulnerabilities from which cybersecurity metrics and best practices could be determined. This thesis has two main research goals: (1) to explore the distribution and characteristics of security vulnerabilities based on the information provided in bug tracking systems and (2) to develop data analytics approaches for automatic classification of bug reports as security or non-security related. This work is based on using three NASA datasets as case studies. The empirical analysis showed that the majority of software vulnerabilities belong only to a small number of types. Addressing these types of vulnerabilities will consequently lead to cost efficient improvement of software security. Since this analysis requires labeling of each bug report in the bug tracking system, we explored using machine learning to automate the classification of each bug report as a security or non-security related (two-class classification), as well as each security related bug report as specific security type (multiclass classification). In addition to using supervised machine learning algorithms, a novel unsupervised machine learning approach is proposed. An ac- curacy of 92%, recall of 96%, precision of 92%, probability of false alarm of 4%, F-Score of 81% and G-Score of 90% were the best results achieved during two-class classification. Furthermore, an accuracy of 80%, recall of 80%, precision of 94%, and F-score of 85% were the best results achieved during multiclass classification.

  2. A method for fast automated microscope image stitching.

    Science.gov (United States)

    Yang, Fan; Deng, Zhen-Sheng; Fan, Qiu-Hong

    2013-05-01

    Image stitching is an important technology to produce a panorama or larger image by combining several images with overlapped areas. In many biomedical researches, image stitching is highly desirable to acquire a panoramic image which represents large areas of certain structures or whole sections, while retaining microscopic resolution. In this study, we develop a fast normal light microscope image stitching algorithm based on feature extraction. At first, an algorithm of scale-space reconstruction of speeded-up robust features (SURF) was proposed to extract features from the images to be stitched with a short time and higher repeatability. Then, the histogram equalization (HE) method was employed to preprocess the images to enhance their contrast for extracting more features. Thirdly, the rough overlapping zones of the images preprocessed were calculated by phase correlation, and the improved SURF was used to extract the image features in the rough overlapping areas. Fourthly, the features were corresponded by matching algorithm and the transformation parameters were estimated, then the images were blended seamlessly. Finally, this procedure was applied to stitch normal light microscope images to verify its validity. Our experimental results demonstrate that the improved SURF algorithm is very robust to viewpoint, illumination, blur, rotation and zoom of the images and our method is able to stitch microscope images automatically with high precision and high speed. Also, the method proposed in this paper is applicable to registration and stitching of common images as well as stitching the microscope images in the field of virtual microscope for the purpose of observing, exchanging, saving, and establishing a database of microscope images. Copyright © 2013 Elsevier Ltd. All rights reserved.

  3. Analysis of numerical methods

    CERN Document Server

    Isaacson, Eugene

    1994-01-01

    This excellent text for advanced undergraduates and graduate students covers norms, numerical solution of linear systems and matrix factoring, iterative solutions of nonlinear equations, eigenvalues and eigenvectors, polynomial approximation, and other topics. It offers a careful analysis and stresses techniques for developing new methods, plus many examples and problems. 1966 edition.

  4. Interferences in automated phenol red method for determination of bromide in water

    Science.gov (United States)

    Basel, C.L.; Defreese, J.D.; Whittemore, Donald O.

    1982-01-01

    The phenol red method for the determination of bromide in water has been automated by segmented flow analysis. Samples can be analyzed at a rate of 20 samples/h with a method detection limit, defined, as the concentration giving a signal about three times the standard deviation of replicate anaiyte determinations in reagent water, of 10 ??g/L. Samples studied include oil-field brines, halite solution brines, ground-waters contaminated with these brines, and fresh groundwaters. Chloride and bicarbonate cause significant positive interferences at levels as low as 100 mg/L and 50 mg/L, respectively. Ammonia gives a negative interference that is important at levels as low as 0.05 mg/L. An ionic strength buffer is used to suppress a positive ionic strength interference, correction curves are used to compensate for the chloride interference, the bicarbonate interference is minimized by acidification, and the ammonia interference is eliminated by its removal by ion exchange. Reaction product studies are used to suggest a plausible mode of chloride interference. ?? 1982 American Chemical Society.

  5. Methods for RNA Analysis

    DEFF Research Database (Denmark)

    Olivarius, Signe

    RNAs rely on interactions with proteins, the establishment of protein-binding profiles is essential for the characterization of RNAs. Aiming to facilitate RNA analysis, this thesis introduces proteomics- as well as transcriptomics-based methods for the functional characterization of RNA. First, RNA......-protein pulldown combined with mass spectrometry analysis is applied for in vivo as well as in vitro identification of RNA-binding proteins, the latter succeeding in verifying known RNA-protein interactions. Secondly, acknowledging the significance of flexible promoter usage for the diversification...... of the transcriptome, 5’ end capture of RNA is combined with next-generation sequencing for high-throughput quantitative assessment of transcription start sites by two different methods. The methods presented here allow for functional investigation of coding as well as noncoding RNA and contribute to future...

  6. Constructing an Intelligent Patent Network Analysis Method

    Directory of Open Access Journals (Sweden)

    Chao-Chan Wu

    2012-11-01

    Full Text Available Patent network analysis, an advanced method of patent analysis, is a useful tool for technology management. This method visually displays all the relationships among the patents and enables the analysts to intuitively comprehend the overview of a set of patents in the field of the technology being studied. Although patent network analysis possesses relative advantages different from traditional methods of patent analysis, it is subject to several crucial limitations. To overcome the drawbacks of the current method, this study proposes a novel patent analysis method, called the intelligent patent network analysis method, to make a visual network with great precision. Based on artificial intelligence techniques, the proposed method provides an automated procedure for searching patent documents, extracting patent keywords, and determining the weight of each patent keyword in order to generate a sophisticated visualization of the patent network. This study proposes a detailed procedure for generating an intelligent patent network that is helpful for improving the efficiency and quality of patent analysis. Furthermore, patents in the field of Carbon Nanotube Backlight Unit (CNT-BLU were analyzed to verify the utility of the proposed method.

  7. Automated structural imaging analysis detects premanifest Huntington's disease neurodegeneration within 1 year.

    Science.gov (United States)

    Majid, D S Adnan; Stoffers, Diederick; Sheldon, Sarah; Hamza, Samar; Thompson, Wesley K; Goldstein, Jody; Corey-Bloom, Jody; Aron, Adam R

    2011-07-01

    Intense efforts are underway to evaluate neuroimaging measures as biomarkers for neurodegeneration in premanifest Huntington's disease (preHD). We used a completely automated longitudinal analysis method to compare structural scans in preHD individuals and controls. Using a 1-year longitudinal design, we analyzed T(1) -weighted structural scans in 35 preHD individuals and 22 age-matched controls. We used the SIENA (Structural Image Evaluation, using Normalization, of Atrophy) software tool to yield overall percentage brain volume change (PBVC) and voxel-level changes in atrophy. We calculated sample sizes for a hypothetical disease-modifying (neuroprotection) study. We found significantly greater yearly atrophy in preHD individuals versus controls (mean PBVC controls, -0.149%; preHD, -0.388%; P = .031, Cohen's d = .617). For a preHD subgroup closest to disease onset, yearly atrophy was more than 3 times that of controls (mean PBVC close-to-onset preHD, -0.510%; P = .019, Cohen's d = .920). This atrophy was evident at the voxel level in periventricular regions, consistent with well-established preHD basal ganglia atrophy. We estimated that a neuroprotection study using SIENA would only need 74 close-to-onset individuals in each arm (treatment vs placebo) to detect a 50% slowing in yearly atrophy with 80% power. Automated whole-brain analysis of structural MRI can reliably detect preHD disease progression in 1 year. These results were attained with a readily available imaging analysis tool, SIENA, which is observer independent, automated, and robust with respect to image quality, slice thickness, and different pulse sequences. This MRI biomarker approach could be used to evaluate neuroprotection in preHD. Copyright © 2011 Movement Disorder Society.

  8. Semi-automated Biopanning of Bacterial Display Libraries for Peptide Affinity Reagent Discovery and Analysis of Resulting Isolates.

    Science.gov (United States)

    Sarkes, Deborah A; Jahnke, Justin P; Stratis-Cullum, Dimitra N

    2017-12-06

    Biopanning bacterial display libraries is a proven technique for peptide affinity reagent discovery for recognition of both biotic and abiotic targets. Peptide affinity reagents can be used for similar applications to antibodies, including sensing and therapeutics, but are more robust and able to perform in more extreme environments. Specific enrichment of peptide capture agents to a protein target of interest is enhanced using semi-automated sorting methods which improve binding and wash steps and therefore decrease the occurrence of false positive binders. A semi-automated sorting method is described herein for use with a commercial automated magnetic-activated cell sorting device with an unconstrained bacterial display sorting library expressing random 15-mer peptides. With slight modifications, these methods are extendable to other automated devices, other sorting libraries, and other organisms. A primary goal of this work is to provide a comprehensive methodology and expound the thought process applied in analyzing and minimizing the resulting pool of candidates. These techniques include analysis of on-cell binding using fluorescence-activated cell sorting (FACS), to assess affinity and specificity during sorting and in comparing individual candidates, and the analysis of peptide sequences to identify trends and consensus sequences for understanding and potentially improving the affinity to and specificity for the target of interest.

  9. Automated segmentation of chronic stroke lesions using LINDA: Lesion identification with neighborhood data analysis.

    Science.gov (United States)

    Pustina, Dorian; Coslett, H Branch; Turkeltaub, Peter E; Tustison, Nicholas; Schwartz, Myrna F; Avants, Brian

    2016-04-01

    The gold standard for identifying stroke lesions is manual tracing, a method that is known to be observer dependent and time consuming, thus impractical for big data studies. We propose LINDA (Lesion Identification with Neighborhood Data Analysis), an automated segmentation algorithm capable of learning the relationship between existing manual segmentations and a single T1-weighted MRI. A dataset of 60 left hemispheric chronic stroke patients is used to build the method and test it with k-fold and leave-one-out procedures. With respect to manual tracings, predicted lesion maps showed a mean dice overlap of 0.696 ± 0.16, Hausdorff distance of 17.9 ± 9.8 mm, and average displacement of 2.54 ± 1.38 mm. The manual and predicted lesion volumes correlated at r = 0.961. An additional dataset of 45 patients was utilized to test LINDA with independent data, achieving high accuracy rates and confirming its cross-institutional applicability. To investigate the cost of moving from manual tracings to automated segmentation, we performed comparative lesion-to-symptom mapping (LSM) on five behavioral scores. Predicted and manual lesions produced similar neuro-cognitive maps, albeit with some discussed discrepancies. Of note, region-wise LSM was more robust to the prediction error than voxel-wise LSM. Our results show that, while several limitations exist, our current results compete with or exceed the state-of-the-art, producing consistent predictions, very low failure rates, and transferable knowledge between labs. This work also establishes a new viewpoint on evaluating automated methods not only with segmentation accuracy but also with brain-behavior relationships. LINDA is made available online with trained models from over 100 patients. © 2016 Wiley Periodicals, Inc.

  10. Automated Analysis of Clinical Flow Cytometry Data: A Chronic Lymphocytic Leukemia Illustration.

    Science.gov (United States)

    Scheuermann, Richard H; Bui, Jack; Wang, Huan-You; Qian, Yu

    2017-12-01

    Flow cytometry is used in cell-based diagnostic evaluation for blood-borne malignancies including leukemia and lymphoma. The current practice for cytometry data analysis relies on manual gating to identify cell subsets in complex mixtures, which is subjective, labor-intensive, and poorly reproducible. This article reviews recent efforts to develop, validate, and disseminate automated computational methods and pipelines for cytometry data analysis that could help overcome the limitations of manual analysis and provide for efficient and data-driven diagnostic applications. It demonstrates the performance of an optimized computational pipeline in a pilot study of chronic lymphocytic leukemia data from the authors' clinical diagnostic laboratory. Copyright © 2017 Elsevier Inc. All rights reserved.

  11. Automated methods for thorium determination in liquids, solids and aerosols

    International Nuclear Information System (INIS)

    Robertson, R.; Stuart, J.E.

    1984-01-01

    Methodology for determining trace thorium levels in a variety of sample types for compliance purposes was developed. Thorium in filtered water samples is concentrated by ferric hydroxide co-precipitation. Aerosols on glass-fibre, cellulose ester or teflon filters are acid digested and thorium is concentrated by lanthanum fluoride co-precipitation. Chemical separation and measurement are then done on a Technicon AAII-C auto-analyzer via TTA-solvent extraction and colorimetry using the thorium-arsenazo III colour complex. Solid samples are acid digested and thorium is concentrated and separated using lanthanum fluoride co-precipitation followed by anion-exchange chromatography. Measurement is then carried out on the autoanalyzer by direct development of the thorium-arsenazo III colour complex. Chemical yields are determined through the addition of thorium-234 tracer with assay by gamma-ray spectrometry. The sensitivities of the methods for liquids, aerosols and solids are approximately 1μg/L,0.5μg and 0.5 μg/g respectively. At thorium levels about ten times the detection limits, accuracy and reproducibility are typically +-10 percent for liquids and aerosols and +- 15 percent for solid samples

  12. Automated three-dimensional analysis of particle measurements using an optical profilometer and image analysis software.

    Science.gov (United States)

    Bullman, V

    2003-07-01

    The automated collection of topographic images from an optical profilometer coupled with existing image analysis software offers the unique ability to quantify three-dimensional particle morphology. Optional software available with most optical profilers permits automated collection of adjacent topographic images of particles dispersed onto a suitable substrate. Particles are recognized in the image as a set of continuous pixels with grey-level values above the grey level assigned to the substrate, whereas particle height or thickness is represented in the numerical differences between these grey levels. These images are loaded into remote image analysis software where macros automate image processing, and then distinguish particles for feature analysis, including standard two-dimensional measurements (e.g. projected area, length, width, aspect ratios) and third-dimensional measurements (e.g. maximum height, mean height). Feature measurements from each calibrated image are automatically added to cumulative databases and exported to a commercial spreadsheet or statistical program for further data processing and presentation. An example is given that demonstrates the superiority of quantitative three-dimensional measurements by optical profilometry and image analysis in comparison with conventional two-dimensional measurements for the characterization of pharmaceutical powders with plate-like particles.

  13. Automated Clean Chemistry for Bulk Analysis of Environmental Swipe Samples - FY17 Year End Report

    Energy Technology Data Exchange (ETDEWEB)

    Ticknor, Brian W. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Metzger, Shalina C. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); McBay, Eddy H. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Hexel, Cole R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Tevepaugh, Kayron N. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Bostick, Debra A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2017-11-30

    Sample preparation methods for mass spectrometry are being automated using commercial-off-the-shelf (COTS) equipment to shorten lengthy and costly manual chemical purification procedures. This development addresses a serious need in the International Atomic Energy Agency’s Network of Analytical Laboratories (IAEA NWAL) to increase efficiency in the Bulk Analysis of Environmental Samples for Safeguards program with a method that allows unattended, overnight operation. In collaboration with Elemental Scientific Inc., the prepFAST-MC2 was designed based on COTS equipment. It was modified for uranium/plutonium separations using renewable columns packed with Eichrom TEVA and UTEVA resins, with a chemical separation method based on the Oak Ridge National Laboratory (ORNL) NWAL chemical procedure. The newly designed prepFAST-SR has had several upgrades compared with the original prepFAST-MC2. Both systems are currently installed in the Ultra-Trace Forensics Science Center at ORNL.

  14. Automated Design and Analysis Tool for CLV/CEV Composite and Metallic Structural Components, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation of the proposed effort is a unique automated process for the analysis, design, and sizing of CLV/CEV composite and metallic structures. This developed...

  15. Automated Design and Analysis Tool for CEV Structural and TPS Components, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation of the proposed effort is a unique automated process for the analysis, design, and sizing of CEV structures and TPS. This developed process will...

  16. Automated small‐scale protein purification and analysis for accelerated development of protein therapeutics

    Science.gov (United States)

    LeSaout, Xavier; Costioli, Matteo; Jordan, Lynn; Lambert, Jeremy; Beighley, Ross; Provencher, Laurel; McGuire, Kevin; Verlinden, Nico; Barry, Andrew

    2015-01-01

    Small‐scale protein purification presents opportunities for accelerated process development of biotherapeutic molecules. Miniaturization of purification conditions reduces time and allows for parallel processing of samples, thus offering increased statistical significance and greater breadth of variables. The ability of the miniaturized platform to be predictive of larger scale purification schemes is of critical importance. The PerkinElmer JANUS BioTx Pro and Pro‐Plus workstations were developed as intuitive, flexible, and automated devices capable of performing parallel small‐scale analytical protein purification. Preprogrammed methods automate a variety of commercially available ion exchange and affinity chromatography solutions, including miniaturized chromatography columns, resin‐packed pipette tips, and resin‐filled microtiter vacuum filtration plates. Here, we present a comparison of microscale chromatography versus standard fast protein LC (FPLC) methods for process optimization. In this study, we evaluated the capabilities of the JANUS BioTx Pro‐Plus robotic platform for miniaturized chromatographic purification of proteins with the GE ӒKTA Express system. We were able to demonstrate predictive analysis similar to that of larger scale purification platforms, while offering advantages in speed and number of samples processed. This approach is predictive of scale‐up conditions, resulting in shorter biotherapeutic development cycles and less consumed material than traditional FPLC methods, thus reducing time‐to‐market from discovery to manufacturing. PMID:27774045

  17. Spectroscopic Methods of Steroid Analysis

    Science.gov (United States)

    Kasal, Alexander; Budesinsky, Milos; Griffiths, William J.

    Modern chemical laboratories contain equipment capable of measuring many of the physical properties of single chemical compounds and mixtures of compounds, particularly their spectral properties, which can, if interpreted correctly, provide valuable information about both structure (of single compounds) and composition (of mixtures). Over the past 50 years, the author have witnessed enormous progress in the technical capabilities of this equipment. Automation and speed of analysis have greatly improved the ease of use and the versatility of the technology.

  18. Method of Modeling Questions for Automated Grading of Students’ Responses in E-Learning Systems

    Directory of Open Access Journals (Sweden)

    A. A. Gurchenkov

    2015-01-01

    scoring of algorithmic questions. The module supports both standalone mode using the test pages, and the mode of integration with any LMS. Special attention is given to fault tolerance of the module by allowing a continuous state preservation of algorithmic questions in the database. The developed module architecture enables the integration of any types of input and scoring algorithms in LMS. An advantage of the suggested approach is the capability to generate a number of question options, which is necessary to form the homework or tests to be given to groups of students.Application. The results can be used to extend the functionality of e-learning systems for the inclusion of complex question types that support automated scoring of student answers.Further studies. Further research based on this method may include the integration of new student’s answer input methods and their scoring algorithms, as well as creating other algorithms for the system feedback loop implemented in answer ( function. For example, one of the urgent problems consists in generation of hints based on the analysis of students’ wrong responses.

  19. Semi-automated extraction of longitudinal subglacial bedforms from digital terrain models - Two new methods

    Science.gov (United States)

    Jorge, Marco G.; Brennand, Tracy A.

    2017-07-01

    Relict drumlin and mega-scale glacial lineation (positive relief, longitudinal subglacial bedforms - LSBs) morphometry has been used as a proxy for paleo ice-sheet dynamics. LSB morphometric inventories have relied on manual mapping, which is slow and subjective and thus potentially difficult to reproduce. Automated methods are faster and reproducible, but previous methods for LSB semi-automated mapping have not been highly successful. Here, two new object-based methods for the semi-automated extraction of LSBs (footprints) from digital terrain models are compared in a test area in the Puget Lowland, Washington, USA. As segmentation procedures to create LSB-candidate objects, the normalized closed contour method relies on the contouring of a normalized local relief model addressing LSBs on slopes, and the landform elements mask method relies on the classification of landform elements derived from the digital terrain model. For identifying which LSB-candidate objects correspond to LSBs, both methods use the same LSB operational definition: a ruleset encapsulating expert knowledge, published morphometric data, and the morphometric range of LSBs in the study area. The normalized closed contour method was separately applied to four different local relief models, two computed in moving windows and two hydrology-based. Overall, the normalized closed contour method outperformed the landform elements mask method. The normalized closed contour method performed on a hydrological relief model from a multiple direction flow routing algorithm performed best. For an assessment of its transferability, the normalized closed contour method was evaluated on a second area, the Chautauqua drumlin field, Pennsylvania and New York, USA where it performed better than in the Puget Lowland. A broad comparison to previous methods suggests that the normalized relief closed contour method may be the most capable method to date, but more development is required.

  20. Automation Tools for Finite Element Analysis of Adhesively Bonded Joints

    Science.gov (United States)

    Tahmasebi, Farhad; Brodeur, Stephen J. (Technical Monitor)

    2002-01-01

    This article presents two new automation creation tools that obtain stresses and strains (Shear and peel) in adhesively bonded joints. For a given adhesively bonded joint Finite Element model, in which the adhesive is characterised using springs, these automation tools read the corresponding input and output files, use the spring forces and deformations to obtain the adhesive stresses and strains, sort the stresses and strains in descending order, and generate plot files for 3D visualisation of the stress and strain fields. Grids (nodes) and elements can be numbered in any order that is convenient for the user. Using the automation tools, trade-off studies, which are needed for design of adhesively bonded joints, can be performed very quickly.

  1. Orbit transfer rocket engine technology program: Automated preflight methods concept definition

    Science.gov (United States)

    Erickson, C. M.; Hertzberg, D. W.

    1991-01-01

    The possibility of automating preflight engine checkouts on orbit transfer engines is discussed. The minimum requirements in terms of information and processing necessary to assess the engine'e integrity and readiness to perform its mission were first defined. A variety of ways for remotely obtaining that information were generated. The sophistication of these approaches varied from a simple preliminary power up, where the engine is fired up for the first time, to the most advanced approach where the sensor and operational history data system alone indicates engine integrity. The critical issues and benefits of these methods were identified, outlined, and prioritized. The technology readiness of each of these automated preflight methods were then rated on a NASA Office of Exploration scale used for comparing technology options for future mission choices. Finally, estimates were made of the remaining cost to advance the technology for each method to a level where the system validation models have been demonstrated in a simulated environment.

  2. Manual versus Automated Narrative Analysis of Agrammatic Production Patterns: The Northwestern Narrative Language Analysis and Computerized Language Analysis

    Science.gov (United States)

    Hsu, Chien-Ju; Thompson, Cynthia K.

    2018-01-01

    Purpose: The purpose of this study is to compare the outcomes of the manually coded Northwestern Narrative Language Analysis (NNLA) system, which was developed for characterizing agrammatic production patterns, and the automated Computerized Language Analysis (CLAN) system, which has recently been adopted to analyze speech samples of individuals…

  3. Automated Dissolution for Enteric-Coated Aspirin Tablets: A Case Study for Method Transfer to a RoboDis II.

    Science.gov (United States)

    Ibrahim, Sarah A; Martini, Luigi

    2014-08-01

    Dissolution method transfer is a complicated yet common process in the pharmaceutical industry. With increased pharmaceutical product manufacturing and dissolution acceptance requirements, dissolution testing has become one of the most labor-intensive quality control testing methods. There is an increased trend for automation in dissolution testing, particularly for large pharmaceutical companies to reduce variability and increase personnel efficiency. There is no official guideline for dissolution testing method transfer from a manual, semi-automated, to automated dissolution tester. In this study, a manual multipoint dissolution testing procedure for an enteric-coated aspirin tablet was transferred effectively and reproducibly to a fully automated dissolution testing device, RoboDis II. Enteric-coated aspirin samples were used as a model formulation to assess the feasibility and accuracy of media pH change during continuous automated dissolution testing. Several RoboDis II parameters were evaluated to ensure the integrity and equivalency of dissolution method transfer from a manual dissolution tester. This current study provides a systematic outline for the transfer of the manual dissolution testing protocol to an automated dissolution tester. This study further supports that automated dissolution testers compliant with regulatory requirements and similar to manual dissolution testers facilitate method transfer. © 2014 Society for Laboratory Automation and Screening.

  4. Automated analysis of heterogeneous carbon nanostructures by high-resolution electron microscopy and on-line image processing

    Energy Technology Data Exchange (ETDEWEB)

    Toth, P., E-mail: toth.pal@uni-miskolc.hu [Department of Chemical Engineering, University of Utah, 50 S. Central Campus Drive, Salt Lake City, UT 84112-9203 (United States); Farrer, J.K. [Department of Physics and Astronomy, Brigham Young University, N283 ESC, Provo, UT 84602 (United States); Palotas, A.B. [Department of Combustion Technology and Thermal Energy, University of Miskolc, H3515, Miskolc-Egyetemvaros (Hungary); Lighty, J.S.; Eddings, E.G. [Department of Chemical Engineering, University of Utah, 50 S. Central Campus Drive, Salt Lake City, UT 84112-9203 (United States)

    2013-06-15

    High-resolution electron microscopy is an efficient tool for characterizing heterogeneous nanostructures; however, currently the analysis is a laborious and time-consuming manual process. In order to be able to accurately and robustly quantify heterostructures, one must obtain a statistically high number of micrographs showing images of the appropriate sub-structures. The second step of analysis is usually the application of digital image processing techniques in order to extract meaningful structural descriptors from the acquired images. In this paper it will be shown that by applying on-line image processing and basic machine vision algorithms, it is possible to fully automate the image acquisition step; therefore, the number of acquired images in a given time can be increased drastically without the need for additional human labor. The proposed automation technique works by computing fields of structural descriptors in situ and thus outputs sets of the desired structural descriptors in real-time. The merits of the method are demonstrated by using combustion-generated black carbon samples. - Highlights: ► The HRTEM analysis of heterogeneous nanostructures is a tedious manual process. ► Automatic HRTEM image acquisition and analysis can improve data quantity and quality. ► We propose a method based on on-line image analysis for the automation of HRTEM image acquisition. ► The proposed method is demonstrated using HRTEM images of soot particles.

  5. Comparison of Automated Continuous Flow Method With Shake- Flask Method in Determining Partition Coefficients of Bidentate Hydroxypyridinone Ligands

    Directory of Open Access Journals (Sweden)

    Lotfollah Saghaie

    2003-08-01

    Full Text Available The partition coefficients (Kpart , in octanol/water system of a range of bidentate ligands containing the 3-hydroxypyridin-4-one moiety were determined using shake flask and automated continuous flow methods (filter probe method. The shake flask method was used for extremely hydrophilic or hydrophobic compounds with a Kpart values greater than 100 and less than 0.01. For other ligands which possess moderate lipophilicity (Kpart values between 0.01-100 the filter probe method was used. Also the partition coefficient of four ligands with moderate lipophilicity was determined by shake flask method in order to check comparability of these two methods. While the shake flask method was able to determine either extremely hydrophilic or hydrophobic compounds efficiently, the filter probe method was unable to measure such Kpart values. Although, determination of the Kpart values of all compounds is possible with the classical shake-flask method, the procedure is time consuming. In contrast, the filter probe method offers many advantages over the traditional shake-flask method in terms of speed, efficiency of separation and degree of automation. The shake-flask method is the method of choice for determination of partition coefficients of extremely hydrophilic and hydrophobic ligands.

  6. Optical Coherence Tomography in the UK Biobank Study - Rapid Automated Analysis of Retinal Thickness for Large Population-Based Studies.

    Directory of Open Access Journals (Sweden)

    Pearse A Keane

    Full Text Available To describe an approach to the use of optical coherence tomography (OCT imaging in large, population-based studies, including methods for OCT image acquisition, storage, and the remote, rapid, automated analysis of retinal thickness.In UK Biobank, OCT images were acquired between 2009 and 2010 using a commercially available "spectral domain" OCT device (3D OCT-1000, Topcon. Images were obtained using a raster scan protocol, 6 mm x 6 mm in area, and consisting of 128 B-scans. OCT image sets were stored on UK Biobank servers in a central repository, adjacent to high performance computers. Rapid, automated analysis of retinal thickness was performed using custom image segmentation software developed by the Topcon Advanced Biomedical Imaging Laboratory (TABIL. This software employs dual-scale gradient information to allow for automated segmentation of nine intraretinal boundaries in a rapid fashion.67,321 participants (134,642 eyes in UK Biobank underwent OCT imaging of both eyes as part of the ocular module. 134,611 images were successfully processed with 31 images failing segmentation analysis due to corrupted OCT files or withdrawal of subject consent for UKBB study participation. Average time taken to call up an image from the database and complete segmentation analysis was approximately 120 seconds per data set per login, and analysis of the entire dataset was completed in approximately 28 days.We report an approach to the rapid, automated measurement of retinal thickness from nearly 140,000 OCT image sets from the UK Biobank. In the near future, these measurements will be publically available for utilization by researchers around the world, and thus for correlation with the wealth of other data collected in UK Biobank. The automated analysis approaches we describe may be of utility for future large population-based epidemiological studies, clinical trials, and screening programs that employ OCT imaging.

  7. Optical Coherence Tomography in the UK Biobank Study - Rapid Automated Analysis of Retinal Thickness for Large Population-Based Studies.

    Science.gov (United States)

    Keane, Pearse A; Grossi, Carlota M; Foster, Paul J; Yang, Qi; Reisman, Charles A; Chan, Kinpui; Peto, Tunde; Thomas, Dhanes; Patel, Praveen J

    2016-01-01

    To describe an approach to the use of optical coherence tomography (OCT) imaging in large, population-based studies, including methods for OCT image acquisition, storage, and the remote, rapid, automated analysis of retinal thickness. In UK Biobank, OCT images were acquired between 2009 and 2010 using a commercially available "spectral domain" OCT device (3D OCT-1000, Topcon). Images were obtained using a raster scan protocol, 6 mm x 6 mm in area, and consisting of 128 B-scans. OCT image sets were stored on UK Biobank servers in a central repository, adjacent to high performance computers. Rapid, automated analysis of retinal thickness was performed using custom image segmentation software developed by the Topcon Advanced Biomedical Imaging Laboratory (TABIL). This software employs dual-scale gradient information to allow for automated segmentation of nine intraretinal boundaries in a rapid fashion. 67,321 participants (134,642 eyes) in UK Biobank underwent OCT imaging of both eyes as part of the ocular module. 134,611 images were successfully processed with 31 images failing segmentation analysis due to corrupted OCT files or withdrawal of subject consent for UKBB study participation. Average time taken to call up an image from the database and complete segmentation analysis was approximately 120 seconds per data set per login, and analysis of the entire dataset was completed in approximately 28 days. We report an approach to the rapid, automated measurement of retinal thickness from nearly 140,000 OCT image sets from the UK Biobank. In the near future, these measurements will be publically available for utilization by researchers around the world, and thus for correlation with the wealth of other data collected in UK Biobank. The automated analysis approaches we describe may be of utility for future large population-based epidemiological studies, clinical trials, and screening programs that employ OCT imaging.

  8. CEST ANALYSIS: AUTOMATED CHANGE DETECTION FROM VERY-HIGH-RESOLUTION REMOTE SENSING IMAGES

    Directory of Open Access Journals (Sweden)

    M. Ehlers

    2012-08-01

    Full Text Available A fast detection, visualization and assessment of change in areas of crisis or catastrophes are important requirements for coordination and planning of help. Through the availability of new satellites and/or airborne sensors with very high spatial resolutions (e.g., WorldView, GeoEye new remote sensing data are available for a better detection, delineation and visualization of change. For automated change detection, a large number of algorithms has been proposed and developed. From previous studies, however, it is evident that to-date no single algorithm has the potential for being a reliable change detector for all possible scenarios. This paper introduces the Combined Edge Segment Texture (CEST analysis, a decision-tree based cooperative suite of algorithms for automated change detection that is especially designed for the generation of new satellites with very high spatial resolution. The method incorporates frequency based filtering, texture analysis, and image segmentation techniques. For the frequency analysis, different band pass filters can be applied to identify the relevant frequency information for change detection. After transforming the multitemporal images via a fast Fourier transform (FFT and applying the most suitable band pass filter, different methods are available to extract changed structures: differencing and correlation in the frequency domain and correlation and edge detection in the spatial domain. Best results are obtained using edge extraction. For the texture analysis, different 'Haralick' parameters can be calculated (e.g., energy, correlation, contrast, inverse distance moment with 'energy' so far providing the most accurate results. These algorithms are combined with a prior segmentation of the image data as well as with morphological operations for a final binary change result. A rule-based combination (CEST of the change algorithms is applied to calculate the probability of change for a particular location. CEST

  9. Automated method for measuring the extent of selective logging damage with airborne LiDAR data

    Science.gov (United States)

    Melendy, L.; Hagen, S. C.; Sullivan, F. B.; Pearson, T. R. H.; Walker, S. M.; Ellis, P.; Kustiyo; Sambodo, Ari Katmoko; Roswintiarti, O.; Hanson, M. A.; Klassen, A. W.; Palace, M. W.; Braswell, B. H.; Delgado, G. M.

    2018-05-01

    Selective logging has an impact on the global carbon cycle, as well as on the forest micro-climate, and longer-term changes in erosion, soil and nutrient cycling, and fire susceptibility. Our ability to quantify these impacts is dependent on methods and tools that accurately identify the extent and features of logging activity. LiDAR-based measurements of these features offers significant promise. Here, we present a set of algorithms for automated detection and mapping of critical features associated with logging - roads/decks, skid trails, and gaps - using commercial airborne LiDAR data as input. The automated algorithm was applied to commercial LiDAR data collected over two logging concessions in Kalimantan, Indonesia in 2014. The algorithm results were compared to measurements of the logging features collected in the field soon after logging was complete. The automated algorithm-mapped road/deck and skid trail features match closely with features measured in the field, with agreement levels ranging from 69% to 99% when adjusting for GPS location error. The algorithm performed most poorly with gaps, which, by their nature, are variable due to the unpredictable impact of tree fall versus the linear and regular features directly created by mechanical means. Overall, the automated algorithm performs well and offers significant promise as a generalizable tool useful to efficiently and accurately capture the effects of selective logging, including the potential to distinguish reduced impact logging from conventional logging.

  10. The CAIRN method: automated, reproducible calculation of catchment-averaged denudation rates from cosmogenic nuclide concentrations

    Science.gov (United States)

    Marius Mudd, Simon; Harel, Marie-Alice; Hurst, Martin D.; Grieve, Stuart W. D.; Marrero, Shasta M.

    2016-08-01

    We report a new program for calculating catchment-averaged denudation rates from cosmogenic nuclide concentrations. The method (Catchment-Averaged denudatIon Rates from cosmogenic Nuclides: CAIRN) bundles previously reported production scaling and topographic shielding algorithms. In addition, it calculates production and shielding on a pixel-by-pixel basis. We explore the effect of sampling frequency across both azimuth (Δθ) and altitude (Δϕ) angles for topographic shielding and show that in high relief terrain a relatively high sampling frequency is required, with a good balance achieved between accuracy and computational expense at Δθ = 8° and Δϕ = 5°. CAIRN includes both internal and external uncertainty analysis, and is packaged in freely available software in order to facilitate easily reproducible denudation rate estimates. CAIRN calculates denudation rates but also automates catchment averaging of shielding and production, and thus can be used to provide reproducible input parameters for the CRONUS family of online calculators.

  11. Instrumental neutron activation analysis - a routine method

    International Nuclear Information System (INIS)

    Bruin, M. de.

    1983-01-01

    This thesis describes the way in which at IRI instrumental neutron activation analysis (INAA) has been developed into an automated system for routine analysis. The basis of this work are 20 publications describing the development of INAA since 1968. (Auth.)

  12. Automated analysis of intima-media thickness: analysis and performance of CARES 3.0.

    Science.gov (United States)

    Saba, Luca; Montisci, Roberto; Famiglietti, Luca; Tallapally, Niranjan; Acharya, U Rajendra; Molinari, Filippo; Sanfilippo, Roberto; Mallarini, Giorgio; Nicolaides, Andrew; Suri, Jasjit S

    2013-07-01

    In recent years, the use of computer-based techniques has been advocated to improve intima-media thickness (IMT) quantification and its reproducibility. The purpose of this study was to test the diagnostic performance of a new IMT automated algorithm, CARES 3.0, which is a patented class of IMT measurement systems called AtheroEdge (AtheroPoint, LLC, Roseville, CA). From 2 different institutions, we analyzed the carotid arteries of 250 patients. The automated CARES 3.0 algorithm was tested versus 2 other automated algorithms, 1 semiautomated algorithm, and a reader reference to assess the IMT measurements. Bland-Altman analysis, regression analysis, and the Student t test were performed. CARES 3.0 showed an IMT measurement bias ± SD of -0.022 ± 0.288 mm compared with the expert reader. The average IMT by CARES 3.0 was 0.852 ± 0.248 mm, and that of the reader was 0.872 ± 0.325 mm. In the Bland-Altman plots, the CARES 3.0 IMT measurements showed accurate values, with about 80% of the images having an IMT measurement bias ranging between -50% and +50%. These values were better than those of the previous CARES releases and the semiautomated algorithm. Regression analysis showed that, among all techniques, the best t value was between CARES 3.0 and the reader. We have developed an improved fully automated technique for carotid IMT measurement on longitudinal ultrasound images. This new version, called CARES 3.0, consists of a new heuristic for lumen-intima and media-adventitia detection, which showed high accuracy and reproducibility for IMT measurement.

  13. Prototype Software for Automated Structural Analysis of Systems

    DEFF Research Database (Denmark)

    Jørgensen, A.; Izadi-Zamanabadi, Roozbeh; Kristensen, M.

    2004-01-01

    In this paper we present a prototype software tool that is developed to analyse the structural model of automated systems in order to identify redundant information that is hence utilized for Fault detection and Isolation (FDI) purposes. The dedicated algorithms in this software tool use a tri...

  14. Prajna: adding automated reasoning to the visual- analysis process.

    Science.gov (United States)

    Swing, E

    2010-01-01

    Developers who create applications for knowledge representation must contend with challenges in both the abundance of data and the variety of toolkits, architectures, and standards for representing it. Prajna is a flexible Java toolkit designed to overcome these challenges with an extensible architecture that supports both visualization and automated reasoning.

  15. A semi-automated method for bone age assessment using cervical vertebral maturation.

    Science.gov (United States)

    Baptista, Roberto S; Quaglio, Camila L; Mourad, Laila M E H; Hummel, Anderson D; Caetano, Cesar Augusto C; Ortolani, Cristina Lúcia F; Pisa, Ivan T

    2012-07-01

    To propose a semi-automated method for pattern classification to predict individuals' stage of growth based on morphologic characteristics that are described in the modified cervical vertebral maturation (CVM) method of Baccetti et al. A total of 188 lateral cephalograms were collected, digitized, evaluated manually, and grouped into cervical stages by two expert examiners. Landmarks were located on each image and measured. Three pattern classifiers based on the Naïve Bayes algorithm were built and assessed using a software program. The classifier with the greatest accuracy according to the weighted kappa test was considered best. The classifier showed a weighted kappa coefficient of 0.861 ± 0.020. If an adjacent estimated pre-stage or poststage value was taken to be acceptable, the classifier would show a weighted kappa coefficient of 0.992 ± 0.019. Results from this study show that the proposed semi-automated pattern classification method can help orthodontists identify the stage of CVM. However, additional studies are needed before this semi-automated classification method for CVM assessment can be implemented in clinical practice.

  16. Methods for pattern selection, class-specific feature selection and classification for automated learning.

    Science.gov (United States)

    Roy, Asim; Mackin, Patrick D; Mukhopadhyay, Somnath

    2013-05-01

    This paper presents methods for training pattern (prototype) selection, class-specific feature selection and classification for automated learning. For training pattern selection, we propose a method of sampling that extracts a small number of representative training patterns (prototypes) from the dataset. The idea is to extract a set of prototype training patterns that represents each class region in a classification problem. In class-specific feature selection, we try to find a separate feature set for each class such that it is the best one to separate that class from the other classes. We then build a separate classifier for that class based on its own feature set. The paper also presents a new hypersphere classification algorithm. Hypersphere nets are similar to radial basis function (RBF) nets and belong to the group of kernel function nets. Polynomial time complexity of the methods is proven. Polynomial time complexity of learning algorithms is important to the field of neural networks. Computational results are provided for a number of well-known datasets. None of the parameters of the algorithm were fine tuned for any of the problems solved and this supports the idea of automation of learning methods. Automation of learning is crucial to wider deployment of learning technologies. Copyright © 2012 Elsevier Ltd. All rights reserved.

  17. A method to establish seismic noise baselines for automated station assessment

    Science.gov (United States)

    McNamara, D.E.; Hutt, C.R.; Gee, L.S.; Benz, H.M.; Buland, R.P.

    2009-01-01

    We present a method for quantifying station noise baselines and characterizing the spectral shape of out-of-nominal noise sources. Our intent is to automate this method in order to ensure that only the highest-quality data are used in rapid earthquake products at NEIC. In addition, the station noise baselines provide a valuable tool to support the quality control of GSN and ANSS backbone data and metadata. The procedures addressed here are currently in development at the NEIC, and work is underway to understand how quickly changes from nominal can be observed and used within the NEIC processing framework. The spectral methods and software used to compute station baselines and described herein (PQLX) can be useful to both permanent and portable seismic stations operators. Applications include: general seismic station and data quality control (QC), evaluation of instrument responses, assessment of near real-time communication system performance, characterization of site cultural noise conditions, and evaluation of sensor vault design, as well as assessment of gross network capabilities (McNamara et al. 2005). Future PQLX development plans include incorporating station baselines for automated QC methods and automating station status report generation and notification based on user-defined QC parameters. The PQLX software is available through the USGS (http://earthquake. usgs.gov/research/software/pqlx.php) and IRIS (http://www.iris.edu/software/ pqlx/).

  18. EddyOne automated analysis of PWR/WWER steam generator tubes eddy current data

    International Nuclear Information System (INIS)

    Nadinic, B.; Vanjak, Z.

    2004-01-01

    INETEC Institute for Nuclear Technology developed software package called Eddy One which has option of automated analysis of bobbin coil eddy current data. During its development and on site use, many valuable lessons were learned which are described in this article. In accordance with previous, the following topics are covered: General requirements for automated analysis of bobbin coil eddy current data; Main approaches to automated analysis; Multi rule algorithms for data screening; Landmark detection algorithms as prerequisite for automated analysis (threshold algorithms and algorithms based on neural network principles); Field experience with Eddy One software; Development directions (use of artificial intelligence with self learning abilities for indication detection and sizing); Automated analysis software qualification; Conclusions. Special emphasis is given on results obtained on different types of steam generators, condensers and heat exchangers. Such results are then compared with results obtained by other automated software vendors giving clear advantage to INETEC approach. It has to be pointed out that INETEC field experience was collected also on WWER steam generators what is for now unique experience.(author)

  19. The Development Of Mathematical Model For Automated Fingerprint Identification Systems Analysis

    International Nuclear Information System (INIS)

    Ardisasmita, M. Syamsa

    2001-01-01

    Fingerprint has a strong oriented and periodic structure composed of dark lines of raised skin (ridges) and clear lines of lowered skin (furrows)that twist to form a distinct pattern. Although the manner in which the ridges flow is distinctive, other characteristics of the fingerprint called m inutiae a re what are most unique to the individual. These features are particular patterns consisting of terminations or bifurcations of the ridges. To assert if two fingerprints are from the same finger or not, experts detect those minutiae. AFIS (Automated Fingerprint Identification Systems) extract and compare these features for determining a match. The classic methods of fingerprints recognition are not suitable for direct implementation in form of computer algorithms. The creation of a finger's model was however the necessity of development of new, better algorithms of analysis. This paper presents a new numerical methods of fingerprints' simulation based on mathematical model of arrangement of dermatoglyphics and creation of minutiae. This paper describes also the design and implementation of an automated fingerprint identification systems which operates in two stages: minutiae extraction and minutiae matching

  20. Correlation of the UV-induced mutational spectra and the DNA damage distribution of the human HPRT gene: Automating the analysis

    International Nuclear Information System (INIS)

    Kotturi, G.; Erfle, H.; Koop, B.F.; Boer, J.G. de; Glickman, B.W.

    1994-01-01

    Automated DNA sequencers can be readily adapted for various types of sequence-based nucleic acid analysis: more recently it was determined the distribution of UV photoproducts in the E. coli laci gene using techniques developed for automated fluorescence-based analysis. We have been working to improve the automated approach of damage distribution. Our current method is more rigorous. We have new software that integrates the area under the individual peaks, rather than measuring the height of the curve. In addition, we now employ an internal standard. The analysis can also be partially automated. Detection limits for both major types of UV-photoproducts (cyclobutane dimers and pyrimidine (6-4) pyrimidone photoproducts) are reported. The UV-induced damage distribution in the hprt gene is compared to the mutational spectra in human and rodents cells

  1. Electroencephalographic sleep of healthy children. Part II: Findings using automated delta and REM sleep measurement methods.

    Science.gov (United States)

    Coble, P A; Reynolds, C F; Kupfer, D J; Houck, P

    1987-12-01

    Using earlier developed computer-based measurement methods, delta and REM activity were examined during sleep as a function of age, gender, and time of night in 85 healthy, 6- to 16-year-old children. Chronological age was found to account most strongly for differences in automated delta and REM count measures in this age range. Increasing age was shown to be associated with a significant decline in both automated measures, but the effect was much greater for the delta count measure. The age-related decline in delta wave activity was reflected primarily in a linear decline in 2.0-3.0 Hz delta activity, that is, in the faster end of the delta frequency band. Examination of these measurements in successive NREM and REM sleep periods confirmed that, in children as in adults, delta activity decreases and REM activity increases across the night. Findings are discussed relative to those obtained in the same children using standard measurement methods.

  2. Integrated Markov-neural reliability computation method: A case for multiple automated guided vehicle system

    International Nuclear Information System (INIS)

    Fazlollahtabar, Hamed; Saidi-Mehrabad, Mohammad; Balakrishnan, Jaydeep

    2015-01-01

    This paper proposes an integrated Markovian and back propagation neural network approaches to compute reliability of a system. While states of failure occurrences are significant elements for accurate reliability computation, Markovian based reliability assessment method is designed. Due to drawbacks shown by Markovian model for steady state reliability computations and neural network for initial training pattern, integration being called Markov-neural is developed and evaluated. To show efficiency of the proposed approach comparative analyses are performed. Also, for managerial implication purpose an application case for multiple automated guided vehicles (AGVs) in manufacturing networks is conducted. - Highlights: • Integrated Markovian and back propagation neural network approach to compute reliability. • Markovian based reliability assessment method. • Managerial implication is shown in an application case for multiple automated guided vehicles (AGVs) in manufacturing networks

  3. Analysis of new bone, cartilage, and fibrosis tissue in healing murine allografts using whole slide imaging and a new automated histomorphometric algorithm

    OpenAIRE

    Zhang, Longze; Chang, Martin; Beck, Christopher A; Schwarz, Edward M; Boyce, Brendan F

    2016-01-01

    Histomorphometric analysis of histologic sections of normal and diseased bone samples, such as healing allografts and fractures, is widely used in bone research. However, the utility of traditional semi-automated methods is limited because they are labor-intensive and can have high interobserver variability depending upon the parameters being assessed, and primary data cannot be re-analyzed automatically. Automated histomorphometry has long been recognized as a solution for these issues, and ...

  4. RoboSCell: An automated single cell arraying and analysis instrument

    KAUST Repository

    Sakaki, Kelly

    2009-09-09

    Single cell research has the potential to revolutionize experimental methods in biomedical sciences and contribute to clinical practices. Recent studies suggest analysis of single cells reveals novel features of intracellular processes, cell-to-cell interactions and cell structure. The methods of single cell analysis require mechanical resolution and accuracy that is not possible using conventional techniques. Robotic instruments and novel microdevices can achieve higher throughput and repeatability; however, the development of such instrumentation is a formidable task. A void exists in the state-of-the-art for automated analysis of single cells. With the increase in interest in single cell analyses in stem cell and cancer research the ability to facilitate higher throughput and repeatable procedures is necessary. In this paper, a high-throughput, single cell microarray-based robotic instrument, called the RoboSCell, is described. The proposed instrument employs a partially transparent single cell microarray (SCM) integrated with a robotic biomanipulator for in vitro analyses of live single cells trapped at the array sites. Cells, labeled with immunomagnetic particles, are captured at the array sites by channeling magnetic fields through encapsulated permalloy channels in the SCM. The RoboSCell is capable of systematically scanning the captured cells temporarily immobilized at the array sites and using optical methods to repeatedly measure extracellular and intracellular characteristics over time. The instrument\\'s capabilities are demonstrated by arraying human T lymphocytes and measuring the uptake dynamics of calcein acetoxymethylester-all in a fully automated fashion. © 2009 Springer Science+Business Media, LLC.

  5. USING LEARNING VECTOR QUANTIZATION METHOD FOR AUTOMATED IDENTIFICATION OF MYCOBACTERIUM TUBERCULOSIS

    Directory of Open Access Journals (Sweden)

    Endah Purwanti

    2012-01-01

    Full Text Available In this paper, we are developing an automated method for the detection of tubercle bacilli in clinical specimens, principally the sputum. This investigation is the first attempt to automatically identify TB bacilli in sputum using image processing and learning vector quantization (LVQ techniques. The evaluation of the learning vector quantization (LVQ was carried out on Tuberculosis dataset show that average of accuracy is 91,33%.

  6. Automated Inspection Algorithm for Thick Plate Using Dual Light Switching Lighting Method

    OpenAIRE

    Yong-JuJeon; Doo-chul Choi; Jong Pil Yun; Changhyun Park; Homoon Bae; Sang Woo Kim

    2012-01-01

    This paper presents an automated inspection algorithm for a thick plate. Thick plates typically have various types of surface defects, such as scabs, scratches, and roller marks. These defects have individual characteristics including brightness and shape. Therefore, it is not simple to detect all the defects. In order to solve these problems and to detect defects more effectively, we propose a dual light switching lighting method and a defect detection algorithm based on ...

  7. Improvement of the banana "Musa acuminata" reference sequence using NGS data and semi-automated bioinformatics methods.

    Science.gov (United States)

    Martin, Guillaume; Baurens, Franc-Christophe; Droc, Gaëtan; Rouard, Mathieu; Cenci, Alberto; Kilian, Andrzej; Hastie, Alex; Doležel, Jaroslav; Aury, Jean-Marc; Alberti, Adriana; Carreel, Françoise; D'Hont, Angélique

    2016-03-16

    Recent advances in genomics indicate functional significance of a majority of genome sequences and their long range interactions. As a detailed examination of genome organization and function requires very high quality genome sequence, the objective of this study was to improve reference genome assembly of banana (Musa acuminata). We have developed a modular bioinformatics pipeline to improve genome sequence assemblies, which can handle various types of data. The pipeline comprises several semi-automated tools. However, unlike classical automated tools that are based on global parameters, the semi-automated tools proposed an expert mode for a user who can decide on suggested improvements through local compromises. The pipeline was used to improve the draft genome sequence of Musa acuminata. Genotyping by sequencing (GBS) of a segregating population and paired-end sequencing were used to detect and correct scaffold misassemblies. Long insert size paired-end reads identified scaffold junctions and fusions missed by automated assembly methods. GBS markers were used to anchor scaffolds to pseudo-molecules with a new bioinformatics approach that avoids the tedious step of marker ordering during genetic map construction. Furthermore, a genome map was constructed and used to assemble scaffolds into super scaffolds. Finally, a consensus gene annotation was projected on the new assembly from two pre-existing annotations. This approach reduced the total Musa scaffold number from 7513 to 1532 (i.e. by 80%), with an N50 that increased from 1.3 Mb (65 scaffolds) to 3.0 Mb (26 scaffolds). 89.5% of the assembly was anchored to the 11 Musa chromosomes compared to the previous 70%. Unknown sites (N) were reduced from 17.3 to 10.0%. The release of the Musa acuminata reference genome version 2 provides a platform for detailed analysis of banana genome variation, function and evolution. Bioinformatics tools developed in this work can be used to improve genome sequence assemblies in

  8. Methods for RNA Analysis

    DEFF Research Database (Denmark)

    Olivarius, Signe

    While increasing evidence appoints diverse types of RNA as key players in the regulatory networks underlying cellular differentiation and metabolism, the potential functions of thousands of conserved RNA structures encoded in mammalian genomes remain to be determined. Since the functions of most...... RNAs rely on interactions with proteins, the establishment of protein-binding profiles is essential for the characterization of RNAs. Aiming to facilitate RNA analysis, this thesis introduces proteomics- as well as transcriptomics-based methods for the functional characterization of RNA. First, RNA......-protein pulldown combined with mass spectrometry analysis is applied for in vivo as well as in vitro identification of RNA-binding proteins, the latter succeeding in verifying known RNA-protein interactions. Secondly, acknowledging the significance of flexible promoter usage for the diversification...

  9. [Snoring analysis methods].

    Science.gov (United States)

    Fiz Fernández, José Antonio; Solà Soler, Jordi; Jané Campos, Raimon

    2011-06-11

    Snore is a breathing sound that is originated during sleep, either nocturnal or diurnal. Snoring may be inspiratory, expiratory or it may occupy the whole breathing cycle. It is caused by the vibrations of the different tissues of the upper airway. Many procedures have been used to analyze it, from simple interrogation, to standardized questionnaires, to more sophisticated acoustic methods developed thanks to the advance of biomedical techniques in the last years. The present work describes the current state of the art of snoring analysis procedures. Copyright © 2010 Elsevier España, S.L. All rights reserved.

  10. Method of signal analysis

    International Nuclear Information System (INIS)

    Berthomier, Charles

    1975-01-01

    A method capable of handling the amplitude and the frequency time laws of a certain kind of geophysical signals is described here. This method is based upon the analytical signal idea of Gabor and Ville, which is constructed either in the time domain by adding an imaginary part to the real signal (in-quadrature signal), or in the frequency domain by suppressing negative frequency components. The instantaneous frequency of the initial signal is then defined as the time derivative of the phase of the analytical signal, and his amplitude, or envelope, as the modulus of this complex signal. The method is applied to three types of magnetospheric signals: chorus, whistlers and pearls. The results obtained by analog and numerical calculations are compared to results obtained by classical systems using filters, i.e. based upon a different definition of the concept of frequency. The precision with which the frequency-time laws are determined leads then to the examination of the principle of the method and to a definition of instantaneous power density spectrum attached to the signal, and to the first consequences of this definition. In this way, a two-dimensional representation of the signal is introduced which is less deformed by the analysis system properties than the usual representation, and which moreover has the advantage of being obtainable practically in real time [fr

  11. Quantification of Eosinophilic Granule Protein Deposition in Biopsies of Inflammatory Skin Diseases by Automated Image Analysis of Highly Sensitive Immunostaining

    Directory of Open Access Journals (Sweden)

    Peter Kiehl

    1999-01-01

    Full Text Available Eosinophilic granulocytes are major effector cells in inflammation. Extracellular deposition of toxic eosinophilic granule proteins (EGPs, but not the presence of intact eosinophils, is crucial for their functional effect in situ. As even recent morphometric approaches to quantify the involvement of eosinophils in inflammation have been only based on cell counting, we developed a new method for the cell‐independent quantification of EGPs by image analysis of immunostaining. Highly sensitive, automated immunohistochemistry was done on paraffin sections of inflammatory skin diseases with 4 different primary antibodies against EGPs. Image analysis of immunostaining was performed by colour translation, linear combination and automated thresholding. Using strictly standardized protocols, the assay was proven to be specific and accurate concerning segmentation in 8916 fields of 520 sections, well reproducible in repeated measurements and reliable over 16 weeks observation time. The method may be valuable for the cell‐independent segmentation of immunostaining in other applications as well.

  12. Long-term live cell imaging and automated 4D analysis of drosophila neuroblast lineages.

    Directory of Open Access Journals (Sweden)

    Catarina C F Homem

    Full Text Available The developing Drosophila brain is a well-studied model system for neurogenesis and stem cell biology. In the Drosophila central brain, around 200 neural stem cells called neuroblasts undergo repeated rounds of asymmetric cell division. These divisions typically generate a larger self-renewing neuroblast and a smaller ganglion mother cell that undergoes one terminal division to create two differentiating neurons. Although single mitotic divisions of neuroblasts can easily be imaged in real time, the lack of long term imaging procedures has limited the use of neuroblast live imaging for lineage analysis. Here we describe a method that allows live imaging of cultured Drosophila neuroblasts over multiple cell cycles for up to 24 hours. We describe a 4D image analysis protocol that can be used to extract cell cycle times and growth rates from the resulting movies in an automated manner. We use it to perform lineage analysis in type II neuroblasts where clonal analysis has indicated the presence of a transit-amplifying population that potentiates the number of neurons. Indeed, our experiments verify type II lineages and provide quantitative parameters for all cell types in those lineages. As defects in type II neuroblast lineages can result in brain tumor formation, our lineage analysis method will allow more detailed and quantitative analysis of tumorigenesis and asymmetric cell division in the Drosophila brain.

  13. A method for the automated, reliable retrieval of publication-citation records.

    Directory of Open Access Journals (Sweden)

    Derek Ruths

    Full Text Available BACKGROUND: Publication records and citation indices often are used to evaluate academic performance. For this reason, obtaining or computing them accurately is important. This can be difficult, largely due to a lack of complete knowledge of an individual's publication list and/or lack of time available to manually obtain or construct the publication-citation record. While online publication search engines have somewhat addressed these problems, using raw search results can yield inaccurate estimates of publication-citation records and citation indices. METHODOLOGY: In this paper, we present a new, automated method that produces estimates of an individual's publication-citation record from an individual's name and a set of domain-specific vocabulary that may occur in the individual's publication titles. Because this vocabulary can be harvested directly from a research web page or online (partial publication list, our method delivers an easy way to obtain estimates of a publication-citation record and the relevant citation indices. Our method works by applying a series of stringent name and content filters to the raw publication search results returned by an online publication search engine. In this paper, our method is run using Google Scholar, but the underlying filters can be easily applied to any existing publication search engine. When compared against a manually constructed data set of individuals and their publication-citation records, our method provides significant improvements over raw search results. The estimated publication-citation records returned by our method have an average sensitivity of 98% and specificity of 72% (in contrast to raw search result specificity of less than 10%. When citation indices are computed using these records, the estimated indices are within of the true value 10%, compared to raw search results which have overestimates of, on average, 75%. CONCLUSIONS: These results confirm that our method provides

  14. Comparison of subjective and fully automated methods for measuring mammographic density.

    Science.gov (United States)

    Moshina, Nataliia; Roman, Marta; Sebuødegård, Sofie; Waade, Gunvor G; Ursin, Giske; Hofvind, Solveig

    2018-02-01

    Background Breast radiologists of the Norwegian Breast Cancer Screening Program subjectively classified mammographic density using a three-point scale between 1996 and 2012 and changed into the fourth edition of the BI-RADS classification since 2013. In 2015, an automated volumetric breast density assessment software was installed at two screening units. Purpose To compare volumetric breast density measurements from the automated method with two subjective methods: the three-point scale and the BI-RADS density classification. Material and Methods Information on subjective and automated density assessment was obtained from screening examinations of 3635 women recalled for further assessment due to positive screening mammography between 2007 and 2015. The score of the three-point scale (I = fatty; II = medium dense; III = dense) was available for 2310 women. The BI-RADS density score was provided for 1325 women. Mean volumetric breast density was estimated for each category of the subjective classifications. The automated software assigned volumetric breast density to four categories. The agreement between BI-RADS and volumetric breast density categories was assessed using weighted kappa (k w ). Results Mean volumetric breast density was 4.5%, 7.5%, and 13.4% for categories I, II, and III of the three-point scale, respectively, and 4.4%, 7.5%, 9.9%, and 13.9% for the BI-RADS density categories, respectively ( P for trend < 0.001 for both subjective classifications). The agreement between BI-RADS and volumetric breast density categories was k w  = 0.5 (95% CI = 0.47-0.53; P < 0.001). Conclusion Mean values of volumetric breast density increased with increasing density category of the subjective classifications. The agreement between BI-RADS and volumetric breast density categories was moderate.

  15. SWOT Analysis of Automation for Cash and Accounts Control in Construction

    OpenAIRE

    Mariya Deriy

    2013-01-01

    The possibility has been analyzed as to computerization of control over accounting and information systems data in terms of cash and payments in company practical activity provided that the problem is solved of the existence of well-functioning single computer network between different units of a developing company. Current state of the control organization and possibility of its automation has been observed. SWOT analysis of control automation to identify its strengths and weaknesses, obstac...

  16. Automated analysis of damages for radiation in plastics surfaces; Analisis automatizado de danos por radiacion en superficies plasticas

    Energy Technology Data Exchange (ETDEWEB)

    Andrade, C.; Camacho M, E.; Tavera, L.; Balcazar, M. [ININ, 52045 Ocoyoacac, Estado de Mexico (Mexico)

    1990-02-15

    Analysis of damages done by the radiation in a polymer characterized by optic properties of polished surfaces, of uniformity and chemical resistance that the acrylic; resistant until the 150 centigrade grades of temperature, and with an approximate weight of half of the glass. An objective of this work is the development of a method that analyze in automated form the superficial damages induced by radiation in plastic materials means an images analyst. (Author)

  17. A new automated method for the determination of cross-section limits in ephemeral gullies

    Science.gov (United States)

    Castillo, Carlos; Ángel Campo-Bescós, Miguel; Casalí, Javier; Giménez, Rafael

    2017-04-01

    The assessment of gully erosion relies on the estimation of the soil volume enclosed by cross sections limits. Both 3D and 2D methods require the application of a methodology for the determination of the cross-section limits what has been traditionally carried out in two ways: a) by visual inspection of the cross-section by a certain expert operator; b) by the automated identification of thresholds for different geometrical variables such as elevation, slope or plan curvature obtained from the cross-section profile. However, for these last methods, typically, the thresholds are not of general application because they depend on absolute values valid only for the local gully conditions where they were derived. In this communication we evaluate an automated method for cross-section delimitation of ephemeral gullies and compare its performance with the visual assessment provided by five scientists experienced in gully erosion assessment, defining gully width, depth and area for a total of 60 ephemeral gullies cross-sections obtained from field surveys conducted on agricultural plots in Navarra (Spain). The automated method only depends on the calculation of a simple geometrical measurement, which is the bank trapezoid area for every point of each gully bank. This rectangle trapezoid (right-angled trapezoid) is defined by the elevation of a given point, the minimum elevation and the extremes of the cross-section. The gully limit for each bank is determined by the point in the bank with the maximum trapezoid area. The comparison of the estimates among the different expert operators showed large variation coefficients (up to 70%) in a number of cross-sections, larger for cross sections width and area and smaller for cross sections depth. The automated method produced comparable results to those obtained by the experts and was the procedure with the highest average correlation with the rest of the methods for the three dimensional parameters. The errors of the automated

  18. [Automated fluorescent analysis of STR profiling and sex determination].

    Science.gov (United States)

    Jiang, B; Liang, S; Guo, J

    2000-08-01

    Denaturing PAGE coupled with the ABI377 fluorescent automated DNA sequencer was used to test the performance and reproducibility of the automated DNA profiling systems at vWA31A, TH01, F13A01, FES, TPOX, CSF1PO and Amelogenin gene. The allele designation windows at the 7 genetic markers were established and implemented into the genotype reading software. Alleles differing in just 1 bp in length could easily be discriminated. Furthermore, the interpretation guidelines were outlined for the 7 genetic systems by investigating the relative peak areas of heterozygote peaks and relative stutter peak areas in various monoplex systems. Our results indicate that if the ratio between two peaks is equal to or higher than 0.404, a herozygote could be determined, otherwise the homozygote be made.

  19. Alert management for home healthcare based on home automation analysis.

    Science.gov (United States)

    Truong, T T; de Lamotte, F; Diguet, J-Ph; Said-Hocine, F

    2010-01-01

    Rising healthcare for elder and disabled people can be controlled by offering people autonomy at home by means of information technology. In this paper, we present an original and sensorless alert management solution which performs multimedia and home automation service discrimination and extracts highly regular home activities as sensors for alert management. The results of simulation data, based on real context, allow us to evaluate our approach before application to real data.

  20. Automated handling for SAF batch furnace and chemistry analysis operations

    International Nuclear Information System (INIS)

    Bowen, W.W.; Sherrell, D.L.; Wiemers, M.J.

    1981-01-01

    The Secure Automated Fabrication Program is developing a remotely operated breeder reactor fuel pin fabrication line. The equipment will be installed in the Fuels and Materials Examination Facility being constructed at Hanford, Washington. Production is scheduled to start in mid-1986. The application of small pneumatically operated industrial robots for loading and unloading product into and out of batch furnaces and for distribution and handling of chemistry samples is described

  1. Methods for geochemical analysis

    Science.gov (United States)

    Baedecker, Philip A.

    1987-01-01

    The laboratories for analytical chemistry within the Geologic Division of the U.S. Geological Survey are administered by the Office of Mineral Resources. The laboratory analysts provide analytical support to those programs of the Geologic Division that require chemical information and conduct basic research in analytical and geochemical areas vital to the furtherance of Division program goals. Laboratories for research and geochemical analysis are maintained at the three major centers in Reston, Virginia, Denver, Colorado, and Menlo Park, California. The Division has an expertise in a broad spectrum of analytical techniques, and the analytical research is designed to advance the state of the art of existing techniques and to develop new methods of analysis in response to special problems in geochemical analysis. The geochemical research and analytical results are applied to the solution of fundamental geochemical problems relating to the origin of mineral deposits and fossil fuels, as well as to studies relating to the distribution of elements in varied geologic systems, the mechanisms by which they are transported, and their impact on the environment.

  2. Automated quality control methods for sensor data: a novel observatory approach

    Directory of Open Access Journals (Sweden)

    J. R. Taylor

    2013-07-01

    Full Text Available National and international networks and observatories of terrestrial-based sensors are emerging rapidly. As such, there is demand for a standardized approach to data quality control, as well as interoperability of data among sensor networks. The National Ecological Observatory Network (NEON has begun constructing their first terrestrial observing sites, with 60 locations expected to be distributed across the US by 2017. This will result in over 14 000 automated sensors recording more than > 100 Tb of data per year. These data are then used to create other datasets and subsequent "higher-level" data products. In anticipation of this challenge, an overall data quality assurance plan has been developed and the first suite of data quality control measures defined. This data-driven approach focuses on automated methods for defining a suite of plausibility test parameter thresholds. Specifically, these plausibility tests scrutinize the data range and variance of each measurement type by employing a suite of binary checks. The statistical basis for each of these tests is developed, and the methods for calculating test parameter thresholds are explored here. While these tests have been used elsewhere, we apply them in a novel approach by calculating their relevant test parameter thresholds. Finally, implementing automated quality control is demonstrated with preliminary data from a NEON prototype site.

  3. An automated method of quantifying ferrite microstructures using electron backscatter diffraction (EBSD) data

    International Nuclear Information System (INIS)

    Shrestha, Sachin L.; Breen, Andrew J.; Trimby, Patrick; Proust, Gwénaëlle; Ringer, Simon P.; Cairney, Julie M.

    2014-01-01

    The identification and quantification of the different ferrite microconstituents in steels has long been a major challenge for metallurgists. Manual point counting from images obtained by optical and scanning electron microscopy (SEM) is commonly used for this purpose. While classification systems exist, the complexity of steel microstructures means that identifying and quantifying these phases is still a great challenge. Moreover, point counting is extremely tedious, time consuming, and subject to operator bias. This paper presents a new automated identification and quantification technique for the characterisation of complex ferrite microstructures by electron backscatter diffraction (EBSD). This technique takes advantage of the fact that different classes of ferrite exhibit preferential grain boundary misorientations, aspect ratios and mean misorientation, all of which can be detected using current EBSD software. These characteristics are set as criteria for identification and linked to grain size to determine the area fractions. The results of this method were evaluated by comparing the new automated technique with point counting results. The technique could easily be applied to a range of other steel microstructures. - Highlights: • New automated method to identify and quantify ferrite microconstituents in HSLA steels is presented. • Unique characteristics of the ferrite microconstituents are investigated using EBSD. • Characteristics of ferrite microconstituents are exploited to identify the type of ferrite grains within the steel's microstructures. • The identified ferrite grains are linked to their associated grain's size for area fraction calculations

  4. A semi-automated Raman micro-spectroscopy method for morphological and chemical characterizations of microplastic litter.

    Science.gov (United States)

    L, Frère; I, Paul-Pont; J, Moreau; P, Soudant; C, Lambert; A, Huvet; E, Rinnert

    2016-12-15

    Every step of microplastic analysis (collection, extraction and characterization) is time-consuming, representing an obstacle to the implementation of large scale monitoring. This study proposes a semi-automated Raman micro-spectroscopy method coupled to static image analysis that allows the screening of a large quantity of microplastic in a time-effective way with minimal machine operator intervention. The method was validated using 103 particles collected at the sea surface spiked with 7 standard plastics: morphological and chemical characterization of particles was performed in environmental sample (n=962 particles). The identification rate was 75% and significantly decreased as a function of particle size. Microplastics represented 71% of the identified particles and significant size differences were observed: polystyrene was mainly found in the 2-5mm range (59%), polyethylene in the 1-2mm range (40%) and polypropylene in the 0.335-1mm range (42%). Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. Automated capillary Western dot blot method for the identity of a 15-valent pneumococcal conjugate vaccine.

    Science.gov (United States)

    Hamm, Melissa; Ha, Sha; Rustandi, Richard R

    2015-06-01

    Simple Western is a new technology that allows for the separation, blotting, and detection of proteins similar to a traditional Western except in a capillary format. Traditionally, identity assays for biological products are performed using either an enzyme-linked immunosorbent assay (ELISA) or a manual dot blot Western. Both techniques are usually very tedious, labor-intensive, and complicated for multivalent vaccines, and they can be difficult to transfer to other laboratories. An advantage this capillary Western technique has over the traditional manual dot blot Western method is the speed and the automation of electrophoresis separation, blotting, and detection steps performed in 96 capillaries. This article describes details of the development of an automated identity assay for a 15-valent pneumococcal conjugate vaccine, PCV15-CRM197, using capillary Western technology. Copyright © 2015 Elsevier Inc. All rights reserved.

  6. Quantization of polyphenolic compounds in histological sections of grape berries by automated color image analysis

    Science.gov (United States)

    Clement, Alain; Vigouroux, Bertnand

    2003-04-01

    We present new results in applied color image analysis that put in evidence the significant influence of soil on localization and appearance of polyphenols in grapes. These results have been obtained with a new unsupervised classification algorithm founded on hierarchical analysis of color histograms. The process is automated thanks to a software platform we developed specifically for color image analysis and it's applications.

  7. COMPUTER METHODS OF GENETIC ANALYSIS.

    Directory of Open Access Journals (Sweden)

    A. L. Osipov

    2017-02-01

    Full Text Available The basic statistical methods used in conducting the genetic analysis of human traits. We studied by segregation analysis, linkage analysis and allelic associations. Developed software for the implementation of these methods support.

  8. Automated Dermoscopy Image Analysis of Pigmented Skin Lesions

    Directory of Open Access Journals (Sweden)

    Alfonso Baldi

    2010-03-01

    Full Text Available Dermoscopy (dermatoscopy, epiluminescence microscopy is a non-invasive diagnostic technique for the in vivo observation of pigmented skin lesions (PSLs, allowing a better visualization of surface and subsurface structures (from the epidermis to the papillary dermis. This diagnostic tool permits the recognition of morphologic structures not visible by the naked eye, thus opening a new dimension in the analysis of the clinical morphologic features of PSLs. In order to reduce the learning-curve of non-expert clinicians and to mitigate problems inherent in the reliability and reproducibility of the diagnostic criteria used in pattern analysis, several indicative methods based on diagnostic algorithms have been introduced in the last few years. Recently, numerous systems designed to provide computer-aided analysis of digital images obtained by dermoscopy have been reported in the literature. The goal of this article is to review these systems, focusing on the most recent approaches based on content-based image retrieval systems (CBIR.

  9. Immunohistochemical Ki-67/KL1 double stains increase accuracy of Ki-67 indices in breast cancer and simplify automated image analysis

    DEFF Research Database (Denmark)

    Nielsen, Patricia S; Bentzer, Nina K; Jensen, Vibeke

    2014-01-01

    observers and automated image analysis. RESULTS: Indices were predominantly higher for single stains than double stains (P≤0.002), yet the difference between observers was statistically significant (Pmanual and automated indices ranged from 0...... by digital image analysis. This study aims to detect the difference in accuracy and precision between manual indices of single and double stains, to develop an automated quantification of double stains, and to explore the relation between automated indices and tumor characteristics when quantified...... in different regions: hot spots, global tumor areas, and invasive fronts. MATERIALS AND METHODS: Paraffin-embedded, formalin-fixed tissue from 100 consecutive patients with invasive breast cancer was immunohistochemically stained for Ki-67 and Ki-67/KL1. Ki-67 was manually scored in different regions by 2...

  10. Integrated and automated data analysis for neuronal activation studies using positron emission tomography. Methodology and applications

    International Nuclear Information System (INIS)

    Minoshima, Satoshi; Arimizu, Noboru; Koeppe, R.A.; Kuhl, D.E.

    1994-01-01

    A data analysis method was developed for neuronal activation studies using [ 15 O] water positron emission tomography (PET). The method consists of several procedures including intra-subject head motion correction (co-registration), detection of the mid-sagittal plane of the brain, detection of the intercommissural (AC-PC) line, linear scaling and non-linear warping for anatomical standardization, pixel-by-pixel statistical analysis, and data display. All steps are performed in three dimensions and are fully automated. Each step was validated using a brain phantom, computer simulations, and data from human subjects, demonstrating accuracy and reliability of the procedure. The method was applied to human neuronal activation studies using vibratory and visual stimulations. The method detected significant blood flow increases in the primary sensory cortices as well as in other regions such as the secondary sensory cortex and cerebellum. The proposed method should enhance application of PET neuronal activation studies to the investigation of higher-order human brain functions. (author) 38 refs

  11. Sleep-spindle detection: crowdsourcing and evaluating performance of experts, non-experts and automated methods

    DEFF Research Database (Denmark)

    Warby, Simon C.; Wendt, Sabrina Lyngbye; Welinder, Peter

    2014-01-01

    to crowdsource spindle identification by human experts and non-experts, and we compared their performance with that of automated detection algorithms in data from middle- to older-aged subjects from the general population. We also refined methods for forming group consensus and evaluating the performance...... that crowdsourcing the scoring of sleep data is an efficient method to collect large data sets, even for difficult tasks such as spindle identification. Further refinements to spindle detection algorithms are needed for middle- to older-aged subjects....

  12. Automation of C-terminal sequence analysis of 2D-PAGE separated proteins

    Directory of Open Access Journals (Sweden)

    P.P. Moerman

    2014-06-01

    Full Text Available Experimental assignment of the protein termini remains essential to define the functional protein structure. Here, we report on the improvement of a proteomic C-terminal sequence analysis method. The approach aims to discriminate the C-terminal peptide in a CNBr-digest where Met-Xxx peptide bonds are cleaved in internal peptides ending at a homoserine lactone (hsl-derivative. pH-dependent partial opening of the lactone ring results in the formation of doublets for all internal peptides. C-terminal peptides are distinguished as singlet peaks by MALDI-TOF MS and MS/MS is then used for their identification. We present a fully automated protocol established on a robotic liquid-handling station.

  13. Results of Automated Retinal Image Analysis for Detection of Diabetic Retinopathy from the Nakuru Study, Kenya

    DEFF Research Database (Denmark)

    Juul Bøgelund Hansen, Morten; Abramoff, M. D.; Folk, J. C.

    2015-01-01

    Objective Digital retinal imaging is an established method of screening for diabetic retinopathy (DR). It has been established that currently about 1% of the world's blind or visually impaired is due to DR. However, the increasing prevalence of diabetes mellitus and DR is creating an increased...... gave an AUC of 0.878 (95% CI 0.850-0.905). It showed a negative predictive value of 98%. The IDP missed no vision threatening retinopathy in any patients and none of the false negative cases met criteria for treatment. Conclusions In this epidemiological sample, the IDP's grading was comparable...... workload on those with expertise in grading retinal images. Safe and reliable automated analysis of retinal images may support screening services worldwide. This study aimed to compare the Iowa Detection Program (IDP) ability to detect diabetic eye diseases (DED) to human grading carried out at Moorfields...

  14. Automated multivariate analysis of comprehensive two-dimensional gas chromatograms of petroleum

    DEFF Research Database (Denmark)

    Skov, Søren Furbo

    Petroleum is an economically and industrially important resource. Crude oil must be refined before use to ensure suitable properties of the product. Among the processes used in this refining is distillation and desulfurization. In order to optimize these processes, it is essential to understand...... them. Comprehensive two-dimensional gas chromatography (GCGC) is a method for analyzing the volatile parts of a sample. It can separate hundreds or thousands of compounds based on their boiling point, polarity and polarizability. This makes it ideally suited for petroleum analysis. The number...... impossible to find it. For a special class of models, multi-way models, unique solutions often exist, meaning that the underlying phenomena can be found. I have tested this class of models on GCGC data from petroleum and conclude that more work is needed before they can be automated. I demonstrate how...

  15. Automated analysis of image mammogram for breast cancer diagnosis

    Science.gov (United States)

    Nurhasanah, Sampurno, Joko; Faryuni, Irfana Diah; Ivansyah, Okto

    2016-03-01

    Medical imaging help doctors in diagnosing and detecting diseases that attack the inside of the body without surgery. Mammogram image is a medical image of the inner breast imaging. Diagnosis of breast cancer needs to be done in detail and as soon as possible for determination of next medical treatment. The aim of this work is to increase the objectivity of clinical diagnostic by using fractal analysis. This study applies fractal method based on 2D Fourier analysis to determine the density of normal and abnormal and applying the segmentation technique based on K-Means clustering algorithm to image abnormal for determine the boundary of the organ and calculate the area of organ segmentation results. The results show fractal method based on 2D Fourier analysis can be used to distinguish between the normal and abnormal breast and segmentation techniques with K-Means Clustering algorithm is able to generate the boundaries of normal and abnormal tissue organs, so area of the abnormal tissue can be determined.

  16. Rapid Automated Dissolution and Analysis Techniques for Radionuclides in Recycle Process Streams

    International Nuclear Information System (INIS)

    Sudowe, Ralf; Roman, Audrey; Dailey, Ashlee; Go, Elaine

    2013-01-01

    The analysis of process samples for radionuclide content is an important part of current procedures for material balance and accountancy in the different process streams of a recycling plant. The destructive sample analysis techniques currently available necessitate a significant amount of time. It is therefore desirable to develop new sample analysis procedures that allow for a quick turnaround time and increased sample throughput with a minimum of deviation between samples. In particular, new capabilities for rapid sample dissolution and radiochemical separation are required. Most of the radioanalytical techniques currently employed for sample analysis are based on manual laboratory procedures. Such procedures are time- and labor-intensive, and not well suited for situations in which a rapid sample analysis is required and/or large number of samples need to be analyzed. To address this issue we are currently investigating radiochemical separation methods based on extraction chromatography that have been specifically optimized for the analysis of process stream samples. The influence of potential interferences present in the process samples as well as mass loading, flow rate and resin performance is being studied. In addition, the potential to automate these procedures utilizing a robotic platform is evaluated. Initial studies have been carried out using the commercially available DGA resin. This resin shows an affinity for Am, Pu, U, and Th and is also exhibiting signs of a possible synergistic effects in the presence of iron.

  17. Automated gas bubble imaging at sea floor - a new method of in situ gas flux quantification

    Science.gov (United States)

    Thomanek, K.; Zielinski, O.; Sahling, H.; Bohrmann, G.

    2010-06-01

    Photo-optical systems are common in marine sciences and have been extensively used in coastal and deep-sea research. However, due to technical limitations in the past photo images had to be processed manually or semi-automatically. Recent advances in technology have rapidly improved image recording, storage and processing capabilities which are used in a new concept of automated in situ gas quantification by photo-optical detection. The design for an in situ high-speed image acquisition and automated data processing system is reported ("Bubblemeter"). New strategies have been followed with regards to back-light illumination, bubble extraction, automated image processing and data management. This paper presents the design of the novel method, its validation procedures and calibration experiments. The system will be positioned and recovered from the sea floor using a remotely operated vehicle (ROV). It is able to measure bubble flux rates up to 10 L/min with a maximum error of 33% for worst case conditions. The Bubblemeter has been successfully deployed at a water depth of 1023 m at the Makran accretionary prism offshore Pakistan during a research expedition with R/V Meteor in November 2007.

  18. A Machine Learning Approach to Automated Gait Analysis for the Noldus Catwalk TMSystem.

    Science.gov (United States)

    Frohlich, Holger; Claes, Kasper; De Wolf, Catherine; Van Damme, Xavier; Michel, Anne

    2017-08-24

    Gait analysis of animal disease models can provide valuable insights into in vivo compound effects and thus help in preclinical drug development. The purpose of this paper is to establish a computational gait analysis approach for the Noldus Catwalk TM system, in which footprints are automatically captured and stored. We present a - to our knowledge - first machine learning based approach for the Catwalk TM system, which comprises a step decomposition, definition and extraction of meaningful features, multivariate step sequence alignment, feature selection and training of different classifiers (Gradient Boosting Machine, Random Forest, Elastic Net). Using animal-wise leave-one-out cross-validation we demonstrate that with our method we can reliable separate movement patterns of a putative Parkinson's disease (PD) animal model and several control groups. Furthermore, we show that we can predict the time point after and the type of different brain lesions and can even forecast the brain region, where the intervention was applied. We provide an in-depth analysis of the features involved into our classifiers via statistical techniques for model interpretation. A machine learning method for automated analysis of data from the Noldus Catwalk TM system was established. Our works shows the ability of machine learning to discriminate pharmacologically relevant animal groups based on their walking behavior in a multivariate manner. Further interesting aspects of the approach include the ability to learn from past experiments, improve with more data arriving and to make predictions for single animals in future studies.

  19. Automated microscopic characterization of metallic ores with image analysis: a key to improve ore processing. I: test of the methodology

    International Nuclear Information System (INIS)

    Berrezueta, E.; Castroviejo, R.

    2007-01-01

    Ore microscopy has traditionally been an important support to control ore processing, but the volume of present day processes is beyond the reach of human operators. Automation is therefore compulsory, but its development through digital image analysis, DIA, is limited by various problems, such as the similarity in reflectance values of some important ores, their anisotropism, and the performance of instruments and methods. The results presented show that automated identification and quantification by DIA are possible through multiband (RGB) determinations with a research 3CCD video camera on reflected light microscope. These results were obtained by systematic measurement of selected ores accounting for most of the industrial applications. Polarized light is avoided, so the effects of anisotropism can be neglected. Quality control at various stages and statistical analysis are important, as is the application of complementary criteria (e.g. metallogenetic). The sequential methodology is described and shown through practical examples. (Author)

  20. Predicting blood transfusion using automated analysis of pulse oximetry signals and laboratory values.

    Science.gov (United States)

    Shackelford, Stacy; Yang, Shiming; Hu, Peter; Miller, Catriona; Anazodo, Amechi; Galvagno, Samuel; Wang, Yulei; Hartsky, Lauren; Fang, Raymond; Mackenzie, Colin

    2015-10-01

    Identification of hemorrhaging trauma patients and prediction of blood transfusion needs in near real time will expedite care of the critically injured. We hypothesized that automated analysis of pulse oximetry signals in combination with laboratory values and vital signs obtained at the time of triage would predict the need for blood transfusion with accuracy greater than that of triage vital signs or pulse oximetry analysis alone. Continuous pulse oximetry signals were recorded for directly admitted trauma patients with abnormal prehospital shock index (heart rate [HR] / systolic blood pressure) of 0.62 or greater. Predictions of blood transfusion within 24 hours were compared using Delong's method for area under the receiver operating characteristic (AUROC) curves to determine the optimal combination of triage vital signs (prehospital HR + systolic blood pressure), pulse oximetry features (40 waveform features, O2 saturation, HR), and laboratory values (hematocrit, electrolytes, bicarbonate, prothrombin time, international normalization ratio, lactate) in multivariate logistic regression models. We enrolled 1,191 patients; 339 were excluded because of incomplete data; 40 received blood within 3 hours; and 14 received massive transfusion. Triage vital signs predicted need for transfusion within 3 hours (AUROC, 0.59) and massive transfusion (AUROC, 0.70). Pulse oximetry for 15 minutes predicted transfusion more accurately than triage vital signs for both time frames (3-hour AUROC, 0.74; p = 0.004) (massive transfusion AUROC, 0.88; p transfusion prediction (3-hour AUROC, 0.84; p transfusion AUROC, 0.91; p blood transfusion during trauma resuscitation more accurately than triage vital signs or pulse oximetry analysis alone. Results suggest automated calculations from a noninvasive vital sign monitor interfaced with a point-of-care laboratory device may support clinical decisions by recognizing patients with hemorrhage sufficient to need transfusion. Epidemiologic

  1. Automated MRI Volumetric Analysis in Epilepsy Patients with Rasmussen’s Syndrome

    Science.gov (United States)

    Wang, Z. Irene; Krishnan, Balu; Shattuck, David W; Leahy, Richard M; Moosa, Ahsan NV; Wyllie, Elaine; Burgess, Richard C; Al-Sharif, Noor B; Joshi, Anand A; Alexopoulos, Andreas V; Mosher, John C; Udayasankar, Unni; Jones, Stephen E

    2016-01-01

    Background and Purpose To apply automated quantitative volumetric MRI analyses to patients diagnosed with Rasmussen’s encephalitis (RE), to determine the predictive value of lobar volumetric measures, and to assess regional atrophy difference and monitor disease progression using these measures. Materials and Methods Nineteen patients (42 scans) with diagnosed RE were studied. Two control groups were used: one with 42 age- and gender-matched normal subjects; the other with 42 non-RE epilepsy patients with the same disease duration as RE patients. Volumetric analysis was performed on T1-weighted images using BrainSuite. Ratios of volumes from the affected hemisphere divided by those from the unaffected hemisphere were used as input to a logistic regression classifier, which was trained to discriminate patients from controls. Using the classifier, we compared the predictive accuracy of all the volumetric measures. These ratios were further used to assess regional atrophy difference and to correlate with epilepsy duration. Results Interhemispheric and frontal lobe ratios had the best prediction accuracy to separate RE patients from normal and non-RE epilepsy controls. The insula showed significantly more atrophy compared to all the other cortical regions. Patients with longitudinal scans showed progressive volume loss of the affected hemisphere. Atrophy of the frontal lobe and insula correlated significantly with epilepsy duration. Conclusions Automated quantitative volumetric analysis provides accurate separation of RE patients from normal controls and non-RE epilepsy patients, and thus may assist diagnosis of RE. Volumetric analysis could also be included as part of followup for RE patients to assess disease progression. PMID:27609620

  2. Application of quantum dots as analytical tools in automated chemical analysis: A review

    International Nuclear Information System (INIS)

    Frigerio, Christian; Ribeiro, David S.M.; Rodrigues, S. Sofia M.; Abreu, Vera L.R.G.; Barbosa, João A.C.; Prior, João A.V.; Marques, Karine L.; Santos, João L.M.

    2012-01-01

    Highlights: ► Review on quantum dots application in automated chemical analysis. ► Automation by using flow-based techniques. ► Quantum dots in liquid chromatography and capillary electrophoresis. ► Detection by fluorescence and chemiluminescence. ► Electrochemiluminescence and radical generation. - Abstract: Colloidal semiconductor nanocrystals or quantum dots (QDs) are one of the most relevant developments in the fast-growing world of nanotechnology. Initially proposed as luminescent biological labels, they are finding new important fields of application in analytical chemistry, where their photoluminescent properties have been exploited in environmental monitoring, pharmaceutical and clinical analysis and food quality control. Despite the enormous variety of applications that have been developed, the automation of QDs-based analytical methodologies by resorting to automation tools such as continuous flow analysis and related techniques, which would allow to take advantage of particular features of the nanocrystals such as the versatile surface chemistry and ligand binding ability, the aptitude to generate reactive species, the possibility of encapsulation in different materials while retaining native luminescence providing the means for the implementation of renewable chemosensors or even the utilisation of more drastic and even stability impairing reaction conditions, is hitherto very limited. In this review, we provide insights into the analytical potential of quantum dots focusing on prospects of their utilisation in automated flow-based and flow-related approaches and the future outlook of QDs applications in chemical analysis.

  3. Advances in methods and applications of reliability and safety analysis

    International Nuclear Information System (INIS)

    Fieandt, J.; Hossi, H.; Laakso, K.; Lyytikaeinen, A.; Niemelae, I.; Pulkkinen, U.; Pulli, T.

    1986-01-01

    The know-how of the reliability and safety design and analysis techniques of Vtt has been established over several years in analyzing the reliability in the Finnish nuclear power plants Loviisa and Olkiluoto. This experience has been later on applied and developed to be used in the process industry, conventional power industry, automation and electronics. VTT develops and transfers methods and tools for reliability and safety analysis to the private and public sectors. The technology transfer takes place in joint development projects with potential users. Several computer-aided methods, such as RELVEC for reliability modelling and analysis, have been developed. The tool developed are today used by major Finnish companies in the fields of automation, nuclear power, shipbuilding and electronics. Development of computer-aided and other methods needed in analysis of operating experience, reliability or safety is further going on in a number of research and development projects

  4. Geena 2, improved automated analysis of MALDI/TOF mass spectra.

    Science.gov (United States)

    Romano, Paolo; Profumo, Aldo; Rocco, Mattia; Mangerini, Rosa; Ferri, Fabio; Facchiano, Angelo

    2016-03-02

    Mass spectrometry (MS) is producing high volumes of data supporting oncological sciences, especially for translational research. Most of related elaborations can be carried out by combining existing tools at different levels, but little is currently available for the automation of the fundamental steps. For the analysis of MALDI/TOF spectra, a number of pre-processing steps are required, including joining of isotopic abundances for a given molecular species, normalization of signals against an internal standard, background noise removal, averaging multiple spectra from the same sample, and aligning spectra from different samples. In this paper, we present Geena 2, a public software tool for the automated execution of these pre-processing steps for MALDI/TOF spectra. Geena 2 has been developed in a Linux-Apache-MySQL-PHP web development environment, with scripts in PHP and Perl. Input and output are managed as simple formats that can be consumed by any database system and spreadsheet software. Input data may also be stored in a MySQL database. Processing methods are based on original heuristic algorithms which are introduced in the paper. Three simple and intuitive web interfaces are available: the Standard Search Interface, which allows a complete control over all parameters, the Bright Search Interface, which leaves to the user the possibility to tune parameters for alignment of spectra, and the Quick Search Interface, which limits the number of parameters to a minimum by using default values for the majority of parameters. Geena 2 has been utilized, in conjunction with a statistical analysis tool, in three published experimental works: a proteomic study on the effects of long-term cryopreservation on the low molecular weight fraction of serum proteome, and two retrospective serum proteomic studies, one on the risk of developing breat cancer in patients affected by gross cystic disease of the breast (GCDB) and the other for the identification of a predictor of

  5. Semi-automated method for brain hematoma and edema quantification using computed tomography.

    Science.gov (United States)

    Bardera, A; Boada, I; Feixas, M; Remollo, S; Blasco, G; Silva, Y; Pedraza, S

    2009-06-01

    In this paper, a semi-automated method for brain hematoma and edema segmentation, and volume measurement using computed tomography imaging is presented. This method combines a region growing approach to segment the hematoma and a level set segmentation technique to segment the edema. The main novelty of this method is the strategy applied to define the propagation function required by the level set approach. To evaluate the method, 18 patients with brain hematoma and edema of different size, shape and location were selected. The obtained results demonstrate that the proposed approach provides objective and reproducible segmentations that are similar to the manually obtained results. Moreover, the processing time of the proposed method is about 4 min compared to the 10 min required for manual segmentation.

  6. Traitement automatique et apprentissage des langues (Automated Discourse Analysis and Language Teaching).

    Science.gov (United States)

    Garrigues, Mylene

    1992-01-01

    Issues in computerized analysis of language usage are discussed, focusing on the problems encountered as computers, linguistics, and language teaching converge. The tools of automated language and error analysis are outlined and specific problems are illustrated in several types of classroom exercise. (MSE)

  7. Use of computed tomography and automated software for quantitative analysis of the vasculature of patients with pulmonary hypertension

    Energy Technology Data Exchange (ETDEWEB)

    Wada, Danilo Tadao; Pádua, Adriana Ignácio de; Lima Filho, Moyses Oliveira; Marin Neto, José Antonio; Elias Júnior, Jorge; Baddini-Martinez, José; Santos, Marcel Koenigkam, E-mail: danilowada@yahoo.com.br [Universidade de São Paulo (HCFMRP/USP), Ribeirão Preto, SP (Brazil). Faculdade de Medicina. Hospital das Clínicas

    2017-11-15

    Objective: To perform a quantitative analysis of the lung parenchyma and pulmonary vasculature of patients with pulmonary hypertension (PH) on computed tomography angiography (CTA) images, using automated software. Materials And Methods: We retrospectively analyzed the CTA findings and clinical records of 45 patients with PH (17 males and 28 females), in comparison with a control group of 20 healthy individuals (7 males and 13 females); the mean age differed significantly between the two groups (53 ± 14.7 vs. 35 ± 9.6 years; p = 0.0001). Results: The automated analysis showed that, in comparison with the controls, the patients with PH showed lower 10{sup th} percentile values for lung density, higher vascular volumes in the right upper lung lobe, and higher vascular volume ratios between the upper and lower lobes. In our quantitative analysis, we found no differences among the various PH subgroups. We inferred that a difference in the 10{sup th} percentile values indicates areas of hypovolaemia in patients with PH and that a difference in pulmonary vascular volumes indicates redistribution of the pulmonary vasculature and an increase in pulmonary vasculature resistance. Conclusion: Automated analysis of pulmonary vessels on CTA images revealed alterations and could represent an objective diagnostic tool for the evaluation of patients with PH. (author)

  8. Development of a semi-automated method for mitral valve modeling with medial axis representation using 3D ultrasound.

    Science.gov (United States)

    Pouch, Alison M; Yushkevich, Paul A; Jackson, Benjamin M; Jassar, Arminder S; Vergnat, Mathieu; Gorman, Joseph H; Gorman, Robert C; Sehgal, Chandra M

    2012-02-01

    Precise 3D modeling of the mitral valve has the potential to improve our understanding of valve morphology, particularly in the setting of mitral regurgitation (MR). Toward this goal, the authors have developed a user-initialized algorithm for reconstructing valve geometry from transesophageal 3D ultrasound (3D US) image data. Semi-automated image analysis was performed on transesophageal 3D US images obtained from 14 subjects with MR ranging from trace to severe. Image analysis of the mitral valve at midsystole had two stages: user-initialized segmentation and 3D deformable modeling with continuous medial representation (cm-rep). Semi-automated segmentation began with user-identification of valve location in 2D projection images generated from 3D US data. The mitral leaflets were then automatically segmented in 3D using the level set method. Second, a bileaflet deformable medial model was fitted to the binary valve segmentation by Bayesian optimization. The resulting cm-rep provided a visual reconstruction of the mitral valve, from which localized measurements of valve morphology were automatically derived. The features extracted from the fitted cm-rep included annular area, annular circumference, annular height, intercommissural width, septolateral length, total tenting volume, and percent anterior tenting volume. These measurements were compared to those obtained by expert manual tracing. Regurgitant orifice area (ROA) measurements were compared to qualitative assessments of MR severity. The accuracy of valve shape representation with cm-rep was evaluated in terms of the Dice overlap between the fitted cm-rep and its target segmentation. The morphological features and anatomic ROA derived from semi-automated image analysis were consistent with manual tracing of 3D US image data and with qualitative assessments of MR severity made on clinical radiology. The fitted cm-reps accurately captured valve shape and demonstrated patient-specific differences in valve

  9. Network meta-analysis using R: a review of currently available automated packages.

    Directory of Open Access Journals (Sweden)

    Binod Neupane

    Full Text Available Network meta-analysis (NMA--a statistical technique that allows comparison of multiple treatments in the same meta-analysis simultaneously--has become increasingly popular in the medical literature in recent years. The statistical methodology underpinning this technique and software tools for implementing the methods are evolving. Both commercial and freely available statistical software packages have been developed to facilitate the statistical computations using NMA with varying degrees of functionality and ease of use. This paper aims to introduce the reader to three R packages, namely, gemtc, pcnetmeta, and netmeta, which are freely available software tools implemented in R. Each automates the process of performing NMA so that users can perform the analysis with minimal computational effort. We present, compare and contrast the availability and functionality of different important features of NMA in these three packages so that clinical investigators and researchers can determine which R packages to implement depending on their analysis needs. Four summary tables detailing (i data input and network plotting, (ii modeling options, (iii assumption checking and diagnostic testing, and (iv inference and reporting tools, are provided, along with an analysis of a previously published dataset to illustrate the outputs available from each package. We demonstrate that each of the three packages provides a useful set of tools, and combined provide users with nearly all functionality that might be desired when conducting a NMA.

  10. Web-based automation of green building rating index and life cycle cost analysis

    Science.gov (United States)

    Shahzaib Khan, Jam; Zakaria, Rozana; Aminuddin, Eeydzah; IzieAdiana Abidin, Nur; Sahamir, Shaza Rina; Ahmad, Rosli; Nafis Abas, Darul

    2018-04-01

    Sudden decline in financial markets and economic meltdown has slow down adaptation and lowered interest of investors towards green certified buildings due to their higher initial costs. Similarly, it is essential to fetch investor’s attention towards more development of green buildings through automated tools for the construction projects. Though, historical dearth is found on the automation of green building rating tools that brings up an essential gap to develop an automated analog computerized programming tool. This paper present a proposed research aim to develop an integrated web-based automated analog computerized programming that applies green building rating assessment tool, green technology and life cycle cost analysis. It also emphasizes to identify variables of MyCrest and LCC to be integrated and developed in a framework then transformed into automated analog computerized programming. A mix methodology of qualitative and quantitative survey and its development portray the planned to carry MyCrest-LCC integration to an automated level. In this study, the preliminary literature review enriches better understanding of Green Building Rating Tools (GBRT) integration to LCC. The outcome of this research is a pave way for future researchers to integrate other efficient tool and parameters that contributes towards green buildings and future agendas.

  11. An Automated Method for Landmark Identification and Finite-Element Modeling of the Lumbar Spine.

    Science.gov (United States)

    Campbell, Julius Quinn; Petrella, Anthony J

    2015-11-01

    The purpose of this study was to develop a method for the automated creation of finite-element models of the lumbar spine. Custom scripts were written to extract bone landmarks of lumbar vertebrae and assemble L1-L5 finite-element models. End-plate borders, ligament attachment points, and facet surfaces were identified. Landmarks were identified to maintain mesh correspondence between meshes for later use in statistical shape modeling. 90 lumbar vertebrae were processed creating 18 subject-specific finite-element models. Finite-element model surfaces and ligament attachment points were reproduced within 1e-5 mm of the bone surface, including the critical contact surfaces of the facets. Element quality exceeded specifications in 97% of elements for the 18 models created. The current method is capable of producing subject-specific finite-element models of the lumbar spine with good accuracy, quality, and robustness. The automated methods developed represent advancement in the state of the art of subject-specific lumbar spine modeling to a scale not possible with prior manual and semiautomated methods.

  12. Automated quantification and analysis of facial asymmetry in children with arthritis in the temporomandibular joint

    DEFF Research Database (Denmark)

    Darvann, Tron A.; Hermann, Nuno V.; Demant, Sune

    2011-01-01

    We present an automated method of spatially detailed 3D asymmetry quantification of face surfaces obtained in a stereophotogrammetric system, and the method was applied to a population of children with juvenile idiopathic arthritis (JIA) who have involvement of one temporomandibular joint (TMJ...

  13. Automated analysis of pumping tests; Analise automatizada de testes de bombeamento

    Energy Technology Data Exchange (ETDEWEB)

    Sugahara, Luiz Alberto Nozaki

    1996-01-01

    An automated procedure for analysis of pumping test data performed in groundwater wells is described. A computer software was developed to be used under the Windows operational system. The software allows the choice of 3 mathematical models for representing the aquifer behavior, which are: Confined aquifer (Theis model); Leaky aquifer (Hantush model); unconfined aquifer (Boulton model). The analysis of pumping test data using the proper aquifer model, allows for the determination of the model parameters such as transmissivity, storage coefficient, leakage coefficient and delay index. The computer program can be used for the analysis of data obtained from both pumping tests, with one or more pumping rates, and recovery tests. In the multiple rate case, a de superposition procedure has been implemented in order to obtain the equivalent aquifer response for the first flow rate, which is used in obtaining an initial estimate of the model parameters. Such initial estimate is required in the non-linear regression analysis method. The solutions to the partial differential equations describing the aquifer behavior were obtained in Laplace space, followed by numerical inversion of the transformed solution using the Stehfest algorithm. The data analysis procedure is based on a non-linear regression method by matching the field data to the theoretical response of a selected aquifer model, for a given type of test. A least squared regression analysis method was implemented using either Gauss-Newton or Levenberg-Marquardt procedures for minimization of a objective function. The computer software can also be applied to multiple rate test data in order to determine the non-linear well coefficient, allowing for the computation of the well inflow performance curve. (author)

  14. Automated Spectral Analysis, the Virtual Observatory and Computational Grids

    Science.gov (United States)

    Jeffery, C. S.

    The newest generation of telescopes and detectors and the facilities like the Virtual Observatory (VO) are delivering vast volumes of astronomical data and creating increasing demands for their analysis and interpretation. Methods for such analyses rely heavily on computer-generated models of growing sophistication and realism. These pose two problems. First, simulations are carried out at increasingly high spatial and temporal resolution and physical dimension. Second, the dimensionality of parameter-search space continues to grow. Major computational problems include ensuring that parameter-space volumes to be searched are physically interesting and to match observational data efficiently and without overloading the computational infrastructure. For the analysis of highly-evolved hot stars, we have developed a toolkit for the modelling of stellar atmospheres and stellar spectra. We can automatically fit observed flux distributions and/or high-resolution spectra and solve for a wide range of atmospheric parameters for both single and binary stars. The software represents a prototype for generic toolkits that could facilitate data analysis within, for example, the VO. We introduce a proposal to integrate a range of such toolkits within a heterogeneous network (such as the VO) so as to facilitate data analysis. For example, functions will be required to combine new observations with data from established archives. A goal-seeking algorithm will use this data to guide a sequence of theoretical calculations. These simulations may need to retrieve data from other sources, atomic data, pre-computed model atmospheres and so on. Such applications using widely distributed and heterogeneous resources will require the emerging technologies of computational grids.

  15. Evaluation of automated nucleic acid extraction methods for virus detection in a multicenter comparative trial

    DEFF Research Database (Denmark)

    Rasmussen, Thomas Bruun; Uttenthal, Åse; Hakhverdyan, M.

    2009-01-01

    between the results obtained for the different automated extraction platforms. In particular, the limit of detection was identical for 9/12 and 8/12 best performing robots (using dilutions of BVDV infected-serum and cell culture material, respectively), which was similar to a manual extraction method used...... for comparison. The remaining equipment and protocols used were less sensitive, in an extreme case for serum, by a factor of 1000. There was no evidence for cross-contamination of RNA template in any of the negative samples included in these panels. These results are not intended to replace local optimisation...

  16. A simple method for validation and verification of pipettes mounted on automated liquid handlers

    OpenAIRE

    Stangegaard, Michael; Hansen, Anders Johannes; Frøslev, Tobias Guldberg; Morling, Niels

    2009-01-01

     We have implemented a simple method for validation and verification of the performance of pipettes mounted on automated liquid handlers as necessary for laboratories accredited under ISO 17025. An 8-step serial dilution of Orange G was prepared in quadruplicates in a flat bottom 96-well microtiter plate (BD Falcon) manually by means of calibrated pipettes. Each pipette of the liquid handler (1 up to 8) dispensed a selected volume (1 to 200µl) of Orange G 8 times into the wells of the microti...

  17. Deriving pathway maps from automated text analysis using a grammar-based approach.

    Science.gov (United States)

    Olsson, Björn; Gawronska, Barbara; Erlendsson, Björn

    2006-04-01

    We demonstrate how automated text analysis can be used to support the large-scale analysis of metabolic and regulatory pathways by deriving pathway maps from textual descriptions found in the scientific literature. The main assumption is that correct syntactic analysis combined with domain-specific heuristics provides a good basis for relation extraction. Our method uses an algorithm that searches through the syntactic trees produced by a parser based on a Referent Grammar formalism, identifies relations mentioned in the sentence, and classifies them with respect to their semantic class and epistemic status (facts, counterfactuals, hypotheses). The semantic categories used in the classification are based on the relation set used in KEGG (Kyoto Encyclopedia of Genes and Genomes), so that pathway maps using KEGG notation can be automatically generated. We present the current version of the relation extraction algorithm and an evaluation based on a corpus of abstracts obtained from PubMed. The results indicate that the method is able to combine a reasonable coverage with high accuracy. We found that 61% of all sentences were parsed, and 97% of the parse trees were judged to be correct. The extraction algorithm was tested on a sample of 300 parse trees and was found to produce correct extractions in 90.5% of the cases.

  18. Methods of nonlinear analysis

    CERN Document Server

    Bellman, Richard Ernest

    1970-01-01

    In this book, we study theoretical and practical aspects of computing methods for mathematical modelling of nonlinear systems. A number of computing techniques are considered, such as methods of operator approximation with any given accuracy; operator interpolation techniques including a non-Lagrange interpolation; methods of system representation subject to constraints associated with concepts of causality, memory and stationarity; methods of system representation with an accuracy that is the best within a given class of models; methods of covariance matrix estimation;methods for low-rank mat

  19. How automated image analysis techniques help scientists in species identification and classification?

    Science.gov (United States)

    Yousef Kalafi, Elham; Town, Christopher; Kaur Dhillon, Sarinder

    2017-09-04

    Identification of taxonomy at a specific level is time consuming and reliant upon expert ecologists. Hence the demand for automated species identification increased over the last two decades. Automation of data classification is primarily focussed on images, incorporating and analysing image data has recently become easier due to developments in computational technology. Research efforts in identification of species include specimens' image processing, extraction of identical features, followed by classifying them into correct categories. In this paper, we discuss recent automated species identification systems, categorizing and evaluating their methods. We reviewed and compared different methods in step by step scheme of automated identification and classification systems of species images. The selection of methods is influenced by many variables such as level of classification, number of training data and complexity of images. The aim of writing this paper is to provide researchers and scientists an extensive background study on work related to automated species identification, focusing on pattern recognition techniques in building such systems for biodiversity studies.

  20. Performance Analysis of Wireless Networks for Industrial Automation-Process Automation (WIA-PA)

    Science.gov (United States)

    2017-09-01

    Kim, “A self-stabilized firefly synchronization method for the isa100.11a network,” in 2013 International Conference on ICT Convergence (ICTC), 2013...information is estimated to average 1 hour per response, including the time for reviewing instruction, searching existing data sources, gathering and...maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect

  1. Automated Multivariate Optimization Tool for Energy Analysis: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Ellis, P. G.; Griffith, B. T.; Long, N.; Torcellini, P. A.; Crawley, D.

    2006-07-01

    Building energy simulations are often used for trial-and-error evaluation of ''what-if'' options in building design--a limited search for an optimal solution, or ''optimization''. Computerized searching has the potential to automate the input and output, evaluate many options, and perform enough simulations to account for the complex interactions among combinations of options. This paper describes ongoing efforts to develop such a tool. The optimization tool employs multiple modules, including a graphical user interface, a database, a preprocessor, the EnergyPlus simulation engine, an optimization engine, and a simulation run manager. Each module is described and the overall application architecture is summarized.

  2. A semi-automated method for measuring femoral shape to derive version and its comparison with existing methods.

    Science.gov (United States)

    Berryman, F; Pynsent, P; McBryde, C

    2014-11-01

    The measurement of femoral version is important in surgical planning of derotational osteotomies particularly for patients with proximal femoral deformity. It is, however, difficult to measure version accurately and differences of 10° to 15° have been found between repeated measurements. The aim of this work was first to develop a method of measuring femoral version angle where the definition of the neck axis is based on the three-dimensional point cloud making up the neck, second to automate many of the processes involved thus reducing the influence of human error and third to ensure the method could run on freely available software suitable for most computer platforms. A CT scan was performed on 44 cadaveric femurs to generate point clouds of the femoral surfaces. The point clouds were then analysed semi-automatically to determine femoral version angle between a neck axis defined by the bone surface points belonging only to the neck and a femoral condylar axis. The results from the neck fitting method were compared against three other methods typically used in the clinic (Murphy, Reikeras and Lee methods). Version angle measured by the new method gave 19.1° ± 7.3° (mean ± standard deviation) for the set of cadaveric femurs, 3.5° lower than the Murphy method and 6.8° and 11.0° higher than the Reikeras and Lee 2D methods respectively. The results demonstrate a method of measuring femoral version angle incorporating a high level of automation running on free software. Copyright © 2014 John Wiley & Sons, Ltd.

  3. SU-E-T-139: Automated Daily EPID Exit Dose Analysis Uncovers Treatment Variations

    Energy Technology Data Exchange (ETDEWEB)

    Olch, A [University of Southern California, Los Angeles, CA (United States)

    2015-06-15

    Purpose: To evaluate a fully automated EPID exit dose system for its ability to detect daily treatment deviations including patient setup, delivery, and anatomy changes. Methods: PerFRACTION (Sun Nuclear Corporation) software is a system that uses integrated EPID images taken during patient treatment and automatically pulled from the Aria database and analyzed based on user-defined comparisons. This was used to monitor 20 plans consisting of a total of 859 fields for 18 patients, for a total of 251 fractions. Nine VMAT, 5 IMRT, and 6 3D plans were monitored. The Gamma analysis was performed for each field within a plan, comparing the first fraction against each of the other fractions in each treatment course. A 2% dose difference, 1 mm distance-to-agreement, and 10% dose threshold was used. These tight tolerances were chosen to achieve a high sensitivity to treatment variations. The field passed if 93% of the pixels had a Gamma of 1 or less. Results: Twenty-nine percent of the fields failed. The average plan passing rate was 92.5%.The average 3D plan passing rate was less than for VMAT or IMRT, 84%, vs. an average of 96.2%. When fields failed, an investigation revealed changes in patient anatomy or setup variations, often also leading to variations of transmission through immobilization devices. Conclusion: PerFRACTION is a fully automated system for determining daily changes in dose transmission through the patient that requires no effort other than for the imager panel to be deployed during treatment. A surprising number of fields failed the analysis and can be attributed to important treatment variations that would otherwise not be appreciated. Further study of inter-fraction treatment variations is possible and warranted. Sun Nuclear Corporation provided a license to the software described.

  4. Automating annotation of information-giving for analysis of clinical conversation.

    Science.gov (United States)

    Mayfield, Elijah; Laws, M Barton; Wilson, Ira B; Penstein Rosé, Carolyn

    2014-02-01

    Coding of clinical communication for fine-grained features such as speech acts has produced a substantial literature. However, annotation by humans is laborious and expensive, limiting application of these methods. We aimed to show that through machine learning, computers could code certain categories of speech acts with sufficient reliability to make useful distinctions among clinical encounters. The data were transcripts of 415 routine outpatient visits of HIV patients which had previously been coded for speech acts using the Generalized Medical Interaction Analysis System (GMIAS); 50 had also been coded for larger scale features using the Comprehensive Analysis of the Structure of Encounters System (CASES). We aggregated selected speech acts into information-giving and requesting, then trained the machine to automatically annotate using logistic regression classification. We evaluated reliability by per-speech act accuracy. We used multiple regression to predict patient reports of communication quality from post-visit surveys using the patient and provider information-giving to information-requesting ratio (briefly, information-giving ratio) and patient gender. Automated coding produces moderate reliability with human coding (accuracy 71.2%, κ=0.57), with high correlation between machine and human prediction of the information-giving ratio (r=0.96). The regression significantly predicted four of five patient-reported measures of communication quality (r=0.263-0.344). The information-giving ratio is a useful and intuitive measure for predicting patient perception of provider-patient communication quality. These predictions can be made with automated annotation, which is a practical option for studying large collections of clinical encounters with objectivity, consistency, and low cost, providing greater opportunity for training and reflection for care providers.

  5. Widely applicable MATLAB routines for automated analysis of saccadic reaction times.

    Science.gov (United States)

    Leppänen, Jukka M; Forssman, Linda; Kaatiala, Jussi; Yrttiaho, Santeri; Wass, Sam

    2015-06-01

    Saccadic reaction time (SRT) is a widely used dependent variable in eye-tracking studies of human cognition and its disorders. SRTs are also frequently measured in studies with special populations, such as infants and young children, who are limited in their ability to follow verbal instructions and remain in a stable position over time. In this article, we describe a library of MATLAB routines (Mathworks, Natick, MA) that are designed to (1) enable completely automated implementation of SRT analysis for multiple data sets and (2) cope with the unique challenges of analyzing SRTs from eye-tracking data collected from poorly cooperating participants. The library includes preprocessing and SRT analysis routines. The preprocessing routines (i.e., moving median filter and interpolation) are designed to remove technical artifacts and missing samples from raw eye-tracking data. The SRTs are detected by a simple algorithm that identifies the last point of gaze in the area of interest, but, critically, the extracted SRTs are further subjected to a number of postanalysis verification checks to exclude values contaminated by artifacts. Example analyses of data from 5- to 11-month-old infants demonstrated that SRTs extracted with the proposed routines were in high agreement with SRTs obtained manually from video records, robust against potential sources of artifact, and exhibited moderate to high test-retest stability. We propose that the present library has wide utility in standardizing and automating SRT-based cognitive testing in various populations. The MATLAB routines are open source and can be downloaded from http://www.uta.fi/med/icl/methods.html .

  6. Are the new automated methods for bone age estimation advantageous over the manual approaches?

    Science.gov (United States)

    De Sanctis, Vincenzo; Soliman, Ashraf T; Di Maio, Salvatore; Bedair, Said

    2014-12-01

    Bone Age Assessment (BAA) is performed worldwide for the evaluation of endocrine, genetic and chronic diseases, to monitor response to medical therapy and to determine the growth potential of children and adolescents. It is also used for consultation in planning orthopedic procedures, for determination of chronological age for adopted children, youth sports participation and in forensic settings. The main clinical methods for skeletal bone age estimation are the Greulich and Pyle (GP) and the Tanner and Whitehouse (TW) methods. Seventy six per cent (76%) of radiologists or pediatricians usually use the method of GP, 20% that of TW and 4% other methods. The advantages of using the TW method, as opposed to the GP method, are that it overcomes the subjectivity problem and results are more reproducible. However, it is complex and time consuming; for this reason its usage is just about 20% on a world-wide scale. Moreover, there are some evidences that bone age assignments by different physicians can differ significantly. Computerized and Quantitative Ultrasound Technologies (QUS) for assessing skeletal maturity have been developed with the aim of reducing many of the inconsistencies associated with radiographic investigations. In spite of the fact that the volume of automated methods for BAA has increased, the majotity of them are still in an early phase of development. QUS is comparable to the GP based method, but there is not enough established data yet for the healthy population. The Authors wish to stimulate the attention on the accuracy, reliability and consistency of BAA and to initiate a debate on manual versus automated approaches to enhance our assessment for skeletal matutation in children and adolescents.

  7. Automated Segmentation of Coronary Arteries Based on Statistical Region Growing and Heuristic Decision Method

    Directory of Open Access Journals (Sweden)

    Yun Tian

    2016-01-01

    Full Text Available The segmentation of coronary arteries is a vital process that helps cardiovascular radiologists detect and quantify stenosis. In this paper, we propose a fully automated coronary artery segmentation from cardiac data volume. The method is built on a statistics region growing together with a heuristic decision. First, the heart region is extracted using a multi-atlas-based approach. Second, the vessel structures are enhanced via a 3D multiscale line filter. Next, seed points are detected automatically through a threshold preprocessing and a subsequent morphological operation. Based on the set of detected seed points, a statistics-based region growing is applied. Finally, results are obtained by setting conservative parameters. A heuristic decision method is then used to obtain the desired result automatically because parameters in region growing vary in different patients, and the segmentation requires full automation. The experiments are carried out on a dataset that includes eight-patient multivendor cardiac computed tomography angiography (CTA volume data. The DICE similarity index, mean distance, and Hausdorff distance metrics are employed to compare the proposed algorithm with two state-of-the-art methods. Experimental results indicate that the proposed algorithm is capable of performing complete, robust, and accurate extraction of coronary arteries.

  8. A Novel Method for the Separation of Overlapping Pollen Species for Automated Detection and Classification.

    S