WorldWideScience

Sample records for automated analysis method

  1. Applications of Automation Methods for Nonlinear Fracture Test Analysis

    Science.gov (United States)

    Allen, Phillip A.; Wells, Douglas N.

    2013-01-01

    Using automated and standardized computer tools to calculate the pertinent test result values has several advantages such as: 1. allowing high-fidelity solutions to complex nonlinear phenomena that would be impractical to express in written equation form, 2. eliminating errors associated with the interpretation and programing of analysis procedures from the text of test standards, 3. lessening the need for expertise in the areas of solid mechanics, fracture mechanics, numerical methods, and/or finite element modeling, to achieve sound results, 4. and providing one computer tool and/or one set of solutions for all users for a more "standardized" answer. In summary, this approach allows a non-expert with rudimentary training to get the best practical solution based on the latest understanding with minimum difficulty.Other existing ASTM standards that cover complicated phenomena use standard computer programs: 1. ASTM C1340/C1340M-10- Standard Practice for Estimation of Heat Gain or Loss Through Ceilings Under Attics Containing Radiant Barriers by Use of a Computer Program 2. ASTM F 2815 - Standard Practice for Chemical Permeation through Protective Clothing Materials: Testing Data Analysis by Use of a Computer Program 3. ASTM E2807 - Standard Specification for 3D Imaging Data Exchange, Version 1.0 The verification, validation, and round-robin processes required of a computer tool closely parallel the methods that are used to ensure the solution validity for equations included in test standard. The use of automated analysis tools allows the creation and practical implementation of advanced fracture mechanics test standards that capture the physics of a nonlinear fracture mechanics problem without adding undue burden or expense to the user. The presented approach forms a bridge between the equation-based fracture testing standards of today and the next generation of standards solving complex problems through analysis automation.

  2. Method 365.5 Determination of Orthophosphate in Estuarine and Coastal Waters by Automated Colorimetric Analysis

    Science.gov (United States)

    This method provides a procedure for the determination of low-level orthophosphate concentrations normally found in estuarine and/or coastal waters. It is based upon the method of Murphy and Riley1 adapted for automated segmented flow analysis2 in which the two reagent solutions ...

  3. A performance analysis system for MEMS using automated imaging methods

    Energy Technology Data Exchange (ETDEWEB)

    LaVigne, G.F.; Miller, S.L.

    1998-08-01

    The ability to make in-situ performance measurements of MEMS operating at high speeds has been demonstrated using a new image analysis system. Significant improvements in performance and reliability have directly resulted from the use of this system.

  4. An optimized method for automated analysis of algal pigments by HPLC

    NARCIS (Netherlands)

    van Leeuwe, M. A.; Villerius, L. A.; Roggeveld, J.; Visser, R. J. W.; Stefels, J.

    2006-01-01

    A recent development in algal pigment analysis by high-performance liquid chromatography (HPLC) is the application of automation. An optimization of a complete sampling and analysis protocol applied specifically in automation has not yet been performed. In this paper we show that automation can only

  5. Automation of semen analysis using flow cytometer in comparison with manual methods.

    Science.gov (United States)

    Saleh, Mohamed; Fathy, Amal; El-Akras, Atef I; Eyada, Mostafa M; Younes, Soha; El-Gohary, Ahmed M

    2005-01-01

    In order to standardize techniques and limit the effect of human factors on the results of analyses of biological fluids, automation seems to be mandatory. In an attempt to automate semen analysis, computer assisted sperm analysis (CASA) system has been developed, however its use is still limited and its practical applications have many criticisms. In a trial to automate semen analysis, this study aimed to evaluate the usefulness of flow cytometer in the detection of some seminal parameters in comparison with the traditional manual methods. Isolated spermatogenic cells and isolated sperms from semen and EDTA blood of volunteers were analyzed by flow cytometer in order to define their respective regions. Ejaculates of 28 male patients were subjected to routine semen analyses, leucocytes detection by peroxidase test and monoclonal antibody CD53 using flow cytometer after preparation of the patients' semen samples for flow cytometeric analysis. A highly significant correlation (r=0.96, p= 0.001) of absolute neutrophils (pus cells) detected by peroxidase versus flow cytometer using CD53 monoclonal antibody. A poor correlation (r=0.39, p=0.035) of sperm counts assessed by manual technique and flow cytometer and a spurious sperm counts of 1.08 million/ml detected by flow cytometery in azoospermic patients. Flow cytometer could be used for the assessment of pus cells in semen but seems to be non reliable for the assessment of sperm count if gating depend on sperm size and granularity alone.

  6. The Use of Computer Simulation Methods to Reach Data for Economic Analysis of Automated Logistic Systems

    Science.gov (United States)

    Neradilová, Hana; Fedorko, Gabriel

    2016-12-01

    Automated logistic systems are becoming more widely used within enterprise logistics processes. Their main advantage is that they allow increasing the efficiency and reliability of logistics processes. In terms of evaluating their effectiveness, it is necessary to take into account the economic aspect of the entire process. However, many users ignore and underestimate this area,which is not correct. One of the reasons why the economic aspect is overlooked is the fact that obtaining information for such an analysis is not easy. The aim of this paper is to present the possibilities of computer simulation methods for obtaining data for full-scale economic analysis implementation.

  7. An automated method for analysis of microcirculation videos for accurate assessment of tissue perfusion

    Directory of Open Access Journals (Sweden)

    Demir Sumeyra U

    2012-12-01

    Full Text Available Abstract Background Imaging of the human microcirculation in real-time has the potential to detect injuries and illnesses that disturb the microcirculation at earlier stages and may improve the efficacy of resuscitation. Despite advanced imaging techniques to monitor the microcirculation, there are currently no tools for the near real-time analysis of the videos produced by these imaging systems. An automated system tool that can extract microvasculature information and monitor changes in tissue perfusion quantitatively might be invaluable as a diagnostic and therapeutic endpoint for resuscitation. Methods The experimental algorithm automatically extracts microvascular network and quantitatively measures changes in the microcirculation. There are two main parts in the algorithm: video processing and vessel segmentation. Microcirculatory videos are first stabilized in a video processing step to remove motion artifacts. In the vessel segmentation process, the microvascular network is extracted using multiple level thresholding and pixel verification techniques. Threshold levels are selected using histogram information of a set of training video recordings. Pixel-by-pixel differences are calculated throughout the frames to identify active blood vessels and capillaries with flow. Results Sublingual microcirculatory videos are recorded from anesthetized swine at baseline and during hemorrhage using a hand-held Side-stream Dark Field (SDF imaging device to track changes in the microvasculature during hemorrhage. Automatically segmented vessels in the recordings are analyzed visually and the functional capillary density (FCD values calculated by the algorithm are compared for both health baseline and hemorrhagic conditions. These results were compared to independently made FCD measurements using a well-known semi-automated method. Results of the fully automated algorithm demonstrated a significant decrease of FCD values. Similar, but more variable FCD

  8. A method for the automated detection phishing websites through both site characteristics and image analysis

    Science.gov (United States)

    White, Joshua S.; Matthews, Jeanna N.; Stacy, John L.

    2012-06-01

    Phishing website analysis is largely still a time-consuming manual process of discovering potential phishing sites, verifying if suspicious sites truly are malicious spoofs and if so, distributing their URLs to the appropriate blacklisting services. Attackers increasingly use sophisticated systems for bringing phishing sites up and down rapidly at new locations, making automated response essential. In this paper, we present a method for rapid, automated detection and analysis of phishing websites. Our method relies on near real-time gathering and analysis of URLs posted on social media sites. We fetch the pages pointed to by each URL and characterize each page with a set of easily computed values such as number of images and links. We also capture a screen-shot of the rendered page image, compute a hash of the image and use the Hamming distance between these image hashes as a form of visual comparison. We provide initial results demonstrate the feasibility of our techniques by comparing legitimate sites to known fraudulent versions from Phishtank.com, by actively introducing a series of minor changes to a phishing toolkit captured in a local honeypot and by performing some initial analysis on a set of over 2.8 million URLs posted to Twitter over a 4 days in August 2011. We discuss the issues encountered during our testing such as resolvability and legitimacy of URL's posted on Twitter, the data sets used, the characteristics of the phishing sites we discovered, and our plans for future work.

  9. Automated Nanofiber Diameter Measurement in SEM Images Using a Robust Image Analysis Method

    Directory of Open Access Journals (Sweden)

    Ertan Öznergiz

    2014-01-01

    Full Text Available Due to the high surface area, porosity, and rigidity, applications of nanofibers and nanosurfaces have developed in recent years. Nanofibers and nanosurfaces are typically produced by electrospinning method. In the production process, determination of average fiber diameter is crucial for quality assessment. Average fiber diameter is determined by manually measuring the diameters of randomly selected fibers on scanning electron microscopy (SEM images. However, as the number of the images increases, manual fiber diameter determination becomes a tedious and time consuming task as well as being sensitive to human errors. Therefore, an automated fiber diameter measurement system is desired. In the literature, this task is achieved by using image analysis algorithms. Typically, these methods first isolate each fiber in the image and measure the diameter of each isolated fiber. Fiber isolation is an error-prone process. In this study, automated calculation of nanofiber diameter is achieved without fiber isolation using image processing and analysis algorithms. Performance of the proposed method was tested on real data. The effectiveness of the proposed method is shown by comparing automatically and manually measured nanofiber diameter values.

  10. Serum 5'nucleotidase activity in rats: a method for automated analysis and criteria for interpretation.

    Science.gov (United States)

    Carakostas, Michael C.; Power, Richard J.; Banerjee, Asit K.

    1990-01-01

    A manual kit for determining serum 5'nucleotidase (5'NT, EC 3.1.3.5) activity was adapted for use with rat samples on a large discrete clinical chemistry analyzer. The precision of the method was good (within-run C.V. = 2.14%; between-run C.V. = 5.5%). A comparison of the new automated method with a manual and semi-automated method gave regression statistics of y = 1.18X -3.66 (Sy. x = 4.54), and y = 0.733X + 1.97 (Sy. x = 1.69), respectively. Temperature conversion factors provided by the kit manufacturer for human samples were determined to be inaccurate for converting results from rat samples. Analysis of components contributing to normal variation in rat serum 5'NT activity showed age and sex to be major factors. Increased serum 5'NT activity was observed in female rats when compared to male rats beginning at about 5 to 6 weeks of age. An analysis of variance of serum 5'NT, alkaline phosphatase, and GGT activities observed over a 9-week period in normal rats suggests several advantages for 5'NT as a predictor of biliary lesions in rats.

  11. Statistical analysis to assess automated level of suspicion scoring methods in breast ultrasound

    Science.gov (United States)

    Galperin, Michael

    2003-05-01

    A well-defined rule-based system has been developed for scoring 0-5 the Level of Suspicion (LOS) based on qualitative lexicon describing the ultrasound appearance of breast lesion. The purposes of the research are to asses and select one of the automated LOS scoring quantitative methods developed during preliminary studies in benign biopsies reduction. The study has used Computer Aided Imaging System (CAIS) to improve the uniformity and accuracy of applying the LOS scheme by automatically detecting, analyzing and comparing breast masses. The overall goal is to reduce biopsies on the masses with lower levels of suspicion, rather that increasing the accuracy of diagnosis of cancers (will require biopsy anyway). On complex cysts and fibroadenoma cases experienced radiologists were up to 50% less certain in true negatives than CAIS. Full correlation analysis was applied to determine which of the proposed LOS quantification methods serves CAIS accuracy the best. This paper presents current results of applying statistical analysis for automated LOS scoring quantification for breast masses with known biopsy results. It was found that First Order Ranking method yielded most the accurate results. The CAIS system (Image Companion, Data Companion software) is developed by Almen Laboratories and was used to achieve the results.

  12. A scanning electron microscope method for automated, quantitative analysis of mineral matter in coal

    Energy Technology Data Exchange (ETDEWEB)

    Creelman, R.A.; Ward, C.R. [R.A. Creelman and Associates, Epping, NSW (Australia)

    1996-07-01

    Quantitative mineralogical analysis has been carried out in a series of nine coal samples from Australia, South Africa and China using a newly-developed automated image analysis system coupled to a scanning electron microscopy. The image analysis system (QEM{asterisk}SEM) gathers X-ray spectra and backscattered electron data from a number of points on a conventional grain-mount polished section under the SEM, and interprets the data from each point in mineralogical terms. The cumulative data in each case was integrated to provide a volumetric modal analysis of the species present in the coal samples, expressed as percentages of the respective coals` mineral matter. Comparison was made of the QEM{asterisk}SEM results to data obtained from the same samples using other methods of quantitative mineralogical analysis, namely X-ray diffraction of the low-temperature oxygen-plasma ash and normative calculation from the (high-temperature) ash analysis and carbonate CO{sub 2} data. Good agreement was obtained from all three methods for quartz in the coals, and also for most of the iron-bearing minerals. The correlation between results from the different methods was less strong, however, for individual clay minerals, or for minerals such as calcite, dolomite and phosphate species that made up only relatively small proportions of the mineral matter. The image analysis approach, using the electron microscope for mineralogical studies, has significant potential as a supplement to optical microscopy in quantitative coal characterisation. 36 refs., 3 figs., 4 tabs.

  13. Selection of Filtration Methods in the Analysis of Motion of Automated Guided Vehicle

    Directory of Open Access Journals (Sweden)

    Dobrzańska Magdalena

    2016-08-01

    Full Text Available In this article the issues related to mapping the route and error correction in automated guided vehicle (AGV movement have been discussed. The nature and size of disruption have been determined using the registered runs in experimental studies. On the basis of the analysis a number of numerical runs have been generated, which mapped possible to obtain runs in a real movement of the vehicle. The obtained data set has been used for further research. The aim of this paper was to test the selected methods of digital filtering on the same data set and determine their effectiveness. The results of simulation studies have been presented in the article. The effectiveness of various methods has been determined and on this basis the conclusions have been drawn.

  14. A Fully Automated and Robust Method to Incorporate Stamping Data in Crash, NVH and Durability Analysis

    Science.gov (United States)

    Palaniswamy, Hariharasudhan; Kanthadai, Narayan; Roy, Subir; Beauchesne, Erwan

    2011-08-01

    Crash, NVH (Noise, Vibration, Harshness), and durability analysis are commonly deployed in structural CAE analysis for mechanical design of components especially in the automotive industry. Components manufactured by stamping constitute a major portion of the automotive structure. In CAE analysis they are modeled at a nominal state with uniform thickness and no residual stresses and strains. However, in reality the stamped components have non-uniformly distributed thickness and residual stresses and strains resulting from stamping. It is essential to consider the stamping information in CAE analysis to accurately model the behavior of the sheet metal structures under different loading conditions. Especially with the current emphasis on weight reduction by replacing conventional steels with aluminum and advanced high strength steels it is imperative to avoid over design. Considering this growing need in industry, a highly automated and robust method has been integrated within Altair Hyperworks® to initialize sheet metal components in CAE models with stamping data. This paper demonstrates this new feature and the influence of stamping data for a full car frontal crash analysis.

  15. Semi-automated method to measure pneumonia severity in mice through computed tomography (CT) scan analysis

    Science.gov (United States)

    Johri, Ansh; Schimel, Daniel; Noguchi, Audrey; Hsu, Lewis L.

    2010-03-01

    Imaging is a crucial clinical tool for diagnosis and assessment of pneumonia, but quantitative methods are lacking. Micro-computed tomography (micro CT), designed for lab animals, provides opportunities for non-invasive radiographic endpoints for pneumonia studies. HYPOTHESIS: In vivo micro CT scans of mice with early bacterial pneumonia can be scored quantitatively by semiautomated imaging methods, with good reproducibility and correlation with bacterial dose inoculated, pneumonia survival outcome, and radiologists' scores. METHODS: Healthy mice had intratracheal inoculation of E. coli bacteria (n=24) or saline control (n=11). In vivo micro CT scans were performed 24 hours later with microCAT II (Siemens). Two independent radiologists scored the extent of airspace abnormality, on a scale of 0 (normal) to 24 (completely abnormal). Using the Amira 5.2 software (Mercury Computer Systems), a histogram distribution of voxel counts between the Hounsfield range of -510 to 0 was created and analyzed, and a segmentation procedure was devised. RESULTS: A t-test was performed to determine whether there was a significant difference in the mean voxel value of each mouse in the three experimental groups: Saline Survivors, Pneumonia Survivors, and Pneumonia Non-survivors. It was found that the voxel count method was able to statistically tell apart the Saline Survivors from the Pneumonia Survivors, the Saline Survivors from the Pneumonia Non-survivors, but not the Pneumonia Survivors vs. Pneumonia Non-survivors. The segmentation method, however, was successfully able to distinguish the two Pneumonia groups. CONCLUSION: We have pilot-tested an evaluation of early pneumonia in mice using micro CT and a semi-automated method for lung segmentation and scoring system. Statistical analysis indicates that the system is reliable and merits further evaluation.

  16. An Automated High-Throughput Metabolic Stability Assay Using an Integrated High-Resolution Accurate Mass Method and Automated Data Analysis Software

    Science.gov (United States)

    Shah, Pranav; Kerns, Edward; Nguyen, Dac-Trung; Obach, R. Scott; Wang, Amy Q.; Zakharov, Alexey; McKew, John; Simeonov, Anton; Hop, Cornelis E. C. A.

    2016-01-01

    Advancement of in silico tools would be enabled by the availability of data for metabolic reaction rates and intrinsic clearance (CLint) of a diverse compound structure data set by specific metabolic enzymes. Our goal is to measure CLint for a large set of compounds with each major human cytochrome P450 (P450) isozyme. To achieve our goal, it is of utmost importance to develop an automated, robust, sensitive, high-throughput metabolic stability assay that can efficiently handle a large volume of compound sets. The substrate depletion method [in vitro half-life (t1/2) method] was chosen to determine CLint. The assay (384-well format) consisted of three parts: 1) a robotic system for incubation and sample cleanup; 2) two different integrated, ultraperformance liquid chromatography/mass spectrometry (UPLC/MS) platforms to determine the percent remaining of parent compound, and 3) an automated data analysis system. The CYP3A4 assay was evaluated using two long t1/2 compounds, carbamazepine and antipyrine (t1/2 > 30 minutes); one moderate t1/2 compound, ketoconazole (10 < t1/2 < 30 minutes); and two short t1/2 compounds, loperamide and buspirone (t½ < 10 minutes). Interday and intraday precision and accuracy of the assay were within acceptable range (∼12%) for the linear range observed. Using this assay, CYP3A4 CLint and t1/2 values for more than 3000 compounds were measured. This high-throughput, automated, and robust assay allows for rapid metabolic stability screening of large compound sets and enables advanced computational modeling for individual human P450 isozymes. PMID:27417180

  17. A revised automated proximity and conformity analysis method to compare predicted and observed spatial boundaries of geologic phenomena

    Science.gov (United States)

    Li, Yingkui; Napieralski, Jacob; Harbor, Jon

    2008-12-01

    Quantitative assessment of the level of agreement between model-predicted and field-observed geologic data is crucial to calibrate and validate numerical landscape models. Application of Geographic Information Systems (GIS) provides an opportunity to integrate model and field data and quantify their levels of correspondence. Napieralski et al. [Comparing predicted and observed spatial boundaries of geologic phenomena: Automated Proximity and Conformity Analysis (APCA) applied to ice sheet reconstructions. Computers and Geosciences 32, 124-134] introduced an Automated Proximity and Conformity Analysis (APCA) method to compare model-predicted and field-observed spatial boundaries and used it to quantify the level of correspondence between predicted ice margins from ice sheet models and field observations from end moraines. However, as originally formulated, APCA involves a relatively large amount of user intervention during the analysis and results in an index to quantify the level of correspondence that lacks direct statistical meaning. Here, we propose a revised APCA approach and a more automated and statistically robust way to quantify the level of correspondence between model predictions and field observations. Specifically, the mean and standard deviation of distances between model and field boundaries are used to quantify proximity and conformity, respectively. An illustration of the revised method comparing modeled ice margins of the Fennoscandian Ice Sheet with observed end moraines of the Last Glacial Maximum shows that this approach provides a more automated and statistically robust means to quantify correspondence than the original APCA. The revised approach can be adopted for a wide range of geoscience issues where comparisons of model-predicted and field-observed spatial boundaries are useful, including mass movement and flood extents.

  18. Automated methods of corrosion measurement

    DEFF Research Database (Denmark)

    Andersen, Jens Enevold Thaulov; Bech-Nielsen, Gregers; Reeve, John Ch

    1997-01-01

    Measurements of corrosion rates and other parameters connected with corrosion processes are important, first as indicators of the corrosion resistance of metallic materials and second because such measurements are based on general and fundamental physical, chemical, and electrochemical relations....... Hence improvements and innovations in methods applied in corrosion research are likeliy to benefit basic disciplines as well. A method for corrosion measurements can only provide reliable data if the beckground of the method is fully understood. Failure of a method to give correct data indicates a need...... to revise assumptions regarding the basis of the method, which sometimes leads to the discovery of as-yet unnoticed phenomena. The present selection of automated methods for corrosion measurements is not motivated simply by the fact that a certain measurement can be performed automatically. Automation...

  19. Development and application of an automated analysis method for individual cerebral perfusion single photon emission tomography images

    CERN Document Server

    Cluckie, A J

    2001-01-01

    Neurological images may be analysed by performing voxel by voxel comparisons with a group of control subject images. An automated, 3D, voxel-based method has been developed for the analysis of individual single photon emission tomography (SPET) scans. Clusters of voxels are identified that represent regions of abnormal radiopharmaceutical uptake. Morphological operators are applied to reduce noise in the clusters, then quantitative estimates of the size and degree of the radiopharmaceutical uptake abnormalities are derived. Statistical inference has been performed using a Monte Carlo method that has not previously been applied to SPET scans, or for the analysis of individual images. This has been validated for group comparisons of SPET scans and for the analysis of an individual image using comparison with a group. Accurate statistical inference was obtained independent of experimental factors such as degrees of freedom, image smoothing and voxel significance level threshold. The analysis method has been eval...

  20. AUTOMATED TEXT CLUSTERING OF NEWSPAPER AND SCIENTIFIC TEXTS IN BRAZILIAN PORTUGUESE: ANALYSIS AND COMPARISON OF METHODS

    Directory of Open Access Journals (Sweden)

    Alexandre Ribeiro Afonso

    2014-10-01

    Full Text Available This article reports the findings of an empirical study about Automated Text Clustering applied to scientific articles and newspaper texts in Brazilian Portuguese, the objective was to find the most effective computational method able to cluster the input of texts in their original groups. The study covered four experiments, each experiment had four procedures: 1. Corpus Selections (a set of texts is selected for clustering, 2. Word Class Selections (Nouns, Verbs and Adjectives are chosen from each text by using specific algorithms, 3. Filtering Algorithms (a set of terms is selected from the results of the preview stage, a semantic weight is also inserted for each term and an index is generated for each text, 4. Clustering Algorithms (the clustering algorithms Simple K-Means, sIB and EM are applied to the indexes. After those procedures, clustering correctness and clustering time statistical results were collected. The sIB clustering algorithm is the best choice for both scientific and newspaper corpus, under the condition that the sIB clustering algorithm asks for the number of clusters as input before running (for the newspaper corpus, 68.9% correctness in 1 minute and for the scientific corpus, 77.8% correctness in 1 minute. The EM clustering algorithm additionally guesses the number of clusters without user intervention, but its best case is less than 53% correctness. Considering the experiments carried out, the results of human text classification and automated clustering are distant; it was also observed that the clustering correctness results vary according to the number of input texts and their topics.

  1. Contaminant analysis automation demonstration proposal

    Energy Technology Data Exchange (ETDEWEB)

    Dodson, M.G.; Schur, A.; Heubach, J.G.

    1993-10-01

    The nation-wide and global need for environmental restoration and waste remediation (ER&WR) presents significant challenges to the analytical chemistry laboratory. The expansion of ER&WR programs forces an increase in the volume of samples processed and the demand for analysis data. To handle this expanding volume, productivity must be increased. However. The need for significantly increased productivity, faces contaminant analysis process which is costly in time, labor, equipment, and safety protection. Laboratory automation offers a cost effective approach to meeting current and future contaminant analytical laboratory needs. The proposed demonstration will present a proof-of-concept automated laboratory conducting varied sample preparations. This automated process also highlights a graphical user interface that provides supervisory, control and monitoring of the automated process. The demonstration provides affirming answers to the following questions about laboratory automation: Can preparation of contaminants be successfully automated?; Can a full-scale working proof-of-concept automated laboratory be developed that is capable of preparing contaminant and hazardous chemical samples?; Can the automated processes be seamlessly integrated and controlled?; Can the automated laboratory be customized through readily convertible design? and Can automated sample preparation concepts be extended to the other phases of the sample analysis process? To fully reap the benefits of automation, four human factors areas should be studied and the outputs used to increase the efficiency of laboratory automation. These areas include: (1) laboratory configuration, (2) procedures, (3) receptacles and fixtures, and (4) human-computer interface for the full automated system and complex laboratory information management systems.

  2. Fully automated (operational) modal analysis

    Science.gov (United States)

    Reynders, Edwin; Houbrechts, Jeroen; De Roeck, Guido

    2012-05-01

    Modal parameter estimation requires a lot of user interaction, especially when parametric system identification methods are used and the modes are selected in a stabilization diagram. In this paper, a fully automated, generally applicable three-stage clustering approach is developed for interpreting such a diagram. It does not require any user-specified parameter or threshold value, and it can be used in an experimental, operational, and combined vibration testing context and with any parametric system identification algorithm. The three stages of the algorithm correspond to the three stages in a manual analysis: setting stabilization thresholds for clearing out the diagram, detecting columns of stable modes, and selecting a representative mode from each column. An extensive validation study illustrates the accuracy and robustness of this automation strategy.

  3. Development of automated extraction method of biliary tract from abdominal CT volumes based on local intensity structure analysis

    Science.gov (United States)

    Koga, Kusuto; Hayashi, Yuichiro; Hirose, Tomoaki; Oda, Masahiro; Kitasaka, Takayuki; Igami, Tsuyoshi; Nagino, Masato; Mori, Kensaku

    2014-03-01

    In this paper, we propose an automated biliary tract extraction method from abdominal CT volumes. The biliary tract is the path by which bile is transported from liver to the duodenum. No extraction method have been reported for the automated extraction of the biliary tract from common contrast CT volumes. Our method consists of three steps including: (1) extraction of extrahepatic bile duct (EHBD) candidate regions, (2) extraction of intrahepatic bile duct (IHBD) candidate regions, and (3) combination of these candidate regions. The IHBD has linear structures and intensities of the IHBD are low in CT volumes. We use a dark linear structure enhancement (DLSE) filter based on a local intensity structure analysis method using the eigenvalues of the Hessian matrix for the IHBD candidate region extraction. The EHBD region is extracted using a thresholding process and a connected component analysis. In the combination process, we connect the IHBD candidate regions to each EHBD candidate region and select a bile duct region from the connected candidate regions. We applied the proposed method to 22 cases of CT volumes. An average Dice coefficient of extraction result was 66.7%.

  4. A Robust and Fully-Automated Chromatographic Method for the Quantitative Purification of Ca and Sr for Isotopic Analysis

    Science.gov (United States)

    Smith, H. B.; Kim, H.; Romaniello, S. J.; Field, P.; Anbar, A. D.

    2014-12-01

    High throughput methods for sample purification are required to effectively exploit new opportunities in the study of non-traditional stable isotopes. Many geochemical isotopic studies would benefit from larger data sets, but these are often impractical with manual drip chromatography techniques, which can be time-consuming and demand the attention of skilled laboratory staff. Here we present a new, fully-automated single-column method suitable for the purification of both Ca and Sr for stable and radiogenic isotopic analysis. The method can accommodate a wide variety of sample types, including carbonates, bones, and teeth; silicate rocks and sediments; fresh and marine waters; and biological samples such as blood and urine. Protocols for these isotopic analyses are being developed for use on the new prepFAST-MCTM system from Elemental Scientific (ESI). The system is highly adaptable and processes up to 24-60 samples per day by reusing a single chromatographic column. Efficient column cleaning between samples and an all Teflon flow path ensures that sample carryover is maintained at the level of background laboratory blanks typical for manual drip chromatography. This method is part of a family of new fully-automated chromatographic methods being developed to address many different isotopic systems including B, Ca, Fe, Cu, Zn, Sr, Cd, Pb, and U. These methods are designed to be rugged and transferrable, and to allow the preparation of large, diverse sample sets via a highly repeatable process with minimal effort.

  5. Literature Lab: a method of automated literature interrogation to infer biology from microarray analysis

    Directory of Open Access Journals (Sweden)

    Stegmaier Kimberly

    2007-12-01

    Full Text Available Abstract Background The biomedical literature is a rich source of associative information but too vast for complete manual review. We have developed an automated method of literature interrogation called "Literature Lab" that identifies and ranks associations existing in the literature between gene sets, such as those derived from microarray experiments, and curated sets of key terms (i.e. pathway names, medical subject heading (MeSH terms, etc. Results Literature Lab was developed using differentially expressed gene sets from three previously published cancer experiments and tested on a fourth, novel gene set. When applied to the genesets from the published data including an in vitro experiment, an in vivo mouse experiment, and an experiment with human tumor samples, Literature Lab correctly identified known biological processes occurring within each experiment. When applied to a novel set of genes differentially expressed between locally invasive and metastatic prostate cancer, Literature Lab identified a strong association between the pathway term "FOSB" and genes with increased expression in metastatic prostate cancer. Immunohistochemistry subsequently confirmed increased nuclear FOSB staining in metastatic compared to locally invasive prostate cancers. Conclusion This work demonstrates that Literature Lab can discover key biological processes by identifying meritorious associations between experimentally derived gene sets and key terms within the biomedical literature.

  6. Barcoding T cell calcium response diversity with methods for automated and accurate analysis of cell signals (MAAACS.

    Directory of Open Access Journals (Sweden)

    Audrey Salles

    Full Text Available We introduce a series of experimental procedures enabling sensitive calcium monitoring in T cell populations by confocal video-microscopy. Tracking and post-acquisition analysis was performed using Methods for Automated and Accurate Analysis of Cell Signals (MAAACS, a fully customized program that associates a high throughput tracking algorithm, an intuitive reconnection routine and a statistical platform to provide, at a glance, the calcium barcode of a population of individual T-cells. Combined with a sensitive calcium probe, this method allowed us to unravel the heterogeneity in shape and intensity of the calcium response in T cell populations and especially in naive T cells, which display intracellular calcium oscillations upon stimulation by antigen presenting cells.

  7. Barcoding T Cell Calcium Response Diversity with Methods for Automated and Accurate Analysis of Cell Signals (MAAACS)

    Science.gov (United States)

    Sergé, Arnauld; Bernard, Anne-Marie; Phélipot, Marie-Claire; Bertaux, Nicolas; Fallet, Mathieu; Grenot, Pierre; Marguet, Didier; He, Hai-Tao; Hamon, Yannick

    2013-01-01

    We introduce a series of experimental procedures enabling sensitive calcium monitoring in T cell populations by confocal video-microscopy. Tracking and post-acquisition analysis was performed using Methods for Automated and Accurate Analysis of Cell Signals (MAAACS), a fully customized program that associates a high throughput tracking algorithm, an intuitive reconnection routine and a statistical platform to provide, at a glance, the calcium barcode of a population of individual T-cells. Combined with a sensitive calcium probe, this method allowed us to unravel the heterogeneity in shape and intensity of the calcium response in T cell populations and especially in naive T cells, which display intracellular calcium oscillations upon stimulation by antigen presenting cells. PMID:24086124

  8. An Improved Method for Measuring Quantitative Resistance to the Wheat Pathogen Zymoseptoria tritici Using High-Throughput Automated Image Analysis.

    Science.gov (United States)

    Stewart, Ethan L; Hagerty, Christina H; Mikaberidze, Alexey; Mundt, Christopher C; Zhong, Ziming; McDonald, Bruce A

    2016-07-01

    Zymoseptoria tritici causes Septoria tritici blotch (STB) on wheat. An improved method of quantifying STB symptoms was developed based on automated analysis of diseased leaf images made using a flatbed scanner. Naturally infected leaves (n = 949) sampled from fungicide-treated field plots comprising 39 wheat cultivars grown in Switzerland and 9 recombinant inbred lines (RIL) grown in Oregon were included in these analyses. Measures of quantitative resistance were percent leaf area covered by lesions, pycnidia size and gray value, and pycnidia density per leaf and lesion. These measures were obtained automatically with a batch-processing macro utilizing the image-processing software ImageJ. All phenotypes in both locations showed a continuous distribution, as expected for a quantitative trait. The trait distributions at both sites were largely overlapping even though the field and host environments were quite different. Cultivars and RILs could be assigned to two or more statistically different groups for each measured phenotype. Traditional visual assessments of field resistance were highly correlated with quantitative resistance measures based on image analysis for the Oregon RILs. These results show that automated image analysis provides a promising tool for assessing quantitative resistance to Z. tritici under field conditions.

  9. Automated sugar analysis

    Directory of Open Access Journals (Sweden)

    Tadeu Alcides MARQUES

    2016-03-01

    Full Text Available Abstract Sugarcane monosaccharides are reducing sugars, and classical analytical methodologies (Lane-Eynon, Benedict, complexometric-EDTA, Luff-Schoorl, Musson-Walker, Somogyi-Nelson are based on reducing copper ions in alkaline solutions. In Brazil, certain factories use Lane-Eynon, others use the equipment referred to as “REDUTEC”, and additional factories analyze reducing sugars based on a mathematic model. The objective of this paper is to understand the relationship between variations in millivolts, mass and tenors of reducing sugars during the analysis process. Another objective is to generate an automatic model for this process. The work herein uses the equipment referred to as “REDUTEC”, a digital balance, a peristaltic pump, a digital camcorder, math programs and graphics programs. We conclude that the millivolts, mass and tenors of reducing sugars exhibit a good mathematical correlation, and the mathematical model generated was benchmarked to low-concentration reducing sugars (<0.3%. Using the model created herein, reducing sugars analyses can be automated using the new equipment.

  10. Automated method for simultaneous lead and strontium isotopic analysis applied to rainwater samples and airborne particulate filters (PM10).

    Science.gov (United States)

    Beltrán, Blanca; Avivar, Jessica; Mola, Montserrat; Ferrer, Laura; Cerdà, Víctor; Leal, Luz O

    2013-09-03

    A new automated, sensitive, and fast system for the simultaneous online isolation and preconcentration of lead and strontium by sorption on a microcolumn packed with Sr-resin using an inductively coupled plasma mass spectrometry (ICP-MS) detector was developed, hyphenating lab-on-valve (LOV) and multisyringe flow injection analysis (MSFIA). Pb and Sr are directly retained on the sorbent column and eluted with a solution of 0.05 mol L(-1) ammonium oxalate. The detection limits achieved were 0.04 ng for lead and 0.03 ng for strontium. Mass calibration curves were used since the proposed system allows the use of different sample volumes for preconcentration. Mass linear working ranges were between 0.13 and 50 ng and 0.1 and 50 ng for lead and strontium, respectively. The repeatability of the method, expressed as RSD, was 2.1% and 2.7% for Pb and Sr, respectively. Environmental samples such as rainwater and airborne particulate (PM10) filters as well as a certified reference material SLRS-4 (river water) were satisfactorily analyzed obtaining recoveries between 90 and 110% for both elements. The main features of the LOV-MSFIA-ICP-MS system proposed are the capability to renew solid phase extraction at will in a fully automated way, the remarkable stability of the column which can be reused up to 160 times, and the potential to perform isotopic analysis.

  11. AUTOMATED ANALYSIS OF BREAKERS

    Directory of Open Access Journals (Sweden)

    E. M. Farhadzade

    2014-01-01

    Full Text Available Breakers relate to Electric Power Systems’ equipment, the reliability of which influence, to a great extend, on reliability of Power Plants. In particular, the breakers determine structural reliability of switchgear circuit of Power Stations and network substations. Failure in short-circuit switching off by breaker with further failure of reservation unit or system of long-distance protection lead quite often to system emergency.The problem of breakers’ reliability improvement and the reduction of maintenance expenses is becoming ever more urgent in conditions of systematic increasing of maintenance cost and repair expenses of oil circuit and air-break circuit breakers. The main direction of this problem solution is the improvement of diagnostic control methods and organization of on-condition maintenance. But this demands to use a great amount of statistic information about nameplate data of breakers and their operating conditions, about their failures, testing and repairing, advanced developments (software of computer technologies and specific automated information system (AIS.The new AIS with AISV logo was developed at the department: “Reliability of power equipment” of AzRDSI of Energy. The main features of AISV are:· to provide the security and data base accuracy;· to carry out systematic control of breakers conformity with operating conditions;· to make the estimation of individual  reliability’s value and characteristics of its changing for given combination of characteristics variety;· to provide personnel, who is responsible for technical maintenance of breakers, not only with information but also with methodological support, including recommendations for the given problem solving  and advanced methods for its realization.

  12. NetFCM: A Semi-Automated Web-Based Method for Flow Cytometry Data Analysis

    DEFF Research Database (Denmark)

    Frederiksen, Juliet Wairimu; Buggert, Marcus; Karlsson, Annika C.;

    2014-01-01

    tool both for subset identification as well as for quantification of differences between samples. Additionally, NetFCM can classify and cluster samples based on multidimensional data. We tested the method using a data set of peripheral blood mononuclear cells collected from 23 HIV-infected individuals......Multi-parametric flow cytometry (FCM) represents an invaluable instrument to conduct single cell analysis and has significantly increased our understanding of the immune system. However, due to new techniques allowing us to measure an increased number of phenotypes within the immune system, FCM...... data analysis has become more complex and labor-intensive than previously. We have therefore developed a semi-automatic gating strategy (NetFCM) that uses clustering and principal component analysis (PCA) together with other statistical methods to mimic manual gating approaches. NetFCM is an online...

  13. Introducing Powell's Direction Set Method to a Fully Automated Analysis of Eclipsing Binary Stars

    CERN Document Server

    Prsa, A

    2006-01-01

    With recent observational advancements, substantial amounts of photometric and spectroscopic eclipsing binary data have been acquired. As part of an ongoing effort to assemble a reliable pipeline for fully automatic data analysis, we put Powell's direction set method to the test. The method does not depend on numerical derivatives, only on function evaluations, and as such it cannot diverge. Compared to differential corrections (DC) and Nelder & Mead's downhill simplex (NMS) method, Powell's method proves to be more efficient in terms of solution determination and the required number of iterations. However, its application is still not optimal in terms of time cost. Causes for this deficiency are identified and two steps toward the solution are proposed: non-ortogonality of the parameter set should be removed and better initial directions should be determined before the minimization is initiated. Once these setbacks are worked out, Powell's method will probably replace DC and NMS as the default minimizing...

  14. Evaluation of automated sample preparation, retention time locked gas chromatography-mass spectrometry and data analysis methods for the metabolomic study of Arabidopsis species.

    Science.gov (United States)

    Gu, Qun; David, Frank; Lynen, Frédéric; Rumpel, Klaus; Dugardeyn, Jasper; Van Der Straeten, Dominique; Xu, Guowang; Sandra, Pat

    2011-05-27

    In this paper, automated sample preparation, retention time locked gas chromatography-mass spectrometry (GC-MS) and data analysis methods for the metabolomics study were evaluated. A miniaturized and automated derivatisation method using sequential oximation and silylation was applied to a polar extract of 4 types (2 types×2 ages) of Arabidopsis thaliana, a popular model organism often used in plant sciences and genetics. Automation of the derivatisation process offers excellent repeatability, and the time between sample preparation and analysis was short and constant, reducing artifact formation. Retention time locked (RTL) gas chromatography-mass spectrometry was used, resulting in reproducible retention times and GC-MS profiles. Two approaches were used for data analysis. XCMS followed by principal component analysis (approach 1) and AMDIS deconvolution combined with a commercially available program (Mass Profiler Professional) followed by principal component analysis (approach 2) were compared. Several features that were up- or down-regulated in the different types were detected.

  15. Automated analysis of 3D echocardiography

    NARCIS (Netherlands)

    Stralen, Marijn van

    2009-01-01

    In this thesis we aim at automating the analysis of 3D echocardiography, mainly targeting the functional analysis of the left ventricle. Manual analysis of these data is cumbersome, time-consuming and is associated with inter-observer and inter-institutional variability. Methods for reconstruction o

  16. E-learning platform for automated testing of electronic circuits using signature analysis method

    Science.gov (United States)

    Gherghina, Cǎtǎlina; Bacivarov, Angelica; Bacivarov, Ioan C.; Petricǎ, Gabriel

    2016-12-01

    Dependability of electronic circuits can be ensured only through testing of circuit modules. This is done by generating test vectors and their application to the circuit. Testability should be viewed as a concerted effort to ensure maximum efficiency throughout the product life cycle, from conception and design stage, through production to repairs during products operating. In this paper, is presented the platform developed by authors for training for testability in electronics, in general and in using signature analysis method, in particular. The platform allows highlighting the two approaches in the field namely analog and digital signature of circuits. As a part of this e-learning platform, it has been developed a database for signatures of different electronic components meant to put into the spotlight different techniques implying fault detection, and from this there were also self-repairing techniques of the systems with this kind of components. An approach for realizing self-testing circuits based on MATLAB environment and using signature analysis method is proposed. This paper analyses the benefits of signature analysis method and simulates signature analyzer performance based on the use of pseudo-random sequences, too.

  17. Development testing of the chemical analysis automation polychlorinated biphenyl standard analysis method during surface soils sampling at the David Witherspoon 1630 site

    Energy Technology Data Exchange (ETDEWEB)

    Hunt, M.A.; Klatt, L.N.; Thompson, D.H. [and others

    1998-02-01

    The Chemical Analysis Automation (CAA) project is developing standardized, software-driven, site-deployable robotic laboratory systems with the objective of lowering the per-sample analysis cost, decreasing sample turnaround time, and minimizing human exposure to hazardous and radioactive materials associated with DOE remediation projects. The first integrated system developed by the CAA project is designed to determine polychlorinated biphenyls (PCB) content in soil matrices. A demonstration and development testing of this system was conducted in conjuction with surface soil characterization activities at the David Witherspoon 1630 Site in Knoxville, Tennessee. The PCB system consists of five hardware standard laboratory modules (SLMs), one software SLM, the task sequence controller (TSC), and the human-computer interface (HCI). Four of the hardware SLMs included a four-channel Soxhlet extractor, a high-volume concentrator, a column cleanup, and a gas chromatograph. These SLMs performed the sample preparation and measurement steps within the total analysis protocol. The fifth hardware module was a robot that transports samples between the SLMs and the required consumable supplies to the SLMs. The software SLM is an automated data interpretation module that receives raw data from the gas chromatograph SLM and analyzes the data to yield the analyte information. The TSC is a software system that provides the scheduling, management of system resources, and the coordination of all SLM activities. The HCI is a graphical user interface that presents the automated laboratory to the analyst in terms of the analytical procedures and methods. Human control of the automated laboratory is accomplished via the HCI. Sample information required for processing by the automated laboratory is entered through the HCI. Information related to the sample and the system status is presented to the analyst via graphical icons.

  18. A method for the automated processing and analysis of images of ULVWF-platelet strings.

    Science.gov (United States)

    Reeve, Scott R; Abbitt, Katherine B; Cruise, Thomas D; Hose, D Rodney; Lawford, Patricia V

    2013-01-01

    We present a method for identifying and analysing unusually large von Willebrand factor (ULVWF)-platelet strings in noisy low-quality images. The method requires relatively inexpensive, non-specialist equipment and allows multiple users to be employed in the capture of images. Images are subsequently enhanced and analysed, using custom-written software to perform the processing tasks. The formation and properties of ULVWF-platelet strings released in in vitro flow-based assays have recently become a popular research area. Endothelial cells are incorporated into a flow chamber, chemically stimulated to induce ULVWF release and perfused with isolated platelets which are able to bind to the ULVWF to form strings. The numbers and lengths of the strings released are related to characteristics of the flow. ULVWF-platelet strings are routinely identified by eye from video recordings captured during experiments and analysed manually using basic NIH image software to determine the number of strings and their lengths. This is a laborious, time-consuming task and a single experiment, often consisting of data from four to six dishes of endothelial cells, can take 2 or more days to analyse. The method described here allows analysis of the strings to provide data such as the number and length of strings, number of platelets per string and the distance between each platelet to be found. The software reduces analysis time, and more importantly removes user subjectivity, producing highly reproducible results with an error of less than 2% when compared with detailed manual analysis.

  19. Computer-implemented system and method for automated and highly accurate plaque analysis, reporting, and visualization

    Science.gov (United States)

    Kemp, James Herbert (Inventor); Talukder, Ashit (Inventor); Lambert, James (Inventor); Lam, Raymond (Inventor)

    2008-01-01

    A computer-implemented system and method of intra-oral analysis for measuring plaque removal is disclosed. The system includes hardware for real-time image acquisition and software to store the acquired images on a patient-by-patient basis. The system implements algorithms to segment teeth of interest from surrounding gum, and uses a real-time image-based morphing procedure to automatically overlay a grid onto each segmented tooth. Pattern recognition methods are used to classify plaque from surrounding gum and enamel, while ignoring glare effects due to the reflection of camera light and ambient light from enamel regions. The system integrates these components into a single software suite with an easy-to-use graphical user interface (GUI) that allows users to do an end-to-end run of a patient record, including tooth segmentation of all teeth, grid morphing of each segmented tooth, and plaque classification of each tooth image.

  20. GSMA: Gene Set Matrix Analysis, An Automated Method for Rapid Hypothesis Testing of Gene Expression Data

    Directory of Open Access Journals (Sweden)

    Chris Cheadle

    2007-01-01

    Full Text Available Background: Microarray technology has become highly valuable for identifying complex global changes in gene expression patterns. The assignment of functional information to these complex patterns remains a challenging task in effectively interpreting data and correlating results from across experiments, projects and laboratories. Methods which allow the rapid and robust evaluation of multiple functional hypotheses increase the power of individual researchers to data mine gene expression data more efficiently.Results: We have developed (gene set matrix analysis GSMA as a useful method for the rapid testing of group-wise up- or downregulation of gene expression simultaneously for multiple lists of genes (gene sets against entire distributions of gene expression changes (datasets for single or multiple experiments. The utility of GSMA lies in its flexibility to rapidly poll gene sets related by known biological function or as designated solely by the end-user against large numbers of datasets simultaneously.Conclusions: GSMA provides a simple and straightforward method for hypothesis testing in which genes are tested by groups across multiple datasets for patterns of expression enrichment.

  1. A new automated method for analysis of gated-SPECT images based on a three-dimensional heart shaped model

    DEFF Research Database (Denmark)

    Lomsky, Milan; Richter, Jens; Johansson, Lena

    2005-01-01

    A new automated method for quantification of left ventricular function from gated-single photon emission computed tomography (SPECT) images has been developed. The method for quantification of cardiac function (CAFU) is based on a heart shaped model and the active shape algorithm. The model...

  2. Automated analysis of protein NMR assignments using methods from artificial intelligence.

    Science.gov (United States)

    Zimmerman, D E; Kulikowski, C A; Huang, Y; Feng, W; Tashiro, M; Shimotakahara, S; Chien, C; Powers, R; Montelione, G T

    1997-06-20

    An expert system for determining resonance assignments from NMR spectra of proteins is described. Given the amino acid sequence, a two-dimensional 15N-1H heteronuclear correlation spectrum and seven to eight three-dimensional triple-resonance NMR spectra for seven proteins, AUTOASSIGN obtained an average of 98% of sequence-specific spin-system assignments with an error rate of less than 0.5%. Execution times on a Sparc 10 workstation varied from 16 seconds for smaller proteins with simple spectra to one to nine minutes for medium size proteins exhibiting numerous extra spin systems attributed to conformational isomerization. AUTOASSIGN combines symbolic constraint satisfaction methods with a domain-specific knowledge base to exploit the logical structure of the sequential assignment problem, the specific features of the various NMR experiments, and the expected chemical shift frequencies of different amino acids. The current implementation specializes in the analysis of data derived from the most sensitive of the currently available triple-resonance experiments. Potential extensions of the system for analysis of additional types of protein NMR data are also discussed.

  3. Triangulation methods for automated docking

    Science.gov (United States)

    Bales, John W.

    1996-01-01

    An automated docking system must have a reliable method for determining range and orientation of the passive (target) vehicle with respect to the active vehicle. This method must also provide accurate information on the rates of change of range to and orientation of the passive vehicle. The method must be accurate within required tolerances and capable of operating in real time. The method being developed at Marshall Space Flight Center employs a single TV camera, a laser illumination system and a target consisting, in its minimal configuration, of three retro-reflectors. Two of the retro-reflectors are mounted flush to the same surface, with the third retro-reflector mounted to a post fixed midway between the other two and jutting at a right angle from the surface. For redundancy, two additional retroreflectors are mounted on the surface on a line at right angles to the line containing the first two retro-reflectors, and equally spaced on either side of the post. The target vehicle will contain a large target for initial acquisition and several smaller targets for close range.

  4. Automated detection method for architectural distortion areas on mammograms based on morphological processing and surface analysis

    Science.gov (United States)

    Ichikawa, Tetsuko; Matsubara, Tomoko; Hara, Takeshi; Fujita, Hiroshi; Endo, Tokiko; Iwase, Takuji

    2004-05-01

    As well as mass and microcalcification, architectural distortion is a very important finding for the early detection of breast cancer via mammograms, and such distortions can be classified into three typical types: spiculation, retraction, and distortion. The purpose of this work is to develop an automatic method for detecting areas of architectural distortion with spiculation. The suspect areas are detected by concentration indexes of line-structures extracted by using mean curvature. After that, discrimination analysis of nine features is employed for the classifications of true and false positives. The employed features are the size, the mean pixel value, the mean concentration index, the mean isotropic index, the contrast, and four other features based on the power spectrum. As a result of this work, the accuracy of the classification was 76% and the sensitivity was 80% with 0.9 false positives per image in our database in regard to spiculation. It was concluded that our method was effective in detectiong the area of architectural distortion; however, some architectural distortions were not detected accurately because of the size, the density, or the different appearance of the distorted areas.

  5. Automated Methods Of Corrosion Measurements

    DEFF Research Database (Denmark)

    Bech-Nielsen, Gregers; Andersen, Jens Enevold Thaulov; Reeve, John Ch

    1997-01-01

    The chapter describes the following automated measurements: Corrosion Measurements by Titration, Imaging Corrosion by Scanning Probe Microscopy, Critical Pitting Temperature and Application of the Electrochemical Hydrogen Permeation Cell.......The chapter describes the following automated measurements: Corrosion Measurements by Titration, Imaging Corrosion by Scanning Probe Microscopy, Critical Pitting Temperature and Application of the Electrochemical Hydrogen Permeation Cell....

  6. Feasibility Analysis of Crane Automation

    Institute of Scientific and Technical Information of China (English)

    DONG Ming-xiao; MEI Xue-song; JIANG Ge-dong; ZHANG Gui-qing

    2006-01-01

    This paper summarizes the modeling methods, open-loop control and closed-loop control techniques of various forms of cranes, worldwide, and discusses their feasibilities and limitations in engineering. Then the dynamic behaviors of cranes are analyzed. Finally, we propose applied modeling methods and feasible control techniques and demonstrate the feasibilities of crane automation.

  7. Automated Analysis of Infinite Scenarios

    DEFF Research Database (Denmark)

    Buchholtz, Mikael

    2005-01-01

    The security of a network protocol crucially relies on the scenario in which the protocol is deployed. This paper describes syntactic constructs for modelling network scenarios and presents an automated analysis tool, which can guarantee that security properties hold in all of the (infinitely many...

  8. A novel, rapid and automated conductometric method to evaluate surfactant-cells interactions by means of critical micellar concentration analysis.

    Science.gov (United States)

    Tiecco, Matteo; Corte, Laura; Roscini, Luca; Colabella, Claudia; Germani, Raimondo; Cardinali, Gianluigi

    2014-07-25

    Conductometry is widely used to determine critical micellar concentration and micellar aggregates surface properties of amphiphiles. Current conductivity experiments of surfactant solutions are typically carried out by manual pipetting, yielding some tens reading points within a couple of hours. In order to study the properties of surfactant-cells interactions, each amphiphile must be tested in different conditions against several types of cells. This calls for complex experimental designs making the application of current methods seriously time consuming, especially because long experiments risk to determine alterations of cells, independently of the surfactant action. In this paper we present a novel, accurate and rapid automated procedure to obtain conductometric curves with several hundreds reading points within tens of minutes. The method was validated with surfactant solutions alone and in combination with Saccharomyces cerevisiae cells. An easy-to use R script, calculates conductometric parameters and their statistical significance with a graphic interface to visualize data and results. The validations showed that indeed the procedure works in the same manner with surfactant alone or in combination with cells, yielding around 1000 reading points within 20 min and with high accuracy, as determined by the regression analysis.

  9. Automated methods of textual content analysis and description of text structures

    CERN Document Server

    Chýla, Roman

    Universal Semantic Language (USL) is a semi-formalized approach for the description of knowledge (a knowledge representation tool). The idea of USL was introduced by Vladimir Smetacek in the system called SEMAN which was used for keyword extraction tasks in the former Information centre of the Czechoslovak Republic. However due to the dissolution of the centre in early 90's, the system has been lost. This thesis reintroduces the idea of USL in a new context of quantitative content analysis. First we introduce the historical background and the problems of semantics and knowledge representation, semes, semantic fields, semantic primes and universals. The basic methodology of content analysis studies is illustrated on the example of three content analysis tools and we describe the architecture of a new system. The application was built specifically for USL discovery but it can work also in the context of classical content analysis. It contains Natural Language Processing (NLP) components and employs the algorith...

  10. VID-R and SCAN: Tools and Methods for the Automated Analysis of Visual Records.

    Science.gov (United States)

    Ekman, Paul; And Others

    The VID-R (Visual Information Display and Retrieval) system that enables computer-aided analysis of visual records is composed of a film-to-television chain, two videotape recorders with complete remote control of functions, a video-disc recorder, three high-resolution television monitors, a teletype, a PDP-8, a video and audio interface, three…

  11. Automated Analysis of Corpora Callosa

    DEFF Research Database (Denmark)

    Stegmann, Mikkel Bille; Davies, Rhodri H.

    2003-01-01

    This report describes and evaluates the steps needed to perform modern model-based interpretation of the corpus callosum in MRI. The process is discussed from the initial landmark-free contours to full-fledged statistical models based on the Active Appearance Models framework. Topics treated incl...... include landmark placement, background modelling and multi-resolution analysis. Preliminary quantitative and qualitative validation in a cross-sectional study show that fully automated analysis and segmentation of the corpus callosum are feasible....

  12. Description and recognition of regular and distorted secondary structures in proteins using the automated protein structure analysis method.

    Science.gov (United States)

    Ranganathan, Sushilee; Izotov, Dmitry; Kraka, Elfi; Cremer, Dieter

    2009-08-01

    The Automated Protein Structure Analysis (APSA) method, which describes the protein backbone as a smooth line in three-dimensional space and characterizes it by curvature kappa and torsion tau as a function of arc length s, was applied on 77 proteins to determine all secondary structural units via specific kappa(s) and tau(s) patterns. A total of 533 alpha-helices and 644 beta-strands were recognized by APSA, whereas DSSP gives 536 and 651 units, respectively. Kinks and distortions were quantified and the boundaries (entry and exit) of secondary structures were classified. Similarity between proteins can be easily quantified using APSA, as was demonstrated for the roll architecture of proteins ubiquitin and spinach ferridoxin. A twenty-by-twenty comparison of all alpha domains showed that the curvature-torsion patterns generated by APSA provide an accurate and meaningful similarity measurement for secondary, super secondary, and tertiary protein structure. APSA is shown to accurately reflect the conformation of the backbone effectively reducing three-dimensional structure information to two-dimensional representations that are easy to interpret and understand.

  13. Integrated Design Engineering Analysis (IDEA) Environment Automated Generation of Structured CFD Grids using Topology Methods

    Science.gov (United States)

    Kamhawi, Hilmi N.

    2012-01-01

    This report documents the work performed from March 2010 to March 2012. The Integrated Design and Engineering Analysis (IDEA) environment is a collaborative environment based on an object-oriented, multidisciplinary, distributed framework using the Adaptive Modeling Language (AML) as a framework and supporting the configuration design and parametric CFD grid generation. This report will focus on describing the work in the area of parametric CFD grid generation using novel concepts for defining the interaction between the mesh topology and the geometry in such a way as to separate the mesh topology from the geometric topology while maintaining the link between the mesh topology and the actual geometry.

  14. Automated Sentiment Analysis

    Science.gov (United States)

    2009-06-01

    Sentiment Analysis? Deep philosophical questions could be raised about the nature of sentiment. It is not exactly an emotion – one can choose to...and syntactic analysis easier. It also forestalls misunderstanding; sentences likely to be misclassified (because of unusual style, sarcasm , etc...has no emotional significance. We focus on supervised learning for this prototype; though, we can alter our program to perform unsupervised learning

  15. A method to quantify movement activity of groups of animals using automated image analysis

    Science.gov (United States)

    Xu, Jianyu; Yu, Haizhen; Liu, Ying

    2009-07-01

    Most physiological and environmental changes are capable of inducing variations in animal behavior. The behavioral parameters have the possibility to be measured continuously in-situ by a non-invasive and non-contact approach, and have the potential to be used in the actual productions to predict stress conditions. Most vertebrates tend to live in groups, herds, flocks, shoals, bands, packs of conspecific individuals. Under culture conditions, the livestock or fish are in groups and interact on each other, so the aggregate behavior of the group should be studied rather than that of individuals. This paper presents a method to calculate the movement speed of a group of animal in a enclosure or a tank denoted by body length speed that correspond to group activity using computer vision technique. Frame sequences captured at special time interval were subtracted in pairs after image segmentation and identification. By labeling components caused by object movement in difference frame, the projected area caused by the movement of every object in the capture interval was calculated; this projected area was divided by the projected area of every object in the later frame to get body length moving distance of each object, and further could obtain the relative body length speed. The average speed of all object can well respond to the activity of the group. The group activity of a tilapia (Oreochromis niloticus) school to high (2.65 mg/L) levels of unionized ammonia (UIA) concentration were quantified based on these methods. High UIA level condition elicited a marked increase in school activity at the first hour (P<0.05) exhibiting an avoidance reaction (trying to flee from high UIA condition), and then decreased gradually.

  16. Automation of the proximate analysis of coals

    Energy Technology Data Exchange (ETDEWEB)

    1985-01-01

    A study is reported of the feasibility of using a multi-jointed general-purpose robot for the automated analysis of moisture, volatile matter, ash and total post-combustion sulfur in coal and coke. The results obtained with an automated system are compared with those of conventional manual methods. The design of the robot hand and the safety measures provided are now both fully satisfactory, and the analytic values obtained exhibit little scatter. It is concluded that the use of this robot system results in a better working environment and in considerable labour saving. Applications to other tasks are under development.

  17. Automated Motivic Analysis

    DEFF Research Database (Denmark)

    Lartillot, Olivier

    2016-01-01

    of the successive notes and intervals, various sets of musical parameters may be invoked. In this chapter, a method is presented that allows for these heterogeneous patterns to be discovered. Motivic repetition with local ornamentation is detected by reconstructing, on top of “surface-level” monodic voices, longer...... for lossless compression. The structural complexity resulting from successive repetitions of patterns can be controlled through a simple modelling of cycles. Generally, motivic patterns cannot always be defined solely as sequences of descriptions in a fixed set of dimensions: throughout the descriptions......-term relations between non-adjacent notes related to deeper structures, and by tracking motives on the resulting syntagmatic network. These principles are integrated into a computational framework, the MiningSuite, developed in Matlab....

  18. High-Throughput Method for Automated Colony and Cell Counting by Digital Image Analysis Based on Edge Detection.

    Directory of Open Access Journals (Sweden)

    Priya Choudhry

    Full Text Available Counting cells and colonies is an integral part of high-throughput screens and quantitative cellular assays. Due to its subjective and time-intensive nature, manual counting has hindered the adoption of cellular assays such as tumor spheroid formation in high-throughput screens. The objective of this study was to develop an automated method for quick and reliable counting of cells and colonies from digital images. For this purpose, I developed an ImageJ macro Cell Colony Edge and a CellProfiler Pipeline Cell Colony Counting, and compared them to other open-source digital methods and manual counts. The ImageJ macro Cell Colony Edge is valuable in counting cells and colonies, and measuring their area, volume, morphology, and intensity. In this study, I demonstrate that Cell Colony Edge is superior to other open-source methods, in speed, accuracy and applicability to diverse cellular assays. It can fulfill the need to automate colony/cell counting in high-throughput screens, colony forming assays, and cellular assays.

  19. Automation of finite element methods

    CERN Document Server

    Korelc, Jože

    2016-01-01

    New finite elements are needed as well in research as in industry environments for the development of virtual prediction techniques. The design and implementation of novel finite elements for specific purposes is a tedious and time consuming task, especially for nonlinear formulations. The automation of this process can help to speed up this process considerably since the generation of the final computer code can be accelerated by order of several magnitudes. This book provides the reader with the required knowledge needed to employ modern automatic tools like AceGen within solid mechanics in a successful way. It covers the range from the theoretical background, algorithmic treatments to many different applications. The book is written for advanced students in the engineering field and for researchers in educational and industrial environments.

  20. Automated analysis of complex data

    Science.gov (United States)

    Saintamant, Robert; Cohen, Paul R.

    1994-01-01

    We have examined some of the issues involved in automating exploratory data analysis, in particular the tradeoff between control and opportunism. We have proposed an opportunistic planning solution for this tradeoff, and we have implemented a prototype, Igor, to test the approach. Our experience in developing Igor was surprisingly smooth. In contrast to earlier versions that relied on rule representation, it was straightforward to increment Igor's knowledge base without causing the search space to explode. The planning representation appears to be both general and powerful, with high level strategic knowledge provided by goals and plans, and the hooks for domain-specific knowledge are provided by monitors and focusing heuristics.

  1. Automated pipelines for spectroscopic analysis

    Science.gov (United States)

    Allende Prieto, C.

    2016-09-01

    The Gaia mission will have a profound impact on our understanding of the structure and dynamics of the Milky Way. Gaia is providing an exhaustive census of stellar parallaxes, proper motions, positions, colors and radial velocities, but also leaves some glaring holes in an otherwise complete data set. The radial velocities measured with the on-board high-resolution spectrograph will only reach some 10 % of the full sample of stars with astrometry and photometry from the mission, and detailed chemical information will be obtained for less than 1 %. Teams all over the world are organizing large-scale projects to provide complementary radial velocities and chemistry, since this can now be done very efficiently from the ground thanks to large and mid-size telescopes with a wide field-of-view and multi-object spectrographs. As a result, automated data processing is taking an ever increasing relevance, and the concept is applying to many more areas, from targeting to analysis. In this paper, I provide a quick overview of recent, ongoing, and upcoming spectroscopic surveys, and the strategies adopted in their automated analysis pipelines.

  2. Exploratory analysis of methods for automated classification of laboratory test orders into syndromic groups in veterinary medicine.

    Directory of Open Access Journals (Sweden)

    Fernanda C Dórea

    Full Text Available BACKGROUND: Recent focus on earlier detection of pathogen introduction in human and animal populations has led to the development of surveillance systems based on automated monitoring of health data. Real- or near real-time monitoring of pre-diagnostic data requires automated classification of records into syndromes--syndromic surveillance--using algorithms that incorporate medical knowledge in a reliable and efficient way, while remaining comprehensible to end users. METHODS: This paper describes the application of two of machine learning (Naïve Bayes and Decision Trees and rule-based methods to extract syndromic information from laboratory test requests submitted to a veterinary diagnostic laboratory. RESULTS: High performance (F1-macro = 0.9995 was achieved through the use of a rule-based syndrome classifier, based on rule induction followed by manual modification during the construction phase, which also resulted in clear interpretability of the resulting classification process. An unmodified rule induction algorithm achieved an F(1-micro score of 0.979 though this fell to 0.677 when performance for individual classes was averaged in an unweighted manner (F(1-macro, due to the fact that the algorithm failed to learn 3 of the 16 classes from the training set. Decision Trees showed equal interpretability to the rule-based approaches, but achieved an F(1-micro score of 0.923 (falling to 0.311 when classes are given equal weight. A Naïve Bayes classifier learned all classes and achieved high performance (F(1-micro= 0.994 and F(1-macro = .955, however the classification process is not transparent to the domain experts. CONCLUSION: The use of a manually customised rule set allowed for the development of a system for classification of laboratory tests into syndromic groups with very high performance, and high interpretability by the domain experts. Further research is required to develop internal validation rules in order to establish

  3. Power analysis in flexible automation

    Science.gov (United States)

    Titus, Nathan A.

    1992-12-01

    The performance of an automation or robotic device can be measured in terms of its power efficiency. Screw theory is used to mathematically define the task instantaneously with two screws. The task wrench defines the effect of the device on its environment, and the task twist describes the motion of the device. The tasks can be separated into three task types: kinetic, manipulative, and reactive. Efficiency metrics are developed for each task type. The output power is strictly a function of the task screws, while device input power is shown to be a function of the task, the device Jacobian, and the actuator type. Expressions for input power are developed for two common types of actuators, DC servometers and hydraulic actuators. Simple examples are used to illustrate how power analysis can be used for task/workspace planning, actuator selection, device configuration design, and redundancy resolution.

  4. Automated image analysis techniques for cardiovascular magnetic resonance imaging

    NARCIS (Netherlands)

    Geest, Robertus Jacobus van der

    2011-01-01

    The introductory chapter provides an overview of various aspects related to quantitative analysis of cardiovascular MR (CMR) imaging studies. Subsequently, the thesis describes several automated methods for quantitative assessment of left ventricular function from CMR imaging studies. Several novel

  5. Automated liquid chromatography-tandem mass spectrometry method for the analysis of firocoxib in urine and plasma from horse and dog.

    Science.gov (United States)

    Letendre, Laura; Kvaternick, Valerie; Tecle, Berhane; Fischer, James

    2007-06-15

    A rugged, sensitive and efficient liquid chromatography-tandem mass spectrometry method was developed and validated for the quantitative analysis of firocoxib in urine from 5 to 3000 ng/mL and in plasma from 1 to 3000 ng/mL. The method requires 200 microL of either plasma or urine and includes sample preparation in 96-well solid phase extraction (SPE) plates using a BIOMEK 2000 Laboratory Automated Workstation. Chromatographic separation of firocoxib from matrix interferences was achieved using isocratic reversed phase chromatography on a PHENOMENEX LUNA Phenyl-Hexyl column. The mobile phase was 45% acetonitrile and 55% of a 2 mM ammonium formate buffer. The method was accurate (88-107%) and precise (CV93% were achieved and ionization efficiencies (due to matrix effects) were >72%. Extensive stability and ruggedness testing was also performed; therefore, the method can be used for pharmacokinetic studies as well as drug monitoring and screening. The data presented here is the first LC-MS/MS method for the quantitation of firocoxib in plasma (LLOQ of 1 ng/mL), a 25-fold improvement in sensitivity over the HPLC-UV method and the first quantitative method for firocoxib in urine (LLOQ of 5 ng/mL). Additionally the sample preparation process has been automated to improve efficiency.

  6. Automated Methods of Corrosion Measurements

    DEFF Research Database (Denmark)

    Andersen, Jens Enevold Thaulov

    1997-01-01

    electrochemical measurements as well as elemental analysis look very promising for elucidating corrosion reaction mechanisms. The study of initial surface reactions at the atomic or submicron level is becoming an important field of research in the understanding of corrosion processes. At present, mainly two...... scanning microscope techniques are employed investigating corrosion processes, and usually in situ: in situ scanning tunneling microscopy (in situ STM) and in situ scanning force microscopy (in situ AFM). It is these techniques to which attention is directed here....

  7. Automation for System Safety Analysis

    Science.gov (United States)

    Malin, Jane T.; Fleming, Land; Throop, David; Thronesbery, Carroll; Flores, Joshua; Bennett, Ted; Wennberg, Paul

    2009-01-01

    This presentation describes work to integrate a set of tools to support early model-based analysis of failures and hazards due to system-software interactions. The tools perform and assist analysts in the following tasks: 1) extract model parts from text for architecture and safety/hazard models; 2) combine the parts with library information to develop the models for visualization and analysis; 3) perform graph analysis and simulation to identify and evaluate possible paths from hazard sources to vulnerable entities and functions, in nominal and anomalous system-software configurations and scenarios; and 4) identify resulting candidate scenarios for software integration testing. There has been significant technical progress in model extraction from Orion program text sources, architecture model derivation (components and connections) and documentation of extraction sources. Models have been derived from Internal Interface Requirements Documents (IIRDs) and FMEA documents. Linguistic text processing is used to extract model parts and relationships, and the Aerospace Ontology also aids automated model development from the extracted information. Visualizations of these models assist analysts in requirements overview and in checking consistency and completeness.

  8. A Comparison of Fully Automated Methods of Data Analysis and Computer Assisted Heuristic Methods in an Electrode Kinetic Study of the Pathologically Variable [Fe(CN) 6 ] 3–/4– Process by AC Voltammetry

    KAUST Repository

    Morris, Graham P.

    2013-12-17

    Fully automated and computer assisted heuristic data analysis approaches have been applied to a series of AC voltammetric experiments undertaken on the [Fe(CN)6]3-/4- process at a glassy carbon electrode in 3 M KCl aqueous electrolyte. The recovered parameters in all forms of data analysis encompass E0 (reversible potential), k0 (heterogeneous charge transfer rate constant at E0), α (charge transfer coefficient), Ru (uncompensated resistance), and Cdl (double layer capacitance). The automated method of analysis employed time domain optimization and Bayesian statistics. This and all other methods assumed the Butler-Volmer model applies for electron transfer kinetics, planar diffusion for mass transport, Ohm\\'s Law for Ru, and a potential-independent Cdl model. Heuristic approaches utilize combinations of Fourier Transform filtering, sensitivity analysis, and simplex-based forms of optimization applied to resolved AC harmonics and rely on experimenter experience to assist in experiment-theory comparisons. Remarkable consistency of parameter evaluation was achieved, although the fully automated time domain method provided consistently higher α values than those based on frequency domain data analysis. The origin of this difference is that the implemented fully automated method requires a perfect model for the double layer capacitance. In contrast, the importance of imperfections in the double layer model is minimized when analysis is performed in the frequency domain. Substantial variation in k0 values was found by analysis of the 10 data sets for this highly surface-sensitive pathologically variable [Fe(CN) 6]3-/4- process, but remarkably, all fit the quasi-reversible model satisfactorily. © 2013 American Chemical Society.

  9. NEW TECHNIQUES USED IN AUTOMATED TEXT ANALYSIS

    Directory of Open Access Journals (Sweden)

    M. I strate

    2010-12-01

    Full Text Available Automated analysis of natural language texts is one of the most important knowledge discovery tasks for any organization. According to Gartner Group, almost 90% of knowledge available at an organization today is dispersed throughout piles of documents buried within unstructured text. Analyzing huge volumes of textual information is often involved in making informed and correct business decisions. Traditional analysis methods based on statistics fail to help processing unstructured texts and the society is in search of new technologies for text analysis. There exist a variety of approaches to the analysis of natural language texts, but most of them do not provide results that could be successfully applied in practice. This article concentrates on recent ideas and practical implementations in this area.

  10. Automated Pipelines for Spectroscopic Analysis

    CERN Document Server

    Prieto, Carlos Allende

    2016-01-01

    The Gaia mission will have a profound impact on our understanding of the structure and dynamics of the Milky Way. Gaia is providing an exhaustive census of stellar parallaxes, proper motions, positions, colors and radial velocities, but also leaves some flaring holes in an otherwise complete data set. The radial velocities measured with the on-board high-resolution spectrograph will only reach some 10% of the full sample of stars with astrometry and photometry from the mission, and detailed chemical information will be obtained for less than 1%. Teams all over the world are organizing large-scale projects to provide complementary radial velocities and chemistry, since this can now be done very efficiently from the ground thanks to large and mid-size telescopes with a wide field-of-view and multi-object spectrographs. As a result, automated data processing is taking an ever increasing relevance, and the concept is applying to many more areas, from targeting to analysis. In this paper, I provide a quick overvie...

  11. Predictions for rapid methods and automation in food microbiology.

    Science.gov (United States)

    Fung, Daniel Y C

    2002-01-01

    A discussion is presented on the present status of rapid methods and automation in microbiology. Predictions are also presented for development in the following areas: viable cell counts; real-time monitoring of hygiene; polymerase chain reaction, ribotyping, and genetic tests in food laboratories; automated enzyme-linked immunosorbent assay and immunotests; rapid dipstick technology; biosensors for Hazard Analysis Critical Control Point programs; instant detection of target pathogens by computer-generated matrix; effective separation and concentration for rapid identification of target cells; microbiological alert systems in food packages; and rapid alert kits for detecting pathogens at home.

  12. Automation and robotics for genetic analysis.

    Science.gov (United States)

    Smith, J H; Madan, D; Salhaney, J; Engelstein, M

    2001-05-01

    This guide to laboratory robotics covers a wide variety of methods amenable to automation including mapping, genotyping, barcoding and data handling, template preparation, reaction setup, colony and plaque picking, and more.

  13. A Systematic, Automated Network Planning Method

    DEFF Research Database (Denmark)

    Holm, Jens Åge; Pedersen, Jens Myrup

    2006-01-01

    This paper describes a case study conducted to evaluate the viability of a systematic, automated network planning method. The motivation for developing the network planning method was that many data networks are planned in an adhoc manner with no assurance of quality of the solution with respect...... to consistency and long-term characteristics. The developed method gives significant improvements on these parameters. The case study was conducted as a comparison between an existing network where the traffic was known and a proposed network designed by the developed method. It turned out that the proposed...... network performed better than the existing network with regard to the performance measurements used which reflected how well the traffic was routed in the networks and the cost of establishing the networks. Challenges that need to be solved before the developed method can be used to design network...

  14. Sensitivity Analysis of Automated Ice Edge Detection

    Science.gov (United States)

    Moen, Mari-Ann N.; Isaksem, Hugo; Debien, Annekatrien

    2016-08-01

    The importance of highly detailed and time sensitive ice charts has increased with the increasing interest in the Arctic for oil and gas, tourism, and shipping. Manual ice charts are prepared by national ice services of several Arctic countries. Methods are also being developed to automate this task. Kongsberg Satellite Services uses a method that detects ice edges within 15 minutes after image acquisition. This paper describes a sensitivity analysis of the ice edge, assessing to which ice concentration class from the manual ice charts it can be compared to. The ice edge is derived using the Ice Tracking from SAR Images (ITSARI) algorithm. RADARSAT-2 images of February 2011 are used, both for the manual ice charts and the automatic ice edges. The results show that the KSAT ice edge lies within ice concentration classes with very low ice concentration or open water.

  15. Management issues in automated audit analysis

    Energy Technology Data Exchange (ETDEWEB)

    Jackson, K.A.; Hochberg, J.G.; Wilhelmy, S.K.; McClary, J.F.; Christoph, G.G.

    1994-03-01

    This paper discusses management issues associated with the design and implementation of an automated audit analysis system that we use to detect security events. It gives the viewpoint of a team directly responsible for developing and managing such a system. We use Los Alamos National Laboratory`s Network Anomaly Detection and Intrusion Reporter (NADIR) as a case in point. We examine issues encountered at Los Alamos, detail our solutions to them, and where appropriate suggest general solutions. After providing an introduction to NADIR, we explore four general management issues: cost-benefit questions, privacy considerations, legal issues, and system integrity. Our experiences are of general interest both to security professionals and to anyone who may wish to implement a similar system. While NADIR investigates security events, the methods used and the management issues are potentially applicable to a broad range of complex systems. These include those used to audit credit card transactions, medical care payments, and procurement systems.

  16. Automated 96-well solid phase extraction and hydrophilic interaction liquid chromatography-tandem mass spectrometric method for the analysis of cetirizine (ZYRTEC) in human plasma--with emphasis on method ruggedness.

    Science.gov (United States)

    Song, Qi; Junga, Heiko; Tang, Yong; Li, Austin C; Addison, Tom; McCort-Tipton, Melanie; Beato, Brian; Naidong, Weng

    2005-01-05

    A high-throughput bioanalytical method based on automated sample transfer, automated solid phase extraction, and hydrophilic interaction liquid chromatography-tandem mass spectrometry (HILIC-MS/MS) analysis, has been developed for the determination of cetirizine, a selective H(1)-receptor antagonist. Deuterated cetirizine (cetirizine-d(8)) was synthesized as described and was used as the internal standard. Samples were transferred into 96-well plates using an automated sample handling system. Automated solid phase extraction was carried out using a 96-channel programmable liquid-handling workstation. Solid phase extraction 96-well plate on polymer sorbent (Strata X) was used to extract the analyte. The extracted samples were injected onto a Betasil silica column (50 x 3, 5 microm) using a mobile phase of acetonitrile-water-acetic acid-trifluroacetic acid (93:7:1:0.025, v/v/v/v) at a flow rate of 0.5 ml/min. The chromatographic run time is 2.0 min per injection, with retention time of cetirizine and cetirizine-d(8) both at 1.1 min. The system consisted of a Shimadzu HPLC system and a PE Sciex API 3000 or API 4000 tandem mass spectrometer with (+) ESI. The method has been validated over the concentration range of 1.00-1000 ng/ml cetirizine in human plasma, based on a 0.10-ml sample size. The inter-day precision and accuracy of the quality control (QC) samples demonstrated <3.0% relative standard deviation (R.S.D.) and <6.0% relative error (RE). Stability of cetirizine in stock solution, in plasma, and in reconstitution solution was established. The absolute extraction recovery was 85.8%, 84.5%, and 88.0% at 3, 40, and 800 ng/ml, respectively. The recovery for the internal standard was 84.1%. No adverse matrix effects were noticed for this assay. The automation of the sample preparation steps not only increased the analysis throughput, but also increased method ruggedness. The use of a stable isotope-labeled internal standard further improved the method ruggedness

  17. Analysis of an automated background correction method for cardiovascular MR phase contrast imaging in children and young adults

    Energy Technology Data Exchange (ETDEWEB)

    Rigsby, Cynthia K.; Hilpipre, Nicholas; Boylan, Emma E.; Popescu, Andrada R.; Deng, Jie [Ann and Robert H. Lurie Children' s Hospital of Chicago, Department of Medical Imaging, Chicago, IL (United States); McNeal, Gary R. [Siemens Medical Solutions USA Inc., Customer Solutions Group, Cardiovascular MR R and D, Chicago, IL (United States); Zhang, Gang [Ann and Robert H. Lurie Children' s Hospital of Chicago Research Center, Biostatistics Research Core, Chicago, IL (United States); Choi, Grace [Ann and Robert H. Lurie Children' s Hospital of Chicago, Department of Pediatrics, Chicago, IL (United States); Greiser, Andreas [Siemens AG Healthcare Sector, Erlangen (Germany)

    2014-03-15

    Phase contrast magnetic resonance imaging (MRI) is a powerful tool for evaluating vessel blood flow. Inherent errors in acquisition, such as phase offset, eddy currents and gradient field effects, can cause significant inaccuracies in flow parameters. These errors can be rectified with the use of background correction software. To evaluate the performance of an automated phase contrast MRI background phase correction method in children and young adults undergoing cardiac MR imaging. We conducted a retrospective review of patients undergoing routine clinical cardiac MRI including phase contrast MRI for flow quantification in the aorta (Ao) and main pulmonary artery (MPA). When phase contrast MRI of the right and left pulmonary arteries was also performed, these data were included. We excluded patients with known shunts and metallic implants causing visible MRI artifact and those with more than mild to moderate aortic or pulmonary stenosis. Phase contrast MRI of the Ao, mid MPA, proximal right pulmonary artery (RPA) and left pulmonary artery (LPA) using 2-D gradient echo Fast Low Angle SHot (FLASH) imaging was acquired during normal respiration with retrospective cardiac gating. Standard phase image reconstruction and the automatic spatially dependent background-phase-corrected reconstruction were performed on each phase contrast MRI dataset. Non-background-corrected and background-phase-corrected net flow, forward flow, regurgitant volume, regurgitant fraction, and vessel cardiac output were recorded for each vessel. We compared standard non-background-corrected and background-phase-corrected mean flow values for the Ao and MPA. The ratio of pulmonary to systemic blood flow (Qp:Qs) was calculated for the standard non-background and background-phase-corrected data and these values were compared to each other and for proximity to 1. In a subset of patients who also underwent phase contrast MRI of the MPA, RPA, and LPA a comparison was made between standard non

  18. Cooling method with automated seasonal freeze protection

    Energy Technology Data Exchange (ETDEWEB)

    Cambell, Levi; Chu, Richard; David, Milnes; Ellsworth, Jr, Michael; Iyengar, Madhusudan; Simons, Robert; Singh, Prabjit; Zhang, Jing

    2016-05-31

    An automated multi-fluid cooling method is provided for cooling an electronic component(s). The method includes obtaining a coolant loop, and providing a coolant tank, multiple valves, and a controller. The coolant loop is at least partially exposed to outdoor ambient air temperature(s) during normal operation, and the coolant tank includes first and second reservoirs containing first and second fluids, respectively. The first fluid freezes at a lower temperature than the second, the second fluid has superior cooling properties compared with the first, and the two fluids are soluble. The multiple valves are controllable to selectively couple the first or second fluid into the coolant in the coolant loop, wherein the coolant includes at least the second fluid. The controller automatically controls the valves to vary first fluid concentration level in the coolant loop based on historical, current, or anticipated outdoor air ambient temperature(s) for a time of year.

  19. A high-performance, safer and semi-automated approach for the delta18O analysis of diatom silica and new methods for removing exchangeable oxygen.

    Science.gov (United States)

    Chapligin, B; Meyer, H; Friedrichsen, H; Marent, A; Sohns, E; Hubberten, H-W

    2010-09-15

    The determination of the oxygen isotope composition of diatom silica in sediment cores is important for paleoclimate reconstruction, especially in non-carbonate sediments, where no other bioindicators such as ostracods and foraminifera are available. Since most currently available analytical techniques are time-consuming and labour-intensive, we have developed a new, safer, faster and semi-automated online approach for measuring oxygen isotopes in biogenic silica. Improvements include software that controls the measurement procedures and a video camera that remotely records the reaction of the samples under BrF(5) with a CO(2) laser. Maximum safety is guaranteed as the laser-fluorination unit is arranged under a fume hood in a separate room from the operator. A new routine has been developed for removing the exchangeable hydrous components within biogenic silica using ramp degassing. The sample plate is heated up to 1100 degrees C and cooled down to 400 degrees C in approximately 7 h under a flow of He gas (the inert Gas Flow Dehydration method--iGFD) before isotope analysis. Two quartz and two biogenic silica samples (approximately 1.5 mg) of known isotope composition were tested. The isotopic compositions were reproducible within an acceptable error; quartz samples gave a mean standard deviation of <0.15 per thousand (1sigma) and for biogenic silica <0.25 per thousand (1sigma) for samples down to approximately 0.3 mg. The semi-automated fluorination line is the fastest method available at present and enables a throughput of 74 samples/week.

  20. A semi-automated method for the detection of seismic anisotropy at depth via receiver function analysis

    Science.gov (United States)

    Licciardi, A.; Piana Agostinetti, N.

    2016-06-01

    Information about seismic anisotropy is embedded in the variation of the amplitude of the Ps pulses as a function of the azimuth, on both the Radial and the Transverse components of teleseismic receiver functions (RF). We develop a semi-automatic method to constrain the presence and the depth of anisotropic layers beneath a single seismic broad-band station. An algorithm is specifically designed to avoid trial and error methods and subjective crustal parametrizations in RF inversions, providing a suitable tool for large-size data set analysis. The algorithm couples together information extracted from a 1-D VS profile and from a harmonic decomposition analysis of the RF data set. This information is used to determine the number of anisotropic layers and their approximate position at depth, which, in turn, can be used to, for example, narrow the search boundaries for layer thickness and S-wave velocity in a subsequent parameter space search. Here, the output of the algorithm is used to invert an RF data set by means of the Neighbourhood Algorithm (NA). To test our methodology, we apply the algorithm to both synthetic and observed data. We make use of synthetic RF with correlated Gaussian noise to investigate the resolution power for multiple and thin (1-3 km) anisotropic layers in the crust. The algorithm successfully identifies the number and position of anisotropic layers at depth prior the NA inversion step. In the NA inversion, strength of anisotropy and orientation of the symmetry axis are correctly retrieved. Then, the method is applied to field measurement from station BUDO in the Tibetan Plateau. Two consecutive layers of anisotropy are automatically identified with our method in the first 25-30 km of the crust. The data are then inverted with the retrieved parametrization. The direction of the anisotropic axis in the uppermost layer correlates well with the orientation of the major planar structure in the area. The deeper anisotropic layer is associated with

  1. Proximate analysis by automated thermogravimetry

    Energy Technology Data Exchange (ETDEWEB)

    Elder, J.P.

    1983-05-01

    A study has been made of the use of the Perkin-Elmer thermogravimetric instrument TGS-2, under the control of the System 4 microprocessor for the automatic proximate analysis of solid fossil fuels and related matter. The programs developed are simple to operate, and do not require detailed temperature calibration of the instrumental system. They have been tested with coals of varying rank, biomass samples and Devonian oil shales all of which were of special importance to the State of Kentucky. Precise, accurate data conforming to ASTM specifications were obtained. The simplicity of the technique suggests that it may complement the classical ASTM method and could be used when this latter procedure cannot be employed. However, its adoption as a standardized method must await the development of statistical data resulting from interlaboratory testing on a variety of fossil fuels. (9 refs.)

  2. Automated Technology for Verificiation and Analysis

    DEFF Research Database (Denmark)

    This volume contains the papers presented at the 7th International Symposium on Automated Technology for Verification and Analysis held during October 13-16 in Macao SAR, China. The primary objective of the ATVA conferences remains the same: to exchange and promote the latest advances of state......-of-the-art research on theoretical and practical aspects of automated analysis, verification, and synthesis. Among 74 research papers and 10 tool papers submitted to ATVA 2009, the Program Committee accepted 23 as regular papers and 3 as tool papers. In all, 33 experts from 17 countries worked hard to make sure...

  3. Development of automated conjunctival hyperemia analysis software.

    Science.gov (United States)

    Sumi, Tamaki; Yoneda, Tsuyoshi; Fukuda, Ken; Hoshikawa, Yasuhiro; Kobayashi, Masahiko; Yanagi, Masahide; Kiuchi, Yoshiaki; Yasumitsu-Lovell, Kahoko; Fukushima, Atsuki

    2013-11-01

    Conjunctival hyperemia is observed in a variety of ocular inflammatory conditions. The evaluation of hyperemia is indispensable for the treatment of patients with ocular inflammation. However, the major methods currently available for evaluation are based on nonquantitative and subjective methods. Therefore, we developed novel software to evaluate bulbar hyperemia quantitatively and objectively. First, we investigated whether the histamine-induced hyperemia of guinea pigs could be quantified by image analysis. Bulbar conjunctival images were taken by means of a digital camera, followed by the binarization of the images and the selection of regions of interest (ROIs) for evaluation. The ROIs were evaluated by counting the number of absolute pixel values. Pixel values peaked significantly 1 minute after histamine challenge was performed and were still increased after 5 minutes. Second, we applied the same method to antigen (ovalbumin)-induced hyperemia of sensitized guinea pigs, acquiring similar results except for the substantial upregulation in the first 5 minutes after challenge. Finally, we analyzed human bulbar hyperemia using the new software we developed especially for human usage. The new software allows the automatic calculation of pixel values once the ROIs have been selected. In our clinical trials, the percentage of blood vessel coverage of ROIs was significantly higher in the images of hyperemia caused by allergic conjunctival diseases and hyperemia induced by Bimatoprost, compared with those of healthy volunteers. We propose that this newly developed automated hyperemia analysis software will be an objective clinical tool for the evaluation of ocular hyperemia.

  4. Automated Analysis of Child Phonetic Production Using Naturalistic Recordings

    Science.gov (United States)

    Xu, Dongxin; Richards, Jeffrey A.; Gilkerson, Jill

    2014-01-01

    Purpose: Conventional resource-intensive methods for child phonetic development studies are often impractical for sampling and analyzing child vocalizations in sufficient quantity. The purpose of this study was to provide new information on early language development by an automated analysis of child phonetic production using naturalistic…

  5. Automated Loads Analysis System (ATLAS)

    Science.gov (United States)

    Gardner, Stephen; Frere, Scot; O’Reilly, Patrick

    2013-01-01

    ATLAS is a generalized solution that can be used for launch vehicles. ATLAS is used to produce modal transient analysis and quasi-static analysis results (i.e., accelerations, displacements, and forces) for the payload math models on a specific Shuttle Transport System (STS) flight using the shuttle math model and associated forcing functions. This innovation solves the problem of coupling of payload math models into a shuttle math model. It performs a transient loads analysis simulating liftoff, landing, and all flight events between liftoff and landing. ATLAS utilizes efficient and numerically stable algorithms available in MSC/NASTRAN.

  6. Assessment of the relative error in sessile drop method automation task

    OpenAIRE

    Levitskaya T.О.

    2015-01-01

    Assessment of the relative error in the sessile drop method automation. Further development of the sessile drop method is directly related to the development of new techniques and specially developed algorithms enabling automatic computer calculation of surface properties. The sessile drop method mathematical apparatus improvement, drop circuit equation transformation to a form suitable for working, the drop surface calculation method automation, analysis of relative errors in the calculation...

  7. Rapid, automated online SPE-LC-QTRAP-MS/MS method for the simultaneous analysis of 14 phthalate metabolites and 5 bisphenol analogues in human urine.

    Science.gov (United States)

    Heffernan, A L; Thompson, K; Eaglesham, G; Vijayasarathy, S; Mueller, J F; Sly, P D; Gomez, M J

    2016-05-01

    Phthalates and bisphenol A (BPA) have received special attention in recent years due to their frequent use in consumer products and potential for adverse effects on human health. BPA is being replaced with a number of alternatives, including bisphenol S, bisphenol B, bisphenol F and bisphenol AF. These bisphenol analogues have similar potential for adverse health effects, but studies on human exposure are limited. Accurate measurement of multiple contaminants is important for estimating exposure. This paper describes a sensitive and automated method for the simultaneous determination of 14 phthalate metabolites, BPA and four bisphenol analogues in urine using online solid phase extraction coupled with high-performance liquid chromatography/tandem mass spectrometry using a hybrid triple-quadrupole linear ion trap mass spectrometer (LC-QTRAP-MS/MS), requiring very little sample volume (50µL). Quantification was performed under selected reaction monitoring (SRM) mode with negative electrospray ionization. The use of SRM combined with an enhanced product ion scan within the same analysis was examined. Unequivocal identification was provided by the acquisition of three SRM transitions per compound and isotope dilution. The analytical performance of the method was evaluated in synthetic and human urine. Linearity of response over three orders of magnitude was demonstrated for all of the compounds (R(2)>0.99), with method detection limits of 0.01-0.5ng/mL and limits of reporting of 0.07-3.1ng/mL. Accuracy ranged from 93% to 113% and inter- and intra-day precision were bisphenols, with median concentrations ranging from 0.3ng/mL (bisphenol S) to 18.5ng/mL (monoethyl phthalate).

  8. ENHANCEMENT OF METHODICAL BACKGROUND FOR AUTOMATION WITH ENTERPRISE SPECIFICATION

    OpenAIRE

    Pisarchuk, O.; Uvarova, V.

    2010-01-01

    The article gets over the key methodical approaches to building up automated accounting information system with enterprise specification, by the example of program product «1C:Accounting». Shows the advantages and disadvantages of each key methodical approaches.

  9. A Modular Approach for Automating Video Analysis

    OpenAIRE

    Nadarajan, Gayathri; Renouf, Arnaud

    2007-01-01

    International audience; Automating the steps involved in video processing has yet to be tackled with much success by vision developers and knowledge engineers. This is due to the difficulty in formulating vision problems and their solutions in a generalised manner. In this collaborated work, we introduce a modular approach that utilises ontologies to capture the goals, domain description and capabilities for performing video analysis. This modularisation is tested on real-world videos from an...

  10. Techniques for Automated Performance Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Marcus, Ryan C. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2014-09-02

    The performance of a particular HPC code depends on a multitude of variables, including compiler selection, optimization flags, OpenMP pool size, file system load, memory usage, MPI configuration, etc. As a result of this complexity, current predictive models have limited applicability, especially at scale. We present a formulation of scientific codes, nodes, and clusters that reduces complex performance analysis to well-known mathematical techniques. Building accurate predictive models and enhancing our understanding of scientific codes at scale is an important step towards exascale computing.

  11. Failure modes and effects analysis automation

    Science.gov (United States)

    Kamhieh, Cynthia H.; Cutts, Dannie E.; Purves, R. Byron

    1988-01-01

    A failure modes and effects analysis (FMEA) assistant was implemented as a knowledge based system and will be used during design of the Space Station to aid engineers in performing the complex task of tracking failures throughout the entire design effort. The three major directions in which automation was pursued were the clerical components of the FMEA process, the knowledge acquisition aspects of FMEA, and the failure propagation/analysis portions of the FMEA task. The system is accessible to design, safety, and reliability engineers at single user workstations and, although not designed to replace conventional FMEA, it is expected to decrease by many man years the time required to perform the analysis.

  12. US Environmental Protection Agency Method 314.1, an automated sample preconcentration/matrix elimination suppressed conductivity method for the analysis of trace levels (0.50 microg/L) of perchlorate in drinking water.

    Science.gov (United States)

    Wagner, Herbert P; Pepich, B V; Pohl, C; Later, D; Joyce, R; Srinivasan, K; Thomas, D; Woodruff, A; Deborba, B; Munch, D J

    2006-06-16

    Since 1997 there has been increasing interest in the development of analytical methods for the analysis of perchlorate. The US Environmental Protection Agency (EPA) Method 314.0, which was used during the first Unregulated Contaminant Monitoring Regulation (UCMR) cycle, supports a method reporting limit (MRL) of 4.0 microg/L. The non-selective nature of conductivity detection, combined with very high ionic strength matrices, can create conditions that make the determination of perchlorate difficult. The objective of this work was to develop an automated, suppressed conductivity method with improved sensitivity for use in the second UCMR cycle. The new method, EPA Method 314.1, uses a 35 mm x 4 mm cryptand concentrator column in the sample loop position to concentrate perchlorate from a 2 mL sample volume, which is subsequently rinsed with 10 mM NaOH to remove interfering anions. The cryptand concentrator column is combined with a primary AS16 analytical column and a confirmation AS20 analytical column. Unique characteristics of the cryptand column allow perchlorate to be desorbed from the cryptand trap and refocused on the head of the guard column for subsequent separation and analysis. EPA Method 314.1 has a perchlorate lowest concentration minimum reporting level (LCMRL) of 0.13 microg/L in both drinking water and laboratory synthetic sample matrices (LSSM) containing up to 1,000 microg/L each of chloride, bicarbonate and sulfate.

  13. Flux-P: Automating Metabolic Flux Analysis

    Directory of Open Access Journals (Sweden)

    Birgitta E. Ebert

    2012-11-01

    Full Text Available Quantitative knowledge of intracellular fluxes in metabolic networks is invaluable for inferring metabolic system behavior and the design principles of biological systems. However, intracellular reaction rates can not often be calculated directly but have to be estimated; for instance, via 13C-based metabolic flux analysis, a model-based interpretation of stable carbon isotope patterns in intermediates of metabolism. Existing software such as FiatFlux, OpenFLUX or 13CFLUX supports experts in this complex analysis, but requires several steps that have to be carried out manually, hence restricting the use of this software for data interpretation to a rather small number of experiments. In this paper, we present Flux-P as an approach to automate and standardize 13C-based metabolic flux analysis, using the Bio-jETI workflow framework. Exemplarily based on the FiatFlux software, it demonstrates how services can be created that carry out the different analysis steps autonomously and how these can subsequently be assembled into software workflows that perform automated, high-throughput intracellular flux analysis of high quality and reproducibility. Besides significant acceleration and standardization of the data analysis, the agile workflow-based realization supports flexible changes of the analysis workflows on the user level, making it easy to perform custom analyses.

  14. Streamlining and automation of radioanalytical methods at a commercial laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Harvey, J.T.; Dillard, J.W. [IT Corp., Knoxville, TN (United States)

    1993-12-31

    Through the careful planning and design of laboratory facilities and incorporation of modern instrumentation and robotics systems, properly trained and competent laboratory associates can efficiently and safely handle radioactive and mixed waste samples. This paper addresses the potential improvements radiochemistry and mixed waste laboratories can achieve utilizing robotics for automated sample analysis. Several examples of automated systems for sample preparation and analysis will be discussed.

  15. Automated Cache Performance Analysis And Optimization

    Energy Technology Data Exchange (ETDEWEB)

    Mohror, Kathryn [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2013-12-23

    While there is no lack of performance counter tools for coarse-grained measurement of cache activity, there is a critical lack of tools for relating data layout to cache behavior to application performance. Generally, any nontrivial optimizations are either not done at all, or are done ”by hand” requiring significant time and expertise. To the best of our knowledge no tool available to users measures the latency of memory reference instructions for partic- ular addresses and makes this information available to users in an easy-to-use and intuitive way. In this project, we worked to enable the Open|SpeedShop performance analysis tool to gather memory reference latency information for specific instructions and memory ad- dresses, and to gather and display this information in an easy-to-use and intuitive way to aid performance analysts in identifying problematic data structures in their codes. This tool was primarily designed for use in the supercomputer domain as well as grid, cluster, cloud-based parallel e-commerce, and engineering systems and middleware. Ultimately, we envision a tool to automate optimization of application cache layout and utilization in the Open|SpeedShop performance analysis tool. To commercialize this soft- ware, we worked to develop core capabilities for gathering enhanced memory usage per- formance data from applications and create and apply novel methods for automatic data structure layout optimizations, tailoring the overall approach to support existing supercom- puter and cluster programming models and constraints. In this Phase I project, we focused on infrastructure necessary to gather performance data and present it in an intuitive way to users. With the advent of enhanced Precise Event-Based Sampling (PEBS) counters on recent Intel processor architectures and equivalent technology on AMD processors, we are now in a position to access memory reference information for particular addresses. Prior to the introduction of PEBS counters

  16. Automating Risk Analysis of Software Design Models

    Directory of Open Access Journals (Sweden)

    Maxime Frydman

    2014-01-01

    Full Text Available The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance.

  17. Automated Methods in Chiral Perturbation Theory on the Lattice

    CERN Document Server

    Borasoy, B; Krebs, H; Lewis, R; Borasoy, Bugra; Hippel, Georg M. von; Krebs, Hermann; Lewis, Randy

    2005-01-01

    We present a method to automatically derive the Feynman rules for mesonic chiral perturbation theory with a lattice regulator. The Feynman rules can be output both in a human-readable format and in a form suitable for an automated numerical evaluation of lattice Feynman diagrams. The automated method significantly simplifies working with improved or extended actions. Some applications to the study of finite-volume effects will be presented.

  18. Extended -Regular Sequence for Automated Analysis of Microarray Images

    Directory of Open Access Journals (Sweden)

    Jin Hee-Jeong

    2006-01-01

    Full Text Available Microarray study enables us to obtain hundreds of thousands of expressions of genes or genotypes at once, and it is an indispensable technology for genome research. The first step is the analysis of scanned microarray images. This is the most important procedure for obtaining biologically reliable data. Currently most microarray image processing systems require burdensome manual block/spot indexing work. Since the amount of experimental data is increasing very quickly, automated microarray image analysis software becomes important. In this paper, we propose two automated methods for analyzing microarray images. First, we propose the extended -regular sequence to index blocks and spots, which enables a novel automatic gridding procedure. Second, we provide a methodology, hierarchical metagrid alignment, to allow reliable and efficient batch processing for a set of microarray images. Experimental results show that the proposed methods are more reliable and convenient than the commercial tools.

  19. Multispectral tissue analysis and classification towards enabling automated robotic surgery

    Science.gov (United States)

    Triana, Brian; Cha, Jaepyeong; Shademan, Azad; Krieger, Axel; Kang, Jin U.; Kim, Peter C. W.

    2014-02-01

    Accurate optical characterization of different tissue types is an important tool for potentially guiding surgeons and enabling automated robotic surgery. Multispectral imaging and analysis have been used in the literature to detect spectral variations in tissue reflectance that may be visible to the naked eye. Using this technique, hidden structures can be visualized and analyzed for effective tissue classification. Here, we investigated the feasibility of automated tissue classification using multispectral tissue analysis. Broadband reflectance spectra (200-1050 nm) were collected from nine different ex vivo porcine tissues types using an optical fiber-probe based spectrometer system. We created a mathematical model to train and distinguish different tissue types based upon analysis of the observed spectra using total principal component regression (TPCR). Compared to other reported methods, our technique is computationally inexpensive and suitable for real-time implementation. Each of the 92 spectra was cross-referenced against the nine tissue types. Preliminary results show a mean detection rate of 91.3%, with detection rates of 100% and 70.0% (inner and outer kidney), 100% and 100% (inner and outer liver), 100% (outer stomach), and 90.9%, 100%, 70.0%, 85.7% (four different inner stomach areas, respectively). We conclude that automated tissue differentiation using our multispectral tissue analysis method is feasible in multiple ex vivo tissue specimens. Although measurements were performed using ex vivo tissues, these results suggest that real-time, in vivo tissue identification during surgery may be possible.

  20. Methods of analysis by the U.S. Geological Survey National Water Quality Laboratory; determination of antimony by automated-hydride atomic absorption spectrophotometry

    Science.gov (United States)

    Brown, G.E.; McLain, B.J.

    1994-01-01

    The analysis of natural-water samples for antimony by automated-hydride atomic absorption spectrophotometry is described. Samples are prepared for analysis by addition of potassium and hydrochloric acid followed by an autoclave digestion. After the digestion, potassium iodide and sodium borohydride are added automatically. Antimony hydride (stibine) gas is generated, then swept into a heated quartz cell for determination of antimony by atomic absorption spectrophotometry. Precision and accuracy data are presented. Results obtained on standard reference water samples agree with means established by interlaboratory studies. Spike recoveries for actual samples range from 90 to 114 percent. Replicate analyses of water samples of varying matrices give relative standard deviations from 3 to 10 percent.

  1. Computational botany methods for automated species identification

    CERN Document Server

    Remagnino, Paolo; Wilkin, Paul; Cope, James; Kirkup, Don

    2017-01-01

    This book discusses innovative methods for mining information from images of plants, especially leaves, and highlights the diagnostic features that can be implemented in fully automatic systems for identifying plant species. Adopting a multidisciplinary approach, it explores the problem of plant species identification, covering both the concepts of taxonomy and morphology. It then provides an overview of morphometrics, including the historical background and the main steps in the morphometric analysis of leaves together with a number of applications. The core of the book focuses on novel diagnostic methods for plant species identification developed from a computer scientist’s perspective. It then concludes with a chapter on the characterization of botanists' visions, which highlights important cognitive aspects that can be implemented in a computer system to more accurately replicate the human expert’s fixation process. The book not only represents an authoritative guide to advanced computational tools fo...

  2. An Automated Solar Synoptic Analysis Software System

    Science.gov (United States)

    Hong, S.; Lee, S.; Oh, S.; Kim, J.; Lee, J.; Kim, Y.; Lee, J.; Moon, Y.; Lee, D.

    2012-12-01

    We have developed an automated software system of identifying solar active regions, filament channels, and coronal holes, those are three major solar sources causing the space weather. Space weather forecasters of NOAA Space Weather Prediction Center produce the solar synoptic drawings as a daily basis to predict solar activities, i.e., solar flares, filament eruptions, high speed solar wind streams, and co-rotating interaction regions as well as their possible effects to the Earth. As an attempt to emulate this process with a fully automated and consistent way, we developed a software system named ASSA(Automated Solar Synoptic Analysis). When identifying solar active regions, ASSA uses high-resolution SDO HMI intensitygram and magnetogram as inputs and providing McIntosh classification and Mt. Wilson magnetic classification of each active region by applying appropriate image processing techniques such as thresholding, morphology extraction, and region growing. At the same time, it also extracts morphological and physical properties of active regions in a quantitative way for the short-term prediction of flares and CMEs. When identifying filament channels and coronal holes, images of global H-alpha network and SDO AIA 193 are used for morphological identification and also SDO HMI magnetograms for quantitative verification. The output results of ASSA are routinely checked and validated against NOAA's daily SRS(Solar Region Summary) and UCOHO(URSIgram code for coronal hole information). A couple of preliminary scientific results are to be presented using available output results. ASSA will be deployed at the Korean Space Weather Center and serve its customers in an operational status by the end of 2012.

  3. Automated morphological analysis approach for classifying colorectal microscopic images

    Science.gov (United States)

    Marghani, Khaled A.; Dlay, Satnam S.; Sharif, Bayan S.; Sims, Andrew J.

    2003-10-01

    Automated medical image diagnosis using quantitative measurements is extremely helpful for cancer prognosis to reach a high degree of accuracy and thus make reliable decisions. In this paper, six morphological features based on texture analysis were studied in order to categorize normal and cancer colon mucosa. They were derived after a series of pre-processing steps to generate a set of different shape measurements. Based on the shape and the size, six features known as Euler Number, Equivalent Diamater, Solidity, Extent, Elongation, and Shape Factor AR were extracted. Mathematical morphology is used firstly to remove background noise from segmented images and then to obtain different morphological measures to describe shape, size, and texture of colon glands. The automated system proposed is tested to classifying 102 microscopic samples of colorectal tissues, which consist of 44 normal color mucosa and 58 cancerous. The results were first statistically evaluated, using one-way ANOVA method in order to examine the significance of each feature extracted. Then significant features are selected in order to classify the dataset into two categories. Finally, using two discrimination methods; linear method and k-means clustering, important classification factors were estimated. In brief, this study demonstrates that abnormalities in low-level power tissue morphology can be distinguished using quantitative image analysis. This investigation shows the potential of an automated vision system in histopathology. Furthermore, it has the advantage of being objective, and more importantly a valuable diagnostic decision support tool.

  4. Automated quantification of budding Saccharomyces cerevisiae using a novel image cytometry method.

    Science.gov (United States)

    Laverty, Daniel J; Kury, Alexandria L; Kuksin, Dmitry; Pirani, Alnoor; Flanagan, Kevin; Chan, Leo Li-Ying

    2013-06-01

    The measurements of concentration, viability, and budding percentages of Saccharomyces cerevisiae are performed on a routine basis in the brewing and biofuel industries. Generation of these parameters is of great importance in a manufacturing setting, where they can aid in the estimation of product quality, quantity, and fermentation time of the manufacturing process. Specifically, budding percentages can be used to estimate the reproduction rate of yeast populations, which directly correlates with metabolism of polysaccharides and bioethanol production, and can be monitored to maximize production of bioethanol during fermentation. The traditional method involves manual counting using a hemacytometer, but this is time-consuming and prone to human error. In this study, we developed a novel automated method for the quantification of yeast budding percentages using Cellometer image cytometry. The automated method utilizes a dual-fluorescent nucleic acid dye to specifically stain live cells for imaging analysis of unique morphological characteristics of budding yeast. In addition, cell cycle analysis is performed as an alternative method for budding analysis. We were able to show comparable yeast budding percentages between manual and automated counting, as well as cell cycle analysis. The automated image cytometry method is used to analyze and characterize corn mash samples directly from fermenters during standard fermentation. Since concentration, viability, and budding percentages can be obtained simultaneously, the automated method can be integrated into the fermentation quality assurance protocol, which may improve the quality and efficiency of beer and bioethanol production processes.

  5. Automated Cache Performance Analysis And Optimization

    Energy Technology Data Exchange (ETDEWEB)

    Mohror, Kathryn [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2013-12-23

    While there is no lack of performance counter tools for coarse-grained measurement of cache activity, there is a critical lack of tools for relating data layout to cache behavior to application performance. Generally, any nontrivial optimizations are either not done at all, or are done ”by hand” requiring significant time and expertise. To the best of our knowledge no tool available to users measures the latency of memory reference instructions for partic- ular addresses and makes this information available to users in an easy-to-use and intuitive way. In this project, we worked to enable the Open|SpeedShop performance analysis tool to gather memory reference latency information for specific instructions and memory ad- dresses, and to gather and display this information in an easy-to-use and intuitive way to aid performance analysts in identifying problematic data structures in their codes. This tool was primarily designed for use in the supercomputer domain as well as grid, cluster, cloud-based parallel e-commerce, and engineering systems and middleware. Ultimately, we envision a tool to automate optimization of application cache layout and utilization in the Open|SpeedShop performance analysis tool. To commercialize this soft- ware, we worked to develop core capabilities for gathering enhanced memory usage per- formance data from applications and create and apply novel methods for automatic data structure layout optimizations, tailoring the overall approach to support existing supercom- puter and cluster programming models and constraints. In this Phase I project, we focused on infrastructure necessary to gather performance data and present it in an intuitive way to users. With the advent of enhanced Precise Event-Based Sampling (PEBS) counters on recent Intel processor architectures and equivalent technology on AMD processors, we are now in a position to access memory reference information for particular addresses. Prior to the introduction of PEBS counters

  6. Development and validation of an automated extraction method (accelerated solvent extraction) and a reverse-phase HPLC analysis method for assay of ivermectin in a meat-based chewable formulation.

    Science.gov (United States)

    Abend, Andreas M; Chung, Le; McCollum, David G; Wuelfing, W Peter

    2003-04-10

    A new method for monitoring ivermectin content in HEARTGARD CHEWABLES has been developed and validated. The method consists of the automated extraction of ivermectin from the meat-based formulation under conditions of elevated temperature and pressure (accelerated solvent extraction, ASE, and determination of the active by reverse-phase high performance liquid chromatography (HPLC). The method resolves both active species of ivermectin (components H(2)B(1a) and H(2)B(1b)) from the formulation matrix.

  7. Automated Scanning Electron Microscopy Analysis of Sampled Aerosol

    DEFF Research Database (Denmark)

    Bluhme, Anders Brostrøm; Kling, Kirsten; Mølhave, Kristian

    development of an automated software-based analysis of aerosols using Scanning Electron Microscopy (SEM) and Scanning Transmission Electron Microscopy (STEM) coupled with Energy-Dispersive X-ray Spectroscopy (EDS). The automated analysis will be capable of providing both detailed physical and chemical single...

  8. ASteCA - Automated Stellar Cluster Analysis

    CERN Document Server

    Perren, Gabriel I; Piatti, Andrés E

    2014-01-01

    We present ASteCA (Automated Stellar Cluster Analysis), a suit of tools designed to fully automatize the standard tests applied on stellar clusters to determine their basic parameters. The set of functions included in the code make use of positional and photometric data to obtain precise and objective values for a given cluster's center coordinates, radius, luminosity function and integrated color magnitude, as well as characterizing through a statistical estimator its probability of being a true physical cluster rather than a random overdensity of field stars. ASteCA incorporates a Bayesian field star decontamination algorithm capable of assigning membership probabilities using photometric data alone. An isochrone fitting process based on the generation of synthetic clusters from theoretical isochrones and selection of the best fit through a genetic algorithm is also present, which allows ASteCA to provide accurate estimates for a cluster's metallicity, age, extinction and distance values along with its unce...

  9. Automation of Large-scale Computer Cluster Monitoring Information Analysis

    Science.gov (United States)

    Magradze, Erekle; Nadal, Jordi; Quadt, Arnulf; Kawamura, Gen; Musheghyan, Haykuhi

    2015-12-01

    High-throughput computing platforms consist of a complex infrastructure and provide a number of services apt to failures. To mitigate the impact of failures on the quality of the provided services, a constant monitoring and in time reaction is required, which is impossible without automation of the system administration processes. This paper introduces a way of automation of the process of monitoring information analysis to provide the long and short term predictions of the service response time (SRT) for a mass storage and batch systems and to identify the status of a service at a given time. The approach for the SRT predictions is based on Adaptive Neuro Fuzzy Inference System (ANFIS). An evaluation of the approaches is performed on real monitoring data from the WLCG Tier 2 center GoeGrid. Ten fold cross validation results demonstrate high efficiency of both approaches in comparison to known methods.

  10. A Method for Automated Planning of FTTH Access Network Infrastructures

    DEFF Research Database (Denmark)

    Riaz, Muhammad Tahir; Pedersen, Jens Myrup; Madsen, Ole Brun

    2005-01-01

    In this paper a method for automated planning of Fiber to the Home (FTTH) access networks is proposed. We introduced a systematic approach for planning access network infrastructure. The GIS data and a set of algorithms were employed to make the planning process more automatic. The method explains...

  11. Prevalence of discordant microscopic changes with automated CBC analysis

    Directory of Open Access Journals (Sweden)

    Fabiano de Jesus Santos

    2014-12-01

    Full Text Available Introduction:The most common cause of diagnostic error is related to errors in laboratory tests as well as errors of results interpretation. In order to reduce them, the laboratory currently has modern equipment which provides accurate and reliable results. The development of automation has revolutionized the laboratory procedures in Brazil and worldwide.Objective:To determine the prevalence of microscopic changes present in blood slides concordant and discordant with results obtained using fully automated procedures.Materials and method:From January to July 2013, 1,000 hematological parameters slides were analyzed. Automated analysis was performed on last generation equipment, which methodology is based on electrical impedance, and is able to quantify all the figurative elements of the blood in a universe of 22 parameters. The microscopy was performed by two experts in microscopy simultaneously.Results:The data showed that only 42.70% were concordant, comparing with 57.30% discordant. The main findings among discordant were: Changes in red blood cells 43.70% (n = 250, white blood cells 38.46% (n = 220, and number of platelet 17.80% (n = 102.Discussion:The data show that some results are not consistent with clinical or physiological state of an individual, and cannot be explained because they have not been investigated, which may compromise the final diagnosis.Conclusion:It was observed that it is of fundamental importance that the microscopy qualitative analysis must be performed in parallel with automated analysis in order to obtain reliable results, causing a positive impact on the prevention, diagnosis, prognosis, and therapeutic follow-up.

  12. An automated method for the layup of fiberglass fabric

    Science.gov (United States)

    Zhu, Siqi

    This dissertation presents an automated composite fabric layup solution based on a new method to deform fiberglass fabric referred to as shifting. A layup system was designed and implemented using a large robotic gantry and custom end-effector for shifting. Layup tests proved that the system can deposit fabric onto two-dimensional and three-dimensional tooling surfaces accurately and repeatedly while avoiding out-of-plane deformation. A process planning method was developed to generate tool paths for the layup system based on a geometric model of the tooling surface. The approach is analogous to Computer Numerical Controlled (CNC) machining, where Numerical Control (NC) code from a Computer-Aided Design (CAD) model is generated to drive the milling machine. Layup experiments utilizing the proposed method were conducted to validate the performance. The results show that the process planning software requires minimal time or human intervention and can generate tool paths leading to accurate composite fabric layups. Fiberglass fabric samples processed with shifting deformation were observed for meso-scale deformation. Tow thinning, bending and spacing was observed and measured. Overall, shifting did not create flaws in amounts that would disqualify the method from use in industry. This suggests that shifting is a viable method for use in automated manufacturing. The work of this dissertation provides a new method for the automated layup of broad width composite fabric that is not possible with any available composite automation systems to date.

  13. Evaluation of an automated method for urinocolture screening

    Directory of Open Access Journals (Sweden)

    Claudia Ballabio

    2010-09-01

    Full Text Available Introduction: Urinary tract infections are one of the most common diseases found in medical practice and are diagnosed with traditional methods of cultivation on plates. In this study we evaluated an automated instrumentation for screening of the urinocultures that can provide results quickly and guarantee traceability. The comparison of results obtained with automatic and plate methods is reported. Methods: 316 urine samples including midstream urine, urine catheter and urine bag have been analyzed by Alfred 60 (Alifax through light scattering technology that measures the replication of the bacteria. Simultaneously, the samples were sown on agar plates CPS3,Agar Cled, Mc Conkey Agar. Results:A total of 316 samples were analyzed by the automated method, 190 resulted negative, all confirmed by culture, while 126 were found positive. 82 cases were confirmed positive in culture plate, 65 with significant isolation of bacteria and 17 with polymicrobial flora with a significant charge. 44 cases were negative in culture plate but positive for the automated method. Conclusions: The absence of false negative results at low charges can represent a starting point to introduce an automated method for urinocolture screening.

  14. Ecological Automation Design, Extending Work Domain Analysis

    NARCIS (Netherlands)

    Amelink, M.H.J.

    2010-01-01

    In high–risk domains like aviation, medicine and nuclear power plant control, automation has enabled new capabilities, increased the economy of operation and has greatly contributed to safety. However, automation increases the number of couplings in a system, which can inadvertently lead to more com

  15. Methods of analysis by the U.S. Geological Survey National Water Quality Laboratory; determination of the total phosphorus by a Kjeldahl digestion method and an automated colorimetric finish that includes dialysis

    Science.gov (United States)

    Patton, Charles J.; Truitt, Earl P.

    1992-01-01

    A method to determine total phosphorus (TP) in the same digests prepared for total Kjeldahl nitrogen (TKN) determinations is desribed. The batch, high-temperature (block digester), HG(II)-catalyzed digestion step is similar to U.S. Geological Survey methods I-2552-85/I-4552-85 and U.S. Environmental Protection Agency method 365.4 except that sample and reagent volumes are halved. Prepared digests are desolvated at 220 degrees Celsius and digested at 370 degrees Celsius in separate block digesters set at these temperatures, rather than in a single, temperature-programmed block digester. This approach is used in the method escribed here, which permits 40 calibrants, reference waters, and smaples to be digested and resolvated in about an hour. Orthophosphate ions originally present in samples, along with those released during the digestion step, are determined colorimetrically at a rate of 90 tests per hour by an automated version of the phosphoantimonylmolybdenum blue procedure. About 100 microliters of digest are required per determination. The upper concentration limit is 2 milligrams per liter (mg/L) with a method detection limt of 0.01 mg/L. Repeatability for a sample containing approximately 1.6 mg/L of TP in a high suspended-solids matrix is 0.7 percent. Between-day precision for the same sample is 5.0 percent. A dialyzer in the air-segmented continuous flow analyzer provides on-line digest cleanup, eliminated particulates that otherwise would interfere in the colorimetric finish. An single-channel analyzer can process the resolvated digests from two pairs of block digesters each hour. Paired t-test analysis of TP concentrations for approximately 1,600 samples determined by the new method (U.S. Geologial Survey methods I-2610-91 and I-4610-91) and the old method (U.S. Geological Survey methods I-2600-85 and I-4600-85) revealed positive bias in the former of 0.02 to 0.04 mg/L for surface-water samples in agreement with previous studies. Concentrations of total

  16. Automated Model Fit Method for Diesel Engine Control Development

    NARCIS (Netherlands)

    Seykens, X.; Willems, F.P.T.; Kuijpers, B.; Rietjens, C.

    2014-01-01

    This paper presents an automated fit for a control-oriented physics-based diesel engine combustion model. This method is based on the combination of a dedicated measurement procedure and structured approach to fit the required combustion model parameters. Only a data set is required that is consider

  17. Automated analysis and annotation of basketball video

    Science.gov (United States)

    Saur, Drew D.; Tan, Yap-Peng; Kulkarni, Sanjeev R.; Ramadge, Peter J.

    1997-01-01

    Automated analysis and annotation of video sequences are important for digital video libraries, content-based video browsing and data mining projects. A successful video annotation system should provide users with useful video content summary in a reasonable processing time. Given the wide variety of video genres available today, automatically extracting meaningful video content for annotation still remains hard by using current available techniques. However, a wide range video has inherent structure such that some prior knowledge about the video content can be exploited to improve our understanding of the high-level video semantic content. In this paper, we develop tools and techniques for analyzing structured video by using the low-level information available directly from MPEG compressed video. Being able to work directly in the video compressed domain can greatly reduce the processing time and enhance storage efficiency. As a testbed, we have developed a basketball annotation system which combines the low-level information extracted from MPEG stream with the prior knowledge of basketball video structure to provide high level content analysis, annotation and browsing for events such as wide- angle and close-up views, fast breaks, steals, potential shots, number of possessions and possession times. We expect our approach can also be extended to structured video in other domains.

  18. Statistical Analysis of Filament Features Based on the H{\\alpha} Solar Images from 1988 to 2013 by Computer Automated Detection Method

    CERN Document Server

    Hao, Q; Cao, W; Chen, P F

    2015-01-01

    We improve our filament automated detection method which was proposed in our previous works. It is then applied to process the full disk H$\\alpha$ data mainly obtained by Big Bear Solar Observatory (BBSO) from 1988 to 2013, spanning nearly 3 solar cycles. The butterfly diagrams of the filaments, showing the information of the filament area, spine length, tilt angle, and the barb number, are obtained. The variations of these features with the calendar year and the latitude band are analyzed. The drift velocities of the filaments in different latitude bands are calculated and studied. We also investigate the north-south (N-S) asymmetries of the filament numbers in total and in each subclass classified according to the filament area, spine length, and tilt angle. The latitudinal distribution of the filament number is found to be bimodal. About 80% of all the filaments have tilt angles within [0{\\deg}, 60{\\deg}]. For the filaments within latitudes lower (higher) than 50{\\deg} the northeast (northwest) direction i...

  19. Automating Object-Oriented Software Development Methods

    NARCIS (Netherlands)

    Tekinerdogan, Bedir; Saeki, Motoshi; Sunyé, Gerson; Broek, van den Pim; Hruby, Pavel

    2001-01-01

    Current software projects have generally to deal with producing and managing large and complex software products. It is generally believed that applying software development methods are useful in coping with this complexity and for supporting quality. As such numerous object-oriented software devel

  20. Automating Object-Oriented Software Development Methods

    NARCIS (Netherlands)

    Tekinerdogan, Bedir; Saeki, Motoshi; Sunyé, Gerson; Broek, van den Pim; Hruby, Pavel; Frohner, A´ kos

    2002-01-01

    Current software projects have generally to deal with producing and managing large and complex software products. It is generally believed that applying software development methods are useful in coping with this complexity and for supporting quality. As such numerous object-oriented software develo

  1. Automated Traffic Management System and Method

    Science.gov (United States)

    Glass, Brian J. (Inventor); Spirkovska, Liljana (Inventor); McDermott, William J. (Inventor); Reisman, Ronald J. (Inventor); Gibson, James (Inventor); Iverson, David L. (Inventor)

    2000-01-01

    A data management system and method that enables acquisition, integration, and management of real-time data generated at different rates, by multiple heterogeneous incompatible data sources. The system achieves this functionality by using an expert system to fuse data from a variety of airline, airport operations, ramp control, and air traffic control tower sources, to establish and update reference data values for every aircraft surface operation. The system may be configured as a real-time airport surface traffic management system (TMS) that electronically interconnects air traffic control, airline data, and airport operations data to facilitate information sharing and improve taxi queuing. In the TMS operational mode, empirical data shows substantial benefits in ramp operations for airlines, reducing departure taxi times by about one minute per aircraft in operational use, translating as $12 to $15 million per year savings to airlines at the Atlanta, Georgia airport. The data management system and method may also be used for scheduling the movement of multiple vehicles in other applications, such as marine vessels in harbors and ports, trucks or railroad cars in ports or shipping yards, and railroad cars in switching yards. Finally, the data management system and method may be used for managing containers at a shipping dock, stock on a factory floor or in a warehouse, or as a training tool for improving situational awareness of FAA tower controllers, ramp and airport operators, or commercial airline personnel in airfield surface operations.

  2. Postprocessing algorithm for automated analysis of pelvic intraoperative neuromonitoring signals

    Directory of Open Access Journals (Sweden)

    Wegner Celine

    2016-09-01

    Full Text Available Two dimensional pelvic intraoperative neuromonitoring (pIONM® is based on electric stimulation of autonomic nerves under observation of electromyography of internal anal sphincter (IAS and manometry of urinary bladder. The method provides nerve identification and verification of its’ functional integrity. Currently pIONM® is gaining increased attention in times where preservation of function is becoming more and more important. Ongoing technical and methodological developments in experimental and clinical settings require further analysis of the obtained signals. This work describes a postprocessing algorithm for pIONM® signals, developed for automated analysis of huge amount of recorded data. The analysis routine includes a graphical representation of the recorded signals in the time and frequency domain, as well as a quantitative evaluation by means of features calculated from the time and frequency domain. The produced plots are summarized automatically in a PowerPoint presentation. The calculated features are filled into a standardized Excel-sheet, ready for statistical analysis.

  3. Testing an Automated Accuracy Assessment Method on Bibliographic Data

    Directory of Open Access Journals (Sweden)

    Marlies Olensky

    2014-12-01

    Full Text Available This study investigates automated data accuracy assessment as described in data quality literature for its suitability to assess bibliographic data. The data samples comprise the publications of two Nobel Prize winners in the field of Chemistry for a 10-year-publication period retrieved from the two bibliometric data sources, Web of Science and Scopus. The bibliographic records are assessed against the original publication (gold standard and an automatic assessment method is compared to a manual one. The results show that the manual assessment method reflects truer accuracy scores. The automated assessment method would need to be extended by additional rules that reflect specific characteristics of bibliographic data. Both data sources had higher accuracy scores per field than accumulated per record. This study contributes to the research on finding a standardized assessment method of bibliographic data accuracy as well as defining the impact of data accuracy on the citation matching process.

  4. White matter hyperintensities segmentation: a new semi-automated method.

    Science.gov (United States)

    Iorio, Mariangela; Spalletta, Gianfranco; Chiapponi, Chiara; Luccichenti, Giacomo; Cacciari, Claudia; Orfei, Maria D; Caltagirone, Carlo; Piras, Fabrizio

    2013-01-01

    White matter hyperintensities (WMH) are brain areas of increased signal on T2-weighted or fluid-attenuated inverse recovery magnetic resonance imaging (MRI) scans. In this study we present a new semi-automated method to measure WMH load that is based on the segmentation of the intensity histogram of fluid-attenuated inversion recovery images. Thirty patients with mild cognitive impairment with variable WMH load were enrolled. The semi-automated WMH segmentation included removal of non-brain tissue, spatial normalization, removal of cerebellum and brain stem, spatial filtering, thresholding to segment probable WMH, manual editing for correction of false positives and negatives, generation of WMH map, and volumetric estimation of the WMH load. Accuracy was quantitatively evaluated by comparing semi-automated and manual WMH segmentations performed by two independent raters. Differences between the two procedures were assessed using Student's t-tests and similarity was evaluated using linear regression model and Dice similarity coefficient (DSC). The volumes of the manual and semi-automated segmentations did not statistically differ (t-value = -1.79, DF = 29, p = 0.839 for rater 1; t-value = 1.113, DF = 29, p = 0.2749 for rater 2), were highly correlated [R (2) = 0.921, F (1,29) = 155.54, p < 0.0001 for rater 1; R (2) = 0.935, F (1,29) = 402.709, p < 0.0001 for rater 2] and showed a very strong spatial similarity (mean DSC = 0.78, for rater 1 and 0.77 for rater 2). In conclusion, our semi-automated method to measure the load of WMH is highly reliable and could represent a good tool that could be easily implemented in routinely neuroimaging analyses to map clinical consequences of WMH.

  5. White matter hyperintensities segmentation: a new semi-automated method

    Directory of Open Access Journals (Sweden)

    Mariangela eIorio

    2013-12-01

    Full Text Available White matter hyperintensities (WMH are brain areas of increased signal on T2-weighted or fluid attenuated inverse recovery magnetic resonance imaging (MRI scans. In this study we present a new semi-automated method to measure WMH load that is based on the segmentation of the intensity histogram of fluid-attenuated inversion recovery images. Thirty patients with Mild Cognitive Impairment with variable WMH load were enrolled. The semi-automated WMH segmentation included: removal of non-brain tissue, spatial normalization, removal of cerebellum and brain stem, spatial filtering, thresholding to segment probable WMH, manual editing for correction of false positives and negatives, generation of WMH map and volumetric estimation of the WMH load. Accuracy was quantitatively evaluated by comparing semi-automated and manual WMH segmentations performed by two independent raters. Differences between the two procedures were assessed using Student’s t tests and similarity was evaluated using linear regression model and Dice Similarity Coefficient (DSC. The volumes of the manual and semi-automated segmentations did not statistically differ (t-value= -1.79, DF=29, p= 0.839 for rater 1; t-value= 1.113, DF=29, p= 0.2749 for rater 2, were highly correlated (R²= 0.921, F (1,29 =155,54, p

  6. An automated method for fibrin clot permeability assessment.

    Science.gov (United States)

    Ząbczyk, Michał; Piłat, Adam; Awsiuk, Magdalena; Undas, Anetta

    2015-01-01

    The fibrin clot permeability coefficient (Ks) is a useful measure of porosity of the fibrin network, which is determined by a number of genetic and environmental factors. Currently available methods to evaluate Ks are time-consuming, require constant supervision and provide only one parameter. We present an automated method in which drops are weighed individually, buffer is dosed by the pump and well defined clot washing is controlled by the software. The presence of a straight association between drop mass and their dripping time allows to shorten the measurement time twice. In 40 healthy individuals, Ks, the number of drops required to reach the plateau (DTP), the time to achieve the plateau (TTP) and the DTP/TTP ratio (DTR) were calculated. There was a positive association between Ks (r = 0.69, P Ks (r = -0.55, P Ks (r = 0.70, P < 0.0001 for the manual method and r = 0.76, P < 0.0001 for the automated method), fibrinogen (r = -0.58, P < 0.0001) and C-reactive protein (CRP) (r = -0.47, P < 0.01). The automated method might be a suitable tool for research and clinical use and may offer more additional parameters describing fibrin clot structure.

  7. Automated drawing of network plots in network meta-analysis.

    Science.gov (United States)

    Rücker, Gerta; Schwarzer, Guido

    2016-03-01

    In systematic reviews based on network meta-analysis, the network structure should be visualized. Network plots often have been drawn by hand using generic graphical software. A typical way of drawing networks, also implemented in statistical software for network meta-analysis, is a circular representation, often with many crossing lines. We use methods from graph theory in order to generate network plots in an automated way. We give a number of requirements for graph drawing and present an algorithm that fits prespecified ideal distances between the nodes representing the treatments. The method was implemented in the function netgraph of the R package netmeta and applied to a number of networks from the literature. We show that graph representations with a small number of crossing lines are often preferable to circular representations.

  8. Automated eigensystem realisation algorithm for operational modal analysis

    Science.gov (United States)

    Zhang, Guowen; Ma, Jinghua; Chen, Zhuo; Wang, Ruirong

    2014-07-01

    The eigensystem realisation algorithm (ERA) is one of the most popular methods in civil engineering applications for estimating modal parameters. Three issues have been addressed in the paper: spurious mode elimination, estimating the energy relationship between different modes, and automatic analysis of the stabilisation diagram. On spurious mode elimination, a new criterion, modal similarity index (MSI) is proposed to measure the reliability of the modes obtained by ERA. On estimating the energy relationship between different modes, the mode energy level (MEL) was introduced to measure the energy contribution of each mode, which can be used to indicate the dominant mode. On automatic analysis of the stabilisation diagram, an automation of the mode selection process based on a hierarchical clustering algorithm was developed. An experimental example of the parameter estimation for the Chaotianmen bridge model in Chongqing, China, is presented to demonstrate the efficacy of the proposed method.

  9. Automated mass action model space generation and analysis methods for two-reactant combinatorially complex equilibriums: An analysis of ATP-induced ribonucleotide reductase R1 hexamerization data

    Directory of Open Access Journals (Sweden)

    Radivoyevitch Tomas

    2009-12-01

    /30 > 508/2088 with p -15. Finally, 99 of the 2088 models did not have any terms with ATP/R1 ratios >1.5, but of the top 30, there were 14 such models (14/30 > 99/2088 with p -16, i.e. the existence of R1 hexamers with >3 a-sites occupied by ATP is also not supported by this dataset. Conclusion The analysis presented suggests that three a-sites may not be occupied by ATP in R1 hexamers under the conditions of the data analyzed. If a-sites fill before h-sites, this implies that the dataset analyzed can be explained without the existence of an h-site. Reviewers This article was reviewed by Ossama Kashlan (nominated by Philip Hahnfeldt, Bin Hu (nominated by William Hlavacek and Rainer Sachs.

  10. Comparison of manual and automated quantification methods of {sup 123}I-ADAM

    Energy Technology Data Exchange (ETDEWEB)

    Kauppinen, T. [Helsinki Univ. Central Hospital (Finland). HUS Helsinki Medical Imaging Center; Helsinki Univ. Central Hospital (Finland). Division of Nuclear Medicine; Koskela, A.; Ahonen, A. [Helsinki Univ. Central Hospital (Finland). Division of Nuclear Medicine; Diemling, M. [Hermes Medical Solutions, Stockholm (Sweden); Keski-Rahkonen, A.; Sihvola, E. [Helsinki Univ. (Finland). Dept. of Public Health; Helsinki Univ. Central Hospital (Finland). Dept. of Psychiatry

    2005-07-01

    {sup 123}I-ADAM is a novel radioligand for imaging of the brain serotonin transporters (SERTs). Traditionally, the analysis of brain receptor studies has been based on observer-dependent manual region of interest definitions and visual interpretation. Our aim was to create a template for automated image registrations and volume of interest (VOI) quantification and to show that an automated quantification method of {sup 123}I-ADAM is more repeatable than the manual method. Patients, methods: A template and a predefined VOI map was created from {sup 123}I-ADAM scans done for healthy volunteers (n=15). Scans of another group of healthy persons (HS, n=12) and patients with bulimia nervosa (BN, n=10) were automatically fitted to the template and specific binding ratios (SBRs) were calculated by using the VOI map. Manual VOI definitions were done for the HS and BN groups by both one and two observers. The repeatability of the automated method was evaluated by using the BN group. Results: For the manual method, the interobserver coefficient of repeatability was 0.61 for the HS group and 1.00 for the BN group. The intra-observer coefficient of repeatability for the BN group was 0.70. For the automated method, the coefficient of repeatability was 0.13 for SBRs in midbrain. Conclusion: An automated quantification gives valuable information in addition to visual interpretation decreasing also the total image handling time and giving clear advantages for research work. An automated method for analysing {sup 123}I-ADAM binding to the brain SERT gives repeatable results for fitting the studies to the template and for calculating SBRs, and could therefore replace manual methods. (orig.)

  11. Quantifying biodiversity using digital cameras and automated image analysis.

    Science.gov (United States)

    Roadknight, C. M.; Rose, R. J.; Barber, M. L.; Price, M. C.; Marshall, I. W.

    2009-04-01

    Monitoring the effects on biodiversity of extensive grazing in complex semi-natural habitats is labour intensive. There are also concerns about the standardization of semi-quantitative data collection. We have chosen to focus initially on automating the most time consuming aspect - the image analysis. The advent of cheaper and more sophisticated digital camera technology has lead to a sudden increase in the number of habitat monitoring images and information that is being collected. We report on the use of automated trail cameras (designed for the game hunting market) to continuously capture images of grazer activity in a variety of habitats at Moor House National Nature Reserve, which is situated in the North of England at an average altitude of over 600m. Rainfall is high, and in most areas the soil consists of deep peat (1m to 3m), populated by a mix of heather, mosses and sedges. The cameras have been continuously in operation over a 6 month period, daylight images are in full colour and night images (IR flash) are black and white. We have developed artificial intelligence based methods to assist in the analysis of the large number of images collected, generating alert states for new or unusual image conditions. This paper describes the data collection techniques, outlines the quantitative and qualitative data collected and proposes online and offline systems that can reduce the manpower overheads and increase focus on important subsets in the collected data. By converting digital image data into statistical composite data it can be handled in a similar way to other biodiversity statistics thus improving the scalability of monitoring experiments. Unsupervised feature detection methods and supervised neural methods were tested and offered solutions to simplifying the process. Accurate (85 to 95%) categorization of faunal content can be obtained, requiring human intervention for only those images containing rare animals or unusual (undecidable) conditions, and

  12. An automated dynamic water vapor permeation test method

    Science.gov (United States)

    Gibson, Phillip; Kendrick, Cyrus; Rivin, Donald; Charmchii, Majid; Sicuranza, Linda

    1995-05-01

    This report describes an automated apparatus developed to measure the transport of water vapor through materials under a variety of conditions. The apparatus is more convenient to use than the traditional test methods for textiles and clothing materials, and allows one to use a wider variety of test conditions to investigate the concentration-dependent and nonlinear transport behavior of many of the semipermeable membrane laminates which are now available. The dynamic moisture permeation cell (DMPC) has been automated to permit multiple setpoint testing under computer control, and to facilitate investigation of transient phenomena. Results generated with the DMPC are in agreement with and of comparable accuracy to those from the ISO 11092 (sweating guarded hot plate) method of measuring water vapor permeability.

  13. α-Automated Reasoning Method Based on Lattice-Valued Propositional Logic LP(X)

    Institute of Scientific and Technical Information of China (English)

    王伟; 徐扬; 王学芳

    2002-01-01

    This paper is focused on automated reasoning based on classical propositional logic and lattice-valued propositional logic LP(X). A new method of automated reasoning is given, and the soundness and completeness theorems of this method are proved.

  14. When Phase Contrast Fails: ChainTracer and NucTracer, Two ImageJ Methods for Semi-Automated Single Cell Analysis Using Membrane or DNA Staining.

    Science.gov (United States)

    Syvertsson, Simon; Vischer, Norbert O E; Gao, Yongqiang; Hamoen, Leendert W

    2016-01-01

    Within bacterial populations, genetically identical cells often behave differently. Single-cell measurement methods are required to observe this heterogeneity. Flow cytometry and fluorescence light microscopy are the primary methods to do this. However, flow cytometry requires reasonably strong fluorescence signals and is impractical when bacteria grow in cell chains. Therefore fluorescence light microscopy is often used to measure population heterogeneity in bacteria. Automatic microscopy image analysis programs typically use phase contrast images to identify cells. However, many bacteria divide by forming a cross-wall that is not detectable by phase contrast. We have developed 'ChainTracer', a method based on the ImageJ plugin ObjectJ. It can automatically identify individual cells stained by fluorescent membrane dyes, and measure fluorescence intensity, chain length, cell length, and cell diameter. As a complementary analysis method we developed 'NucTracer', which uses DAPI stained nucleoids as a proxy for single cells. The latter method is especially useful when dealing with crowded images. The methods were tested with Bacillus subtilis and Lactococcus lactis cells expressing a GFP-reporter. In conclusion, ChainTracer and NucTracer are useful single cell measurement methods when bacterial cells are difficult to distinguish with phase contrast.

  15. Trends in biomedical informatics: automated topic analysis of JAMIA articles.

    Science.gov (United States)

    Han, Dong; Wang, Shuang; Jiang, Chao; Jiang, Xiaoqian; Kim, Hyeon-Eui; Sun, Jimeng; Ohno-Machado, Lucila

    2015-11-01

    Biomedical Informatics is a growing interdisciplinary field in which research topics and citation trends have been evolving rapidly in recent years. To analyze these data in a fast, reproducible manner, automation of certain processes is needed. JAMIA is a "generalist" journal for biomedical informatics. Its articles reflect the wide range of topics in informatics. In this study, we retrieved Medical Subject Headings (MeSH) terms and citations of JAMIA articles published between 2009 and 2014. We use tensors (i.e., multidimensional arrays) to represent the interaction among topics, time and citations, and applied tensor decomposition to automate the analysis. The trends represented by tensors were then carefully interpreted and the results were compared with previous findings based on manual topic analysis. A list of most cited JAMIA articles, their topics, and publication trends over recent years is presented. The analyses confirmed previous studies and showed that, from 2012 to 2014, the number of articles related to MeSH terms Methods, Organization & Administration, and Algorithms increased significantly both in number of publications and citations. Citation trends varied widely by topic, with Natural Language Processing having a large number of citations in particular years, and Medical Record Systems, Computerized remaining a very popular topic in all years.

  16. Automated Steel Cleanliness Analysis Tool (ASCAT)

    Energy Technology Data Exchange (ETDEWEB)

    Gary Casuccio (RJ Lee Group); Michael Potter (RJ Lee Group); Fred Schwerer (RJ Lee Group); Dr. Richard J. Fruehan (Carnegie Mellon University); Dr. Scott Story (US Steel)

    2005-12-30

    The objective of this study was to develop the Automated Steel Cleanliness Analysis Tool (ASCATTM) to permit steelmakers to evaluate the quality of the steel through the analysis of individual inclusions. By characterizing individual inclusions, determinations can be made as to the cleanliness of the steel. Understanding the complicating effects of inclusions in the steelmaking process and on the resulting properties of steel allows the steel producer to increase throughput, better control the process, reduce remelts, and improve the quality of the product. The ASCAT (Figure 1) is a steel-smart inclusion analysis tool developed around a customized next-generation computer controlled scanning electron microscopy (NG-CCSEM) hardware platform that permits acquisition of inclusion size and composition data at a rate never before possible in SEM-based instruments. With built-in customized ''intelligent'' software, the inclusion data is automatically sorted into clusters representing different inclusion types to define the characteristics of a particular heat (Figure 2). The ASCAT represents an innovative new tool for the collection of statistically meaningful data on inclusions, and provides a means of understanding the complicated effects of inclusions in the steel making process and on the resulting properties of steel. Research conducted by RJLG with AISI (American Iron and Steel Institute) and SMA (Steel Manufactures of America) members indicates that the ASCAT has application in high-grade bar, sheet, plate, tin products, pipes, SBQ, tire cord, welding rod, and specialty steels and alloys where control of inclusions, whether natural or engineered, are crucial to their specification for a given end-use. Example applications include castability of calcium treated steel; interstitial free (IF) degasser grade slag conditioning practice; tundish clogging and erosion minimization; degasser circulation and optimization; quality assessment

  17. Osteolytica: An automated image analysis software package that rapidly measures cancer-induced osteolytic lesions in in vivo models with greater reproducibility compared to other commonly used methods.

    Science.gov (United States)

    Evans, H R; Karmakharm, T; Lawson, M A; Walker, R E; Harris, W; Fellows, C; Huggins, I D; Richmond, P; Chantry, A D

    2016-02-01

    Methods currently used to analyse osteolytic lesions caused by malignancies such as multiple myeloma and metastatic breast cancer vary from basic 2-D X-ray analysis to 2-D images of micro-CT datasets analysed with non-specialised image software such as ImageJ. However, these methods have significant limitations. They do not capture 3-D data, they are time-consuming and they often suffer from inter-user variability. We therefore sought to develop a rapid and reproducible method to analyse 3-D osteolytic lesions in mice with cancer-induced bone disease. To this end, we have developed Osteolytica, an image analysis software method featuring an easy to use, step-by-step interface to measure lytic bone lesions. Osteolytica utilises novel graphics card acceleration (parallel computing) and 3-D rendering to provide rapid reconstruction and analysis of osteolytic lesions. To evaluate the use of Osteolytica we analysed tibial micro-CT datasets from murine models of cancer-induced bone disease and compared the results to those obtained using a standard ImageJ analysis method. Firstly, to assess inter-user variability we deployed four independent researchers to analyse tibial datasets from the U266-NSG murine model of myeloma. Using ImageJ, inter-user variability between the bones was substantial (±19.6%), in contrast to using Osteolytica, which demonstrated minimal variability (±0.5%). Secondly, tibial datasets from U266-bearing NSG mice or BALB/c mice injected with the metastatic breast cancer cell line 4T1 were compared to tibial datasets from aged and sex-matched non-tumour control mice. Analyses by both Osteolytica and ImageJ showed significant increases in bone lesion area in tumour-bearing mice compared to control mice. These results confirm that Osteolytica performs as well as the current 2-D ImageJ osteolytic lesion analysis method. However, Osteolytica is advantageous in that it analyses over the entirety of the bone volume (as opposed to selected 2-D images), it

  18. A screened automated structural search with semiempirical methods

    CERN Document Server

    Ota, Yukihiro; Machida, Masahiko; Shiga, Motoyuki

    2016-01-01

    We developed an interface program between a program suite for an automated search of chemical reaction pathways, GRRM, and a program package of semiempirical methods, MOPAC. A two-step structural search is proposed as an application of this interface program. A screening test is first performed by semiempirical calculations. Subsequently, a reoptimization procedure is done by ab initio or density functional calculations. We apply this approach to ion adsorption on cellulose. The computational efficiency is also shown for a GRRM search. The interface program is suitable for the structural search of large molecular systems for which semiempirical methods are applicable.

  19. Comparative Analysis of Manual and Automated AFEES

    Science.gov (United States)

    1976-05-14

    system and in-house technical studies of AFEES-related issues. The design of the system -vas performed by Computer Sciences Corporation (CSC...34two day" blood pressure and pulse and/or answer any questions that some liaison would have concerning a previously physicalled applicant. This...from the Consumption/ Usage Listings as submitted by Computer Sciences Corporation and estimates for automated applicant forms from Central

  20. Feasibility studies of safety assessment methods for programmable automation systems. Final report of the AVV project

    Energy Technology Data Exchange (ETDEWEB)

    Haapanen, P.; Maskuniitty, M.; Pulkkinen, U. [VTT Automation, Espoo (Finland); Heikkinen, J.; Korhonen, J.; Tuulari, E. [VTT Electronics, Espoo (Finland)

    1995-10-01

    Feasibility studies of two different groups of methodologies for safety assessment of programmable automation systems has been executed at the Technical Research Centre of Finland (VTT). The studies concerned the dynamic testing methods and the fault tree (FT) and failure mode and effects analysis (FMEA) methods. In order to get real experience in the application of these methods, an experimental testing of two realistic pilot systems were executed and a FT/FMEA analysis of a programmable safety function accomplished. The purpose of the studies was not to assess the object systems, but to get experience in the application of methods and assess their potentials and development needs. (46 refs., 21 figs.).

  1. Grasping devices and methods in automated production processes

    DEFF Research Database (Denmark)

    Fantoni, Gualtiero; Santochi, Marco; Dini, Gino

    2014-01-01

    In automated production processes grasping devices and methods play a crucial role in the handling of many parts, components and products. This keynote paper starts with a classification of grasping phases, describes how different principles are adopted at different scales in different applications...... and continues explaining different releasing strategies and principles. Then the paper classifies the numerous sensors used to monitor the effectiveness of grasping (part presence, exchanged force, stick-slip transitions, etc.). Later the grasping and releasing problems in different fields (from mechanical...

  2. Method and automated apparatus for detecting coliform organisms

    Science.gov (United States)

    Dill, W. P.; Taylor, R. E.; Jeffers, E. L. (Inventor)

    1980-01-01

    Method and automated apparatus are disclosed for determining the time of detection of metabolically produced hydrogen by coliform bacteria cultured in an electroanalytical cell from the time the cell is inoculated with the bacteria. The detection time data provides bacteria concentration values. The apparatus is sequenced and controlled by a digital computer to discharge a spent sample, clean and sterilize the culture cell, provide a bacteria nutrient into the cell, control the temperature of the nutrient, inoculate the nutrient with a bacteria sample, measures the electrical potential difference produced by the cell, and measures the time of detection from inoculation.

  3. Comparison of Particulate Mercury Measured with Manual and Automated Methods

    Directory of Open Access Journals (Sweden)

    Rachel Russo

    2011-01-01

    Full Text Available A study was conducted to compare measuring particulate mercury (HgP with the manual filter method and the automated Tekran system. Simultaneous measurements were conducted with the Tekran and Teflon filter methodologies in the marine and coastal continental atmospheres. Overall, the filter HgP values were on the average 21% higher than the Tekran HgP, and >85% of the data were outside of ±25% region surrounding the 1:1 line. In some cases the filter values were as much as 3-fold greater, with

  4. Automated SEM Modal Analysis Applied to the Diogenites

    Science.gov (United States)

    Bowman, L. E.; Spilde, M. N.; Papike, James J.

    1996-01-01

    Analysis of volume proportions of minerals, or modal analysis, is routinely accomplished by point counting on an optical microscope, but the process, particularly on brecciated samples such as the diogenite meteorites, is tedious and prone to error by misidentification of very small fragments, which may make up a significant volume of the sample. Precise volume percentage data can be gathered on a scanning electron microscope (SEM) utilizing digital imaging and an energy dispersive spectrometer (EDS). This form of automated phase analysis reduces error, and at the same time provides more information than could be gathered using simple point counting alone, such as particle morphology statistics and chemical analyses. We have previously studied major, minor, and trace-element chemistry of orthopyroxene from a suite of diogenites. This abstract describes the method applied to determine the modes on this same suite of meteorites and the results of that research. The modal abundances thus determined add additional information on the petrogenesis of the diogenites. In addition, low-abundance phases such as spinels were located for further analysis by this method.

  5. Analysis of Trinity Power Metrics for Automated Monitoring

    Energy Technology Data Exchange (ETDEWEB)

    Michalenko, Ashley Christine [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-07-28

    This is a presentation from Los Alamos National Laboraotyr (LANL) about the analysis of trinity power metrics for automated monitoring. The following topics are covered: current monitoring efforts, motivation for analysis, tools used, the methodology, work performed during the summer, and future work planned.

  6. Image analysis and platform development for automated phenotyping in cytomics

    NARCIS (Netherlands)

    Yan, Kuan

    2013-01-01

    This thesis is dedicated to the empirical study of image analysis in HT/HC screen study. Often a HT/HC screening produces extensive amounts that cannot be manually analyzed. Thus, an automated image analysis solution is prior to an objective understanding of the raw image data. Compared to general a

  7. Automated quantitative gait analysis in animal models of movement disorders

    Directory of Open Access Journals (Sweden)

    Vandeputte Caroline

    2010-08-01

    Full Text Available Abstract Background Accurate and reproducible behavioral tests in animal models are of major importance in the development and evaluation of new therapies for central nervous system disease. In this study we investigated for the first time gait parameters of rat models for Parkinson's disease (PD, Huntington's disease (HD and stroke using the Catwalk method, a novel automated gait analysis test. Static and dynamic gait parameters were measured in all animal models, and these data were compared to readouts of established behavioral tests, such as the cylinder test in the PD and stroke rats and the rotarod tests for the HD group. Results Hemiparkinsonian rats were generated by unilateral injection of the neurotoxin 6-hydroxydopamine in the striatum or in the medial forebrain bundle. For Huntington's disease, a transgenic rat model expressing a truncated huntingtin fragment with multiple CAG repeats was used. Thirdly, a stroke model was generated by a photothrombotic induced infarct in the right sensorimotor cortex. We found that multiple gait parameters were significantly altered in all three disease models compared to their respective controls. Behavioural deficits could be efficiently measured using the cylinder test in the PD and stroke animals, and in the case of the PD model, the deficits in gait essentially confirmed results obtained by the cylinder test. However, in the HD model and the stroke model the Catwalk analysis proved more sensitive than the rotarod test and also added new and more detailed information on specific gait parameters. Conclusion The automated quantitative gait analysis test may be a useful tool to study both motor impairment and recovery associated with various neurological motor disorders.

  8. Automated computational aberration correction method for broadband interferometric imaging techniques.

    Science.gov (United States)

    Pande, Paritosh; Liu, Yuan-Zhi; South, Fredrick A; Boppart, Stephen A

    2016-07-15

    Numerical correction of optical aberrations provides an inexpensive and simpler alternative to the traditionally used hardware-based adaptive optics techniques. In this Letter, we present an automated computational aberration correction method for broadband interferometric imaging techniques. In the proposed method, the process of aberration correction is modeled as a filtering operation on the aberrant image using a phase filter in the Fourier domain. The phase filter is expressed as a linear combination of Zernike polynomials with unknown coefficients, which are estimated through an iterative optimization scheme based on maximizing an image sharpness metric. The method is validated on both simulated data and experimental data obtained from a tissue phantom, an ex vivo tissue sample, and an in vivo photoreceptor layer of the human retina.

  9. Extending and automating a Systems-Theoretic hazard analysis for requirements generation and analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Thomas, John (Massachusetts Institute of Technology)

    2012-05-01

    Systems Theoretic Process Analysis (STPA) is a powerful new hazard analysis method designed to go beyond traditional safety techniques - such as Fault Tree Analysis (FTA) - that overlook important causes of accidents like flawed requirements, dysfunctional component interactions, and software errors. While proving to be very effective on real systems, no formal structure has been defined for STPA and its application has been ad-hoc with no rigorous procedures or model-based design tools. This report defines a formal mathematical structure underlying STPA and describes a procedure for systematically performing an STPA analysis based on that structure. A method for using the results of the hazard analysis to generate formal safety-critical, model-based system and software requirements is also presented. Techniques to automate both the analysis and the requirements generation are introduced, as well as a method to detect conflicts between the safety and other functional model-based requirements during early development of the system.

  10. Development of An Optimization Method for Determining Automation Rate in Nuclear Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Seung Min; Seong, Poong Hyun [KAIST, Daejeon (Korea, Republic of); Kim, Jong Hyun [KEPCO, Ulsan (Korea, Republic of)

    2014-08-15

    Since automation was introduced in various industrial fields, it has been known that automation provides positive effects like greater efficiency and fewer human errors, and negative effect defined as out-of-the-loop (OOTL). Thus, before introducing automation in nuclear field, the estimation of the positive and negative effects of automation on human operators should be conducted. In this paper, by focusing on CPS, the optimization method to find an appropriate proportion of automation is suggested by integrating the suggested cognitive automation rate and the concepts of the level of ostracism. The cognitive automation rate estimation method was suggested to express the reduced amount of human cognitive loads, and the level of ostracism was suggested to express the difficulty in obtaining information from the automation system and increased uncertainty of human operators' diagnosis. The maximized proportion of automation that maintains the high level of attention for monitoring the situation is derived by an experiment, and the automation rate is estimated by the suggested automation rate estimation method. It is expected to derive an appropriate inclusion proportion of the automation system avoiding the OOTL problem and having maximum efficacy at the same time.

  11. Formal Methods for Automated Diagnosis of Autosub 6000

    Science.gov (United States)

    Ernits, Juhan; Dearden, Richard; Pebody, Miles

    2009-01-01

    This is a progress report on applying formal methods in the context of building an automated diagnosis and recovery system for Autosub 6000, an Autonomous Underwater Vehicle (AUV). The diagnosis task involves building abstract models of the control system of the AUV. The diagnosis engine is based on Livingstone 2, a model-based diagnoser originally built for aerospace applications. Large parts of the diagnosis model can be built without concrete knowledge about each mission, but actual mission scripts and configuration parameters that carry important information for diagnosis are changed for every mission. Thus we use formal methods for generating the mission control part of the diagnosis model automatically from the mission script and perform a number of invariant checks to validate the configuration. After the diagnosis model is augmented with the generated mission control component model, it needs to be validated using verification techniques.

  12. An Exploration of Inexpensive, Automated Methods for Measuring River Stage

    Science.gov (United States)

    Kruger, A.; Niemeier, J. J.; Krajewski, W. F.; Ceynar, D.; Wagner, G. D.

    2009-12-01

    River stage measurements are of fundamental importance in many hydrologic and environmental applications. Even though the U.S. Geological Survey (USGS) operates a large network of stream gauging stations on larger rivers, river stage is unavailable in many locations, especially on smaller rivers and streams. We have been exploring a variety of relatively inexpensive techniques for automated river stage measurements, targeting medium and smaller rivers. These techniques include ultrasonic water-surface height measurement, submerged pressure transducers that incorporate a unique bladder system, and a new radio-based technique. The latter method uses inexpensive radios that operate in the unlicensed industrial, scientific, and medical (ISM) band, and uses the phase delay of radio waves from a submerged radio to infer river stage. The technique holds promise as an inexpensive/disposable sensor that can find application in rapid-response settings such as flooding. In this work, we present and discuss the instruments along with comparisons of the different methods.

  13. Added value of a mandible movement automated analysis in the screening of obstructive sleep apnea.

    Science.gov (United States)

    Maury, Gisele; Cambron, Laurent; Jamart, Jacques; Marchand, Eric; Senny, Frédéric; Poirrier, Robert

    2013-02-01

    In-laboratory polysomnography is the 'gold standard' for diagnosing obstructive sleep apnea syndrome, but is time consuming and costly, with long waiting lists in many sleep laboratories. Therefore, the search for alternative methods to detect respiratory events is growing. In this prospective study, we compared attended polysomnography with two other methods, with or without mandible movement automated analysis provided by a distance-meter and added to airflow and oxygen saturation analysis for the detection of respiratory events. The mandible movement automated analysis allows for the detection of salient mandible movement, which is a surrogate for arousal. All parameters were recorded simultaneously in 570 consecutive patients (M/F: 381/189; age: 50±14 years; body mass index: 29±7 kg m(-2) ) visiting a sleep laboratory. The most frequent main diagnoses were: obstructive sleep apnea (344; 60%); insomnia/anxiety/depression (75; 13%); and upper airway resistance syndrome (25; 4%). The correlation between polysomnography and the method with mandible movement automated analysis was excellent (r: 0.95; P<0.001). Accuracy characteristics of the methods showed a statistical improvement in sensitivity and negative predictive value with the addition of mandible movement automated analysis. This was true for different diagnostic thresholds of obstructive sleep severity, with an excellent efficiency for moderate to severe index (apnea-hypopnea index ≥15h(-1) ). A Bland & Altman plot corroborated the analysis. The addition of mandible movement automated analysis significantly improves the respiratory index calculation accuracy compared with an airflow and oxygen saturation analysis. This is an attractive method for the screening of obstructive sleep apnea syndrome, increasing the ability to detect hypopnea thanks to the salient mandible movement as a marker of arousals.

  14. ORIGAMI Automator Primer. Automated ORIGEN Source Terms and Spent Fuel Storage Pool Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Wieselquist, William A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Thompson, Adam B. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Bowman, Stephen M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Peterson, Joshua L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-04-01

    Source terms and spent nuclear fuel (SNF) storage pool decay heat load analyses for operating nuclear power plants require a large number of Oak Ridge Isotope Generation and Depletion (ORIGEN) calculations. SNF source term calculations also require a significant amount of bookkeeping to track quantities such as core and assembly operating histories, spent fuel pool (SFP) residence times, heavy metal masses, and enrichments. The ORIGEN Assembly Isotopics (ORIGAMI) module in the SCALE code system provides a simple scheme for entering these data. However, given the large scope of the analysis, extensive scripting is necessary to convert formats and process data to create thousands of ORIGAMI input files (one per assembly) and to process the results into formats readily usable by follow-on analysis tools. This primer describes a project within the SCALE Fulcrum graphical user interface (GUI) called ORIGAMI Automator that was developed to automate the scripting and bookkeeping in large-scale source term analyses. The ORIGAMI Automator enables the analyst to (1) easily create, view, and edit the reactor site and assembly information, (2) automatically create and run ORIGAMI inputs, and (3) analyze the results from ORIGAMI. ORIGAMI Automator uses the standard ORIGEN binary concentrations files produced by ORIGAMI, with concentrations available at all time points in each assembly’s life. The GUI plots results such as mass, concentration, activity, and decay heat using a powerful new ORIGEN Post-Processing Utility for SCALE (OPUS) GUI component. This document includes a description and user guide for the GUI, a step-by-step tutorial for a simplified scenario, and appendices that document the file structures used.

  15. An Automated Data Analysis Tool for Livestock Market Data

    Science.gov (United States)

    Williams, Galen S.; Raper, Kellie Curry

    2011-01-01

    This article describes an automated data analysis tool that allows Oklahoma Cooperative Extension Service educators to disseminate results in a timely manner. Primary data collected at Oklahoma Quality Beef Network (OQBN) certified calf auctions across the state results in a large amount of data per sale site. Sale summaries for an individual sale…

  16. Automated image analysis in the study of collagenous colitis

    DEFF Research Database (Denmark)

    Kanstrup, Anne-Marie Fiehn; Kristensson, Martin; Engel, Ulla

    2016-01-01

    PURPOSE: The aim of this study was to develop an automated image analysis software to measure the thickness of the subepithelial collagenous band in colon biopsies with collagenous colitis (CC) and incomplete CC (CCi). The software measures the thickness of the collagenous band on microscopic...

  17. Automated pollen identification using microscopic imaging and texture analysis.

    Science.gov (United States)

    Marcos, J Víctor; Nava, Rodrigo; Cristóbal, Gabriel; Redondo, Rafael; Escalante-Ramírez, Boris; Bueno, Gloria; Déniz, Óscar; González-Porto, Amelia; Pardo, Cristina; Chung, François; Rodríguez, Tomás

    2015-01-01

    Pollen identification is required in different scenarios such as prevention of allergic reactions, climate analysis or apiculture. However, it is a time-consuming task since experts are required to recognize each pollen grain through the microscope. In this study, we performed an exhaustive assessment on the utility of texture analysis for automated characterisation of pollen samples. A database composed of 1800 brightfield microscopy images of pollen grains from 15 different taxa was used for this purpose. A pattern recognition-based methodology was adopted to perform pollen classification. Four different methods were evaluated for texture feature extraction from the pollen image: Haralick's gray-level co-occurrence matrices (GLCM), log-Gabor filters (LGF), local binary patterns (LBP) and discrete Tchebichef moments (DTM). Fisher's discriminant analysis and k-nearest neighbour were subsequently applied to perform dimensionality reduction and multivariate classification, respectively. Our results reveal that LGF and DTM, which are based on the spectral properties of the image, outperformed GLCM and LBP in the proposed classification problem. Furthermore, we found that the combination of all the texture features resulted in the highest performance, yielding an accuracy of 95%. Therefore, thorough texture characterisation could be considered in further implementations of automatic pollen recognition systems based on image processing techniques.

  18. When Phase Contrast Fails: ChainTracer and NucTracer, Two ImageJ Methods for Semi-Automated Single Cell Analysis Using Membrane or DNA Staining

    NARCIS (Netherlands)

    Syvertsson, S.; Vischer, N.O.E.; Gao, Y.; Hamoen, L.W.

    2016-01-01

    Within bacterial populations, genetically identical cells often behave differently. Single-cell measurement methods are required to observe this heterogeneity. Flow cytometry and fluorescence light microscopy are the primary methods to do this. However, flow cytometry requires reasonably strong fluo

  19. Automated analysis for lifecycle assembly processes

    Energy Technology Data Exchange (ETDEWEB)

    Calton, T.L.; Brown, R.G.; Peters, R.R.

    1998-05-01

    Many manufacturing companies today expend more effort on upgrade and disposal projects than on clean-slate design, and this trend is expected to become more prevalent in coming years. However, commercial CAD tools are better suited to initial product design than to the product`s full life cycle. Computer-aided analysis, optimization, and visualization of life cycle assembly processes based on the product CAD data can help ensure accuracy and reduce effort expended in planning these processes for existing products, as well as provide design-for-lifecycle analysis for new designs. To be effective, computer aided assembly planning systems must allow users to express the plan selection criteria that apply to their companies and products as well as to the life cycles of their products. Designing products for easy assembly and disassembly during its entire life cycle for purposes including service, field repair, upgrade, and disposal is a process that involves many disciplines. In addition, finding the best solution often involves considering the design as a whole and by considering its intended life cycle. Different goals and constraints (compared to initial assembly) require one to re-visit the significant fundamental assumptions and methods that underlie current assembly planning techniques. Previous work in this area has been limited to either academic studies of issues in assembly planning or applied studies of life cycle assembly processes, which give no attention to automatic planning. It is believed that merging these two areas will result in a much greater ability to design for; optimize, and analyze life cycle assembly processes.

  20. Granulometric profiling of aeolian dust deposits by automated image analysis

    Science.gov (United States)

    Varga, György; Újvári, Gábor; Kovács, János; Jakab, Gergely; Kiss, Klaudia; Szalai, Zoltán

    2016-04-01

    Determination of granulometric parameters is of growing interest in the Earth sciences. Particle size data of sedimentary deposits provide insights into the physicochemical environment of transport, accumulation and post-depositional alterations of sedimentary particles, and are important proxies applied in paleoclimatic reconstructions. It is especially true for aeolian dust deposits with a fairly narrow grain size range as a consequence of the extremely selective nature of wind sediment transport. Therefore, various aspects of aeolian sedimentation (wind strength, distance to source(s), possible secondary source regions and modes of sedimentation and transport) can be reconstructed only from precise grain size data. As terrestrial wind-blown deposits are among the most important archives of past environmental changes, proper explanation of the proxy data is a mandatory issue. Automated imaging provides a unique technique to gather direct information on granulometric characteristics of sedimentary particles. Granulometric data obtained from automatic image analysis of Malvern Morphologi G3-ID is a rarely applied new technique for particle size and shape analyses in sedimentary geology. Size and shape data of several hundred thousand (or even million) individual particles were automatically recorded in this study from 15 loess and paleosoil samples from the captured high-resolution images. Several size (e.g. circle-equivalent diameter, major axis, length, width, area) and shape parameters (e.g. elongation, circularity, convexity) were calculated by the instrument software. At the same time, the mean light intensity after transmission through each particle is automatically collected by the system as a proxy of optical properties of the material. Intensity values are dependent on chemical composition and/or thickness of the particles. The results of the automated imaging were compared to particle size data determined by three different laser diffraction instruments

  1. Volumetric measurements of pulmonary nodules: variability in automated analysis tools

    Science.gov (United States)

    Juluru, Krishna; Kim, Woojin; Boonn, William; King, Tara; Siddiqui, Khan; Siegel, Eliot

    2007-03-01

    Over the past decade, several computerized tools have been developed for detection of lung nodules and for providing volumetric analysis. Incidentally detected lung nodules have traditionally been followed over time by measurements of their axial dimensions on CT scans to ensure stability or document progression. A recently published article by the Fleischner Society offers guidelines on the management of incidentally detected nodules based on size criteria. For this reason, differences in measurements obtained by automated tools from various vendors may have significant implications on management, yet the degree of variability in these measurements is not well understood. The goal of this study is to quantify the differences in nodule maximum diameter and volume among different automated analysis software. Using a dataset of lung scans obtained with both "ultra-low" and conventional doses, we identified a subset of nodules in each of five size-based categories. Using automated analysis tools provided by three different vendors, we obtained size and volumetric measurements on these nodules, and compared these data using descriptive as well as ANOVA and t-test analysis. Results showed significant differences in nodule maximum diameter measurements among the various automated lung nodule analysis tools but no significant differences in nodule volume measurements. These data suggest that when using automated commercial software, volume measurements may be a more reliable marker of tumor progression than maximum diameter. The data also suggest that volumetric nodule measurements may be relatively reproducible among various commercial workstations, in contrast to the variability documented when performing human mark-ups, as is seen in the LIDC (lung imaging database consortium) study.

  2. Automated Protein Biomarker Analysis: on-line extraction of clinical samples by Molecularly Imprinted Polymers

    Science.gov (United States)

    Rossetti, Cecilia; Świtnicka-Plak, Magdalena A.; Grønhaug Halvorsen, Trine; Cormack, Peter A.G.; Sellergren, Börje; Reubsaet, Léon

    2017-01-01

    Robust biomarker quantification is essential for the accurate diagnosis of diseases and is of great value in cancer management. In this paper, an innovative diagnostic platform is presented which provides automated molecularly imprinted solid-phase extraction (MISPE) followed by liquid chromatography-mass spectrometry (LC-MS) for biomarker determination using ProGastrin Releasing Peptide (ProGRP), a highly sensitive biomarker for Small Cell Lung Cancer, as a model. Molecularly imprinted polymer microspheres were synthesized by precipitation polymerization and analytical optimization of the most promising material led to the development of an automated quantification method for ProGRP. The method enabled analysis of patient serum samples with elevated ProGRP levels. Particularly low sample volumes were permitted using the automated extraction within a method which was time-efficient, thereby demonstrating the potential of such a strategy in a clinical setting. PMID:28303910

  3. Automated Protein Biomarker Analysis: on-line extraction of clinical samples by Molecularly Imprinted Polymers

    Science.gov (United States)

    Rossetti, Cecilia; Świtnicka-Plak, Magdalena A.; Grønhaug Halvorsen, Trine; Cormack, Peter A. G.; Sellergren, Börje; Reubsaet, Léon

    2017-03-01

    Robust biomarker quantification is essential for the accurate diagnosis of diseases and is of great value in cancer management. In this paper, an innovative diagnostic platform is presented which provides automated molecularly imprinted solid-phase extraction (MISPE) followed by liquid chromatography-mass spectrometry (LC-MS) for biomarker determination using ProGastrin Releasing Peptide (ProGRP), a highly sensitive biomarker for Small Cell Lung Cancer, as a model. Molecularly imprinted polymer microspheres were synthesized by precipitation polymerization and analytical optimization of the most promising material led to the development of an automated quantification method for ProGRP. The method enabled analysis of patient serum samples with elevated ProGRP levels. Particularly low sample volumes were permitted using the automated extraction within a method which was time-efficient, thereby demonstrating the potential of such a strategy in a clinical setting.

  4. Automated liquid operation method for microfluidic heterogeneous immunoassay.

    Science.gov (United States)

    Yi, Hui; Pan, Jian-Zhang; Shi, Xiao-Tong; Fang, Qun

    2013-02-15

    In this work, an automated liquid operation method for multistep heterogeneous immunoassay toward point of care testing (POCT) was proposed. A miniaturized peristaltic pump was developed to control the flow direction, flow time and flow rate in the microliter range according to a program. The peristaltic pump has the advantages of simple structure, small size, low cost, and easy to build and use. By coupling the peristaltic pump with an antibody-coated capillary and a reagent-preloaded cartridge, the complicated liquid handling operation for heterogeneous immunoassay, including sample metering and introduction, multistep reagent introduction and rinsing, could be triggered by an action and accomplished automatically in 12 min. The analytical performance of the present immunoassay system was demonstrated in the measurement of human IgG with fluorescence detection. A detection limit of 0.68 μg/mL IgG and a dynamic range of 2-300 μg/mL were obtained.

  5. On Automating and Standardising Corpus Callosum Analysis in Brain MRI

    DEFF Research Database (Denmark)

    Stegmann, Mikkel Bille; Skoglund, Karl

    2005-01-01

    Corpus callosum analysis is influenced by many factors. The effort in controlling these has previously been incomplete and scattered. This paper sketches a complete pipeline for automated corpus callosum analysis from magnetic resonance images, with focus on measurement standardisation....... The presented pipeline deals with i) estimation of the mid-sagittal plane, ii) localisation and registration of the corpus callosum, iii) parameterisation and representation of its contour, and iv) means of standardising the traditional reference area measurements....

  6. Fully automated apparatus for the proximate analysis of coals

    Energy Technology Data Exchange (ETDEWEB)

    Fukumoto, K.; Ishibashi, Y.; Ishii, T.; Maeda, K.; Ogawa, A.; Gotoh, K.

    1985-01-01

    The authors report the development of fully-automated equipment for the proximate analysis of coals, a development undertaken with the twin aims of labour-saving and developing robot applications technology. This system comprises a balance, electric furnaces, a sulfur analyzer, etc., arranged concentrically around a multi-jointed robot which automatically performs all the necessary operations, such as sampling and weighing the materials for analysis, and inserting and removing them from the furnaces. 2 references.

  7. A fully automated multicapillary electrophoresis device for DNA analysis.

    Science.gov (United States)

    Behr, S; Mätzig, M; Levin, A; Eickhoff, H; Heller, C

    1999-06-01

    We describe the construction and performance of a fully automated multicapillary electrophoresis system for the analysis of fluorescently labeled biomolecules. A special detection system allows the simultaneous spectral analysis of all 96 capillaries. The main features are true parallel detection without any moving parts, high robustness, and full compatibility to existing protocols. The device can process up to 40 microtiter plates (96 and 384 well) without human interference, which means up to 15,000 samples before it has to be reloaded.

  8. Towards unsupervised analysis of second-order chromatographic data: automated selection of number of components in multivariate curve-resolution methods.

    Science.gov (United States)

    Vivó-Truyols, G; Torres-Lapasió, J R; García-Alvarez-Coque, M C; Schoenmakers, P J

    2007-07-27

    A method to apply multivariate curve-resolution unattendedly is presented. The algorithm is suitable to perform deconvolution of two-way data (e.g. retrieving the individual elution profiles and spectra of co-eluting compounds from signals obtained from a chromatograph equipped with multiple-channel detection: LC-DAD or GC-MS). The method is especially adequate to achieve the advantages of deconvolution approaches when huge amounts of data are present and manual application of multivariate techniques is too time-consuming. The philosophy of the algorithm is to mimic the reactions of an expert user when applying the orthogonal projection approach--multivariate curve-resolution techniques. Basically, the method establishes a way to check the number of significant components in the data matrix. The performance of the method was superior to the Malinowski F-test. The algorithm was tested with HPLC-DAD signals.

  9. Automated analysis for detecting beams in laser wakefield simulations

    Energy Technology Data Exchange (ETDEWEB)

    Ushizima, Daniela M.; Rubel, Oliver; Prabhat, Mr.; Weber, Gunther H.; Bethel, E. Wes; Aragon, Cecilia R.; Geddes, Cameron G.R.; Cormier-Michel, Estelle; Hamann, Bernd; Messmer, Peter; Hagen, Hans

    2008-07-03

    Laser wakefield particle accelerators have shown the potential to generate electric fields thousands of times higher than those of conventional accelerators. The resulting extremely short particle acceleration distance could yield a potential new compact source of energetic electrons and radiation, with wide applications from medicine to physics. Physicists investigate laser-plasma internal dynamics by running particle-in-cell simulations; however, this generates a large dataset that requires time-consuming, manual inspection by experts in order to detect key features such as beam formation. This paper describes a framework to automate the data analysis and classification of simulation data. First, we propose a new method to identify locations with high density of particles in the space-time domain, based on maximum extremum point detection on the particle distribution. We analyze high density electron regions using a lifetime diagram by organizing and pruning the maximum extrema as nodes in a minimum spanning tree. Second, we partition the multivariate data using fuzzy clustering to detect time steps in a experiment that may contain a high quality electron beam. Finally, we combine results from fuzzy clustering and bunch lifetime analysis to estimate spatially confined beams. We demonstrate our algorithms successfully on four different simulation datasets.

  10. Automated Asteroseismic Analysis of Solar-type Stars

    DEFF Research Database (Denmark)

    Karoff, Christoffer; Campante, T.L.; Chaplin, W.J.

    2010-01-01

    The rapidly increasing volume of asteroseismic observations on solar-type stars has revealed a need for automated analysis tools. The reason for this is not only that individual analyses of single stars are rather time consuming, but more importantly that these large volumes of observations open...... the possibility to do population studies on large samples of stars and such population studies demand a consistent analysis. By consistent analysis we understand an analysis that can be performed without the need to make any subjective choices on e.g. mode identification and an analysis where the uncertainties...

  11. Fully Automated Operational Modal Analysis using multi-stage clustering

    Science.gov (United States)

    Neu, Eugen; Janser, Frank; Khatibi, Akbar A.; Orifici, Adrian C.

    2017-02-01

    The interest for robust automatic modal parameter extraction techniques has increased significantly over the last years, together with the rising demand for continuous health monitoring of critical infrastructure like bridges, buildings and wind turbine blades. In this study a novel, multi-stage clustering approach for Automated Operational Modal Analysis (AOMA) is introduced. In contrast to existing approaches, the procedure works without any user-provided thresholds, is applicable within large system order ranges, can be used with very small sensor numbers and does not place any limitations on the damping ratio or the complexity of the system under investigation. The approach works with any parametric system identification algorithm that uses the system order n as sole parameter. Here a data-driven Stochastic Subspace Identification (SSI) method is used. Measurements from a wind tunnel investigation with a composite cantilever equipped with Fiber Bragg Grating Sensors (FBGSs) and piezoelectric sensors are used to assess the performance of the algorithm with a highly damped structure and low signal to noise ratio conditions. The proposed method was able to identify all physical system modes in the investigated frequency range from over 1000 individual datasets using FBGSs under challenging signal to noise ratio conditions and under better signal conditions but from only two sensors.

  12. Towards Automated Design, Analysis and Optimization of Declarative Curation Workflows

    Directory of Open Access Journals (Sweden)

    Tianhong Song

    2014-10-01

    Full Text Available Data curation is increasingly important. Our previous work on a Kepler curation package has demonstrated advantages that come from automating data curation pipelines by using workflow systems. However, manually designed curation workflows can be error-prone and inefficient due to a lack of user understanding of the workflow system, misuse of actors, or human error. Correcting problematic workflows is often very time-consuming. A more proactive workflow system can help users avoid such pitfalls. For example, static analysis before execution can be used to detect the potential problems in a workflow and help the user to improve workflow design. In this paper, we propose a declarative workflow approach that supports semi-automated workflow design, analysis and optimization. We show how the workflow design engine helps users to construct data curation workflows, how the workflow analysis engine detects different design problems of workflows and how workflows can be optimized by exploiting parallelism.

  13. Sleep-spindle detection: crowdsourcing and evaluating performance of experts, non-experts and automated methods

    DEFF Research Database (Denmark)

    Warby, Simon C.; Wendt, Sabrina Lyngbye; Welinder, Peter;

    2014-01-01

    Sleep spindles are discrete, intermittent patterns of brain activity observed in human electroencephalographic data. Increasingly, these oscillations are of biological and clinical interest because of their role in development, learning and neurological disorders. We used an Internet interface...... to crowdsource spindle identification by human experts and non-experts, and we compared their performance with that of automated detection algorithms in data from middle- to older-aged subjects from the general population. We also refined methods for forming group consensus and evaluating the performance...... of event detectors in physiological data such as electroencephalographic recordings from polysomnography. Compared to the expert group consensus gold standard, the highest performance was by individual experts and the non-expert group consensus, followed by automated spindle detectors. This analysis showed...

  14. Method and system for assigning a confidence metric for automated determination of optic disc location

    Science.gov (United States)

    Karnowski, Thomas P [Knoxville, TN; Tobin, Jr., Kenneth W.; Muthusamy Govindasamy, Vijaya Priya [Knoxville, TN; Chaum, Edward [Memphis, TN

    2012-07-10

    A method for assigning a confidence metric for automated determination of optic disc location that includes analyzing a retinal image and determining at least two sets of coordinates locating an optic disc in the retinal image. The sets of coordinates can be determined using first and second image analysis techniques that are different from one another. An accuracy parameter can be calculated and compared to a primary risk cut-off value. A high confidence level can be assigned to the retinal image if the accuracy parameter is less than the primary risk cut-off value and a low confidence level can be assigned to the retinal image if the accuracy parameter is greater than the primary risk cut-off value. The primary risk cut-off value being selected to represent an acceptable risk of misdiagnosis of a disease having retinal manifestations by the automated technique.

  15. Development of methods for DSM and distribution automation planning

    Energy Technology Data Exchange (ETDEWEB)

    Kaerkkaeinen, S.; Kekkonen, V. [VTT Energy, Espoo (Finland); Rissanen, P. [Tietosavo Oy (Finland)

    1998-08-01

    Demand-Side Management (DSM) is usually an utility (or sometimes governmental) activity designed to influence energy demand of customers (both level and load variation). It includes basic options like strategic conservation or load growth, peak clipping. Load shifting and fuel switching. Typical ways to realize DSM are direct load control, innovative tariffs, different types of campaign etc. Restructuring of utility in Finland and increased competition in electricity market have had dramatic influence on the DSM. Traditional ways are impossible due to the conflicting interests of generation, network and supply business and increased competition between different actors in the market. Costs and benefits of DSM are divided to different companies, and different type of utilities are interested only in those activities which are beneficial to them. On the other hand, due to the increased competition the suppliers are diversifying to different types of products and increasing number of customer services partly based on DSM are available. The aim of this project was to develop and assess methods for DSM and distribution automation planning from the utility point of view. The methods were also applied to case studies at utilities

  16. Comparing a Perceptual and an Automated Vision-Based Method for Lie Detection in Younger Children

    Science.gov (United States)

    Serras Pereira, Mariana; Cozijn, Reinier; Postma, Eric; Shahid, Suleman; Swerts, Marc

    2016-01-01

    The present study investigates how easily it can be detected whether a child is being truthful or not in a game situation, and it explores the cue validity of bodily movements for such type of classification. To achieve this, we introduce an innovative methodology – the combination of perception studies (in which eye-tracking technology is being used) and automated movement analysis. Film fragments from truthful and deceptive children were shown to human judges who were given the task to decide whether the recorded child was being truthful or not. Results reveal that judges are able to accurately distinguish truthful clips from lying clips in both perception studies. Even though the automated movement analysis for overall and specific body regions did not yield significant results between the experimental conditions, we did find a positive correlation between the amount of movement in a child and the perception of lies, i.e., the more movement the children exhibited during a clip, the higher the chance that the clip was perceived as a lie. The eye-tracking study revealed that, even when there is movement happening in different body regions, judges tend to focus their attention mainly on the face region. This is the first study that compares a perceptual and an automated method for the detection of deceptive behavior in children whose data have been elicited through an ecologically valid paradigm. PMID:28018271

  17. Implicit media frames: automated analysis of public debate on artificial sweeteners.

    Science.gov (United States)

    Hellsten, Iina; Dawson, James; Leydesdorff, Loet

    2010-09-01

    The framing of issues in the mass media plays a crucial role in the public understanding of science and technology. This article contributes to research concerned with the analysis of media frames over time by making an analytical distinction between implicit and explicit media frames, and by introducing an automated method for the analysis of implicit frames. In particular, we apply a semantic maps method to a case study on the newspaper debate about artificial sweeteners, published in the New York Times between 1980 and 2006. Our results show that the analysis of semantic changes enables us to filter out the dynamics of implicit frames, and to detect emerging metaphors in public debates. Theoretically, we discuss the relation between implicit frames in public debates and the codification of meaning and information in scientific discourses, and suggest further avenues for research interested in the automated analysis of frame changes and trends in public debates.

  18. Flux-P: Automating Metabolic Flux Analysis

    OpenAIRE

    Ebert, Birgitta E.; Anna-Lena Lamprecht; Bernhard Steffen; Blank, Lars M.

    2012-01-01

    Quantitative knowledge of intracellular fluxes in metabolic networks is invaluable for inferring metabolic system behavior and the design principles of biological systems. However, intracellular reaction rates can not often be calculated directly but have to be estimated; for instance, via 13C-based metabolic flux analysis, a model-based interpretation of stable carbon isotope patterns in intermediates of metabolism. Existing software such as FiatFlux, OpenFLUX or 13CFLUX supports experts in ...

  19. A Mixed Approach Of Automated ECG Analysis

    Science.gov (United States)

    De, A. K.; Das, J.; Majumder, D. Dutta

    1982-11-01

    ECG is one of the non-invasive and risk-free technique for collecting data about the functional state of the heart. However, all these data-processing techniques can be classified into two basically different approaches -- the first and second generation ECG computer program. Not the opposition, but simbiosis of these two approaches will lead to systems with the highest accuracy. In our paper we are going to describe a mixed approach which will show higher accuracy with lesser amount of computational work. Key Words : Primary features, Patients' parameter matrix, Screening, Logical comparison technique, Multivariate statistical analysis, Mixed approach.

  20. Fuzzy Emotional Semantic Analysis and Automated Annotation of Scene Images

    Directory of Open Access Journals (Sweden)

    Jianfang Cao

    2015-01-01

    Full Text Available With the advances in electronic and imaging techniques, the production of digital images has rapidly increased, and the extraction and automated annotation of emotional semantics implied by images have become issues that must be urgently addressed. To better simulate human subjectivity and ambiguity for understanding scene images, the current study proposes an emotional semantic annotation method for scene images based on fuzzy set theory. A fuzzy membership degree was calculated to describe the emotional degree of a scene image and was implemented using the Adaboost algorithm and a back-propagation (BP neural network. The automated annotation method was trained and tested using scene images from the SUN Database. The annotation results were then compared with those based on artificial annotation. Our method showed an annotation accuracy rate of 91.2% for basic emotional values and 82.4% after extended emotional values were added, which correspond to increases of 5.5% and 8.9%, respectively, compared with the results from using a single BP neural network algorithm. Furthermore, the retrieval accuracy rate based on our method reached approximately 89%. This study attempts to lay a solid foundation for the automated emotional semantic annotation of more types of images and therefore is of practical significance.

  1. Fuzzy emotional semantic analysis and automated annotation of scene images.

    Science.gov (United States)

    Cao, Jianfang; Chen, Lichao

    2015-01-01

    With the advances in electronic and imaging techniques, the production of digital images has rapidly increased, and the extraction and automated annotation of emotional semantics implied by images have become issues that must be urgently addressed. To better simulate human subjectivity and ambiguity for understanding scene images, the current study proposes an emotional semantic annotation method for scene images based on fuzzy set theory. A fuzzy membership degree was calculated to describe the emotional degree of a scene image and was implemented using the Adaboost algorithm and a back-propagation (BP) neural network. The automated annotation method was trained and tested using scene images from the SUN Database. The annotation results were then compared with those based on artificial annotation. Our method showed an annotation accuracy rate of 91.2% for basic emotional values and 82.4% after extended emotional values were added, which correspond to increases of 5.5% and 8.9%, respectively, compared with the results from using a single BP neural network algorithm. Furthermore, the retrieval accuracy rate based on our method reached approximately 89%. This study attempts to lay a solid foundation for the automated emotional semantic annotation of more types of images and therefore is of practical significance.

  2. Automated identification of mitochondrial regions in complex intracellular space by texture analysis

    Science.gov (United States)

    Pham, Tuan D.

    2014-01-01

    Automated processing and quantification of biological images have been rapidly increasing the attention of researchers in image processing and pattern recognition because the roles of computerized image and pattern analyses are critical for new biological findings and drug discovery based on modern high-throughput and highcontent image screening. This paper presents a study of the automated detection of regions of mitochondria, which are a subcellular structure of eukaryotic cells, in microscopy images. The automated identification of mitochondria in intracellular space that is captured by the state-of-the-art combination of focused ion beam and scanning electron microscope imaging reported here is the first of its type. Existing methods and a proposed algorithm for texture analysis were tested with the real intracellular images. The high correction rate of detecting the locations of the mitochondria in a complex environment suggests the effectiveness of the proposed study.

  3. Automated face analysis by feature point tracking has high concurrent validity with manual FACS coding.

    Science.gov (United States)

    Cohn, J F; Zlochower, A J; Lien, J; Kanade, T

    1999-01-01

    The face is a rich source of information about human behavior. Available methods for coding facial displays, however, are human-observer dependent, labor intensive, and difficult to standardize. To enable rigorous and efficient quantitative measurement of facial displays, we have developed an automated method of facial display analysis. In this report, we compare the results with this automated system with those of manual FACS (Facial Action Coding System, Ekman & Friesen, 1978a) coding. One hundred university students were videotaped while performing a series of facial displays. The image sequences were coded from videotape by certified FACS coders. Fifteen action units and action unit combinations that occurred a minimum of 25 times were selected for automated analysis. Facial features were automatically tracked in digitized image sequences using a hierarchical algorithm for estimating optical flow. The measurements were normalized for variation in position, orientation, and scale. The image sequences were randomly divided into a training set and a cross-validation set, and discriminant function analyses were conducted on the feature point measurements. In the training set, average agreement with manual FACS coding was 92% or higher for action units in the brow, eye, and mouth regions. In the cross-validation set, average agreement was 91%, 88%, and 81% for action units in the brow, eye, and mouth regions, respectively. Automated face analysis by feature point tracking demonstrated high concurrent validity with manual FACS coding.

  4. Crowdsourcing scoring of immunohistochemistry images: Evaluating Performance of the Crowd and an Automated Computational Method

    Science.gov (United States)

    Irshad, Humayun; Oh, Eun-Yeong; Schmolze, Daniel; Quintana, Liza M.; Collins, Laura; Tamimi, Rulla M.; Beck, Andrew H.

    2017-01-01

    The assessment of protein expression in immunohistochemistry (IHC) images provides important diagnostic, prognostic and predictive information for guiding cancer diagnosis and therapy. Manual scoring of IHC images represents a logistical challenge, as the process is labor intensive and time consuming. Since the last decade, computational methods have been developed to enable the application of quantitative methods for the analysis and interpretation of protein expression in IHC images. These methods have not yet replaced manual scoring for the assessment of IHC in the majority of diagnostic laboratories and in many large-scale research studies. An alternative approach is crowdsourcing the quantification of IHC images to an undefined crowd. The aim of this study is to quantify IHC images for labeling of ER status with two different crowdsourcing approaches, image-labeling and nuclei-labeling, and compare their performance with automated methods. Crowdsourcing- derived scores obtained greater concordance with the pathologist interpretations for both image-labeling and nuclei-labeling tasks (83% and 87%), as compared to the pathologist concordance achieved by the automated method (81%) on 5,338 TMA images from 1,853 breast cancer patients. This analysis shows that crowdsourcing the scoring of protein expression in IHC images is a promising new approach for large scale cancer molecular pathology studies. PMID:28230179

  5. Automation of Classical QEEG Trending Methods for Early Detection of Delayed Cerebral Ischemia: More Work to Do.

    Science.gov (United States)

    Wickering, Ellis; Gaspard, Nicolas; Zafar, Sahar; Moura, Valdery J; Biswal, Siddharth; Bechek, Sophia; OʼConnor, Kathryn; Rosenthal, Eric S; Westover, M Brandon

    2016-06-01

    The purpose of this study is to evaluate automated implementations of continuous EEG monitoring-based detection of delayed cerebral ischemia based on methods used in classical retrospective studies. We studied 95 patients with either Fisher 3 or Hunt Hess 4 to 5 aneurysmal subarachnoid hemorrhage who were admitted to the Neurosciences ICU and underwent continuous EEG monitoring. We implemented several variations of two classical algorithms for automated detection of delayed cerebral ischemia based on decreases in alpha-delta ratio and relative alpha variability. Of 95 patients, 43 (45%) developed delayed cerebral ischemia. Our automated implementation of the classical alpha-delta ratio-based trending method resulted in a sensitivity and specificity (Se,Sp) of (80,27)%, compared with the values of (100,76)% reported in the classic study using similar methods in a nonautomated fashion. Our automated implementation of the classical relative alpha variability-based trending method yielded (Se,Sp) values of (65,43)%, compared with (100,46)% reported in the classic study using nonautomated analysis. Our findings suggest that improved methods to detect decreases in alpha-delta ratio and relative alpha variability are needed before an automated EEG-based early delayed cerebral ischemia detection system is ready for clinical use.

  6. Automated Analysis of Security in Networking Systems

    DEFF Research Database (Denmark)

    Buchholtz, Mikael

    2004-01-01

    It has for a long time been a challenge to built secure networking systems. One way to counter this problem is to provide developers of software applications for networking systems with easy-to-use tools that can check security properties before the applications ever reach the marked. These tools...... will both help raise the general level of awareness of the problems and prevent the most basic flaws from occurring. This thesis contributes to the development of such tools. Networking systems typically try to attain secure communication by applying standard cryptographic techniques. In this thesis...... attacks, and attacks launched by insiders. Finally, the perspectives for the application of the analysis techniques are discussed, thereby, coming a small step closer to providing developers with easy- to-use tools for validating the security of networking applications....

  7. Background Defect Density Reduction Using Automated Defect Inspection And Analysis

    Science.gov (United States)

    Weirauch, Steven C.

    1988-01-01

    Yield maintenance and improvement is a major area of concern in any integrated circuit manufacturing operation. A major aspect of this concern is controlling and reducing defect density. Obviously, large defect excursions must be immediately addressed in order to maintain yield levels. However, to enhance yields, the subtle defect mechanisms must be reduced or eliminated as well. In-line process control inspections are effective for detecting large variations in the defect density on a real time basis. Examples of in-line inspection strategies include after develop or after etch inspections. They are usually effective for detecting when a particular process segment has gone out of control. However, when a process is running normally, there exists a background defect density that is generally not resolved by in-line process control inspections. The inspection strategies that are frequently used to monitor the background defect density are offline inspections. Offline inspections are used to identify the magnitude and characteristics of the background defect density. These inspections sample larger areas of product wafers than the in-line inspections to allow identification of the defect generating mechanisms that normally occur in the process. They are used to construct a database over a period of time so that trends may be studied. This information enables engineering efforts to be focused on the mechanisms that have the greatest impact on device yield. Once trouble spots in the process are identified, the data base supplies the information needed to isolate and solve them. The key aspect to the entire program is to utilize a reliable data gathering mechanism coupled with a flexible information processing system. This paper describes one method of reducing the background defect density using automated wafer inspection and analysis. The tools used in this evaluation were the KLA 2020 Wafer Inspector, KLA Utility Terminal (KLAUT), and a new software package developed

  8. A new, fast and semi-automated size determination method (SASDM for studying multicellular tumor spheroids

    Directory of Open Access Journals (Sweden)

    Lindhe Örjan

    2005-11-01

    Full Text Available Abstract Background Considering the width and importance of using Multicellular Tumor Spheroids (MTS in oncology research, size determination of MTSs by an accurate and fast method is essential. In the present study an effective, fast and semi-automated method, SASDM, was developed to determinate the size of MTSs. The method was applied and tested in MTSs of three different cell-lines. Frozen section autoradiography and Hemotoxylin Eosin (H&E staining was used for further confirmation. Results SASDM was shown to be effective, user-friendly, and time efficient, and to be more precise than the traditional methods and it was applicable for MTSs of different cell-lines. Furthermore, the results of image analysis showed high correspondence to the results of autoradiography and staining. Conclusion The combination of assessment of metabolic condition and image analysis in MTSs provides a good model to evaluate the effect of various anti-cancer treatments.

  9. An automated image analysis system to measure and count organisms in laboratory microcosms.

    Directory of Open Access Journals (Sweden)

    François Mallard

    Full Text Available 1. Because of recent technological improvements in the way computer and digital camera perform, the potential use of imaging for contributing to the study of communities, populations or individuals in laboratory microcosms has risen enormously. However its limited use is due to difficulties in the automation of image analysis. 2. We present an accurate and flexible method of image analysis for detecting, counting and measuring moving particles on a fixed but heterogeneous substrate. This method has been specifically designed to follow individuals, or entire populations, in experimental laboratory microcosms. It can be used in other applications. 3. The method consists in comparing multiple pictures of the same experimental microcosm in order to generate an image of the fixed background. This background is then used to extract, measure and count the moving organisms, leaving out the fixed background and the motionless or dead individuals. 4. We provide different examples (springtails, ants, nematodes, daphnia to show that this non intrusive method is efficient at detecting organisms under a wide variety of conditions even on faintly contrasted and heterogeneous substrates. 5. The repeatability and reliability of this method has been assessed using experimental populations of the Collembola Folsomia candida. 6. We present an ImageJ plugin to automate the analysis of digital pictures of laboratory microcosms. The plugin automates the successive steps of the analysis and recursively analyses multiple sets of images, rapidly producing measurements from a large number of replicated microcosms.

  10. An Innovative Requirements Solution: Combining Six Sigma KJ Language Data Analysis with Automated Content Analysis

    Science.gov (United States)

    2009-03-01

    2008 Carnegie Mellon University An Innovative Requirements Solution: Combining Six Sigma KJ Language Data Analysis with Automated Content...2. REPORT TYPE 3. DATES COVERED 00-00-2009 to 00-00-2009 4. TITLE AND SUBTITLE An Innovative Requirements Solution: Combining Six Sigma KJ...Prescribed by ANSI Std Z39-18 3 An Innovative Requirements Solution: Marrying Six Sigma KJ Analysis with Automation for Text Analysis and

  11. Implicit media frames: Automated analysis of public debate on artificial sweeteners

    CERN Document Server

    Hellsten, Iina; Leydesdorff, Loet

    2010-01-01

    The framing of issues in the mass media plays a crucial role in the public understanding of science and technology. This article contributes to research concerned with diachronic analysis of media frames by making an analytical distinction between implicit and explicit media frames, and by introducing an automated method for analysing diachronic changes of implicit frames. In particular, we apply a semantic maps method to a case study on the newspaper debate about artificial sweeteners, published in The New York Times (NYT) between 1980 and 2006. Our results show that the analysis of semantic changes enables us to filter out the dynamics of implicit frames, and to detect emerging metaphors in public debates. Theoretically, we discuss the relation between implicit frames in public debates and codification of information in scientific discourses, and suggest further avenues for research interested in the automated analysis of frame changes and trends in public debates.

  12. Detailed interrogation of trypanosome cell biology via differential organelle staining and automated image analysis

    Directory of Open Access Journals (Sweden)

    Wheeler Richard J

    2012-01-01

    Full Text Available Abstract Background Many trypanosomatid protozoa are important human or animal pathogens. The well defined morphology and precisely choreographed division of trypanosomatid cells makes morphological analysis a powerful tool for analyzing the effect of mutations, chemical insults and changes between lifecycle stages. High-throughput image analysis of micrographs has the potential to accelerate collection of quantitative morphological data. Trypanosomatid cells have two large DNA-containing organelles, the kinetoplast (mitochondrial DNA and nucleus, which provide useful markers for morphometric analysis; however they need to be accurately identified and often lie in close proximity. This presents a technical challenge. Accurate identification and quantitation of the DNA content of these organelles is a central requirement of any automated analysis method. Results We have developed a technique based on double staining of the DNA with a minor groove binding (4'', 6-diamidino-2-phenylindole (DAPI and a base pair intercalating (propidium iodide (PI or SYBR green fluorescent stain and color deconvolution. This allows the identification of kinetoplast and nuclear DNA in the micrograph based on whether the organelle has DNA with a more A-T or G-C rich composition. Following unambiguous identification of the kinetoplasts and nuclei the resulting images are amenable to quantitative automated analysis of kinetoplast and nucleus number and DNA content. On this foundation we have developed a demonstrative analysis tool capable of measuring kinetoplast and nucleus DNA content, size and position and cell body shape, length and width automatically. Conclusions Our approach to DNA staining and automated quantitative analysis of trypanosomatid morphology accelerated analysis of trypanosomatid protozoa. We have validated this approach using Leishmania mexicana, Crithidia fasciculata and wild-type and mutant Trypanosoma brucei. Automated analysis of T. brucei

  13. Monitoring method for automated CD-SEM recipes

    Science.gov (United States)

    Maeda, Tatsuya; Iwama, Satoru; Nishihara, Makoto; Berger, Daniel; Berger, Andrew; Ueda, Kazuhiro; Kenichi, Takenouchi; Iizumi, Takashi

    2005-05-01

    A prototype of a digital video storage system (CD-watcher) has been developed and attached to a Hitachi S-9380 CD-SEM. The storage system has several modes that are selectable depending on the phenomenon of interest. The system can store video images of duration from a few seconds to a few weeks depending on resolution, sampling rate, and hard disc drive capacity. The system was used to analyze apparent focusing problems that occurred during the execution of automated recipes. Intermittent focusing problems had been an issue on a particular tool for a period of approximately three months. By reviewing saved images, the original diagnosis of the problem appeared to be auto focus. Two days after installation, the CD-watcher system was able to record the errors making it possible to determine the root cause by checking the stored video files. After analysis of the stored video files, it was apparent that the problem consisted of three types of errors. The ability to record and store video files reduced the time to isolate the problem and prevented incorrect diagnosis. The system was also used to explain a complex phenomenon that occurred during the observation a particular layer. Because it is sometimes difficult to accurately describe, and to have others easily understand, certain phenomena in a written report, the video storage system can be used in place of manual annotation. In this report, we describe the CD-watcher system, test results after installing the system on a Hitachi S9380 CD-SEM, and potential applications of the system.

  14. Colorimetric determination of nitrate plus nitrite in water by enzymatic reduction, automated discrete analyzer methods

    Science.gov (United States)

    Patton, Charles J.; Kryskalla, Jennifer R.

    2011-01-01

    This report documents work at the U.S. Geological Survey (USGS) National Water Quality Laboratory (NWQL) to validate enzymatic reduction, colorimetric determinative methods for nitrate + nitrite in filtered water by automated discrete analysis. In these standard- and low-level methods (USGS I-2547-11 and I-2548-11), nitrate is reduced to nitrite with nontoxic, soluble nitrate reductase rather than toxic, granular, copperized cadmium used in the longstanding USGS automated continuous-flow analyzer methods I-2545-90 (NWQL laboratory code 1975) and I-2546-91 (NWQL laboratory code 1979). Colorimetric reagents used to determine resulting nitrite in aforementioned enzymatic- and cadmium-reduction methods are identical. The enzyme used in these discrete analyzer methods, designated AtNaR2 by its manufacturer, is produced by recombinant expression of the nitrate reductase gene from wall cress (Arabidopsis thaliana) in the yeast Pichia pastoris. Unlike other commercially available nitrate reductases we evaluated, AtNaR2 maintains high activity at 37°C and is not inhibited by high-phenolic-content humic acids at reaction temperatures in the range of 20°C to 37°C. These previously unrecognized AtNaR2 characteristics are essential for successful performance of discrete analyzer nitrate + nitrite assays (henceforth, DA-AtNaR2) described here.

  15. A novel automated hydrophilic interaction liquid chromatography method using diode-array detector/electrospray ionization tandem mass spectrometry for analysis of sodium risedronate and related degradation products in pharmaceuticals.

    Science.gov (United States)

    Bertolini, Tiziana; Vicentini, Lorenza; Boschetti, Silvia; Andreatta, Paolo; Gatti, Rita

    2014-10-24

    A simple, sensitive and fast hydrophilic interaction liquid chromatography (HILIC) method using ultraviolet diode-array detector (UV-DAD)/electrospray ionization tandem mass spectrometry was developed for the automated high performance liquid chromatography (HPLC) determination of sodium risedronate (SR) and its degradation products in new pharmaceuticals. The chromatographic separations were performed on Ascentis Express HILIC 2.7μm (150mm×2.1mm, i.d.) stainless steel column (fused core). The mobile phase consisted of formate buffer solution (pH 3.4; 0.03M)/acetonitrile 42:58 and 45:55 (v/v) for granules for oral solution and effervescent tablet analysis, respectively, at a flow-rate of 0.2mL/min, setting the wavelength at 262nm. Stability characteristics of SR were evaluated by performing stress test studies. The main degradation product formed under oxidation conditions corresponding to sodium hydrogen (1-hydroxy-2-(1-oxidopyridin-3-yl)-1-phosphonoethyl)phosphonate was characterized by high performance liquid chromatography-electrospray ionization-mass tandem mass spectrometry (HPLC-ESI-MS/MS). The validation parameters such as linearity, sensitivity, accuracy, precision and selectivity were found to be highly satisfactory. Linear responses were observed in standard and in fortified placebo solutions. Intra-day precision (relative standard deviation, RSD) was ≤1.1% for peak area and ≤0.2% for retention times (tR) without significant differences between intra- and inter-day data. Recovery studies showed good results for all the examined compounds (from 98.7 to 101.0%) with RSD ranging from 0.6 to 0.7%. The limits of detection (LOD) and quantitation (LOQ) were 1 and 3ng/mL, respectively. The high stability of standard and sample solutions at room temperature means an undoubted advantage of the method allowing the simultaneous preparation of many samples and consecutive chromatographic analyses by using an autosampler. The developed stability indicating

  16. Comparison of a quantitative microtiter method, a quantitative automated method, and the plate-count method for determining microbial complement resistance.

    Science.gov (United States)

    Lee, M D; Wooley, R E; Brown, J; Spears, K R; Nolan, L K; Shotts, E B

    1991-01-01

    A quantitative microtiter method for determining the degree of complement resistance or sensitivity of microorganisms is described. The microtiter method is compared with a quantitative automated system and the standard plate-count technique. Data were accumulated from 30 avian Escherichia coli isolates incubated at 35 C with either chicken plasma or heat-inactivated chicken plasma. Analysis of data generated by the automated system and plate-count techniques resulted in a classification of the microorganisms into three groups: those sensitive to the action of complement; those of intermediate sensitivity to the action of complement; and those resistant to the action of complement. Although the three methods studied did not agree absolutely, there were statistically significant correlations among them.

  17. Automated monitoring of activated sludge using image analysis

    OpenAIRE

    Motta, Maurício da; M. N. Pons; Roche, N; A.L. Amaral; Ferreira, E. C.; Alves, M.M.; Mota, M.; Vivier, H.

    2000-01-01

    An automated procedure for the characterisation by image analysis of the morphology of activated sludge has been used to monitor in a systematic manner the biomass in wastewater treatment plants. Over a period of one year, variations in terms mainly of the fractal dimension of flocs and of the amount of filamentous bacteria could be related to rain events affecting the plant influent flow rate and composition. Grand Nancy Council. Météo-France. Brasil. Ministério da Ciênc...

  18. Molecular Detection of Bladder Cancer by Fluorescence Microsatellite Analysis and an Automated Genetic Analyzing System

    Directory of Open Access Journals (Sweden)

    Sarel Halachmi

    2007-01-01

    Full Text Available To investigate the ability of an automated fluorescent analyzing system to detect microsatellite alterations, in patients with bladder cancer. We investigated 11 with pathology proven bladder Transitional Cell Carcinoma (TCC for microsatellite alterations in blood, urine, and tumor biopsies. DNA was prepared by standard methods from blood, urine and resected tumor specimens, and was used for microsatellite analysis. After the primers were fluorescent labeled, amplification of the DNA was performed with PCR. The PCR products were placed into the automated genetic analyser (ABI Prism 310, Perkin Elmer, USA and were subjected to fluorescent scanning with argon ion laser beams. The fluorescent signal intensity measured by the genetic analyzer measured the product size in terms of base pairs. We found loss of heterozygocity (LOH or microsatellite alterations (a loss or gain of nucleotides, which alter the original normal locus size in all the patients by using fluorescent microsatellite analysis and an automated analyzing system. In each case the genetic changes found in urine samples were identical to those found in the resected tumor sample. The studies demonstrated the ability to detect bladder tumor non-invasively by fluorescent microsatellite analysis of urine samples. Our study supports the worldwide trend for the search of non-invasive methods to detect bladder cancer. We have overcome major obstacles that prevented the clinical use of an experimental system. With our new tested system microsatellite analysis can be done cheaper, faster, easier and with higher scientific accuracy.

  19. Reproducibility of In Vivo Corneal Confocal Microscopy Using an Automated Analysis Program for Detection of Diabetic Sensorimotor Polyneuropathy.

    Directory of Open Access Journals (Sweden)

    Ilia Ostrovski

    Full Text Available In vivo Corneal Confocal Microscopy (IVCCM is a validated, non-invasive test for diabetic sensorimotor polyneuropathy (DSP detection, but its utility is limited by the image analysis time and expertise required. We aimed to determine the inter- and intra-observer reproducibility of a novel automated analysis program compared to manual analysis.In a cross-sectional diagnostic study, 20 non-diabetes controls (mean age 41.4±17.3y, HbA1c 5.5±0.4% and 26 participants with type 1 diabetes (42.8±16.9y, 8.0±1.9% underwent two separate IVCCM examinations by one observer and a third by an independent observer. Along with nerve density and branch density, corneal nerve fibre length (CNFL was obtained by manual analysis (CNFLMANUAL, a protocol in which images were manually selected for automated analysis (CNFLSEMI-AUTOMATED, and one in which selection and analysis were performed electronically (CNFLFULLY-AUTOMATED. Reproducibility of each protocol was determined using intraclass correlation coefficients (ICC and, as a secondary objective, the method of Bland and Altman was used to explore agreement between protocols.Mean CNFLManual was 16.7±4.0, 13.9±4.2 mm/mm2 for non-diabetes controls and diabetes participants, while CNFLSemi-Automated was 10.2±3.3, 8.6±3.0 mm/mm2 and CNFLFully-Automated was 12.5±2.8, 10.9 ± 2.9 mm/mm2. Inter-observer ICC and 95% confidence intervals (95%CI were 0.73(0.56, 0.84, 0.75(0.59, 0.85, and 0.78(0.63, 0.87, respectively (p = NS for all comparisons. Intra-observer ICC and 95%CI were 0.72(0.55, 0.83, 0.74(0.57, 0.85, and 0.84(0.73, 0.91, respectively (p<0.05 for CNFLFully-Automated compared to others. The other IVCCM parameters had substantially lower ICC compared to those for CNFL. CNFLSemi-Automated and CNFLFully-Automated underestimated CNFLManual by mean and 95%CI of 35.1(-4.5, 67.5% and 21.0(-21.6, 46.1%, respectively.Despite an apparent measurement (underestimation bias in comparison to the manual strategy of image

  20. An Automated Method to Quantify Radiation Damage in Human Blood Cells

    Energy Technology Data Exchange (ETDEWEB)

    Gordon K. Livingston, Mark S. Jenkins and Akio A. Awa

    2006-07-10

    Cytogenetic analysis of blood lymphocytes is a well established method to assess the absorbed dose in persons exposed to ionizing radiation. Because mature lymphocytes circulate throughout the body, the dose to these cells is believed to represent the average whole body exposure. Cytogenetic methods measure the incidence of structural aberrations in chromosomes as a means to quantify DNA damage which occurs when ionizing radiation interacts with human tissue. Methods to quantify DNA damage at the chromosomal level vary in complexity and tend to be laborious and time consuming. In a mass casualty scenario involving radiological/nuclear materials, the ability to rapidly triage individuals according to radiation dose is critically important. For high-throughput screening for dicentric chromosomes, many of the data collection steps can be optimized with motorized microscopes coupled to automated slide scanning platforms.

  1. Analysis of automated highway system risks and uncertainties. Volume 5

    Energy Technology Data Exchange (ETDEWEB)

    Sicherman, A.

    1994-10-01

    This volume describes a risk analysis performed to help identify important Automated Highway System (AHS) deployment uncertainties and quantify their effect on costs and benefits for a range of AHS deployment scenarios. The analysis identified a suite of key factors affecting vehicle and roadway costs, capacities and market penetrations for alternative AHS deployment scenarios. A systematic protocol was utilized for obtaining expert judgments of key factor uncertainties in the form of subjective probability percentile assessments. Based on these assessments, probability distributions on vehicle and roadway costs, capacity and market penetration were developed for the different scenarios. The cost/benefit risk methodology and analysis provide insights by showing how uncertainties in key factors translate into uncertainties in summary cost/benefit indices.

  2. Automated Digital Analysis Of Holographic Interferograms Of Pure Translations

    Science.gov (United States)

    Choudry, A.; Frankena, H. J.; van Beek, J. W.

    1983-10-01

    Holographic interferometry is a versatile technique for non-tactile measurement of changes in a wide variety of physical variables such as temperature, strain, position etc. It has a great potential for becoming an important metrologic technique in industrial applications. For holographic interferometry to become more attractive for industrial practice the problem of quantitative analysis of the patterns and thereby eliciting reliable values of the relevant parameters has to be addressed. In an attempt to calibrate the technique of holographic interferometry and ascertain the reliability of the subsequent digital analysis, we have chosen precisely known translations as a basis. Holographic interferograms taken from these are analysed manually and by digital techniques specially developed for such patterns. The results are promising enough to indicate the feasibility of automated digital analysis for determining translations within an acceptable accuracy. Some details of the evaluation techniques, along with a brief discussion of the preliminary results are presented.

  3. Development of design automation codes using software engineering methods

    Energy Technology Data Exchange (ETDEWEB)

    Smith, R.J. II

    1976-10-31

    The Electrical Engineering Department of the Lawrence Livermore Laboratory (LLL) has recently formed a Design Automation (DA) Group responsible for development of new DA capabilities at the Laboratory. This paper briefly discusses the environment in which the software is being produced, and methodologies employed by the development team. The discussion of software engineering approaches should be of interest to small groups producing relatively large complex software systems. (auth)

  4. Engineering methods and tools for cyber–physical automation systems

    OpenAIRE

    Ahmad, Bilal; Vera, Daniel; Harrison, Robert

    2016-01-01

    Much has been published about potential benefits of the adoption of cyber–physical systems (CPSs) in manufacturing industry. However, less has been said about how such automation systems might be effectively configured and supported through their lifecycles and how application modeling, visualization, and reuse of such systems might be best achieved. It is vitally important to be able to incorporate support for engineering best practice while at the same time exploiting the potential that CPS...

  5. Automating Flood Hazard Mapping Methods for Near Real-time Storm Surge Inundation and Vulnerability Assessment

    Science.gov (United States)

    Weigel, A. M.; Griffin, R.; Gallagher, D.

    2015-12-01

    Storm surge has enough destructive power to damage buildings and infrastructure, erode beaches, and threaten human life across large geographic areas, hence posing the greatest threat of all the hurricane hazards. The United States Gulf of Mexico has proven vulnerable to hurricanes as it has been hit by some of the most destructive hurricanes on record. With projected rises in sea level and increases in hurricane activity, there is a need to better understand the associated risks for disaster mitigation, preparedness, and response. GIS has become a critical tool in enhancing disaster planning, risk assessment, and emergency response by communicating spatial information through a multi-layer approach. However, there is a need for a near real-time method of identifying areas with a high risk of being impacted by storm surge. Research was conducted alongside Baron, a private industry weather enterprise, to facilitate automated modeling and visualization of storm surge inundation and vulnerability on a near real-time basis. This research successfully automated current flood hazard mapping techniques using a GIS framework written in a Python programming environment, and displayed resulting data through an Application Program Interface (API). Data used for this methodology included high resolution topography, NOAA Probabilistic Surge model outputs parsed from Rich Site Summary (RSS) feeds, and the NOAA Census tract level Social Vulnerability Index (SoVI). The development process required extensive data processing and management to provide high resolution visualizations of potential flooding and population vulnerability in a timely manner. The accuracy of the developed methodology was assessed using Hurricane Isaac as a case study, which through a USGS and NOAA partnership, contained ample data for statistical analysis. This research successfully created a fully automated, near real-time method for mapping high resolution storm surge inundation and vulnerability for the

  6. Single-cell bacteria growth monitoring by automated DEP-facilitated image analysis.

    Science.gov (United States)

    Peitz, Ingmar; van Leeuwen, Rien

    2010-11-07

    Growth monitoring is the method of choice in many assays measuring the presence or properties of pathogens, e.g. in diagnostics and food quality. Established methods, relying on culturing large numbers of bacteria, are rather time-consuming, while in healthcare time often is crucial. Several new approaches have been published, mostly aiming at assaying growth or other properties of a small number of bacteria. However, no method so far readily achieves single-cell resolution with a convenient and easy to handle setup that offers the possibility for automation and high throughput. We demonstrate these benefits in this study by employing dielectrophoretic capturing of bacteria in microfluidic electrode structures, optical detection and automated bacteria identification and counting with image analysis algorithms. For a proof-of-principle experiment we chose an antibiotic susceptibility test with Escherichia coli and polymyxin B. Growth monitoring is demonstrated on single cells and the impact of the antibiotic on the growth rate is shown. The minimum inhibitory concentration as a standard diagnostic parameter is derived from a dose-response plot. This report is the basis for further integration of image analysis code into device control. Ultimately, an automated and parallelized setup may be created, using an optical microscanner and many of the electrode structures simultaneously. Sufficient data for a sound statistical evaluation and a confirmation of the initial findings can then be generated in a single experiment.

  7. Semi-automated potentiometric titration method for uranium characterization

    Energy Technology Data Exchange (ETDEWEB)

    Cristiano, B.F.G., E-mail: barbara@ird.gov.br [Comissao Nacional de Energia Nuclear (CNEN), Instituto de Radioprotecao e Dosimetria (IRD), Avenida Salvador Allende s/n Recreio dos Bandeirantes, PO Box 37750, Rio de Janeiro, 22780-160 RJ (Brazil); Delgado, J.U.; Silva, J.W.S. da; Barros, P.D. de; Araujo, R.M.S. de [Comissao Nacional de Energia Nuclear (CNEN), Instituto de Radioprotecao e Dosimetria (IRD), Avenida Salvador Allende s/n Recreio dos Bandeirantes, PO Box 37750, Rio de Janeiro, 22780-160 RJ (Brazil); Lopes, R.T. [Programa de Engenharia Nuclear (PEN/COPPE), Universidade Federal do Rio de Janeiro (UFRJ), Ilha do Fundao, PO Box 68509, Rio de Janeiro, 21945-970 RJ (Brazil)

    2012-07-15

    The manual version of the potentiometric titration method has been used for certification and characterization of uranium compounds. In order to reduce the analysis time and the influence of the analyst, a semi-automatic version of the method was developed in the Brazilian Nuclear Energy Commission. The method was applied with traceability assured by using a potassium dichromate primary standard. The combined standard uncertainty in determining the total concentration of uranium was around 0.01%, which is suitable for uranium characterization. - Highlights: Black-Right-Pointing-Pointer We developed a semi-automatic version of potentiometric titration method. Black-Right-Pointing-Pointer The method is used for certification and characterization of uranium compounds. Black-Right-Pointing-Pointer The traceability of the method was assured by a K{sub 2}Cr{sub 2}O{sub 7} primary standard. Black-Right-Pointing-Pointer The results of U{sub 3}O{sub 8} reference material analyzed was consistent with certified value. Black-Right-Pointing-Pointer The uncertainty obtained, near 0.01%, is useful for characterization purposes.

  8. AutoGate: automating analysis of flow cytometry data.

    Science.gov (United States)

    Meehan, Stephen; Walther, Guenther; Moore, Wayne; Orlova, Darya; Meehan, Connor; Parks, David; Ghosn, Eliver; Philips, Megan; Mitsunaga, Erin; Waters, Jeffrey; Kantor, Aaron; Okamura, Ross; Owumi, Solomon; Yang, Yang; Herzenberg, Leonard A; Herzenberg, Leonore A

    2014-05-01

    Nowadays, one can hardly imagine biology and medicine without flow cytometry to measure CD4 T cell counts in HIV, follow bone marrow transplant patients, characterize leukemias, etc. Similarly, without flow cytometry, there would be a bleak future for stem cell deployment, HIV drug development and full characterization of the cells and cell interactions in the immune system. But while flow instruments have improved markedly, the development of automated tools for processing and analyzing flow data has lagged sorely behind. To address this deficit, we have developed automated flow analysis software technology, provisionally named AutoComp and AutoGate. AutoComp acquires sample and reagent labels from users or flow data files, and uses this information to complete the flow data compensation task. AutoGate replaces the manual subsetting capabilities provided by current analysis packages with newly defined statistical algorithms that automatically and accurately detect, display and delineate subsets in well-labeled and well-recognized formats (histograms, contour and dot plots). Users guide analyses by successively specifying axes (flow parameters) for data subset displays and selecting statistically defined subsets to be used for the next analysis round. Ultimately, this process generates analysis "trees" that can be applied to automatically guide analyses for similar samples. The first AutoComp/AutoGate version is currently in the hands of a small group of users at Stanford, Emory and NIH. When this "early adopter" phase is complete, the authors expect to distribute the software free of charge to .edu, .org and .gov users.

  9. Automated High-Dimensional Flow Cytometric Data Analysis

    Science.gov (United States)

    Pyne, Saumyadipta; Hu, Xinli; Wang, Kui; Rossin, Elizabeth; Lin, Tsung-I.; Maier, Lisa; Baecher-Allan, Clare; McLachlan, Geoffrey; Tamayo, Pablo; Hafler, David; de Jager, Philip; Mesirov, Jill

    Flow cytometry is widely used for single cell interrogation of surface and intracellular protein expression by measuring fluorescence intensity of fluorophore-conjugated reagents. We focus on the recently developed procedure of Pyne et al. (2009, Proceedings of the National Academy of Sciences USA 106, 8519-8524) for automated high- dimensional flow cytometric analysis called FLAME (FLow analysis with Automated Multivariate Estimation). It introduced novel finite mixture models of heavy-tailed and asymmetric distributions to identify and model cell populations in a flow cytometric sample. This approach robustly addresses the complexities of flow data without the need for transformation or projection to lower dimensions. It also addresses the critical task of matching cell populations across samples that enables downstream analysis. It thus facilitates application of flow cytometry to new biological and clinical problems. To facilitate pipelining with standard bioinformatic applications such as high-dimensional visualization, subject classification or outcome prediction, FLAME has been incorporated with the GenePattern package of the Broad Institute. Thereby analysis of flow data can be approached similarly as other genomic platforms. We also consider some new work that proposes a rigorous and robust solution to the registration problem by a multi-level approach that allows us to model and register cell populations simultaneously across a cohort of high-dimensional flow samples. This new approach is called JCM (Joint Clustering and Matching). It enables direct and rigorous comparisons across different time points or phenotypes in a complex biological study as well as for classification of new patient samples in a more clinical setting.

  10. Protocol for Data Collection and Analysis Applied to Automated Facial Expression Analysis Technology and Temporal Analysis for Sensory Evaluation.

    Science.gov (United States)

    Crist, Courtney A; Duncan, Susan E; Gallagher, Daniel L

    2016-08-26

    We demonstrate a method for capturing emotional response to beverages and liquefied foods in a sensory evaluation laboratory using automated facial expression analysis (AFEA) software. Additionally, we demonstrate a method for extracting relevant emotional data output and plotting the emotional response of a population over a specified time frame. By time pairing each participant's treatment response to a control stimulus (baseline), the overall emotional response over time and across multiple participants can be quantified. AFEA is a prospective analytical tool for assessing unbiased response to food and beverages. At present, most research has mainly focused on beverages. Methodologies and analyses have not yet been standardized for the application of AFEA to beverages and foods; however, a consistent standard methodology is needed. Optimizing video capture procedures and resulting video quality aids in a successful collection of emotional response to foods. Furthermore, the methodology of data analysis is novel for extracting the pertinent data relevant to the emotional response. The combinations of video capture optimization and data analysis will aid in standardizing the protocol for automated facial expression analysis and interpretation of emotional response data.

  11. A new automated spectral feature extraction method and its application in spectral classification and defective spectra recovery

    Science.gov (United States)

    Wang, Ke; Guo, Ping; Luo, A.-Li

    2017-03-01

    Spectral feature extraction is a crucial procedure in automated spectral analysis. This procedure starts from the spectral data and produces informative and non-redundant features, facilitating the subsequent automated processing and analysis with machine-learning and data-mining techniques. In this paper, we present a new automated feature extraction method for astronomical spectra, with application in spectral classification and defective spectra recovery. The basic idea of our approach is to train a deep neural network to extract features of spectra with different levels of abstraction in different layers. The deep neural network is trained with a fast layer-wise learning algorithm in an analytical way without any iterative optimization procedure. We evaluate the performance of the proposed scheme on real-world spectral data. The results demonstrate that our method is superior regarding its comprehensive performance, and the computational cost is significantly lower than that for other methods. The proposed method can be regarded as a new valid alternative general-purpose feature extraction method for various tasks in spectral data analysis.

  12. Adiposoft: automated software for the analysis of white adipose tissue cellularity in histological sections.

    Science.gov (United States)

    Galarraga, Miguel; Campión, Javier; Muñoz-Barrutia, Arrate; Boqué, Noemí; Moreno, Haritz; Martínez, José Alfredo; Milagro, Fermín; Ortiz-de-Solórzano, Carlos

    2012-12-01

    The accurate estimation of the number and size of cells provides relevant information on the kinetics of growth and the physiological status of a given tissue or organ. Here, we present Adiposoft, a fully automated open-source software for the analysis of white adipose tissue cellularity in histological sections. First, we describe the sequence of image analysis routines implemented by the program. Then, we evaluate our software by comparing it with other adipose tissue quantification methods, namely, with the manual analysis of cells in histological sections (used as gold standard) and with the automated analysis of cells in suspension, the most commonly used method. Our results show significant concordance between Adiposoft and the other two methods. We also demonstrate the ability of the proposed method to distinguish the cellular composition of three different rat fat depots. Moreover, we found high correlation and low disagreement between Adiposoft and the manual delineation of cells. We conclude that Adiposoft provides accurate results while considerably reducing the amount of time and effort required for the analysis.

  13. Development of a fully automated online mixing system for SAXS protein structure analysis

    DEFF Research Database (Denmark)

    Nielsen, Søren Skou; Arleth, Lise

    2010-01-01

    This thesis presents the development of an automated high-throughput mixing and exposure system for Small-Angle Scattering analysis on a synchrotron using polymer microfluidics. Software and hardware for both automated mixing, exposure control on a beamline and automated data reduction and prelim......This thesis presents the development of an automated high-throughput mixing and exposure system for Small-Angle Scattering analysis on a synchrotron using polymer microfluidics. Software and hardware for both automated mixing, exposure control on a beamline and automated data reduction...... and preliminary analysis is presented. Three mixing systems that have been the corner stones of the development process are presented including a fully functioning high-throughput microfluidic system that is able to produce and expose 36 mixed samples per hour using 30 μL of sample volume. The system is tested...

  14. Automated methods for accurate determination of the critical velocity of packed bed chromatography.

    Science.gov (United States)

    Chang, Yu-Chih; Gerontas, Spyridon; Titchener-Hooker, Nigel J

    2012-01-01

    Knowing the critical velocity (ucrit) of a chromatography column is an important part of process development as it allows the optimization of chromatographic flow conditions. The conventional flow step method for determining ucrit is prone to error as it depends heavily on human judgment. In this study, two automated methods for determining ucrit have been developed: the automatic flow step (AFS) method and the automatic pressure step (APS) method. In the AFS method, the column pressure drop is monitored upon application of automated incremental increases in flow velocity, whereas in the APS method the flow velocity is monitored upon application of automated incremental increases in pressure drop. The APS method emerged as the one with the higher levels of accuracy, efficiency and ease of application having the greater potential to assist defining the best operational parameters of a chromatography column.

  15. Automated alignment method for coherence-controlled holographic microscope

    Science.gov (United States)

    Dostal, Zbynek; Slaby, Tomas; Kvasnica, Lukas; Lostak, Martin; Krizova, Aneta; Chmelik, Radim

    2015-11-01

    A coherence-controlled holographic microscope (CCHM) was developed particularly for quantitative phase imaging and measurement of live cell dynamics, which is the proper subject of digital holographic microscopy (DHM). CCHM in low-coherence mode extends DHM in the study of living cells. However, this advantage is compensated by sensitivity of the system to easily become misaligned, which is a serious hindrance to wanted performance. Therefore, it became clear that introduction of a self-correcting system is inevitable. Accordingly, we had to devise a theory of a suitable control and design an automated alignment system for CCHM. The modulus of the reconstructed holographic signal was identified as a significant variable for guiding the alignment procedures. From this, we derived the original basic realignment three-dimensional algorithm, which encompasses a unique set of procedures for automated alignment that contains processes for initial and advanced alignment as well as long-term maintenance of microscope tuning. All of these procedures were applied to a functioning microscope and the tested processes were successfully validated. Finally, in such a way, CCHM is enabled to substantially contribute to study of biology, particularly of cancer cells in vitro.

  16. Methods for Automated and Continuous Commissioning of Building Systems

    Energy Technology Data Exchange (ETDEWEB)

    Larry Luskay; Michael Brambley; Srinivas Katipamula

    2003-04-30

    Avoidance of poorly installed HVAC systems is best accomplished at the close of construction by having a building and its systems put ''through their paces'' with a well conducted commissioning process. This research project focused on developing key components to enable the development of tools that will automatically detect and correct equipment operating problems, thus providing continuous and automatic commissioning of the HVAC systems throughout the life of a facility. A study of pervasive operating problems reveled the following would most benefit from an automated and continuous commissioning process: (1) faulty economizer operation; (2) malfunctioning sensors; (3) malfunctioning valves and dampers, and (4) access to project design data. Methodologies for detecting system operation faults in these areas were developed and validated in ''bare-bones'' forms within standard software such as spreadsheets, databases, statistical or mathematical packages. Demonstrations included flow diagrams and simplified mock-up applications. Techniques to manage data were demonstrated by illustrating how test forms could be populated with original design information and the recommended sequence of operation for equipment systems. Proposed tools would use measured data, design data, and equipment operating parameters to diagnosis system problems. Steps for future research are suggested to help more toward practical application of automated commissioning and its high potential to improve equipment availability, increase occupant comfort, and extend the life of system equipment.

  17. Towards the Procedure Automation of Full Stochastic Spectral Based Fatigue Analysis

    Directory of Open Access Journals (Sweden)

    Khurram Shehzad

    2013-05-01

    Full Text Available Fatigue is one of the most significant failure modes for marine structures such as ships and offshore platforms. Among numerous methods for fatigue life estimation, spectral method is considered as the most reliable one due to its ability to cater different sea states as well as their probabilities of occurrence. However, spectral based simulation procedure itself is quite complex and numerically intensive owing to various critical technical details. Present research study is focused on the application and automation of spectral based fatigue analysis procedure for ship structure using ANSYS software with 3D liner sea keeping code AQWA. Ansys Parametric Design Language (APDL macros are created and subsequently implemented to automate the workflow of simulation process by reducing the time spent on non-value added repetitive activity. A MATLAB program based on direct calculation procedure of spectral fatigue is developed to calculate total fatigue damage. The automation procedure is employed to predict the fatigue life of a ship structural detail using wave scatter data of North Atlantic and Worldwide trade. The current work will provide a system for efficient implementation of stochastic spectral fatigue analysis procedure for ship structures.

  18. RootGraph: a graphic optimization tool for automated image analysis of plant roots.

    Science.gov (United States)

    Cai, Jinhai; Zeng, Zhanghui; Connor, Jason N; Huang, Chun Yuan; Melino, Vanessa; Kumar, Pankaj; Miklavcic, Stanley J

    2015-11-01

    This paper outlines a numerical scheme for accurate, detailed, and high-throughput image analysis of plant roots. In contrast to existing root image analysis tools that focus on root system-average traits, a novel, fully automated and robust approach for the detailed characterization of root traits, based on a graph optimization process is presented. The scheme, firstly, distinguishes primary roots from lateral roots and, secondly, quantifies a broad spectrum of root traits for each identified primary and lateral root. Thirdly, it associates lateral roots and their properties with the specific primary root from which the laterals emerge. The performance of this approach was evaluated through comparisons with other automated and semi-automated software solutions as well as against results based on manual measurements. The comparisons and subsequent application of the algorithm to an array of experimental data demonstrate that this method outperforms existing methods in terms of accuracy, robustness, and the ability to process root images under high-throughput conditions.

  19. Automation of block assignment planning using a diagram-based scenario modeling method

    Science.gov (United States)

    Hwang, In Hyuck; Kim, Youngmin; Lee, Dong Kun; Shin, Jong Gye

    2014-03-01

    Most shipbuilding scheduling research so far has focused on the load level on the dock plan. This is be¬cause the dock is the least extendable resource in shipyards, and its overloading is difficult to resolve. However, once dock scheduling is completed, making a plan that makes the best use of the rest of the resources in the shipyard to minimize any additional cost is also important. Block assignment planning is one of the midterm planning tasks; it assigns a block to the facility (factory/shop or surface plate) that will actually manufacture the block according to the block characteristics and current situation of the facility. It is one of the most heavily loaded midterm planning tasks and is carried out manu¬ally by experienced workers. In this study, a method of representing the block assignment rules using a diagram was su¬ggested through analysis of the existing manual process. A block allocation program was developed which automated the block assignment process according to the rules represented by the diagram. The planning scenario was validated through a case study that compared the manual assignment and two automated block assignment results.

  20. Automation of block assignment planning using a diagram-based scenario modeling method

    Directory of Open Access Journals (Sweden)

    Hwang In Hyuck

    2014-03-01

    Full Text Available Most shipbuilding scheduling research so far has focused on the load level on the dock plan. This is be¬cause the dock is the least extendable resource in shipyards, and its overloading is difficult to resolve. However, once dock scheduling is completed, making a plan that makes the best use of the rest of the resources in the shipyard to minimize any additional cost is also important. Block assignment planning is one of the midterm planning tasks; it assigns a block to the facility (factory/shop or surface plate that will actually manufacture the block according to the block characteristics and current situation of the facility. It is one of the most heavily loaded midterm planning tasks and is carried out manu¬ally by experienced workers. In this study, a method of representing the block assignment rules using a diagram was su¬ggested through analysis of the existing manual process. A block allocation program was developed which automated the block assignment process according to the rules represented by the diagram. The planning scenario was validated through a case study that compared the manual assignment and two automated block assignment results.

  1. 14 CFR 1261.413 - Analysis of costs; automation; prevention of overpayments, delinquencies, or defaults.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 5 2010-01-01 2010-01-01 false Analysis of costs; automation; prevention of overpayments, delinquencies, or defaults. 1261.413 Section 1261.413 Aeronautics and Space NATIONAL...) § 1261.413 Analysis of costs; automation; prevention of overpayments, delinquencies, or defaults....

  2. Automated DNA extraction of single dog hairs without roots for mitochondrial DNA analysis.

    Science.gov (United States)

    Bekaert, Bram; Larmuseau, Maarten H D; Vanhove, Maarten P M; Opdekamp, Anouschka; Decorte, Ronny

    2012-03-01

    Dogs are intensely integrated in human social life and their shed hairs can play a major role in forensic investigations. The overall aim of this study was to validate a semi-automated extraction method for mitochondrial DNA analysis of telogenic dog hairs. Extracted DNA was amplified with a 95% success rate from 43 samples using two new experimental designs in which the mitochondrial control region was amplified as a single large (± 1260 bp) amplicon or as two individual amplicons (HV1 and HV2; ± 650 and 350 bp) with tailed-primers. The results prove that the extraction of dog hair mitochondrial DNA can easily be automated to provide sufficient DNA yield for the amplification of a forensically useful long mitochondrial DNA fragment or alternatively two short fragments with minimal loss of sequence in case of degraded samples.

  3. Automated Large-Scale Shoreline Variability Analysis From Video

    Science.gov (United States)

    Pearre, N. S.

    2006-12-01

    Land-based video has been used to quantify changes in nearshore conditions for over twenty years. By combining the ability to track rapid, short-term shoreline change and changes associated with longer term or seasonal processes, video has proved to be a cost effective and versatile tool for coastal science. Previous video-based studies of shoreline change have typically examined the position of the shoreline along a small number of cross-shore lines as a proxy for the continuous coast. The goal of this study is twofold: (1) to further develop automated shoreline extraction algorithms for continuous shorelines, and (2) to track the evolution of a nourishment project at Rehoboth Beach, DE that was concluded in June 2005. Seven cameras are situated approximately 30 meters above mean sea level and 70 meters from the shoreline. Time exposure and variance images are captured hourly during daylight and transferred to a local processing computer. After correcting for lens distortion and geo-rectifying to a shore-normal coordinate system, the images are merged to form a composite planform image of 6 km of coast. Automated extraction algorithms establish shoreline and breaker positions throughout a tidal cycle on a daily basis. Short and long term variability in the daily shoreline will be characterized using empirical orthogonal function (EOF) analysis. Periodic sediment volume information will be extracted by incorporating the results of monthly ground-based LIDAR surveys and by correlating the hourly shorelines to the corresponding tide level under conditions with minimal wave activity. The Delaware coast in the area downdrift of the nourishment site is intermittently interrupted by short groins. An Even/Odd analysis of the shoreline response around these groins will be performed. The impact of groins on the sediment volume transport along the coast during periods of accretive and erosive conditions will be discussed. [This work is being supported by DNREC and the

  4. galaxieEST: addressing EST identity through automated phylogenetic analysis

    Directory of Open Access Journals (Sweden)

    Larsson Karl-Henrik

    2004-07-01

    Full Text Available Abstract Background Research involving expressed sequence tags (ESTs is intricately coupled to the existence of large, well-annotated sequence repositories. Comparatively complete and satisfactory annotated public sequence libraries are, however, available only for a limited range of organisms, rendering the absence of sequences and gene structure information a tangible problem for those working with taxa lacking an EST or genome sequencing project. Paralogous genes belonging to the same gene family but distinguished by derived characteristics are particularly prone to misidentification and erroneous annotation; high but incomplete levels of sequence similarity are typically difficult to interpret and have formed the basis of many unsubstantiated assumptions of orthology. In these cases, a phylogenetic study of the query sequence together with the most similar sequences in the database may be of great value to the identification process. In order to facilitate this laborious procedure, a project to employ automated phylogenetic analysis in the identification of ESTs was initiated. Results galaxieEST is an open source Perl-CGI script package designed to complement traditional similarity-based identification of EST sequences through employment of automated phylogenetic analysis. It uses a series of BLAST runs as a sieve to retrieve nucleotide and protein sequences for inclusion in neighbour joining and parsimony analyses; the output includes the BLAST output, the results of the phylogenetic analyses, and the corresponding multiple alignments. galaxieEST is available as an on-line web service for identification of fungal ESTs and for download / local installation for use with any organism group at http://galaxie.cgb.ki.se/galaxieEST.html. Conclusions By addressing sequence relatedness in addition to similarity, galaxieEST provides an integrative view on EST origin and identity, which may prove particularly useful in cases where similarity searches

  5. A fully automated method for quantifying and localizing white matter hyperintensities on MR images.

    Science.gov (United States)

    Wu, Minjie; Rosano, Caterina; Butters, Meryl; Whyte, Ellen; Nable, Megan; Crooks, Ryan; Meltzer, Carolyn C; Reynolds, Charles F; Aizenstein, Howard J

    2006-12-01

    White matter hyperintensities (WMH), commonly found on T2-weighted FLAIR brain MR images in the elderly, are associated with a number of neuropsychiatric disorders, including vascular dementia, Alzheimer's disease, and late-life depression. Previous MRI studies of WMHs have primarily relied on the subjective and global (i.e., full-brain) ratings of WMH grade. In the current study we implement and validate an automated method for quantifying and localizing WMHs. We adapt a fuzzy-connected algorithm to automate the segmentation of WMHs and use a demons-based image registration to automate the anatomic localization of the WMHs using the Johns Hopkins University White Matter Atlas. The method is validated using the brain MR images acquired from eleven elderly subjects with late-onset late-life depression (LLD) and eight elderly controls. This dataset was chosen because LLD subjects are known to have significant WMH burden. The volumes of WMH identified in our automated method are compared with the accepted gold standard (manual ratings). A significant correlation of the automated method and the manual ratings is found (Pdepression. Progress in Neuro-Psychopharmacology and Biological Psychiatry. 27 (3), 539-544.]), we found there was a significantly greater WMH burden in the LLD subjects versus the controls for both the manual and automated method. The effect size was greater for the automated method, suggesting that it is a more specific measure. Additionally, we describe the anatomic localization of the WMHs in LLD subjects as well as in the control subjects, and detect the regions of interest (ROIs) specific for the WMH burden of LLD patients. Given the emergence of large NeuroImage databases, techniques, such as that described here, will allow for a better understanding of the relationship between WMHs and neuropsychiatric disorders.

  6. Semi-automated Method for Failed Eruptions Search in SDO Data Base: Methodology and First Results

    Science.gov (United States)

    Mrozek, T.; Gronkiewicz, D.; Kołomański, S.; Chmielewska, E.; Chruślińska, M.

    It is well known that not all solar flares are connected with eruptions followed by coronal mass ejection (CME). Even strongest X-class flares may not be accompanied by eruptions or are accompanied by failed eruptions. There are several mechanisms responsible which were proposed. Present observations of SDO/AIA give a chance for deep statistical analysis of properties of an active region that may confine an eruption. Therefore, we developed automated method which can recognize moving structures and confined eruptions in AIA images. We present the algorithm and its performance for 1 April 2012 - 1 July 2012 period. The algorithm found more than 600 dynamic events. More than 30% of them are failed eruptions. Developed algorithm is very effective and gives a chance for huge increase of failed eruption data base.

  7. Conventional Versus Automated Implantation of Loose Seeds in Prostate Brachytherapy: Analysis of Dosimetric and Clinical Results

    Energy Technology Data Exchange (ETDEWEB)

    Genebes, Caroline, E-mail: genebes.caroline@claudiusregaud.fr [Radiation Oncology Department, Institut Claudius Regaud, Toulouse (France); Filleron, Thomas; Graff, Pierre [Radiation Oncology Department, Institut Claudius Regaud, Toulouse (France); Jonca, Frédéric [Department of Urology, Clinique Ambroise Paré, Toulouse (France); Huyghe, Eric; Thoulouzan, Matthieu; Soulie, Michel; Malavaud, Bernard [Department of Urology and Andrology, CHU Rangueil, Toulouse (France); Aziza, Richard; Brun, Thomas; Delannes, Martine; Bachaud, Jean-Marc [Radiation Oncology Department, Institut Claudius Regaud, Toulouse (France)

    2013-11-15

    Purpose: To review the clinical outcome of I-125 permanent prostate brachytherapy (PPB) for low-risk and intermediate-risk prostate cancer and to compare 2 techniques of loose-seed implantation. Methods and Materials: 574 consecutive patients underwent I-125 PPB for low-risk and intermediate-risk prostate cancer between 2000 and 2008. Two successive techniques were used: conventional implantation from 2000 to 2004 and automated implantation (Nucletron, FIRST system) from 2004 to 2008. Dosimetric and biochemical recurrence-free (bNED) survival results were reported and compared for the 2 techniques. Univariate and multivariate analysis researched independent predictors for bNED survival. Results: 419 (73%) and 155 (27%) patients with low-risk and intermediate-risk disease, respectively, were treated (median follow-up time, 69.3 months). The 60-month bNED survival rates were 95.2% and 85.7%, respectively, for patients with low-risk and intermediate-risk disease (P=.04). In univariate analysis, patients treated with automated implantation had worse bNED survival rates than did those treated with conventional implantation (P<.0001). By day 30, patients treated with automated implantation showed lower values of dose delivered to 90% of prostate volume (D90) and volume of prostate receiving 100% of prescribed dose (V100). In multivariate analysis, implantation technique, Gleason score, and V100 on day 30 were independent predictors of recurrence-free status. Grade 3 urethritis and urinary incontinence were observed in 2.6% and 1.6% of the cohort, respectively, with no significant differences between the 2 techniques. No grade 3 proctitis was observed. Conclusion: Satisfactory 60-month bNED survival rates (93.1%) and acceptable toxicity (grade 3 urethritis <3%) were achieved by loose-seed implantation. Automated implantation was associated with worse dosimetric and bNED survival outcomes.

  8. IFDOTMETER: A New Software Application for Automated Immunofluorescence Analysis.

    Science.gov (United States)

    Rodríguez-Arribas, Mario; Pizarro-Estrella, Elisa; Gómez-Sánchez, Rubén; Yakhine-Diop, S M S; Gragera-Hidalgo, Antonio; Cristo, Alejandro; Bravo-San Pedro, Jose M; González-Polo, Rosa A; Fuentes, José M

    2016-04-01

    Most laboratories interested in autophagy use different imaging software for managing and analyzing heterogeneous parameters in immunofluorescence experiments (e.g., LC3-puncta quantification and determination of the number and size of lysosomes). One solution would be software that works on a user's laptop or workstation that can access all image settings and provide quick and easy-to-use analysis of data. Thus, we have designed and implemented an application called IFDOTMETER, which can run on all major operating systems because it has been programmed using JAVA (Sun Microsystems). Briefly, IFDOTMETER software has been created to quantify a variety of biological hallmarks, including mitochondrial morphology and nuclear condensation. The program interface is intuitive and user-friendly, making it useful for users not familiar with computer handling. By setting previously defined parameters, the software can automatically analyze a large number of images without the supervision of the researcher. Once analysis is complete, the results are stored in a spreadsheet. Using software for high-throughput cell image analysis offers researchers the possibility of performing comprehensive and precise analysis of a high number of images in an automated manner, making this routine task easier.

  9. PCA method for automated detection of mispronounced words

    Science.gov (United States)

    Ge, Zhenhao; Sharma, Sudhendu R.; Smith, Mark J. T.

    2011-06-01

    This paper presents a method for detecting mispronunciations with the aim of improving Computer Assisted Language Learning (CALL) tools used by foreign language learners. The algorithm is based on Principle Component Analysis (PCA). It is hierarchical with each successive step refining the estimate to classify the test word as being either mispronounced or correct. Preprocessing before detection, like normalization and time-scale modification, is implemented to guarantee uniformity of the feature vectors input to the detection system. The performance using various features including spectrograms and Mel-Frequency Cepstral Coefficients (MFCCs) are compared and evaluated. Best results were obtained using MFCCs, achieving up to 99% accuracy in word verification and 93% in native/non-native classification. Compared with Hidden Markov Models (HMMs) which are used pervasively in recognition application, this particular approach is computational efficient and effective when training data is limited.

  10. Comparison of automated method and improved AOAC Kjeldahl method for determination of protein in meat and meat products.

    Science.gov (United States)

    McGill, D L

    1981-01-01

    The Kjel-Foss automated method for protein determination meat and meat products was compared with the improved AOAC Kjeldahl method. Meat samples were separated into 3 categories based on fat content and analyzed in duplicate by both methods. No significant difference was found in a paired comparison of the 2 methods in each of the 3 meat categories, using Student's t-test at the 99% confidence level. A number of additional meat samples analyzed 6-9 times by the automated method showed an overall average range of 0.55% protein and average standard deviation of 0.20. The Kjel-Foss automated method was applicable for total protein determination in a wide variety of meat and meat products.

  11. A time-series method for automated measurement of changes in mitotic and interphase duration from time-lapse movies.

    Directory of Open Access Journals (Sweden)

    Frederic D Sigoillot

    Full Text Available BACKGROUND: Automated time-lapse microscopy can visualize proliferation of large numbers of individual cells, enabling accurate measurement of the frequency of cell division and the duration of interphase and mitosis. However, extraction of quantitative information by manual inspection of time-lapse movies is too time-consuming to be useful for analysis of large experiments. METHODOLOGY/PRINCIPAL FINDINGS: Here we present an automated time-series approach that can measure changes in the duration of mitosis and interphase in individual cells expressing fluorescent histone 2B. The approach requires analysis of only 2 features, nuclear area and average intensity. Compared to supervised learning approaches, this method reduces processing time and does not require generation of training data sets. We demonstrate that this method is as sensitive as manual analysis in identifying small changes in interphase or mitotic duration induced by drug or siRNA treatment. CONCLUSIONS/SIGNIFICANCE: This approach should facilitate automated analysis of high-throughput time-lapse data sets to identify small molecules or gene products that influence timing of cell division.

  12. Development of automated high throughput single molecular microfluidic detection platform for signal transduction analysis

    Science.gov (United States)

    Huang, Po-Jung; Baghbani Kordmahale, Sina; Chou, Chao-Kai; Yamaguchi, Hirohito; Hung, Mien-Chie; Kameoka, Jun

    2016-03-01

    Signal transductions including multiple protein post-translational modifications (PTM), protein-protein interactions (PPI), and protein-nucleic acid interaction (PNI) play critical roles for cell proliferation and differentiation that are directly related to the cancer biology. Traditional methods, like mass spectrometry, immunoprecipitation, fluorescence resonance energy transfer, and fluorescence correlation spectroscopy require a large amount of sample and long processing time. "microchannel for multiple-parameter analysis of proteins in single-complex (mMAPS)"we proposed can reduce the process time and sample volume because this system is composed by microfluidic channels, fluorescence microscopy, and computerized data analysis. In this paper, we will present an automated mMAPS including integrated microfluidic device, automated stage and electrical relay for high-throughput clinical screening. Based on this result, we estimated that this automated detection system will be able to screen approximately 150 patient samples in a 24-hour period, providing a practical application to analyze tissue samples in a clinical setting.

  13. Scanner-based image quality measurement system for automated analysis of EP output

    Science.gov (United States)

    Kipman, Yair; Mehta, Prashant; Johnson, Kate

    2003-12-01

    Inspection of electrophotographic print cartridge quality and compatibility requires analysis of hundreds of pages on a wide population of printers and copiers. Although print quality inspection is often achieved through the use of anchor prints and densitometry, more comprehensive analysis and quantitative data is desired for performance tracking, benchmarking and failure mode analysis. Image quality measurement systems range in price and performance, image capture paths and levels of automation. In order to address the requirements of a specific application, careful consideration was made to print volume, budgetary limits, and the scope of the desired image quality measurements. A flatbed scanner-based image quality measurement system was selected to support high throughput, maximal automation, and sufficient flexibility for both measurement methods and image sampling rates. Using an automatic document feeder (ADF) for sample management, a half ream of prints can be measured automatically without operator intervention. The system includes optical character recognition (OCR) for automatic determination of target type for measurement suite selection. This capability also enables measurement of mixed stacks of targets since each sample is identified prior to measurement. In addition, OCR is used to read toner ID, machine ID, print count, and other pertinent information regarding the printing conditions and environment. This data is saved to a data file along with the measurement results for complete test documentation. Measurement methods were developed to replace current methods of visual inspection and densitometry. The features that were being analyzed visually could be addressed via standard measurement algorithms. Measurement of density proved to be less simple since the scanner is not a densitometer and anything short of an excellent estimation would be meaningless. In order to address the measurement of density, a transfer curve was built to translate the

  14. Automated analysis of NF-κB nuclear translocation kinetics in high-throughput screening.

    Directory of Open Access Journals (Sweden)

    Zi Di

    Full Text Available Nuclear entry and exit of the NF-κB family of dimeric transcription factors plays an essential role in regulating cellular responses to inflammatory stress. The dynamics of this nuclear translocation can vary significantly within a cell population and may dramatically change e.g. upon drug exposure. Furthermore, there is significant heterogeneity in individual cell response upon stress signaling. In order to systematically determine factors that define NF-κB translocation dynamics, high-throughput screens that enable the analysis of dynamic NF-κB responses in individual cells in real time are essential. Thus far, only NF-κB downstream signaling responses of whole cell populations at the transcriptional level are in high-throughput mode. In this study, we developed a fully automated image analysis method to determine the time-course of NF-κB translocation in individual cells, suitable for high-throughput screenings in the context of compound screening and functional genomics. Two novel segmentation methods were used for defining the individual nuclear and cytoplasmic regions: watershed masked clustering (WMC and best-fit ellipse of Voronoi cell (BEVC. The dynamic NFκB oscillatory response at the single cell and population level was coupled to automated extraction of 26 analogue translocation parameters including number of peaks, time to reach each peak, and amplitude of each peak. Our automated image analysis method was validated through a series of statistical tests demonstrating computational efficient and accurate NF-κB translocation dynamics quantification of our algorithm. Both pharmacological inhibition of NF-κB and short interfering RNAs targeting the inhibitor of NFκB, IκBα, demonstrated the ability of our method to identify compounds and genetic players that interfere with the nuclear transition of NF-κB.

  15. A Semi-Automated Functional Test Data Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Peng; Haves, Philip; Kim, Moosung

    2005-05-01

    The growing interest in commissioning is creating a demand that will increasingly be met by mechanical contractors and less experienced commissioning agents. They will need tools to help them perform commissioning effectively and efficiently. The widespread availability of standardized procedures, accessible in the field, will allow commissioning to be specified with greater certainty as to what will be delivered, enhancing the acceptance and credibility of commissioning. In response, a functional test data analysis tool is being developed to analyze the data collected during functional tests for air-handling units. The functional test data analysis tool is designed to analyze test data, assess performance of the unit under test and identify the likely causes of the failure. The tool has a convenient user interface to facilitate manual entry of measurements made during a test. A graphical display shows the measured performance versus the expected performance, highlighting significant differences that indicate the unit is not able to pass the test. The tool is described as semiautomated because the measured data need to be entered manually, instead of being passed from the building control system automatically. However, the data analysis and visualization are fully automated. The tool is designed to be used by commissioning providers conducting functional tests as part of either new building commissioning or retro-commissioning, as well as building owners and operators interested in conducting routine tests periodically to check the performance of their HVAC systems.

  16. Intelligent Control in Automation Based on Wireless Traffic Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kurt Derr; Milos Manic

    2007-08-01

    Wireless technology is a central component of many factory automation infrastructures in both the commercial and government sectors, providing connectivity among various components in industrial realms (distributed sensors, machines, mobile process controllers). However wireless technologies provide more threats to computer security than wired environments. The advantageous features of Bluetooth technology resulted in Bluetooth units shipments climbing to five million per week at the end of 2005 [1, 2]. This is why the real-time interpretation and understanding of Bluetooth traffic behavior is critical in both maintaining the integrity of computer systems and increasing the efficient use of this technology in control type applications. Although neuro-fuzzy approaches have been applied to wireless 802.11 behavior analysis in the past, a significantly different Bluetooth protocol framework has not been extensively explored using this technology. This paper presents a new neurofuzzy traffic analysis algorithm of this still new territory of Bluetooth traffic. Further enhancements of this algorithm are presented along with the comparison against the traditional, numerical approach. Through test examples, interesting Bluetooth traffic behavior characteristics were captured, and the comparative elegance of this computationally inexpensive approach was demonstrated. This analysis can be used to provide directions for future development and use of this prevailing technology in various control type applications, as well as making the use of it more secure.

  17. Intelligent Control in Automation Based on Wireless Traffic Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kurt Derr; Milos Manic

    2007-09-01

    Wireless technology is a central component of many factory automation infrastructures in both the commercial and government sectors, providing connectivity among various components in industrial realms (distributed sensors, machines, mobile process controllers). However wireless technologies provide more threats to computer security than wired environments. The advantageous features of Bluetooth technology resulted in Bluetooth units shipments climbing to five million per week at the end of 2005 [1, 2]. This is why the real-time interpretation and understanding of Bluetooth traffic behavior is critical in both maintaining the integrity of computer systems and increasing the efficient use of this technology in control type applications. Although neuro-fuzzy approaches have been applied to wireless 802.11 behavior analysis in the past, a significantly different Bluetooth protocol framework has not been extensively explored using this technology. This paper presents a new neurofuzzy traffic analysis algorithm of this still new territory of Bluetooth traffic. Further enhancements of this algorithm are presented along with the comparison against the traditional, numerical approach. Through test examples, interesting Bluetooth traffic behavior characteristics were captured, and the comparative elegance of this computationally inexpensive approach was demonstrated. This analysis can be used to provide directions for future development and use of this prevailing technology in various control type applications, as well as making the use of it more secure.

  18. Development of a software for INAA analysis automation

    Energy Technology Data Exchange (ETDEWEB)

    Zahn, Guilherme S.; Genezini, Frederico A.; Figueiredo, Ana Maria G.; Ticianelli, Regina B., E-mail: gzahn@ipen [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2013-07-01

    In this work, a software to automate the post-counting tasks in comparative INAA has been developed that aims to become more flexible than the available options, integrating itself with some of the routines currently in use in the IPEN Activation Analysis Laboratory and allowing the user to choose between a fully-automatic analysis or an Excel-oriented one. The software makes use of the Genie 2000 data importing and analysis routines and stores each 'energy-counts-uncertainty' table as a separate ASCII file that can be used later on if required by the analyst. Moreover, it generates an Excel-compatible CSV (comma separated values) file with only the relevant results from the analyses for each sample or comparator, as well as the results of the concentration calculations and the results obtained with four different statistical tools (unweighted average, weighted average, normalized residuals and Rajeval technique), allowing the analyst to double-check the results. Finally, a 'summary' CSV file is also produced, with the final concentration results obtained for each element in each sample. (author)

  19. Automated Integrated Analog Filter Design Issues

    OpenAIRE

    2015-01-01

    An analysis of modern automated integrated analog circuits design methods and their use in integrated filter design is done. Current modern analog circuits automated tools are based on optimization algorithms and/or new circuit generation methods. Most automated integrated filter design methods are only suited to gmC and switched current filter topologies. Here, an algorithm for an active RC integrated filter design is proposed, that can be used in automated filter designs. The algorithm is t...

  20. An automated method for 'clumped-isotope' measurements on small carbonate samples.

    Science.gov (United States)

    Schmid, Thomas W; Bernasconi, Stefano M

    2010-07-30

    Clumped-isotope geochemistry deals with the state of ordering of rare isotopes in molecules, in particular with their tendency to form bonds with other rare isotopes rather than with the most abundant ones. Among its possible applications, carbonate clumped-isotope thermometry is the one that has gained most attention because of the wide potential of applications in many disciplines of earth sciences. Clumped-isotope thermometry allows reconstructing the temperature of formation of carbonate minerals without knowing the isotopic composition of the water from which they were formed. This feature enables new approaches in paleothermometry. The currently published method is, however, limited by sample weight requirements of 10-15 mg and because measurements are performed manually. In this paper we present a new method using an automated sample preparation device coupled to an isotope ratio mass spectrometer. The method is based on the repeated analysis (n = 6-8) of 200 microg aliquots of sample material and completely automated measurements. In addition, we propose to use precisely calibrated carbonates spanning a wide range in Delta(47) instead of heated gases to correct for isotope effects caused by the source of the mass spectrometer, following the principle of equal treatment of the samples and standards. We present data for international standards (NBS 19 and LSVEC) and different carbonates formed at temperatures exceeding 600 degrees C to show that precisions in the range of 10 to 15 ppm (1 SE) can be reached for repeated analyses of a single sample. Finally, we discuss and validate the correction procedure based on high-temperature carbonates instead of heated gases.

  1. Twelve automated thresholding methods for segmentation of PET images: a phantom study.

    Science.gov (United States)

    Prieto, Elena; Lecumberri, Pablo; Pagola, Miguel; Gómez, Marisol; Bilbao, Izaskun; Ecay, Margarita; Peñuelas, Iván; Martí-Climent, Josep M

    2012-06-21

    Tumor volume delineation over positron emission tomography (PET) images is of great interest for proper diagnosis and therapy planning. However, standard segmentation techniques (manual or semi-automated) are operator dependent and time consuming while fully automated procedures are cumbersome or require complex mathematical development. The aim of this study was to segment PET images in a fully automated way by implementing a set of 12 automated thresholding algorithms, classical in the fields of optical character recognition, tissue engineering or non-destructive testing images in high-tech structures. Automated thresholding algorithms select a specific threshold for each image without any a priori spatial information of the segmented object or any special calibration of the tomograph, as opposed to usual thresholding methods for PET. Spherical (18)F-filled objects of different volumes were acquired on clinical PET/CT and on a small animal PET scanner, with three different signal-to-background ratios. Images were segmented with 12 automatic thresholding algorithms and results were compared with the standard segmentation reference, a threshold at 42% of the maximum uptake. Ridler and Ramesh thresholding algorithms based on clustering and histogram-shape information, respectively, provided better results that the classical 42%-based threshold (p < 0.05). We have herein demonstrated that fully automated thresholding algorithms can provide better results than classical PET segmentation tools.

  2. Automated calibration methods for robotic multisensor landmine detection

    Science.gov (United States)

    Keranen, Joe G.; Miller, Jonathan; Schultz, Gregory; Topolosky, Zeke

    2007-04-01

    Both force protection and humanitarian demining missions require efficient and reliable detection and discrimination of buried anti-tank and anti-personnel landmines. Widely varying surface and subsurface conditions, mine types and placement, as well as environmental regimes challenge the robustness of the automatic target recognition process. In this paper we present applications created for the U.S. Army Nemesis detection platform. Nemesis is an unmanned rubber-tracked vehicle-based system designed to eradicate a wide variety of anti-tank and anti-personnel landmines for humanitarian demining missions. The detection system integrates advanced ground penetrating synthetic aperture radar (GPSAR) and electromagnetic induction (EMI) arrays, highly accurate global and local positioning, and on-board target detection/classification software on the front loader of a semi-autonomous UGV. An automated procedure is developed to estimate the soil's dielectric constant using surface reflections from the ground penetrating radar. The results have implications not only for calibration of system data acquisition parameters, but also for user awareness and tuning of automatic target recognition detection and discrimination algorithms.

  3. OpenComet: An automated tool for comet assay image analysis

    Directory of Open Access Journals (Sweden)

    Benjamin M. Gyori

    2014-01-01

    Full Text Available Reactive species such as free radicals are constantly generated in vivo and DNA is the most important target of oxidative stress. Oxidative DNA damage is used as a predictive biomarker to monitor the risk of development of many diseases. The comet assay is widely used for measuring oxidative DNA damage at a single cell level. The analysis of comet assay output images, however, poses considerable challenges. Commercial software is costly and restrictive, while free software generally requires laborious manual tagging of cells. This paper presents OpenComet, an open-source software tool providing automated analysis of comet assay images. It uses a novel and robust method for finding comets based on geometric shape attributes and segmenting the comet heads through image intensity profile analysis. Due to automation, OpenComet is more accurate, less prone to human bias, and faster than manual analysis. A live analysis functionality also allows users to analyze images captured directly from a microscope. We have validated OpenComet on both alkaline and neutral comet assay images as well as sample images from existing software packages. Our results show that OpenComet achieves high accuracy with significantly reduced analysis time.

  4. OpenComet: an automated tool for comet assay image analysis.

    Science.gov (United States)

    Gyori, Benjamin M; Venkatachalam, Gireedhar; Thiagarajan, P S; Hsu, David; Clement, Marie-Veronique

    2014-01-01

    Reactive species such as free radicals are constantly generated in vivo and DNA is the most important target of oxidative stress. Oxidative DNA damage is used as a predictive biomarker to monitor the risk of development of many diseases. The comet assay is widely used for measuring oxidative DNA damage at a single cell level. The analysis of comet assay output images, however, poses considerable challenges. Commercial software is costly and restrictive, while free software generally requires laborious manual tagging of cells. This paper presents OpenComet, an open-source software tool providing automated analysis of comet assay images. It uses a novel and robust method for finding comets based on geometric shape attributes and segmenting the comet heads through image intensity profile analysis. Due to automation, OpenComet is more accurate, less prone to human bias, and faster than manual analysis. A live analysis functionality also allows users to analyze images captured directly from a microscope. We have validated OpenComet on both alkaline and neutral comet assay images as well as sample images from existing software packages. Our results show that OpenComet achieves high accuracy with significantly reduced analysis time.

  5. An automated method to quantify microglia morphology and application to monitor activation state longitudinally in vivo.

    Directory of Open Access Journals (Sweden)

    Cleopatra Kozlowski

    Full Text Available Microglia are specialized immune cells of the brain. Upon insult, microglia initiate a cascade of cellular responses including a characteristic change in cell morphology. To study the dynamics of microglia immune response in situ, we developed an automated image analysis method that enables the quantitative assessment of microglia activation state within tissue based solely on cell morphology. Per cell morphometric analysis of fluorescently labeled microglia is achieved through local iterative threshold segmentation, which reduces errors caused by signal-to-noise variation across large volumes. We demonstrate, utilizing systemic application of lipopolysaccharide as a model of immune challenge, that several morphological parameters, including cell perimeter length, cell roundness and soma size, quantitatively distinguish resting versus activated populations of microglia within tissue comparable to traditional immunohistochemistry methods. Furthermore, we provide proof-of-concept data that monitoring soma size enables the longitudinal assessment of microglia activation in the mouse neocortex imaged via 2-photon in vivo microscopy. The ability to quantify microglia activation automatically by shape alone allows unbiased and rapid analysis of both fixed and in vivo central nervous system tissue.

  6. AUTOMATION METHODS FOR FORMING AND RECTIFYING STIFFENED PARTS WITH ROLLING MACHINES

    Directory of Open Access Journals (Sweden)

    A.Ye. Pashkov

    2015-12-01

    Full Text Available To improve the capabilities of forming and rectifying stiffened parts, rolling as one of the implemented methods of local plastic deformation has been examined. The tools for edge and sheet rolling have been described. The methods of process automation have been developed.

  7. Automated Analysis of Vital Signs Identified Patients with Substantial Bleeding Prior to Hospital Arrival

    Science.gov (United States)

    2015-10-01

    culminating with the first and only deployment of an automated emergency care decision system on board active air ambulances: the APPRAISE system, a...hardware/software platform for automated , real-time analysis of vital-sign data. After developing the APPRAISE system using data from trauma patients

  8. 40 CFR 13.19 - Analysis of costs; automation; prevention of overpayments, delinquencies or defaults.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 1 2010-07-01 2010-07-01 false Analysis of costs; automation; prevention of overpayments, delinquencies or defaults. 13.19 Section 13.19 Protection of Environment...; automation; prevention of overpayments, delinquencies or defaults. (a) The Administrator may...

  9. RFI detection by automated feature extraction and statistical analysis

    Science.gov (United States)

    Winkel, B.; Kerp, J.; Stanko, S.

    2007-01-01

    In this paper we present an interference detection toolbox consisting of a high dynamic range Digital Fast-Fourier-Transform spectrometer (DFFT, based on FPGA-technology) and data analysis software for automated radio frequency interference (RFI) detection. The DFFT spectrometer allows high speed data storage of spectra on time scales of less than a second. The high dynamic range of the device assures constant calibration even during extremely powerful RFI events. The software uses an algorithm which performs a two-dimensional baseline fit in the time-frequency domain, searching automatically for RFI signals superposed on the spectral data. We demonstrate, that the software operates successfully on computer-generated RFI data as well as on real DFFT data recorded at the Effelsberg 100-m telescope. At 21-cm wavelength RFI signals can be identified down to the 4σ_rms level. A statistical analysis of all RFI events detected in our observational data revealed that: (1) mean signal strength is comparable to the astronomical line emission of the Milky Way, (2) interferences are polarised, (3) electronic devices in the neighbourhood of the telescope contribute significantly to the RFI radiation. We also show that the radiometer equation is no longer fulfilled in presence of RFI signals.

  10. RFI detection by automated feature extraction and statistical analysis

    CERN Document Server

    Winkel, B; Stanko, S; Winkel, Benjamin; Kerp, Juergen; Stanko, Stephan

    2006-01-01

    In this paper we present an interference detection toolbox consisting of a high dynamic range Digital Fast-Fourier-Transform spectrometer (DFFT, based on FPGA-technology) and data analysis software for automated radio frequency interference (RFI) detection. The DFFT spectrometer allows high speed data storage of spectra on time scales of less than a second. The high dynamic range of the device assures constant calibration even during extremely powerful RFI events. The software uses an algorithm which performs a two-dimensional baseline fit in the time-frequency domain, searching automatically for RFI signals superposed on the spectral data. We demonstrate, that the software operates successfully on computer-generated RFI data as well as on real DFFT data recorded at the Effelsberg 100-m telescope. At 21-cm wavelength RFI signals can be identified down to the 4-sigma level. A statistical analysis of all RFI events detected in our observational data revealed that: (1) mean signal strength is comparable to the a...

  11. Semi-Automated Detection of Surface Degradation on Bridges Based on a Level Set Method

    Science.gov (United States)

    Masiero, A.; Guarnieri, A.; Pirotti, F.; Vettore, A.

    2015-08-01

    Due to the effect of climate factors, natural phenomena and human usage, buildings and infrastructures are subject of progressive degradation. The deterioration of these structures has to be monitored in order to avoid hazards for human beings and for the natural environment in their neighborhood. Hence, on the one hand, monitoring such infrastructures is of primarily importance. On the other hand, unfortunately, nowadays this monitoring effort is mostly done by expert and skilled personnel, which follow the overall data acquisition, analysis and result reporting process, making the whole monitoring procedure quite expensive for the public (and private, as well) agencies. This paper proposes the use of a partially user-assisted procedure in order to reduce the monitoring cost and to make the obtained result less subjective as well. The developed method relies on the use of images acquired with standard cameras by even inexperienced personnel. The deterioration on the infrastructure surface is detected by image segmentation based on a level sets method. The results of the semi-automated analysis procedure are remapped on a 3D model of the infrastructure obtained by means of a terrestrial laser scanning acquisition. The proposed method has been successfully tested on a portion of a road bridge in Perarolo di Cadore (BL), Italy.

  12. THE METHOD OF FORMING THE PIGGYBACK TECHNOLOGIES USING THE AUTOMATED HEURISTIC SYSTEM

    Directory of Open Access Journals (Sweden)

    Ye. Nahornyi

    2015-07-01

    Full Text Available In order to choose a rational piggyback technology there was offered a method that envisages the automated system improvement by giving it a heuristic nature. The automated system is based on a set of methods, techniques and strategies aimed at creating optimal resource saving technologies, which makes it possible to take into account with maximum efficiency the interests of all the participants of the delivery process. When organizing the piggyback traffic there is presupposed the coordination of operations between the piggyback traffic participants to minimize the cargo travel time.

  13. Fully automated quantitative analysis of breast cancer risk in DCE-MR images

    Science.gov (United States)

    Jiang, Luan; Hu, Xiaoxin; Gu, Yajia; Li, Qiang

    2015-03-01

    Amount of fibroglandular tissue (FGT) and background parenchymal enhancement (BPE) in dynamic contrast enhanced magnetic resonance (DCE-MR) images are two important indices for breast cancer risk assessment in the clinical practice. The purpose of this study is to develop and evaluate a fully automated scheme for quantitative analysis of FGT and BPE in DCE-MR images. Our fully automated method consists of three steps, i.e., segmentation of whole breast, fibroglandular tissues, and enhanced fibroglandular tissues. Based on the volume of interest extracted automatically, dynamic programming method was applied in each 2-D slice of a 3-D MR scan to delineate the chest wall and breast skin line for segmenting the whole breast. This step took advantages of the continuity of chest wall and breast skin line across adjacent slices. We then further used fuzzy c-means clustering method with automatic selection of cluster number for segmenting the fibroglandular tissues within the segmented whole breast area. Finally, a statistical method was used to set a threshold based on the estimated noise level for segmenting the enhanced fibroglandular tissues in the subtraction images of pre- and post-contrast MR scans. Based on the segmented whole breast, fibroglandular tissues, and enhanced fibroglandular tissues, FGT and BPE were automatically computed. Preliminary results of technical evaluation and clinical validation showed that our fully automated scheme could obtain good segmentation of the whole breast, fibroglandular tissues, and enhanced fibroglandular tissues to achieve accurate assessment of FGT and BPE for quantitative analysis of breast cancer risk.

  14. Automated preparation of Kepler time series of planet hosts for asteroseismic analysis

    DEFF Research Database (Denmark)

    Handberg, R.; Lund, M. N.

    2014-01-01

    One of the tasks of the Kepler Asteroseismic Science Operations Center (KASOC) is to provide asteroseismic analyses on Kepler Objects of Interest (KOIs). However, asteroseismic analysis of planetary host stars presents some unique complications with respect to data preprocessing, compared to pure...... photometric time series than the original data. The methods are automated and can therefore easily be applied to a large number of stars. The application of the filter is not restricted to planetary hosts, but can be applied to any solar-like or red giant stars observed by Kepler/K2....

  15. An approach for model-based energy cost analysis of industrial automation systems

    Energy Technology Data Exchange (ETDEWEB)

    Beck, A.; Goehner, P. [Institute of Industrial Automation and Software Engineering, University of Stuttgart, Pfaffenwaldring 47, 70550 Stuttgart (Germany)

    2012-08-15

    Current energy reports confirm the steadily dilating gap between available conventional energy resources and future energy demand. This gap results in increasing energy costs and has become a determining factor in economies. Hence, politics, industry, and research focus either on regenerative energy resources or on energy-efficient concepts, methods, and technologies for energy-consuming devices. A remaining challenge is energy optimization of complex systems during their operation time. In addition to optimization measures that can be applied in development and engineering, the generation of optimization measures that are customized to the specific dynamic operational situation, promise high-cost saving potentials. During operation time, the systems are located in unique situations and environments and are operated according to individual requirements of their users. Hence, in addition to complexity of the systems, individuality and dynamic variability of their surroundings during operation time complicate identification of goal-oriented optimization measures. This contribution introduces a model-based approach for user-centric energy cost analysis of industrial automation systems. The approach allows automated generation and appliance of individual optimization proposals. Focus of this paper is on a basic variant for a single industrial automation system and its operational parameters.

  16. Automated Mineral Analysis to Characterize Metalliferous Mine Waste

    Science.gov (United States)

    Hensler, Ana-Sophie; Lottermoser, Bernd G.; Vossen, Peter; Langenberg, Lukas C.

    2016-10-01

    The objective of this study was to investigate the applicability of automated QEMSCAN® mineral analysis combined with bulk geochemical analysis to evaluate the environmental risk of non-acid producing mine waste present at the historic Albertsgrube Pb-Zn mine site, Hastenrath, North Rhine-Westphalia, Germany. Geochemical analyses revealed elevated average abundances of As, Cd, Cu, Mn, Pb, Sb and Zn and near neutral to slightly alkaline paste pH values. Mineralogical analyses using the QEMSCAN® revealed diverse mono- and polymineralic particles across all samples, with grain sizes ranging from a few μm up to 2000 μm. Calcite and dolomite (up to 78 %), smithsonite (up to 24 %) and Ca sulphate (up to 11.5 %) are present mainly as coarse-grained particles. By contrast, significant amounts of quartz, muscovite/illite, sphalerite (up to 10.8 %), galena (up to 1 %), pyrite (up to 3.4 %) and cerussite/anglesite (up to 4.3 %) are present as fine-grained (<500 μm) particles. QEMSCAN® analysis also identified disseminated sauconite, coronadite/chalcophanite, chalcopyrite, jarosite, apatite, rutile, K-feldspar, biotite, Fe (hydr) oxides/CO3 and unknown Zn Pb(Fe) and Zn Pb Ca (Fe Ti) phases. Many of the metal-bearing sulphide grains occur as separate particles with exposed surface areas and thus, may be matter of environmental concern because such mineralogical hosts will continue to release metals and metalloids (As, Cd, Sb, Zn) at near neutral pH into ground and surface waters. QEMSCAN® mineral analysis allows acquisition of fully quantitative data on the mineralogical composition, textural characteristics and grain size estimation of mine waste material and permits the recognition of mine waste as “high-risk” material that would have otherwise been classified by traditional geochemical tests as benign.

  17. Technical aspects and evaluation methodology for the application of two automated brain MRI tumor segmentation methods in radiation therapy planning.

    Science.gov (United States)

    Beyer, Gloria P; Velthuizen, Robert P; Murtagh, F Reed; Pearlman, James L

    2006-11-01

    The purpose of this study was to design the steps necessary to create a tumor volume outline from the results of two automated multispectral magnetic resonance imaging segmentation methods and integrate these contours into radiation therapy treatment planning. Algorithms were developed to create a closed, smooth contour that encompassed the tumor pixels resulting from two automated segmentation methods: k-nearest neighbors and knowledge guided. These included an automatic three-dimensional (3D) expansion of the results to compensate for their undersegmentation and match the extended contouring technique used in practice by radiation oncologists. Each resulting radiation treatment plan generated from the automated segmentation and from the outlining by two radiation oncologists for 11 brain tumor patients was compared against the volume and treatment plan from an expert radiation oncologist who served as the control. As part of this analysis, a quantitative and qualitative evaluation mechanism was developed to aid in this comparison. It was found that the expert physician reference volume was irradiated within the same level of conformity when using the plans generated from the contours of the segmentation methods. In addition, any uncertainty in the identification of the actual gross tumor volume by the segmentation methods, as identified by previous research into this area, had small effects when used to generate 3D radiation therapy treatment planning due to the averaging process in the generation of margins used in defining a planning target volume.

  18. Automated Reasoning and Equation Solving with the Characteristic Set Method

    Institute of Scientific and Technical Information of China (English)

    Wen-Tsun Wu; Xiao-Shan Gao

    2006-01-01

    A brief introduction to the characteristic set method is given for solving algebraic equation systems and then the method is extended to algebraic difference systems. The method can be used to decompose the zero set for a difference polynomial set in general form to the union of difference polynomial sets in triangular form. Based on the characteristic set method, a decision procedure for the first order theory over an algebraically closed field and a procedure to prove certain difference identities are proposed.

  19. Synchronous Control Method and Realization of Automated Pharmacy Elevator

    Science.gov (United States)

    Liu, Xiang-Quan

    Firstly, the control method of elevator's synchronous motion is provided, the synchronous control structure of double servo motor based on PMAC is accomplished. Secondly, synchronous control program of elevator is implemented by using PMAC linear interpolation motion model and position error compensation method. Finally, the PID parameters of servo motor were adjusted. The experiment proves the control method has high stability and reliability.

  20. Hybrid digital signal processing and neural networks for automated diagnostics using NDE methods

    Energy Technology Data Exchange (ETDEWEB)

    Upadhyaya, B.R.; Yan, W. [Tennessee Univ., Knoxville, TN (United States). Dept. of Nuclear Engineering

    1993-11-01

    The primary purpose of the current research was to develop an integrated approach by combining information compression methods and artificial neural networks for the monitoring of plant components using nondestructive examination data. Specifically, data from eddy current inspection of heat exchanger tubing were utilized to evaluate this technology. The focus of the research was to develop and test various data compression methods (for eddy current data) and the performance of different neural network paradigms for defect classification and defect parameter estimation. Feedforward, fully-connected neural networks, that use the back-propagation algorithm for network training, were implemented for defect classification and defect parameter estimation using a modular network architecture. A large eddy current tube inspection database was acquired from the Metals and Ceramics Division of ORNL. These data were used to study the performance of artificial neural networks for defect type classification and for estimating defect parameters. A PC-based data preprocessing and display program was also developed as part of an expert system for data management and decision making. The results of the analysis showed that for effective (low-error) defect classification and estimation of parameters, it is necessary to identify proper feature vectors using different data representation methods. The integration of data compression and artificial neural networks for information processing was established as an effective technique for automation of diagnostics using nondestructive examination methods.

  1. Automated preparation of Kepler time series of planet hosts for asteroseismic analysis

    CERN Document Server

    Handberg, R

    2014-01-01

    One of the tasks of the Kepler Asteroseismic Science Operations Center (KASOC) is to provide asteroseismic analyses on Kepler Objects of Interest (KOIs). However, asteroseismic analysis of planetary host stars presents some unique complications with respect to data preprocessing, compared to pure asteroseismic targets. If not accounted for, the presence of planetary transits in the photometric time series often greatly complicates or even hinders these asteroseismic analyses. This drives the need for specialised methods of preprocessing data to make them suitable for asteroseismic analysis. In this paper we present the KASOC Filter, which is used to automatically prepare data from the Kepler/K2 mission for asteroseismic analyses of solar-like planet host stars. The methods are very effective at removing unwanted signals of both instrumental and planetary origins and produce significantly cleaner photometric time series than the original data. The methods are automated and can therefore easily be applied to a ...

  2. Application of automated image analysis to coal petrography

    Science.gov (United States)

    Chao, E.C.T.; Minkin, J.A.; Thompson, C.L.

    1982-01-01

    The coal petrologist seeks to determine the petrographic characteristics of organic and inorganic coal constituents and their lateral and vertical variations within a single coal bed or different coal beds of a particular coal field. Definitive descriptions of coal characteristics and coal facies provide the basis for interpretation of depositional environments, diagenetic changes, and burial history and determination of the degree of coalification or metamorphism. Numerous coal core or columnar samples must be studied in detail in order to adequately describe and define coal microlithotypes, lithotypes, and lithologic facies and their variations. The large amount of petrographic information required can be obtained rapidly and quantitatively by use of an automated image-analysis system (AIAS). An AIAS can be used to generate quantitative megascopic and microscopic modal analyses for the lithologic units of an entire columnar section of a coal bed. In our scheme for megascopic analysis, distinctive bands 2 mm or more thick are first demarcated by visual inspection. These bands consist of either nearly pure microlithotypes or lithotypes such as vitrite/vitrain or fusite/fusain, or assemblages of microlithotypes. Megascopic analysis with the aid of the AIAS is next performed to determine volume percentages of vitrite, inertite, minerals, and microlithotype mixtures in bands 0.5 to 2 mm thick. The microlithotype mixtures are analyzed microscopically by use of the AIAS to determine their modal composition in terms of maceral and optically observable mineral components. Megascopic and microscopic data are combined to describe the coal unit quantitatively in terms of (V) for vitrite, (E) for liptite, (I) for inertite or fusite, (M) for mineral components other than iron sulfide, (S) for iron sulfide, and (VEIM) for the composition of the mixed phases (Xi) i = 1,2, etc. in terms of the maceral groups vitrinite V, exinite E, inertinite I, and optically observable mineral

  3. An Automated Method for Semantic Classification of Regions in Coastal Images

    NARCIS (Netherlands)

    Hoonhout, B.M.; Radermacher, M.; Baart, F.; Van der Maaten, L.J.P.

    2015-01-01

    Large, long-term coastal imagery datasets are nowadays a low-cost source of information for various coastal research disciplines. However, the applicability of many existing algorithms for coastal image analysis is limited for these large datasets due to a lack of automation and robustness. Therefor

  4. VisioTracker, an innovative automated approach to oculomotor analysis.

    Science.gov (United States)

    Mueller, Kaspar P; Schnaedelbach, Oliver D R; Russig, Holger D; Neuhauss, Stephan C F

    2011-10-12

    Investigations into the visual system development and function necessitate quantifiable behavioral models of visual performance that are easy to elicit, robust, and simple to manipulate. A suitable model has been found in the optokinetic response (OKR), a reflexive behavior present in all vertebrates due to its high selection value. The OKR involves slow stimulus-following movements of eyes alternated with rapid resetting saccades. The measurement of this behavior is easily carried out in zebrafish larvae, due to its early and stable onset (fully developed after 96 hours post fertilization (hpf)), and benefitting from the thorough knowledge about zebrafish genetics, for decades one of the favored model organisms in this field. Meanwhile the analysis of similar mechanisms in adult fish has gained importance, particularly for pharmacological and toxicological applications. Here we describe VisioTracker, a fully automated, high-throughput system for quantitative analysis of visual performance. The system is based on research carried out in the group of Prof. Stephan Neuhauss and was re-designed by TSE Systems. It consists of an immobilizing device for small fish monitored by a high-quality video camera equipped with a high-resolution zoom lens. The fish container is surrounded by a drum screen, upon which computer-generated stimulus patterns can be projected. Eye movements are recorded and automatically analyzed by the VisioTracker software package in real time. Data analysis enables immediate recognition of parameters such as slow and fast phase duration, movement cycle frequency, slow-phase gain, visual acuity, and contrast sensitivity. Typical results allow for example the rapid identification of visual system mutants that show no apparent alteration in wild type morphology, or the determination of quantitative effects of pharmacological or toxic and mutagenic agents on visual system performance.

  5. Multimodal microscopy for automated histologic analysis of prostate cancer

    Directory of Open Access Journals (Sweden)

    Sinha Saurabh

    2011-02-01

    Full Text Available Abstract Background Prostate cancer is the single most prevalent cancer in US men whose gold standard of diagnosis is histologic assessment of biopsies. Manual assessment of stained tissue of all biopsies limits speed and accuracy in clinical practice and research of prostate cancer diagnosis. We sought to develop a fully-automated multimodal microscopy method to distinguish cancerous from non-cancerous tissue samples. Methods We recorded chemical data from an unstained tissue microarray (TMA using Fourier transform infrared (FT-IR spectroscopic imaging. Using pattern recognition, we identified epithelial cells without user input. We fused the cell type information with the corresponding stained images commonly used in clinical practice. Extracted morphological features, optimized by two-stage feature selection method using a minimum-redundancy-maximal-relevance (mRMR criterion and sequential floating forward selection (SFFS, were applied to classify tissue samples as cancer or non-cancer. Results We achieved high accuracy (area under ROC curve (AUC >0.97 in cross-validations on each of two data sets that were stained under different conditions. When the classifier was trained on one data set and tested on the other data set, an AUC value of ~0.95 was observed. In the absence of IR data, the performance of the same classification system dropped for both data sets and between data sets. Conclusions We were able to achieve very effective fusion of the information from two different images that provide very different types of data with different characteristics. The method is entirely transparent to a user and does not involve any adjustment or decision-making based on spectral data. By combining the IR and optical data, we achieved high accurate classification.

  6. Automated segmentation refinement of small lung nodules in CT scans by local shape analysis.

    Science.gov (United States)

    Diciotti, Stefano; Lombardo, Simone; Falchini, Massimo; Picozzi, Giulia; Mascalchi, Mario

    2011-12-01

    One of the most important problems in the segmentation of lung nodules in CT imaging arises from possible attachments occurring between nodules and other lung structures, such as vessels or pleura. In this report, we address the problem of vessels attachments by proposing an automated correction method applied to an initial rough segmentation of the lung nodule. The method is based on a local shape analysis of the initial segmentation making use of 3-D geodesic distance map representations. The correction method has the advantage that it locally refines the nodule segmentation along recognized vessel attachments only, without modifying the nodule boundary elsewhere. The method was tested using a simple initial rough segmentation, obtained by a fixed image thresholding. The validation of the complete segmentation algorithm was carried out on small lung nodules, identified in the ITALUNG screening trial and on small nodules of the lung image database consortium (LIDC) dataset. In fully automated mode, 217/256 (84.8%) lung nodules of ITALUNG and 139/157 (88.5%) individual marks of lung nodules of LIDC were correctly outlined and an excellent reproducibility was also observed. By using an additional interactive mode, based on a controlled manual interaction, 233/256 (91.0%) lung nodules of ITALUNG and 144/157 (91.7%) individual marks of lung nodules of LIDC were overall correctly segmented. The proposed correction method could also be usefully applied to any existent nodule segmentation algorithm for improving the segmentation quality of juxta-vascular nodules.

  7. Automated methods and control when mining seams prone to outburst

    Energy Technology Data Exchange (ETDEWEB)

    Kolesov, O.A.; Agaphonov, A.V.; Kolchin, G.I. [Makeyevka Safety in Mines Research Institute (Ukraine)

    1995-12-31

    Drawbacks in existing methods of predicting outburst zones in Donbas (Russia) thin coal seams led specialists at MakNII to investigate methods based on artificial excited acoustic signals, with processing by personnal computers. The paper describes investigations to correlate different acoustic signal parameters with stress and strained state of the massif preface. The method proved reliable in determining the relief zone in 12 Donbas mines. The paper goes on to describe development of a control method for another widely used method of coal and gas outburst prevention in Donbas, that of water injection into the coal seam known as `hydroripping`. This method includes acoustic signals recording and preface part parameters determination in the drilling process for infusion and recording and processing of the acoustic signal in real time, which is created during water infusion. 8 refs.

  8. Automating dChip: toward reproducible sharing of microarray data analysis

    Directory of Open Access Journals (Sweden)

    Li Cheng

    2008-05-01

    Full Text Available Abstract Background During the past decade, many software packages have been developed for analysis and visualization of various types of microarrays. We have developed and maintained the widely used dChip as a microarray analysis software package accessible to both biologist and data analysts. However, challenges arise when dChip users want to analyze large number of arrays automatically and share data analysis procedures and parameters. Improvement is also needed when the dChip user support team tries to identify the causes of reported analysis errors or bugs from users. Results We report here implementation and application of the dChip automation module. Through this module, dChip automation files can be created to include menu steps, parameters, and data viewpoints to run automatically. A data-packaging function allows convenient transfer from one user to another of the dChip software, microarray data, and analysis procedures, so that the second user can reproduce the entire analysis session of the first user. An analysis report file can also be generated during an automated run, including analysis logs, user comments, and viewpoint screenshots. Conclusion The dChip automation module is a step toward reproducible research, and it can prompt a more convenient and reproducible mechanism for sharing microarray software, data, and analysis procedures and results. Automation data packages can also be used as publication supplements. Similar automation mechanisms could be valuable to the research community if implemented in other genomics and bioinformatics software packages.

  9. Automated multidimensional image analysis reveals a role for Abl in embryonic wound repair.

    Science.gov (United States)

    Zulueta-Coarasa, Teresa; Tamada, Masako; Lee, Eun J; Fernandez-Gonzalez, Rodrigo

    2014-07-01

    The embryonic epidermis displays a remarkable ability to repair wounds rapidly. Embryonic wound repair is driven by the evolutionary conserved redistribution of cytoskeletal and junctional proteins around the wound. Drosophila has emerged as a model to screen for factors implicated in wound closure. However, genetic screens have been limited by the use of manual analysis methods. We introduce MEDUSA, a novel image-analysis tool for the automated quantification of multicellular and molecular dynamics from time-lapse confocal microscopy data. We validate MEDUSA by quantifying wound closure in Drosophila embryos, and we show that the results of our automated analysis are comparable to analysis by manual delineation and tracking of the wounds, while significantly reducing the processing time. We demonstrate that MEDUSA can also be applied to the investigation of cellular behaviors in three and four dimensions. Using MEDUSA, we find that the conserved nonreceptor tyrosine kinase Abelson (Abl) contributes to rapid embryonic wound closure. We demonstrate that Abl plays a role in the organization of filamentous actin and the redistribution of the junctional protein β-catenin at the wound margin during embryonic wound repair. Finally, we discuss different models for the role of Abl in the regulation of actin architecture and adhesion dynamics at the wound margin.

  10. PM2.5手工法与自动法比对差异及其与气象条件影响关系分析%Analysis of Divergence of PM2. 5 Concentration Measured by Gravimetric Measurement Method and Automated Monitoring Method and Its Relationship with Meteorological Conditions

    Institute of Scientific and Technical Information of China (English)

    郑翔翔; 洪正昉; 黄芳; 陈浩; 吕晶

    2015-01-01

    通过在杭州、衢州和温州三个城市开展PM2.5手工法和自动法比对试验,分析PM2.5手工法和自动法的比对差异及其与气象条件的影响关系,以期为PM2.5自动监测的现场手工比对工作积累一定的经验,确保PM2.5自动监测数据的准确性。%An intercomparison of PM2. 5 concentration measured simultaneously by gravimetric measurement method and automated monitoring method was carried out in Hangzhou, Quzhou and Wenzhou in order to study the divergence between gravimetric measurement method and automated monitoring method and its relationship with meteorological conditions. Experience of intercomparison of PM2. 5 concentration measured by gravimetric measurement method and automated monitoring method could be obtained through this research to ensure the accuracy of measured values.

  11. Automated Finite Element Analysis of Elastically-Tailored Plates

    Science.gov (United States)

    Jegley, Dawn C. (Technical Monitor); Tatting, Brian F.; Guerdal, Zafer

    2003-01-01

    A procedure for analyzing and designing elastically tailored composite laminates using the STAGS finite element solver has been presented. The methodology used to produce the elastic tailoring, namely computer-controlled steering of unidirectionally reinforced composite material tows, has been reduced to a handful of design parameters along with a selection of construction methods. The generality of the tow-steered ply definition provides the user a wide variety of options for laminate design, which can be automatically incorporated with any finite element model that is composed of STAGS shell elements. Furthermore, the variable stiffness parameterization is formulated so that manufacturability can be assessed during the design process, plus new ideas using tow steering concepts can be easily integrated within the general framework of the elastic tailoring definitions. Details for the necessary implementation of the tow-steering definitions within the STAGS hierarchy is provided, and the format of the ply definitions is discussed in detail to provide easy access to the elastic tailoring choices. Integration of the automated STAGS solver with laminate design software has been demonstrated, so that the large design space generated by the tow-steering options can be traversed effectively. Several design problems are presented which confirm the usefulness of the design tool as well as further establish the potential of tow-steered plies for laminate design.

  12. Automated Chemical Analysis of Internally Mixed Aerosol Particles Using X-ray Spectromicroscopy at the Carbon K-Edge

    Energy Technology Data Exchange (ETDEWEB)

    Gilles, Mary K; Moffet, R.C.; Henn, T.; Laskin, A.

    2011-01-20

    We have developed an automated data analysis method for atmospheric particles using scanning transmission X-ray microscopy coupled with near edge X-ray fine structure spectroscopy (STXM/NEXAFS). This method is applied to complex internally mixed submicrometer particles containing organic and inorganic material. Several algorithms were developed to exploit NEXAFS spectral features in the energy range from 278 to 320 eV for quantitative mapping of the spatial distribution of elemental carbon, organic carbon, potassium, and noncarbonaceous elements in particles of mixed composition. This energy range encompasses the carbon K-edge and potassium L2 and L3 edges. STXM/NEXAFS maps of different chemical components were complemented with a subsequent analysis using elemental maps obtained by scanning electron microscopy coupled with energy dispersive X-ray analysis (SEM/EDX). We demonstrate the application of the automated mapping algorithms for data analysis and the statistical classification of particles.

  13. ANALYSIS OF MULTISCALE METHODS

    Institute of Scientific and Technical Information of China (English)

    Wei-nan E; Ping-bing Ming

    2004-01-01

    The heterogeneous multiscale method gives a general framework for the analysis of multiscale methods. In this paper, we demonstrate this by applying this framework to two canonical problems: The elliptic problem with multiscale coefficients and the quasicontinuum method.

  14. Linking Automated Data Analysis and Visualization with Applications in Developmental Biology and High-Energy Physics

    Energy Technology Data Exchange (ETDEWEB)

    Ruebel, Oliver [Technical Univ. of Darmstadt (Germany)

    2009-11-20

    Knowledge discovery from large and complex collections of today's scientific datasets is a challenging task. With the ability to measure and simulate more processes at increasingly finer spatial and temporal scales, the increasing number of data dimensions and data objects is presenting tremendous challenges for data analysis and effective data exploration methods and tools. Researchers are overwhelmed with data and standard tools are often insufficient to enable effective data analysis and knowledge discovery. The main objective of this thesis is to provide important new capabilities to accelerate scientific knowledge discovery form large, complex, and multivariate scientific data. The research covered in this thesis addresses these scientific challenges using a combination of scientific visualization, information visualization, automated data analysis, and other enabling technologies, such as efficient data management. The effectiveness of the proposed analysis methods is demonstrated via applications in two distinct scientific research fields, namely developmental biology and high-energy physics.Advances in microscopy, image analysis, and embryo registration enable for the first time measurement of gene expression at cellular resolution for entire organisms. Analysis of high-dimensional spatial gene expression datasets is a challenging task. By integrating data clustering and visualization, analysis of complex, time-varying, spatial gene expression patterns and their formation becomes possible. The analysis framework MATLAB and the visualization have been integrated, making advanced analysis tools accessible to biologist and enabling bioinformatic researchers to directly integrate their analysis with the visualization. Laser wakefield particle accelerators (LWFAs) promise to be a new compact source of high-energy particles and radiation, with wide applications ranging from medicine to physics. To gain insight into the complex physical processes of particle

  15. Research into the automation of the proximate analysis of coal (II): the establishment of a method for the rapid determination of ash in coal, and combustion residues in coal ash

    Energy Technology Data Exchange (ETDEWEB)

    Hase, Y.

    1986-01-01

    The JIS method for coal ash analysis requires 2.5-3 hours for ashing and a total of 3-3.5 hours for the complete determination. The author reports a new method in which ashing time is reduced to about 3 minutes and overall analysis time to approximately 30 minutes. The former is achieved by employing oxygen and using a new type of ashing vessel, while the latter time reduction is due to the introduction of cooling. Measurement precision with the new method is adequate for all practical purposes, apart from in the case of Miike coal, which has a particularly high sulfur content. 2 references, 4 figures, 17 tables.

  16. Empirical Analysis and Automated Classification of Security Bug Reports

    Science.gov (United States)

    Tyo, Jacob P.

    2016-01-01

    With the ever expanding amount of sensitive data being placed into computer systems, the need for effective cybersecurity is of utmost importance. However, there is a shortage of detailed empirical studies of security vulnerabilities from which cybersecurity metrics and best practices could be determined. This thesis has two main research goals: (1) to explore the distribution and characteristics of security vulnerabilities based on the information provided in bug tracking systems and (2) to develop data analytics approaches for automatic classification of bug reports as security or non-security related. This work is based on using three NASA datasets as case studies. The empirical analysis showed that the majority of software vulnerabilities belong only to a small number of types. Addressing these types of vulnerabilities will consequently lead to cost efficient improvement of software security. Since this analysis requires labeling of each bug report in the bug tracking system, we explored using machine learning to automate the classification of each bug report as a security or non-security related (two-class classification), as well as each security related bug report as specific security type (multiclass classification). In addition to using supervised machine learning algorithms, a novel unsupervised machine learning approach is proposed. An ac- curacy of 92%, recall of 96%, precision of 92%, probability of false alarm of 4%, F-Score of 81% and G-Score of 90% were the best results achieved during two-class classification. Furthermore, an accuracy of 80%, recall of 80%, precision of 94%, and F-score of 85% were the best results achieved during multiclass classification.

  17. Automation of a center pivot using the temperature-time-threshold method of irriation scheduling

    Science.gov (United States)

    A center pivot was completely automated using the temperature-time-threshold (TTT) method of irrigation scheduling. An array of infrared thermometers was mounted on the center pivot and these were used to remotely determine the crop leaf temperature as an indicator of crop water stress. We describ...

  18. Foreign object detection and removal to improve automated analysis of chest radiographs

    Energy Technology Data Exchange (ETDEWEB)

    Hogeweg, Laurens; Sanchez, Clara I.; Melendez, Jaime; Maduskar, Pragnya; Ginneken, Bram van [Diagnostic Image Analysis Group, Radboud University Nijmegen Medical Centre, Nijmegen 6525 GA (Netherlands); Story, Alistair; Hayward, Andrew [University College London, Centre for Infectious Disease Epidemiology, London NW3 2PF (United Kingdom)

    2013-07-15

    Purpose: Chest radiographs commonly contain projections of foreign objects, such as buttons, brassier clips, jewellery, or pacemakers and wires. The presence of these structures can substantially affect the output of computer analysis of these images. An automated method is presented to detect, segment, and remove foreign objects from chest radiographs.Methods: Detection is performed using supervised pixel classification with a kNN classifier, resulting in a probability estimate per pixel to belong to a projected foreign object. Segmentation is performed by grouping and post-processing pixels with a probability above a certain threshold. Next, the objects are replaced by texture inpainting.Results: The method is evaluated in experiments on 257 chest radiographs. The detection at pixel level is evaluated with receiver operating characteristic analysis on pixels within the unobscured lung fields and an A{sub z} value of 0.949 is achieved. Free response operator characteristic analysis is performed at the object level, and 95.6% of objects are detected with on average 0.25 false positive detections per image. To investigate the effect of removing the detected objects through inpainting, a texture analysis system for tuberculosis detection is applied to images with and without pathology and with and without foreign object removal. Unprocessed, the texture analysis abnormality score of normal images with foreign objects is comparable to those with pathology. After removing foreign objects, the texture score of normal images with and without foreign objects is similar, while abnormal images, whether they contain foreign objects or not, achieve on average higher scores.Conclusions: The authors conclude that removal of foreign objects from chest radiographs is feasible and beneficial for automated image analysis.

  19. Interobserver and Intraobserver Variability in pH-Impedance Analysis between 10 Experts and Automated Analysis

    DEFF Research Database (Denmark)

    Loots, Clara M; van Wijk, Michiel P; Blondeau, Kathleen;

    2011-01-01

    OBJECTIVE: To determine interobserver and intraobserver variability in pH-impedance interpretation between experts and accuracy of automated analysis (AA). STUDY DESIGN: Ten pediatric 24-hour pH-impedance tracings were analyzed by 10 observers from 7 world groups and with AA. Detection of gastroe......OBJECTIVE: To determine interobserver and intraobserver variability in pH-impedance interpretation between experts and accuracy of automated analysis (AA). STUDY DESIGN: Ten pediatric 24-hour pH-impedance tracings were analyzed by 10 observers from 7 world groups and with AA. Detection....... CONCLUSION: Interobserver agreement in combined pH-multichannel intraluminal impedance analysis in experts is moderate; only 42% of GER episodes were detected by the majority of observers. Detection of total GER numbers is more consistent. Considering these poor outcomes, AA seems favorable compared...

  20. Analysis methods of neutrons induced resonances in the transmission experiments by time-of-flight and automation of these methods on IBM 7094 II computer; Methode d'analyse des resonances induites par les neutrons dans les experiences de transmission par temps-de-vol et automatisation de ces methodes sur ordinateur IBM-7094 II

    Energy Technology Data Exchange (ETDEWEB)

    Corge, C

    1967-07-01

    The neutron induced resonances analysis aims to determine the neutrons characteristics, leading to the excitation energies, de-excitation probabilities by gamma radiation emission, by neutron emission or by fission, their spin, their parity... This document describes the methods developed, or adapted, the calculation schemes and the algorithms implemented to realize such analysis on a computer, from data obtained during time-of-flight experiments on the linear accelerator of Saclay. (A.L.B.)

  1. Automated Design and Analysis Tool for CEV Structural and TPS Components Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation of the proposed effort is a unique automated process for the analysis, design, and sizing of CEV structures and TPS. This developed process will...

  2. Automated Design and Analysis Tool for CLV/CEV Composite and Metallic Structural Components Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The innovation of the proposed effort is a unique automated process for the analysis, design, and sizing of CLV/CEV composite and metallic structures. This developed...

  3. Evaluation of automated and manual DNA purification methods for detecting Ricinus communis DNA during ricin investigations.

    Science.gov (United States)

    Hutchins, Anne S; Astwood, Michael J; Saah, J Royden; Michel, Pierre A; Newton, Bruce R; Dauphin, Leslie A

    2014-03-01

    In April of 2013, letters addressed to the President of United States and other government officials were intercepted and found to be contaminated with ricin, heightening awareness about the need to evaluate laboratory methods for detecting ricin. This study evaluated commercial DNA purification methods for isolating Ricinus communis DNA as measured by real-time polymerase chain reaction (PCR). Four commercially available DNA purification methods (two automated, MagNA Pure compact and MagNA Pure LC, and two manual, MasterPure complete DNA and RNA purification kit and QIAamp DNA blood mini kit) were evaluated. We compared their ability to purify detectable levels of R. communis DNA from four different sample types, including crude preparations of ricin that could be used for biological crimes or acts of bioterrorism. Castor beans, spiked swabs, and spiked powders were included to simulate sample types typically tested during criminal and public health investigations. Real-time PCR analysis indicated that the QIAamp kit resulted in the greatest sensitivity for ricin preparations; the MasterPure kit performed best with spiked powders. The four methods detected equivalent levels by real-time PCR when castor beans and spiked swabs were used. All four methods yielded DNA free of PCR inhibitors as determined by the use of a PCR inhibition control assay. This study demonstrated that DNA purification methods differ in their ability to purify R. communis DNA; therefore, the purification method used for a given sample type can influence the sensitivity of real-time PCR assays for R. communis.

  4. Wine analysis to check quality and authenticity by fully-automated 1H-NMR

    Directory of Open Access Journals (Sweden)

    Spraul Manfred

    2015-01-01

    Full Text Available Fully-automated high resolution 1H-NMR spectroscopy offers unique screening capabilities for food quality and safety by combining non-targeted and targeted screening in one analysis (15–20 min from acquisition to report. The advantage of high resolution 1H-NMR is its absolute reproducibility and transferability from laboratory to laboratory, which is not equaled by any other method currently used in food analysis. NMR reproducibility allows statistical investigations e.g. for detection of variety, geographical origin and adulterations, where smallest changes of many ingredients at the same time must be recorded. Reproducibility and transferability of the solutions shown are user-, instrument- and laboratory-independent. Sample prepara- tion, measurement and processing are based on strict standard operation procedures which are substantial for this fully automated solution. The non-targeted approach to the data allows detecting even unknown deviations, if they are visible in the 1H-NMR spectra of e.g. fruit juice, wine or honey. The same data acquired in high-throughput mode are also subjected to quantification of multiple compounds. This 1H-NMR methodology will shortly be introduced, then results on wine will be presented and the advantages of the solutions shown. The method has been proven on juice, honey and wine, where so far unknown frauds could be detected, while at the same time generating targeted parameters are obtained.

  5. Automated Production Flow Line Failure Rate Mathematical Analysis with Probability Theory

    Directory of Open Access Journals (Sweden)

    Tan Chan Sin

    2014-12-01

    Full Text Available Automated lines have been widely used in the industries especially for mass production and to customize product. Productivity of automated line is a crucial indicator to show the output and performance of the production. Failure or breakdown of station or mechanisms is commonly occurs in the automated line in real condition due to the technological and technical problem which is highly affect the productivity. The failure rates of automated line are not express or analyse in terms of mathematic form. This paper presents the mathematic analysis by using probability theory towards the failure condition in automated line. The mathematic express for failure rates can produce and forecast the output of productivity accurately

  6. The Model and Control Methods of Access to Information and Technology Resources of Automated Control Systems in Water Supply Industry

    Science.gov (United States)

    Rytov, M. Yu; Spichyack, S. A.; Fedorov, V. P.; Petreshin, D. I.

    2017-01-01

    The paper describes a formalized control model of access to information and technological resources of automated control systems at water supply enterprises. The given model considers the availability of various communication links with information systems and technological equipment. There are also studied control methods of access to information and technological resources of automated control systems at water supply enterprises. On the basis of the formalized control model and appropriate methods there was developed a software-hardware complex for rapid access to information and technological resources of automated control systems, which contains an administrator’s automated workplace and ultimate users.

  7. A Fully Automated Method to Detect and Segment a Manufactured Object in an Underwater Color Image

    Directory of Open Access Journals (Sweden)

    Phlypo Ronald

    2010-01-01

    Full Text Available We propose a fully automated active contours-based method for the detection and the segmentation of a moored manufactured object in an underwater image. Detection of objects in underwater images is difficult due to the variable lighting conditions and shadows on the object. The proposed technique is based on the information contained in the color maps and uses the visual attention method, combined with a statistical approach for the detection and an active contour for the segmentation of the object to overcome the above problems. In the classical active contour method the region descriptor is fixed and the convergence of the method depends on the initialization. With our approach, this dependence is overcome with an initialization using the visual attention results and a criterion to select the best region descriptor. This approach improves the convergence and the processing time while providing the advantages of a fully automated method.

  8. Image cytometer method for automated assessment of human spermatozoa concentration

    DEFF Research Database (Denmark)

    Egeberg, D L; Kjaerulff, S; Hansen, C

    2013-01-01

    to investigator bias. Here we show that image cytometry can be used to accurately measure the sperm concentration of human semen samples with great ease and reproducibility. The impact of several factors (pipetting, mixing, round cell content, sperm concentration), which can influence the read-out as well......In the basic clinical work-up of infertile couples, a semen analysis is mandatory and the sperm concentration is one of the most essential variables to be determined. Sperm concentration is usually assessed by manual counting using a haemocytometer and is hence labour intensive and may be subjected...... and easy measurement of human sperm concentration....

  9. Automated Analysis of the SCR-Style Requirements Specifications

    Institute of Scientific and Technical Information of China (English)

    WU Guoqing; LIU Xiang; YING Shi; Tetsuo Tamai

    1999-01-01

    The SCR (Software Cost Reduction)requirements method is an effectivemethod for specifying softwaresystem requirements. This paper presents a formalmodel analyzingSCR-style requirements. The analysis model mainly appliesstatetranslation rules, semantic computing rules and attributes to defineformalsemantics of a tabular notation in the SCR requirements method,and may be used toanalyze requirements specifications to be specifiedby the SCR requirements method.Using a simple example, this paperintroduces how to analyze consistency andcompleteness of requirementsspecifications.

  10. Automated quantification and integrative analysis of 2D and 3D mitochondrial shape and network properties.

    Directory of Open Access Journals (Sweden)

    Julie Nikolaisen

    Full Text Available Mitochondrial morphology and function are coupled in healthy cells, during pathological conditions and (adaptation to endogenous and exogenous stress. In this sense mitochondrial shape can range from small globular compartments to complex filamentous networks, even within the same cell. Understanding how mitochondrial morphological changes (i.e. "mitochondrial dynamics" are linked to cellular (patho physiology is currently the subject of intense study and requires detailed quantitative information. During the last decade, various computational approaches have been developed for automated 2-dimensional (2D analysis of mitochondrial morphology and number in microscopy images. Although these strategies are well suited for analysis of adhering cells with a flat morphology they are not applicable for thicker cells, which require a three-dimensional (3D image acquisition and analysis procedure. Here we developed and validated an automated image analysis algorithm allowing simultaneous 3D quantification of mitochondrial morphology and network properties in human endothelial cells (HUVECs. Cells expressing a mitochondria-targeted green fluorescence protein (mitoGFP were visualized by 3D confocal microscopy and mitochondrial morphology was quantified using both the established 2D method and the new 3D strategy. We demonstrate that both analyses can be used to characterize and discriminate between various mitochondrial morphologies and network properties. However, the results from 2D and 3D analysis were not equivalent when filamentous mitochondria in normal HUVECs were compared with circular/spherical mitochondria in metabolically stressed HUVECs treated with rotenone (ROT. 2D quantification suggested that metabolic stress induced mitochondrial fragmentation and loss of biomass. In contrast, 3D analysis revealed that the mitochondrial network structure was dissolved without affecting the amount and size of the organelles. Thus, our results demonstrate

  11. Automated patient and medication payment method for clinical trials

    Directory of Open Access Journals (Sweden)

    Yawn BP

    2013-01-01

    Full Text Available Barbara P Yawn,1 Suzanne Madison,1 Susan Bertram,1 Wilson D Pace,2 Anne Fuhlbrigge,3 Elliot Israel,3 Dawn Littlefield,1 Margary Kurland,1 Michael E Wechsler41Olmsted Medical Center, Department of Research, Rochester, MN, 2UCDHSC, Department of Family Medicine, University of Colorado Health Science Centre, Aurora, CO, 3Brigham and Women's Hospital, Pulmonary and Critical Care Division, Boston, MA, 4National Jewish Medical Center, Division of Pulmonology, Denver, CO, USABackground: Published reports and studies related to patient compensation for clinical trials focus primarily on the ethical issues related to appropriate amounts to reimburse for patient's time and risk burden. Little has been published regarding the method of payment for patient participation. As clinical trials move into widely dispersed community practices and more complex designs, the method of payment also becomes more complex. Here we review the decision process and payment method selected for a primary care-based randomized clinical trial of asthma management in Black Americans.Methods: The method selected is a credit card system designed specifically for clinical trials that allows both fixed and variable real-time payments. We operationalized the study design by providing each patient with two cards, one for reimbursement for study visits and one for payment of medication costs directly to the pharmacies.Results: Of the 1015 patients enrolled, only two refused use of the ClinCard, requesting cash payments for visits and only rarely a weekend or fill-in pharmacist refused to use the card system for payment directly to the pharmacy. Overall, the system has been well accepted by patients and local study teams. The ClinCard administrative system facilitates the fiscal accounting and medication adherence record-keeping by the central teams. Monthly fees are modest, and all 12 study institutional review boards approved use of the system without concern for patient

  12. Interferences in automated phenol red method for determination of bromide in water

    Science.gov (United States)

    Basel, C.L.; Defreese, J.D.; Whittemore, D.O.

    1982-01-01

    The phenol red method for the determination of bromide in water has been automated by segmented flow analysis. Samples can be analyzed at a rate of 20 samples/h with a method detection limit, defined, as the concentration giving a signal about three times the standard deviation of replicate anaiyte determinations in reagent water, of 10 ??g/L. Samples studied include oil-field brines, halite solution brines, ground-waters contaminated with these brines, and fresh groundwaters. Chloride and bicarbonate cause significant positive interferences at levels as low as 100 mg/L and 50 mg/L, respectively. Ammonia gives a negative interference that is important at levels as low as 0.05 mg/L. An ionic strength buffer is used to suppress a positive ionic strength interference, correction curves are used to compensate for the chloride interference, the bicarbonate interference is minimized by acidification, and the ammonia interference is eliminated by its removal by ion exchange. Reaction product studies are used to suggest a plausible mode of chloride interference. ?? 1982 American Chemical Society.

  13. Automation Tools for Finite Element Analysis of Adhesively Bonded Joints

    Science.gov (United States)

    Tahmasebi, Farhad; Brodeur, Stephen J. (Technical Monitor)

    2002-01-01

    This article presents two new automation creation tools that obtain stresses and strains (Shear and peel) in adhesively bonded joints. For a given adhesively bonded joint Finite Element model, in which the adhesive is characterised using springs, these automation tools read the corresponding input and output files, use the spring forces and deformations to obtain the adhesive stresses and strains, sort the stresses and strains in descending order, and generate plot files for 3D visualisation of the stress and strain fields. Grids (nodes) and elements can be numbered in any order that is convenient for the user. Using the automation tools, trade-off studies, which are needed for design of adhesively bonded joints, can be performed very quickly.

  14. A method for fast automated microscope image stitching.

    Science.gov (United States)

    Yang, Fan; Deng, Zhen-Sheng; Fan, Qiu-Hong

    2013-05-01

    Image stitching is an important technology to produce a panorama or larger image by combining several images with overlapped areas. In many biomedical researches, image stitching is highly desirable to acquire a panoramic image which represents large areas of certain structures or whole sections, while retaining microscopic resolution. In this study, we develop a fast normal light microscope image stitching algorithm based on feature extraction. At first, an algorithm of scale-space reconstruction of speeded-up robust features (SURF) was proposed to extract features from the images to be stitched with a short time and higher repeatability. Then, the histogram equalization (HE) method was employed to preprocess the images to enhance their contrast for extracting more features. Thirdly, the rough overlapping zones of the images preprocessed were calculated by phase correlation, and the improved SURF was used to extract the image features in the rough overlapping areas. Fourthly, the features were corresponded by matching algorithm and the transformation parameters were estimated, then the images were blended seamlessly. Finally, this procedure was applied to stitch normal light microscope images to verify its validity. Our experimental results demonstrate that the improved SURF algorithm is very robust to viewpoint, illumination, blur, rotation and zoom of the images and our method is able to stitch microscope images automatically with high precision and high speed. Also, the method proposed in this paper is applicable to registration and stitching of common images as well as stitching the microscope images in the field of virtual microscope for the purpose of observing, exchanging, saving, and establishing a database of microscope images.

  15. Evaluation of an automated erythrocyte sedimentation rate analyzer as compared to the Westergren manual method in measurement of erythrocyte sedimentation rate

    Directory of Open Access Journals (Sweden)

    Arulselvi Subramanian

    2011-01-01

    Full Text Available Context: Monitor 100® (Electa Lab, Italy is a newly developed automated method for measurement of erythrocyte sedimentation rate (ESR. Aims: The aim of our study was to compare the ESR values by Monitor 100® against the standard Westergren method. Patients and Methods: This cross-sectional study was conducted at a Level I trauma care center on 200 patients. The samples taken were as per the recommendations charted out by International Council for Standardization in Hematology (ICSH for comparing automated and manual Westergrens method. Statistical Analysis Used: Bland and Altman statistical analysis was applied for evaluating Monitor 100® against the conventional Westergren method. Results: The analysis revealed a low degree of agreement between the manual and automated methods especially for higher ESR values, mean difference -11.2 (95% limits of agreement, -46.3 to 23.9 and mean difference -13.4 (95% limits of agreement-58.9 to 32.1 for 1 and 2 hours, respectively. This discrepancy which is of clinical significance was less evident for ESR values in the normal range <25 mm/hour (-7.7 mean of difference; -18.9 to 3.5 limits of agreement. Conclusions: The fully automated system Monitor 100® for ESR measurement tends to underestimate the manual ESR readings. Hence it is recommended that a correction factor be applied for the range of ESR values while using this equipment. Further studies and validation experiments would be required.

  16. Automated analysis of heterogeneous carbon nanostructures by high-resolution electron microscopy and on-line image processing

    Energy Technology Data Exchange (ETDEWEB)

    Toth, P., E-mail: toth.pal@uni-miskolc.hu [Department of Chemical Engineering, University of Utah, 50 S. Central Campus Drive, Salt Lake City, UT 84112-9203 (United States); Farrer, J.K. [Department of Physics and Astronomy, Brigham Young University, N283 ESC, Provo, UT 84602 (United States); Palotas, A.B. [Department of Combustion Technology and Thermal Energy, University of Miskolc, H3515, Miskolc-Egyetemvaros (Hungary); Lighty, J.S.; Eddings, E.G. [Department of Chemical Engineering, University of Utah, 50 S. Central Campus Drive, Salt Lake City, UT 84112-9203 (United States)

    2013-06-15

    High-resolution electron microscopy is an efficient tool for characterizing heterogeneous nanostructures; however, currently the analysis is a laborious and time-consuming manual process. In order to be able to accurately and robustly quantify heterostructures, one must obtain a statistically high number of micrographs showing images of the appropriate sub-structures. The second step of analysis is usually the application of digital image processing techniques in order to extract meaningful structural descriptors from the acquired images. In this paper it will be shown that by applying on-line image processing and basic machine vision algorithms, it is possible to fully automate the image acquisition step; therefore, the number of acquired images in a given time can be increased drastically without the need for additional human labor. The proposed automation technique works by computing fields of structural descriptors in situ and thus outputs sets of the desired structural descriptors in real-time. The merits of the method are demonstrated by using combustion-generated black carbon samples. - Highlights: ► The HRTEM analysis of heterogeneous nanostructures is a tedious manual process. ► Automatic HRTEM image acquisition and analysis can improve data quantity and quality. ► We propose a method based on on-line image analysis for the automation of HRTEM image acquisition. ► The proposed method is demonstrated using HRTEM images of soot particles.

  17. AUTOMATION OF QUALITY CONTROL OF MILK HOMOGENIZATION BY ULTRASONIC SPECTROSCOPY METHODS

    OpenAIRE

    V. K. Bityukov; A. A. Khvostov; D. I. Rebrikov; V. E. Merzlikin

    2015-01-01

    The paper deals with the possibility of determining homogenization degree of milk and dairy products using ultrasonic vibrations absorption spectra. Advantages of this method application in automated manufacturing systems were examined. Theoretical background of the method, as well as the possibility of determining the distribution of the fat globules in milk, depending on their sizes were substantiated. We derived mathematical equations, showing the relationship between the homogenization de...

  18. Automated multivariate analysis of comprehensive two-dimensional gas chromatograms of petroleum

    DEFF Research Database (Denmark)

    Skov, Søren Furbo

    of separated compounds makes the analysis of GCGC chromatograms tricky, as there are too much data for manual analysis , and automated analysis is not always trouble-free: Manual checking of the results is often necessary. In this work, I will investigate the possibility of another approach to analysis of GCGC...

  19. Ecological Momentary Assessments and Automated Time Series Analysis to Promote Tailored Health Care: A Proof-of-Principle Study

    Science.gov (United States)

    Emerencia, Ando C; Bos, Elisabeth H; Rosmalen, Judith GM; Riese, Harriëtte; Aiello, Marco; Sytema, Sjoerd; de Jonge, Peter

    2015-01-01

    Background Health promotion can be tailored by combining ecological momentary assessments (EMA) with time series analysis. This combined method allows for studying the temporal order of dynamic relationships among variables, which may provide concrete indications for intervention. However, application of this method in health care practice is hampered because analyses are conducted manually and advanced statistical expertise is required. Objective This study aims to show how this limitation can be overcome by introducing automated vector autoregressive modeling (VAR) of EMA data and to evaluate its feasibility through comparisons with results of previously published manual analyses. Methods We developed a Web-based open source application, called AutoVAR, which automates time series analyses of EMA data and provides output that is intended to be interpretable by nonexperts. The statistical technique we used was VAR. AutoVAR tests and evaluates all possible VAR models within a given combinatorial search space and summarizes their results, thereby replacing the researcher’s tasks of conducting the analysis, making an informed selection of models, and choosing the best model. We compared the output of AutoVAR to the output of a previously published manual analysis (n=4). Results An illustrative example consisting of 4 analyses was provided. Compared to the manual output, the AutoVAR output presents similar model characteristics and statistical results in terms of the Akaike information criterion, the Bayesian information criterion, and the test statistic of the Granger causality test. Conclusions Results suggest that automated analysis and interpretation of times series is feasible. Compared to a manual procedure, the automated procedure is more robust and can save days of time. These findings may pave the way for using time series analysis for health promotion on a larger scale. AutoVAR was evaluated using the results of a previously conducted manual analysis

  20. High-resolution quantitative metabolome analysis of urine by automated flow injection NMR.

    Science.gov (United States)

    Da Silva, Laeticia; Godejohann, Markus; Martin, François-Pierre J; Collino, Sebastiano; Bürkle, Alexander; Moreno-Villanueva, María; Bernhardt, Jürgen; Toussaint, Olivier; Grubeck-Loebenstein, Beatrix; Gonos, Efstathios S; Sikora, Ewa; Grune, Tilman; Breusing, Nicolle; Franceschi, Claudio; Hervonen, Antti; Spraul, Manfred; Moco, Sofia

    2013-06-18

    Metabolism is essential to understand human health. To characterize human metabolism, a high-resolution read-out of the metabolic status under various physiological conditions, either in health or disease, is needed. Metabolomics offers an unprecedented approach for generating system-specific biochemical definitions of a human phenotype through the capture of a variety of metabolites in a single measurement. The emergence of large cohorts in clinical studies increases the demand of technologies able to analyze a large number of measurements, in an automated fashion, in the most robust way. NMR is an established metabolomics tool for obtaining metabolic phenotypes. Here, we describe the analysis of NMR-based urinary profiles for metabolic studies, challenged to a large human study (3007 samples). This method includes the acquisition of nuclear Overhauser effect spectroscopy one-dimensional and J-resolved two-dimensional (J-Res-2D) (1)H NMR spectra obtained on a 600 MHz spectrometer, equipped with a 120 μL flow probe, coupled to a flow-injection analysis system, in full automation under the control of a sampler manager. Samples were acquired at a throughput of ~20 (or 40 when J-Res-2D is included) min/sample. The associated technical analysis error over the full series of analysis is 12%, which demonstrates the robustness of the method. With the aim to describe an overall metabolomics workflow, the quantification of 36 metabolites, mainly related to central carbon metabolism and gut microbial host cometabolism, was obtained, as well as multivariate data analysis of the full spectral profiles. The metabolic read-outs generated using our analytical workflow can therefore be considered for further pathway modeling and/or biological interpretation.

  1. Prototype Software for Automated Structural Analysis of Systems

    DEFF Research Database (Denmark)

    Jørgensen, A.; Izadi-Zamanabadi, Roozbeh; Kristensen, M.

    2004-01-01

    In this paper we present a prototype software tool that is developed to analyse the structural model of automated systems in order to identify redundant information that is hence utilized for Fault detection and Isolation (FDI) purposes. The dedicated algorithms in this software tool use a tri...

  2. Automated cleaning of foraminifera shells before Mg/Ca analysis using a pipette robot

    Science.gov (United States)

    Johnstone, Heather J. H.; Steinke, Stephan; Kuhnert, Henning; Bickert, Torsten; Pälike, Heiko; Mohtadi, Mahyar

    2016-08-01

    The molar ratio of magnesium to calcium (Mg/Ca) in foraminiferal calcite is a widely used proxy for reconstructing past seawater temperatures. Thorough cleaning of tests is required before analysis to remove contaminant phases such as clay and organic matter. We have adapted a commercial pipette robot to automate an established cleaning procedure, the "Mg-cleaning" protocol of Barker et al. (2003). Efficiency of the automated nine-step method was assessed through monitoring Al/Ca of trial samples (GeoB4420-2 core catcher). Planktonic foraminifera Globigerinoides ruber, Globigerinoides sacculifer, and Neogloboquadrina dutertrei from this sample gave Mg/Ca consistent with the habitat range of the three species, and 40-60% sample recovery after cleaning. Comparison between manually cleaned and robot-cleaned samples of G. ruber (white) from a sediment core (GeoB16602) showed good correspondence between the two methods for Mg/Ca (r = 0.93, p robot-cleaned samples was 0.05 mmol/mol, showing that the samples are cleaned effectively by the robot. The robot offers increased sample throughput as batch sizes of up to 88 samples/blanks can be processed in ˜7 h with little intervention.

  3. Endoscope reprocessing methods: a prospective study on the impact of human factors and automation.

    Science.gov (United States)

    Ofstead, Cori L; Wetzler, Harry P; Snyder, Alycea K; Horton, Rebecca A

    2010-01-01

    The main cause of endoscopy-associated infections is failure to adhere to reprocessing guidelines. More information about factors impacting compliance is needed to support the development of effective interventions. The purpose of this multisite, observational study was to evaluate reprocessing practices, employee perceptions, and occupational health issues. Data were collected utilizing interviews, surveys, and direct observation. Written reprocessing policies and procedures were in place at all five sites, and employees affirmed the importance of most recommended steps. Nevertheless, observers documented guideline adherence, with only 1.4% of endoscopes reprocessed using manual cleaning methods with automated high-level disinfection versus 75.4% of those reprocessed using an automated endoscope cleaner and reprocessor. The majority reported health problems (i.e., pain, decreased flexibility, numbness, or tingling). Physical discomfort was associated with time spent reprocessing (p = .041). Discomfort diminished after installation of automated endoscope cleaners and reprocessors (p = .001). Enhanced training and accountability, combined with increased automation, may ensure guideline adherence and patient safety while improving employee satisfaction and health.

  4. The LBI-method for automated indexing of diagnoses by using SNOMED. Part 2. Evaluation.

    Science.gov (United States)

    Brigl, B; Mieth, M; Haux, R; Glück, E

    1995-02-01

    We present a simple, formal, lexicon-based method for automated indexing of diagnoses based on the Systematized Nomenclature of Medicine (SNOMED), called LBI-method. Part 1 gave an introduction to the LBI-method and presented its realisation as application system SALBIDH. Part 2 presents the design and the results of an evaluation study to judge the quality of the LBI-method. In this evaluation study the quality of automated indexing as well as the quality of the retrieval of patient data by using automated indexed diagnoses was examined. The results show that the retrieval based on SNOMED indices is at least as good as the retrieval based on ICD classes despite a lot of indexing errors. From this we gather that our system is not yet good enough for immediate routine use but that an appropriate indexing quality and, as a result, a higher retrieval quality can be achieved after few improvements of the LBI-method, especially after revision of the lexicons.

  5. RoboSCell: An automated single cell arraying and analysis instrument

    KAUST Repository

    Sakaki, Kelly

    2009-09-09

    Single cell research has the potential to revolutionize experimental methods in biomedical sciences and contribute to clinical practices. Recent studies suggest analysis of single cells reveals novel features of intracellular processes, cell-to-cell interactions and cell structure. The methods of single cell analysis require mechanical resolution and accuracy that is not possible using conventional techniques. Robotic instruments and novel microdevices can achieve higher throughput and repeatability; however, the development of such instrumentation is a formidable task. A void exists in the state-of-the-art for automated analysis of single cells. With the increase in interest in single cell analyses in stem cell and cancer research the ability to facilitate higher throughput and repeatable procedures is necessary. In this paper, a high-throughput, single cell microarray-based robotic instrument, called the RoboSCell, is described. The proposed instrument employs a partially transparent single cell microarray (SCM) integrated with a robotic biomanipulator for in vitro analyses of live single cells trapped at the array sites. Cells, labeled with immunomagnetic particles, are captured at the array sites by channeling magnetic fields through encapsulated permalloy channels in the SCM. The RoboSCell is capable of systematically scanning the captured cells temporarily immobilized at the array sites and using optical methods to repeatedly measure extracellular and intracellular characteristics over time. The instrument\\'s capabilities are demonstrated by arraying human T lymphocytes and measuring the uptake dynamics of calcein acetoxymethylester-all in a fully automated fashion. © 2009 Springer Science+Business Media, LLC.

  6. Quantification of Eosinophilic Granule Protein Deposition in Biopsies of Inflammatory Skin Diseases by Automated Image Analysis of Highly Sensitive Immunostaining

    Directory of Open Access Journals (Sweden)

    Peter Kiehl

    1999-01-01

    Full Text Available Eosinophilic granulocytes are major effector cells in inflammation. Extracellular deposition of toxic eosinophilic granule proteins (EGPs, but not the presence of intact eosinophils, is crucial for their functional effect in situ. As even recent morphometric approaches to quantify the involvement of eosinophils in inflammation have been only based on cell counting, we developed a new method for the cell‐independent quantification of EGPs by image analysis of immunostaining. Highly sensitive, automated immunohistochemistry was done on paraffin sections of inflammatory skin diseases with 4 different primary antibodies against EGPs. Image analysis of immunostaining was performed by colour translation, linear combination and automated thresholding. Using strictly standardized protocols, the assay was proven to be specific and accurate concerning segmentation in 8916 fields of 520 sections, well reproducible in repeated measurements and reliable over 16 weeks observation time. The method may be valuable for the cell‐independent segmentation of immunostaining in other applications as well.

  7. Long-term live cell imaging and automated 4D analysis of drosophila neuroblast lineages.

    Directory of Open Access Journals (Sweden)

    Catarina C F Homem

    Full Text Available The developing Drosophila brain is a well-studied model system for neurogenesis and stem cell biology. In the Drosophila central brain, around 200 neural stem cells called neuroblasts undergo repeated rounds of asymmetric cell division. These divisions typically generate a larger self-renewing neuroblast and a smaller ganglion mother cell that undergoes one terminal division to create two differentiating neurons. Although single mitotic divisions of neuroblasts can easily be imaged in real time, the lack of long term imaging procedures has limited the use of neuroblast live imaging for lineage analysis. Here we describe a method that allows live imaging of cultured Drosophila neuroblasts over multiple cell cycles for up to 24 hours. We describe a 4D image analysis protocol that can be used to extract cell cycle times and growth rates from the resulting movies in an automated manner. We use it to perform lineage analysis in type II neuroblasts where clonal analysis has indicated the presence of a transit-amplifying population that potentiates the number of neurons. Indeed, our experiments verify type II lineages and provide quantitative parameters for all cell types in those lineages. As defects in type II neuroblast lineages can result in brain tumor formation, our lineage analysis method will allow more detailed and quantitative analysis of tumorigenesis and asymmetric cell division in the Drosophila brain.

  8. Automated analysis of damages for radiation in plastics surfaces; Analisis automatizado de danos por radiacion en superficies plasticas

    Energy Technology Data Exchange (ETDEWEB)

    Andrade, C.; Camacho M, E.; Tavera, L.; Balcazar, M. [ININ, 52045 Ocoyoacac, Estado de Mexico (Mexico)

    1990-02-15

    Analysis of damages done by the radiation in a polymer characterized by optic properties of polished surfaces, of uniformity and chemical resistance that the acrylic; resistant until the 150 centigrade grades of temperature, and with an approximate weight of half of the glass. An objective of this work is the development of a method that analyze in automated form the superficial damages induced by radiation in plastic materials means an images analyst. (Author)

  9. Alert management for home healthcare based on home automation analysis.

    Science.gov (United States)

    Truong, T T; de Lamotte, F; Diguet, J-Ph; Said-Hocine, F

    2010-01-01

    Rising healthcare for elder and disabled people can be controlled by offering people autonomy at home by means of information technology. In this paper, we present an original and sensorless alert management solution which performs multimedia and home automation service discrimination and extracts highly regular home activities as sensors for alert management. The results of simulation data, based on real context, allow us to evaluate our approach before application to real data.

  10. Automated extraction of DNA from biological stains on fabric from crime cases. A comparison of a manual and three automated methods

    DEFF Research Database (Denmark)

    Stangegaard, Michael; Hjort, Benjamin B; Hansen, Thomas N;

    2013-01-01

    that may be co-extracted with the DNA. Using 120 forensic trace evidence samples consisting of various types of fabric, we compared three automated DNA extraction methods based on magnetic beads (PrepFiler Express Forensic DNA Extraction Kit on an AutoMate Express, QIAsyphony DNA Investigator kit either......The presence of PCR inhibitors in extracted DNA may interfere with the subsequent quantification and short tandem repeat (STR) reactions used in forensic genetic DNA typing. DNA extraction from fabric for forensic genetic purposes may be challenging due to the occasional presence of PCR inhibitors...

  11. Comparison of Automated Continuous Flow Method With Shake- Flask Method in Determining Partition Coefficients of Bidentate Hydroxypyridinone Ligands

    Directory of Open Access Journals (Sweden)

    Lotfollah Saghaie

    2003-08-01

    Full Text Available The partition coefficients (Kpart , in octanol/water system of a range of bidentate ligands containing the 3-hydroxypyridin-4-one moiety were determined using shake flask and automated continuous flow methods (filter probe method. The shake flask method was used for extremely hydrophilic or hydrophobic compounds with a Kpart values greater than 100 and less than 0.01. For other ligands which possess moderate lipophilicity (Kpart values between 0.01-100 the filter probe method was used. Also the partition coefficient of four ligands with moderate lipophilicity was determined by shake flask method in order to check comparability of these two methods. While the shake flask method was able to determine either extremely hydrophilic or hydrophobic compounds efficiently, the filter probe method was unable to measure such Kpart values. Although, determination of the Kpart values of all compounds is possible with the classical shake-flask method, the procedure is time consuming. In contrast, the filter probe method offers many advantages over the traditional shake-flask method in terms of speed, efficiency of separation and degree of automation. The shake-flask method is the method of choice for determination of partition coefficients of extremely hydrophilic and hydrophobic ligands.

  12. Assessment of paclitaxel induced sensory polyneuropathy with "Catwalk" automated gait analysis in mice.

    Directory of Open Access Journals (Sweden)

    Petra Huehnchen

    Full Text Available Neuropathic pain as a symptom of sensory nerve damage is a frequent side effect of chemotherapy. The most common behavioral observation in animal models of chemotherapy induced polyneuropathy is the development of mechanical allodynia, which is quantified with von Frey filaments. The data from one study, however, cannot be easily compared with other studies owing to influences of environmental factors, inter-rater variability and differences in test paradigms. To overcome these limitations, automated quantitative gait analysis was proposed as an alternative, but its usefulness for assessing animals suffering from polyneuropathy has remained unclear. In the present study, we used a novel mouse model of paclitaxel induced polyneuropathy to compare results from electrophysiology and the von Frey method to gait alterations measured with the Catwalk test. To mimic recently improved clinical treatment strategies of gynecological malignancies, we established a mouse model of dose-dense paclitaxel therapy on the common C57Bl/6 background. In this model paclitaxel treated animals developed mechanical allodynia as well as reduced caudal sensory nerve action potential amplitudes indicative of a sensory polyneuropathy. Gait analysis with the Catwalk method detected distinct alterations of gait parameters in animals suffering from sensory neuropathy, revealing a minimized contact of the hind paws with the floor. Treatment of mechanical allodynia with gabapentin improved altered dynamic gait parameters. This study establishes a novel mouse model for investigating the side effects of dose-dense paclitaxel therapy and underlines the usefulness of automated gait analysis as an additional easy-to-use objective test for evaluating painful sensory polyneuropathy.

  13. Use of automated video analysis for the evaluation of bicycle movement and interaction

    Science.gov (United States)

    Twaddle, Heather; Schendzielorz, Tobias; Fakler, Oliver; Amini, Sasan

    2014-03-01

    With the purpose of developing valid models of microscopic bicycle behavior, a large quantity of video data is collected at three busy urban intersections in Munich, Germany. Due to the volume of data, the manual processing of this data is infeasible and an automated or semi-automated analysis method must be implemented. An open source software, "Traffic Intelligence", is used and extended to analyze the collected video data with regard to research questions concerning the tactical behavior of bicyclists. In a first step, the feature detection parameters, the tracking parameters and the object grouping parameters are calibrated, making it possible to accurately track and group the objects at intersections used by large volumes of motor vehicles, bicycles and pedestrians. The resulting parameters for the three intersections are presented. A methodology for the classification of road users as cars, bicycles or pedestrians is presented and evaluated. This is achieved by making hypotheses about which features belong to cars, or bicycles and pedestrians, and using grouping parameters specified for that road user group to cluster the features into objects. These objects are then classified based on their dynamic characteristics. A classification structure for the maneuvers of different road users is presented and future applications are discussed.

  14. A fully automated linear polyacrylamide coating and regeneration method for capillary electrophoresis of proteins.

    Science.gov (United States)

    Bodnar, Judit; Hajba, Laszlo; Guttman, Andras

    2016-12-01

    Surface modification of the inner capillary wall in CE of proteins is frequently required to alter EOF and to prevent protein adsorption. Manual protocols for such coating techniques are cumbersome. In this paper, an automated covalent linear polyacrylamide coating and regeneration process is described to support long-term stability of fused-silica capillaries for protein analysis. The stability of the resulting capillary coatings was evaluated by a large number of separations using a three-protein test mixture in pH 6 and 3 buffer systems. The results were compared to that obtained with the use of bare fused-silica capillaries. If necessary, the fully automated capillary coating process was easily applied to regenerate the capillary to extend its useful life-time.

  15. Automated Dermoscopy Image Analysis of Pigmented Skin Lesions

    Directory of Open Access Journals (Sweden)

    Alfonso Baldi

    2010-03-01

    Full Text Available Dermoscopy (dermatoscopy, epiluminescence microscopy is a non-invasive diagnostic technique for the in vivo observation of pigmented skin lesions (PSLs, allowing a better visualization of surface and subsurface structures (from the epidermis to the papillary dermis. This diagnostic tool permits the recognition of morphologic structures not visible by the naked eye, thus opening a new dimension in the analysis of the clinical morphologic features of PSLs. In order to reduce the learning-curve of non-expert clinicians and to mitigate problems inherent in the reliability and reproducibility of the diagnostic criteria used in pattern analysis, several indicative methods based on diagnostic algorithms have been introduced in the last few years. Recently, numerous systems designed to provide computer-aided analysis of digital images obtained by dermoscopy have been reported in the literature. The goal of this article is to review these systems, focusing on the most recent approaches based on content-based image retrieval systems (CBIR.

  16. The BoneXpert method for automated determination of skeletal maturity

    DEFF Research Database (Denmark)

    Thodberg, Hans Henrik; Kreiborg, Sven; Juul, Anders

    2009-01-01

    Bone age rating is associated with a considerable variability from the human interpretation, and this is the motivation for presenting a new method for automated determination of bone age (skeletal maturity). The method, called BoneXpert, reconstructs, from radiographs of the hand, the borders...... of 15 bones automatically and then computes "intrinsic" bone ages for each of 13 bones (radius, ulna, and 11 short bones). Finally, it transforms the intrinsic bone ages into Greulich Pyle (GP) or Tanner Whitehouse (TW) bone age. The bone reconstruction method automatically rejects images with abnormal...

  17. A method for the automated, reliable retrieval of publication-citation records.

    Directory of Open Access Journals (Sweden)

    Derek Ruths

    Full Text Available BACKGROUND: Publication records and citation indices often are used to evaluate academic performance. For this reason, obtaining or computing them accurately is important. This can be difficult, largely due to a lack of complete knowledge of an individual's publication list and/or lack of time available to manually obtain or construct the publication-citation record. While online publication search engines have somewhat addressed these problems, using raw search results can yield inaccurate estimates of publication-citation records and citation indices. METHODOLOGY: In this paper, we present a new, automated method that produces estimates of an individual's publication-citation record from an individual's name and a set of domain-specific vocabulary that may occur in the individual's publication titles. Because this vocabulary can be harvested directly from a research web page or online (partial publication list, our method delivers an easy way to obtain estimates of a publication-citation record and the relevant citation indices. Our method works by applying a series of stringent name and content filters to the raw publication search results returned by an online publication search engine. In this paper, our method is run using Google Scholar, but the underlying filters can be easily applied to any existing publication search engine. When compared against a manually constructed data set of individuals and their publication-citation records, our method provides significant improvements over raw search results. The estimated publication-citation records returned by our method have an average sensitivity of 98% and specificity of 72% (in contrast to raw search result specificity of less than 10%. When citation indices are computed using these records, the estimated indices are within of the true value 10%, compared to raw search results which have overestimates of, on average, 75%. CONCLUSIONS: These results confirm that our method provides

  18. Automated Dissolution for Enteric-Coated Aspirin Tablets: A Case Study for Method Transfer to a RoboDis II.

    Science.gov (United States)

    Ibrahim, Sarah A; Martini, Luigi

    2014-08-01

    Dissolution method transfer is a complicated yet common process in the pharmaceutical industry. With increased pharmaceutical product manufacturing and dissolution acceptance requirements, dissolution testing has become one of the most labor-intensive quality control testing methods. There is an increased trend for automation in dissolution testing, particularly for large pharmaceutical companies to reduce variability and increase personnel efficiency. There is no official guideline for dissolution testing method transfer from a manual, semi-automated, to automated dissolution tester. In this study, a manual multipoint dissolution testing procedure for an enteric-coated aspirin tablet was transferred effectively and reproducibly to a fully automated dissolution testing device, RoboDis II. Enteric-coated aspirin samples were used as a model formulation to assess the feasibility and accuracy of media pH change during continuous automated dissolution testing. Several RoboDis II parameters were evaluated to ensure the integrity and equivalency of dissolution method transfer from a manual dissolution tester. This current study provides a systematic outline for the transfer of the manual dissolution testing protocol to an automated dissolution tester. This study further supports that automated dissolution testers compliant with regulatory requirements and similar to manual dissolution testers facilitate method transfer.

  19. Method and System for Protection of Automated Control Systems for “Smart Buildings”

    Directory of Open Access Journals (Sweden)

    Dmitry Mikhaylov

    2013-07-01

    Full Text Available The paper is related to system and method for protection of an automated control system (ACS against un-authorized devices connected to the ACS via wired or wireless channels that substantially obviates the disadvantages of the related art. The protection system monitors the signals spreading in the network analyzing the performance of the network for malicious code or hidden connections of attacker. The system is developed specifically for this purpose and it can protect the industrial control systems more effectively than standard anti-virus programs. Specific anti-virus software installed on a central server of the automated control system protects it from software-based attacks both from internal and external offenders. The system comprises a plurality of bus protection devices of different types, including any of a twisted-pair protection device, a power lines protection device, On-Board Diagnostics signal protocol protection device, and a wireless protection device.

  20. Automated quality control methods for sensor data: a novel observatory approach

    Directory of Open Access Journals (Sweden)

    J. R. Taylor

    2012-12-01

    Full Text Available National and international networks and observatories of terrestrial-based sensors are emerging rapidly. As such, there is demand for a standardized approach to data quality control, as well as interoperability of data among sensor networks. The National Ecological Observatory Network (NEON has begun constructing their first terrestrial observing sites with 60 locations expected to be distributed across the US by 2017. This will result in over 14 000 automated sensors recording more than > 100 Tb of data per year. These data are then used to create other datasets and subsequent "higher-level" data products. In anticipation of this challenge, an overall data quality assurance plan has been developed and the first suite of data quality control measures defined. This data-driven approach focuses on automated methods for defining a suite of plausibility test parameter thresholds. Specifically, these plausibility tests scrutinize data range, persistence, and stochasticity on each measurement type by employing a suite of binary checks. The statistical basis for each of these tests is developed and the methods for calculating test parameter thresholds are explored here. While these tests have been used elsewhere, we apply them in a novel approach by calculating their relevant test parameter thresholds. Finally, implementing automated quality control is demonstrated with preliminary data from a NEON prototype site.

  1. Automated quality control methods for sensor data: a novel observatory approach

    Directory of Open Access Journals (Sweden)

    J. R. Taylor

    2013-07-01

    Full Text Available National and international networks and observatories of terrestrial-based sensors are emerging rapidly. As such, there is demand for a standardized approach to data quality control, as well as interoperability of data among sensor networks. The National Ecological Observatory Network (NEON has begun constructing their first terrestrial observing sites, with 60 locations expected to be distributed across the US by 2017. This will result in over 14 000 automated sensors recording more than > 100 Tb of data per year. These data are then used to create other datasets and subsequent "higher-level" data products. In anticipation of this challenge, an overall data quality assurance plan has been developed and the first suite of data quality control measures defined. This data-driven approach focuses on automated methods for defining a suite of plausibility test parameter thresholds. Specifically, these plausibility tests scrutinize the data range and variance of each measurement type by employing a suite of binary checks. The statistical basis for each of these tests is developed, and the methods for calculating test parameter thresholds are explored here. While these tests have been used elsewhere, we apply them in a novel approach by calculating their relevant test parameter thresholds. Finally, implementing automated quality control is demonstrated with preliminary data from a NEON prototype site.

  2. Comprehensive automation of the solid phase extraction gas chromatographic mass spectrometric analysis (SPE-GC/MS) of opioids, cocaine, and metabolites from serum and other matrices.

    Science.gov (United States)

    Lerch, Oliver; Temme, Oliver; Daldrup, Thomas

    2014-07-01

    The analysis of opioids, cocaine, and metabolites from blood serum is a routine task in forensic laboratories. Commonly, the employed methods include many manual or partly automated steps like protein precipitation, dilution, solid phase extraction, evaporation, and derivatization preceding a gas chromatography (GC)/mass spectrometry (MS) or liquid chromatography (LC)/MS analysis. In this study, a comprehensively automated method was developed from a validated, partly automated routine method. This was possible by replicating method parameters on the automated system. Only marginal optimization of parameters was necessary. The automation relying on an x-y-z robot after manual protein precipitation includes the solid phase extraction, evaporation of the eluate, derivatization (silylation with N-methyl-N-trimethylsilyltrifluoroacetamide, MSTFA), and injection into a GC/MS. A quantitative analysis of almost 170 authentic serum samples and more than 50 authentic samples of other matrices like urine, different tissues, and heart blood on cocaine, benzoylecgonine, methadone, morphine, codeine, 6-monoacetylmorphine, dihydrocodeine, and 7-aminoflunitrazepam was conducted with both methods proving that the analytical results are equivalent even near the limits of quantification (low ng/ml range). To our best knowledge, this application is the first one reported in the literature employing this sample preparation system.

  3. Mass asymmetry and tricyclic wobble motion assessment using automated launch video analysis

    Institute of Scientific and Technical Information of China (English)

    Ryan DECKER; Joseph DONINI; William GARDNER; Jobin JOHN; Walter KOENIG

    2016-01-01

    This paper describes an approach to identify epicyclic and tricyclic motion during projectile flight caused by mass asymmetries in spin-stabilized projectiles. Flight video was captured following projectile launch of several M110A2E1 155 mm artillery projectiles. These videos were then analyzed using the automated flight video analysis method to attain their initial position and orientation histories. Examination of the pitch and yaw histories clearly indicates that in addition to epicyclic motion’s nutation and precession oscillations, an even faster wobble amplitude is present during each spin revolution, even though some of the amplitudes of the oscillation are smaller than 0.02 degree. The results are compared to a sequence of shots where little appreciable mass asymmetries were present, and only nutation and precession frequencies are predominantly apparent in the motion history results. Magnitudes of the wobble motion are estimated and compared to product of inertia measurements of the asymmetric projectiles.

  4. Automation of C-terminal sequence analysis of 2D-PAGE separated proteins

    Directory of Open Access Journals (Sweden)

    P.P. Moerman

    2014-06-01

    Full Text Available Experimental assignment of the protein termini remains essential to define the functional protein structure. Here, we report on the improvement of a proteomic C-terminal sequence analysis method. The approach aims to discriminate the C-terminal peptide in a CNBr-digest where Met-Xxx peptide bonds are cleaved in internal peptides ending at a homoserine lactone (hsl-derivative. pH-dependent partial opening of the lactone ring results in the formation of doublets for all internal peptides. C-terminal peptides are distinguished as singlet peaks by MALDI-TOF MS and MS/MS is then used for their identification. We present a fully automated protocol established on a robotic liquid-handling station.

  5. Rapid Automated Dissolution and Analysis Techniques for Radionuclides in Recycle Process Streams

    Energy Technology Data Exchange (ETDEWEB)

    Sudowe, Ralf [Univ. of Nevada, Las Vegas, NV (United States). Radiochemistry Program and Health Physics Dept.; Roman, Audrey [Univ. of Nevada, Las Vegas, NV (United States). Radiochemistry Program; Dailey, Ashlee [Univ. of Nevada, Las Vegas, NV (United States). Radiochemistry Program; Go, Elaine [Univ. of Nevada, Las Vegas, NV (United States). Radiochemistry Program

    2013-07-18

    The analysis of process samples for radionuclide content is an important part of current procedures for material balance and accountancy in the different process streams of a recycling plant. The destructive sample analysis techniques currently available necessitate a significant amount of time. It is therefore desirable to develop new sample analysis procedures that allow for a quick turnaround time and increased sample throughput with a minimum of deviation between samples. In particular, new capabilities for rapid sample dissolution and radiochemical separation are required. Most of the radioanalytical techniques currently employed for sample analysis are based on manual laboratory procedures. Such procedures are time- and labor-intensive, and not well suited for situations in which a rapid sample analysis is required and/or large number of samples need to be analyzed. To address this issue we are currently investigating radiochemical separation methods based on extraction chromatography that have been specifically optimized for the analysis of process stream samples. The influence of potential interferences present in the process samples as well as mass loading, flow rate and resin performance is being studied. In addition, the potential to automate these procedures utilizing a robotic platform is evaluated. Initial studies have been carried out using the commercially available DGA resin. This resin shows an affinity for Am, Pu, U, and Th and is also exhibiting signs of a possible synergistic effects in the presence of iron.

  6. SigMate: a Matlab-based automated tool for extracellular neuronal signal processing and analysis.

    Science.gov (United States)

    Mahmud, Mufti; Bertoldo, Alessandra; Girardi, Stefano; Maschietto, Marta; Vassanelli, Stefano

    2012-05-30

    Rapid advances in neuronal probe technology for multisite recording of brain activity have posed a significant challenge to neuroscientists for processing and analyzing the recorded signals. To be able to infer meaningful conclusions quickly and accurately from large datasets, automated and sophisticated signal processing and analysis tools are required. This paper presents a Matlab-based novel tool, "SigMate", incorporating standard methods to analyze spikes and EEG signals, and in-house solutions for local field potentials (LFPs) analysis. Available modules at present are - 1. In-house developed algorithms for: data display (2D and 3D), file operations (file splitting, file concatenation, and file column rearranging), baseline correction, slow stimulus artifact removal, noise characterization and signal quality assessment, current source density (CSD) analysis, latency estimation from LFPs and CSDs, determination of cortical layer activation order using LFPs and CSDs, and single LFP clustering; 2. Existing modules: spike detection, sorting and spike train analysis, and EEG signal analysis. SigMate has the flexibility of analyzing multichannel signals as well as signals from multiple recording sources. The in-house developed tools for LFP analysis have been extensively tested with signals recorded using standard extracellular recording electrode, and planar and implantable multi transistor array (MTA) based neural probes. SigMate will be disseminated shortly to the neuroscience community under the open-source GNU-General Public License.

  7. Comparison of Automated Image-Based Grain Sizing to Standard Pebble Count Methods

    Science.gov (United States)

    Strom, K. B.

    2009-12-01

    This study explores the use of an automated, image-based method for characterizing grain-size distributions (GSDs) of exposed, open-framework gravel beds. This was done by comparing the GSDs measured with an image-based method to distributions obtained with two pebble-count methods. Selection of grains for the two pebble-count methods was carried out using a gridded sampling frame and the heel-to-toe Wolman walk method at six field sites. At each site, 500-partcle pebble-count samples were collected with each of the two pebble-count methods and digital images were systematically collected over the same sampling area. For the methods used, the pebble counts collected with the gridded sampling frame were assumed to be the most accurate representations of the true grain-size population, and results from the image-based method were compared to the grid derived GSDs for accuracy estimates; comparisons between the grid and Wolman walk methods were conducted to give an indication of possible variation between commonly used methods for each particular field site. Comparison of grain sizes were made at two spatial scales. At the larger scale, results from the image-based method were integrated over the sampling area required to collect the 500-particle pebble-count samples. At the smaller sampling scale, the image derived GSDs were compared to those from 100-particle, pebble-count samples obtained with the gridded sampling frame. The comparisons show that the image-based method performed reasonably well on five of the six study sites. For those five sites, the image-based method slightly underestimate all grain-size percentiles relative to the pebble counts collected with the gridded sampling frame. The average bias for Ψ5, Ψ50, and Ψ95 between the image and grid count methods at the larger sampling scale was 0.07Ψ, 0.04Ψ, and 0.19Ψ respectively; at the smaller sampling scale the average bias was 0.004Ψ, 0.03Ψ, and 0.18Ψ respectively. The average bias between the

  8. Methods of Multivariate Analysis

    CERN Document Server

    Rencher, Alvin C

    2012-01-01

    Praise for the Second Edition "This book is a systematic, well-written, well-organized text on multivariate analysis packed with intuition and insight . . . There is much practical wisdom in this book that is hard to find elsewhere."-IIE Transactions Filled with new and timely content, Methods of Multivariate Analysis, Third Edition provides examples and exercises based on more than sixty real data sets from a wide variety of scientific fields. It takes a "methods" approach to the subject, placing an emphasis on how students and practitioners can employ multivariate analysis in real-life sit

  9. Automated analysis of image mammogram for breast cancer diagnosis

    Science.gov (United States)

    Nurhasanah, Sampurno, Joko; Faryuni, Irfana Diah; Ivansyah, Okto

    2016-03-01

    Medical imaging help doctors in diagnosing and detecting diseases that attack the inside of the body without surgery. Mammogram image is a medical image of the inner breast imaging. Diagnosis of breast cancer needs to be done in detail and as soon as possible for determination of next medical treatment. The aim of this work is to increase the objectivity of clinical diagnostic by using fractal analysis. This study applies fractal method based on 2D Fourier analysis to determine the density of normal and abnormal and applying the segmentation technique based on K-Means clustering algorithm to image abnormal for determine the boundary of the organ and calculate the area of organ segmentation results. The results show fractal method based on 2D Fourier analysis can be used to distinguish between the normal and abnormal breast and segmentation techniques with K-Means Clustering algorithm is able to generate the boundaries of normal and abnormal tissue organs, so area of the abnormal tissue can be determined.

  10. Evaluation of robot automated chromogenic substrate LAL endotoxin assay method for pharmaceutical products testing.

    Science.gov (United States)

    Tsuji, K; Martin, P A

    1985-01-01

    The robot automated chromogenic substrate LAL assay method was evaluated for endotoxin testing using three lots each of 12 pharmaceutical products. As many as 216 assays, including automated standard curve construction and sample preparation, can be performed in a single day of unattended operation. The method is linear (r greater than .99) in the range of 0 to 0.2 EU/ml. The precision of the method determined by assaying a lot of calcium gluconate for four days was 6%, 10%, and 10% for within an assay block, between assay blocks, and between assay days, respectively. Recovery of endotoxin when spiked into products ranged from 81% to 110% and was within the statistical variation (2 sigma limit) of the method. The endotoxin levels detected in a biological raw material by the chromogenic substrate assay method correlated well with that of the gel-clot LAL assay method. The endotoxin content of the majority of the pharmaceutical products tested was well below the sensitivity of both the chromogenic substrate and the gel clot LAL assay methods.

  11. Automated analysis of images acquired with electronic portal imaging device during delivery of quality assurance plans for inversely optimized arc therapy

    DEFF Research Database (Denmark)

    Fredh, Anna; Korreman, Stine; Rosenschöld, Per Munck af

    2010-01-01

    This work presents an automated method for comprehensively analyzing EPID images acquired for quality assurance of RapidArc treatment delivery. In-house-developed software has been used for the analysis and long-term results from measurements on three linacs are presented.......This work presents an automated method for comprehensively analyzing EPID images acquired for quality assurance of RapidArc treatment delivery. In-house-developed software has been used for the analysis and long-term results from measurements on three linacs are presented....

  12. A New Automated Design Method Based on Machine Learning for CMOS Analog Circuits

    Science.gov (United States)

    Moradi, Behzad; Mirzaei, Abdolreza

    2016-11-01

    A new simulation based automated CMOS analog circuit design method which applies a multi-objective non-Darwinian-type evolutionary algorithm based on Learnable Evolution Model (LEM) is proposed in this article. The multi-objective property of this automated design of CMOS analog circuits is governed by a modified Strength Pareto Evolutionary Algorithm (SPEA) incorporated in the LEM algorithm presented here. LEM includes a machine learning method such as the decision trees that makes a distinction between high- and low-fitness areas in the design space. The learning process can detect the right directions of the evolution and lead to high steps in the evolution of the individuals. The learning phase shortens the evolution process and makes remarkable reduction in the number of individual evaluations. The expert designer's knowledge on circuit is applied in the design process in order to reduce the design space as well as the design time. The circuit evaluation is made by HSPICE simulator. In order to improve the design accuracy, bsim3v3 CMOS transistor model is adopted in this proposed design method. This proposed design method is tested on three different operational amplifier circuits. The performance of this proposed design method is verified by comparing it with the evolutionary strategy algorithm and other similar methods.

  13. Image patch-based method for automated classification and detection of focal liver lesions on CT

    Science.gov (United States)

    Safdari, Mustafa; Pasari, Raghav; Rubin, Daniel; Greenspan, Hayit

    2013-03-01

    We developed a method for automated classification and detection of liver lesions in CT images based on image patch representation and bag-of-visual-words (BoVW). BoVW analysis has been extensively used in the computer vision domain to analyze scenery images. In the current work we discuss how it can be used for liver lesion classification and detection. The methodology includes building a dictionary for a training set using local descriptors and representing a region in the image using a visual word histogram. Two tasks are described: a classification task, for lesion characterization, and a detection task in which a scan window moves across the image and is determined to be normal liver tissue or a lesion. Data: In the classification task 73 CT images of liver lesions were used, 25 images having cysts, 24 having metastasis and 24 having hemangiomas. A radiologist circumscribed the lesions, creating a region of interest (ROI), in each of the images. He then provided the diagnosis, which was established either by biopsy or clinical follow-up. Thus our data set comprises 73 images and 73 ROIs. In the detection task, a radiologist drew ROIs around each liver lesion and two regions of normal liver, for a total of 159 liver lesion ROIs and 146 normal liver ROIs. The radiologist also demarcated the liver boundary. Results: Classification results of more than 95% were obtained. In the detection task, F1 results obtained is 0.76. Recall is 84%, with precision of 73%. Results show the ability to detect lesions, regardless of shape.

  14. Automated method for determining the flow of surface functionalized nanoparticles through a hydraulically fractured mineral formation using plasmonic silver nanoparticles.

    Science.gov (United States)

    Maguire-Boyle, Samuel J; Garner, David J; Heimann, Jessica E; Gao, Lucy; Orbaek, Alvin W; Barron, Andrew R

    2014-02-01

    Quantifying nanoparticle (NP) transport within porous geological media is imperative in the design of tracers and sensors to monitor the environmental impact of hydraulic fracturing that has seen increasing concern over recent years, in particular the potential pollution and contamination of aquifers. The surface chemistry of a NP defining many of its solubility and transport properties means that there is a wide range of functionality that it is desirable to screen for optimum transport. Most prior transport methods are limited in determining if significant adsorption occurs of a NP over a limited column distance, however, translating this to effects over large distances is difficult. Herein we report an automated method that allows for the simulation of adsorption effects of a dilute nanoparticle solution over large distances under a range of solution parameters. Using plasmonic silver NPs and UV-visible spectroscopic detection allows for low concentrations to be used while offering greater consistency in peak absorbance leading to a higher degree of data reliability and statistics. As an example, breakthrough curves were determined for mercaptosuccinic acid (MSA) and cysteamine (CYS) functionalized Ag NPs passing through Ottawa sand (typical proppant material) immobile phase (C) or bypassing the immobile phase (C0). Automation allows for multiple sequences such that the absorption plateau after each breakthrough and the rate of breakthrough can be compared for multiple runs to provide statistical analysis. The mobility of the NPs as a function of pH is readily determined. The stickiness (α) of the NP to the immobile phase calculated from the C/C0 ratio shows that MSA-Ag NPs show good mobility, with a slight decrease around neutral pH, while CYS-Ag NPs shows an almost sinusoidal variation. The automated process described herein allows for rapid screening of NP functionality, as a function of immobile phase (proppant versus reservoir material), hydraulic

  15. Automation or De-automation

    Science.gov (United States)

    Gorlach, Igor; Wessel, Oliver

    2008-09-01

    In the global automotive industry, for decades, vehicle manufacturers have continually increased the level of automation of production systems in order to be competitive. However, there is a new trend to decrease the level of automation, especially in final car assembly, for reasons of economy and flexibility. In this research, the final car assembly lines at three production sites of Volkswagen are analysed in order to determine the best level of automation for each, in terms of manufacturing costs, productivity, quality and flexibility. The case study is based on the methodology proposed by the Fraunhofer Institute. The results of the analysis indicate that fully automated assembly systems are not necessarily the best option in terms of cost, productivity and quality combined, which is attributed to high complexity of final car assembly systems; some de-automation is therefore recommended. On the other hand, the analysis shows that low automation can result in poor product quality due to reasons related to plant location, such as inadequate workers' skills, motivation, etc. Hence, the automation strategy should be formulated on the basis of analysis of all relevant aspects of the manufacturing process, such as costs, quality, productivity and flexibility in relation to the local context. A more balanced combination of automated and manual assembly operations provides better utilisation of equipment, reduces production costs and improves throughput.

  16. Non-invasive automated assessment of the ratio of pulmonary to systemic flow in patients with atrial septal defects by the colour Doppler velocity profile integration method

    OpenAIRE

    Ueda, Y.; Hozumi, T; Yoshida, K.; Watanabe, H; Akasaka, T; Takagi, T; Yamamuro, A; Homma, S; Yoshikawa, J

    2002-01-01

    Background: The recent introduction of the automated cardiac flow measurement (ACM) method, using spatiotemporal integration of the Doppler velocity profile, provides a quick and accurate automated calculation of cardiac output.

  17. Automated analysis of pumping tests; Analise automatizada de testes de bombeamento

    Energy Technology Data Exchange (ETDEWEB)

    Sugahara, Luiz Alberto Nozaki

    1996-01-01

    An automated procedure for analysis of pumping test data performed in groundwater wells is described. A computer software was developed to be used under the Windows operational system. The software allows the choice of 3 mathematical models for representing the aquifer behavior, which are: Confined aquifer (Theis model); Leaky aquifer (Hantush model); unconfined aquifer (Boulton model). The analysis of pumping test data using the proper aquifer model, allows for the determination of the model parameters such as transmissivity, storage coefficient, leakage coefficient and delay index. The computer program can be used for the analysis of data obtained from both pumping tests, with one or more pumping rates, and recovery tests. In the multiple rate case, a de superposition procedure has been implemented in order to obtain the equivalent aquifer response for the first flow rate, which is used in obtaining an initial estimate of the model parameters. Such initial estimate is required in the non-linear regression analysis method. The solutions to the partial differential equations describing the aquifer behavior were obtained in Laplace space, followed by numerical inversion of the transformed solution using the Stehfest algorithm. The data analysis procedure is based on a non-linear regression method by matching the field data to the theoretical response of a selected aquifer model, for a given type of test. A least squared regression analysis method was implemented using either Gauss-Newton or Levenberg-Marquardt procedures for minimization of a objective function. The computer software can also be applied to multiple rate test data in order to determine the non-linear well coefficient, allowing for the computation of the well inflow performance curve. (author)

  18. Deriving pathway maps from automated text analysis using a grammar-based approach.

    Science.gov (United States)

    Olsson, Björn; Gawronska, Barbara; Erlendsson, Björn

    2006-04-01

    We demonstrate how automated text analysis can be used to support the large-scale analysis of metabolic and regulatory pathways by deriving pathway maps from textual descriptions found in the scientific literature. The main assumption is that correct syntactic analysis combined with domain-specific heuristics provides a good basis for relation extraction. Our method uses an algorithm that searches through the syntactic trees produced by a parser based on a Referent Grammar formalism, identifies relations mentioned in the sentence, and classifies them with respect to their semantic class and epistemic status (facts, counterfactuals, hypotheses). The semantic categories used in the classification are based on the relation set used in KEGG (Kyoto Encyclopedia of Genes and Genomes), so that pathway maps using KEGG notation can be automatically generated. We present the current version of the relation extraction algorithm and an evaluation based on a corpus of abstracts obtained from PubMed. The results indicate that the method is able to combine a reasonable coverage with high accuracy. We found that 61% of all sentences were parsed, and 97% of the parse trees were judged to be correct. The extraction algorithm was tested on a sample of 300 parse trees and was found to produce correct extractions in 90.5% of the cases.

  19. Widely applicable MATLAB routines for automated analysis of saccadic reaction times.

    Science.gov (United States)

    Leppänen, Jukka M; Forssman, Linda; Kaatiala, Jussi; Yrttiaho, Santeri; Wass, Sam

    2015-06-01

    Saccadic reaction time (SRT) is a widely used dependent variable in eye-tracking studies of human cognition and its disorders. SRTs are also frequently measured in studies with special populations, such as infants and young children, who are limited in their ability to follow verbal instructions and remain in a stable position over time. In this article, we describe a library of MATLAB routines (Mathworks, Natick, MA) that are designed to (1) enable completely automated implementation of SRT analysis for multiple data sets and (2) cope with the unique challenges of analyzing SRTs from eye-tracking data collected from poorly cooperating participants. The library includes preprocessing and SRT analysis routines. The preprocessing routines (i.e., moving median filter and interpolation) are designed to remove technical artifacts and missing samples from raw eye-tracking data. The SRTs are detected by a simple algorithm that identifies the last point of gaze in the area of interest, but, critically, the extracted SRTs are further subjected to a number of postanalysis verification checks to exclude values contaminated by artifacts. Example analyses of data from 5- to 11-month-old infants demonstrated that SRTs extracted with the proposed routines were in high agreement with SRTs obtained manually from video records, robust against potential sources of artifact, and exhibited moderate to high test-retest stability. We propose that the present library has wide utility in standardizing and automating SRT-based cognitive testing in various populations. The MATLAB routines are open source and can be downloaded from http://www.uta.fi/med/icl/methods.html .

  20. Automated analysis of craniofacial morphology using magnetic resonance images.

    Directory of Open Access Journals (Sweden)

    M Mallar Chakravarty

    Full Text Available Quantitative analysis of craniofacial morphology is of interest to scholars working in a wide variety of disciplines, such as anthropology, developmental biology, and medicine. T1-weighted (anatomical magnetic resonance images (MRI provide excellent contrast between soft tissues. Given its three-dimensional nature, MRI represents an ideal imaging modality for the analysis of craniofacial structure in living individuals. Here we describe how T1-weighted MR images, acquired to examine brain anatomy, can also be used to analyze facial features. Using a sample of typically developing adolescents from the Saguenay Youth Study (N = 597; 292 male, 305 female, ages: 12 to 18 years, we quantified inter-individual variations in craniofacial structure in two ways. First, we adapted existing nonlinear registration-based morphological techniques to generate iteratively a group-wise population average of craniofacial features. The nonlinear transformations were used to map the craniofacial structure of each individual to the population average. Using voxel-wise measures of expansion and contraction, we then examined the effects of sex and age on inter-individual variations in facial features. Second, we employed a landmark-based approach to quantify variations in face surfaces. This approach involves: (a placing 56 landmarks (forehead, nose, lips, jaw-line, cheekbones, and eyes on a surface representation of the MRI-based group average; (b warping the landmarks to the individual faces using the inverse nonlinear transformation estimated for each person; and (3 using a principal components analysis (PCA of the warped landmarks to identify facial features (i.e. clusters of landmarks that vary in our sample in a correlated fashion. As with the voxel-wise analysis of the deformation fields, we examined the effects of sex and age on the PCA-derived spatial relationships between facial features. Both methods demonstrated significant sexual dimorphism in

  1. AUTOMATION OF MORPHOMETRIC MEASUREMENTS FOR PLANETARY SURFACE ANALYSIS AND CARTOGRAPHY

    Directory of Open Access Journals (Sweden)

    A. A. Kokhanov

    2016-06-01

    Full Text Available For automation of measurements of morphometric parameters of surface relief various tools were developed and integrated into GIS. We have created a tool, which calculates statistical characteristics of the surface: interquartile range of heights, and slopes, as well as second derivatives of height fields as measures of topographic roughness. Other tools were created for morphological studies of craters. One of them allows automatic placing of topographic profiles through the geometric center of a crater. Another tool was developed for calculation of small crater depths and shape estimation, using C++ programming language. Additionally, we have prepared tool for calculating volumes of relief features from DTM rasters. The created software modules and models will be available in a new developed web-GIS system, operating in distributed cloud environment.

  2. Automated Image Processing for the Analysis of DNA Repair Dynamics

    CERN Document Server

    Riess, Thorsten; Tomas, Martin; Ferrando-May, Elisa; Merhof, Dorit

    2011-01-01

    The efficient repair of cellular DNA is essential for the maintenance and inheritance of genomic information. In order to cope with the high frequency of spontaneous and induced DNA damage, a multitude of repair mechanisms have evolved. These are enabled by a wide range of protein factors specifically recognizing different types of lesions and finally restoring the normal DNA sequence. This work focuses on the repair factor XPC (xeroderma pigmentosum complementation group C), which identifies bulky DNA lesions and initiates their removal via the nucleotide excision repair pathway. The binding of XPC to damaged DNA can be visualized in living cells by following the accumulation of a fluorescent XPC fusion at lesions induced by laser microirradiation in a fluorescence microscope. In this work, an automated image processing pipeline is presented which allows to identify and quantify the accumulation reaction without any user interaction. The image processing pipeline comprises a preprocessing stage where the ima...

  3. Automation of Morphometric Measurements for Planetary Surface Analysis and Cartography

    Science.gov (United States)

    Kokhanov, A. A.; Bystrov, A. Y.; Kreslavsky, M. A.; Matveev, E. V.; Karachevtseva, I. P.

    2016-06-01

    For automation of measurements of morphometric parameters of surface relief various tools were developed and integrated into GIS. We have created a tool, which calculates statistical characteristics of the surface: interquartile range of heights, and slopes, as well as second derivatives of height fields as measures of topographic roughness. Other tools were created for morphological studies of craters. One of them allows automatic placing of topographic profiles through the geometric center of a crater. Another tool was developed for calculation of small crater depths and shape estimation, using C++ programming language. Additionally, we have prepared tool for calculating volumes of relief features from DTM rasters. The created software modules and models will be available in a new developed web-GIS system, operating in distributed cloud environment.

  4. Automated Multivariate Optimization Tool for Energy Analysis: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Ellis, P. G.; Griffith, B. T.; Long, N.; Torcellini, P. A.; Crawley, D.

    2006-07-01

    Building energy simulations are often used for trial-and-error evaluation of ''what-if'' options in building design--a limited search for an optimal solution, or ''optimization''. Computerized searching has the potential to automate the input and output, evaluate many options, and perform enough simulations to account for the complex interactions among combinations of options. This paper describes ongoing efforts to develop such a tool. The optimization tool employs multiple modules, including a graphical user interface, a database, a preprocessor, the EnergyPlus simulation engine, an optimization engine, and a simulation run manager. Each module is described and the overall application architecture is summarized.

  5. 3D Assembly Group Analysis for Cognitive Automation

    Directory of Open Access Journals (Sweden)

    Christian Brecher

    2012-01-01

    Full Text Available A concept that allows the cognitive automation of robotic assembly processes is introduced. An assembly cell comprised of two robots was designed to verify the concept. For the purpose of validation a customer-defined part group consisting of Hubelino bricks is assembled. One of the key aspects for this process is the verification of the assembly group. Hence a software component was designed that utilizes the Microsoft Kinect to perceive both depth and color data in the assembly area. This information is used to determine the current state of the assembly group and is compared to a CAD model for validation purposes. In order to efficiently resolve erroneous situations, the results are interactively accessible to a human expert. The implications for an industrial application are demonstrated by transferring the developed concepts to an assembly scenario for switch-cabinet systems.

  6. An automated classification system for the differentiation of obstructive lung diseases based on the textural analysis of HRCT images

    Energy Technology Data Exchange (ETDEWEB)

    Park, Seong Hoon; Seo, Joon Beom; Kim, Nam Kug; Lee, Young Kyung; Kim, Song Soo; Chae, Eun Jin [University of Ulsan, College of Medicine, Asan Medical Center, Seoul (Korea, Republic of); Lee, June Goo [Seoul National University College of Medicine, Seoul (Korea, Republic of)

    2007-07-15

    To develop an automated classification system for the differentiation of obstructive lung diseases based on the textural analysis of HRCT images, and to evaluate the accuracy and usefulness of the system. For textural analysis, histogram features, gradient features, run length encoding, and a co-occurrence matrix were employed. A Bayesian classifier was used for automated classification. The images (image number n = 256) were selected from the HRCT images obtained from 17 healthy subjects (n = 67), 26 patients with bronchiolitis obliterans (n = 70), 28 patients with mild centrilobular emphysema (n = 65), and 21 patients with panlobular emphysema or severe centrilobular emphysema (n = 63). An five-fold cross-validation method was used to assess the performance of the system. Class-specific sensitivities were analyzed and the overall accuracy of the system was assessed with kappa statistics. The sensitivity of the system for each class was as follows: normal lung 84.9%, bronchiolitis obliterans 83.8%, mild centrilobular emphysema 77.0%, and panlobular emphysema or severe centrilobular emphysema 95.8%. The overall performance for differentiating each disease and the normal lung was satisfactory with a kappa value of 0.779. An automated classification system for the differentiation between obstructive lung diseases based on the textural analysis of HRCT images was developed. The proposed system discriminates well between the various obstructive lung diseases and the normal lung.

  7. Automated combustion accelerator mass spectrometry for the analysis of biomedical samples in the low attomole range.

    Science.gov (United States)

    van Duijn, Esther; Sandman, Hugo; Grossouw, Dimitri; Mocking, Johannes A J; Coulier, Leon; Vaes, Wouter H J

    2014-08-05

    The increasing role of accelerator mass spectrometry (AMS) in biomedical research necessitates modernization of the traditional sample handling process. AMS was originally developed and used for carbon dating, therefore focusing on a very high precision but with a comparably low sample throughput. Here, we describe the combination of automated sample combustion with an elemental analyzer (EA) online coupled to an AMS via a dedicated interface. This setup allows direct radiocarbon measurements for over 70 samples daily by AMS. No sample processing is required apart from the pipetting of the sample into a tin foil cup, which is placed in the carousel of the EA. In our system, up to 200 AMS analyses are performed automatically without the need for manual interventions. We present results on the direct total (14)C count measurements in <2 μL human plasma samples. The method shows linearity over a range of 0.65-821 mBq/mL, with a lower limit of quantification of 0.65 mBq/mL (corresponding to 0.67 amol for acetaminophen). At these extremely low levels of activity, it becomes important to quantify plasma specific carbon percentages. This carbon percentage is automatically generated upon combustion of a sample on the EA. Apparent advantages of the present approach include complete omission of sample preparation (reduced hands-on time) and fully automated sample analysis. These improvements clearly stimulate the standard incorporation of microtracer research in the drug development process. In combination with the particularly low sample volumes required and extreme sensitivity, AMS strongly improves its position as a bioanalysis method.

  8. Real-time whole slide mosaicing for non-automated microscopes in histopathology analysis

    Directory of Open Access Journals (Sweden)

    Alessandro Gherardi

    2013-01-01

    Full Text Available Context: Mosaics of Whole Slides (WS are a valuable resource for pathologists to have the whole sample available at high resolution. The WS mosaic provides pathologists with an overview of the whole sample at a glance, helping them to make a reliable diagnosis. Despite recent solutions exist for creating WS mosaics based, for instance, on automated microscopes with motorized stages or WS scanner, most of the histopathology analysis are still performed in laboratories endowed with standard manual stage microscopes. Nowadays, there are lots of dedicated devices and hardware to achieve WS automatically and in batch, but only few of them are conceived to work tightly connected with a microscope and none of them is capable of working in real-time with common light microscopes. However, there is a need of having low-cost yet effective mosaicing applications even in small laboratories to improve routine histopathological analyses or to perform remote diagnoses. Aims: The purpose of this work is to study and develop a real-time mosaicing algorithm working even using non-automated microscopes, to enable pathologists to achieve WS while moving the holder manually, without exploiting any dedicated device. This choice enables pathologists to build WS in real-time, while browsing the sample as they are accustomed to, helping them to identify, locate, and digitally annotate lesions fast. Materials and Methods: Our method exploits fast feature tracker and frame to frame registration that we implemented on common graphics processing unit cards. The system work with common light microscopes endowed with a digital camera and connected to a commodity personal computer. Result and Conclusion: The system has been tested on several histological samples to test the effectiveness of the algorithm to work with mosaicing having different appearances as far as brightness, contrast, texture, and detail levels are concerned, attaining sub-pixel registration accuracy at real

  9. Functional MRI Preprocessing in Lesioned Brains: Manual Versus Automated Region of Interest Analysis.

    Science.gov (United States)

    Garrison, Kathleen A; Rogalsky, Corianne; Sheng, Tong; Liu, Brent; Damasio, Hanna; Winstein, Carolee J; Aziz-Zadeh, Lisa S

    2015-01-01

    Functional magnetic resonance imaging (fMRI) has significant potential in the study and treatment of neurological disorders and stroke. Region of interest (ROI) analysis in such studies allows for testing of strong a priori clinical hypotheses with improved statistical power. A commonly used automated approach to ROI analysis is to spatially normalize each participant's structural brain image to a template brain image and define ROIs using an atlas. However, in studies of individuals with structural brain lesions, such as stroke, the gold standard approach may be to manually hand-draw ROIs on each participant's non-normalized structural brain image. Automated approaches to ROI analysis are faster and more standardized, yet are susceptible to preprocessing error (e.g., normalization error) that can be greater in lesioned brains. The manual approach to ROI analysis has high demand for time and expertise, but may provide a more accurate estimate of brain response. In this study, commonly used automated and manual approaches to ROI analysis were directly compared by reanalyzing data from a previously published hypothesis-driven cognitive fMRI study, involving individuals with stroke. The ROI evaluated is the pars opercularis of the inferior frontal gyrus. Significant differences were identified in task-related effect size and percent-activated voxels in this ROI between the automated and manual approaches to ROI analysis. Task interactions, however, were consistent across ROI analysis approaches. These findings support the use of automated approaches to ROI analysis in studies of lesioned brains, provided they employ a task interaction design.

  10. Advanced Automation for Ion Trap Mass Spectrometry-New Opportunities for Real-Time Autonomous Analysis

    Science.gov (United States)

    Palmer, Peter T.; Wong, C. M.; Salmonson, J. D.; Yost, R. A.; Griffin, T. P.; Yates, N. A.; Lawless, James G. (Technical Monitor)

    1994-01-01

    The utility of MS/MS for both target compound analysis and the structure elucidation of unknowns has been described in a number of references. A broader acceptance of this technique has not yet been realized as it requires large, complex, and costly instrumentation which has not been competitive with more conventional techniques. Recent advancements in ion trap mass spectrometry promise to change this situation. Although the ion trap's small size, sensitivity, and ability to perform multiple stages of mass spectrometry have made it eminently suitable for on-line, real-time monitoring applications, advance automation techniques are required to make these capabilities more accessible to non-experts. Towards this end we have developed custom software for the design and implementation of MS/MS experiments. This software allows the user to take full advantage of the ion trap's versatility with respect to ionization techniques, scan proxies, and ion accumulation/ejection methods. Additionally, expert system software has been developed for autonomous target compound analysis. This software has been linked to ion trap control software and a commercial data system to bring all of the steps in the analysis cycle under control of the expert system. These software development efforts and their utilization for a number of trace analysis applications will be described.

  11. [The actual possibilities of robotic microscopy in analysis automation and laboratory telemedicine].

    Science.gov (United States)

    Medovyĭ, V S; Piatnitskiĭ, A M; Sokolinskiĭ, B Z; Balugian, R Sh

    2012-10-01

    The article discusses the possibilities of automation microscopy complexes manufactured by Cellavision and MEKOS to perform the medical analyses of blood films and other biomaterials. The joint work of the complex and physician in the regimen of automatic load stages, screening, sampling and sorting on types with simple morphology, visual sorting of sub-sample with complex morphology provides significant increase of method sensitivity, load decrease and enhancement of physician work conditions. The information technologies, the virtual slides and laboratory telemedicine included permit to develop the representative samples of rare types and pathologies to promote automation methods and medical research targets.

  12. A Meta-Analysis of Factors Influencing the Development of Trust in Automation: Implications for Human-Robot Interaction

    Science.gov (United States)

    2014-07-01

    684. Lin, P.; Bekey, G.; Abney, K. Autonomous Military Robotics : Risk, Ethics , and Design; California Polytechnic State University: San Luis Obispo...A Meta-Analysis of Factors Influencing the Development of Trust in Automation: Implications for Human- Robot Interaction by Kristin E...Meta-Analysis of Factors Influencing the Development of Trust in Automation: Implications for Human- Robot Interaction Kristin E. Schaefer

  13. An automated enzymatic method for measurement of D-arabinitol, a metabolite of pathogenic Candida species.

    OpenAIRE

    Switchenko, A C; Miyada, C G; Goodman, T C; Walsh, T. J.; Wong, B; Becker, M J; Ullman, E F

    1994-01-01

    An automated enzymatic method was developed for the measurement of D-arabinitol in human serum. The assay is based on a novel, highly specific D-arabinitol dehydrogenase from Candida tropicalis. This enzyme catalyzes the oxidation of D-arabinitol to D-ribulose and the concomitant reduction of NAD+ to NADH. The NADH produced is used in a second reaction to reduce p-iodonitrotetrazolium violet (INT) to INT-formazan, which is measured spectrophotometrically. The entire reaction sequence can be p...

  14. Analysis of numerical methods

    CERN Document Server

    Isaacson, Eugene

    1994-01-01

    This excellent text for advanced undergraduates and graduate students covers norms, numerical solution of linear systems and matrix factoring, iterative solutions of nonlinear equations, eigenvalues and eigenvectors, polynomial approximation, and other topics. It offers a careful analysis and stresses techniques for developing new methods, plus many examples and problems. 1966 edition.

  15. Automated extraction of DNA from biological stains on fabric from crime cases. A comparison of a manual and three automated methods

    DEFF Research Database (Denmark)

    Stangegaard, Michael; Hjort, Benjamin B; Hansen, Thomas N

    2013-01-01

    The presence of PCR inhibitors in extracted DNA may interfere with the subsequent quantification and short tandem repeat (STR) reactions used in forensic genetic DNA typing. DNA extraction from fabric for forensic genetic purposes may be challenging due to the occasional presence of PCR inhibitors...... that may be co-extracted with the DNA. Using 120 forensic trace evidence samples consisting of various types of fabric, we compared three automated DNA extraction methods based on magnetic beads (PrepFiler Express Forensic DNA Extraction Kit on an AutoMate Express, QIAsyphony DNA Investigator kit either...... with the sample pre-treatment recommended by Qiagen or an in-house optimized sample pre-treatment on a QIAsymphony SP) and one manual method (Chelex) with the aim of reducing the amount of PCR inhibitors in the DNA extracts and increasing the proportion of reportable STR-profiles. A total of 480 samples were...

  16. Semi-automated solid-phase extraction method for studying the biodegradation of ochratoxin A by human intestinal microbiota.

    Science.gov (United States)

    Camel, Valérie; Ouethrani, Minale; Coudray, Cindy; Philippe, Catherine; Rabot, Sylvie

    2012-04-15

    A simple and rapid semi-automated solid-phase (SPE) extraction method has been developed for the analysis of ochratoxin A in aqueous matrices related to biodegradation experiments (namely digestive contents and faecal excreta), with a view of using this method to follow OTA biodegradation by human intestinal microbiota. Influence of extraction parameters that could affect semi-automated SPE efficiency was studied, using C18-silica as the sorbent and water as the simplest matrix, being further applied to the matrices of interest. Conditions finally retained were as follows: 5-mL aqueous samples (pH 3) containing an organic modifier (20% ACN) were applied on 100-mg cartridges. After drying (9 mL of air), the cartridge was rinsed with 5-mL H(2)O/ACN (80:20, v/v), before eluting the compounds with 3 × 1 mL of MeOH/THF (10:90, v/v). Acceptable recoveries and limits of quantification could be obtained considering the complexity of the investigated matrices and the low volumes sampled; this method was also suitable for the analysis of ochratoxin B in faecal extracts. Applicability of the method is illustrated by preliminary results of ochratoxin A biodegradation studies by human intestinal microbiota under simple in vitro conditions. Interestingly, partial degradation of ochratoxin A was observed, with efficiencies ranging from 14% to 47% after 72 h incubation. In addition, three phase I metabolites could be identified using high resolution mass spectrometry, namely ochratoxin α, open ochratoxin A and ochratoxin B.

  17. Programmed automation of modulator cold jet flow for comprehensive two-dimensional gas chromatographic analysis of vacuum gas oils.

    Science.gov (United States)

    Rathbun, Wayne

    2007-01-01

    A method is described for automating the regulation of cold jet flow of a comprehensive two-dimensional gas chromatograph (GCxGC) configured with flame ionization detection. This new capability enables the routine automated separation, identification, and quantitation of hydrocarbon types in petroleum fractions extending into the vacuum gas oil (VGO) range (IBP-540 degrees C). Chromatographic data acquisition software is programmed to precisely change the rate of flow from the cold jet of a nitrogen cooled loop modulator of a GCxGC instrument during sample analysis. This provides for the proper modulation of sample compounds across a wider boiling range. The boiling point distribution of the GCxGC separation is shown to be consistent with high temperature simulated distillation results indicating recovery of higher boiling semi-volatile VGO sample components. GCxGC configured with time-of-flight mass spectrometry is used to determine the molecular identity of individual sample components and boundaries of different molecular types.

  18. SparkMaster: automated calcium spark analysis with ImageJ.

    Science.gov (United States)

    Picht, Eckard; Zima, Aleksey V; Blatter, Lothar A; Bers, Donald M

    2007-09-01

    Ca sparks are elementary Ca-release events from intracellular Ca stores that are observed in virtually all types of muscle. Typically, Ca sparks are measured in the line-scan mode with confocal laser-scanning microscopes, yielding two-dimensional images (distance vs. time). The manual analysis of these images is time consuming and prone to errors as well as investigator bias. Therefore, we developed SparkMaster, an automated analysis program that allows rapid and reliable spark analysis. The underlying analysis algorithm is adapted from the threshold-based standard method of spark analysis developed by Cheng et al. (Biophys J 76: 606-617, 1999) and is implemented here in the freely available image-processing software ImageJ. SparkMaster offers a graphical user interface through which all analysis parameters and output options are selected. The analysis includes general image parameters (number of detected sparks, spark frequency) and individual spark parameters (amplitude, full width at half-maximum amplitude, full duration at half-maximum amplitude, full width, full duration, time to peak, maximum steepness of spark upstroke, time constant of spark decay). We validated the algorithm using images with synthetic sparks embedded into backgrounds with different signal-to-noise ratios to determine an analysis criteria at which a high sensitivity is combined with a low frequency of false-positive detections. Finally, we applied SparkMaster to analyze experimental data of sparks measured in intact and permeabilized ventricular cardiomyocytes, permeabilized mammalian skeletal muscle, and intact smooth muscle cells. We found that SparkMaster provides a reliable, easy to use, and fast way of analyzing Ca sparks in a wide variety of experimental conditions.

  19. Automated Analysis of Crackles in Patients with Interstitial Pulmonary Fibrosis

    Directory of Open Access Journals (Sweden)

    B. Flietstra

    2011-01-01

    Full Text Available Background. The crackles in patients with interstitial pulmonary fibrosis (IPF can be difficult to distinguish from those heard in patients with congestive heart failure (CHF and pneumonia (PN. Misinterpretation of these crackles can lead to inappropriate therapy. The purpose of this study was to determine whether the crackles in patients with IPF differ from those in patients with CHF and PN. Methods. We studied 39 patients with IPF, 95 with CHF and 123 with PN using a 16-channel lung sound analyzer. Crackle features were analyzed using machine learning methods including neural networks and support vector machines. Results. The IPF crackles had distinctive features that allowed them to be separated from those in patients with PN with a sensitivity of 0.82, a specificity of 0.88 and an accuracy of 0.86. They were separated from those of CHF patients with a sensitivity of 0.77, a specificity of 0.85 and an accuracy of 0.82. Conclusion. Distinctive features are present in the crackles of IPF that help separate them from the crackles of CHF and PN. Computer analysis of crackles at the bedside has the potential of aiding clinicians in diagnosing IPF more easily and thus helping to avoid medication errors.

  20. Driver-centred vehicle automation: using network analysis for agent-based modelling of the driver in highly automated driving systems.

    Science.gov (United States)

    Banks, Victoria A; Stanton, Neville A

    2016-11-01

    To the average driver, the concept of automation in driving infers that they can become completely 'hands and feet free'. This is a common misconception, however, one that has been shown through the application of Network Analysis to new Cruise Assist technologies that may feature on our roads by 2020. Through the adoption of a Systems Theoretic approach, this paper introduces the concept of driver-initiated automation which reflects the role of the driver in highly automated driving systems. Using a combination of traditional task analysis and the application of quantitative network metrics, this agent-based modelling paper shows how the role of the driver remains an integral part of the driving system implicating the need for designers to ensure they are provided with the tools necessary to remain actively in-the-loop despite giving increasing opportunities to delegate their control to the automated subsystems. Practitioner Summary: This paper describes and analyses a driver-initiated command and control system of automation using representations afforded by task and social networks to understand how drivers remain actively involved in the task. A network analysis of different driver commands suggests that such a strategy does maintain the driver in the control loop.

  1. Automated analysis of small animal PET studies through deformable registration to an atlas

    Energy Technology Data Exchange (ETDEWEB)

    Gutierrez, Daniel F. [Geneva University Hospital, Division of Nuclear Medicine and Molecular Imaging, Geneva 4 (Switzerland); Zaidi, Habib [Geneva University Hospital, Division of Nuclear Medicine and Molecular Imaging, Geneva 4 (Switzerland); Geneva University, Geneva Neuroscience Center, Geneva (Switzerland); University of Groningen, Department of Nuclear Medicine and Molecular Imaging, University Medical Center Groningen, Groningen (Netherlands)

    2012-11-15

    This work aims to develop a methodology for automated atlas-guided analysis of small animal positron emission tomography (PET) data through deformable registration to an anatomical mouse model. A non-rigid registration technique is used to put into correspondence relevant anatomical regions of rodent CT images from combined PET/CT studies to corresponding CT images of the Digimouse anatomical mouse model. The latter provides a pre-segmented atlas consisting of 21 anatomical regions suitable for automated quantitative analysis. Image registration is performed using a package based on the Insight Toolkit allowing the implementation of various image registration algorithms. The optimal parameters obtained for deformable registration were applied to simulated and experimental mouse PET/CT studies. The accuracy of the image registration procedure was assessed by segmenting mouse CT images into seven regions: brain, lungs, heart, kidneys, bladder, skeleton and the rest of the body. This was accomplished prior to image registration using a semi-automated algorithm. Each mouse segmentation was transformed using the parameters obtained during CT to CT image registration. The resulting segmentation was compared with the original Digimouse atlas to quantify image registration accuracy using established metrics such as the Dice coefficient and Hausdorff distance. PET images were then transformed using the same technique and automated quantitative analysis of tracer uptake performed. The Dice coefficient and Hausdorff distance show fair to excellent agreement and a mean registration mismatch distance of about 6 mm. The results demonstrate good quantification accuracy in most of the regions, especially the brain, but not in the bladder, as expected. Normalized mean activity estimates were preserved between the reference and automated quantification techniques with relative errors below 10 % in most of the organs considered. The proposed automated quantification technique is

  2. Automated Segmentation of Coronary Arteries Based on Statistical Region Growing and Heuristic Decision Method

    Directory of Open Access Journals (Sweden)

    Yun Tian

    2016-01-01

    Full Text Available The segmentation of coronary arteries is a vital process that helps cardiovascular radiologists detect and quantify stenosis. In this paper, we propose a fully automated coronary artery segmentation from cardiac data volume. The method is built on a statistics region growing together with a heuristic decision. First, the heart region is extracted using a multi-atlas-based approach. Second, the vessel structures are enhanced via a 3D multiscale line filter. Next, seed points are detected automatically through a threshold preprocessing and a subsequent morphological operation. Based on the set of detected seed points, a statistics-based region growing is applied. Finally, results are obtained by setting conservative parameters. A heuristic decision method is then used to obtain the desired result automatically because parameters in region growing vary in different patients, and the segmentation requires full automation. The experiments are carried out on a dataset that includes eight-patient multivendor cardiac computed tomography angiography (CTA volume data. The DICE similarity index, mean distance, and Hausdorff distance metrics are employed to compare the proposed algorithm with two state-of-the-art methods. Experimental results indicate that the proposed algorithm is capable of performing complete, robust, and accurate extraction of coronary arteries.

  3. Archaeological field survey automation: concurrent multisensor site mapping and automated analysis

    Science.gov (United States)

    Józefowicz, Mateusz; Sokolov, Oleksandr; Meszyński, Sebastian; Siemińska, Dominika; Kołosowski, Przemysław

    2016-04-01

    control the platform from a remote location via satellite, with only servicing person on the site and the survey team operating from their office, globally. The method is under development. The team contributing to the project includes also: Oleksii Sokolov, Michał Koepke, Krzysztof Rydel, Michał Stypczyński, Maciej Ślęk, Łukasz Zapała, Michał Dąbrowski.

  4. Development of an Automated Technique for Failure Modes and Effect Analysis

    DEFF Research Database (Denmark)

    Blanke, M.; Borch, Ole; Allasia, G.;

    1999-01-01

    implementing an automated technique for Failure Modes and Effects Analysis (FMEA). This technique is based on the matrix formulation of FMEA for the investigation of failure propagation through a system. As main result, this technique will provide the design engineer with decision tables for fault handling...

  5. Development of an automated technique for failure modes and effect analysis

    DEFF Research Database (Denmark)

    Blanke, M.; Borch, Ole; Bagnoli, F.;

    implementing an automated technique for Failure Modes and Effects Analysis (FMEA). This technique is based on the matrix formulation of FMEA for the investigation of failure propagation through a system. As main result, this technique will provide the design engineer with decision tables for fault handling...

  6. Chapter 2: Predicting Newcomer Integration in Online Knowledge Communities by Automated Dialog Analysis

    NARCIS (Netherlands)

    Nistor, Nicolae; Dascalu, Mihai; Stavarache, Lucia; Tarnai, Christian; Trausan-Matu, Stefan

    2016-01-01

    Nistor, N., Dascalu, M., Stavarache, L.L., Tarnai, C., & Trausan-Matu, S. (2015). Predicting Newcomer Integration in Online Knowledge Communities by Automated Dialog Analysis. In Y. Li, M. Chang, M. Kravcik, E. Popescu, R. Huang, Kinshuk & N.-S. Chen (Eds.), State-of-the-Art and Future Directions of

  7. Comparing a perceptual and an automated vision-based method for lie detection in younger children

    Directory of Open Access Journals (Sweden)

    Mariana Serras Pereira

    2016-12-01

    Full Text Available The present study investigates how easily it can be detected whether a child is being truthful or not in a game situation, and it explores the cue validity of bodily movements for such type of classification. To achieve this, we introduce an innovative methodology – the combination of perception studies (in which one uses eye-tracking technology and automated movement analysis. Film fragments from truthful and deceptive children were shown to human judges who were given the task to decide whether the recorded child was being truthful or not. Results reveal that judges are able to accurately distinguish truthful clips from lying clips in both perception studies. Even though the automated movement analysis for overall and specific body regions did not yield significant results between the experimental conditions, we did find a positive correlation between the amount of movement in a child and the perception of lies, i.e., the more movement the children exhibited during a clip, the higher the chance that the clip was perceived as a lie. The eye-tracking study revealed that, even when there is movement happening on different body regions, judges tend to focus their attention mainly on the face region.

  8. Design and Prototype of an Automated Column-Switching HPLC System for Radiometabolite Analysis.

    Science.gov (United States)

    Vasdev, Neil; Collier, Thomas Lee

    2016-08-17

    Column-switching high performance liquid chromatography (HPLC) is extensively used for the critical analysis of radiolabeled ligands and their metabolites in plasma. However, the lack of streamlined apparatus and consequently varying protocols remain as a challenge among positron emission tomography laboratories. We report here the prototype apparatus and implementation of a fully automated and simplified column-switching procedure to allow for the easy and automated determination of radioligands and their metabolites in up to 5 mL of plasma. The system has been used with conventional UV and coincidence radiation detectors, as well as with a single quadrupole mass spectrometer.

  9. Design and Prototype of an Automated Column-Switching HPLC System for Radiometabolite Analysis

    Directory of Open Access Journals (Sweden)

    Neil Vasdev

    2016-08-01

    Full Text Available Column-switching high performance liquid chromatography (HPLC is extensively used for the critical analysis of radiolabeled ligands and their metabolites in plasma. However, the lack of streamlined apparatus and consequently varying protocols remain as a challenge among positron emission tomography laboratories. We report here the prototype apparatus and implementation of a fully automated and simplified column-switching procedure to allow for the easy and automated determination of radioligands and their metabolites in up to 5 mL of plasma. The system has been used with conventional UV and coincidence radiation detectors, as well as with a single quadrupole mass spectrometer.

  10. Fully automated fluorescent in situ hybridization (FISH staining and digital analysis of HER2 in breast cancer: a validation study.

    Directory of Open Access Journals (Sweden)

    Elise M J van der Logt

    Full Text Available HER2 assessment is routinely used to select patients with invasive breast cancer that might benefit from HER2-targeted therapy. The aim of this study was to validate a fully automated in situ hybridization (ISH procedure that combines the automated Leica HER2 fluorescent ISH system for Bond with supervised automated analysis with the Visia imaging D-Sight digital imaging platform. HER2 assessment was performed on 328 formalin-fixed/paraffin-embedded invasive breast cancer tumors on tissue microarrays (TMA and 100 (50 selected IHC 2+ and 50 random IHC scores full-sized slides of resections/biopsies obtained for diagnostic purposes previously. For digital analysis slides were pre-screened at 20x and 100x magnification for all fluorescent signals and supervised-automated scoring was performed on at least two pictures (in total at least 20 nuclei were counted with the D-Sight HER2 FISH analysis module by two observers independently. Results were compared to data obtained previously with the manual Abbott FISH test. The overall agreement with Abbott FISH data among TMA samples and 50 selected IHC 2+ cases was 98.8% (κ = 0.94 and 93.8% (κ = 0.88, respectively. The results of 50 additionally tested unselected IHC cases were concordant with previously obtained IHC and/or FISH data. The combination of the Leica FISH system with the D-Sight digital imaging platform is a feasible method for HER2 assessment in routine clinical practice for patients with invasive breast cancer.

  11. Dynamic analysis of flexible gear trains/transmissions - An automated approach

    Science.gov (United States)

    Amirouche, F. M. L.; Shareef, N. H.; Xie, M.

    1992-01-01

    In this paper an automated algorithmic method is presented for the dynamic analysis of geared trains/transmissions. These are treated as a system of interconnected flexible bodies. The procedure developed explains the switching of constraints with time as a result of the change in the contacting areas at the gear teeth. The elastic behavior of the system is studied through the employment of three-dimensional isoparametric elements having six degrees-of-freedom at each node. The contact between the bodies is assumed at the various nodes, which could be either a line or a plane. The kinematical expressions, together with the equations of motion using Kane's method, strain energy concepts, are presented in a matrix form suitable for computer implementation. The constraint Jacobian matrices are generated automatically based on the contact information between the bodies. The concepts of the relative velocity at the contacting points at the tooth pairs and the subsequent use of the transmission ratios in the analysis is presented.

  12. Automated red blood cell analysis compared with routine red blood cell morphology by smear review

    Directory of Open Access Journals (Sweden)

    Dr.Poonam Radadiya

    2015-01-01

    Full Text Available The RBC histogram is an integral part of automated haematology analysis and is now routinely available on all automated cell counters. This histogram and other associated complete blood count (CBC parameters have been found abnormal in various haematological conditions and may provide major clues in the diagnosis and management of significant red cell disorders. Performing manual blood smears is important to ensure the quality of blood count results and to make presumptive diagnosis. In this article we have taken 100 samples for comparative study between RBC histograms obtained by automated haematology analyzer with peripheral blood smear. This article discusses some morphological features of dimorphism and the ensuing characteristic changes in their RBC histograms.

  13. Automated Bearing Fault Diagnosis Using 2D Analysis of Vibration Acceleration Signals under Variable Speed Conditions

    Directory of Open Access Journals (Sweden)

    Sheraz Ali Khan

    2016-01-01

    Full Text Available Traditional fault diagnosis methods of bearings detect characteristic defect frequencies in the envelope power spectrum of the vibration signal. These defect frequencies depend upon the inherently nonstationary shaft speed. Time-frequency and subband signal analysis of vibration signals has been used to deal with random variations in speed, whereas design variations require retraining a new instance of the classifier for each operating speed. This paper presents an automated approach for fault diagnosis in bearings based upon the 2D analysis of vibration acceleration signals under variable speed conditions. Images created from the vibration signals exhibit unique textures for each fault, which show minimal variation with shaft speed. Microtexture analysis of these images is used to generate distinctive fault signatures for each fault type, which can be used to detect those faults at different speeds. A k-nearest neighbor classifier trained using fault signatures generated for one operating speed is used to detect faults at all the other operating speeds. The proposed approach is tested on the bearing fault dataset of Case Western Reserve University, and the results are compared with those of a spectrum imaging-based approach.

  14. Automated Detection of Connective Tissue by Tissue Counter Analysis and Classification and Regression Trees

    Directory of Open Access Journals (Sweden)

    Josef Smolle

    2001-01-01

    Full Text Available Objective: To evaluate the feasibility of the CART (Classification and Regression Tree procedure for the recognition of microscopic structures in tissue counter analysis. Methods: Digital microscopic images of H&E stained slides of normal human skin and of primary malignant melanoma were overlayed with regularly distributed square measuring masks (elements and grey value, texture and colour features within each mask were recorded. In the learning set, elements were interactively labeled as representing either connective tissue of the reticular dermis, other tissue components or background. Subsequently, CART models were based on these data sets. Results: Implementation of the CART classification rules into the image analysis program showed that in an independent test set 94.1% of elements classified as connective tissue of the reticular dermis were correctly labeled. Automated measurements of the total amount of tissue and of the amount of connective tissue within a slide showed high reproducibility (r=0.97 and r=0.94, respectively; p < 0.001. Conclusions: CART procedure in tissue counter analysis yields simple and reproducible classification rules for tissue elements.

  15. a Novel Method for Automation of 3d Hydro Break Line Generation from LIDAR Data Using Matlab

    Science.gov (United States)

    Toscano, G. J.; Gopalam, U.; Devarajan, V.

    2013-08-01

    Water body detection is necessary to generate hydro break lines, which are in turn useful in creating deliverables such as TINs, contours, DEMs from LiDAR data. Hydro flattening follows the detection and delineation of water bodies (lakes, rivers, ponds, reservoirs, streams etc.) with hydro break lines. Manual hydro break line generation is time consuming and expensive. Accuracy and processing time depend on the number of vertices marked for delineation of break lines. Automation with minimal human intervention is desired for this operation. This paper proposes using a novel histogram analysis of LiDAR elevation data and LiDAR intensity data to automatically detect water bodies. Detection of water bodies using elevation information was verified by checking against LiDAR intensity data since the spectral reflectance of water bodies is very small compared with that of land and vegetation in near infra-red wavelength range. Detection of water bodies using LiDAR intensity data was also verified by checking against LiDAR elevation data. False detections were removed using morphological operations and 3D break lines were generated. Finally, a comparison of automatically generated break lines with their semi-automated/manual counterparts was performed to assess the accuracy of the proposed method and the results were discussed.

  16. A semi-automated method for measuring femoral shape to derive version and its comparison with existing methods.

    Science.gov (United States)

    Berryman, F; Pynsent, P; McBryde, C

    2014-11-01

    The measurement of femoral version is important in surgical planning of derotational osteotomies particularly for patients with proximal femoral deformity. It is, however, difficult to measure version accurately and differences of 10° to 15° have been found between repeated measurements. The aim of this work was first to develop a method of measuring femoral version angle where the definition of the neck axis is based on the three-dimensional point cloud making up the neck, second to automate many of the processes involved thus reducing the influence of human error and third to ensure the method could run on freely available software suitable for most computer platforms. A CT scan was performed on 44 cadaveric femurs to generate point clouds of the femoral surfaces. The point clouds were then analysed semi-automatically to determine femoral version angle between a neck axis defined by the bone surface points belonging only to the neck and a femoral condylar axis. The results from the neck fitting method were compared against three other methods typically used in the clinic (Murphy, Reikeras and Lee methods). Version angle measured by the new method gave 19.1° ± 7.3° (mean ± standard deviation) for the set of cadaveric femurs, 3.5° lower than the Murphy method and 6.8° and 11.0° higher than the Reikeras and Lee 2D methods respectively. The results demonstrate a method of measuring femoral version angle incorporating a high level of automation running on free software.

  17. An automated enzymatic method for measurement of D-arabinitol, a metabolite of pathogenic Candida species.

    Science.gov (United States)

    Switchenko, A C; Miyada, C G; Goodman, T C; Walsh, T J; Wong, B; Becker, M J; Ullman, E F

    1994-01-01

    An automated enzymatic method was developed for the measurement of D-arabinitol in human serum. The assay is based on a novel, highly specific D-arabinitol dehydrogenase from Candida tropicalis. This enzyme catalyzes the oxidation of D-arabinitol to D-ribulose and the concomitant reduction of NAD+ to NADH. The NADH produced is used in a second reaction to reduce p-iodonitrotetrazolium violet (INT) to INT-formazan, which is measured spectrophotometrically. The entire reaction sequence can be performed automatically on a COBAS MIRA-S clinical chemistry analyzer (Roche Diagnostic Systems, Inc., Montclair, N.J.). Replicate analyses of human sera supplemented with D-arabinitol over a concentration range of 0 to 40 microM demonstrated that the pentitol could be measured with an accuracy of +/- 7% and a precision (standard deviation) of +/- 0.4 microM. Serum D-arabinitol measurements correlated with those determined by gas chromatography (r = 0.94). The enzymatic method is unaffected by L-arabinitol, D-mannitol, or other polyols commonly found in human serum. Any of 17 therapeutic drugs potentially present in serum did not significantly influence assay performance. Data illustrating the application of the assay in patients for possible diagnosis of invasive candidiasis and the monitoring of therapeutic intervention are presented. The automated assay described here was developed to facilitate the investigation of D-arabinitol as a serum marker for invasive Candida infections.

  18. A Novel Vision Localization Method of Automated Micro-Polishing Robot

    Institute of Scientific and Technical Information of China (English)

    Zhao-jun Yang; Fei Chen; Ji Zhao; Xiao-jie Wu

    2009-01-01

    Based on photograrnmetry technology, a novel localization method of micro-polishing robot, which is restricted within certain working space, is presented in this paper. On the basis of pinhole camera model, a new mathematical model of vision localization of automated polishing robot is established. The vision localization is based on the distance-constraints of feature points. The method to solve the mathematical model is discussed. According to the characteristics of gray image, an adaptive method of automatic threshold selection based on connected components is presented. The center coordinate of the feature image point is resolved by bilinear interpolation gray square weighted algorithm. Finally, the mathematical model of testing system is verified by global localization test. The experimental results show that the vision localization system in working space has high precision.

  19. Accelerating Chart Review Using Automated Methods on Electronic Health Record Data for Postoperative Complications

    Science.gov (United States)

    Hu, Zhen; Melton, Genevieve B.; Moeller, Nathan D.; Arsoniadis, Elliot G.; Wang, Yan; Kwaan, Mary R.; Jensen, Eric H.; Simon, Gyorgy J.

    2016-01-01

    Manual Chart Review (MCR) is an important but labor-intensive task for clinical research and quality improvement. In this study, aiming to accelerate the process of extracting postoperative outcomes from medical charts, we developed an automated postoperative complications detection application by using structured electronic health record (EHR) data. We applied several machine learning methods to the detection of commonly occurring complications, including three subtypes of surgical site infection, pneumonia, urinary tract infection, sepsis, and septic shock. Particularly, we applied one single-task and five multi-task learning methods and compared their detection performance. The models demonstrated high detection performance, which ensures the feasibility of accelerating MCR. Specifically, one of the multi-task learning methods, propensity weighted observations (PWO) demonstrated the highest detection performance, with single-task learning being a close second.

  20. An automated method for determining the cytoadhesion of Plasmodium falciparum-infected erythrocytes to immobilized cells

    DEFF Research Database (Denmark)

    Hempel, Casper; Boisen, Ida M; Efunshile, Akinwale;

    2015-01-01

    BACKGROUND: Plasmodium falciparum exports antigens to the surface of infected erythrocytes causing cytoadhesion to the host vasculature. This is central in malaria pathogenesis but in vitro studies of cytoadhesion rely mainly on manual counting methods. The current study aimed at developing...... an automated high-throughput method for this purpose utilizing the pseudoperoxidase activity of intra-erythrocytic haemoglobin. METHODS: Chinese hamster ovary (CHO) cells were grown to confluence in chamber slides and microtiter plates. Cytoadhesion of co-cultured P. falciparum, selected for binding to CHO...... using: i) binding of P. falciparum-infected erythrocytes to CHO cells over-expressing chondroitin sulfate A and ii) CHO cells transfected with CD36. Binding of infected erythrocytes including field isolates to primary endothelial cells was also performed. Data was analysed using linear regression...

  1. Automated longitudinal intra-subject analysis (ALISA) for diffusion MRI tractography

    DEFF Research Database (Denmark)

    Aarnink, Saskia H; Vos, Sjoerd B; Leemans, Alexander;

    2014-01-01

    . The major disadvantage of manual FT segmentations, unfortunately, is that placing regions-of-interest for tract selection can be very labor-intensive and time-consuming. Although there are several methods that can identify specific WM fiber bundles in an automated way, manual FT segmentations across...... multiple subjects performed by a trained rater with neuroanatomical expertise are generally assumed to be more accurate. However, for longitudinal DTI analyses it may still be beneficial to automate the FT segmentation across multiple time points, but then for each individual subject separately. Both...

  2. Application of fluorescence-based semi-automated AFLP analysis in barley and wheat

    DEFF Research Database (Denmark)

    Schwarz, G.; Herz, M.; Huang, X.Q.

    2000-01-01

    Genetic mapping and the selection of closely linked molecular markers for important agronomic traits require efficient, large-scale genotyping methods. A semi-automated multifluorophore technique was applied for genotyping AFLP marker loci in barley and wheat. In comparison to conventional P-33...

  3. An automated image processing method to quantify collagen fibre organization within cutaneous scar tissue.

    Science.gov (United States)

    Quinn, Kyle P; Golberg, Alexander; Broelsch, G Felix; Khan, Saiqa; Villiger, Martin; Bouma, Brett; Austen, William G; Sheridan, Robert L; Mihm, Martin C; Yarmush, Martin L; Georgakoudi, Irene

    2015-01-01

    Standard approaches to evaluate scar formation within histological sections rely on qualitative evaluations and scoring, which limits our understanding of the remodelling process. We have recently developed an image analysis technique for the rapid quantification of fibre alignment at each pixel location. The goal of this study was to evaluate its application for quantitatively mapping scar formation in histological sections of cutaneous burns. To this end, we utilized directional statistics to define maps of fibre density and directional variance from Masson's trichrome-stained sections for quantifying changes in collagen organization during scar remodelling. Significant increases in collagen fibre density are detectable soon after burn injury in a rat model. Decreased fibre directional variance in the scar was also detectable between 3 weeks and 6 months after injury, indicating increasing fibre alignment. This automated analysis of fibre organization can provide objective surrogate endpoints for evaluating cutaneous wound repair and regeneration.

  4. Automated Integrated Analog Filter Design Issues

    Directory of Open Access Journals (Sweden)

    Karolis Kiela

    2015-07-01

    Full Text Available An analysis of modern automated integrated analog circuits design methods and their use in integrated filter design is done. Current modern analog circuits automated tools are based on optimization algorithms and/or new circuit generation methods. Most automated integrated filter design methods are only suited to gmC and switched current filter topologies. Here, an algorithm for an active RC integrated filter design is proposed, that can be used in automated filter designs. The algorithm is tested by designing an integrated active RC filter in a 65 nm CMOS technology.

  5. Automated image analysis for space debris identification and astrometric measurements

    Science.gov (United States)

    Piattoni, Jacopo; Ceruti, Alessandro; Piergentili, Fabrizio

    2014-10-01

    The space debris is a challenging problem for the human activity in the space. Observation campaigns are conducted around the globe to detect and track uncontrolled space objects. One of the main problems in optical observation is obtaining useful information about the debris dynamical state by the images collected. For orbit determination, the most relevant information embedded in optical observation is the precise angular position, which can be evaluated by astrometry procedures, comparing the stars inside the image with star catalogs. This is typically a time consuming process, if done by a human operator, which makes this task impractical when dealing with large amounts of data, in the order of thousands images per night, generated by routinely conducted observations. An automated procedure is investigated in this paper that is capable to recognize the debris track inside a picture, calculate the celestial coordinates of the image's center and use these information to compute the debris angular position in the sky. This procedure has been implemented in a software code, that does not require human interaction and works without any supplemental information besides the image itself, detecting space objects and solving for their angular position without a priori information. The algorithm for object detection was developed inside the research team. For the star field computation, the software code astrometry.net was used and released under GPL v2 license. The complete procedure was validated by an extensive testing, using the images obtained in the observation campaign performed in a joint project between the Italian Space Agency (ASI) and the University of Bologna at the Broglio Space center, Kenya.

  6. Grcarma: A fully automated task-oriented interface for the analysis of molecular dynamics trajectories.

    Science.gov (United States)

    Koukos, Panagiotis I; Glykos, Nicholas M

    2013-10-05

    We report the availability of grcarma, a program encoding for a fully automated set of tasks aiming to simplify the analysis of molecular dynamics trajectories of biological macromolecules. It is a cross-platform, Perl/Tk-based front-end to the program carma and is designed to facilitate the needs of the novice as well as those of the expert user, while at the same time maintaining a user-friendly and intuitive design. Particular emphasis was given to the automation of several tedious tasks, such as extraction of clusters of structures based on dihedral and Cartesian principal component analysis, secondary structure analysis, calculation and display of root-meansquare deviation (RMSD) matrices, calculation of entropy, calculation and analysis of variance–covariance matrices, calculation of the fraction of native contacts, etc. The program is free-open source software available immediately for download.

  7. Automated Student Model Improvement

    Science.gov (United States)

    Koedinger, Kenneth R.; McLaughlin, Elizabeth A.; Stamper, John C.

    2012-01-01

    Student modeling plays a critical role in developing and improving instruction and instructional technologies. We present a technique for automated improvement of student models that leverages the DataShop repository, crowd sourcing, and a version of the Learning Factors Analysis algorithm. We demonstrate this method on eleven educational…

  8. Automating methods to improve precision in Monte-Carlo event generation for particle colliders

    Energy Technology Data Exchange (ETDEWEB)

    Gleisberg, Tanju

    2008-07-01

    The subject of this thesis was the development of tools for the automated calculation of exact matrix elements, which are a key for the systematic improvement of precision and confidence for theoretical predictions. Part I of this thesis concentrates on the calculations of cross sections at tree level. A number of extensions have been implemented in the matrix element generator AMEGIC++, namely new interaction models such as effective loop-induced couplings of the Higgs boson with massless gauge bosons, required for a number of channels for the Higgs boson search at LHC and anomalous gauge couplings, parameterizing a number of models beyond th SM. Further a special treatment to deal with complicated decay chains of heavy particles has been constructed. A significant effort went into the implementation of methods to push the limits on particle multiplicities. Two recursive methods have been implemented, the Cachazo-Svrcek-Witten recursion and the colour dressed Berends-Giele recursion. For the latter the new module COMIX has been added to the SHERPA framework. The Monte-Carlo phase space integration techniques have been completely revised, which led to significantly reduced statistical error estimates when calculating cross sections and a greatly improved unweighting efficiency for the event generation. Special integration methods have been developed to cope with the newly accessible final states. The event generation framework SHERPA directly benefits from those new developments, improving the precision and the efficiency. Part II was addressed to the automation of QCD calculations at next-to-leading order. A code has been developed, that, for the first time fully automates the real correction part of a NLO calculation. To calculate the correction for a m-parton process obeying the Catani-Seymour dipole subtraction method the following components are provided: 1. the corresponding m+1-parton tree level matrix elements, 2. a number dipole subtraction terms to remove

  9. Automated local bright feature image analysis of nuclear proteindistribution identifies changes in tissue phenotype

    Energy Technology Data Exchange (ETDEWEB)

    Knowles, David; Sudar, Damir; Bator, Carol; Bissell, Mina

    2006-02-01

    The organization of nuclear proteins is linked to cell and tissue phenotypes. When cells arrest proliferation, undergo apoptosis, or differentiate, the distribution of nuclear proteins changes. Conversely, forced alteration of the distribution of nuclear proteins modifies cell phenotype. Immunostaining and fluorescence microscopy have been critical for such findings. However, there is an increasing need for quantitative analysis of nuclear protein distribution to decipher epigenetic relationships between nuclear structure and cell phenotype, and to unravel the mechanisms linking nuclear structure and function. We have developed imaging methods to quantify the distribution of fluorescently-stained nuclear protein NuMA in different mammary phenotypes obtained using three-dimensional cell culture. Automated image segmentation of DAPI-stained nuclei was generated to isolate thousands of nuclei from three-dimensional confocal images. Prominent features of fluorescently-stained NuMA were detected using a novel local bright feature analysis technique, and their normalized spatial density calculated as a function of the distance from the nuclear perimeter to its center. The results revealed marked changes in the distribution of the density of NuMA bright features as non-neoplastic cells underwent phenotypically normal acinar morphogenesis. In contrast, we did not detect any reorganization of NuMA during the formation of tumor nodules by malignant cells. Importantly, the analysis also discriminated proliferating non-neoplastic cells from proliferating malignant cells, suggesting that these imaging methods are capable of identifying alterations linked not only to the proliferation status but also to the malignant character of cells. We believe that this quantitative analysis will have additional applications for classifying normal and pathological tissues.

  10. Genomic typing of Escherichia coli O157:H7 by semi-automated fluorescent AFLP analysis.

    Science.gov (United States)

    Zhao, S; Mitchell, S E; Meng, J; Kresovich, S; Doyle, M P; Dean, R E; Casa, A M; Weller, J W

    2000-02-01

    Escherichia coli serotype O157:H7 isolates were analyzed using a relatively new DNA fingerprinting method, amplified fragment length polymorphism (AFLP). Total genomic DNA was digested with two restriction endonucleases (EcoRI and MseI), and compatible oligonucleotide adapters were ligated to the ends of the resulting DNA fragments. Subsets of fragments from the total pool of cleaved DNA were then amplified by the polymerase chain reaction (PCR) using selective primers that extended beyond the adapter and restriction site sequences. One of the primers from each set was labeled with a fluorescent dye, which enabled amplified fragments to be detected and sized automatically on an automated DNA sequencer. Three AFLP primer sets generated a total of thirty-seven unique genotypes among the 48 E. coli O157:H7 isolates tested. Prior fingerprinting analysis of large restriction fragments from these same isolates by pulsed-field gel electrophoresis (PFGE) resulted in only 21 unique DNA profiles. Also, AFLP fingerprinting was successful for one DNA sample that was not typable by PFGE, presumably because of template degradation. AFLP analysis, therefore, provided greater genetic resolution and was less sensitive to DNA quality than PFGE. Consequently, this DNA typing technology should be very useful for genetic subtyping of bacterial pathogens in epidemiologic studies.

  11. Optimization and Quality Control of Automated Quantitative Mineralogy Analysis for Acid Rock Drainage Prediction

    Directory of Open Access Journals (Sweden)

    Robert Pooler

    2017-01-01

    Full Text Available Low ore-grade waste samples from the Codelco Andina mine that were analyzed in an environmental and mineralogical test program for acid rock drainage prediction, revealed inconsistencies between the quantitative mineralogical data (QEMSCAN® and the results of geochemical characterizations by atomic absorption spectroscopy (AAS, LECO® furnace, and sequential extractions. For the QEMSCAN® results, biases were observed in the proportions of pyrite and calcium sulfate minerals detected. An analysis of the results indicated that the problems observed were likely associated with polished section preparation. Therefore, six different sample preparation protocols were tested and evaluated using three samples from the previous study. One of the methods, which involved particle size reduction and transverse section preparation, was identified as having the greatest potential for correcting the errors observed in the mineralogical analyses. Further, the biases in the quantities of calcium sulfate minerals detected were reduced through the use of ethylene glycol as a polishing lubricant. It is recommended that the sample preparation methodology described in this study be used in order to accurately quantify percentages of pyrite and calcium sulfate minerals in environmental mineralogical studies which use automated mineralogical analysis.

  12. Experimental saltwater intrusion in coastal aquifers using automated image analysis: Applications to homogeneous aquifers

    Science.gov (United States)

    Robinson, G.; Ahmed, Ashraf A.; Hamill, G. A.

    2016-07-01

    This paper presents the applications of a novel methodology to quantify saltwater intrusion parameters in laboratory-scale experiments. The methodology uses an automated image analysis procedure, minimising manual inputs and the subsequent systematic errors that can be introduced. This allowed the quantification of the width of the mixing zone which is difficult to measure in experimental methods that are based on visual observations. Glass beads of different grain sizes were tested for both steady-state and transient conditions. The transient results showed good correlation between experimental and numerical intrusion rates. The experimental intrusion rates revealed that the saltwater wedge reached a steady state condition sooner while receding than advancing. The hydrodynamics of the experimental mixing zone exhibited similar traits; a greater increase in the width of the mixing zone was observed in the receding saltwater wedge, which indicates faster fluid velocities and higher dispersion. The angle of intrusion analysis revealed the formation of a volume of diluted saltwater at the toe position when the saltwater wedge is prompted to recede. In addition, results of different physical repeats of the experiment produced an average coefficient of variation less than 0.18 of the measured toe length and width of the mixing zone.

  13. Automated simultaneous monitoring of nitrate and nitrite in surface water by sequential injection analysis.

    Science.gov (United States)

    Legnerová, Zlatuse; Solich, Petr; Sklenárová, Hana; Satínský, Dalibor; Karlícek, Rolf

    2002-06-01

    A fully automated procedure based on Sequential Injection Analysis (SIA) methodology for simultaneous monitoring of nitrate and nitrite in surface water samples is described. Nitrite was determined directly using the Griess diazo-coupling reaction and the formed azo dye was measured at 540 nm in the flow cell of the fibre-optic spectrophotometer. Nitrate zone was passed through a reducing mini-column containing copperised-cadmium. After the reduction of nitrate into nitrite the sample was aspirated by flow reversal to the holding coil, treated with the reagent and finally passed through the flow cell. The calibration curve was linear over the range 0.05-1.00 mg N l(-1) of nitrite and 0.50-50.00 mg N l(-1) of nitrate; correlation coefficients were 0.9993 and 0.9988 for nitrite and nitrate, respectively. Detection limits were 0.015 and 0.10 mg N l(-1) for nitrite and nitrate, respectively. The relative standard deviation (RSD) values (n = 3) were 1.10% and 1.32% for nitrite and nitrate, respectively. The total time of one measuring cycle was 250 s, thus the sample throughput was about 14 h(-1). Nitrate and nitrite were determined in the real samples of surface water, and the results have been compared with those obtained by two other flow methods; flow injection analysis based on the same reactions and isotachophoretic determination used in a routine environmental control laboratory.

  14. AVR Microcontroller-based automated technique for analysis of DC motors

    Science.gov (United States)

    Kaur, P.; Chatterji, S.

    2014-01-01

    This paper provides essential information on the development of a 'dc motor test and analysis control card' using AVR series ATMega32 microcontroller. This card can be interfaced to PC and calculates parameters like motor losses, efficiency and plot characteristics for dc motors. Presently, there are different tests and methods available to evaluate motor parameters, but a single and universal user-friendly automated set-up has been discussed in this paper. It has been accomplished by designing a data acquisition and SCR bridge firing hardware based on AVR ATMega32 microcontroller. This hardware has the capability to drive the phase-controlled rectifiers and acquire real-time values of current, voltage, temperature and speed of motor. Various analyses feasible with the designed hardware are of immense importance for dc motor manufacturers and quality-sensitive users. Authors, through this paper aim to provide details of this AVR-based hardware which can be used for dc motor parameter analysis and also for motor control applications.

  15. Sample preparation and in situ hybridization techniques for automated molecular cytogenetic analysis of white blood cells

    Energy Technology Data Exchange (ETDEWEB)

    Rijke, F.M. van de; Vrolijk, H.; Sloos, W. [Leiden Univ. (Netherlands)] [and others

    1996-06-01

    With the advent in situ hybridization techniques for the analysis of chromosome copy number or structure in interphase cells, the diagnostic and prognostic potential of cytogenetics has been augmented considerably. In theory, the strategies for detection of cytogenetically aberrant cells by in situ hybridization are simple and straightforward. In practice, however, they are fallible, because false classification of hybridization spot number or patterns occurs. When a decision has to be made on molecular cytogenetic normalcy or abnormalcy of a cell sample, the problem of false classification becomes particularly prominent if the fraction of aberrant cells is relatively small. In such mosaic situations, often > 200 cells have to be evaluated to reach a statistical sound figure. The manual enumeration of in situ hybridization spots in many cells in many patient samples is tedious. Assistance in the evaluation process by automation of microscope functions and image analysis techniques is, therefore, strongly indicated. Next to research and development of microscope hardware, camera technology, and image analysis, the optimization of the specimen for the (semi)automated microscopic analysis is essential, since factors such as cell density, thickness, and overlap have dramatic influences on the speed and complexity of the analysis process. Here we describe experiments that have led to a protocol for blood cell specimen that results in microscope preparations that are well suited for automated molecular cytogenetic analysis. 13 refs., 4 figs., 1 tab.

  16. Automated quantification and sizing of unbranched filamentous cyanobacteria by model-based object-oriented image analysis.

    Science.gov (United States)

    Zeder, Michael; Van den Wyngaert, Silke; Köster, Oliver; Felder, Kathrin M; Pernthaler, Jakob

    2010-03-01

    Quantification and sizing of filamentous cyanobacteria in environmental samples or cultures are time-consuming and are often performed by using manual or semiautomated microscopic analysis. Automation of conventional image analysis is difficult because filaments may exhibit great variations in length and patchy autofluorescence. Moreover, individual filaments frequently cross each other in microscopic preparations, as deduced by modeling. This paper describes a novel approach based on object-oriented image analysis to simultaneously determine (i) filament number, (ii) individual filament lengths, and (iii) the cumulative filament length of unbranched cyanobacterial morphotypes in fluorescent microscope images in a fully automated high-throughput manner. Special emphasis was placed on correct detection of overlapping objects by image analysis and on appropriate coverage of filament length distribution by using large composite images. The method was validated with a data set for Planktothrix rubescens from field samples and was compared with manual filament tracing, the line intercept method, and the Utermöhl counting approach. The computer program described allows batch processing of large images from any appropriate source and annotation of detected filaments. It requires no user interaction, is available free, and thus might be a useful tool for basic research and drinking water quality control.

  17. A simple viability analysis for unicellular cyanobacteria using a new autofluorescence assay, automated microscopy, and ImageJ

    Directory of Open Access Journals (Sweden)

    Schulze Katja

    2011-11-01

    Full Text Available Abstract Background Currently established methods to identify viable and non-viable cells of cyanobacteria are either time-consuming (eg. plating or preparation-intensive (eg. fluorescent staining. In this paper we present a new and fast viability assay for unicellular cyanobacteria, which uses red chlorophyll fluorescence and an unspecific green autofluorescence for the differentiation of viable and non-viable cells without the need of sample preparation. Results The viability assay for unicellular cyanobacteria using red and green autofluorescence was established and validated for the model organism Synechocystis sp. PCC 6803. Both autofluorescence signals could be observed simultaneously allowing a direct classification of viable and non-viable cells. The results were confirmed by plating/colony count, absorption spectra and chlorophyll measurements. The use of an automated fluorescence microscope and a novel ImageJ based image analysis plugin allow a semi-automated analysis. Conclusions The new method simplifies the process of viability analysis and allows a quick and accurate analysis. Furthermore results indicate that a combination of the new assay with absorption spectra or chlorophyll concentration measurements allows the estimation of the vitality of cells.

  18. Evaporation from weighing precipitation gauges: impacts on automated gauge measurements and quality assurance methods

    Directory of Open Access Journals (Sweden)

    R. D. Leeper

    2014-12-01

    Full Text Available The effects of evaporation on precipitation measurements have been understood to bias total precipitation lower. For automated weighing-bucket gauges, the World Meteorological Organization (WMO suggests the use of evaporative suppressants with frequent observations. However, the use of evaporation suppressants is not always feasible due to environmental hazards and the added cost of maintenance, transport, and disposal of the gauge additive. In addition, research has suggested that evaporation prior to precipitation may affect precipitation measurements from auto-recording gauges operating at sub-hourly frequencies. For further evaluation, a field campaign was conducted to monitor evaporation and its impacts on the quality of precipitation measurements from gauges used at US Climate Reference Network (USCRN stations. Collocated Geonor gauges with (nonEvap and without (evap an evaporative suppressant were compared to evaluate evaporative losses and evaporation biases on precipitation measurements. From June to August, evaporative losses from the evap gauge exceeded accumulated precipitation, with an average loss of 0.12 mm h−1. However, the impact of evaporation on precipitation measurements was sensitive to calculation methods. In general, methods that utilized a longer time series to smooth out sensor noise were more sensitive to gauge (−4.6% bias with respect to control evaporation than methods computing depth change without smoothing (< +1% bias. These results indicate that while climate and gauge design affect gauge evaporation rates computational methods can influence the magnitude of evaporation bias on precipitation measurements. It is hoped this study will advance QA techniques that mitigate the impact of evaporation biases on precipitation measurements from other automated networks.

  19. An Automated High Throughput Proteolysis and Desalting Platform for Quantitative Proteomic Analysis

    Directory of Open Access Journals (Sweden)

    Albert-Baskar Arul

    2013-06-01

    Full Text Available Proteomics for biomarker validation needs high throughput instrumentation to analyze huge set of clinical samples for quantitative and reproducible analysis at a minimum time without manual experimental errors. Sample preparation, a vital step in proteomics plays a major role in identification and quantification of proteins from biological samples. Tryptic digestion a major check point in sample preparation for mass spectrometry based proteomics needs to be more accurate with rapid processing time. The present study focuses on establishing a high throughput automated online system for proteolytic digestion and desalting of proteins from biological samples quantitatively and qualitatively in a reproducible manner. The present study compares online protein digestion and desalting of BSA with conventional off-line (in-solution method and validated for real time sample for reproducibility. Proteins were identified using SEQUEST data base search engine and the data were quantified using IDEALQ software. The present study shows that the online system capable of handling high throughput samples in 96 well formats carries out protein digestion and peptide desalting efficiently in a reproducible and quantitative manner. Label free quantification showed clear increase of peptide quantities with increase in concentration with much linearity compared to off line method. Hence we would like to suggest that inclusion of this online system in proteomic pipeline will be effective in quantification of proteins in comparative proteomics were the quantification is really very crucial.

  20. Automated Brain Image classification using Neural Network Approach and Abnormality Analysis

    Directory of Open Access Journals (Sweden)

    P.Muthu Krishnammal

    2015-06-01

    Full Text Available Image segmentation of surgical images plays an important role in diagnosis and analysis the anatomical structure of human body. Magnetic Resonance Imaging (MRI helps in obtaining a structural image of internal parts of the body. This paper aims at developing an automatic support system for stage classification using learning machine and to detect brain Tumor by fuzzy clustering methods to detect the brain Tumor in its early stages and to analyze anatomical structures. The three stages involved are: feature extraction using GLCM and the tumor classification using PNN-RBF network and segmentation using SFCM. Here fast discrete curvelet transformation is used to analyze texture of an image which be used as a base for a Computer Aided Diagnosis (CAD system .The Probabilistic Neural Network with radial basis function is employed to implement an automated Brain Tumor classification. It classifies the stage of Brain Tumor that is benign, malignant or normal automatically. Then the segmentation of the brain abnormality using Spatial FCM and the severity of the tumor is analysed using the number of tumor cells in the detected abnormal region.The proposed method reports promising results in terms of training performance and classification accuracies.

  1. Automated nanoliter solution deposition for total reflection X-ray fluorescence analysis of semiconductor samples

    Energy Technology Data Exchange (ETDEWEB)

    Sparks, Chris M. [Process Characterization Laboratory, ATDF, Austin, TX 78741 (United States)]. E-mail: chris.sparks@atdf.com; Gondran, Carolyn H. [Process Characterization Laboratory, ATDF, Austin, TX 78741 (United States); Havrilla, George J. [Chemistry Division, Los Alamos National Laboratory, Los Alamos, NM 87545 (United States); Hastings, Elizabeth P. [Chemistry Division, Los Alamos National Laboratory, Los Alamos, NM 87545 (United States)

    2006-11-15

    In this study, a BioDot BioJet dispensing system was investigated as a nanoliter sample deposition method for total reflection X-ray fluorescence (TXRF) analysis. The BioDot system was programmed to dispense arrays of 20 nL droplets of sample solution on Si wafers. Each 20 nL droplet was approximately 100 {mu}m in diameter. A 10 x 10 array (100 droplets) was deposited and dried in less than 2 min at room temperature and pressure, demonstrating the efficiency of the automated deposition method. Solutions of various concentrations of Ni and Ni in different matrices were made from stock trace element standards to investigate of the effect of the matrix on the TXRF signal. The concentrations were such that the levels of TXRF signal saturation could be examined. Arrays were deposited to demonstrate the capability of drying 100 {mu}L of vapor phase decomposition-like residue in the area of a typical TXRF detector.

  2. Automated segmentation of muscle and adipose tissue on CT images for human body composition analysis

    Science.gov (United States)

    Chung, Howard; Cobzas, Dana; Birdsell, Laura; Lieffers, Jessica; Baracos, Vickie

    2009-02-01

    The ability to compute body composition in cancer patients lends itself to determining the specific clinical outcomes associated with fat and lean tissue stores. For example, a wasting syndrome of advanced disease associates with shortened survival. Moreover, certain tissue compartments represent sites for drug distribution and are likely determinants of chemotherapy efficacy and toxicity. CT images are abundant, but these cannot be fully exploited unless there exist practical and fast approaches for tissue quantification. Here we propose a fully automated method for segmenting muscle, visceral and subcutaneous adipose tissues, taking the approach of shape modeling for the analysis of skeletal muscle. Muscle shape is represented using PCA encoded Free Form Deformations with respect to a mean shape. The shape model is learned from manually segmented images and used in conjunction with a tissue appearance prior. VAT and SAT are segmented based on the final deformed muscle shape. In comparing the automatic and manual methods, coefficients of variation (COV) (1 - 2%), were similar to or smaller than inter- and intra-observer COVs reported for manual segmentation.

  3. Automated detection of regions of interest for tissue microarray experiments: an image texture analysis

    Directory of Open Access Journals (Sweden)

    Tözeren Aydin

    2007-03-01

    Full Text Available Abstract Background Recent research with tissue microarrays led to a rapid progress toward quantifying the expressions of large sets of biomarkers in normal and diseased tissue. However, standard procedures for sampling tissue for molecular profiling have not yet been established. Methods This study presents a high throughput analysis of texture heterogeneity on breast tissue images for the purpose of identifying regions of interest in the tissue for molecular profiling via tissue microarray technology. Image texture of breast histology slides was described in terms of three parameters: the percentage of area occupied in an image block by chromatin (B, percentage occupied by stroma-like regions (P, and a statistical heterogeneity index H commonly used in image analysis. Texture parameters were defined and computed for each of the thousands of image blocks in our dataset using both the gray scale and color segmentation. The image blocks were then classified into three categories using the texture feature parameters in a novel statistical learning algorithm. These categories are as follows: image blocks specific to normal breast tissue, blocks specific to cancerous tissue, and those image blocks that are non-specific to normal and disease states. Results Gray scale and color segmentation techniques led to identification of same regions in histology slides as cancer-specific. Moreover the image blocks identified as cancer-specific belonged to those cell crowded regions in whole section image slides that were marked by two pathologists as regions of interest for further histological studies. Conclusion These results indicate the high efficiency of our automated method for identifying pathologic regions of interest on histology slides. Automation of critical region identification will help minimize the inter-rater variability among different raters (pathologists as hundreds of tumors that are used to develop an array have typically been evaluated

  4. Methods for RNA Analysis

    DEFF Research Database (Denmark)

    Olivarius, Signe

    While increasing evidence appoints diverse types of RNA as key players in the regulatory networks underlying cellular differentiation and metabolism, the potential functions of thousands of conserved RNA structures encoded in mammalian genomes remain to be determined. Since the functions of most...... RNAs rely on interactions with proteins, the establishment of protein-binding profiles is essential for the characterization of RNAs. Aiming to facilitate RNA analysis, this thesis introduces proteomics- as well as transcriptomics-based methods for the functional characterization of RNA. First, RNA......-protein pulldown combined with mass spectrometry analysis is applied for in vivo as well as in vitro identification of RNA-binding proteins, the latter succeeding in verifying known RNA-protein interactions. Secondly, acknowledging the significance of flexible promoter usage for the diversification...

  5. Analysis of the thoracic aorta using a semi-automated post processing tool

    Energy Technology Data Exchange (ETDEWEB)

    Entezari, Pegah, E-mail: p-entezari@northwestern.edu [Department of Radiology, Cardiovascular Imaging, Northwestern University, Chicago, IL (United States); Kino, Aya, E-mail: ayakino@gmail.com [Department of Radiology, Cardiovascular Imaging, Northwestern University, Chicago, IL (United States); Honarmand, Amir R., E-mail: arhonarmand@yahoo.com [Department of Radiology, Cardiovascular Imaging, Northwestern University, Chicago, IL (United States); Galizia, Mauricio S., E-mail: maugalizia@yahoo.com.br [Department of Radiology, Cardiovascular Imaging, Northwestern University, Chicago, IL (United States); Yang, Yan, E-mail: yyang@vitalimages.com [Vital images Inc, Minnetonka, MN (United States); Collins, Jeremy, E-mail: collins@fsm.northwestern.edu [Department of Radiology, Cardiovascular Imaging, Northwestern University, Chicago, IL (United States); Yaghmai, Vahid, E-mail: vyaghmai@northwestern.edu [Department of Radiology, Cardiovascular Imaging, Northwestern University, Chicago, IL (United States); Carr, James C., E-mail: jcarr@northwestern.edu [Department of Radiology, Cardiovascular Imaging, Northwestern University, Chicago, IL (United States)

    2013-09-15

    Objective: To evaluates a semi-automated method for Thoracic Aortic Aneurysm (TAA) measurement using ECG-gated Dual Source CT Angiogram (DSCTA). Methods: This retrospective HIPAA compliant study was approved by our IRB. Transaxial maximum diameters of outer wall to outer wall were studied in fifty patients at seven anatomic locations of the thoracic aorta: annulus, sinus, sinotubular junction (STJ), mid ascending aorta (MAA) at the level of right pulmonary artery, proximal aortic arch (PROX) immediately proximal to innominate artery, distal aortic arch (DIST) immediately distal to left subclavian artery, and descending aorta (DESC) at the level of diaphragm. Measurements were performed using a manual method and semi-automated software. All readers repeated their measurements. Inter-method, intra-observer and inter-observer agreements were evaluated according to intraclass correlation coefficient (ICC) and Bland–Altman plot. The number of cases with manual contouring or center line adjustment for the semi-automated method and also the post-processing time for each method were recorded. Results: The mean difference between semi-automated and manual methods was less than 1.3 mm at all seven points. Strong inter-method, inter-observer and intra-observer agreement was recorded at all levels (ICC ≥ 0.9). The maximum rate of manual adjustment of center line and contour was at the level of annulus. The average time for manual post-processing of the aorta was 19 ± 0.3 min, while it took 8.26 ± 2.1 min to do the measurements with the semi-automated tool (Vitrea version 6.0.0.1 software). The center line was edited manually at all levels, with most corrections at the level of annulus (60%), while the contour was adjusted at all levels with highest and lowest number of corrections at the levels of annulus and DESC (75% and 0.07% of the cases), respectively. Conclusion: Compared to the commonly used manual method, semi-automated measurement of vessel dimensions is

  6. Semi-automated competitive protein binding analysis of serum thyroxine on reusable Sephadex columns and its advantages over radioimmunoassay.

    Science.gov (United States)

    Alexander, N M

    1976-06-01

    Competitive protein-binding analysis of serum thyroxine on small, reusable, Sephadex columns has been further studied and improved. The improved, semi-automated procedure results in reduced working time and costs. It has also been established that triiodothyronine crossreacts only 1/6 to 1/9 as well as thyroxine, and can be ignored because it represents only about 1/80 of the total serum iodothyronine content. The economic and methodological advantages of the improved method over radioammunoassay and other displacement assays are discussed.

  7. Automated detection, 3D segmentation and analysis of high resolution spine MR images using statistical shape models

    Science.gov (United States)

    Neubert, A.; Fripp, J.; Engstrom, C.; Schwarz, R.; Lauer, L.; Salvado, O.; Crozier, S.

    2012-12-01

    Recent advances in high resolution magnetic resonance (MR) imaging of the spine provide a basis for the automated assessment of intervertebral disc (IVD) and vertebral body (VB) anatomy. High resolution three-dimensional (3D) morphological information contained in these images may be useful for early detection and monitoring of common spine disorders, such as disc degeneration. This work proposes an automated approach to extract the 3D segmentations of lumbar and thoracic IVDs and VBs from MR images using statistical shape analysis and registration of grey level intensity profiles. The algorithm was validated on a dataset of volumetric scans of the thoracolumbar spine of asymptomatic volunteers obtained on a 3T scanner using the relatively new 3D T2-weighted SPACE pulse sequence. Manual segmentations and expert radiological findings of early signs of disc degeneration were used in the validation. There was good agreement between manual and automated segmentation of the IVD and VB volumes with the mean Dice scores of 0.89 ± 0.04 and 0.91 ± 0.02 and mean absolute surface distances of 0.55 ± 0.18 mm and 0.67 ± 0.17 mm respectively. The method compares favourably to existing 3D MR segmentation techniques for VBs. This is the first time IVDs have been automatically segmented from 3D volumetric scans and shape parameters obtained were used in preliminary analyses to accurately classify (100% sensitivity, 98.3% specificity) disc abnormalities associated with early degenerative changes.

  8. RBioplot: an easy-to-use R pipeline for automated statistical analysis and data visualization in molecular biology and biochemistry

    Directory of Open Access Journals (Sweden)

    Jing Zhang

    2016-09-01

    Full Text Available Background Statistical analysis and data visualization are two crucial aspects in molecular biology and biology. For analyses that compare one dependent variable between standard (e.g., control and one or multiple independent variables, a comprehensive yet highly streamlined solution is valuable. The computer programming language R is a popular platform for researchers to develop tools that are tailored specifically for their research needs. Here we present an R package RBioplot that takes raw input data for automated statistical analysis and plotting, highly compatible with various molecular biology and biochemistry lab techniques, such as, but not limited to, western blotting, PCR, and enzyme activity assays. Method The package is built based on workflows operating on a simple raw data layout, with minimum user input or data manipulation required. The package is distributed through GitHub, which can be easily installed through one single-line R command. A detailed installation guide is available at http://kenstoreylab.com/?page_id=2448. Users can also download demo datasets from the same website. Results and Discussion By integrating selected functions from existing statistical and data visualization packages with extensive customization, RBioplot features both statistical analysis and data visualization functionalities. Key properties of RBioplot include: -Fully automated and comprehensive statistical analysis, including normality test, equal variance test, Student’s t-test and ANOVA (with post-hoc tests; -Fully automated histogram, heatmap and joint-point curve plotting modules; -Detailed output files for statistical analysis, data manipulation and high quality graphs; -Axis range finding and user customizable tick settings; -High user-customizability.

  9. RBioplot: an easy-to-use R pipeline for automated statistical analysis and data visualization in molecular biology and biochemistry

    Science.gov (United States)

    Zhang, Jing

    2016-01-01

    Background Statistical analysis and data visualization are two crucial aspects in molecular biology and biology. For analyses that compare one dependent variable between standard (e.g., control) and one or multiple independent variables, a comprehensive yet highly streamlined solution is valuable. The computer programming language R is a popular platform for researchers to develop tools that are tailored specifically for their research needs. Here we present an R package RBioplot that takes raw input data for automated statistical analysis and plotting, highly compatible with various molecular biology and biochemistry lab techniques, such as, but not limited to, western blotting, PCR, and enzyme activity assays. Method The package is built based on workflows operating on a simple raw data layout, with minimum user input or data manipulation required. The package is distributed through GitHub, which can be easily installed through one single-line R command. A detailed installation guide is available at http://kenstoreylab.com/?page_id=2448. Users can also download demo datasets from the same website. Results and Discussion By integrating selected functions from existing statistical and data visualization packages with extensive customization, RBioplot features both statistical analysis and data visualization functionalities. Key properties of RBioplot include: -Fully automated and comprehensive statistical analysis, including normality test, equal variance test, Student’s t-test and ANOVA (with post-hoc tests);-Fully automated histogram, heatmap and joint-point curve plotting modules;-Detailed output files for statistical analysis, data manipulation and high quality graphs;-Axis range finding and user customizable tick settings;-High user-customizability. PMID:27703842

  10. Photogrammetry-Based Automated Measurements for Tooth Shape and Occlusion Analysis

    Science.gov (United States)

    Knyaz, V. A.; Gaboutchian, A. V.

    2016-06-01

    Tooth measurements (odontometry) are performed for various scientific and practical applications, including dentistry. Present-day techniques are being increasingly based on 3D model use that provides wider prospects in comparison to measurements on real objects: teeth or their plaster copies. The main advantages emerge through application of new measurement methods which provide the needed degree of non-invasiveness, precision, convenience and details. Tooth measurements have been always regarded as a time-consuming research, even more so with use of new methods due to their wider opportunities. This is where automation becomes essential for further development and implication of measurement techniques. In our research automation in obtaining 3D models and automation of measurements provided essential data that was analysed to suggest recommendations for tooth preparation - one of the most responsible clinical procedures in prosthetic dentistry - within a comparatively short period of time. The original photogrammetric 3D reconstruction system allows to generate 3D models of dental arches, reproduce their closure, or occlusion, and to perform a set of standard measurement in automated mode.

  11. GenePublisher: automated analysis of DNA microarray data

    DEFF Research Database (Denmark)

    Knudsen, Steen; Workman, Christopher; Sicheritz-Ponten, T.

    2003-01-01

    GenePublisher, a system for automatic analysis of data from DNA microarray experiments, has been implemented with a web interface at http://www.cbs.dtu.dk/services/GenePublisher. Raw data are uploaded to the server together with aspecification of the data. The server performs normalization......, statistical analysis and visualization of the data. The results are run against databases of signal transduction pathways, metabolic pathways and promoter sequences in order to extract more information. The results of the entire analysis are summarized in report form and returned to the user....

  12. A Method to Identify and Analyze Biological Programs through Automated Reasoning

    Science.gov (United States)

    Kugler, Hillel; Smith, Austin; Martello, Graziano; Emmott, Stephen

    2016-01-01

    Predictive biology is elusive because rigorous, data-constrained, mechanistic models of complex biological systems are difficult to derive and validate. Current approaches tend to construct and examine static interaction network models, which are descriptively rich but often lack explanatory and predictive power, or dynamic models that can be simulated to reproduce known behavior. However, in such approaches implicit assumptions are introduced as typically only one mechanism is considered, and exhaustively investigating all scenarios is impractical using simulation. To address these limitations, we present a methodology based on automated formal reasoning, which permits the synthesis and analysis of the complete set of logical models consistent with experimental observations. We test hypotheses against all candidate models, and remove the need for simulation by characterizing and simultaneously analyzing all mechanistic explanations of observed behavior. Our methodology transforms knowledge of complex biological processes from sets of possible interactions and experimental observations to precise, predictive biological programs governing cell function. PMID:27668090

  13. Automation of Safety Analysis with SysML Models Project

    Data.gov (United States)

    National Aeronautics and Space Administration — This project was a small proof-of-concept case study, generating SysML model information as a side effect of safety analysis. A prototype FMEA Assistant was...

  14. Infrascope: Full-Spectrum Phonocardiography with Automated Signal Analysis Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Using digital signal analysis tools, we will generate a repeatable output from the infrascope and compare it to the output of a traditional electrocardiogram, and...

  15. Automated Techniques for Rapid Analysis of Momentum Exchange Devices

    Science.gov (United States)

    2013-12-01

    Contiguousness At this point, it is necessary to introduce the concept of contiguousness. In this thesis, a state space analysis representation is... concept of contiguousness was established to ensure that the results of the analysis would allow for the CMGs to reach every state in the defined...forces at the attachment points of the RWs and CMGs throughout a spacecraft maneuver. Current pedagogy on this topic focuses on the transfer of

  16. Modelling application for cognitive reliability and error analysis method

    Directory of Open Access Journals (Sweden)

    Fabio De Felice

    2013-10-01

    Full Text Available The automation of production systems has delegated to machines the execution of highly repetitive and standardized tasks. In the last decade, however, the failure of the automatic factory model has led to partially automated configurations of production systems. Therefore, in this scenario, centrality and responsibility of the role entrusted to the human operators are exalted because it requires problem solving and decision making ability. Thus, human operator is the core of a cognitive process that leads to decisions, influencing the safety of the whole system in function of their reliability. The aim of this paper is to propose a modelling application for cognitive reliability and error analysis method.

  17. Fully Automated Laser Ablation Liquid Capture Sample Analysis using NanoElectrospray Ionization Mass Spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Lorenz, Matthias [ORNL; Ovchinnikova, Olga S [ORNL; Van Berkel, Gary J [ORNL

    2014-01-01

    RATIONALE: Laser ablation provides for the possibility of sampling a large variety of surfaces with high spatial resolution. This type of sampling when employed in conjunction with liquid capture followed by nanoelectrospray ionization provides the opportunity for sensitive and prolonged interrogation of samples by mass spectrometry as well as the ability to analyze surfaces not amenable to direct liquid extraction. METHODS: A fully automated, reflection geometry, laser ablation liquid capture spot sampling system was achieved by incorporating appropriate laser fiber optics and a focusing lens into a commercially available, liquid extraction surface analysis (LESA ) ready Advion TriVersa NanoMate system. RESULTS: Under optimized conditions about 10% of laser ablated material could be captured in a droplet positioned vertically over the ablation region using the NanoMate robot controlled pipette. The sampling spot size area with this laser ablation liquid capture surface analysis (LA/LCSA) mode of operation (typically about 120 m x 160 m) was approximately 50 times smaller than that achievable by direct liquid extraction using LESA (ca. 1 mm diameter liquid extraction spot). The set-up was successfully applied for the analysis of ink on glass and paper as well as the endogenous components in Alstroemeria Yellow King flower petals. In a second mode of operation with a comparable sampling spot size, termed laser ablation/LESA , the laser system was used to drill through, penetrate, or otherwise expose material beneath a solvent resistant surface. Once drilled, LESA was effective in sampling soluble material exposed at that location on the surface. CONCLUSIONS: Incorporating the capability for different laser ablation liquid capture spot sampling modes of operation into a LESA ready Advion TriVersa NanoMate enhanced the spot sampling spatial resolution of this device and broadened the surface types amenable to analysis to include absorbent and solvent resistant

  18. Development of a full automation solid phase microextraction method for investigating the partition coefficient of organic pollutant in complex sample.

    Science.gov (United States)

    Jiang, Ruifen; Lin, Wei; Wen, Sijia; Zhu, Fang; Luan, Tiangang; Ouyang, Gangfeng

    2015-08-07

    A fully automated solid phase microextraction (SPME) depletion method was developed to study the partition coefficient of organic compound between complex matrix and water sample. The SPME depletion process was conducted by pre-loading the fiber with a specific amount of organic compounds from a proposed standard gas generation vial, and then desorbing the fiber into the targeted samples. Based on the proposed method, the partition coefficients (Kmatrix) of 4 polyaromatic hydrocarbons (PAHs) between humic acid (HA)/hydroxypropyl-β-cyclodextrin (β-HPCD) and aqueous sample were determined. The results showed that the logKmatrix of 4 PAHs with HA and β-HPCD ranged from 3.19 to 4.08, and 2.45 to 3.15, respectively. In addition, the logKmatrix values decreased about 0.12-0.27 log units for different PAHs for every 10°C increase in temperature. The effect of temperature on the partition coefficient followed van't Hoff plot, and the partition coefficient at any temperature can be predicted based on the plot. Furthermore, the proposed method was applied for the real biological fluid analysis. The partition coefficients of 6 PAHs between the complex matrices in the fetal bovine serum and water were determined, and compared to ones obtained from SPME extraction method. The result demonstrated that the proposed method can be applied to determine the sorption coefficients of hydrophobic compounds between complex matrix and water in a variety of samples.

  19. Automated four color CD4/CD8 analysis of leukocytes by scanning fluorescence microscopy using Quantum dots

    Science.gov (United States)

    Bocsi, Jozsef; Mittag, Anja; Varga, Viktor S.; Molnar, Bela; Tulassay, Zsolt; Sack, Ulrich; Lenz, Dominik; Tarnok, Attila

    2006-02-01

    Scanning Fluorescence Microscope (SFM) is a new technique for automated motorized microscopes to measure multiple fluorochrome labeled cells (Bocsi et al. Cytometry 2004, 61A:1). The ratio of CD4+/CD8+ cells is an important in immune diagnostics in immunodeficiency and HIV. Therefor a four-color staining protocol (DNA, CD3, CD4 and CD8) for automated SFM analysis of lymphocytes was developed. EDTA uncoagulated blood was stained with organic and inorganic (Quantum dots) fluorochromes in different combinations. Aliquots of samples were measured by Flow Cytometry (FCM) and SFM. By SFM specimens were scanned and digitized using four fluorescence filter sets. Automated cell detection (based on Hoechst 33342 fluorescence), CD3, CD4 and CD8 detection were performed, CD4/CD8 ratio was calculated. Fluorescence signals were well separable on SFM and FCM. Passing and Bablok regression of all CD4/CD8 ratios obtained by FCM and SFM (F(X)=0.0577+0.9378x) are in the 95% confidence interval. Cusum test did not show significant deviation from linearity (P>0.10). This comparison indicates that there is no systemic bias between the two different methods. In SFM analyses the inorganic Quantum dot staining was very stable in PBS in contrast to the organic fluorescent dyes, but bleached shortly after mounting with antioxidant and free radical scavenger mounting media. This shows the difficulty of combinations of organic dyes and Quantum dots. Slide based multi-fluorescence labeling system and automated SFM are applicable tools for the CD4/CD8 ratio determination in peripheral blood samples. Quantum Dots are stable inorganic fluorescence labels that may be used as reliable high resolution dyes for cell labeling.

  20. Automated Discovery of Elementary Chemical Reaction Steps Using Freezing String and Berny Optimization Methods.

    Science.gov (United States)

    Suleimanov, Yury V; Green, William H

    2015-09-08

    We present a simple protocol which allows fully automated discovery of elementary chemical reaction steps using in cooperation double- and single-ended transition-state optimization algorithms--the freezing string and Berny optimization methods, respectively. To demonstrate the utility of the proposed approach, the reactivity of several single-molecule systems of combustion and atmospheric chemistry importance is investigated. The proposed algorithm allowed us to detect without any human intervention not only "known" reaction pathways, manually detected in the previous studies, but also new, previously "unknown", reaction pathways which involve significant atom rearrangements. We believe that applying such a systematic approach to elementary reaction path finding will greatly accelerate the discovery of new chemistry and will lead to more accurate computer simulations of various chemical processes.

  1. Automated Discovery of Elementary Chemical Reaction Steps Using Freezing String and Berny Optimization Methods

    CERN Document Server

    Suleimanov, Yury V

    2015-01-01

    We present a simple protocol which allows fully automated discovery of elementary chemical reaction steps using in cooperation single- and double-ended transition-state optimization algorithms - the freezing string and Berny optimization methods, respectively. To demonstrate the utility of the proposed approach, the reactivity of several systems of combustion and atmospheric chemistry importance is investigated. The proposed algorithm allowed us to detect without any human intervention not only "known" reaction pathways, manually detected in the previous studies, but also new, previously "unknown", reaction pathways which involve significant atom rearrangements. We believe that applying such a systematic approach to elementary reaction path finding will greatly accelerate the possibility of discovery of new chemistry and will lead to more accurate computer simulations of various chemical processes.

  2. A method for automating calibration and records management for instrumentation and dosimetry

    Energy Technology Data Exchange (ETDEWEB)

    O`Brien, J.M. Jr.; Rushton, R.O.; Burns, R.E. Jr. [Atlan-Tech, Inc., Roswell, GA (United States)

    1993-12-31

    Current industry requirements are becoming more stringent on quality assurance records and documentation for calibration of instruments and dosimetry. A novel method is presented here that will allow a progressive automation scheme to be used in pursuit of that goal. This concept is based on computer-controlled irradiators that can act as stand-alone devices or be interfaced to other components via a computer local area network. In this way, complete systems can be built with modules to create a records management system to meet the needs of small laboratories or large multi-building calibration groups. Different database engines or formats can be used simply by replacing a module. Modules for temperature and pressure monitoring or shipping and receiving can be added, as well as equipment modules for direct IEEE-488 interface to electrometers and other instrumentation.

  3. Device and method for automated separation of a sample of whole blood into aliquots

    Science.gov (United States)

    Burtis, Carl A.; Johnson, Wayne F.

    1989-01-01

    A device and a method for automated processing and separation of an unmeasured sample of whole blood into multiple aliquots of plasma. Capillaries are radially oriented on a rotor, with the rotor defining a sample chamber, transfer channels, overflow chamber, overflow channel, vent channel, cell chambers, and processing chambers. A sample of whole blood is placed in the sample chamber, and when the rotor is rotated, the blood moves outward through the transfer channels to the processing chambers where the blood is centrifugally separated into a solid cellular component and a liquid plasma component. When the rotor speed is decreased, the plasma component backfills the capillaries resulting in uniform aliquots of plasma which may be used for subsequent analytical procedures.

  4. METHOD OF PLANNING CONTROL AUTOMATED SYSTEM INTEGRITY IN THE OPERATION AND SUPPORT

    Directory of Open Access Journals (Sweden)

    Anton S. Lysunets

    2014-01-01

    Full Text Available The article describes the problems of monitoring the integrity of sophisticated banking systems in the accompaniment. The understanding of the level of integrity of the automated system is described using estimates of the integrity level of the planning methodology of testing the automated banking system. Results of the article would be used in courses on Software Engineering and Banking Systems Automation

  5. Mouse Social Interaction Test (MoST): a quantitative computer automated analysis of behavior.

    Science.gov (United States)

    Thanos, Panayotis K; Restif, Christophe; O'Rourke, Joseph R; Lam, Chiu Yin; Metaxas, Dimitris

    2017-01-01

    Rodents are the most commonly used preclinical model of human disease assessing the mechanism(s) involved as well as the role of genetics, epigenetics, and pharmacotherapy on this disease as well as identifying vulnerability factors and risk assessment for disease critical in the development of improved treatment strategies. Unfortunately, the majority of rodent preclinical studies utilize single housed approaches where animals are either entirely housed and tested in solitary environments or group housed but tested in solitary environments. This approach, however, ignores the important contribution of social interaction and social behavior. Social interaction in rodents is found to be a major criterion for the ethological validity of rodent species-specific behavioral characteristics (Zurn et al. 2007; Analysis 2011). It is also well established that there is significant and growing number of reports, which illustrates the important role of social environment and social interaction in all diseases, with particularly significance in all neuropsychiatric diseases. Thus, it is imperative that research studies be able to add large-scale evaluations of social interaction and behavior in mice and benefit from automated tracking of behaviors and measurements by removing user bias and by quantifying aspects of behaviors that cannot be assessed by a human observer. Single mouse setups have been used routinely, but cannot be easily extended to multiple-animal studies where social behavior is key, e.g., autism, depression, anxiety, substance and non-substance addictive disorders, aggression, sexual behavior, or parenting. While recent efforts are focusing on multiple-animal tracking alone, a significant limitation remains the lack of insightful measures of social interactions. We present a novel, non-invasive single camera-based automated tracking method described as Mouse Social Test (MoST) and set of measures designed for estimating the interactions of multiple mice at the

  6. Semi-automated calibration method for modelling of mountain permafrost evolution in Switzerland

    Science.gov (United States)

    Marmy, Antoine; Rajczak, Jan; Delaloye, Reynald; Hilbich, Christin; Hoelzle, Martin; Kotlarski, Sven; Lambiel, Christophe; Noetzli, Jeannette; Phillips, Marcia; Salzmann, Nadine; Staub, Benno; Hauck, Christian

    2016-11-01

    Permafrost is a widespread phenomenon in mountainous regions of the world such as the European Alps. Many important topics such as the future evolution of permafrost related to climate change and the detection of permafrost related to potential natural hazards sites are of major concern to our society. Numerical permafrost models are the only tools which allow for the projection of the future evolution of permafrost. Due to the complexity of the processes involved and the heterogeneity of Alpine terrain, models must be carefully calibrated, and results should be compared with observations at the site (borehole) scale. However, for large-scale applications, a site-specific model calibration for a multitude of grid points would be very time-consuming. To tackle this issue, this study presents a semi-automated calibration method using the Generalized Likelihood Uncertainty Estimation (GLUE) as implemented in a 1-D soil model (CoupModel) and applies it to six permafrost sites in the Swiss Alps. We show that this semi-automated calibration method is able to accurately reproduce the main thermal condition characteristics with some limitations at sites with unique conditions such as 3-D air or water circulation, which have to be calibrated manually. The calibration obtained was used for global and regional climate model (GCM/RCM)-based long-term climate projections under the A1B climate scenario (EU-ENSEMBLES project) specifically downscaled at each borehole site. The projection shows general permafrost degradation with thawing at 10 m, even partially reaching 20 m depth by the end of the century, but with different timing among the sites and with partly considerable uncertainties due to the spread of the applied climatic forcing.

  7. Dental wear estimation using a digital intra-oral optical scanner and an automated 3D computer vision method.

    Science.gov (United States)

    Meireles, Agnes Batista; Vieira, Antonio Wilson; Corpas, Livia; Vandenberghe, Bart; Bastos, Flavia Souza; Lambrechts, Paul; Campos, Mario Montenegro; Las Casas, Estevam Barbosa de

    2016-01-01

    The objective of this work was to propose an automated and direct process to grade tooth wear intra-orally. Eight extracted teeth were etched with acid for different times to produce wear and scanned with an intra-oral optical scanner. Computer vision algorithms were used for alignment and comparison among models. Wear volume was estimated and visual scoring was achieved to determine reliability. Results demonstrated that it is possible to directly detect submillimeter differences in teeth surfaces with an automated method with results similar to those obtained by direct visual inspection. The investigated method proved to be reliable for comparison of measurements over time.

  8. AMAB: Automated measurement and analysis of body motion

    NARCIS (Netherlands)

    Poppe, Ronald; Zee, van der Sophie; Heylen, Dirk K.J.; Taylor, Paul J.

    2014-01-01

    Technologies that measure human nonverbal behavior have existed for some time, and their use in the analysis of social behavior has become more popular following the development of sensor technologies that record full-body movement. However, a standardized methodology to efficiently represent and an

  9. Analysis of the automated systems of planning of spatial constructions

    Directory of Open Access Journals (Sweden)

    М.С. Барабаш

    2004-04-01

    Full Text Available  The article is devoted to the questions of analysis of existing SAPR and questions of development of new information technologies of planning on the basis of integration of programmatic complexes with the use of united informatively-logical model of object.

  10. ADDIS : an automated way to do network meta-analysis

    NARCIS (Netherlands)

    Zhao, Jing; van Valkenhoef, Gert; de Brock, E.O.; Hillege, Hans

    2012-01-01

    In evidence-based medicine, meta-analysis is an important statistical technique for combining the findings from independent clinical trials which have attempted to answer similar questions about treatment's clinical eectiveness [1]. Normally, such meta-analyses are pair-wise treatment comparisons, w

  11. Automated analysis of three-dimensional stress echocardiography

    NARCIS (Netherlands)

    K.Y.E. Leung (Esther); M. van Stralen (Marijn); M.G. Danilouchkine (Mikhail); G. van Burken (Gerard); M.L. Geleijnse (Marcel); J.H.C. Reiber (Johan); N. de Jong (Nico); A.F.W. van der Steen (Ton); J.G. Bosch (Johan)

    2011-01-01

    textabstractReal-time three-dimensional (3D) ultrasound imaging has been proposed as an alternative for two-dimensional stress echocardiography for assessing myocardial dysfunction and underlying coronary artery disease. Analysis of 3D stress echocardiography is no simple task and requires considera

  12. Automated Frequency Domain Decomposition for Operational Modal Analysis

    DEFF Research Database (Denmark)

    Brincker, Rune; Andersen, Palle; Jacobsen, Niels-Jørgen

    2007-01-01

    The Frequency Domain Decomposition (FDD) technique is known as one of the most user friendly and powerful techniques for operational modal analysis of structures. However, the classical implementation of the technique requires some user interaction. The present paper describes an algorithm for au...

  13. Automated detection and analysis of particle beams in laser-plasma accelerator simulations

    Energy Technology Data Exchange (ETDEWEB)

    Ushizima, Daniela Mayumi; Geddes, C.G.; Cormier-Michel, E.; Bethel, E. Wes; Jacobsen, J.; Prabhat, ,; R.ubel, O.; Weber, G,; Hamann, B.

    2010-05-21

    scientific data mining is increasingly considered. In plasma simulations, Bagherjeiran et al. presented a comprehensive report on applying graph-based techniques for orbit classification. They used the KAM classifier to label points and components in single and multiple orbits. Love et al. conducted an image space analysis of coherent structures in plasma simulations. They used a number of segmentation and region-growing techniques to isolate regions of interest in orbit plots. Both approaches analyzed particle accelerator data, targeting the system dynamics in terms of particle orbits. However, they did not address particle dynamics as a function of time or inspected the behavior of bunches of particles. Ruebel et al. addressed the visual analysis of massive laser wakefield acceleration (LWFA) simulation data using interactive procedures to query the data. Sophisticated visualization tools were provided to inspect the data manually. Ruebel et al. have integrated these tools to the visualization and analysis system VisIt, in addition to utilizing efficient data management based on HDF5, H5Part, and the index/query tool FastBit. In Ruebel et al. proposed automatic beam path analysis using a suite of methods to classify particles in simulation data and to analyze their temporal evolution. To enable researchers to accurately define particle beams, the method computes a set of measures based on the path of particles relative to the distance of the particles to a beam. To achieve good performance, this framework uses an analysis pipeline designed to quickly reduce the amount of data that needs to be considered in the actual path distance computation. As part of this process, region-growing methods are utilized to detect particle bunches at single time steps. Efficient data reduction is essential to enable automated analysis of large data sets as described in the next section, where data reduction methods are steered to the particular requirements of our clustering analysis

  14. An Empirical Study on the Impact of Automation on the Requirements Analysis Process

    Institute of Scientific and Technical Information of China (English)

    Giuseppe Lami; Robert W. Ferguson

    2007-01-01

    Requirements analysis is an important phase in a software project. The analysis is often performed in aninformal way by specialists who review documents looking for ambiguities, technical inconsistencies and incomplete parts.Automation is still far from being applied in requirements analyses, above all since natural languages are informal andthus difficult to treat automatically. There are only a few tools that can analyse texts. One of them, called QuARS, wasdeveloped by the Istituto di Scienza e Tecnologie dell'Informazione and can analyse texts in terms of ambiguity. This paperdescribes how QuARS was used in a formal empirical experiment to assess the impact in terms of effectiveness and efficacyof the automation in the requirements review process of a software company.

  15. Applying shot boundary detection for automated crystal growth analysis during in situ transmission electron microscope experiments

    Energy Technology Data Exchange (ETDEWEB)

    Moeglein, W. A.; Griswold, R.; Mehdi, B. L.; Browning, N. D.; Teuton, J.

    2017-01-03

    In-situ (scanning) transmission electron microscopy (S/TEM) is being developed for numerous applications in the study of nucleation and growth under electrochemical driving forces. For this type of experiment, one of the key parameters is to identify when nucleation initiates. Typically the process of identifying the moment that crystals begin to form is a manual process requiring the user to perform an observation and respond accordingly (adjust focus, magnification, translate the stage etc.). However, as the speed of the cameras being used to perform these observations increases, the ability of a user to “catch” the important initial stage of nucleation decreases (there is more information that is available in the first few milliseconds of the process). Here we show that video shot boundary detection (SBD) can automatically detect frames where a change in the image occurs. We show that this method can be applied to quickly and accurately identify points of change during crystal growth. This technique allows for automated segmentation of a digital stream for further analysis and the assignment of arbitrary time stamps for the initiation of processes that are independent of the user’s ability to observe and react.

  16. Using causal reasoning for automated failure modes and effects analysis (FMEA)

    Science.gov (United States)

    Bell, Daniel; Cox, Lisa; Jackson, Steve; Schaefer, Phil

    The authors have developed a tool that automates the reasoning portion of a failure modes and effects analysis (FMEA). It is built around a flexible causal reasoning module that has been adapted to the FMEA procedure. The approach and software architecture have been proven. A prototype tool has been created and successfully passed a test and evaluation program. The authors are expanding the operational capability and adapting the tool to various CAD/CAE (computer-aided design and engineering) platforms.

  17. Automated static image analysis as a novel tool in describing the physical properties of dietary fiber

    OpenAIRE

    Kurek,Marcin Andrzej; Piwińska, Monika; Wyrwisz, Jarosław; Wierzbicka, Agnieszka

    2015-01-01

    Abstract The growing interest in the usage of dietary fiber in food has caused the need to provide precise tools for describing its physical properties. This research examined two dietary fibers from oats and beets, respectively, in variable particle sizes. The application of automated static image analysis for describing the hydration properties and particle size distribution of dietary fiber was analyzed. Conventional tests for water holding capacity (WHC) were conducted. The particles were...

  18. Automated extraction of DNA from biological stains on fabric from crime cases. A comparison of a manual and three automated methods.

    Science.gov (United States)

    Stangegaard, Michael; Hjort, Benjamin B; Hansen, Thomas N; Hoflund, Anders; Mogensen, Helle S; Hansen, Anders J; Morling, Niels

    2013-05-01

    The presence of PCR inhibitors in extracted DNA may interfere with the subsequent quantification and short tandem repeat (STR) reactions used in forensic genetic DNA typing. DNA extraction from fabric for forensic genetic purposes may be challenging due to the occasional presence of PCR inhibitors that may be co-extracted with the DNA. Using 120 forensic trace evidence samples consisting of various types of fabric, we compared three automated DNA extraction methods based on magnetic beads (PrepFiler Express Forensic DNA Extraction Kit on an AutoMate Express, QIAsyphony DNA Investigator kit either with the sample pre-treatment recommended by Qiagen or an in-house optimized sample pre-treatment on a QIAsymphony SP) and one manual method (Chelex) with the aim of reducing the amount of PCR inhibitors in the DNA extracts and increasing the proportion of reportable STR-profiles. A total of 480 samples were processed. The highest DNA recovery was obtained with the PrepFiler Express kit on an AutoMate Express while the lowest DNA recovery was obtained using a QIAsymphony SP with the sample pre-treatment recommended by Qiagen. Extraction using a QIAsymphony SP with the sample pre-treatment recommended by Qiagen resulted in the lowest percentage of PCR inhibition (0%) while extraction using manual Chelex resulted in the highest percentage of PCR inhibition (51%). The largest number of reportable STR-profiles was obtained with DNA from samples extracted with the PrepFiler Express kit (75%) while the lowest number was obtained with DNA from samples extracted using a QIAsymphony SP with the sample pre-treatment recommended by Qiagen (41%).

  19. Automated analysis for microcalcifications in high resolution digital mammograms

    Science.gov (United States)

    Mascio, Laura N.

    1996-01-01

    A method for automatically locating microcalcifications indicating breast cancer. The invention assists mammographers in finding very subtle microcalcifications and in recognizing the pattern formed by all the microcalcifications. It also draws attention to microcalcifications that might be overlooked because a more prominent feature draws attention away from an important object. A new filter has been designed to weed out false positives in one of the steps of the method. Previously, iterative selection threshold was used to separate microcalcifications from the spurious signals resulting from texture or other background. A Selective Erosion or Enhancement (SEE) Filter has been invented to improve this step. Since the algorithm detects areas containing potential calcifications on the mammogram, it can be used to determine which areas need be stored at the highest resolution available, while, in addition, the full mammogram can be reduced to an appropriate resolution for the remaining cancer signs.

  20. Automated image analysis for quantification of filamentous bacteria

    DEFF Research Database (Denmark)

    Fredborg, M.; Rosenvinge, F. S.; Spillum, E.

    2015-01-01

    Background: Antibiotics of the beta-lactam group are able to alter the shape of the bacterial cell wall, e.g. filamentation or a spheroplast formation. Early determination of antimicrobial susceptibility may be complicated by filamentation of bacteria as this can be falsely interpreted as growth...... displaying different resistant profiles and differences in filamentation kinetics were used to study a novel image analysis algorithm to quantify length of bacteria and bacterial filamentation. A total of 12 beta-lactam antibiotics or beta-lactam-beta-lactamase inhibitor combinations were analyzed...

  1. Automated MALDI matrix deposition method with inkjet printing for imaging mass spectrometry.

    Science.gov (United States)

    Baluya, Dodge L; Garrett, Timothy J; Yost, Richard A

    2007-09-01

    Careful matrix deposition on tissue samples for matrix-assisted laser desorption/ionization (MALDI) is critical for producing reproducible analyte ion signals. Traditional methods for matrix deposition are often considered an art rather than a science, with significant sample-to-sample variability. Here we report an automated method for matrix deposition, employing a desktop inkjet printer (printer tray, designed to hold CDs and DVDs, was modified to hold microscope slides. Empty ink cartridges were filled with MALDI matrix solutions, including DHB in methanol/water (70:30) at concentrations up to 40 mg/mL. Various samples (including rat brain tissue sections and standards of small drug molecules) were prepared using three deposition methods (electrospray, airbrush, inkjet). A linear ion trap equipped with an intermediate-pressure MALDI source was used for analyses. Optical microscopic examination showed that matrix crystals were formed evenly across the sample. There was minimal background signal after storing the matrix in the cartridges over a 6-month period. Overall, the mass spectral images gathered from inkjet-printed tissue specimens were of better quality and more reproducible than from specimens prepared by the electrospray and airbrush methods.

  2. An automated method to build groundwater model hydrostratigraphy from airborne electromagnetic data and lithological borehole logs

    Directory of Open Access Journals (Sweden)

    P. A. Marker

    2015-02-01

    Full Text Available Large-scale integrated hydrological models are important decision support tools in water resources management. The largest source of uncertainty in such models is the hydrostratigraphic model. Geometry and configuration of hydrogeological units are often poorly determined from hydrogeological data alone. Due to sparse sampling in space, lithological borehole logs may overlook structures that are important for groundwater flow at larger scales. Good spatial coverage along with high spatial resolution makes airborne time-domain electromagnetic (AEM data valuable for the structural input to large-scale groundwater models. We present a novel method to automatically integrate large AEM data-sets and lithological information into large-scale hydrological models. Clay-fraction maps are produced by translating geophysical resistivity into clay-fraction values using lithological borehole information. Voxel models of electrical resistivity and clay fraction are classified into hydrostratigraphic zones using k-means clustering. Hydraulic conductivity values of the zones are estimated by hydrological calibration using hydraulic head and stream discharge observations. The method is applied to a Danish case study. Benchmarking hydrological performance by comparison of simulated hydrological state variables, the cluster model performed competitively. Calibrations of 11 hydrostratigraphic cluster models with 1–11 hydraulic conductivity zones showed improved hydrological performance with increasing number of clusters. Beyond the 5-cluster model hydrological performance did not improve. Due to reproducibility and possibility of method standardization and automation, we believe that hydrostratigraphic model generation with the proposed method has important prospects for groundwater models used in water resources management.

  3. Wireless Controlled Methods via Voice and Internet (e-mail for Home Automation System

    Directory of Open Access Journals (Sweden)

    R.A.Ramlee

    2013-08-01

    Full Text Available This paper presents a wireless Home Automation System (HAS that mainly performed by computer. The system is designed with several control methods in order to control the target electrical appliances. This various control methods implemented to fulfill the needs of users at home even at outside. The computer application is designed in Microsoft Windows OS that integrated with speech recognition voice control by using Microsoft Speech Application Programming Interface (SAPI. The voice control method provides more convenience especially to the blind and paralyzed users at home. This system is designed to perform short distance control by using wireless Bluetooth technology and long distance control by using Simple Mail Transfer Protocol (SMTP email control method. The short distance control is considered as the control that performed inside the house. Moreover, the long distance control can be performed at everywhere by devices that installed with browser or email application, and also with the internet access. The system intended to control electrical appliances at home with relatively low cost design, user-friendly interface and ease of installation.

  4. Automated Method to Determine Two Critical Growth Stages of Wheat: Heading and Flowering

    Science.gov (United States)

    Sadeghi-Tehran, Pouria; Sabermanesh, Kasra; Virlet, Nicolas; Hawkesford, Malcolm J.

    2017-01-01

    Recording growth stage information is an important aspect of precision agriculture, crop breeding and phenotyping. In practice, crop growth stage is still primarily monitored by-eye, which is not only laborious and time-consuming, but also subjective and error-prone. The application of computer vision on digital images offers a high-throughput and non-invasive alternative to manual observations and its use in agriculture and high-throughput phenotyping is increasing. This paper presents an automated method to detect wheat heading and flowering stages, which uses the application of computer vision on digital images. The bag-of-visual-word technique is used to identify the growth stage during heading and flowering within digital images. Scale invariant feature transformation feature extraction technique is used for lower level feature extraction; subsequently, local linear constraint coding and spatial pyramid matching are developed in the mid-level representation stage. At the end, support vector machine classification is used to train and test the data samples. The method outperformed existing algorithms, having yielded 95.24, 97.79, 99.59% at early, medium and late stages of heading, respectively and 85.45% accuracy for flowering detection. The results also illustrate that the proposed method is robust enough to handle complex environmental changes (illumination, occlusion). Although the proposed method is applied only on identifying growth stage in wheat, there is potential for application to other crops and categorization concepts, such as disease classification. PMID:28289423

  5. Automated blood cell count: a sensitive and reliable method to study corticosterone-related stress in broilers.

    NARCIS (Netherlands)

    Post, J.; Rebel, J.M.J.; Huurne, ter A.A.H.M.

    2003-01-01

    In chickens the heterophil/lymphocyte ratio (H/L) has proved to be a valuable tool in stress related research. In general, H/L is determined with the microscopic differential count on a blood film. We evaluated automated analysis for measuring blood cell parameters in relation to corticosterone in a

  6. Comparison among single-phase test, automated screening method and GC/MS for the traceability of ketamine in urine

    Directory of Open Access Journals (Sweden)

    Alice Visione

    2016-12-01

    CONCLUSION Following the law indications, ketamine is not searched: this limit does not make the authorities able to apply the penalties expected for road laws violations. The automation is essential to guarantee the reliability of toxicological screening tests, especially to medico-legal significance. This results highlight the absolutely necessity of the execution of the confirmation test, successively to screening analysis.

  7. Automated Experimental Data Analysis at the National Ignition Facility

    Energy Technology Data Exchange (ETDEWEB)

    Azevedo, S G; Bettenhausen, R C; Beeler, R G; Bond, E J; Edwards, P W; Glenn, S M; Liebman, J A; Tappero, J D; Warrick, A L; Williams, W H

    2009-09-24

    The National Ignition Facility (NIF) at the Lawrence Livermore National Laboratory is a 192-beam 1.8 MJ ultraviolet laser system designed to support high-energy-density science, including demonstration of inertial confinement fusion ignition. After each target shot lasting {approx}20 ns, scientists require data acquisition, analysis and display within 30 minutes from more than 20 specialized high-speed diagnostic instruments. These diagnostics measure critical x-ray, optical and nuclear phenomena during target burn to quantify ignition results and compare to computational models. All diagnostic data (hundreds of Gbytes) are automatically transferred to an Oracle database that triggers the NIF Shot Data Analysis (SDA) Engine, which distributes the signal and image processing tasks to a Linux cluster. The SDA Engine integrates commercial workflow tools and messaging technologies into a scientific software architecture that is highly parallel, scalable, and flexible. Results are archived in the database for scientist approval and displayed using a web-based tool. The unique architecture and functionality of the SDA Engine will be presented along with an example.

  8. Automated quantification of the synchrogram by recurrence plot analysis.

    Science.gov (United States)

    Nguyen, Chinh Duc; Wilson, Stephen James; Crozier, Stuart

    2012-04-01

    Recently, the concept of phase synchronization of two weakly coupled oscillators has raised a great research interest and has been applied to characterize synchronization phenomenon in physiological data. Phase synchronization of cardiorespiratory coupling is often studied by a synchrogram analysis, a graphical tool investigating the relationship between instantaneous phases of two signals. Although several techniques have been proposed to automatically quantify the synchrogram, most of them require a preselection of a phase-locking ratio by trial and error. One technique does not require this information; however, it is based on the power spectrum of phase's distribution in the synchrogram, which is vulnerable to noise. This study aims to introduce a new technique to automatically quantify the synchrogram by studying its dynamic structure. Our technique exploits recurrence plot analysis, which is a well-established tool for characterizing recurring patterns and nonstationarities in experiments. We applied our technique to detect synchronization in simulated and measured infants' cardiorespiratory data. Our results suggest that the proposed technique is able to systematically detect synchronization in noisy and chaotic data without preselecting the phase-locking ratio. By embedding phase information of the synchrogram into phase space, the phase-locking ratio is automatically unveiled as the number of attractors.

  9. Methodical Approaches to Creation of Dividing Automation at Industrial Enterprises with Generating Power Plants

    Directory of Open Access Journals (Sweden)

    E. V. Kalentionok

    2010-01-01

    Full Text Available The paper considers a problem pertaining to creation of dividing automation at industrial enterprises which have their own generating plants. Algorithms for action of dividing automation that permit to ensure minimum possible power non-balance while using generating plants for autonomous operation and possible parameters for its response are proposed in the paper.

  10. Automated Extraction of Archaeological Traces by a Modified Variance Analysis

    Directory of Open Access Journals (Sweden)

    Tiziana D'Orazio

    2015-03-01

    Full Text Available This paper considers the problem of detecting archaeological traces in digital aerial images by analyzing the pixel variance over regions around selected points. In order to decide if a point belongs to an archaeological trace or not, its surrounding regions are considered. The one-way ANalysis Of VAriance (ANOVA is applied several times to detect the differences among these regions; in particular the expected shape of the mark to be detected is used in each region. Furthermore, an effect size parameter is defined by comparing the statistics of these regions with the statistics of the entire population in order to measure how strongly the trace is appreciable. Experiments on synthetic and real images demonstrate the effectiveness of the proposed approach with respect to some state-of-the-art methodologies.

  11. Knowledge Support and Automation for Performance Analysis with PerfExplorer 2.0

    Directory of Open Access Journals (Sweden)

    Kevin A. Huck

    2008-01-01

    Full Text Available The integration of scalable performance analysis in parallel development tools is difficult. The potential size of data sets and the need to compare results from multiple experiments presents a challenge to manage and process the information. Simply to characterize the performance of parallel applications running on potentially hundreds of thousands of processor cores requires new scalable analysis techniques. Furthermore, many exploratory analysis processes are repeatable and could be automated, but are now implemented as manual procedures. In this paper, we will discuss the current version of PerfExplorer, a performance analysis framework which provides dimension reduction, clustering and correlation analysis of individual trails of large dimensions, and can perform relative performance analysis between multiple application executions. PerfExplorer analysis processes can be captured in the form of Python scripts, automating what would otherwise be time-consuming tasks. We will give examples of large-scale analysis results, and discuss the future development of the framework, including the encoding and processing of expert performance rules, and the increasing use of performance metadata.

  12. TScratch: a novel and simple software tool for automated analysis of monolayer wound healing assays.

    Science.gov (United States)

    Gebäck, Tobias; Schulz, Martin Michael Peter; Koumoutsakos, Petros; Detmar, Michael

    2009-04-01

    Cell migration plays a major role in development, physiology, and disease, and is frequently evaluated in vitro by the monolayer wound healing assay. The assay analysis, however, is a time-consuming task that is often performed manually. In order to accelerate this analysis, we have developed TScratch, a new, freely available image analysis technique and associated software tool that uses the fast discrete curvelet transform to automate the measurement of the area occupied by cells in the images. This tool helps to significantly reduce the time needed for analysis and enables objective and reproducible quantification of assays. The software also offers a graphical user interface which allows easy inspection of analysis results and, if desired, manual modification of analysis parameters. The automated analysis was validated by comparing its results with manual-analysis results for a range of different cell lines. The comparisons demonstrate a close agreement for the vast majority of images that were examined and indicate that the present computational tool can reproduce statistically significant results in experiments with well-known cell migration inhibitors and enhancers.

  13. Some selected quantitative methods of thermal image analysis in Matlab.

    Science.gov (United States)

    Koprowski, Robert

    2016-05-01

    The paper presents a new algorithm based on some selected automatic quantitative methods for analysing thermal images. It shows the practical implementation of these image analysis methods in Matlab. It enables to perform fully automated and reproducible measurements of selected parameters in thermal images. The paper also shows two examples of the use of the proposed image analysis methods for the area of ​​the skin of a human foot and face. The full source code of the developed application is also provided as an attachment. The main window of the program during dynamic analysis of the foot thermal image.

  14. Establishing a novel automated magnetic bead-based method for the extraction of DNA from a variety of forensic samples.

    Science.gov (United States)

    Witt, Sebastian; Neumann, Jan; Zierdt, Holger; Gébel, Gabriella; Röscheisen, Christiane

    2012-09-01

    Automated systems have been increasingly utilized for DNA extraction by many forensic laboratories to handle growing numbers of forensic casework samples while minimizing the risk of human errors and assuring high reproducibility. The step towards automation however is not easy: The automated extraction method has to be very versatile to reliably prepare high yields of pure genomic DNA from a broad variety of sample types on different carrier materials. To prevent possible cross-contamination of samples or the loss of DNA, the components of the kit have to be designed in a way that allows for the automated handling of the samples with no manual intervention necessary. DNA extraction using paramagnetic particles coated with a DNA-binding surface is predestined for an automated approach. For this study, we tested different DNA extraction kits using DNA-binding paramagnetic particles with regard to DNA yield and handling by a Freedom EVO(®)150 extraction robot (Tecan) equipped with a Te-MagS magnetic separator. Among others, the extraction kits tested were the ChargeSwitch(®)Forensic DNA Purification Kit (Invitrogen), the PrepFiler™Automated Forensic DNA Extraction Kit (Applied Biosystems) and NucleoMag™96 Trace (Macherey-Nagel). After an extensive test phase, we established a novel magnetic bead extraction method based upon the NucleoMag™ extraction kit (Macherey-Nagel). The new method is readily automatable and produces high yields of DNA from different sample types (blood, saliva, sperm, contact stains) on various substrates (filter paper, swabs, cigarette butts) with no evidence of a loss of magnetic beads or sample cross-contamination.

  15. Arsenic fractionation in agricultural soil using an automated three-step sequential extraction method coupled to hydride generation-atomic fluorescence spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Rosas-Castor, J.M. [Universidad Autónoma de Nuevo León, UANL, Facultad de Ciencias Químicas, Cd. Universitaria, San Nicolás de los Garza, Nuevo León, C.P. 66451 Nuevo León (Mexico); Group of Analytical Chemistry, Automation and Environment, University of Balearic Islands, Cra. Valldemossa km 7.5, 07122 Palma de Mallorca (Spain); Portugal, L.; Ferrer, L. [Group of Analytical Chemistry, Automation and Environment, University of Balearic Islands, Cra. Valldemossa km 7.5, 07122 Palma de Mallorca (Spain); Guzmán-Mar, J.L.; Hernández-Ramírez, A. [Universidad Autónoma de Nuevo León, UANL, Facultad de Ciencias Químicas, Cd. Universitaria, San Nicolás de los Garza, Nuevo León, C.P. 66451 Nuevo León (Mexico); Cerdà, V. [Group of Analytical Chemistry, Automation and Environment, University of Balearic Islands, Cra. Valldemossa km 7.5, 07122 Palma de Mallorca (Spain); Hinojosa-Reyes, L., E-mail: laura.hinojosary@uanl.edu.mx [Universidad Autónoma de Nuevo León, UANL, Facultad de Ciencias Químicas, Cd. Universitaria, San Nicolás de los Garza, Nuevo León, C.P. 66451 Nuevo León (Mexico)

    2015-05-18

    Highlights: • A fully automated flow-based modified-BCR extraction method was developed to evaluate the extractable As of soil. • The MSFIA–HG-AFS system included an UV photo-oxidation step for organic species degradation. • The accuracy and precision of the proposed method were found satisfactory. • The time analysis can be reduced up to eight times by using the proposed flow-based BCR method. • The labile As (F1 + F2) was <50% of total As in soil samples from As-contaminated-mining zones. - Abstract: A fully automated modified three-step BCR flow-through sequential extraction method was developed for the fractionation of the arsenic (As) content from agricultural soil based on a multi-syringe flow injection analysis (MSFIA) system coupled to hydride generation-atomic fluorescence spectrometry (HG-AFS). Critical parameters that affect the performance of the automated system were optimized by exploiting a multivariate approach using a Doehlert design. The validation of the flow-based modified-BCR method was carried out by comparison with the conventional BCR method. Thus, the total As content was determined in the following three fractions: fraction 1 (F1), the acid-soluble or interchangeable fraction; fraction 2 (F2), the reducible fraction; and fraction 3 (F3), the oxidizable fraction. The limits of detection (LOD) were 4.0, 3.4, and 23.6 μg L{sup −1} for F1, F2, and F3, respectively. A wide working concentration range was obtained for the analysis of each fraction, i.e., 0.013–0.800, 0.011–0.900 and 0.079–1.400 mg L{sup −1} for F1, F2, and F3, respectively. The precision of the automated MSFIA–HG-AFS system, expressed as the relative standard deviation (RSD), was evaluated for a 200 μg L{sup −1} As standard solution, and RSD values between 5 and 8% were achieved for the three BCR fractions. The new modified three-step BCR flow-based sequential extraction method was satisfactorily applied for arsenic fractionation in real agricultural

  16. Arsenic fractionation in agricultural soil using an automated three-step sequential extraction method coupled to hydride generation-atomic fluorescence spectrometry.

    Science.gov (United States)

    Rosas-Castor, J M; Portugal, L; Ferrer, L; Guzmán-Mar, J L; Hernández-Ramírez, A; Cerdà, V; Hinojosa-Reyes, L

    2015-05-18

    A fully automated modified three-step BCR flow-through sequential extraction method was developed for the fractionation of the arsenic (As) content from agricultural soil based on a multi-syringe flow injection analysis (MSFIA) system coupled to hydride generation-atomic fluorescence spectrometry (HG-AFS). Critical parameters that affect the performance of the automated system were optimized by exploiting a multivariate approach using a Doehlert design. The validation of the flow-based modified-BCR method was carried out by comparison with the conventional BCR method. Thus, the total As content was determined in the following three fractions: fraction 1 (F1), the acid-soluble or interchangeable fraction; fraction 2 (F2), the reducible fraction; and fraction 3 (F3), the oxidizable fraction. The limits of detection (LOD) were 4.0, 3.4, and 23.6 μg L(-1) for F1, F2, and F3, respectively. A wide working concentration range was obtained for the analysis of each fraction, i.e., 0.013-0.800, 0.011-0.900 and 0.079-1.400 mg L(-1) for F1, F2, and F3, respectively. The precision of the automated MSFIA-HG-AFS system, expressed as the relative standard deviation (RSD), was evaluated for a 200 μg L(-1) As standard solution, and RSD values between 5 and 8% were achieved for the three BCR fractions. The new modified three-step BCR flow-based sequential extraction method was satisfactorily applied for arsenic fractionation in real agricultural soil samples from an arsenic-contaminated mining zone to evaluate its extractability. The frequency of analysis of the proposed method was eight times higher than that of the conventional BCR method (6 vs 48 h), and the kinetics of lixiviation were established for each fraction.

  17. Selective Detection and Automated Counting of Fluorescently-Labeled Chrysotile Asbestos Using a Dual-Mode High-Throughput Microscopy (DM-HTM Method

    Directory of Open Access Journals (Sweden)

    Jung Kyung Kim

    2013-05-01

    Full Text Available Phase contrast microscopy (PCM is a widely used analytical method for airborne asbestos, but it is unable to distinguish asbestos from non-asbestos fibers and requires time-consuming and laborious manual counting of fibers. Previously, we developed a high-throughput microscopy (HTM method that could greatly reduce human intervention and analysis time through automated image acquisition and counting of fibers. In this study, we designed a dual-mode HTM (DM-HTM device for the combined reflection and fluorescence imaging of asbestos, and automated a series of built-in image processing commands of ImageJ software to test its capabilities. We used DksA, a chrysotile-adhesive protein, for selective detection of chrysotile fibers in the mixed dust-free suspension of crysotile and amosite prepared in the laboratory. We demonstrate that fluorescently-stained chrysotile and total fibers can be identified and enumerated automatically in a high-throughput manner by the DM-HTM system. Combined with more advanced software that can correctly identify overlapping and branching fibers and distinguish between fibers and elongated dust particles, the DM-HTM method should enable fully automated counting of airborne asbestos.

  18. Method 349.0 Determination of Ammonia in Estuarine and Coastal Waters by Gas Segmented Continuous Flow Colorimetric Analysis

    Science.gov (United States)

    This method provides a procedure for the determination of ammonia in estuarine and coastal waters. The method is based upon the indophenol reaction,1-5 here adapted to automated gas-segmented continuous flow analysis.

  19. AUTOMATION OF QUALITY CONTROL OF MILK HOMOGENIZATION BY ULTRASONIC SPECTROSCOPY METHODS

    Directory of Open Access Journals (Sweden)

    V. K. Bityukov

    2015-01-01

    Full Text Available The paper deals with the possibility of determining homogenization degree of milk and dairy products using ultrasonic vibrations absorption spectra. Advantages of this method application in automated manufacturing systems were examined. Theoretical background of the method, as well as the possibility of determining the distribution of the fat globules in milk, depending on their sizes were substantiated. We derived mathematical equations, showing the relationship between the homogenization degree of dairy products and the acoustic properties of the medium, such as the propagation velocity of ultrasonic vibrations in the medium tested and the absorption coefficient in the medium at a specific frequency of the ultrasonic impact and medium temperature. It was shown that the measurement of the propagation of the milk acoustic properties in frequency, evaluation of their density function of the relaxation times spectrum, and then from the relaxation times to the particles masses allow operational control of the fat globules masses distribution in fractions. Theoretical studies were confirmed by experimental research carrying out whose results demonstrate clearly the dependence of absorption degree of ultrasonic vibrations on the degree of milk homogenization. The dependence between the estimates of the relaxation spectra and the first two moments of the statistical distribution of milk fat globules, which demonstrated their relationship was studied. Possible improvements the method under consideration to increase the reliability of the results obtained were proposed.

  20. Unsupervised EEG analysis for automated epileptic seizure detection

    Science.gov (United States)

    Birjandtalab, Javad; Pouyan, Maziyar Baran; Nourani, Mehrdad

    2016-07-01

    Epilepsy is a neurological disorder which can, if not controlled, potentially cause unexpected death. It is extremely crucial to have accurate automatic pattern recognition and data mining techniques to detect the onset of seizures and inform care-givers to help the patients. EEG signals are the preferred biosignals for diagnosis of epileptic patients. Most of the existing pattern recognition techniques used in EEG analysis leverage the notion of supervised machine learning algorithms. Since seizure data are heavily under-represented, such techniques are not always practical particularly when the labeled data is not sufficiently available or when disease progression is rapid and the corresponding EEG footprint pattern will not be robust. Furthermore, EEG pattern change is highly individual dependent and requires experienced specialists to annotate the seizure and non-seizure events. In this work, we present an unsupervised technique to discriminate seizures and non-seizures events. We employ power spectral density of EEG signals in different frequency bands that are informative features to accurately cluster seizure and non-seizure events. The experimental results tried so far indicate achieving more than 90% accuracy in clustering seizure and non-seizure events without having any prior knowledge on patient's history.

  1. Semi-automated 3D segmentation of major tracts in the rat brain: comparing DTI with standard histological methods.

    Science.gov (United States)

    Gyengesi, Erika; Calabrese, Evan; Sherrier, Matthew C; Johnson, G Allan; Paxinos, George; Watson, Charles

    2014-03-01

    Researchers working with rodent models of neurological disease often require an accurate map of the anatomical organization of the white matter of the rodent brain. With the increasing popularity of small animal MRI techniques, including diffusion tensor imaging (DTI), there is considerable interest in rapid segmentation methods of neurological structures for quantitative comparisons. DTI-derived tractography allows simple and rapid segmentation of major white matter tracts, but the anatomic accuracy of these computer-generated fibers is open to question and has not been rigorously evaluated in the rat brain. In this study, we examine the anatomic accuracy of tractography-based segmentation in the adult rat brain. We analysed 12 major white matter pathways using semi-automated tractography-based segmentation alongside manual segmentation of Gallyas silver-stained histology sections. We applied four fiber-tracking algorithms to the DTI data-two integration methods and two deflection methods. In many cases, tractography-based segmentation closely matched histology-based segmentation; however different tractography algorithms produced dramatically different results. Results suggest that certain white matter pathways are more amenable to tractography-based segmentation than others. We believe that these data will help researchers decide whether it is appropriate to use tractography-based segmentation of white matter structures for quantitative DTI-based analysis of neurologic disease models.

  2. Automated Software Analysis of Fetal Movement Recorded during a Pregnant Woman's Sleep at Home.

    Science.gov (United States)

    Nishihara, Kyoko; Ohki, Noboru; Kamata, Hideo; Ryo, Eiji; Horiuchi, Shigeko

    2015-01-01

    Fetal movement is an important biological index of fetal well-being. Since 2008, we have been developing an original capacitive acceleration sensor and device that a pregnant woman can easily use to record fetal movement by herself at home during sleep. In this study, we report a newly developed automated software system for analyzing recorded fetal movement. This study will introduce the system and compare its results to those of a manual analysis of the same fetal movement signals (Experiment I). We will also demonstrate an appropriate way to use the system (Experiment II). In Experiment I, fetal movement data reported previously for six pregnant women at 28-38 gestational weeks were used. We evaluated the agreement of the manual and automated analyses for the same 10-sec epochs using prevalence-adjusted bias-adjusted kappa (PABAK) including quantitative indicators for prevalence and bias. The mean PABAK value was 0.83, which can be considered almost perfect. In Experiment II, twelve pregnant women at 24-36 gestational weeks recorded fetal movement at night once every four weeks. Overall, mean fetal movement counts per hour during maternal sleep significantly decreased along with gestational weeks, though individual differences in fetal development were noted. This newly developed automated analysis system can provide important data throughout late pregnancy.

  3. Automated Software Analysis of Fetal Movement Recorded during a Pregnant Woman's Sleep at Home.

    Directory of Open Access Journals (Sweden)

    Kyoko Nishihara

    Full Text Available Fetal movement is an important biological index of fetal well-being. Since 2008, we have been developing an original capacitive acceleration sensor and device that a pregnant woman can easily use to record fetal movement by herself at home during sleep. In this study, we report a newly developed automated software system for analyzing recorded fetal movement. This study will introduce the system and compare its results to those of a manual analysis of the same fetal movement signals (Experiment I. We will also demonstrate an appropriate way to use the system (Experiment II. In Experiment I, fetal movement data reported previously for six pregnant women at 28-38 gestational weeks were used. We evaluated the agreement of the manual and automated analyses for the same 10-sec epochs using prevalence-adjusted bias-adjusted kappa (PABAK including quantitative indicators for prevalence and bias. The mean PABAK value was 0.83, which can be considered almost perfect. In Experiment II, twelve pregnant women at 24-36 gestational weeks recorded fetal movement at night once every four weeks. Overall, mean fetal movement counts per hour during maternal sleep significantly decreased along with gestational weeks, though individual differences in fetal development were noted. This newly developed automated analysis system can provide important data throughout late pregnancy.

  4. Automated Image Analysis for the Detection of Benthic Crustaceans and Bacterial Mat Coverage Using the VENUS Undersea Cabled Network

    Directory of Open Access Journals (Sweden)

    Jacopo Aguzzi

    2011-11-01

    Coverage and Fractal Dimension. A constant Region of Interest (ROI was defined and background extraction by a Gaussian Blurring Filter was performed. Image subtraction within ROI was followed by the sum of the RGB channels matrices. Percent Coverage was calculated on the resulting image. Fractal Dimension was estimated using the box-counting method. The images were then resized to a dimension in pixels equal to a power of 2, allowing subdivision into sub-multiple quadrants. In comparisons of manual and automated Percent Coverage and Fractal Dimension estimates, the former showed an overestimation tendency for both parameters. The primary limitations on the automatic analysis of benthic images were habitat variations in sediment texture and water column turbidity. The application of filters for background corrections is a required preliminary step for the efficient recognition of animals and bacterial mat patches.

  5. Automated image analysis for the detection of benthic crustaceans and bacterial mat coverage using the VENUS undersea cabled network.

    Science.gov (United States)

    Aguzzi, Jacopo; Costa, Corrado; Robert, Katleen; Matabos, Marjolaine; Antonucci, Francesca; Juniper, S Kim; Menesatti, Paolo

    2011-01-01

    and Fractal Dimension. A constant Region of Interest (ROI) was defined and background extraction by a Gaussian Blurring Filter was performed. Image subtraction within ROI was followed by the sum of the RGB channels matrices. Percent Coverage was calculated on the resulting image. Fractal Dimension was estimated using the box-counting method. The images were then resized to a dimension in pixels equal to a power of 2, allowing subdivision into sub-multiple quadrants. In comparisons of manual and automated Percent Coverage and Fractal Dimension estimates, the former showed an overestimation tendency for both parameters. The primary limitations on the automatic analysis of benthic images were habitat variations in sediment texture and water column turbidity. The application of filters for background corrections is a required preliminary step for the efficient recognition of animals and bacterial mat patches.

  6. Automated High Throughput Drug Target Crystallography

    Energy Technology Data Exchange (ETDEWEB)

    Rupp, B

    2005-02-18

    The molecular structures of drug target proteins and receptors form the basis for 'rational' or structure guided drug design. The majority of target structures are experimentally determined by protein X-ray crystallography, which as evolved into a highly automated, high throughput drug discovery and screening tool. Process automation has accelerated tasks from parallel protein expression, fully automated crystallization, and rapid data collection to highly efficient structure determination methods. A thoroughly designed automation technology platform supported by a powerful informatics infrastructure forms the basis for optimal workflow implementation and the data mining and analysis tools to generate new leads from experimental protein drug target structures.

  7. Optimization of RNA Purification and Analysis for Automated, Pre-Symptomatic Disease Diagnostics

    Energy Technology Data Exchange (ETDEWEB)

    Vaidya, A; Nasarabadi, S; Milanovich, F

    2005-06-28

    When diagnosing disease, time is often a more formidable enemy than the pathogen itself. Current detection methods rely primarily on post-symptomatic protein production (i.e. antibodies), which does not occur in noticeable levels until several weeks after infection. As such, a major goal among researchers today is to expedite pre-symptomatic disease recognition and treatment. Since most pathogens are known to leave a unique signature on the genetic expression of the host, one potential diagnostic tool is host mRNA. In my experiments, I examined several methods of isolating RNA and reading its genetic sequence. I first used two types of reverse transcriptase polymerase chain reactions (using commercial RNA) and examined the resultant complementary DNA through gel electrophoresis. I then proceeded to isolate and purify whole RNA from actual human monocytes and THP-1 cells using several published methods, and examined gene expression on the RNA itself. I compared the two RT-PCR methods and concluded that a double step RT-PCR is superior to the single step method. I also compared the various techniques of RNA isolation by examining the yield and purity of the resultant RNA. Finally, I studied the level of cellular IL-8 and IL-1 gene expression, two genes involved in the human immune response, which can serve as a baseline for future genetic comparison with LPS-exposed cells. Based on the results, I have determined which conditions and procedures are optimal for RNA isolation, RT-PCR, and RNA yield assessment. The overall goal of my research is to develop a flow-through system of RNA analysis, whereby blood samples can be collected and analyzed for disease prior to the onset of symptoms. The Pathomics group hopes to automate this process by removing the human labor factor, thereby decreasing the procedure's cost and increasing its availability to the general population. Eventually, our aim is to have an autonomous diagnostic system based on RNA analysis that would

  8. An automated method for comparing motion artifacts in cine four-dimensional computed tomography images.

    Science.gov (United States)

    Cui, Guoqiang; Jew, Brian; Hong, Julian C; Johnston, Eric W; Loo, Billy W; Maxim, Peter G

    2012-11-08

    The aim of this study is to develop an automated method to objectively compare motion artifacts in two four-dimensional computed tomography (4D CT) image sets, and identify the one that would appear to human observers with fewer or smaller artifacts. Our proposed method is based on the difference of the normalized correlation coefficients between edge slices at couch transitions, which we hypothesize may be a suitable metric to identify motion artifacts. We evaluated our method using ten pairs of 4D CT image sets that showed subtle differences in artifacts between images in a pair, which were identifiable by human observers. One set of 4D CT images was sorted using breathing traces in which our clinically implemented 4D CT sorting software miscalculated the respiratory phase, which expectedly led to artifacts in the images. The other set of images consisted of the same images; however, these were sorted using the same breathing traces but with corrected phases. Next we calculated the normalized correlation coefficients between edge slices at all couch transitions for all respiratory phases in both image sets to evaluate for motion artifacts. For nine image set pairs, our method identified the 4D CT sets sorted using the breathing traces with the corrected respiratory phase to result in images with fewer or smaller artifacts, whereas for one image pair, no difference was noted. Two observers independently assessed the accuracy of our method. Both observers identified 9 image sets that were sorted using the breathing traces with corrected respiratory phase as having fewer or smaller artifacts. In summary, using the 4D CT data of ten pairs of 4D CT image sets, we have demonstrated proof of principle that our method is able to replicate the results of two human observers in identifying the image set with fewer or smaller artifacts.

  9. Methods for semi-automated indexing for high precision information retrieval

    Science.gov (United States)

    Berrios, Daniel C.; Cucina, Russell J.; Fagan, Lawrence M.

    2002-01-01

    OBJECTIVE: To evaluate a new system, ISAID (Internet-based Semi-automated Indexing of Documents), and to generate textbook indexes that are more detailed and more useful to readers. DESIGN: Pilot evaluation: simple, nonrandomized trial comparing ISAID with manual indexing methods. Methods evaluation: randomized, cross-over trial comparing three versions of ISAID and usability survey. PARTICIPANTS: Pilot evaluation: two physicians. Methods evaluation: twelve physicians, each of whom used three different versions of the system for a total of 36 indexing sessions. MEASUREMENTS: Total index term tuples generated per document per minute (TPM), with and without adjustment for concordance with other subjects; inter-indexer consistency; ratings of the usability of the ISAID indexing system. RESULTS: Compared with manual methods, ISAID decreased indexing times greatly. Using three versions of ISAID, inter-indexer consistency ranged from 15% to 65% with a mean of 41%, 31%, and 40% for each of three documents. Subjects using the full version of ISAID were faster (average TPM: 5.6) and had higher rates of concordant index generation. There were substantial learning effects, despite our use of a training/run-in phase. Subjects using the full version of ISAID were much faster by the third indexing session (average TPM: 9.1). There was a statistically significant increase in three-subject concordant indexing rate using the full version of ISAID during the second indexing session (p < 0.05). SUMMARY: Users of the ISAID indexing system create complex, precise, and accurate indexing for full-text documents much faster than users of manual methods. Furthermore, the natural language processing methods that ISAID uses to suggest indexes contributes substantially to increased indexing speed and accuracy.

  10. An automated analysis of highly complex flow cytometry-based proteomic data.

    Science.gov (United States)

    Stuchlý, Jan; Kanderová, Veronika; Fišer, Karel; Cerná, Daniela; Holm, Anders; Wu, Weiwei; Hrušák, Ondřej; Lund-Johansen, Fridtjof; Kalina, Tomáš

    2012-02-01

    The combination of color-coded microspheres as carriers and flow cytometry as a detection platform provides new opportunities for multiplexed measurement of biomolecules. Here, we developed a software tool capable of automated gating of color-coded microspheres, automatic extraction of statistics from all subsets and validation, normalization, and cross-sample analysis. The approach presented in this article enabled us to harness the power of high-content cellular proteomics. In size exclusion chromatography-resolved microsphere-based affinity proteomics (Size-MAP), antibody-coupled microspheres are used to measure biotinylated proteins that have been separated by size exclusion chromatography. The captured proteins are labeled with streptavidin phycoerythrin and detected by multicolor flow cytometry. When the results from multiple size exclusion chromatography fractions are combined, binding is detected as discrete reactivity peaks (entities). The information obtained might be approximated to a multiplexed western blot. We used a microsphere set with >1,000 subsets, presenting an approach to extract biologically relevant information. The R-project environment was used to sequentially recognize subsets in two-dimensional space and gate them. The aim was to extract the median streptavidin phycoerythrin fluorescence intensity for all 1,000+ microsphere subsets from a series of 96 measured samples. The resulting text files were subjected to algorithms that identified entities across the 24 fractions. Thus, the original 24 data points for each antibody were compressed to 1-4 integrated values representing the areas of individual antibody reactivity peaks. Finally, we provide experimental data on cellular protein changes induced by treatment of leukemia cells with imatinib mesylate. The approach presented here exemplifies how large-scale flow cytometry data analysis can be efficiently processed to employ flow cytometry as a high-content proteomics method.

  11. Automated localization and segmentation of lung tumor from PET-CT thorax volumes based on image feature analysis.

    Science.gov (United States)

    Cui, Hui; Wang, Xiuying; Feng, Dagan

    2012-01-01

    Positron emission tomography - computed tomography (PET-CT) plays an essential role in early tumor detection, diagnosis, staging and treatment. Automated and more accurate lung tumor detection and delineation from PET-CT is challenging. In this paper, on the basis of quantitative analysis of contrast feature of PET volume in SUV (standardized uptake value), our method firstly automatically localized the lung tumor. Then based on analysing the surrounding CT features of the initial tumor definition, our decision strategy determines the tumor segmentation from CT or from PET. The algorithm has been validated on 20 PET-CT studies involving non-small cell lung cancer (NSCLC). Experimental results demonstrated that our method was able to segment the tumor when adjacent to mediastinum or chest wall, and the algorithm outperformed the other five lung segmentation methods in terms of overlapping measure.

  12. Automating the Purple Crow Lidar

    Directory of Open Access Journals (Sweden)

    Hicks Shannon

    2016-01-01

    Full Text Available The Purple Crow LiDAR (PCL was built to measure short and long term coupling between the lower, middle, and upper atmosphere. The initial component of my MSc. project is to automate two key elements of the PCL: the rotating liquid mercury mirror and the Zaber alignment mirror. In addition to the automation of the Zaber alignment mirror, it is also necessary to describe the mirror’s movement and positioning errors. Its properties will then be added into the alignment software. Once the alignment software has been completed, we will compare the new alignment method with the previous manual procedure. This is the first among several projects that will culminate in a fully-automated lidar. Eventually, we will be able to work remotely, thereby increasing the amount of data we collect. This paper will describe the motivation for automation, the methods we propose, preliminary results for the Zaber alignment error analysis, and future work.

  13. SU-E-J-252: Reproducibility of Radiogenomic Image Features: Comparison of Two Semi-Automated Segmentation Methods

    Energy Technology Data Exchange (ETDEWEB)

    Lee, M; Woo, B; Kim, J [Seoul National University, Seoul (Korea, Republic of); Jamshidi, N; Kuo, M [UCLA School of Medicine, Los Angeles, CA (United States)

    2015-06-15

    Purpose: Objective and reliable quantification of imaging phenotype is an essential part of radiogenomic studies. We compared the reproducibility of two semi-automatic segmentation methods for quantitative image phenotyping in magnetic resonance imaging (MRI) of glioblastoma multiforme (GBM). Methods: MRI examinations with T1 post-gadolinium and FLAIR sequences of 10 GBM patients were downloaded from the Cancer Image Archive site. Two semi-automatic segmentation tools with different algorithms (deformable model and grow cut method) were used to segment contrast enhancement, necrosis and edema regions by two independent observers. A total of 21 imaging features consisting of area and edge groups were extracted automatically from the segmented tumor. The inter-observer variability and coefficient of variation (COV) were calculated to evaluate the reproducibility. Results: Inter-observer correlations and coefficient of variation of imaging features with the deformable model ranged from 0.953 to 0.999 and 2.1% to 9.2%, respectively, and the grow cut method ranged from 0.799 to 0.976 and 3.5% to 26.6%, respectively. Coefficient of variation for especially important features which were previously reported as predictive of patient survival were: 3.4% with deformable model and 7.4% with grow cut method for the proportion of contrast enhanced tumor region; 5.5% with deformable model and 25.7% with grow cut method for the proportion of necrosis; and 2.1% with deformable model and 4.4% with grow cut method for edge sharpness of tumor on CE-T1W1. Conclusion: Comparison of two semi-automated tumor segmentation techniques shows reliable image feature extraction for radiogenomic analysis of GBM patients with multiparametric Brain MRI.

  14. A method for automated snow avalanche debris detection through use of synthetic aperture radar (SAR) imaging

    Science.gov (United States)

    Vickers, H.; Eckerstorfer, M.; Malnes, E.; Larsen, Y.; Hindberg, H.

    2016-11-01

    Avalanches are a natural hazard that occur in mountainous regions of Troms County in northern Norway during winter and can cause loss of human life and damage to infrastructure. Knowledge of when and where they occur especially in remote, high mountain areas is often lacking due to difficult access. However, complete, spatiotemporal avalanche activity data sets are important for accurate avalanche forecasting, as well as for deeper understanding of the link between avalanche occurrences and the triggering snowpack and meteorological factors. It is therefore desirable to develop a technique that enables active mapping and monitoring of avalanches over an entire winter. Avalanche debris can be observed remotely over large spatial areas, under all weather and light conditions by synthetic aperture radar (SAR) satellites. The recently launched Sentinel-1A satellite acquires SAR images covering the entire Troms County with frequent updates. By focusing on a case study from New Year 2015 we use Sentinel-1A images to develop an automated avalanche debris detection algorithm that utilizes change detection and unsupervised object classification methods. We compare our results with manually identified avalanche debris and field-based images to quantify the algorithm accuracy. Our results indicate that a correct detection rate of over 60% can be achieved, which is sensitive to several algorithm parameters that may need revising. With further development and refinement of the algorithm, we believe that this method could play an effective role in future operational monitoring of avalanches within Troms and has potential application in avalanche forecasting areas worldwide.

  15. Predicting the Evolution of CO2 Emissions in Bahrain with Automated Forecasting Methods

    Directory of Open Access Journals (Sweden)

    Cristiana Tudor

    2016-09-01

    Full Text Available The 2012 Doha meeting established the continuation of the Kyoto protocol, the legally-binding global agreement under which signatory countries had agreed to reduce their carbon emissions. Contrary to this assumed obligation, all G20 countries with the exception of France and the UK saw significant increases in their CO2 emissions over the last 25 years, surpassing 300% in the case of China. This paper attempts to forecast the evolution of carbon dioxide emissions in Bahrain over the 2012–2021 decade by employing seven Automated Forecasting Methods, including the exponential smoothing state space model (ETS, the Holt–Winters Model, the BATS/TBATS model, ARIMA, the structural time series model (STS, the naive model, and the neural network time series forecasting method (NNAR. Results indicate a reversal of the current decreasing trend of pollution in the country, with a point estimate of 2309 metric tons per capita at the end of 2020 and 2317 at the end of 2021, as compared to the 1934 level achieved in 2010. The country’s baseline level corresponding to year 1990 (as specified by the Doha amendment of the Kyoto protocol is approximately 25.54 metric tons per capita, which implies a maximum level of 20.96 metric tons per capita for the year 2020 (corresponding to a decrease of 18% relative to the baseline level in order for Bahrain to comply with the protocol. Our results therefore suggest that Bahrain cannot meet its assumed target.

  16. Fully automated objective-based method for master recession curve separation.

    Science.gov (United States)

    Posavec, Kristijan; Parlov, Jelena; Nakić, Zoran

    2010-01-01

    The fully automated objective-based method for master recession curve (MRC) separation was developed by using Microsoft Excel spreadsheet and Visual Basic for Applications (VBA) code. The core of the program code is used to construct an MRC by using the adapted matching strip method (Posavec et al. 2006). Criteria for separating the MRC into two or three segments are determined from the flow-duration curve and are represented as the probable range of percent of flow rate duration. Successive separations are performed automatically on two and three MRCs using sets of percent of flow rate duration from selected ranges and an optimal separation model scenario, having the highest average coefficient of determination R(2), is selected as the most appropriate one. The resulting separated master recession curves are presented graphically, whereas the statistics are presented numerically, all in separate sheets. Examples of field data obtained from two springs in Istria, Croatia, are used to illustrate its application. The freely available Excel spreadsheet and VBA program ensures the ease of use and applicability for larger data sets.

  17. Analysis of Complexity Evolution Management and Human Performance Issues in Commercial Aircraft Automation Systems

    Science.gov (United States)

    Vakil, Sanjay S.; Hansman, R. John

    2000-01-01

    Autoflight systems in the current generation of aircraft have been implicated in several recent incidents and accidents. A contributory aspect to these incidents may be the manner in which aircraft transition between differing behaviours or 'modes.' The current state of aircraft automation was investigated and the incremental development of the autoflight system was tracked through a set of aircraft to gain insight into how these systems developed. This process appears to have resulted in a system without a consistent global representation. In order to evaluate and examine autoflight systems, a 'Hybrid Automation Representation' (HAR) was developed. This representation was used to examine several specific problems known to exist in aircraft systems. Cyclomatic complexity is an analysis tool from computer science which counts the number of linearly independent paths through a program graph. This approach was extended to examine autoflight mode transitions modelled with the HAR. A survey was conducted of pilots to identify those autoflight mode transitions which airline pilots find difficult. The transitions identified in this survey were analyzed using cyclomatic complexity to gain insight into the apparent complexity of the autoflight system from the perspective of the pilot. Mode transitions which had been identified as complex by pilots were found to have a high cyclomatic complexity. Further examination was made into a set of specific problems identified in aircraft: the lack of a consistent representation of automation, concern regarding appropriate feedback from the automation, and the implications of physical limitations on the autoflight systems. Mode transitions involved in changing to and leveling at a new altitude were identified across multiple aircraft by numerous pilots. Where possible, evaluation and verification of the behaviour of these autoflight mode transitions was investigated via aircraft-specific high fidelity simulators. Three solution

  18. Development of a System and Method for Automated Isolation of Stromal Vascular Fraction from Adipose Tissue Lipoaspirate

    Directory of Open Access Journals (Sweden)

    Swathi SundarRaj

    2015-01-01

    Full Text Available Autologous fat grafting for soft tissue reconstruction is challenged by unpredictable long-term graft survival. Fat derived stromal vascular fraction (SVF is gaining popularity in tissue reconstruction as SVF-enriched fat grafts demonstrate improved engraftment. SVF also has potential in regenerative medicine for remodeling of ischemic tissues by promoting angiogenesis. Since SVF cells do not require culture expansion, attempts are being made to develop automated devices to isolate SVF at the point of care. We report development of a closed, automated system to process up to 500 mL lipoaspirate using cell size-dependent filtration technology. The yield of SVF obtained by automated tissue digestion and filtration (1.17 ± 0.5 × 105 cells/gram was equivalent to that obtained by manual isolation (1.15 ± 0.3 × 105; p = 0.8, and the viability of the cells isolated by both methods was greater than 90%. Cell composition included CD34+CD31− adipose stromal cells, CD34+CD31+ endothelial progenitor cells, and CD34−CD31+ endothelial cells, and their relative percentages were equivalent to SVF isolated by the manual method. CFU-F capacity and expression of angiogenic factors were also comparable with the manual method, establishing proof-of-concept for fully automated SVF isolation, suitable for use in reconstructive surgeries and regenerative medicine applications.

  19. AGAPE (Automated Genome Analysis PipelinE for pan-genome analysis of Saccharomyces cerevisiae.

    Directory of Open Access Journals (Sweden)

    Giltae Song

    Full Text Available The characterization and public release of genome sequences from thousands of organisms is expanding the scope for genetic variation studies. However, understanding the phenotypic consequences of genetic variation remains a challenge in eukaryotes due to the complexity of the genotype-phenotype map. One approach to this is the intensive study of model systems for which diverse sources of information can be accumulated and integrated. Saccharomyces cerevisiae is an extensively studied model organism, with well-known protein functions and thoroughly curated phenotype data. To develop and expand the available resources linking genomic variation with function in yeast, we aim to model the pan-genome of S. cerevisiae. To initiate the yeast pan-genome, we newly sequenced or re-sequenced the genomes of 25 strains that are commonly used in the yeast research community using advanced sequencing technology at high quality. We also developed a pipeline for automated pan-genome analysis, which integrates the steps of assembly, annotation, and variation calling. To assign strain-specific functional annotations, we identified genes that were not present in the reference genome. We classified these according to their presence or absence across strains and characterized each group of genes with known functional and phenotypic features. The functional roles of novel genes not found in the reference genome and associated with strains or groups of strains appear to be consistent with anticipated adaptations in specific lineages. As more S. cerevisiae strain genomes are released, our analysis can be used to collate genome data and relate it to lineage-specific patterns of genome evolution. Our new tool set will enhance our understanding of genomic and functional evolution in S. cerevisiae, and will be available to the yeast genetics and molecular biology community.

  20. AGAPE (Automated Genome Analysis PipelinE) for pan-genome analysis of Saccharomyces cerevisiae.

    Science.gov (United States)

    Song, Giltae; Dickins, Benjamin J A; Demeter, Janos; Engel, Stacia; Gallagher, Jennifer; Choe, Kisurb; Dunn, Barbara; Snyder, Michael; Cherry, J Michael

    2015-01-01

    The characterization and public release of genome sequences from thousands of organisms is expanding the scope for genetic variation studies. However, understanding the phenotypic consequences of genetic variation remains a challenge in eukaryotes due to the complexity of the genotype-phenotype map. One approach to this is the intensive study of model systems for which diverse sources of information can be accumulated and integrated. Saccharomyces cerevisiae is an extensively studied model organism, with well-known protein functions and thoroughly curated phenotype data. To develop and expand the available resources linking genomic variation with function in yeast, we aim to model the pan-genome of S. cerevisiae. To initiate the yeast pan-genome, we newly sequenced or re-sequenced the genomes of 25 strains that are commonly used in the yeast research community using advanced sequencing technology at high quality. We also developed a pipeline for automated pan-genome analysis, which integrates the steps of assembly, annotation, and variation calling. To assign strain-specific functional annotations, we identified genes that were not present in the reference genome. We classified these according to their presence or absence across strains and characterized each group of genes with known functional and phenotypic features. The functional roles of novel genes not found in the reference genome and associated with strains or groups of strains appear to be consistent with anticipated adaptations in specific lineages. As more S. cerevisiae strain genomes are released, our analysis can be used to collate genome data and relate it to lineage-specific patterns of genome evolution. Our new tool set will enhance our understanding of genomic and functional evolution in S. cerevisiae, and will be available to the yeast genetics and molecular biology community.

  1. Semi-automated calibration method for modelling of mountain permafrost evolution in Switzerland

    Directory of Open Access Journals (Sweden)

    A. Marmy

    2015-09-01

    Full Text Available Permafrost is a widespread phenomenon in the European Alps. Many important topics such as the future evolution of permafrost related to climate change and the detection of permafrost related to potential natural hazards sites are of major concern to our society. Numerical permafrost models are the only tools which facilitate the projection of the future evolution of permafrost. Due to the complexity of the processes involved and the heterogeneity of Alpine terrain, models must be carefully calibrated and results should be compared with observations at the site (borehole scale. However, a large number of local point data are necessary to obtain a broad overview of the thermal evolution of mountain permafrost over a larger area, such as the Swiss Alps, and the site-specific model calibration of each point would be time-consuming. To face this issue, this paper presents a semi-automated calibration method using the Generalized Likelihood Uncertainty Estimation (GLUE as implemented in a 1-D soil model (CoupModel and applies it to six permafrost sites in the Swiss Alps prior to long-term permafrost evolution simulations. We show that this automated calibration method is able to accurately reproduce the main thermal condition characteristics with some limitations at sites with unique conditions such as 3-D air or water circulation, which have to be calibrated manually. The calibration obtained was used for RCM-based long-term simulations under the A1B climate scenario specifically downscaled at each borehole site. The projection shows general permafrost degradation with thawing at 10 m, even partially reaching 20 m depths until the end of the century, but with different timing among the sites. The degradation is more rapid at bedrock sites whereas ice-rich sites with a blocky surface cover showed a reduced sensitivity to climate change. The snow cover duration is expected to be reduced drastically (between −20 to −37 % impacting the ground thermal

  2. Automated kidney morphology measurements from ultrasound images using texture and edge analysis

    Science.gov (United States)

    Ravishankar, Hariharan; Annangi, Pavan; Washburn, Michael; Lanning, Justin

    2016-04-01

    In a typical ultrasound scan, a sonographer measures Kidney morphology to assess renal abnormalities. Kidney morphology can also help to discriminate between chronic and acute kidney failure. The caliper placements and volume measurements are often time consuming and an automated solution will help to improve accuracy, repeatability and throughput. In this work, we developed an automated Kidney morphology measurement solution from long axis Ultrasound scans. Automated kidney segmentation is challenging due to wide variability in kidney shape, size, weak contrast of the kidney boundaries and presence of strong edges like diaphragm, fat layers. To address the challenges and be able to accurately localize and detect kidney regions, we present a two-step algorithm that makes use of edge and texture information in combination with anatomical cues. First, we use an edge analysis technique to localize kidney region by matching the edge map with predefined templates. To accurately estimate the kidney morphology, we use textural information in a machine learning algorithm framework using Haar features and Gradient boosting classifier. We have tested the algorithm on 45 unseen cases and the performance against ground truth is measured by computing Dice overlap, % error in major and minor axis of kidney. The algorithm shows successful performance on 80% cases.

  3. Comparison of automated brain volumetry methods with stereology in children aged 2 to 3 years

    Energy Technology Data Exchange (ETDEWEB)

    Mayer, Kristina N. [University Children' s Hospital of Zurich, Center for MR Research, Zurich (Switzerland); University Children' s Hospital, Pediatric Cardiology, Zurich (Switzerland); Latal, Beatrice [University Children' s Hospital, Child Development Center, Zurich (Switzerland); University Children' s Hospital, Children' s Research Center, Zurich (Switzerland); Knirsch, Walter [University Children' s Hospital, Pediatric Cardiology, Zurich (Switzerland); University Children' s Hospital, Children' s Research Center, Zurich (Switzerland); Scheer, Ianina [University Children' s Hospital, Department for Diagnostic Neuroradiology, Zurich (Switzerland); Rhein, Michael von [University Children' s Hospital, Child Development Center, Zurich (Switzerland); Reich, Bettina; Bauer, Juergen; Gummel, Kerstin [Justus-Liebig University, Pediatric Heart Center, University Hospital Giessen, Giessen (Germany); Roberts, Neil [University of Edinburgh, Clinical Research and Imaging Centre (CRIC), The Queens Medical Research Institute (QMRI), Edinburgh (United Kingdom); O' Gorman Tuura, Ruth [University Children' s Hospital of Zurich, Center for MR Research, Zurich (Switzerland); University Children' s Hospital, Children' s Research Center, Zurich (Switzerland)

    2016-09-15

    The accurate and precise measurement of brain volumes in young children is important for early identification of children with reduced brain volumes and an increased risk for neurodevelopmental impairment. Brain volumes can be measured from cerebral MRI (cMRI), but most neuroimaging tools used for cerebral segmentation and volumetry were developed for use in adults and have not been validated in infants or young children. Here, we investigate the feasibility and accuracy of three automated software methods (i.e., SPM, FSL, and FreeSurfer) for brain volumetry in young children and compare the measures with corresponding volumes obtained using the Cavalieri method of modern design stereology. Cerebral MRI data were collected from 21 children with a complex congenital heart disease (CHD) before Fontan procedure, at a median age of 27 months (range 20.9-42.4 months). Data were segmented with SPM, FSL, and FreeSurfer, and total intracranial volume (ICV) and total brain volume (TBV) were compared with corresponding measures obtained using the Cavalieri method. Agreement between the estimated brain volumes (ICV and TBV) relative to the gold standard stereological volumes was strongest for FreeSurfer (p < 0.001) and moderate for SPM segment (ICV p = 0.05; TBV p = 0.006). No significant association was evident between ICV and TBV obtained using SPM NewSegment and FSL FAST and the corresponding stereological volumes. FreeSurfer provides an accurate method for measuring brain volumes in young children, even in the presence of structural brain abnormalities. (orig.)

  4. Hyper-Cam automated calibration method for continuous hyperspectral imaging measurements

    Science.gov (United States)

    Gagnon, Jean-Philippe; Habte, Zewdu; George, Jacks; Farley, Vincent; Tremblay, Pierre; Chamberland, Martin; Romano, Joao; Rosario, Dalton

    2010-04-01

    The midwave and longwave infrared regions of the electromagnetic spectrum contain rich information which can be captured by hyperspectral sensors thus enabling enhanced detection of targets of interest. A continuous hyperspectral imaging measurement capability operated 24/7 over varying seasons and weather conditions permits the evaluation of hyperspectral imaging for detection of different types of targets in real world environments. Such a measurement site was built at Picatinny Arsenal under the Spectral and Polarimetric Imagery Collection Experiment (SPICE), where two Hyper-Cam hyperspectral imagers are installed at the Precision Armament Laboratory (PAL) and are operated autonomously since Fall of 2009. The Hyper-Cam are currently collecting a complete hyperspectral database that contains the MWIR and LWIR hyperspectral measurements of several targets under day, night, sunny, cloudy, foggy, rainy and snowy conditions. The Telops Hyper-Cam sensor is an imaging spectrometer that enables the spatial and spectral analysis capabilities using a single sensor. It is based on the Fourier-transform technology yielding high spectral resolution and enabling high accuracy radiometric calibration. It provides datacubes of up to 320x256 pixels at spectral resolutions of up to 0.25 cm-1. The MWIR version covers the 3 to 5 μm spectral range and the LWIR version covers the 8 to 12 μm spectral range. This paper describes the automated operation of the two Hyper-Cam sensors being used in the SPICE data collection. The Reveal Automation Control Software (RACS) developed collaboratively between Telops, ARDEC, and ARL enables flexible operating parameters and autonomous calibration. Under the RACS software, the Hyper-Cam sensors can autonomously calibrate itself using their internal blackbody targets, and the calibration events are initiated by user defined time intervals and on internal beamsplitter temperature monitoring. The RACS software is the first software developed for

  5. Automated multi-plug filtration cleanup for liquid chromatographic-tandem mass spectrometric pesticide multi-residue analysis in representative crop commodities.

    Science.gov (United States)

    Qin, Yuhong; Zhang, Jingru; Zhang, Yuan; Li, Fangbing; Han, Yongtao; Zou, Nan; Xu, Haowei; Qian, Meiyuan; Pan, Canping

    2016-09-01

    An automated multi-plug filtration cleanup (m-PFC) method on modified QuEChERS (quick, easy, cheap, effective, rugged, and safe) extracts was developed. The automatic device was aimed to reduce labor-consuming manual operation workload in the cleanup steps. It could control the volume and the speed of pulling and pushing cycles accurately. In this work, m-PFC was based on multi-walled carbon nanotubes (MWCNTs) mixed with other sorbents and anhydrous magnesium sulfate (MgSO4) in a packed tip for analysis of pesticide multi-residues in crop commodities followed by liquid chromatography with tandem mass spectrometric (LC-MS/MS) detection. It was validated by analyzing 25 pesticides in six representative matrices spiked at two concentration levels of 10 and 100μg/kg. Salts, sorbents, m-PFC procedure, automated pulling and pushing volume, automated pulling speed, and pushing speed for each matrix were optimized. After optimization, two general automated m-PFC methods were introduced to relatively simple (apple, citrus fruit, peanut) and relatively complex (spinach, leek, green tea) matrices. Spike recoveries were within 83 and 108% and 1-14% RSD for most analytes in the tested matrices. Matrix-matched calibrations were performed with the coefficients of determination >0.997 between concentration levels of 10 and 1000μg/kg. The developed method was successfully applied to the determination of pesticide residues in market samples.

  6. Comparison between automated system and PCR-based method for identification and antimicrobial susceptibility profile of clinical Enterococcus spp.

    Science.gov (United States)

    Furlaneto-Maia, Luciana; Rocha, Kátia Real; Siqueira, Vera Lúcia Dias; Furlaneto, Márcia Cristina

    2014-01-01

    Enterococci are increasingly responsible for nosocomial infections worldwide. This study was undertaken to compare the identification and susceptibility profile using an automated MicrosScan system, PCR-based assay and disk diffusion assay of Enterococcus spp. We evaluated 30 clinical isolates of Enterococcus spp. Isolates were identified by MicrosScan system and PCR-based assay. The detection of antibiotic resistance genes (vancomycin, gentamicin, tetracycline and erythromycin) was also determined by PCR. Antimicrobial susceptibilities to vancomycin (30 µg), gentamicin (120 µg), tetracycline (30 µg) and erythromycin (15 µg) were tested by the automated system and disk diffusion method, and were interpreted according to the criteria recommended in CLSI guidelines. Concerning Enterococcus identification the general agreement between data obtained by the PCR method and by the automatic system was 90.0% (27/30). For all isolates of E. faecium and E. faecalis we observed 100% agreement. Resistance frequencies were higher in E. faecium than E. faecalis. The resistance rates obtained were higher for erythromycin (86.7%), vancomycin (80.0%), tetracycline (43.35) and gentamicin (33.3%). The correlation between disk diffusion and automation revealed an agreement for the majority of the antibiotics with category agreement rates of > 80%. The PCR-based assay, the van(A) gene was detected in 100% of vancomycin resistant enterococci. This assay is simple to conduct and reliable in the identification of clinically relevant enterococci. The data obtained reinforced the need for an improvement of the automated system to identify some enterococci.

  7. A semi-automated method for measuring thickness and white matter integrity of the corpus callosum

    Directory of Open Access Journals (Sweden)

    S Andronikou

    2012-12-01

    Full Text Available Aim. Diseases affecting cerebral white matter may lead to left-right asymmetries and atrophy of interhemispheric connections, i.e. the corpus callosum (CC. Our aim was to describe and test a semi-automated system that divides the midline CC into a number of segments and determines thickness at each, then performs fibre tracking from these segments. Methods. Six normal female volunteers (average age 25.8 ±6.7 years and a female patient with diagnosed multiple sclerosis (age 26 years were scanned on a 3T MRI. We performed diffusion-weighted imaging in 12 directions, and calculated diffusion tensors and fractional anisotropy (FA maps from this pre-processed data. Fibre tracking from a region-of-interest encompassing the entire CC was done. This fibre data, together with FA maps and the unweighted diffusion tensor imaging (DTI image (b = 0 s/mm2, were imported into a custom tool written in MATLAB. The midline sagittal position was carefully defined by selecting multiple midline points in coronal and axial views and rotating the image volume and fibre co-ordinates accordingly. Using the customised tool, dorsal and ventral CC contours were manually drawn on the mid-sagittal FA image, initiating automated calculation of a contour midway between these manually drawn lines. The programme was designed to then divide the midline contour into a pre-selected number of segments; from each segment border, perpendicular spokes were projected until they intersected with the dorsal and ventral contours. This technique divided the CC into a pre-set amount of segments, the number of which was limited by the spatial resolution. It was decided to set the number at 40 to ensure that each segment depicted a contiguous strip of voxels across the CC from the dorsal to the ventral contour. The system allows these segments to then be used as seeds for separate fibre tracking in each cerebral hemisphere, and various parameters are automatically plotted as a function of

  8. Automated analysis of short responses in an interactive synthetic tutoring system for introductory physics

    Science.gov (United States)

    Nakamura, Christopher M.; Murphy, Sytil K.; Christel, Michael G.; Stevens, Scott M.; Zollman, Dean A.

    2016-06-01

    Computer-automated assessment of students' text responses to short-answer questions represents an important enabling technology for online learning environments. We have investigated the use of machine learning to train computer models capable of automatically classifying short-answer responses and assessed the results. Our investigations are part of a project to develop and test an interactive learning environment designed to help students learn introductory physics concepts. The system is designed around an interactive video tutoring interface. We have analyzed 9 with about 150 responses or less. We observe for 4 of the 9 automated assessment with interrater agreement of 70% or better with the human rater. This level of agreement may represent a baseline for practical utility in instruction and indicates that the method warrants further investigation for use in this type of application. Our results also suggest strategies that may be useful for writing activities and questions that are more appropriate for automated assessment. These strategies include building activities that have relatively few conceptually distinct ways of perceiving the physical behavior of relatively few physical objects. Further success in this direction may allow us to promote interactivity and better provide feedback in online learning systems. These capabilities could enable our system to function more like a real tutor.

  9. Automated methods of predicting the function of biological sequences using GO and BLAST

    Directory of Open Access Journals (Sweden)

    Baumann Ute

    2005-11-01

    Full Text Available Abstract Background With the exponential increase in genomic sequence data there is a need to develop automated approaches to deducing the biological functions of novel sequences with high accuracy. Our aim is to demonstrate how accuracy benchmarking can be used in a decision-making process evaluating competing designs of biological function predictors. We utilise the Gene Ontology, GO, a directed acyclic graph of functional terms, to annotate sequences with functional information describing their biological context. Initially we examine the effect on accuracy scores of increasing the allowed distance between predicted and a test set of curator assigned terms. Next we evaluate several annotator methods using accuracy benchmarking. Given an unannotated sequence we use the Basic Local Alignment Search Tool, BLAST, to find similar sequences that have already been assigned GO terms by curators. A number of methods were developed that utilise terms associated with the best five matching sequences. These methods were compared against a benchmark method of simply using terms associated with the best BLAST-matched sequence (best BLAST approach. Results The precision and recall of estimates increases rapidly as the amount of distance permitted between a predicted term and a correct term assignment increases. Accuracy benchmarking allows a comparison of annotation methods. A covering graph approach performs poorly, except where the term assignment rate is high. A term distance concordance approach has a similar accuracy to the best BLAST approach, demonstrating lower precision but higher recall. However, a discriminant function method has higher precision and recall than the best BLAST approach and other methods shown here. Conclusion Allowing term predictions to be counted correct if closely related to a correct term decreases the reliability of the accuracy score. As such we recommend using accuracy measures that require exact matching of predicted

  10. Computer-automated multi-disciplinary analysis and design optimization of internally cooled turbine blades

    Science.gov (United States)

    Martin, Thomas Joseph

    This dissertation presents the theoretical methodology, organizational strategy, conceptual demonstration and validation of a fully automated computer program for the multi-disciplinary analysis, inverse design and optimization of convectively cooled axial gas turbine blades and vanes. Parametric computer models of the three-dimensional cooled turbine blades and vanes were developed, including the automatic generation of discretized computational grids. Several new analysis programs were written and incorporated with existing computational tools to provide computer models of the engine cycle, aero-thermodynamics, heat conduction and thermofluid physics of the internally cooled turbine blades and vanes. A generalized information transfer protocol was developed to provide the automatic mapping of geometric and boundary condition data between the parametric design tool and the numerical analysis programs. A constrained hybrid optimization algorithm controlled the overall operation of the system and guided the multi-disciplinary internal turbine cooling design process towards the objectives and constraints of engine cycle performance, aerodynamic efficiency, cooling effectiveness and turbine blade and vane durability. Several boundary element computer programs were written to solve the steady-state non-linear heat conduction equation inside the internally cooled and thermal barrier-coated turbine blades and vanes. The boundary element method (BEM) did not require grid generation inside the internally cooled turbine blades and vanes, so the parametric model was very robust. Implicit differentiations of the BEM thermal and thereto-elastic analyses were done to compute design sensitivity derivatives faster and more accurately than via explicit finite differencing. A factor of three savings of computer processing time was realized for two-dimensional thermal optimization problems, and a factor of twenty was obtained for three-dimensional thermal optimization problems

  11. A rapid and automated relocation method of an AFM probe for high-resolution imaging

    Science.gov (United States)

    Zhou, Peilin; Yu, Haibo; Shi, Jialin; Jiao, Niandong; Wang, Zhidong; Wang, Yuechao; Liu, Lianqing

    2016-09-01

    The atomic force microscope (AFM) is one of the most powerful tools for high-resolution imaging and high-precision positioning for nanomanipulation. The selection of the scanning area of the AFM depends on the use of the optical microscope. However, the resolution of an optical microscope is generally no larger than 200 nm owing to wavelength limitations of visible light. Taking into consideration the two determinants of relocation—relative angular rotation and positional offset between the AFM probe and nano target—it is therefore extremely challenging to precisely relocate the AFM probe to the initial scan/manipulation area for the same nano target after the AFM probe has been replaced, or after the sample has been moved. In this paper, we investigate a rapid automated relocation method for the nano target of an AFM using a coordinate transformation. The relocation process is both simple and rapid; moreover, multiple nano targets can be relocated by only identifying a pair of reference points. It possesses a centimeter-scale location range and nano-scale precision. The main advantages of this method are that it overcomes the limitations associated with the resolution of optical microscopes, and that it is label-free on the target areas, which means that it does not require the use of special artificial markers on the target sample areas. Relocation experiments using nanospheres, DNA, SWCNTs, and nano patterns amply demonstrate the practicality and efficiency of the proposed method, which provides technical support for mass nanomanipulation and detection based on AFM for multiple nano targets that are widely distributed in a large area.

  12. A rapid and automated relocation method of an AFM probe for high-resolution imaging.

    Science.gov (United States)

    Zhou, Peilin; Yu, Haibo; Shi, Jialin; Jiao, Niandong; Wang, Zhidong; Wang, Yuechao; Liu, Lianqing

    2016-09-30

    The atomic force microscope (AFM) is one of the most powerful tools for high-resolution imaging and high-precision positioning for nanomanipulation. The selection of the scanning area of the AFM depends on the use of the optical microscope. However, the resolution of an optical microscope is generally no larger than 200 nm owing to wavelength limitations of visible light. Taking into consideration the two determinants of relocation-relative angular rotation and positional offset between the AFM probe and nano target-it is therefore extremely challenging to precisely relocate the AFM probe to the initial scan/manipulation area for the same nano target after the AFM probe has been replaced, or after the sample has been moved. In this paper, we investigate a rapid automated relocation method for the nano target of an AFM using a coordinate transformation. The relocation process is both simple and rapid; moreover, multiple nano targets can be relocated by only identifying a pair of reference points. It possesses a centimeter-scale location range and nano-scale precision. The main advantages of this method are that it overcomes the limitations associated with the resolution of optical microscopes, and that it is label-free on the target areas, which means that it does not require the use of special artificial markers on the target sample areas. Relocation experiments using nanospheres, DNA, SWCNTs, and nano patterns amply demonstrate the practicality and efficiency of the proposed method, which provides technical support for mass nanomanipulation and detection based on AFM for multiple nano targets that are widely distributed in a large area.

  13. Integrating automated structured analysis and design with Ada programming support environments

    Science.gov (United States)

    Hecht, Alan; Simmons, Andy

    1986-01-01

    Ada Programming Support Environments (APSE) include many powerful tools that address the implementation of Ada code. These tools do not address the entire software development process. Structured analysis is a methodology that addresses the creation of complete and accurate system specifications. Structured design takes a specification and derives a plan to decompose the system subcomponents, and provides heuristics to optimize the software design to minimize errors and maintenance. It can also produce the creation of useable modules. Studies have shown that most software errors result from poor system specifications, and that these errors also become more expensive to fix as the development process continues. Structured analysis and design help to uncover error in the early stages of development. The APSE tools help to insure that the code produced is correct, and aid in finding obscure coding errors. However, they do not have the capability to detect errors in specifications or to detect poor designs. An automated system for structured analysis and design TEAMWORK, which can be integrated with an APSE to support software systems development from specification through implementation is described. These tools completement each other to help developers improve quality and productivity, as well as to reduce development and maintenance costs. Complete system documentation and reusable code also resultss from the use of these tools. Integrating an APSE with automated tools for structured analysis and design provide capabilities and advantages beyond those realized with any of these systems used by themselves.

  14. Comparative analysis on continuous sampling: the laboratory analytical method and air quality automation monitoring system%连续采样-实验室分析法与空气质量自动监测系统比对分析

    Institute of Scientific and Technical Information of China (English)

    严业华

    2012-01-01

    空气质量中二氧化硫、二氧化氮、可吸入颗粒物三种污染因子用连续采样-实验室分析法与空气质量自动监测系统两种方法进行比对实验并进行原因分析。%Use two ways which are continuous sampling: the laboratory analytical method and air quality automation monitoring system to make comparative research on the three pollution factors in the air quality which includes sulfur dioxide, nitrogen dioxide, pellet and analyze the reasons.

  15. A new automated assessment method for contrast-detail images by applying support vector machine and its robustness to nonlinear image processing.

    Science.gov (United States)

    Takei, Takaaki; Ikeda, Mitsuru; Imai, Kuniharu; Yamauchi-Kawaura, Chiyo; Kato, Katsuhiko; Isoda, Haruo

    2013-09-01

    The automated contrast-detail (C-D) analysis methods developed so-far cannot be expected to work well on images processed with nonlinear methods, such as noise reduction methods. Therefore, we have devised a new automated C-D analysis method by applying support vector machine (SVM), and tested for its robustness to nonlinear image processing. We acquired the CDRAD (a commercially available C-D test object) images at a tube voltage of 120 kV and a milliampere-second product (mAs) of 0.5-5.0. A partial diffusion equation based technique was used as noise reduction method. Three radiologists and three university students participated in the observer performance study. The training data for our SVM method was the classification data scored by the one radiologist for the CDRAD images acquired at 1.6 and 3.2 mAs and their noise-reduced images. We also compared the performance of our SVM method with the CDRAD Analyser algorithm. The mean C-D diagrams (that is a plot of the mean of the smallest visible hole diameter vs. hole depth) obtained from our devised SVM method agreed well with the ones averaged across the six human observers for both original and noise-reduced CDRAD images, whereas the mean C-D diagrams from the CDRAD Analyser algorithm disagreed with the ones from the human observers for both original and noise-reduced CDRAD images. In conclusion, our proposed SVM method for C-D analysis will work well for the images processed with the non-linear noise reduction method as well as for the original radiographic images.

  16. A feasibility assessment of automated FISH image and signal analysis to assist cervical cancer detection

    Science.gov (United States)

    Wang, Xingwei; Li, Yuhua; Liu, Hong; Li, Shibo; Zhang, Roy R.; Zheng, Bin

    2012-02-01

    Fluorescence in situ hybridization (FISH) technology provides a promising molecular imaging tool to detect cervical cancer. Since manual FISH analysis is difficult, time-consuming, and inconsistent, the automated FISH image scanning systems have been developed. Due to limited focal depth of scanned microscopic image, a FISH-probed specimen needs to be scanned in multiple layers that generate huge image data. To improve diagnostic efficiency of using automated FISH image analysis, we developed a computer-aided detection (CAD) scheme. In this experiment, four pap-smear specimen slides were scanned by a dual-detector fluorescence image scanning system that acquired two spectrum images simultaneously, which represent images of interphase cells and FISH-probed chromosome X. During image scanning, once detecting a cell signal, system captured nine image slides by automatically adjusting optical focus. Based on the sharpness index and maximum intensity measurement, cells and FISH signals distributed in 3-D space were projected into a 2-D con-focal image. CAD scheme was applied to each con-focal image to detect analyzable interphase cells using an adaptive multiple-threshold algorithm and detect FISH-probed signals using a top-hat transform. The ratio of abnormal cells was calculated to detect positive cases. In four scanned specimen slides, CAD generated 1676 con-focal images that depicted analyzable cells. FISH-probed signals wer