WorldWideScience

Sample records for semi-automated mesoscale analysis

  1. Semi-automated retinal vessel analysis in nonmydriatic fundus photography.

    Science.gov (United States)

    Schuster, Alexander Karl-Georg; Fischer, Joachim Ernst; Vossmerbaeumer, Urs

    2014-02-01

    Funduscopic assessment of the retinal vessels may be used to assess the health status of microcirculation and as a component in the evaluation of cardiovascular risk factors. Typically, the evaluation is restricted to morphological appreciation without strict quantification. Our purpose was to develop and validate a software tool for semi-automated quantitative analysis of retinal vasculature in nonmydriatic fundus photography. matlab software was used to develop a semi-automated image recognition and analysis tool for the determination of the arterial-venous (A/V) ratio in the central vessel equivalent on 45° digital fundus photographs. Validity and reproducibility of the results were ascertained using nonmydriatic photographs of 50 eyes from 25 subjects recorded from a 3DOCT device (Topcon Corp.). Two hundred and thirty-three eyes of 121 healthy subjects were evaluated to define normative values. A software tool was developed using image thresholds for vessel recognition and vessel width calculation in a semi-automated three-step procedure: vessel recognition on the photograph and artery/vein designation, width measurement and calculation of central retinal vessel equivalents. Mean vessel recognition rate was 78%, vessel class designation rate 75% and reproducibility between 0.78 and 0.91. Mean A/V ratio was 0.84. Application on a healthy norm cohort showed high congruence with prior published manual methods. Processing time per image was one minute. Quantitative geometrical assessment of the retinal vasculature may be performed in a semi-automated manner using dedicated software tools. Yielding reproducible numerical data within a short time leap, this may contribute additional value to mere morphological estimates in the clinical evaluation of fundus photographs. © 2013 Acta Ophthalmologica Scandinavica Foundation. Published by John Wiley & Sons Ltd.

  2. Application of fluorescence-based semi-automated AFLP analysis in barley and wheat

    DEFF Research Database (Denmark)

    Schwarz, G.; Herz, M.; Huang, X.Q.

    2000-01-01

    of semi-automated codominant analysis for hemizygous AFLP markers in an F-2 population was too low, proposing the use of dominant allele-typing defaults. Nevertheless, the efficiency of genetic mapping, especially of complex plant genomes, will be accelerated by combining the presented genotyping......Genetic mapping and the selection of closely linked molecular markers for important agronomic traits require efficient, large-scale genotyping methods. A semi-automated multifluorophore technique was applied for genotyping AFLP marker loci in barley and wheat. In comparison to conventional P-33...

  3. Semi-automated digital image analysis of patellofemoral joint space width from lateral knee radiographs

    Energy Technology Data Exchange (ETDEWEB)

    Grochowski, S.J. [Mayo Clinic, Department of Orthopedic Surgery, Rochester (United States); Amrami, K.K. [Mayo Clinic, Department of Radiology, Rochester (United States); Kaufman, K. [Mayo Clinic, Department of Orthopedic Surgery, Rochester (United States); Mayo Clinic/Foundation, Biomechanics Laboratory, Department of Orthopedic Surgery, Charlton North L-110L, Rochester (United States)

    2005-10-01

    To design a semi-automated program to measure minimum patellofemoral joint space width (JSW) using standing lateral view radiographs. Lateral patellofemoral knee radiographs were obtained from 35 asymptomatic subjects. The radiographs were analyzed to report both the repeatability of the image analysis program and the reproducibility of JSW measurements within a 2 week period. The results were also compared with manual measurements done by an experienced musculoskeletal radiologist. The image analysis program was shown to have an excellent coefficient of repeatability of 0.18 and 0.23 mm for intra- and inter-observer measurements respectively. The manual method measured a greater minimum JSW than the automated method. Reproducibility between days was comparable to other published results, but was less satisfactory for both manual and semi-automated measurements. The image analysis program had an inter-day coefficient of repeatability of 1.24 mm, which was lower than 1.66 mm for the manual method. A repeatable semi-automated method for measurement of the patellofemoral JSW from radiographs has been developed. The method is more accurate than manual measurements. However, the between-day reproducibility is higher than the intra-day reproducibility. Further investigation of the protocol for obtaining sequential lateral knee radiographs is needed in order to reduce the between-day variability. (orig.)

  4. Semi-automated volumetric analysis of artificial lymph nodes in a phantom study

    International Nuclear Information System (INIS)

    Fabel, M.; Biederer, J.; Jochens, A.; Bornemann, L.; Soza, G.; Heller, M.; Bolte, H.

    2011-01-01

    Purpose: Quantification of tumour burden in oncology requires accurate and reproducible image evaluation. The current standard is one-dimensional measurement (e.g. RECIST) with inherent disadvantages. Volumetric analysis is discussed as an alternative for therapy monitoring of lung and liver metastases. The aim of this study was to investigate the accuracy of semi-automated volumetric analysis of artificial lymph node metastases in a phantom study. Materials and methods: Fifty artificial lymph nodes were produced in a size range from 10 to 55 mm; some of them enhanced using iodine contrast media. All nodules were placed in an artificial chest phantom (artiCHEST ® ) within different surrounding tissues. MDCT was performed using different collimations (1–5 mm) at varying reconstruction kernels (B20f, B40f, B60f). Volume and RECIST measurements were performed using Oncology Software (Siemens Healthcare, Forchheim, Germany) and were compared to reference volume and diameter by calculating absolute percentage errors. Results: The software performance allowed a robust volumetric analysis in a phantom setting. Unsatisfying segmentation results were frequently found for native nodules within surrounding muscle. The absolute percentage error (APE) for volumetric analysis varied between 0.01 and 225%. No significant differences were seen between different reconstruction kernels. The most unsatisfactory segmentation results occurred in higher slice thickness (4 and 5 mm). Contrast enhanced lymph nodes showed better segmentation results by trend. Conclusion: The semi-automated 3D-volumetric analysis software tool allows a reliable and convenient segmentation of artificial lymph nodes in a phantom setting. Lymph nodes adjacent to tissue of similar density cause segmentation problems. For volumetric analysis of lymph node metastases in clinical routine a slice thickness of ≤3 mm and a medium soft reconstruction kernel (e.g. B40f for Siemens scan systems) may be a suitable

  5. Semi-automated volumetric analysis of lymph node metastases in patients with malignant melanoma stage III/IV-A feasibility study

    International Nuclear Information System (INIS)

    Fabel, M.; Tengg-Kobligk, H. von; Giesel, F.L.; Delorme, S.; Kauczor, H.-U.; Bornemann, L.; Dicken, V.; Kopp-Schneider, A.; Moser, C.

    2008-01-01

    Therapy monitoring in oncological patient care requires accurate and reliable imaging and post-processing methods. RECIST criteria are the current standard, with inherent disadvantages. The aim of this study was to investigate the feasibility of semi-automated volumetric analysis of lymph node metastases in patients with malignant melanoma compared to manual volumetric analysis and RECIST. Multislice CT was performed in 47 patients, covering the chest, abdomen and pelvis. In total, 227 suspicious, enlarged lymph nodes were evaluated retrospectively by two radiologists regarding diameters (RECIST), manually measured volume by placement of ROIs and semi-automated volumetric analysis. Volume (ml), quality of segmentation (++/-) and time effort (s) were evaluated in the study. The semi-automated volumetric analysis software tool was rated acceptable to excellent in 81% of all cases (reader 1) and 79% (reader 2). Median time for the entire segmentation process and necessary corrections was shorter with the semi-automated software than by manual segmentation. Bland-Altman plots showed a significantly lower interobserver variability for semi-automated volumetric than for RECIST measurements. The study demonstrated feasibility of volumetric analysis of lymph node metastases. The software allows a fast and robust segmentation in up to 80% of all cases. Ease of use and time needed are acceptable for application in the clinical routine. Variability and interuser bias were reduced to about one third of the values found for RECIST measurements. (orig.)

  6. Volumetric analysis of pelvic hematomas after blunt trauma using semi-automated seeded region growing segmentation: a method validation study.

    Science.gov (United States)

    Dreizin, David; Bodanapally, Uttam K; Neerchal, Nagaraj; Tirada, Nikki; Patlas, Michael; Herskovits, Edward

    2016-11-01

    Manually segmented traumatic pelvic hematoma volumes are strongly predictive of active bleeding at conventional angiography, but the method is time intensive, limiting its clinical applicability. We compared volumetric analysis using semi-automated region growing segmentation to manual segmentation and diameter-based size estimates in patients with pelvic hematomas after blunt pelvic trauma. A 14-patient cohort was selected in an anonymous randomized fashion from a dataset of patients with pelvic binders at MDCT, collected retrospectively as part of a HIPAA-compliant IRB-approved study from January 2008 to December 2013. To evaluate intermethod differences, one reader (R1) performed three volume measurements using the manual technique and three volume measurements using the semi-automated technique. To evaluate interobserver differences for semi-automated segmentation, a second reader (R2) performed three semi-automated measurements. One-way analysis of variance was used to compare differences in mean volumes. Time effort was also compared. Correlation between the two methods as well as two shorthand appraisals (greatest diameter, and the ABC/2 method for estimating ellipsoid volumes) was assessed with Spearman's rho (r). Intraobserver variability was lower for semi-automated compared to manual segmentation, with standard deviations ranging between ±5-32 mL and ±17-84 mL, respectively (p = 0.0003). There was no significant difference in mean volumes between the two readers' semi-automated measurements (p = 0.83); however, means were lower for the semi-automated compared with the manual technique (manual: mean and SD 309.6 ± 139 mL; R1 semi-auto: 229.6 ± 88.2 mL, p = 0.004; R2 semi-auto: 243.79 ± 99.7 mL, p = 0.021). Despite differences in means, the correlation between the two methods was very strong and highly significant (r = 0.91, p hematoma volumes correlate strongly with manually segmented volumes. Since semi-automated segmentation

  7. Semi-automated vectorial analysis of anorectal motion by magnetic resonance defecography in healthy subjects and fecal incontinence.

    Science.gov (United States)

    Noelting, J; Bharucha, A E; Lake, D S; Manduca, A; Fletcher, J G; Riederer, S J; Joseph Melton, L; Zinsmeister, A R

    2012-10-01

    Inter-observer variability limits the reproducibility of pelvic floor motion measured by magnetic resonance imaging (MRI). Our aim was to develop a semi-automated program measuring pelvic floor motion in a reproducible and refined manner. Pelvic floor anatomy and motion during voluntary contraction (squeeze) and rectal evacuation were assessed by MRI in 64 women with fecal incontinence (FI) and 64 age-matched controls. A radiologist measured anorectal angles and anorectal junction motion. A semi-automated program did the same and also dissected anorectal motion into perpendicular vectors representing the puborectalis and other pelvic floor muscles, assessed the pubococcygeal angle, and evaluated pelvic rotation. Manual and semi-automated measurements of anorectal junction motion (r = 0.70; P controls. This semi-automated program provides a reproducible, efficient, and refined analysis of pelvic floor motion by MRI. Puborectalis injury is independently associated with impaired motion of puborectalis, not other pelvic floor muscles in controls and women with FI. © 2012 Blackwell Publishing Ltd.

  8. Analysis of the thoracic aorta using a semi-automated post processing tool

    International Nuclear Information System (INIS)

    Entezari, Pegah; Kino, Aya; Honarmand, Amir R.; Galizia, Mauricio S.; Yang, Yan; Collins, Jeremy; Yaghmai, Vahid; Carr, James C.

    2013-01-01

    Objective: To evaluates a semi-automated method for Thoracic Aortic Aneurysm (TAA) measurement using ECG-gated Dual Source CT Angiogram (DSCTA). Methods: This retrospective HIPAA compliant study was approved by our IRB. Transaxial maximum diameters of outer wall to outer wall were studied in fifty patients at seven anatomic locations of the thoracic aorta: annulus, sinus, sinotubular junction (STJ), mid ascending aorta (MAA) at the level of right pulmonary artery, proximal aortic arch (PROX) immediately proximal to innominate artery, distal aortic arch (DIST) immediately distal to left subclavian artery, and descending aorta (DESC) at the level of diaphragm. Measurements were performed using a manual method and semi-automated software. All readers repeated their measurements. Inter-method, intra-observer and inter-observer agreements were evaluated according to intraclass correlation coefficient (ICC) and Bland–Altman plot. The number of cases with manual contouring or center line adjustment for the semi-automated method and also the post-processing time for each method were recorded. Results: The mean difference between semi-automated and manual methods was less than 1.3 mm at all seven points. Strong inter-method, inter-observer and intra-observer agreement was recorded at all levels (ICC ≥ 0.9). The maximum rate of manual adjustment of center line and contour was at the level of annulus. The average time for manual post-processing of the aorta was 19 ± 0.3 min, while it took 8.26 ± 2.1 min to do the measurements with the semi-automated tool (Vitrea version 6.0.0.1 software). The center line was edited manually at all levels, with most corrections at the level of annulus (60%), while the contour was adjusted at all levels with highest and lowest number of corrections at the levels of annulus and DESC (75% and 0.07% of the cases), respectively. Conclusion: Compared to the commonly used manual method, semi-automated measurement of vessel dimensions is

  9. A multistage, semi-automated procedure for analyzing the morphology of nanoparticles

    KAUST Repository

    Park, Chiwoo

    2012-07-01

    This article presents a multistage, semi-automated procedure that can expedite the morphology analysis of nanoparticles. Material scientists have long conjectured that the morphology of nanoparticles has a profound impact on the properties of the hosting material, but a bottleneck is the lack of a reliable and automated morphology analysis of the particles based on their image measurements. This article attempts to fill in this critical void. One particular challenge in nanomorphology analysis is how to analyze the overlapped nanoparticles, a problem not well addressed by the existing methods but effectively tackled by the method proposed in this article. This method entails multiple stages of operations, executed sequentially, and is considered semi-automated due to the inclusion of a semi-supervised clustering step. The proposed method is applied to several images of nanoparticles, producing the needed statistical characterization of their morphology. © 2012 "IIE".

  10. A multistage, semi-automated procedure for analyzing the morphology of nanoparticles

    KAUST Repository

    Park, Chiwoo; Huang, Jianhua Z.; Huitink, David; Kundu, Subrata; Mallick, Bani K.; Liang, Hong; Ding, Yu

    2012-01-01

    This article presents a multistage, semi-automated procedure that can expedite the morphology analysis of nanoparticles. Material scientists have long conjectured that the morphology of nanoparticles has a profound impact on the properties of the hosting material, but a bottleneck is the lack of a reliable and automated morphology analysis of the particles based on their image measurements. This article attempts to fill in this critical void. One particular challenge in nanomorphology analysis is how to analyze the overlapped nanoparticles, a problem not well addressed by the existing methods but effectively tackled by the method proposed in this article. This method entails multiple stages of operations, executed sequentially, and is considered semi-automated due to the inclusion of a semi-supervised clustering step. The proposed method is applied to several images of nanoparticles, producing the needed statistical characterization of their morphology. © 2012 "IIE".

  11. Comparison of manual and semi-automated delineation of regions of interest for radioligand PET imaging analysis

    International Nuclear Information System (INIS)

    Chow, Tiffany W; Verhoeff, Nicolaas PLG; Takeshita, Shinichiro; Honjo, Kie; Pataky, Christina E; St Jacques, Peggy L; Kusano, Maggie L; Caldwell, Curtis B; Ramirez, Joel; Black, Sandra

    2007-01-01

    As imaging centers produce higher resolution research scans, the number of man-hours required to process regional data has become a major concern. Comparison of automated vs. manual methodology has not been reported for functional imaging. We explored validation of using automation to delineate regions of interest on positron emission tomography (PET) scans. The purpose of this study was to ascertain improvements in image processing time and reproducibility of a semi-automated brain region extraction (SABRE) method over manual delineation of regions of interest (ROIs). We compared 2 sets of partial volume corrected serotonin 1a receptor binding potentials (BPs) resulting from manual vs. semi-automated methods. BPs were obtained from subjects meeting consensus criteria for frontotemporal degeneration and from age- and gender-matched healthy controls. Two trained raters provided each set of data to conduct comparisons of inter-rater mean image processing time, rank order of BPs for 9 PET scans, intra- and inter-rater intraclass correlation coefficients (ICC), repeatability coefficients (RC), percentages of the average parameter value (RM%), and effect sizes of either method. SABRE saved approximately 3 hours of processing time per PET subject over manual delineation (p < .001). Quality of the SABRE BP results was preserved relative to the rank order of subjects by manual methods. Intra- and inter-rater ICC were high (>0.8) for both methods. RC and RM% were lower for the manual method across all ROIs, indicating less intra-rater variance across PET subjects' BPs. SABRE demonstrated significant time savings and no significant difference in reproducibility over manual methods, justifying the use of SABRE in serotonin 1a receptor radioligand PET imaging analysis. This implies that semi-automated ROI delineation is a valid methodology for future PET imaging analysis

  12. White matter hyperintensities segmentation: a new semi-automated method

    Directory of Open Access Journals (Sweden)

    Mariangela eIorio

    2013-12-01

    Full Text Available White matter hyperintensities (WMH are brain areas of increased signal on T2-weighted or fluid attenuated inverse recovery magnetic resonance imaging (MRI scans. In this study we present a new semi-automated method to measure WMH load that is based on the segmentation of the intensity histogram of fluid-attenuated inversion recovery images. Thirty patients with Mild Cognitive Impairment with variable WMH load were enrolled. The semi-automated WMH segmentation included: removal of non-brain tissue, spatial normalization, removal of cerebellum and brain stem, spatial filtering, thresholding to segment probable WMH, manual editing for correction of false positives and negatives, generation of WMH map and volumetric estimation of the WMH load. Accuracy was quantitatively evaluated by comparing semi-automated and manual WMH segmentations performed by two independent raters. Differences between the two procedures were assessed using Student’s t tests and similarity was evaluated using linear regression model and Dice Similarity Coefficient (DSC. The volumes of the manual and semi-automated segmentations did not statistically differ (t-value= -1.79, DF=29, p= 0.839 for rater 1; t-value= 1.113, DF=29, p= 0.2749 for rater 2, were highly correlated (R²= 0.921, F (1,29 =155,54, p

  13. MSCT follow-up in malignant lymphoma. Comparison of manual linear measurements with semi-automated lymph node analysis for therapy response classification

    International Nuclear Information System (INIS)

    Wessling, J.; Puesken, M.; Kohlhase, N.; Persigehl, T.; Mesters, R.; Heindel, W.; Buerke, B.; Koch, R.

    2012-01-01

    Purpose: Assignment of semi-automated lymph node analysis compared to manual measurements for therapy response classification of malignant lymphoma in MSCT. Materials and Methods: MSCT scans of 63 malignant lymphoma patients before and after 2 cycles of chemotherapy (307 target lymph nodes) were evaluated. The long axis diameter (LAD), short axis diameter (SAD) and bi-dimensional WHO were determined manually and semi-automatically. The time for manual and semi-automatic segmentation was evaluated. The ref. standard response was defined as the mean relative change across all manual and semi-automatic measurements (mean manual/semi-automatic LAD, SAD, semi-automatic volume). Statistical analysis encompassed t-test and McNemar's test for clustered data. Results: Response classification per lymph node revealed semi-automated volumetry and bi-dimensional WHO to be significantly more accurate than manual linear metric measurements. Response classification per patient based on RECIST revealed more patients to be correctly classified by semi-automatic measurements, e.g. 96.0 %/92.9 % (WHO bi-dimensional/volume) compared to 85.7/84.1 % for manual LAD and SAD, respectively (mean reduction in misclassified patients of 9.95 %). Considering the use of correction tools, the time expenditure for lymph node segmentation (29.7 ± 17.4 sec) was the same as with the manual approach (29.1 ± 14.5 sec). Conclusion: Semi-automatically derived 'lymph node volume' and 'bi-dimensional WHO' significantly reduce the number of misclassified patients in the CT follow-up of malignant lymphoma by at least 10 %. However, lymph node volumetry does not outperform bi-dimensional WHO. (orig.)

  14. MSCT follow-up in malignant lymphoma. Comparison of manual linear measurements with semi-automated lymph node analysis for therapy response classification

    Energy Technology Data Exchange (ETDEWEB)

    Wessling, J.; Puesken, M.; Kohlhase, N.; Persigehl, T.; Mesters, R.; Heindel, W.; Buerke, B. [Muenster Univ. (Germany). Dept. of Clinical Radiology; Koch, R. [Muenster Univ. (Germany). Inst. of Biostatistics and Clinical Research

    2012-09-15

    Purpose: Assignment of semi-automated lymph node analysis compared to manual measurements for therapy response classification of malignant lymphoma in MSCT. Materials and Methods: MSCT scans of 63 malignant lymphoma patients before and after 2 cycles of chemotherapy (307 target lymph nodes) were evaluated. The long axis diameter (LAD), short axis diameter (SAD) and bi-dimensional WHO were determined manually and semi-automatically. The time for manual and semi-automatic segmentation was evaluated. The ref. standard response was defined as the mean relative change across all manual and semi-automatic measurements (mean manual/semi-automatic LAD, SAD, semi-automatic volume). Statistical analysis encompassed t-test and McNemar's test for clustered data. Results: Response classification per lymph node revealed semi-automated volumetry and bi-dimensional WHO to be significantly more accurate than manual linear metric measurements. Response classification per patient based on RECIST revealed more patients to be correctly classified by semi-automatic measurements, e.g. 96.0 %/92.9 % (WHO bi-dimensional/volume) compared to 85.7/84.1 % for manual LAD and SAD, respectively (mean reduction in misclassified patients of 9.95 %). Considering the use of correction tools, the time expenditure for lymph node segmentation (29.7 {+-} 17.4 sec) was the same as with the manual approach (29.1 {+-} 14.5 sec). Conclusion: Semi-automatically derived 'lymph node volume' and 'bi-dimensional WHO' significantly reduce the number of misclassified patients in the CT follow-up of malignant lymphoma by at least 10 %. However, lymph node volumetry does not outperform bi-dimensional WHO. (orig.)

  15. PCR evaluation : considering transition from manual to semi-automated pavement distress collection and analysis.

    Science.gov (United States)

    2013-07-01

    This study is designed to assist the Ohio Department of Transportation (ODOT) in determining : whether transitioning from manual to state-of the-practice semi-automated pavement distress : data collection is feasible and recommended. Statistical and ...

  16. Semi-automated scoring of triple-probe FISH in human sperm using confocal microscopy.

    Science.gov (United States)

    Branch, Francesca; Nguyen, GiaLinh; Porter, Nicholas; Young, Heather A; Martenies, Sheena E; McCray, Nathan; Deloid, Glen; Popratiloff, Anastas; Perry, Melissa J

    2017-09-01

    Structural and numerical sperm chromosomal aberrations result from abnormal meiosis and are directly linked to infertility. Any live births that arise from aneuploid conceptuses can result in syndromes such as Kleinfelter, Turners, XYY and Edwards. Multi-probe fluorescence in situ hybridization (FISH) is commonly used to study sperm aneuploidy, however manual FISH scoring in sperm samples is labor-intensive and introduces errors. Automated scoring methods are continuously evolving. One challenging aspect for optimizing automated sperm FISH scoring has been the overlap in excitation and emission of the fluorescent probes used to enumerate the chromosomes of interest. Our objective was to demonstrate the feasibility of combining confocal microscopy and spectral imaging with high-throughput methods for accurately measuring sperm aneuploidy. Our approach used confocal microscopy to analyze numerical chromosomal abnormalities in human sperm using enhanced slide preparation and rigorous semi-automated scoring methods. FISH for chromosomes X, Y, and 18 was conducted to determine sex chromosome disomy in sperm nuclei. Application of online spectral linear unmixing was used for effective separation of four fluorochromes while decreasing data acquisition time. Semi-automated image processing, segmentation, classification, and scoring were performed on 10 slides using custom image processing and analysis software and results were compared with manual methods. No significant differences in disomy frequencies were seen between the semi automated and manual methods. Samples treated with pepsin were observed to have reduced background autofluorescence and more uniform distribution of cells. These results demonstrate that semi-automated methods using spectral imaging on a confocal platform are a feasible approach for analyzing numerical chromosomal aberrations in sperm, and are comparable to manual methods. © 2017 International Society for Advancement of Cytometry. © 2017

  17. Semi-automated analysis of EEG spikes in the preterm fetal sheep using wavelet analysis

    International Nuclear Information System (INIS)

    Walbran, A.C.; Unsworth, C.P.; Gunn, A.J.; Benett, L.

    2010-01-01

    Full text: Presentation Preference Oral Presentation Perinatal hypoxia plays a key role in the cause of brain injury in premature infants. Cerebral hypothermia commenced in the latent phase of evolving injury (first 6-8 h post hypoxic-ischemic insult) is the lead candidate for treatment however currently there is no means to identify which infants can benefit from treatment. Recent studies suggest that epileptiform transients in latent phase are predictive of neural outcome. To quantify this, an automated means of EEG analysis is required as EEG monitoring produces vast amounts of data which is timely to analyse manually. We have developed a semi-automated EEG spike detection method which employs a discretized version of the continuous wavelet transform (CWT). EEG data was obtained from a fetal sheep at approximately 0.7 of gestation. Fetal asphyxia was maintained for 25 min and the EEG recorded for 8 h before and after asphyxia. The CWT was calculated followed by the power of the wavelet transform coefficients. Areas of high power corresponded to spike waves so thresholding was employed to identify the spikes. The performance of the method was found have a good sensitivity and selectivity, thus demonstrating that this method is a simple, robust and potentially effective spike detection algorithm.

  18. Fast-FISH Detection and Semi-Automated Image Analysis of Numerical Chromosome Aberrations in Hematological Malignancies

    Directory of Open Access Journals (Sweden)

    Arif Esa

    1998-01-01

    Full Text Available A new fluorescence in situ hybridization (FISH technique called Fast-FISH in combination with semi-automated image analysis was applied to detect numerical aberrations of chromosomes 8 and 12 in interphase nuclei of peripheral blood lymphocytes and bone marrow cells from patients with acute myelogenous leukemia (AML and chronic lymphocytic leukemia (CLL. Commercially available α-satellite DNA probes specific for the centromere regions of chromosome 8 and chromosome 12, respectively, were used. After application of the Fast-FISH protocol, the microscopic images of the fluorescence-labelled cell nuclei were recorded by the true color CCD camera Kappa CF 15 MC and evaluated quantitatively by computer analysis on a PC. These results were compared to results obtained from the same type of specimens using the same analysis system but with a standard FISH protocol. In addition, automated spot counting after both FISH techniques was compared to visual spot counting after standard FISH. A total number of about 3,000 cell nuclei was evaluated. For quantitative brightness parameters, a good correlation between standard FISH labelling and Fast-FISH was found. Automated spot counting after Fast-FISH coincided within a few percent to automated and visual spot counting after standard FISH. The examples shown indicate the reliability and reproducibility of Fast-FISH and its potential for automatized interphase cell diagnostics of numerical chromosome aberrations. Since the Fast-FISH technique requires a hybridization time as low as 1/20 of established standard FISH techniques, omitting most of the time consuming working steps in the protocol, it may contribute considerably to clinical diagnostics. This may especially be interesting in cases where an accurate result is required within a few hours.

  19. Development and Evaluation of a Semi-automated Segmentation Tool and a Modified Ellipsoid Formula for Volumetric Analysis of the Kidney in Non-contrast T2-Weighted MR Images.

    Science.gov (United States)

    Seuss, Hannes; Janka, Rolf; Prümmer, Marcus; Cavallaro, Alexander; Hammon, Rebecca; Theis, Ragnar; Sandmair, Martin; Amann, Kerstin; Bäuerle, Tobias; Uder, Michael; Hammon, Matthias

    2017-04-01

    Volumetric analysis of the kidney parenchyma provides additional information for the detection and monitoring of various renal diseases. Therefore the purposes of the study were to develop and evaluate a semi-automated segmentation tool and a modified ellipsoid formula for volumetric analysis of the kidney in non-contrast T2-weighted magnetic resonance (MR)-images. Three readers performed semi-automated segmentation of the total kidney volume (TKV) in axial, non-contrast-enhanced T2-weighted MR-images of 24 healthy volunteers (48 kidneys) twice. A semi-automated threshold-based segmentation tool was developed to segment the kidney parenchyma. Furthermore, the three readers measured renal dimensions (length, width, depth) and applied different formulas to calculate the TKV. Manual segmentation served as a reference volume. Volumes of the different methods were compared and time required was recorded. There was no significant difference between the semi-automatically and manually segmented TKV (p = 0.31). The difference in mean volumes was 0.3 ml (95% confidence interval (CI), -10.1 to 10.7 ml). Semi-automated segmentation was significantly faster than manual segmentation, with a mean difference = 188 s (220 vs. 408 s); p T2-weighted MR data delivers accurate and reproducible results and was significantly faster than manual segmentation. Applying a modified ellipsoid formula quickly provides an accurate kidney volume.

  20. Semi-Automated Processing of Trajectory Simulator Output Files for Model Evaluation

    Science.gov (United States)

    2018-01-01

    ARL-TR-8284 ● JAN 2018 US Army Research Laboratory Semi-Automated Processing of Trajectory Simulator Output Files for Model...Semi-Automated Processing of Trajectory Simulator Output Files for Model Evaluation 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT...although some minor changes may be needed. The program processes a GTRAJ output text file that contains results from 2 or more simulations , where each

  1. Suitability of semi-automated tumor response assessment of liver metastases using a dedicated software package

    International Nuclear Information System (INIS)

    Kalkmann, Janine; Ladd, S.C.; Greiff, A. de; Forsting, M.; Stattaus, J.

    2010-01-01

    Purpose: to evaluate the suitability of semi-automated compared to manual tumor response assessment (TRA) of liver metastases. Materials and methods: in total, 32 patients with colorectal cancer and liver metastases were followed by an average of 2.8 contrast-enhanced CT scans. Two observers (O1, O2) measured the longest diameter (LD) of 269 liver metastases manually and semi-automatically using software installed as thin-client on a PACS workstation (LMS-Liver, MEDIAN Technologies). LD and TRA (''progressive'', ''stable'', ''partial remission'') were performed according to RECIST (Response Evaluation Criteria in Solid Tumors) and analyzed for between-method, interobserver and intraobserver variability. The time needed for evaluation was compared for both methods. Results: all measurements correlated excellently (r ≥ 0.96). Intraobserver (semi-automated), interobserver (manual) and between-method differences (by O1) in LD of 1.4 ± 2.6 mm, 1.9 ± 1.9 mm and 2.1 ± 2.0 mm, respectively, were not significant. Interobserver (semi-automated) and between-method (by O2) differences in LD of 3.0 ± 3.0 mm and 2.6 ± 2.0 mm, respectively, reflected a significant variability (p < 0.01). The interobserver agreement in manual and semi-automated TRA was 91.4%. The intraobserver agreement in semi-automated TRA was 84.5%. Between both methods a TRA agreement of 86.2% was obtained. Semi-automated evaluation (2.7 min) took slightly more time than manual evaluation (2.3 min). Conclusion: semi-automated and manual evaluation of liver metastases yield comparable results in response assessments and require comparable effort. (orig.)

  2. The influence of image setting on intracranial translucency measurement by manual and semi-automated system.

    Science.gov (United States)

    Zhen, Li; Yang, Xin; Ting, Yuen Ha; Chen, Min; Leung, Tak Yeung

    2013-09-01

    To investigate the agreement between manual and semi-automated system and the effect of different image settings on intracranial translucency (IT) measurement. A prospective study was conducted on 55 women carrying singleton pregnancy who attended first trimester Down syndrome screening. IT was measured both manually and by semi-automated system at the same default image setting. The IT measurements were then repeated with the post-processing changes in the image setting one at a time. The difference in IT measurements between the altered and the original images were assessed. Intracranial translucency was successfully measured on 55 images both manually and by semi-automated method. There was strong agreement in IT measurements between the two methods with a mean difference (manual minus semi-automated) of 0.011 mm (95% confidence interval--0.052 mm-0.094 mm). There were statistically significant variations in both manual and semi-automated IT measurement after changing the Gain and the Contrast. The greatest changes occurred when the Contrast was reduced to 1 (IT reduced by 0.591 mm in semi-automated; 0.565 mm in manual), followed by when the Gain was increased to 15 (IT reduced by 0.424 mm in semi-automated; 0.524 mm in manual). The image settings may affect IT identification and measurement. Increased Gain and reduced Contrast are the most influential factors and may cause under-measurement of IT. © 2013 John Wiley & Sons, Ltd.

  3. Mesoscale brain explorer, a flexible python-based image analysis and visualization tool.

    Science.gov (United States)

    Haupt, Dirk; Vanni, Matthieu P; Bolanos, Federico; Mitelut, Catalin; LeDue, Jeffrey M; Murphy, Tim H

    2017-07-01

    Imaging of mesoscale brain activity is used to map interactions between brain regions. This work has benefited from the pioneering studies of Grinvald et al., who employed optical methods to image brain function by exploiting the properties of intrinsic optical signals and small molecule voltage-sensitive dyes. Mesoscale interareal brain imaging techniques have been advanced by cell targeted and selective recombinant indicators of neuronal activity. Spontaneous resting state activity is often collected during mesoscale imaging to provide the basis for mapping of connectivity relationships using correlation. However, the information content of mesoscale datasets is vast and is only superficially presented in manuscripts given the need to constrain measurements to a fixed set of frequencies, regions of interest, and other parameters. We describe a new open source tool written in python, termed mesoscale brain explorer (MBE), which provides an interface to process and explore these large datasets. The platform supports automated image processing pipelines with the ability to assess multiple trials and combine data from different animals. The tool provides functions for temporal filtering, averaging, and visualization of functional connectivity relations using time-dependent correlation. Here, we describe the tool and show applications, where previously published datasets were reanalyzed using MBE.

  4. Development and analysis of prognostic equations for mesoscale kinetic energy and mesoscale (subgrid scale) fluxes for large-scale atmospheric models

    Science.gov (United States)

    Avissar, Roni; Chen, Fei

    1993-01-01

    Generated by landscape discontinuities (e.g., sea breezes) mesoscale circulation processes are not represented in large-scale atmospheric models (e.g., general circulation models), which have an inappropiate grid-scale resolution. With the assumption that atmospheric variables can be separated into large scale, mesoscale, and turbulent scale, a set of prognostic equations applicable in large-scale atmospheric models for momentum, temperature, moisture, and any other gaseous or aerosol material, which includes both mesoscale and turbulent fluxes is developed. Prognostic equations are also developed for these mesoscale fluxes, which indicate a closure problem and, therefore, require a parameterization. For this purpose, the mean mesoscale kinetic energy (MKE) per unit of mass is used, defined as E-tilde = 0.5 (the mean value of u'(sub i exp 2), where u'(sub i) represents the three Cartesian components of a mesoscale circulation (the angle bracket symbol is the grid-scale, horizontal averaging operator in the large-scale model, and a tilde indicates a corresponding large-scale mean value). A prognostic equation is developed for E-tilde, and an analysis of the different terms of this equation indicates that the mesoscale vertical heat flux, the mesoscale pressure correlation, and the interaction between turbulence and mesoscale perturbations are the major terms that affect the time tendency of E-tilde. A-state-of-the-art mesoscale atmospheric model is used to investigate the relationship between MKE, landscape discontinuities (as characterized by the spatial distribution of heat fluxes at the earth's surface), and mesoscale sensible and latent heat fluxes in the atmosphere. MKE is compared with turbulence kinetic energy to illustrate the importance of mesoscale processes as compared to turbulent processes. This analysis emphasizes the potential use of MKE to bridge between landscape discontinuities and mesoscale fluxes and, therefore, to parameterize mesoscale fluxes

  5. Semi-automated uranium analysis by a modified Davies--Gray procedure

    International Nuclear Information System (INIS)

    Swanson, G.C.

    1977-01-01

    To rapidly and reliably determine uranium in fuel materials a semi-automated implementation of the Davies-Gray uranium titration was developed. The Davies-Gray method is essentially a three step procedure. First uranium is reduced quantitatively from +6 valence to +4 valence by excess of iron (II) in strong phosphoric acid in the absence of nitrite. Prior to the uranium reduction nitrite is destroyed by addition of sulfamic acid. In the second step iron (II) is selectively oxidized to iron (III) by nitric acid in the presence of Mo (VI) catalyst. Finally after dilution to reduce phosphate concentration, the uranium is titrated to U (VI) by standard dichromate. The original sluggish colorimetric endpoint determination used by Davies and Gray is seldom used since New Brunswick Laboratory discovered that addition of vanadium (IV) just prior to titration sufficiently improves reaction rate to allow a potentiometric endpoint determination. One of the advantages of the Davies-Gray uranium titration is that it is quite specific for uranium, most common impurity elements do not interfere with the analysis, and specifically high levels of Pu, Th, and Fe are tolerated

  6. Semi-automated microwave assisted solid-phase peptide synthesis

    DEFF Research Database (Denmark)

    Pedersen, Søren Ljungberg

    with microwaves for SPPS has gained in popularity as it for many syntheses has provided significant improvement in terms of speed, purity, and yields, maybe especially in the synthesis of long and "difficult" peptides. Thus, precise microwave heating has emerged as one new parameter for SPPS, in addition...... to coupling reagents, resins, solvents etc. We have previously reported on microwave heating to promote a range of solid-phase reactions in SPPS. Here we present a new, flexible semi-automated instrument for the application of precise microwave heating in solid-phase synthesis. It combines a slightly modified...... Biotage Initiator microwave instrument, which is available in many laboratories, with a modified semi-automated peptide synthesizer from MultiSynTech. A custom-made reaction vessel is placed permanently in the microwave oven, thus the reactor does not have to be moved between steps. Mixing is achieved...

  7. Semi-automated Robust Quantification of Lesions (SRQL Toolbox

    Directory of Open Access Journals (Sweden)

    Kaori L Ito

    2017-05-01

    Full Text Available Quantifying lesions in a reliable manner is fundamental for studying the effects of neuroanatomical changes related to recovery in the post-stroke brain. However, the wide variability in lesion characteristics across individuals makes manual lesion segmentation a challenging and often subjective process. This often makes it difficult to combine stroke lesion data across multiple research sites, due to subjective differences in how lesions may be defined. Thus, we developed the Semi-automated Robust Quantification of Lesions (SRQL; https://github.com/npnl/SRQL; DOI: 10.5281/zenodo.557114 Toolbox that performs several analysis steps: 1 a white matter intensity correction that removes healthy white matter voxels from the lesion mask, thereby making lesions slightly more robust to subjective errors; 2 an automated report of descriptive statistics on lesions for simplified comparison between or across groups, and 3 an option to perform analyses in both native and standard space to facilitate analyses in either space. Here, we describe the methods implemented in the toolbox.

  8. Refuelling: Swiss station will be semi-automated

    International Nuclear Information System (INIS)

    Fontaine, B.; Ribaux, P.

    1981-01-01

    The first semi-automated LWR refuelling machine in Europe has been supplied to the Leibstadt General Electric BWR in Switzerland. The system relieves operators of the boring and repetitive job of moving and accurately positioning the refuelling machine during fuelling operations and will thus contribute to plant safety. The machine and its mode of operation are described. (author)

  9. Enhanced detection levels in a semi-automated sandwich ...

    African Journals Online (AJOL)

    A peptide nucleic acid (PNA) signal probe was tested as a replacement for a typical DNA oligonucleotidebased signal probe in a semi-automated sandwich hybridisation assay designed to detect the harmful phytoplankton species Alexandrium tamarense. The PNA probe yielded consistently higher fluorescent signal ...

  10. Method for semi-automated microscopy of filtration-enriched circulating tumor cells.

    Science.gov (United States)

    Pailler, Emma; Oulhen, Marianne; Billiot, Fanny; Galland, Alexandre; Auger, Nathalie; Faugeroux, Vincent; Laplace-Builhé, Corinne; Besse, Benjamin; Loriot, Yohann; Ngo-Camus, Maud; Hemanda, Merouan; Lindsay, Colin R; Soria, Jean-Charles; Vielh, Philippe; Farace, Françoise

    2016-07-14

    Circulating tumor cell (CTC)-filtration methods capture high numbers of CTCs in non-small-cell lung cancer (NSCLC) and metastatic prostate cancer (mPCa) patients, and hold promise as a non-invasive technique for treatment selection and disease monitoring. However filters have drawbacks that make the automation of microscopy challenging. We report the semi-automated microscopy method we developed to analyze filtration-enriched CTCs from NSCLC and mPCa patients. Spiked cell lines in normal blood and CTCs were enriched by ISET (isolation by size of epithelial tumor cells). Fluorescent staining was carried out using epithelial (pan-cytokeratins, EpCAM), mesenchymal (vimentin, N-cadherin), leukocyte (CD45) markers and DAPI. Cytomorphological staining was carried out with Mayer-Hemalun or Diff-Quik. ALK-, ROS1-, ERG-rearrangement were detected by filter-adapted-FISH (FA-FISH). Microscopy was carried out using an Ariol scanner. Two combined assays were developed. The first assay sequentially combined four-color fluorescent staining, scanning, automated selection of CD45(-) cells, cytomorphological staining, then scanning and analysis of CD45(-) cell phenotypical and cytomorphological characteristics. CD45(-) cell selection was based on DAPI and CD45 intensity, and a nuclear area >55 μm(2). The second assay sequentially combined fluorescent staining, automated selection of CD45(-) cells, FISH scanning on CD45(-) cells, then analysis of CD45(-) cell FISH signals. Specific scanning parameters were developed to deal with the uneven surface of filters and CTC characteristics. Thirty z-stacks spaced 0.6 μm apart were defined as the optimal setting, scanning 82 %, 91 %, and 95 % of CTCs in ALK-, ROS1-, and ERG-rearranged patients respectively. A multi-exposure protocol consisting of three separate exposure times for green and red fluorochromes was optimized to analyze the intensity, size and thickness of FISH signals. The semi-automated microscopy method reported here

  11. Feasibility of a semi-automated method for cardiac conduction velocity analysis of high-resolution activation maps

    NARCIS (Netherlands)

    Doshi, Ashish N.; Walton, Richard D.; Krul, Sébastien P.; de Groot, Joris R.; Bernus, Olivier; Efimov, Igor R.; Boukens, Bastiaan J.; Coronel, Ruben

    2015-01-01

    Myocardial conduction velocity is important for the genesis of arrhythmias. In the normal heart, conduction is primarily dependent on fiber direction (anisotropy) and may be discontinuous at sites with tissue heterogeneities (trabeculated or fibrotic tissue). We present a semi-automated method for

  12. Chemical composition dispersion in bi-metallic nanoparticles: semi-automated analysis using HAADF-STEM

    International Nuclear Information System (INIS)

    Epicier, T.; Sato, K.; Tournus, F.; Konno, T.

    2012-01-01

    We present a method using high-angle annular dark field scanning transmission electron microscopy (HAADF-STEM) to determine the chemical composition of bi-metallic nanoparticles. This method, which can be applied in a semi-automated way, allows large scale analysis with a statistical number of particles (several hundreds) in a short time. Once a calibration curve has been obtained, e.g., using energy-dispersive X-ray spectroscopy (EDX) measurements on a few particles, the HAADF integrated intensity of each particle can indeed be directly related to its chemical composition. After a theoretical description, this approach is applied to the case of iron–palladium nanoparticles (expected to be nearly stoichiometric) with a mean size of 8.3 nm. It will be shown that an accurate chemical composition histogram is obtained, i.e., the Fe content has been determined to be 49.0 at.% with a dispersion of 10.4 %. HAADF-STEM analysis represents a powerful alternative to fastidious single particle EDX measurements, for the compositional dispersion in alloy nanoparticles.

  13. Semi-automated Robust Quantification of Lesions (SRQL Toolbox

    Directory of Open Access Journals (Sweden)

    Kaori Ito

    2017-02-01

    Full Text Available Quantifying lesions in a robust manner is fundamental for studying the effects of neuroanatomical changes in the post-stroke brain on recovery. However, the wide variability in lesion characteristics across individuals makes manual lesion segmentation a challenging and often subjective process. This makes it difficult to combine stroke lesion data across multiple research sites, due to subjective differences in how lesions may be defined. We developed the Semi-automated Robust Quantification of Lesions (SRQL; https://github.com/npnl/SRQL; DOI: 10.5281/zenodo.267213 Toolbox that performs several analysis steps: 1 a white matter intensity correction that removes healthy white matter voxels from the lesion mask, thereby making lesions slightly more robust to subjective errors; 2 an automated report of descriptive statistics on lesions for simplified comparison between or across groups, and 3 an option to perform analyses in both native and standard space to facilitate analyses in either space, or comparisons between spaces. Here, we describe the methods implemented in the toolbox and demonstrate the outputs of the SRQL toolbox.

  14. Semi-Automated Digital Image Analysis of Pick's Disease and TDP-43 Proteinopathy.

    Science.gov (United States)

    Irwin, David J; Byrne, Matthew D; McMillan, Corey T; Cooper, Felicia; Arnold, Steven E; Lee, Edward B; Van Deerlin, Vivianna M; Xie, Sharon X; Lee, Virginia M-Y; Grossman, Murray; Trojanowski, John Q

    2016-01-01

    Digital image analysis of histology sections provides reliable, high-throughput methods for neuropathological studies but data is scant in frontotemporal lobar degeneration (FTLD), which has an added challenge of study due to morphologically diverse pathologies. Here, we describe a novel method of semi-automated digital image analysis in FTLD subtypes including: Pick's disease (PiD, n=11) with tau-positive intracellular inclusions and neuropil threads, and TDP-43 pathology type C (FTLD-TDPC, n=10), defined by TDP-43-positive aggregates predominantly in large dystrophic neurites. To do this, we examined three FTLD-associated cortical regions: mid-frontal gyrus (MFG), superior temporal gyrus (STG) and anterior cingulate gyrus (ACG) by immunohistochemistry. We used a color deconvolution process to isolate signal from the chromogen and applied both object detection and intensity thresholding algorithms to quantify pathological burden. We found object-detection algorithms had good agreement with gold-standard manual quantification of tau- and TDP-43-positive inclusions. Our sampling method was reliable across three separate investigators and we obtained similar results in a pilot analysis using open-source software. Regional comparisons using these algorithms finds differences in regional anatomic disease burden between PiD and FTLD-TDP not detected using traditional ordinal scale data, suggesting digital image analysis is a powerful tool for clinicopathological studies in morphologically diverse FTLD syndromes. © The Author(s) 2015.

  15. Semi-automated technique for the separation and determination of barium and strontium in surface waters by ion exchange chromatography and atomic emission spectrometry

    International Nuclear Information System (INIS)

    Pierce, F.D.; Brown, H.R.

    1977-01-01

    A semi-automated method for the separation and the analysis of barium and strontium in surface waters by atomic emission spectrometry is described. The method employs a semi-automated separation technique using ion exchange and an automated aspiration-analysis procedure. Forty specimens can be prepared in approximately 90 min and can be analyzed for barium and strontium content in 20 min. The detection limits and sensitivities provided by the described technique are 0.003 mg/l and 0.01 mg/l respectively for barium and 0.00045 mg/l and 0.003 mg/l respectively for strontium

  16. Application of semi-automated ultrasonography on nutritional support for severe acute pancreatitis.

    Science.gov (United States)

    Li, Ying; Ye, Yu; Yang, Mei; Ruan, Haiying; Yu, Yuan

    2018-04-25

    To evaluate the application value of semi-automated ultrasound on the guidance of nasogastrojejunal tube replacement for patients with acute severe pancreatitis (ASP), as well as the value of the nutritional support for standardized treatment in clinical practice. The retrospective research was performed in our hospital, and 34 patients suffering from ASP were enrolled into this study. All these identified participants ever received CT scans in order to make definitive diagnoses. Following, these patients received semi-automated ultrasound examinations within 1 days after their onset, in order to provide enteral nutrititon treatment via nasogastrojejunal tube, or freehand nasogastrojejunal tube replacement. In terms of statistical analysis, the application value of semi-automated ultrasound guidance on nasogastrojejunal tube replacement was evaluated, and was compared with tube replacement of no guidance. After cathetering, the additional enteral nutrition was provided, and its therapeutic effect on SAP was analyzed in further. A total of 34 patients with pancreatitis were identified in this research, 29 cases with necrosis of pancreas parenchyma. After further examinations, 32 cases were SAP, 2 cases were mild acute pancreatitis. When the firm diagnosis was made, additional enteral nutrition (EN) was given, all the patient conditions appeared good, and they all were satisfied with this kind of nutritional support. According to our clinical experience, when there was 200-250 ml liquid in the stomach, the successful rate of intubation appeared higher. Additionally, the comparison between ultrasound-guided and freehand nasogastrojejunal tube replacement was made. According to the statistical results, in terms of the utilization ratio of nutritional support, it was better in ultrasound-guided group, when compared with it in freehand group, within 1 day, after 3 days and after 7 days (7/20 versus 2/14; P groups was not statistically different (P > 0.05). It can

  17. Semi-Automated Quantification of Finger Joint Space Narrowing Using Tomosynthesis in Patients with Rheumatoid Arthritis.

    Science.gov (United States)

    Ichikawa, Shota; Kamishima, Tamotsu; Sutherland, Kenneth; Kasahara, Hideki; Shimizu, Yuka; Fujimori, Motoshi; Yasojima, Nobutoshi; Ono, Yohei; Kaneda, Takahiko; Koike, Takao

    2017-06-01

    The purpose of the study is to validate the semi-automated method using tomosynthesis images for the assessment of finger joint space narrowing (JSN) in patients with rheumatoid arthritis (RA), by using the semi-quantitative scoring method as the reference standard. Twenty patients (14 females and 6 males) with RA were included in this retrospective study. All patients underwent radiography and tomosynthesis of the bilateral hand and wrist. Two rheumatologists and a radiologist independently scored JSN with two modalities according to the Sharp/van der Heijde score. Two observers independently measured joint space width on tomosynthesis images using an in-house semi-automated method. More joints with JSN were revealed with tomosynthesis score (243 joints) and the semi-automated method (215 joints) than with radiography (120 joints), and the associations between tomosynthesis scores and radiography scores were demonstrated (P tomosynthesis scores with r = -0.606 (P tomosynthesis images was in almost perfect agreement with intra-class correlation coefficient (ICC) values of 0.964 and 0.963, respectively. The semi-automated method using tomosynthesis images provided sensitive, quantitative, and reproducible measurement of finger joint space in patients with RA.

  18. Method for semi-automated microscopy of filtration-enriched circulating tumor cells

    International Nuclear Information System (INIS)

    Pailler, Emma; Oulhen, Marianne; Billiot, Fanny; Galland, Alexandre; Auger, Nathalie; Faugeroux, Vincent; Laplace-Builhé, Corinne; Besse, Benjamin; Loriot, Yohann; Ngo-Camus, Maud; Hemanda, Merouan; Lindsay, Colin R.; Soria, Jean-Charles; Vielh, Philippe; Farace, Françoise

    2016-01-01

    Circulating tumor cell (CTC)-filtration methods capture high numbers of CTCs in non-small-cell lung cancer (NSCLC) and metastatic prostate cancer (mPCa) patients, and hold promise as a non-invasive technique for treatment selection and disease monitoring. However filters have drawbacks that make the automation of microscopy challenging. We report the semi-automated microscopy method we developed to analyze filtration-enriched CTCs from NSCLC and mPCa patients. Spiked cell lines in normal blood and CTCs were enriched by ISET (isolation by size of epithelial tumor cells). Fluorescent staining was carried out using epithelial (pan-cytokeratins, EpCAM), mesenchymal (vimentin, N-cadherin), leukocyte (CD45) markers and DAPI. Cytomorphological staining was carried out with Mayer-Hemalun or Diff-Quik. ALK-, ROS1-, ERG-rearrangement were detected by filter-adapted-FISH (FA-FISH). Microscopy was carried out using an Ariol scanner. Two combined assays were developed. The first assay sequentially combined four-color fluorescent staining, scanning, automated selection of CD45 − cells, cytomorphological staining, then scanning and analysis of CD45 − cell phenotypical and cytomorphological characteristics. CD45 − cell selection was based on DAPI and CD45 intensity, and a nuclear area >55 μm 2 . The second assay sequentially combined fluorescent staining, automated selection of CD45 − cells, FISH scanning on CD45 − cells, then analysis of CD45 − cell FISH signals. Specific scanning parameters were developed to deal with the uneven surface of filters and CTC characteristics. Thirty z-stacks spaced 0.6 μm apart were defined as the optimal setting, scanning 82 %, 91 %, and 95 % of CTCs in ALK-, ROS1-, and ERG-rearranged patients respectively. A multi-exposure protocol consisting of three separate exposure times for green and red fluorochromes was optimized to analyze the intensity, size and thickness of FISH signals. The semi-automated microscopy method reported here

  19. A Framework for Semi-Automated Implementation of Multidimensional Data Models

    Directory of Open Access Journals (Sweden)

    Ilona Mariana NAGY

    2012-08-01

    Full Text Available Data warehousing solution development represents a challenging task which requires the employment of considerable resources on behalf of enterprises and sustained commitment from the stakeholders. Costs derive mostly from the amount of time invested in the design and physical implementation of these large projects, time that we consider, may be decreased through the automation of several processes. Thus, we present a framework for semi-automated implementation of multidimensional data models and introduce an automation prototype intended to reduce the time of data structures generation in the warehousing environment. Our research is focused on the design of an automation component and the development of a corresponding prototype from technical metadata.

  20. A semi-automated method for bone age assessment using cervical vertebral maturation.

    Science.gov (United States)

    Baptista, Roberto S; Quaglio, Camila L; Mourad, Laila M E H; Hummel, Anderson D; Caetano, Cesar Augusto C; Ortolani, Cristina Lúcia F; Pisa, Ivan T

    2012-07-01

    To propose a semi-automated method for pattern classification to predict individuals' stage of growth based on morphologic characteristics that are described in the modified cervical vertebral maturation (CVM) method of Baccetti et al. A total of 188 lateral cephalograms were collected, digitized, evaluated manually, and grouped into cervical stages by two expert examiners. Landmarks were located on each image and measured. Three pattern classifiers based on the Naïve Bayes algorithm were built and assessed using a software program. The classifier with the greatest accuracy according to the weighted kappa test was considered best. The classifier showed a weighted kappa coefficient of 0.861 ± 0.020. If an adjacent estimated pre-stage or poststage value was taken to be acceptable, the classifier would show a weighted kappa coefficient of 0.992 ± 0.019. Results from this study show that the proposed semi-automated pattern classification method can help orthodontists identify the stage of CVM. However, additional studies are needed before this semi-automated classification method for CVM assessment can be implemented in clinical practice.

  1. Literature classification for semi-automated updating of biological knowledgebases

    DEFF Research Database (Denmark)

    Olsen, Lars Rønn; Kudahl, Ulrich Johan; Winther, Ole

    2013-01-01

    abstracts yielded classification accuracy of 0.95, thus showing significant value in support of data extraction from the literature. Conclusion: We here propose a conceptual framework for semi-automated extraction of epitope data embedded in scientific literature using principles from text mining...... types of biological data, such as sequence data, are extensively stored in biological databases, functional annotations, such as immunological epitopes, are found primarily in semi-structured formats or free text embedded in primary scientific literature. Results: We defined and applied a machine...

  2. Contribution of mesoscale eddies to Black Sea ventilation

    Science.gov (United States)

    Capet, Arthur; Mason, Evan; Pascual, Ananda; Grégoire, Marilaure

    2017-04-01

    The shoaling of the Black Sea oxycline is one of the most urgent environmental issues in the Black Sea. The permanent oxycline derives directly from the Black Sea permanent stratification and has shoaled alarmingly in the last decades, due to a shifting balance between oxygen consumption and ventilation processes (Capet et al. 2016). The understanding of this balance is thus of the utmost importance and requires to quantify 1) the export of nutrients and organic materials from the shelf regions to the open sea and 2) the ventilation processes. These two processes being influenced by mesoscale features, it is critical to understand the role of the semi-permanent mesoscale structures in horizontal (center/periphery) and vertical (diapycnal and isopycnal) exchanges. A useful insight can be obtained by merging observations from satellite altimeter and in situ profilers (ARGO). In such composite analyses, eddies are first automatically identified and tracked from altimeter data (Mason et al. 2014, py-eddy-tracker). Vertical ARGO profiles are then expressed in terms of their position relative to eddy centers and radii. Derived statistics indicate how consistently mesoscale eddies alter the vertical structure, and provide a deeper understanding of the associated horizontal and vertical fluxes. However, this data-based approach is limited in the Black Sea due to the lower quality of gridded altimetric products in the vicinity of the coast, where semi-permanent mesoscale structures prevail. To complement the difficult analysis of this sparse dataset, a compositing methodology. is also applied to model outputs from the 5km GHER-BHAMBI Black Sea implementation (CMEMS BS-MFC). Characteristic biogeochemical anomalies associated with eddies in the model are analyzed per se, and compared to the observation-based analysis. Capet, A., Stanev, E. V., Beckers, J.-M., Murray, J. W., and Grégoire, M.: Decline of the Black Sea oxygen inventory, Biogeosciences, 13, 1287-1297, doi:10

  3. Semi-Automated Digital Image Analysis of Pick’s Disease and TDP-43 Proteinopathy

    Science.gov (United States)

    Irwin, David J.; Byrne, Matthew D.; McMillan, Corey T.; Cooper, Felicia; Arnold, Steven E.; Lee, Edward B.; Van Deerlin, Vivianna M.; Xie, Sharon X.; Lee, Virginia M.-Y.; Grossman, Murray; Trojanowski, John Q.

    2015-01-01

    Digital image analysis of histology sections provides reliable, high-throughput methods for neuropathological studies but data is scant in frontotemporal lobar degeneration (FTLD), which has an added challenge of study due to morphologically diverse pathologies. Here, we describe a novel method of semi-automated digital image analysis in FTLD subtypes including: Pick’s disease (PiD, n=11) with tau-positive intracellular inclusions and neuropil threads, and TDP-43 pathology type C (FTLD-TDPC, n=10), defined by TDP-43-positive aggregates predominantly in large dystrophic neurites. To do this, we examined three FTLD-associated cortical regions: mid-frontal gyrus (MFG), superior temporal gyrus (STG) and anterior cingulate gyrus (ACG) by immunohistochemistry. We used a color deconvolution process to isolate signal from the chromogen and applied both object detection and intensity thresholding algorithms to quantify pathological burden. We found object-detection algorithms had good agreement with gold-standard manual quantification of tau- and TDP-43-positive inclusions. Our sampling method was reliable across three separate investigators and we obtained similar results in a pilot analysis using open-source software. Regional comparisons using these algorithms finds differences in regional anatomic disease burden between PiD and FTLD-TDP not detected using traditional ordinal scale data, suggesting digital image analysis is a powerful tool for clinicopathological studies in morphologically diverse FTLD syndromes. PMID:26538548

  4. Dynamic CT myocardial perfusion imaging: performance of 3D semi-automated evaluation software

    Energy Technology Data Exchange (ETDEWEB)

    Ebersberger, Ullrich [Medical University of South Carolina, Heart and Vascular Center, Charleston, SC (United States); Heart Center Munich-Bogenhausen, Department of Cardiology and Intensive Care Medicine, Munich (Germany); Marcus, Roy P.; Nikolaou, Konstantin; Bamberg, Fabian [University of Munich, Institute of Clinical Radiology, Munich (Germany); Schoepf, U.J.; Gray, J.C.; McQuiston, Andrew D. [Medical University of South Carolina, Heart and Vascular Center, Charleston, SC (United States); Lo, Gladys G. [Hong Kong Sanatorium and Hospital, Department of Diagnostic and Interventional Radiology, Hong Kong (China); Wang, Yining [Medical University of South Carolina, Heart and Vascular Center, Charleston, SC (United States); Peking Union Medical College Hospital, Chinese Academy of Medical Sciences, Department of Radiology, Beijing (China); Blanke, Philipp [Medical University of South Carolina, Heart and Vascular Center, Charleston, SC (United States); University Hospital Freiburg, Department of Diagnostic Radiology, Freiburg (Germany); Geyer, Lucas L. [Medical University of South Carolina, Heart and Vascular Center, Charleston, SC (United States); University of Munich, Institute of Clinical Radiology, Munich (Germany); Cho, Young Jun [Medical University of South Carolina, Heart and Vascular Center, Charleston, SC (United States); Konyang University College of Medicine, Department of Radiology, Daejeon (Korea, Republic of); Scheuering, Michael; Canstein, Christian [Siemens Healthcare, CT Division, Forchheim (Germany); Hoffmann, Ellen [Heart Center Munich-Bogenhausen, Department of Cardiology and Intensive Care Medicine, Munich (Germany)

    2014-01-15

    To evaluate the performance of three-dimensional semi-automated evaluation software for the assessment of myocardial blood flow (MBF) and blood volume (MBV) at dynamic myocardial perfusion computed tomography (CT). Volume-based software relying on marginal space learning and probabilistic boosting tree-based contour fitting was applied to CT myocardial perfusion imaging data of 37 subjects. In addition, all image data were analysed manually and both approaches were compared with SPECT findings. Study endpoints included time of analysis and conventional measures of diagnostic accuracy. Of 592 analysable segments, 42 showed perfusion defects on SPECT. Average analysis times for the manual and software-based approaches were 49.1 ± 11.2 and 16.5 ± 3.7 min respectively (P < 0.01). There was strong agreement between the two measures of interest (MBF, ICC = 0.91, and MBV, ICC = 0.88, both P < 0.01) and no significant difference in MBF/MBV with respect to diagnostic accuracy between the two approaches for both MBF and MBV for manual versus software-based approach; respectively; all comparisons P > 0.05. Three-dimensional semi-automated evaluation of dynamic myocardial perfusion CT data provides similar measures and diagnostic accuracy to manual evaluation, albeit with substantially reduced analysis times. This capability may aid the integration of this test into clinical workflows. (orig.)

  5. Fast semi-automated lesion demarcation in stroke

    Directory of Open Access Journals (Sweden)

    Bianca de Haan

    2015-01-01

    Full Text Available Lesion–behaviour mapping analyses require the demarcation of the brain lesion on each (usually transverse slice of the individual stroke patient's brain image. To date, this is generally thought to be most precise when done manually, which is, however, both time-consuming and potentially observer-dependent. Fully automated lesion demarcation methods have been developed to address these issues, but these are often not practicable in acute stroke research where for each patient only a single image modality is available and the available image modality differs over patients. In the current study, we evaluated a semi-automated lesion demarcation approach, the so-called Clusterize algorithm, in acute stroke patients scanned in a range of common image modalities. Our results suggest that, compared to the standard of manual lesion demarcation, the semi-automated Clusterize algorithm is capable of significantly speeding up lesion demarcation in the most commonly used image modalities, without loss of either lesion demarcation precision or lesion demarcation reproducibility. For the three investigated acute datasets (CT, DWI, T2FLAIR, containing a total of 44 patient images obtained in a regular clinical setting at patient admission, the reduction in processing time was on average 17.8 min per patient and this advantage increased with increasing lesion volume (up to 60 min per patient for the largest lesion volumes in our datasets. Additionally, our results suggest that performance of the Clusterize algorithm in a chronic dataset with 11 T1 images was comparable to its performance in the acute datasets. We thus advocate the use of the Clusterize algorithm, integrated into a simple, freely available SPM toolbox, for the precise, reliable and fast preparation of imaging data for lesion–behaviour mapping analyses.

  6. A semi-automated method of monitoring dam passage of American Eels Anguilla rostrata

    Science.gov (United States)

    Welsh, Stuart A.; Aldinger, Joni L.

    2014-01-01

    Fish passage facilities at dams have become an important focus of fishery management in riverine systems. Given the personnel and travel costs associated with physical monitoring programs, automated or semi-automated systems are an attractive alternative for monitoring fish passage facilities. We designed and tested a semi-automated system for eel ladder monitoring at Millville Dam on the lower Shenandoah River, West Virginia. A motion-activated eel ladder camera (ELC) photographed each yellow-phase American Eel Anguilla rostrata that passed through the ladder. Digital images (with date and time stamps) of American Eels allowed for total daily counts and measurements of eel TL using photogrammetric methods with digital imaging software. We compared physical counts of American Eels with camera-based counts; TLs obtained with a measuring board were compared with TLs derived from photogrammetric methods. Data from the ELC were consistent with data obtained by physical methods, thus supporting the semi-automated camera system as a viable option for monitoring American Eel passage. Time stamps on digital images allowed for the documentation of eel passage time—data that were not obtainable from physical monitoring efforts. The ELC has application to eel ladder facilities but can also be used to monitor dam passage of other taxa, such as crayfishes, lampreys, and water snakes.

  7. A semi-automated method for measuring thickness and white matter ...

    African Journals Online (AJOL)

    A semi-automated method for measuring thickness and white matter integrity of the corpus callosum. ... and interhemispheric differences. Future research will determine normal values for age and compare CC thickness with peripheral white matter volume loss in large groups of patients, using the semiautomated technique.

  8. Accuracy and reproducibility of aortic annular measurements obtained from echocardiographic 3D manual and semi-automated software analyses in patients referred for transcatheter aortic valve implantation: implication for prosthesis size selection.

    Science.gov (United States)

    Stella, Stefano; Italia, Leonardo; Geremia, Giulia; Rosa, Isabella; Ancona, Francesco; Marini, Claudia; Capogrosso, Cristina; Giglio, Manuela; Montorfano, Matteo; Latib, Azeem; Margonato, Alberto; Colombo, Antonio; Agricola, Eustachio

    2018-02-06

    A 3D transoesophageal echocardiography (3D-TOE) reconstruction tool has recently been introduced. The system automatically configures a geometric model of the aortic root and performs quantitative analysis of these structures. We compared the measurements of the aortic annulus (AA) obtained by semi-automated 3D-TOE quantitative software and manual analysis vs. multislice computed tomography (MSCT) ones. One hundred and seventy-five patients (mean age 81.3 ± 6.3 years, 77 men) who underwent both MSCT and 3D-TOE for annulus assessment before transcatheter aortic valve implantation were analysed. Hypothetical prosthetic valve sizing was evaluated using the 3D manual, semi-automated measurements using manufacturer-recommended CT-based sizing algorithm as gold standard. Good correlation between 3D-TOE methods vs. MSCT measurements was found, but the semi-automated analysis demonstrated slightly better correlations for AA major diameter (r = 0.89), perimeter (r = 0.89), and area (r = 0.85) (all P 3D methods underestimated the MSCT measurements, but semi-automated measurements showed narrower limits of agreement and lesser bias than manual measurements for most of AA parameters. On average, 3D-TOE semi-automated major diameter, area, and perimeter underestimated the respective MSCT measurements by 7.4%, 3.5%, and 4.4%, respectively, whereas minor diameter was overestimated by 0.3%. Moderate agreement for valve sizing for both 3D-TOE techniques was found: Kappa agreement 0.5 for both semi-automated and manual analysis. Interobserver and intraobserver agreements for the AA measurements were excellent for both techniques (intraclass correlation coefficients for all parameters >0.80). The 3D-TOE semi-automated analysis of AA is feasible and reliable and can be used in clinical practice as an alternative to MSCT for AA assessment. Published on behalf of the European Society of Cardiology. All rights reserved. © The Author(s) 2018. For permissions, please

  9. A Study on the Cost-Effectiveness of a SemiAutomated Cutting Process at a Garment Manufacturing Company

    Directory of Open Access Journals (Sweden)

    Castro, Mark Daniel

    2017-11-01

    Full Text Available The subject of the study, Company X, has been experiencing variations in the quantity report from the cutting department and the transmittal reports. The management found that these processes are hugely affected by manual labor. To reduce the system's proneness to human error, the management decided to explore the possibility of adapting a semi-automated spreading and cutting process in the system. This research aims to evaluate the pre-sewing processes of Company X and whether introducing automation can be beneficial to the company and the garments industry. The researchers used process mapping tools, descriptive research, and process flowchart to assess the current and proposed systems, and engineering economics to evaluate the cost and benefits of implementing the semi-automated system. The results showed that with the implementation of the semi- automated system; the company will incur 66.61% more savings per year than the current system. In terms of cycle time, the semi-automated system eliminated the relaxation of fabric before the cutting process, thereby greatly reducing cycle time. In addition, the researchers found that as long as the company produce more than 4,140 pieces per day for the system will be economically feasible. Unquantifiable benefits are also identified on introducing the semi- automated system to the company. The company can have a cleaner work environment that will lead to more productivity and greater quality of goods. This will lead to a better company image that will encourage more customers to place job orders.

  10. Intelligent, Semi-Automated Procedure Aid (ISAPA) for ISS Flight Control, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — We propose to develop the Intelligent, Semi-Automated Procedure Aid (ISAPA) intended for use by International Space Station (ISS) ground controllers to increase the...

  11. A geometrical approach for semi-automated crystal centering and in situ X-ray diffraction data collection

    International Nuclear Information System (INIS)

    Mohammad Yaser Heidari Khajepour; Ferrer, Jean-Luc; Lebrette, Hugo; Vernede, Xavier; Rogues, Pierrick

    2013-01-01

    High-throughput protein crystallography projects pushed forward the development of automated crystallization platforms that are now commonly used. This created an urgent need for adapted and automated equipment for crystal analysis. However, first these crystals have to be harvested, cryo-protected and flash-cooled, operations that can fail or negatively impact on the crystal. In situ X-ray diffraction analysis has become a valid alternative to these operations, and a growing number of users apply it for crystal screening and to solve structures. Nevertheless, even this shortcut may require a significant amount of beam time. In this in situ high-throughput approach, the centering of crystals relative to the beam represents the bottleneck in the analysis process. In this article, a new method to accelerate this process, by recording accurately the local geometry coordinates for each crystal in the crystallization plate, is presented. Subsequently, the crystallization plate can be presented to the X-ray beam by an automated plate-handling device, such as a six-axis robot arm, for an automated crystal centering in the beam, in situ screening or data collection. Here the preliminary results of such a semi-automated pipeline are reported for two distinct test proteins. (authors)

  12. Evaluation of an automated karyotyping system for chromosome aberration analysis

    International Nuclear Information System (INIS)

    Prichard, H.M.

    1987-01-01

    Chromosome aberration analysis is a promising complement to conventional radiation dosimetry, particularly in the complex radiation fields encountered in the space environment. The capabilities of a recently developed automated karyotyping system were evaluated both to determine current capabilities and limitations and to suggest areas where future development should be emphasized. Cells exposed to radiometric chemicals and to photon and particulate radiation were evaluated by manual inspection and by automated karyotyping. It was demonstrated that the evaluated programs were appropriate for image digitization, storage, and transmission. However, automated and semi-automated scoring techniques must be advanced significantly if in-flight chromosome aberration analysis is to be practical. A degree of artificial intelligence may be necessary to realize this goal

  13. Preliminary clinical evaluation of semi-automated nailfold capillaroscopy in the assessment of patients with Raynaud's phenomenon.

    Science.gov (United States)

    Murray, Andrea K; Feng, Kaiyan; Moore, Tonia L; Allen, Phillip D; Taylor, Christopher J; Herrick, Ariane L

    2011-08-01

      Nailfold capillaroscopy is well established in screening patients with Raynaud's phenomenon for underlying SSc-spectrum disorders, by identifying abnormal capillaries. Our aim was to compare semi-automatic feature measurement from newly developed software with manual measurements, and determine the degree to which semi-automated data allows disease group classification.   Images from 46 healthy controls, 21 patients with PRP and 49 with SSc were preprocessed, and semi-automated measurements of intercapillary distance and capillary width, tortuosity, and derangement were performed. These were compared with manual measurements. Features were used to classify images into the three subject groups.   Comparison of automatic and manual measures for distance, width, tortuosity, and derangement had correlations of r=0.583, 0.624, 0.495 (p<0.001), and 0.195 (p=0.040). For automatic measures, correlations were found between width and intercapillary distance, r=0.374, and width and tortuosity, r=0.573 (p<0.001). Significant differences between subject groups were found for all features (p<0.002). Overall, 75% of images correctly matched clinical classification using semi-automated features, compared with 71% for manual measurements.   Semi-automatic and manual measurements of distance, width, and tortuosity showed moderate (but statistically significant) correlations. Correlation for derangement was weaker. Semi-automatic measurements are faster than manual measurements. Semi-automatic parameters identify differences between groups, and are as good as manual measurements for between-group classification. © 2011 John Wiley & Sons Ltd.

  14. Semi-automated, occupationally safe immunofluorescence microtip sensor for rapid detection of Mycobacterium cells in sputum.

    Directory of Open Access Journals (Sweden)

    Shinnosuke Inoue

    Full Text Available An occupationally safe (biosafe sputum liquefaction protocol was developed for use with a semi-automated antibody-based microtip immunofluorescence sensor. The protocol effectively liquefied sputum and inactivated microorganisms including Mycobacterium tuberculosis, while preserving the antibody-binding activity of Mycobacterium cell surface antigens. Sputum was treated with a synergistic chemical-thermal protocol that included moderate concentrations of NaOH and detergent at 60°C for 5 to 10 min. Samples spiked with M. tuberculosis complex cells showed approximately 10(6-fold inactivation of the pathogen after treatment. Antibody binding was retained post-treatment, as determined by analysis with a microtip immunosensor. The sensor correctly distinguished between Mycobacterium species and other cell types naturally present in biosafe-treated sputum, with a detection limit of 100 CFU/mL for M. tuberculosis, in a 30-minute sample-to-result process. The microtip device was also semi-automated and shown to be compatible with low-cost, LED-powered fluorescence microscopy. The device and biosafe sputum liquefaction method opens the door to rapid detection of tuberculosis in settings with limited laboratory infrastructure.

  15. Toward the use of a mesoscale model at a very high resolution

    Energy Technology Data Exchange (ETDEWEB)

    Gasset, N.; Benoit, R.; Masson, C. [Canada Research Chair on Nordic Environment Aerodynamics of Wind Turbines, Ottawa, ON (Canada)

    2008-07-01

    This presentation described a new compressible mesoscale model designed to obtain wind speed data for potential wind power resource development. Microscale modelling and computerized fluid dynamics (CFD) are used to study the mean properties of the surface layer of the atmospheric boundary layer (ABL). Mesoscale models study the temporal evolution of synoptic to mesoscale atmospheric phenomena and environmental modelling. Mesoscale modelling is essential for wind energy applications and large-scale resource evaluation, and can be compared with microscale models in order to validate input data and determine boundary conditions. The compressible community mesoscale model (MC2) was comprised of a national weather prediction (NWP) model with semi-implicit semi-Lagrangian (SISL) dynamics and compressible Euler equation solutions. Physical parameters included radiations; microphysics; thermal stratification; turbulence; and convection. The turbulence diffusion feature included unsteady Reynolds averaged Navier-Stokes; transport equations for turbulent kinetic energy; and mixing lengths. Operating modes included 3-D weather data, and surface and ground properties as well as 1-way self-nesting abilities. The validation framework for the model included a simulation of a set of realistic cases and theoretical cases including full dynamics and physics. Theoretical cases included manually imposed initial and boundary conditions and minimalist physics. Further research is being conducted to refine operating modes and boundary conditions. tabs., figs.

  16. Investigating Semi-Automated Cadastral Boundaries Extraction from Airborne Laser Scanned Data

    Directory of Open Access Journals (Sweden)

    Xianghuan Luo

    2017-09-01

    Full Text Available Many developing countries have witnessed the urgent need of accelerating cadastral surveying processes. Previous studies found that large portions of cadastral boundaries coincide with visible physical objects, namely roads, fences, and building walls. This research explores the application of airborne laser scanning (ALS techniques on cadastral surveys. A semi-automated workflow is developed to extract cadastral boundaries from an ALS point clouds. Firstly, a two-phased workflow was developed that focused on extracting digital representations of physical objects. In the automated extraction phase, after classifying points into semantic components, the outline of planar objects such as building roofs and road surfaces were generated by an α-shape algorithm, whilst the centerlines delineatiation approach was fitted into the lineate object—a fence. Afterwards, the extracted vector lines were edited and refined during the post-refinement phase. Secondly, we quantitatively evaluated the workflow performance by comparing results against an exiting cadastral map as reference. It was found that the workflow achieved promising results: around 80% completeness and 60% correctness on average, although the spatial accuracy is still modest. It is argued that the semi-automated extraction workflow could effectively speed up cadastral surveying, with both human resources and equipment costs being reduced

  17. A semi-automated tool for treatment plan-quality evaluation and clinical trial quality assurance

    Science.gov (United States)

    Wang, Jiazhou; Chen, Wenzhou; Studenski, Matthew; Cui, Yunfeng; Lee, Andrew J.; Xiao, Ying

    2013-07-01

    The goal of this work is to develop a plan-quality evaluation program for clinical routine and multi-institutional clinical trials so that the overall evaluation efficiency is improved. In multi-institutional clinical trials evaluating the plan quality is a time-consuming and labor-intensive process. In this note, we present a semi-automated plan-quality evaluation program which combines MIMVista, Java/MATLAB, and extensible markup language (XML). More specifically, MIMVista is used for data visualization; Java and its powerful function library are implemented for calculating dosimetry parameters; and to improve the clarity of the index definitions, XML is applied. The accuracy and the efficiency of the program were evaluated by comparing the results of the program with the manually recorded results in two RTOG trials. A slight difference of about 0.2% in volume or 0.6 Gy in dose between the semi-automated program and manual recording was observed. According to the criteria of indices, there are minimal differences between the two methods. The evaluation time is reduced from 10-20 min to 2 min by applying the semi-automated plan-quality evaluation program.

  18. A semi-automated tool for treatment plan-quality evaluation and clinical trial quality assurance

    International Nuclear Information System (INIS)

    Wang, Jiazhou; Chen, Wenzhou; Studenski, Matthew; Cui, Yunfeng; Xiao, Ying; Lee, Andrew J

    2013-01-01

    The goal of this work is to develop a plan-quality evaluation program for clinical routine and multi-institutional clinical trials so that the overall evaluation efficiency is improved. In multi-institutional clinical trials evaluating the plan quality is a time-consuming and labor-intensive process. In this note, we present a semi-automated plan-quality evaluation program which combines MIMVista, Java/MATLAB, and extensible markup language (XML). More specifically, MIMVista is used for data visualization; Java and its powerful function library are implemented for calculating dosimetry parameters; and to improve the clarity of the index definitions, XML is applied. The accuracy and the efficiency of the program were evaluated by comparing the results of the program with the manually recorded results in two RTOG trials. A slight difference of about 0.2% in volume or 0.6 Gy in dose between the semi-automated program and manual recording was observed. According to the criteria of indices, there are minimal differences between the two methods. The evaluation time is reduced from 10–20 min to 2 min by applying the semi-automated plan-quality evaluation program. (note)

  19. Automated quantification of neuronal networks and single-cell calcium dynamics using calcium imaging.

    Science.gov (United States)

    Patel, Tapan P; Man, Karen; Firestein, Bonnie L; Meaney, David F

    2015-03-30

    Recent advances in genetically engineered calcium and membrane potential indicators provide the potential to estimate the activation dynamics of individual neurons within larger, mesoscale networks (100s-1000+neurons). However, a fully integrated automated workflow for the analysis and visualization of neural microcircuits from high speed fluorescence imaging data is lacking. Here we introduce FluoroSNNAP, Fluorescence Single Neuron and Network Analysis Package. FluoroSNNAP is an open-source, interactive software developed in MATLAB for automated quantification of numerous biologically relevant features of both the calcium dynamics of single-cells and network activity patterns. FluoroSNNAP integrates and improves upon existing tools for spike detection, synchronization analysis, and inference of functional connectivity, making it most useful to experimentalists with little or no programming knowledge. We apply FluoroSNNAP to characterize the activity patterns of neuronal microcircuits undergoing developmental maturation in vitro. Separately, we highlight the utility of single-cell analysis for phenotyping a mixed population of neurons expressing a human mutant variant of the microtubule associated protein tau and wild-type tau. We show the performance of semi-automated cell segmentation using spatiotemporal independent component analysis and significant improvement in detecting calcium transients using a template-based algorithm in comparison to peak-based or wavelet-based detection methods. Our software further enables automated analysis of microcircuits, which is an improvement over existing methods. We expect the dissemination of this software will facilitate a comprehensive analysis of neuronal networks, promoting the rapid interrogation of circuits in health and disease. Copyright © 2015. Published by Elsevier B.V.

  20. Rapid and convenient semi-automated microwave-assisted solid-phase synthesis of arylopeptoids

    DEFF Research Database (Denmark)

    Rasmussen, Jakob Ewald; Boccia, Marcello Massimo; Nielsen, John

    2014-01-01

    A facile and expedient route to the synthesis of arylopeptoid oligomers (N-alkylated aminomethyl benz-amides) using semi-automated microwave-assisted solid-phase synthesis is presented. The synthesis was optimized for the incorporation of side chains derived from sterically hindered or unreactive...

  1. Smartnotebook: A semi-automated approach to protein sequential NMR resonance assignments

    International Nuclear Information System (INIS)

    Slupsky, Carolyn M.; Boyko, Robert F.; Booth, Valerie K.; Sykes, Brian D.

    2003-01-01

    Complete and accurate NMR spectral assignment is a prerequisite for high-throughput automated structure determination of biological macromolecules. However, completely automated assignment procedures generally encounter difficulties for all but the most ideal data sets. Sources of these problems include difficulty in resolving correlations in crowded spectral regions, as well as complications arising from dynamics, such as weak or missing peaks, or atoms exhibiting more than one peak due to exchange phenomena. Smartnotebook is a semi-automated assignment software package designed to combine the best features of the automated and manual approaches. The software finds and displays potential connections between residues, while the spectroscopist makes decisions on which connection is correct, allowing rapid and robust assignment. In addition, smartnotebook helps the user fit chains of connected residues to the primary sequence of the protein by comparing the experimentally determined chemical shifts with expected shifts derived from a chemical shift database, while providing bookkeeping throughout the assignment procedure

  2. Semi-automated software service integration in virtual organisations

    Science.gov (United States)

    Afsarmanesh, Hamideh; Sargolzaei, Mahdi; Shadi, Mahdieh

    2015-08-01

    To enhance their business opportunities, organisations involved in many service industries are increasingly active in pursuit of both online provision of their business services (BSs) and collaborating with others. Collaborative Networks (CNs) in service industry sector, however, face many challenges related to sharing and integration of their collection of provided BSs and their corresponding software services. Therefore, the topic of service interoperability for which this article introduces a framework is gaining momentum in research for supporting CNs. It contributes to generation of formal machine readable specification for business processes, aimed at providing their unambiguous definitions, as needed for developing their equivalent software services. The framework provides a model and implementation architecture for discovery and composition of shared services, to support the semi-automated development of integrated value-added services. In support of service discovery, a main contribution of this research is the formal representation of services' behaviour and applying desired service behaviour specified by users for automated matchmaking with other existing services. Furthermore, to support service integration, mechanisms are developed for automated selection of the most suitable service(s) according to a number of service quality aspects. Two scenario cases are presented, which exemplify several specific features related to service discovery and service integration aspects.

  3. A comparison of semi-automated volumetric vs linear measurement of small vestibular schwannomas.

    Science.gov (United States)

    MacKeith, Samuel; Das, Tilak; Graves, Martin; Patterson, Andrew; Donnelly, Neil; Mannion, Richard; Axon, Patrick; Tysome, James

    2018-04-01

    Accurate and precise measurement of vestibular schwannoma (VS) size is key to clinical management decisions. Linear measurements are used in routine clinical practice but are prone to measurement error. This study aims to compare a semi-automated volume segmentation tool against standard linear method for measuring small VS. This study also examines whether oblique tumour orientation can contribute to linear measurement error. Experimental comparison of observer agreement using two measurement techniques. Tertiary skull base unit. Twenty-four patients with unilateral sporadic small (linear dimension following reformatting to correct for oblique orientation of VS. Intra-observer ICC was higher for semi-automated volumetric when compared with linear measurements, 0.998 (95% CI 0.994-0.999) vs 0.936 (95% CI 0.856-0.972), p linear measurements, 0.989 (95% CI 0.975-0.995) vs 0.946 (95% CI 0.880-0.976), p = 0.0045. The intra-observer %SDD was similar for volumetric and linear measurements, 9.9% vs 11.8%. However, the inter-observer %SDD was greater for volumetric than linear measurements, 20.1% vs 10.6%. Following oblique reformatting to correct tumour angulation, the mean increase in size was 1.14 mm (p = 0.04). Semi-automated volumetric measurements are more repeatable than linear measurements when measuring small VS and should be considered for use in clinical practice. Oblique orientation of VS may contribute to linear measurement error.

  4. Percutaneous biopsy of a metastatic common iliac lymph node using hydrodissection and a semi-automated biopsy gun

    Energy Technology Data Exchange (ETDEWEB)

    Ryu, Seong Yoon; Park, Byung Kwan [Dept. of Radiology, amsung Medical Center, Sungkyunkwan University School of Medicine, Seoul (Korea, Republic of)

    2017-06-15

    Percutaneous biopsy is a less invasive technique for sampling the tissue than laparoscopic biopsy or exploratory laparotomy. However, it is difficult to perform biopsy of a deep-seated lesion because of the possibility of damage to the critical organs. Recently, we successfully performed CT-guided biopsy of a metastatic common iliac lymph node using hydrodissection and semi-automated biopsy devices. The purpose of this case report was to show how to perform hydrodissection and how to use a semi-automated gun for safe biopsy of a metastatic common iliac lymph node.

  5. Vessel suppressed chest Computed Tomography for semi-automated volumetric measurements of solid pulmonary nodules.

    Science.gov (United States)

    Milanese, Gianluca; Eberhard, Matthias; Martini, Katharina; Vittoria De Martini, Ilaria; Frauenfelder, Thomas

    2018-04-01

    To evaluate whether vessel-suppressed computed tomography (VSCT) can be reliably used for semi-automated volumetric measurements of solid pulmonary nodules, as compared to standard CT (SCT) MATERIAL AND METHODS: Ninety-three SCT were elaborated by dedicated software (ClearRead CT, Riverain Technologies, Miamisburg, OH, USA), that allows subtracting vessels from lung parenchyma. Semi-automated volumetric measurements of 65 solid nodules were compared between SCT and VSCT. The measurements were repeated by two readers. For each solid nodule, volume measured on SCT by Reader 1 and Reader 2 was averaged and the average volume between readers acted as standard of reference value. Concordance between measurements was assessed using Lin's Concordance Correlation Coefficient (CCC). Limits of agreement (LoA) between readers and CT datasets were evaluated. Standard of reference nodule volume ranged from 13 to 366 mm 3 . The mean overestimation between readers was 3 mm 3 and 2.9 mm 3 on SCT and VSCT, respectively. Semi-automated volumetric measurements on VSCT showed substantial agreement with the standard of reference (Lin's CCC = 0.990 for Reader 1; 0.985 for Reader 2). The upper and lower LoA between readers' measurements were (16.3, -22.4 mm 3 ) and (15.5, -21.4 mm 3 ) for SCT and VSCT, respectively. VSCT datasets are feasible for the measurements of solid nodules, showing an almost perfect concordance between readers and with measurements on SCT. Copyright © 2018 Elsevier B.V. All rights reserved.

  6. Automated Detection of Microaneurysms Using Scale-Adapted Blob Analysis and Semi-Supervised Learning

    Energy Technology Data Exchange (ETDEWEB)

    Adal, Kedir M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Sidebe, Desire [Univ. of Burgundy, Dijon (France); Ali, Sharib [Univ. of Burgundy, Dijon (France); Chaum, Edward [Univ. of Tennessee, Knoxville, TN (United States); Karnowski, Thomas Paul [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Meriaudeau, Fabrice [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2014-01-07

    Despite several attempts, automated detection of microaneurysm (MA) from digital fundus images still remains to be an open issue. This is due to the subtle nature of MAs against the surrounding tissues. In this paper, the microaneurysm detection problem is modeled as finding interest regions or blobs from an image and an automatic local-scale selection technique is presented. Several scale-adapted region descriptors are then introduced to characterize these blob regions. A semi-supervised based learning approach, which requires few manually annotated learning examples, is also proposed to train a classifier to detect true MAs. The developed system is built using only few manually labeled and a large number of unlabeled retinal color fundus images. The performance of the overall system is evaluated on Retinopathy Online Challenge (ROC) competition database. A competition performance measure (CPM) of 0.364 shows the competitiveness of the proposed system against state-of-the art techniques as well as the applicability of the proposed features to analyze fundus images.

  7. Interobserver agreement of semi-automated and manual measurements of functional MRI metrics of treatment response in hepatocellular carcinoma

    International Nuclear Information System (INIS)

    Bonekamp, David; Bonekamp, Susanne; Halappa, Vivek Gowdra; Geschwind, Jean-Francois H.; Eng, John; Corona-Villalobos, Celia Pamela; Pawlik, Timothy M.; Kamel, Ihab R.

    2014-01-01

    Purpose: To assess the interobserver agreement in 50 patients with hepatocellular carcinoma (HCC) before and 1 month after intra-arterial therapy (IAT) using two semi-automated methods and a manual approach for the following functional, volumetric and morphologic parameters: (1) apparent diffusion coefficient (ADC), (2) arterial phase enhancement (AE), (3) portal venous phase enhancement (VE), (4) tumor volume, and assessment according to (5) the Response Evaluation Criteria in Solid Tumors (RECIST), and (6) the European Association for the Study of the Liver (EASL). Materials and methods: This HIPAA-compliant retrospective study had institutional review board approval. The requirement for patient informed consent was waived. Tumor ADC, AE, VE, volume, RECIST, and EASL in 50 index lesions was measured by three observers. Interobserver reproducibility was evaluated using intraclass correlation coefficients (ICC). P < 0.05 was considered to indicate a significant difference. Results: Semi-automated volumetric measurements of functional parameters (ADC, AE, and VE) before and after IAT as well as change in tumor ADC, AE, or VE had better interobserver agreement (ICC = 0.830–0.974) compared with manual ROI-based axial measurements (ICC = 0.157–0.799). Semi-automated measurements of tumor volume and size in the axial plane before and after IAT had better interobserver agreement (ICC = 0.854–0.996) compared with manual size measurements (ICC = 0.543–0.596), and interobserver agreement for change in tumor RECIST size was also higher using semi-automated measurements (ICC = 0.655) compared with manual measurements (ICC = 0.169). EASL measurements of tumor enhancement in the axial plane before and after IAT ((ICC = 0.758–0.809), and changes in EASL after IAT (ICC = 0.653) had good interobserver agreement. Conclusion: Semi-automated measurements of functional changes assessed by ADC and VE based on whole-lesion segmentation demonstrated better reproducibility than

  8. Expert-driven semi-automated geomorphological mapping for a mountainaous area using a laser DTM

    NARCIS (Netherlands)

    van Asselen, S.; Seijmonsbergen, A.C.

    2006-01-01

    n this paper a semi-automated method is presented to recognize and spatially delineate geomorphological units in mountainous forested ecosystems, using statistical information extracted from a 1-m resolution laser digital elevation dataset. The method was applied to a mountainous area in Austria.

  9. Semi-automated high-efficiency reflectivity chamber for vacuum UV measurements

    Science.gov (United States)

    Wiley, James; Fleming, Brian; Renninger, Nicholas; Egan, Arika

    2017-08-01

    This paper presents the design and theory of operation for a semi-automated reflectivity chamber for ultraviolet optimized optics. A graphical user interface designed in LabVIEW controls the stages, interfaces with the detector system, takes semi-autonomous measurements, and monitors the system in case of error. Samples and an optical photodiode sit on an optics plate mounted to a rotation stage in the middle of the vacuum chamber. The optics plate rotates the samples and diode between an incident and reflected position to measure the absolute reflectivity of the samples at wavelengths limited by the monochromator operational bandpass of 70 nm to 550 nm. A collimating parabolic mirror on a fine steering tip-tilt motor enables beam steering for detector peak-ups. This chamber is designed to take measurements rapidly and with minimal oversight, increasing lab efficiency for high cadence and high accuracy vacuum UV reflectivity measurements.

  10. Semi-automated analysis of three-dimensional track images

    International Nuclear Information System (INIS)

    Meesen, G.; Poffijn, A.

    2001-01-01

    In the past, three-dimensional (3-d) track images in solid state detectors were difficult to obtain. With the introduction of the confocal scanning laser microscope it is now possible to record 3-d track images in a non-destructive way. These 3-d track images can latter be used to measure typical track parameters. Preparing the detectors and recording the 3-d images however is only the first step. The second step in this process is enhancing the image quality by means of deconvolution techniques to obtain the maximum possible resolution. The third step is extracting the typical track parameters. This can be done on-screen by an experienced operator. For large sets of data however, this manual technique is not desirable. This paper will present some techniques to analyse 3-d track data in an automated way by means of image analysis routines. Advanced thresholding techniques guarantee stable results in different recording situations. By using pre-knowledge about the track shape, reliable object identification is obtained. In case of ambiguity, manual intervention is possible

  11. Intra- and interoperator variability of lobar pulmonary volumes and emphysema scores in patients with chronic obstructive pulmonary disease and emphysema: comparison of manual and semi-automated segmentation techniques.

    Science.gov (United States)

    Molinari, Francesco; Pirronti, Tommaso; Sverzellati, Nicola; Diciotti, Stefano; Amato, Michele; Paolantonio, Guglielmo; Gentile, Luigia; Parapatt, George K; D'Argento, Francesco; Kuhnigk, Jan-Martin

    2013-01-01

    We aimed to compare the intra- and interoperator variability of lobar volumetry and emphysema scores obtained by semi-automated and manual segmentation techniques in lung emphysema patients. In two sessions held three months apart, two operators performed lobar volumetry of unenhanced chest computed tomography examinations of 47 consecutive patients with chronic obstructive pulmonary disease and lung emphysema. Both operators used the manual and semi-automated segmentation techniques. The intra- and interoperator variability of the volumes and emphysema scores obtained by semi-automated segmentation was compared with the variability obtained by manual segmentation of the five pulmonary lobes. The intra- and interoperator variability of the lobar volumes decreased when using semi-automated lobe segmentation (coefficients of repeatability for the first operator: right upper lobe, 147 vs. 96.3; right middle lobe, 137.7 vs. 73.4; right lower lobe, 89.2 vs. 42.4; left upper lobe, 262.2 vs. 54.8; and left lower lobe, 260.5 vs. 56.5; coefficients of repeatability for the second operator: right upper lobe, 61.4 vs. 48.1; right middle lobe, 56 vs. 46.4; right lower lobe, 26.9 vs. 16.7; left upper lobe, 61.4 vs. 27; and left lower lobe, 63.6 vs. 27.5; coefficients of reproducibility in the interoperator analysis: right upper lobe, 191.3 vs. 102.9; right middle lobe, 219.8 vs. 126.5; right lower lobe, 122.6 vs. 90.1; left upper lobe, 166.9 vs. 68.7; and left lower lobe, 168.7 vs. 71.6). The coefficients of repeatability and reproducibility of emphysema scores also decreased when using semi-automated segmentation and had ranges that varied depending on the target lobe and selected threshold of emphysema. Semi-automated segmentation reduces the intra- and interoperator variability of lobar volumetry and provides a more objective tool than manual technique for quantifying lung volumes and severity of emphysema.

  12. Terminal digit bias is not an issue for properly trained healthcare personnel using manual or semi-automated devices - biomed 2010.

    Science.gov (United States)

    Butler, Kenneth R; Minor, Deborah S; Benghuzzi, Hamed A; Tucci, Michelle

    2010-01-01

    The objective of this study was to evaluate terminal digit preference in blood pressure (BP) measurements taken from a sample of clinics at a large academic health sciences center. We hypothesized that terminal digit preference would occur more frequently in BP measurements taken with manual mercury sphygmomanometry compared to those obtained with semi-automated instruments. A total of 1,393 BP measures were obtained in 16 ambulatory and inpatient sites by personnel using both mercury (n=1,286) and semi-automated (n=107) devices For the semi-automated devices, a trained observer repeated the patients BP following American Heart Association recommendations using a similar device with a known calibration history. At least two recorded systolic and diastolic blood pressures (average of two or more readings for each) were obtained for all manual mercury readings. Data were evaluated using descriptive statistics and Chi square as appropriate (SPSS software, 17.0). Overall, zero and other terminal digit preference was observed more frequently in systolic (?2 = 883.21, df = 9, p manual instruments, while all end digits obtained by clinic staff using semi-automated devices were more evenly distributed (?2 = 8.23, df = 9, p = 0.511 for systolic and ?2 = 10.48, df = 9, p = 0.313 for diastolic). In addition to zero digit bias in mercury readings, even numbers were reported with significantly higher frequency than odd numbers. There was no detectable digit preference observed when examining semi-automated measurements by clinic staff or device type for either systolic or diastolic BP measures. These findings demonstrate that terminal digit preference was more likely to occur with manual mercury sphygmomanometry. This phenomenon was most likely the result of mercury column graduation in 2 mm Hg increments producing a higher than expected frequency of even digits.

  13. Semi-automated relative quantification of cell culture contamination with mycoplasma by Photoshop-based image analysis on immunofluorescence preparations.

    Science.gov (United States)

    Kumar, Ashok; Yerneni, Lakshmana K

    2009-01-01

    Mycoplasma contamination in cell culture is a serious setback for the cell-culturist. The experiments undertaken using contaminated cell cultures are known to yield unreliable or false results due to various morphological, biochemical and genetic effects. Earlier surveys revealed incidences of mycoplasma contamination in cell cultures to range from 15 to 80%. Out of a vast array of methods for detecting mycoplasma in cell culture, the cytological methods directly demonstrate the contaminating organism present in association with the cultured cells. In this investigation, we report the adoption of a cytological immunofluorescence assay (IFA), in an attempt to obtain a semi-automated relative quantification of contamination by employing the user-friendly Photoshop-based image analysis. The study performed on 77 cell cultures randomly collected from various laboratories revealed mycoplasma contamination in 18 cell cultures simultaneously by IFA and Hoechst DNA fluorochrome staining methods. It was observed that the Photoshop-based image analysis on IFA stained slides was very valuable as a sensitive tool in providing quantitative assessment on the extent of contamination both per se and in comparison to cellularity of cell cultures. The technique could be useful in estimating the efficacy of anti-mycoplasma agents during decontaminating measures.

  14. Automated detection of microaneurysms using scale-adapted blob analysis and semi-supervised learning.

    Science.gov (United States)

    Adal, Kedir M; Sidibé, Désiré; Ali, Sharib; Chaum, Edward; Karnowski, Thomas P; Mériaudeau, Fabrice

    2014-04-01

    Despite several attempts, automated detection of microaneurysm (MA) from digital fundus images still remains to be an open issue. This is due to the subtle nature of MAs against the surrounding tissues. In this paper, the microaneurysm detection problem is modeled as finding interest regions or blobs from an image and an automatic local-scale selection technique is presented. Several scale-adapted region descriptors are introduced to characterize these blob regions. A semi-supervised based learning approach, which requires few manually annotated learning examples, is also proposed to train a classifier which can detect true MAs. The developed system is built using only few manually labeled and a large number of unlabeled retinal color fundus images. The performance of the overall system is evaluated on Retinopathy Online Challenge (ROC) competition database. A competition performance measure (CPM) of 0.364 shows the competitiveness of the proposed system against state-of-the art techniques as well as the applicability of the proposed features to analyze fundus images. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  15. Semi Automated Land Cover Layer Updating Process Utilizing Spectral Analysis and GIS Data Fusion

    Science.gov (United States)

    Cohen, L.; Keinan, E.; Yaniv, M.; Tal, Y.; Felus, A.; Regev, R.

    2018-04-01

    Technological improvements made in recent years of mass data gathering and analyzing, influenced the traditional methods of updating and forming of the national topographic database. It has brought a significant increase in the number of use cases and detailed geo information demands. Processes which its purpose is to alternate traditional data collection methods developed in many National Mapping and Cadaster Agencies. There has been significant progress in semi-automated methodologies aiming to facilitate updating of a topographic national geodatabase. Implementation of those is expected to allow a considerable reduction of updating costs and operation times. Our previous activity has focused on building automatic extraction (Keinan, Zilberstein et al, 2015). Before semiautomatic updating method, it was common that interpreter identification has to be as detailed as possible to hold most reliable database eventually. When using semi-automatic updating methodologies, the ability to insert human insights based knowledge is limited. Therefore, our motivations were to reduce the created gap by allowing end-users to add their data inputs to the basic geometric database. In this article, we will present a simple Land cover database updating method which combines insights extracted from the analyzed image, and a given spatial data of vector layers. The main stages of the advanced practice are multispectral image segmentation and supervised classification together with given vector data geometric fusion while maintaining the principle of low shape editorial work to be done. All coding was done utilizing open source software components.

  16. Quality of radiomic features in glioblastoma multiforme: Impact of semi-automated tumor segmentation software

    International Nuclear Information System (INIS)

    Lee, Myung Eun; Kim, Jong Hyo; Woo, Bo Yeong; Ko, Micheal D.; Jamshidi, Neema

    2017-01-01

    The purpose of this study was to evaluate the reliability and quality of radiomic features in glioblastoma multiforme (GBM) derived from tumor volumes obtained with semi-automated tumor segmentation software. MR images of 45 GBM patients (29 males, 16 females) were downloaded from The Cancer Imaging Archive, in which post-contrast T1-weighted imaging and fluid-attenuated inversion recovery MR sequences were used. Two raters independently segmented the tumors using two semi-automated segmentation tools (TumorPrism3D and 3D Slicer). Regions of interest corresponding to contrast-enhancing lesion, necrotic portions, and non-enhancing T2 high signal intensity component were segmented for each tumor. A total of 180 imaging features were extracted, and their quality was evaluated in terms of stability, normalized dynamic range (NDR), and redundancy, using intra-class correlation coefficients, cluster consensus, and Rand Statistic. Our study results showed that most of the radiomic features in GBM were highly stable. Over 90% of 180 features showed good stability (intra-class correlation coefficient [ICC] ≥ 0.8), whereas only 7 features were of poor stability (ICC NDR ≥1), while above 35% of the texture features showed poor NDR (< 1). Features were shown to cluster into only 5 groups, indicating that they were highly redundant. The use of semi-automated software tools provided sufficiently reliable tumor segmentation and feature stability; thus helping to overcome the inherent inter-rater and intra-rater variability of user intervention. However, certain aspects of feature quality, including NDR and redundancy, need to be assessed for determination of representative signature features before further development of radiomics

  17. Evaluation of training nurses to perform semi-automated three-dimensional left ventricular ejection fraction using a customised workstation-based training protocol.

    Science.gov (United States)

    Guppy-Coles, Kristyan B; Prasad, Sandhir B; Smith, Kym C; Hillier, Samuel; Lo, Ada; Atherton, John J

    2015-06-01

    We aimed to determine the feasibility of training cardiac nurses to evaluate left ventricular function utilising a semi-automated, workstation-based protocol on three dimensional echocardiography images. Assessment of left ventricular function by nurses is an attractive concept. Recent developments in three dimensional echocardiography coupled with border detection assistance have reduced inter- and intra-observer variability and analysis time. This could allow abbreviated training of nurses to assess cardiac function. A comparative, diagnostic accuracy study evaluating left ventricular ejection fraction assessment utilising a semi-automated, workstation-based protocol performed by echocardiography-naïve nurses on previously acquired three dimensional echocardiography images. Nine cardiac nurses underwent two brief lectures about cardiac anatomy, physiology and three dimensional left ventricular ejection fraction assessment, before a hands-on demonstration in 20 cases. We then selected 50 cases from our three dimensional echocardiography library based on optimal image quality with a broad range of left ventricular ejection fractions, which was quantified by two experienced sonographers and the average used as the comparator for the nurses. Nurses independently measured three dimensional left ventricular ejection fraction using the Auto lvq package with semi-automated border detection. The left ventricular ejection fraction range was 25-72% (70% with a left ventricular ejection fraction nurses showed excellent agreement with the sonographers. Minimal intra-observer variability was noted on both short-term (same day) and long-term (>2 weeks later) retest. It is feasible to train nurses to measure left ventricular ejection fraction utilising a semi-automated, workstation-based protocol on previously acquired three dimensional echocardiography images. Further study is needed to determine the feasibility of training nurses to acquire three dimensional echocardiography

  18. Quantifying biodiversity using digital cameras and automated image analysis.

    Science.gov (United States)

    Roadknight, C. M.; Rose, R. J.; Barber, M. L.; Price, M. C.; Marshall, I. W.

    2009-04-01

    Monitoring the effects on biodiversity of extensive grazing in complex semi-natural habitats is labour intensive. There are also concerns about the standardization of semi-quantitative data collection. We have chosen to focus initially on automating the most time consuming aspect - the image analysis. The advent of cheaper and more sophisticated digital camera technology has lead to a sudden increase in the number of habitat monitoring images and information that is being collected. We report on the use of automated trail cameras (designed for the game hunting market) to continuously capture images of grazer activity in a variety of habitats at Moor House National Nature Reserve, which is situated in the North of England at an average altitude of over 600m. Rainfall is high, and in most areas the soil consists of deep peat (1m to 3m), populated by a mix of heather, mosses and sedges. The cameras have been continuously in operation over a 6 month period, daylight images are in full colour and night images (IR flash) are black and white. We have developed artificial intelligence based methods to assist in the analysis of the large number of images collected, generating alert states for new or unusual image conditions. This paper describes the data collection techniques, outlines the quantitative and qualitative data collected and proposes online and offline systems that can reduce the manpower overheads and increase focus on important subsets in the collected data. By converting digital image data into statistical composite data it can be handled in a similar way to other biodiversity statistics thus improving the scalability of monitoring experiments. Unsupervised feature detection methods and supervised neural methods were tested and offered solutions to simplifying the process. Accurate (85 to 95%) categorization of faunal content can be obtained, requiring human intervention for only those images containing rare animals or unusual (undecidable) conditions, and

  19. Semi-automated curation of metabolic models via flux balance analysis: a case study with Mycoplasma gallisepticum.

    Directory of Open Access Journals (Sweden)

    Eddy J Bautista

    Full Text Available Primarily used for metabolic engineering and synthetic biology, genome-scale metabolic modeling shows tremendous potential as a tool for fundamental research and curation of metabolism. Through a novel integration of flux balance analysis and genetic algorithms, a strategy to curate metabolic networks and facilitate identification of metabolic pathways that may not be directly inferable solely from genome annotation was developed. Specifically, metabolites involved in unknown reactions can be determined, and potentially erroneous pathways can be identified. The procedure developed allows for new fundamental insight into metabolism, as well as acting as a semi-automated curation methodology for genome-scale metabolic modeling. To validate the methodology, a genome-scale metabolic model for the bacterium Mycoplasma gallisepticum was created. Several reactions not predicted by the genome annotation were postulated and validated via the literature. The model predicted an average growth rate of 0.358±0.12[Formula: see text], closely matching the experimentally determined growth rate of M. gallisepticum of 0.244±0.03[Formula: see text]. This work presents a powerful algorithm for facilitating the identification and curation of previously known and new metabolic pathways, as well as presenting the first genome-scale reconstruction of M. gallisepticum.

  20. Improvement of the banana "Musa acuminata" reference sequence using NGS data and semi-automated bioinformatics methods.

    Science.gov (United States)

    Martin, Guillaume; Baurens, Franc-Christophe; Droc, Gaëtan; Rouard, Mathieu; Cenci, Alberto; Kilian, Andrzej; Hastie, Alex; Doležel, Jaroslav; Aury, Jean-Marc; Alberti, Adriana; Carreel, Françoise; D'Hont, Angélique

    2016-03-16

    Recent advances in genomics indicate functional significance of a majority of genome sequences and their long range interactions. As a detailed examination of genome organization and function requires very high quality genome sequence, the objective of this study was to improve reference genome assembly of banana (Musa acuminata). We have developed a modular bioinformatics pipeline to improve genome sequence assemblies, which can handle various types of data. The pipeline comprises several semi-automated tools. However, unlike classical automated tools that are based on global parameters, the semi-automated tools proposed an expert mode for a user who can decide on suggested improvements through local compromises. The pipeline was used to improve the draft genome sequence of Musa acuminata. Genotyping by sequencing (GBS) of a segregating population and paired-end sequencing were used to detect and correct scaffold misassemblies. Long insert size paired-end reads identified scaffold junctions and fusions missed by automated assembly methods. GBS markers were used to anchor scaffolds to pseudo-molecules with a new bioinformatics approach that avoids the tedious step of marker ordering during genetic map construction. Furthermore, a genome map was constructed and used to assemble scaffolds into super scaffolds. Finally, a consensus gene annotation was projected on the new assembly from two pre-existing annotations. This approach reduced the total Musa scaffold number from 7513 to 1532 (i.e. by 80%), with an N50 that increased from 1.3 Mb (65 scaffolds) to 3.0 Mb (26 scaffolds). 89.5% of the assembly was anchored to the 11 Musa chromosomes compared to the previous 70%. Unknown sites (N) were reduced from 17.3 to 10.0%. The release of the Musa acuminata reference genome version 2 provides a platform for detailed analysis of banana genome variation, function and evolution. Bioinformatics tools developed in this work can be used to improve genome sequence assemblies in

  1. Quality of radiomic features in glioblastoma multiforme: Impact of semi-automated tumor segmentation software

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Myung Eun; Kim, Jong Hyo [Center for Medical-IT Convergence Technology Research, Advanced Institutes of Convergence Technology, Seoul National University, Suwon (Korea, Republic of); Woo, Bo Yeong [Dept. of Transdisciplinary Studies, Graduate School of Convergence Science and Technology, Seoul National University, Suwon (Korea, Republic of); Ko, Micheal D.; Jamshidi, Neema [Dept. of Radiological Sciences, University of California, Los Angeles, Los Angeles (United States)

    2017-06-15

    The purpose of this study was to evaluate the reliability and quality of radiomic features in glioblastoma multiforme (GBM) derived from tumor volumes obtained with semi-automated tumor segmentation software. MR images of 45 GBM patients (29 males, 16 females) were downloaded from The Cancer Imaging Archive, in which post-contrast T1-weighted imaging and fluid-attenuated inversion recovery MR sequences were used. Two raters independently segmented the tumors using two semi-automated segmentation tools (TumorPrism3D and 3D Slicer). Regions of interest corresponding to contrast-enhancing lesion, necrotic portions, and non-enhancing T2 high signal intensity component were segmented for each tumor. A total of 180 imaging features were extracted, and their quality was evaluated in terms of stability, normalized dynamic range (NDR), and redundancy, using intra-class correlation coefficients, cluster consensus, and Rand Statistic. Our study results showed that most of the radiomic features in GBM were highly stable. Over 90% of 180 features showed good stability (intra-class correlation coefficient [ICC] ≥ 0.8), whereas only 7 features were of poor stability (ICC < 0.5). Most first order statistics and morphometric features showed moderate-to-high NDR (4 > NDR ≥1), while above 35% of the texture features showed poor NDR (< 1). Features were shown to cluster into only 5 groups, indicating that they were highly redundant. The use of semi-automated software tools provided sufficiently reliable tumor segmentation and feature stability; thus helping to overcome the inherent inter-rater and intra-rater variability of user intervention. However, certain aspects of feature quality, including NDR and redundancy, need to be assessed for determination of representative signature features before further development of radiomics.

  2. Reducing the cost of semi-automated in-gel tryptic digestion and GeLC sample preparation for high-throughput proteomics.

    Science.gov (United States)

    Ruelcke, Jayde E; Loo, Dorothy; Hill, Michelle M

    2016-10-21

    Peptide generation by trypsin digestion is typically the first step in mass spectrometry-based proteomics experiments, including 'bottom-up' discovery and targeted proteomics using multiple reaction monitoring. Manual tryptic digest and the subsequent clean-up steps can add variability even before the sample reaches the analytical platform. While specialized filter plates and tips have been designed for automated sample processing, the specialty reagents required may not be accessible or feasible due to their high cost. Here, we report a lower-cost semi-automated protocol for in-gel digestion and GeLC using standard 96-well microplates. Further cost savings were realized by re-using reagent tips with optimized sample ordering. To evaluate the methodology, we compared a simple mixture of 7 proteins and a complex cell-lysate sample. The results across three replicates showed that our semi-automated protocol had performance equal to or better than a manual in-gel digestion with respect to replicate variability and level of contamination. In this paper, we also provide the Agilent Bravo method file, which can be adapted to other liquid handlers. The simplicity, reproducibility, and cost-effectiveness of our semi-automated protocol make it ideal for routine in-gel and GeLC sample preparations, as well as high throughput processing of large clinical sample cohorts. Copyright © 2016 Elsevier B.V. All rights reserved.

  3. Simulation and analysis of the mesoscale circulation in the northwestern Mediterranean Sea

    Directory of Open Access Journals (Sweden)

    V. Echevin

    Full Text Available The large-scale and mesoscale circulation of the northwestern Mediterranean Sea are simulated with an eddy-resolving primitive-equation regional model (RM of 1/16° resolution embedded in a general circulation model (GM of the Mediterranean Sea of 1/8° resolution. The RM is forced by a monthly climatology of heat fluxes, precipitation and wind stress. The GM, which uses the same atmospheric forcing, provides initial and boundary conditions for the RM. Analysis of the RM results shows that several realistic features of the large-scale and mesoscale circulation are evident in this region. The mean cyclonic circulation is in good agreement with observations. Mesoscale variability is intense along the coasts of Sardinia and Corsica, in the Gulf of Lions and in the Catalan Sea. The length scales of the Northern Current meanders along the Provence coast and in the Gulf of Lions’ shelf are in good agreement with observations. Winter Intermediate Water is formed along most of the north-coast shelves, between the Gulf of Genoa and Cape Creus. Advection of this water by the mean cyclonic circulation generates a complex eddy field in the Catalan Sea. Intense anticyclonic eddies are generated northeast of the Balearic Islands. These results are in good agreement with mesoscale activity inferred from satellite altimetric data. This work demonstrates the feasibility of a down-scaling system composed of a general-circulation, a regional and a coastal model, which is one of the goals of the Mediterranean Forecasting System Pilot Project.

    Key words. Oceanography: physical (currents; eddies and mesoscale processes; general circulation

  4. Comparison of manual & automated analysis methods for corneal endothelial cell density measurements by specular microscopy.

    Science.gov (United States)

    Huang, Jianyan; Maram, Jyotsna; Tepelus, Tudor C; Modak, Cristina; Marion, Ken; Sadda, SriniVas R; Chopra, Vikas; Lee, Olivia L

    2017-08-07

    To determine the reliability of corneal endothelial cell density (ECD) obtained by automated specular microscopy versus that of validated manual methods and factors that predict such reliability. Sharp central images from 94 control and 106 glaucomatous eyes were captured with Konan specular microscope NSP-9900. All images were analyzed by trained graders using Konan CellChek Software, employing the fully- and semi-automated methods as well as Center Method. Images with low cell count (input cells number <100) and/or guttata were compared with the Center and Flex-Center Methods. ECDs were compared and absolute error was used to assess variation. The effect on ECD of age, cell count, cell size, and cell size variation was evaluated. No significant difference was observed between the Center and Flex-Center Methods in corneas with guttata (p=0.48) or low ECD (p=0.11). No difference (p=0.32) was observed in ECD of normal controls <40 yrs old between the fully-automated method and manual Center Method. However, in older controls and glaucomatous eyes, ECD was overestimated by the fully-automated method (p=0.034) and semi-automated method (p=0.025) as compared to manual method. Our findings show that automated analysis significantly overestimates ECD in the eyes with high polymegathism and/or large cell size, compared to the manual method. Therefore, we discourage reliance upon the fully-automated method alone to perform specular microscopy analysis, particularly if an accurate ECD value is imperative. Copyright © 2017. Published by Elsevier España, S.L.U.

  5. Evaluation of a semi-automated computer algorithm for measuring total fat and visceral fat content in lambs undergoing in vivo whole body computed tomography.

    Science.gov (United States)

    Rosenblatt, Alana J; Scrivani, Peter V; Boisclair, Yves R; Reeves, Anthony P; Ramos-Nieves, Jose M; Xie, Yiting; Erb, Hollis N

    2017-10-01

    Computed tomography (CT) is a suitable tool for measuring body fat, since it is non-destructive and can be used to differentiate metabolically active visceral fat from total body fat. Whole body analysis of body fat is likely to be more accurate than single CT slice estimates of body fat. The aim of this study was to assess the agreement between semi-automated computer analysis of whole body volumetric CT data and conventional proximate (chemical) analysis of body fat in lambs. Data were collected prospectively from 12 lambs that underwent duplicate whole body CT, followed by slaughter and carcass analysis by dissection and chemical analysis. Agreement between methods for quantification of total and visceral fat was assessed by Bland-Altman plot analysis. The repeatability of CT was assessed for these measures using the mean difference of duplicated measures. When compared to chemical analysis, CT systematically underestimated total and visceral fat contents by more than 10% of the mean fat weight. Therefore, carcass analysis and semi-automated CT computer measurements were not interchangeable for quantifying body fat content without the use of a correction factor. CT acquisition was repeatable, with a mean difference of repeated measures being close to zero. Therefore, uncorrected whole body CT might have an application for assessment of relative changes in fat content, especially in growing lambs. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Moving from proprietary to open-source solutions for academic research in remote sensing: Example with semi-automated land cover mapping

    OpenAIRE

    Grippa, Taïs

    2017-01-01

    GRASS GIS has recently experienced significant improvements for Object-Based Image Analysis. At ULB the choice was made to combine GRASS GIS and Python in a semi-automated processing chain for land-cover mapping. The later proved its ability of being quickly customized in order to match the requirements of different projects. In order to promote the OSGEO software, we decided to make it freely available, allowing anyone interested to review, reuse and/or enhance it for further studies.

  7. A semi-automated algorithm for hypothalamus volumetry in 3 Tesla magnetic resonance images.

    Science.gov (United States)

    Wolff, Julia; Schindler, Stephanie; Lucas, Christian; Binninger, Anne-Sophie; Weinrich, Luise; Schreiber, Jan; Hegerl, Ulrich; Möller, Harald E; Leitzke, Marco; Geyer, Stefan; Schönknecht, Peter

    2018-07-30

    The hypothalamus, a small diencephalic gray matter structure, is part of the limbic system. Volumetric changes of this structure occur in psychiatric diseases, therefore there is increasing interest in precise volumetry. Based on our detailed volumetry algorithm for 7 Tesla magnetic resonance imaging (MRI), we developed a method for 3 Tesla MRI, adopting anatomical landmarks and work in triplanar view. We overlaid T1-weighted MR images with gray matter-tissue probability maps to combine anatomical information with tissue class segmentation. Then, we outlined regions of interest (ROIs) that covered potential hypothalamus voxels. Within these ROIs, seed growing technique helped define the hypothalamic volume using gray matter probabilities from the tissue probability maps. This yielded a semi-automated method with short processing times of 20-40 min per hypothalamus. In the MRIs of ten subjects, reliabilities were determined as intraclass correlations (ICC) and volume overlaps in percent. Three raters achieved very good intra-rater reliabilities (ICC 0.82-0.97) and good inter-rater reliabilities (ICC 0.78 and 0.82). Overlaps of intra- and inter-rater runs were very good (≥ 89.7%). We present a fast, semi-automated method for in vivo hypothalamus volumetry in 3 Tesla MRI. Copyright © 2018 Elsevier B.V. All rights reserved.

  8. Subregional characterization of mesoscale eddies across the Brazil-Malvinas Confluence

    Science.gov (United States)

    Mason, Evan; Pascual, Ananda; Gaube, Peter; Ruiz, Simón; Pelegrí, Josep L.; Delepoulle, Antoine

    2017-04-01

    Horizontal and vertical motions associated with coherent mesoscale structures, including eddies and meanders, are responsible for significant global transports of many properties, including heat and mass. Mesoscale vertical fluxes also influence upper ocean biological productivity by mediating the supply of nutrients into the euphotic layer, with potential impacts on the global carbon cycle. The Brazil-Malvinas Confluence (BMC) is a western boundary current region in the South Atlantic with intense mesoscale activity. This region has an active role in the genesis and transformation of water masses and thus is a critical component of the Atlantic meridional overturning circulation. The collision between the Malvinas and Brazil Currents over the Patagonian shelf/slope creates an energetic front that translates offshore to form a vigorous eddy field. Recent improvements in gridded altimetric sea level anomaly fields allow us to track BMC mesoscale eddies with high spatial and temporal resolutions using an automated eddy tracker. We characterize the eddies across fourteen 5° × 5° subregions. Eddy-centric composites of tracers and geostrophic currents diagnosed from a global reanalysis of surface and in situ data reveal substantial subregional heterogeneity. The in situ data are also used to compute the evolving quasi-geostrophic vertical velocity (QG-ω) associated with each instantaneous eddy instance. The QG-ω eddy composites have the expected dipole patterns of alternating upwelling/downwelling, however, the magnitude and sign of azimuthally averaged vertical velocity varies among subregions. Maximum eddy values are found near fronts and sharp topographic gradients. In comparison with regional eddy composites, subregional composites provide refined information about mesoscale eddy heterogeneity.

  9. Semi-automated CCTV surveillance: the effects of system confidence, system accuracy and task complexity on operator vigilance, reliance and workload.

    Science.gov (United States)

    Dadashi, N; Stedmon, A W; Pridmore, T P

    2013-09-01

    Recent advances in computer vision technology have lead to the development of various automatic surveillance systems, however their effectiveness is adversely affected by many factors and they are not completely reliable. This study investigated the potential of a semi-automated surveillance system to reduce CCTV operator workload in both detection and tracking activities. A further focus of interest was the degree of user reliance on the automated system. A simulated prototype was developed which mimicked an automated system that provided different levels of system confidence information. Dependent variable measures were taken for secondary task performance, reliance and subjective workload. When the automatic component of a semi-automatic CCTV surveillance system provided reliable system confidence information to operators, workload significantly decreased and spare mental capacity significantly increased. Providing feedback about system confidence and accuracy appears to be one important way of making the status of the automated component of the surveillance system more 'visible' to users and hence more effective to use. Copyright © 2012 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  10. Mesoscale wind fluctuations over Danish waters

    Energy Technology Data Exchange (ETDEWEB)

    Vincent, C.L.

    2010-12-15

    Mesoscale wind fluctuations affect the large scale integration of wind power because they undermine the day-ahead predictability of wind speed and power production, and because they can result in large fluctuations in power generation that must be balanced using reserve power. Large fluctuations in generated power are a particular problem for offshore wind farms because the typically high concentration of turbines within a limited geographical area means that fluctuations can be correlated across large numbers of turbines. Furthermore, organised mesoscale structures that often form over water, such as convective rolls and cellular convection, have length scales of tens of kilometers, and can cause large wind fluctuations on a time scale of around an hour. This thesis is an exploration of the predictability of mesoscale wind fluctuations using observations from the world's first two large offshore wind farms - Horns Rev I in the North Sea, and Nysted in the Baltic Sea. The thesis begins with a climatological analysis of wind fluctuations on time scales of 1-10 hours at the two sites. A novel method for calculating conditional climatologies of spectral information is proposed, based on binning and averaging the time axis of the Hilbert spectrum. Results reveal clear patterns between wind fluctuations and locally observed meteorological conditions. The analysis is expanded by classifying wind fluctuations on time scales of 1-3 hours according to synoptic patterns, satellite pictures and wind classes. Results indicate that cold air outbreaks and open cellular convection are a significant contributor to mesoscale wind variability at Horns Rev. The predictability of mesoscale wind fluctuations is tested by implementing standard statistical models that relate local wind variability to parameters based on a large scale weather analysis. The models show some skill, but only achieve a 15% improvement on a persistence forecast. The possibility of explicitly modelling

  11. Semi-automated literature mining to identify putative biomarkers of disease from multiple biofluids

    Science.gov (United States)

    2014-01-01

    Background Computational methods for mining of biomedical literature can be useful in augmenting manual searches of the literature using keywords for disease-specific biomarker discovery from biofluids. In this work, we develop and apply a semi-automated literature mining method to mine abstracts obtained from PubMed to discover putative biomarkers of breast and lung cancers in specific biofluids. Methodology A positive set of abstracts was defined by the terms ‘breast cancer’ and ‘lung cancer’ in conjunction with 14 separate ‘biofluids’ (bile, blood, breastmilk, cerebrospinal fluid, mucus, plasma, saliva, semen, serum, synovial fluid, stool, sweat, tears, and urine), while a negative set of abstracts was defined by the terms ‘(biofluid) NOT breast cancer’ or ‘(biofluid) NOT lung cancer.’ More than 5.3 million total abstracts were obtained from PubMed and examined for biomarker-disease-biofluid associations (34,296 positive and 2,653,396 negative for breast cancer; 28,355 positive and 2,595,034 negative for lung cancer). Biological entities such as genes and proteins were tagged using ABNER, and processed using Python scripts to produce a list of putative biomarkers. Z-scores were calculated, ranked, and used to determine significance of putative biomarkers found. Manual verification of relevant abstracts was performed to assess our method’s performance. Results Biofluid-specific markers were identified from the literature, assigned relevance scores based on frequency of occurrence, and validated using known biomarker lists and/or databases for lung and breast cancer [NCBI’s On-line Mendelian Inheritance in Man (OMIM), Cancer Gene annotation server for cancer genomics (CAGE), NCBI’s Genes & Disease, NCI’s Early Detection Research Network (EDRN), and others]. The specificity of each marker for a given biofluid was calculated, and the performance of our semi-automated literature mining method assessed for breast and lung cancer

  12. Semi-automated extraction and characterization of Stromal Vascular Fraction using a new medical device.

    Science.gov (United States)

    Hanke, Alexander; Prantl, Lukas; Wenzel, Carina; Nerlich, Michael; Brockhoff, Gero; Loibl, Markus; Gehmert, Sebastian

    2016-01-01

    The stem cell rich Stromal Vascular Fraction (SVF) can be harvested by processing lipo-aspirate or fat tissue with an enzymatic digestion followed by centrifugation. To date neither a standardised extraction method for SVF nor a generally admitted protocol for cell application in patients exists. A novel commercially available semi-automated device for the extraction of SVF promises sterility, consistent results and usability in the clinical routine. The aim of this work was to compare the quantity and quality of the SVF between the new system and an established manual laboratory method. SVF was extracted from lipo-aspirate both by a prototype of the semi-automated UNiStation™ (NeoGenesis, Seoul, Korea) and by hand preparation with common laboratory equipment. Cell composition of the SVF was characterized by multi-parametric flow-cytometry (FACSCanto-II, BD Biosciences). The total cell number (quantity) of the SVF was determined as well the percentage of cells expressing the stem cell marker CD34, the leucocyte marker CD45 and the marker CD271 for highly proliferative stem cells (quality). Lipo-aspirate obtained from six patients was processed with both the novel device (d) and the hand preparation (h) which always resulted in a macroscopically visible SVF. However, there was a tendency of a fewer cell yield per gram of used lipo-aspirate with the device (d: 1.1×105±1.1×105 vs. h: 2.0×105±1.7×105; p = 0.06). Noteworthy, the percentage of CD34+ cells was significantly lower when using the device (d: 57.3% ±23.8% vs. h: 74.1% ±13.4%; p = 0.02) and CD45+ leukocyte counts tend to be higher when compared to the hand preparation (d: 20.7% ±15.8% vs. h: 9.8% ±7.1%; p = 0.07). The percentage of highly proliferative CD271+ cells was similar for both methods (d:12.9% ±9.6% vs. h: 13.4% ±11.6%; p = 0.74) and no differences were found for double positive cells of CD34+/CD45+ (d: 5.9% ±1.7% vs. h: 1.7% ±1.1%; p = 0.13), CD34+/CD271+ (d: 24

  13. Modeling mesoscale eddies

    Science.gov (United States)

    Canuto, V. M.; Dubovikov, M. S.

    Mesoscale eddies are not resolved in coarse resolution ocean models and must be modeled. They affect both mean momentum and scalars. At present, no generally accepted model exists for the former; in the latter case, mesoscales are modeled with a bolus velocity u∗ to represent a sink of mean potential energy. However, comparison of u∗(model) vs. u∗ (eddy resolving code, [J. Phys. Ocean. 29 (1999) 2442]) has shown that u∗(model) is incomplete and that additional terms, "unrelated to thickness source or sinks", are required. Thus far, no form of the additional terms has been suggested. To describe mesoscale eddies, we employ the Navier-Stokes and scalar equations and a turbulence model to treat the non-linear interactions. We then show that the problem reduces to an eigenvalue problem for the mesoscale Bernoulli potential. The solution, which we derive in analytic form, is used to construct the momentum and thickness fluxes. In the latter case, the bolus velocity u∗ is found to contain two types of terms: the first type entails the gradient of the mean potential vorticity and represents a positive contribution to the production of mesoscale potential energy; the second type of terms, which is new, entails the velocity of the mean flow and represents a negative contribution to the production of mesoscale potential energy, or equivalently, a backscatter process whereby a fraction of the mesoscale potential energy is returned to the original reservoir of mean potential energy. This type of terms satisfies the physical description of the additional terms given by [J. Phys. Ocean. 29 (1999) 2442]. The mesoscale flux that enters the momentum equations is also contributed by two types of terms of the same physical nature as those entering the thickness flux. The potential vorticity flux is also shown to contain two types of terms: the first is of the gradient-type while the other terms entail the velocity of the mean flow. An expression is derived for the mesoscale

  14. Geostatistical Analysis of Mesoscale Spatial Variability and Error in SeaWiFS and MODIS/Aqua Global Ocean Color Data

    Science.gov (United States)

    Glover, David M.; Doney, Scott C.; Oestreich, William K.; Tullo, Alisdair W.

    2018-01-01

    Mesoscale (10-300 km, weeks to months) physical variability strongly modulates the structure and dynamics of planktonic marine ecosystems via both turbulent advection and environmental impacts upon biological rates. Using structure function analysis (geostatistics), we quantify the mesoscale biological signals within global 13 year SeaWiFS (1998-2010) and 8 year MODIS/Aqua (2003-2010) chlorophyll a ocean color data (Level-3, 9 km resolution). We present geographical distributions, seasonality, and interannual variability of key geostatistical parameters: unresolved variability or noise, resolved variability, and spatial range. Resolved variability is nearly identical for both instruments, indicating that geostatistical techniques isolate a robust measure of biophysical mesoscale variability largely independent of measurement platform. In contrast, unresolved variability in MODIS/Aqua is substantially lower than in SeaWiFS, especially in oligotrophic waters where previous analysis identified a problem for the SeaWiFS instrument likely due to sensor noise characteristics. Both records exhibit a statistically significant relationship between resolved mesoscale variability and the low-pass filtered chlorophyll field horizontal gradient magnitude, consistent with physical stirring acting on large-scale gradient as an important factor supporting observed mesoscale variability. Comparable horizontal length scales for variability are found from tracer-based scaling arguments and geostatistical decorrelation. Regional variations between these length scales may reflect scale dependence of biological mechanisms that also create variability directly at the mesoscale, for example, enhanced net phytoplankton growth in coastal and frontal upwelling and convective mixing regions. Global estimates of mesoscale biophysical variability provide an improved basis for evaluating higher resolution, coupled ecosystem-ocean general circulation models, and data assimilation.

  15. Applicability Of A Semi-Automated Clinical Chemistry Analyzer In Determining The Antioxidant Concentrations Of Selected Plants

    Directory of Open Access Journals (Sweden)

    Allan L. Hilario

    2017-07-01

    Full Text Available Plants are rich sources of antioxidants that are protective against diseases associated to oxidative stress. There is a need for high throughput screening method that should be useful in determining the antioxidant concentration in plants. Such screening method should significantly simplify and speed up most antioxidant assays. This paper aimed at comparing the applicability of a semi-automated clinical chemistry analyzer Pointe Scientific MI USA with the traditional standard curve method and using a Vis spectrophotometer in performing the DPPH assay for antioxidant screening. Samples of crude aqueous leaf extract of kulitis Amaranthus viridis Linn and chayote Sechium edule Linn were screened for the Total Antioxidant Concentration TAC using the two methods. Results presented in mean SD amp956gdl were compared using unpaired Students t-test P0.05. All runs were done in triplicates. The mean TAC of A. viridis was 646.0 45.5 amp956gdl using the clinical chemistry analyzer and 581.9 19.4 amp956gdl using the standard curve-spectrophotometer. On the other hand the mean TAC of S. edule was 660.2 35.9 amp956gdl using the semi-automated clinical chemistry analyzer and 672.3 20.9 amp956gdl using the spectrophotometer. No significant differences were observed between the readings of the two methods for A. viridis P0.05 and S. edible P0.05. This implies that the clinical chemistry analyzer can be an alternative method in conducting the DPPH assay to determine the TAC in plants. This study presented the applicability of a semi-automated clinical chemistry analyzer in performing the DPPH assay. Further validation can be conducted by performing other antioxidant assays using this equipment.

  16. Delayed shear enhancement in mesoscale atmospheric dispersion

    Energy Technology Data Exchange (ETDEWEB)

    Moran, M.D. [Atmospheric Environment Service, Ontario (Canada); Pielke, R.A. [Colorado State Univ., Fort Collins, CO (United States)

    1994-12-31

    Mesoscale atmospheric dispersion (MAD) is more complicated than smaller-scale dispersion because the mean wind field can no longer be considered steady or horizontally homogeneous over mesoscale time and space scales. Wind shear also plays a much more important role on the mesoscale: horizontal dispersion can be enhanced and often dominated by vertical wind shear on these scales through the interaction of horizontal differential advection and vertical mixing. Just over 30 years ago, Pasquill suggested that this interaction need not be simultaneous and that the combination of differential horizontal advection with delayed or subsequent vertical mixing could maintain effective horizontal diffusion in spite of temporal or spatial reductions in boundary-layer turbulence intensity. This two-step mechanism has not received much attention since then, but a recent analysis of observations from and numerical simulations of two mesoscale tracer experiments suggests that delayed shear enhancement can play an important role in MAD. This paper presents an overview of this analysis, with particular emphasis on the influence of resolvable vertical shear on MAD in these two case studies and the contributions made by delayed shear enhancement.

  17. NetFCM: A Semi-Automated Web-Based Method for Flow Cytometry Data Analysis

    DEFF Research Database (Denmark)

    Frederiksen, Juliet Wairimu; Buggert, Marcus; Karlsson, Annika C.

    2014-01-01

    data analysis has become more complex and labor-intensive than previously. We have therefore developed a semi-automatic gating strategy (NetFCM) that uses clustering and principal component analysis (PCA) together with other statistical methods to mimic manual gating approaches. NetFCM is an online...... tool both for subset identification as well as for quantification of differences between samples. Additionally, NetFCM can classify and cluster samples based on multidimensional data. We tested the method using a data set of peripheral blood mononuclear cells collected from 23 HIV-infected individuals...... corresponding to those obtained by manual gating strategies. These data demonstrate that NetFCM has the potential to identify relevant T cell populations by mimicking classical FCM data analysis and reduce the subjectivity and amount of time associated with such analysis. (c) 2014 International Society...

  18. Semi-automated identification of artefact and noise signals in MEG sensors

    International Nuclear Information System (INIS)

    Rettich, E.

    2006-09-01

    Magnetic encephalography (MEG) is a noninvasive method of measuring cerebral activity. It is based on the registration of magnetic fields that are induced by synaptic ion currents as the brain processes information. These magnetic fields are of a very small magnitude, ranging from a few femto Tesla (1 fT = 10 15 T) to several thousand fT (1 pT). This is equivalent to a ten thousandth to a billionth of the Earth's magnetic field. When applied with a time resolution in the range of milliseconds this technique permits research on time-critical neurophysiological processes. A meaningful analysis of MEG data presupposes that signals have been measured at low noise levels. This in turn requires magnetic shielding, normally in the form of a shielded cabin, and low-noise detectors. Data input from high-noise channels impairs the result of the measurement, possibly rendering it useless. To prevent this it is necessary to identify high-noise channels and remove them from the measurement data. At Juelich Research Center, like at most MEG laboratories, this is done by visual inspection. However, being dependent on the individual observer, this method does not yield objective results. Furthermore, visual inspection presupposes a high degree of experience and is time-consuming. This situation could be significantly improved by automated identification of high-noise channels. The purpose of the present study was to develop an algorithm that analyses measurement signals in a given time and frequency interval on the basis of statistical traits. Using a suitably designed user interface this permits searching MEG data for high-noise channel data below or above statistical threshold values on the basis of predetermined decision criteria. The identified high-noise channels are then output in a selection list, and the measurement data and results of the statistical analysis are displayed. This information enables the user to make changes and decide which high-noise channels to extract

  19. A semi-automated volumetric software for segmentation and perfusion parameter quantification of brain tumors using 320-row multidetector computed tomography: a validation study

    Energy Technology Data Exchange (ETDEWEB)

    Chae, Soo Young; Suh, Sangil; Ryoo, Inseon; Park, Arim; Seol, Hae Young [Korea University Guro Hospital, Department of Radiology, Seoul (Korea, Republic of); Noh, Kyoung Jin [Soonchunhyang University, Department of Electronic Engineering, Asan (Korea, Republic of); Shim, Hackjoon [Toshiba Medical Systems Korea Co., Seoul (Korea, Republic of)

    2017-05-15

    We developed a semi-automated volumetric software, NPerfusion, to segment brain tumors and quantify perfusion parameters on whole-brain CT perfusion (WBCTP) images. The purpose of this study was to assess the feasibility of the software and to validate its performance compared with manual segmentation. Twenty-nine patients with pathologically proven brain tumors who underwent preoperative WBCTP between August 2012 and February 2015 were included. Three perfusion parameters, arterial flow (AF), equivalent blood volume (EBV), and Patlak flow (PF, which is a measure of permeability of capillaries), of brain tumors were generated by a commercial software and then quantified volumetrically by NPerfusion, which also semi-automatically segmented tumor boundaries. The quantification was validated by comparison with that of manual segmentation in terms of the concordance correlation coefficient and Bland-Altman analysis. With NPerfusion, we successfully performed segmentation and quantified whole volumetric perfusion parameters of all 29 brain tumors that showed consistent perfusion trends with previous studies. The validation of the perfusion parameter quantification exhibited almost perfect agreement with manual segmentation, with Lin concordance correlation coefficients (ρ {sub c}) for AF, EBV, and PF of 0.9988, 0.9994, and 0.9976, respectively. On Bland-Altman analysis, most differences between this software and manual segmentation on the commercial software were within the limit of agreement. NPerfusion successfully performs segmentation of brain tumors and calculates perfusion parameters of brain tumors. We validated this semi-automated segmentation software by comparing it with manual segmentation. NPerfusion can be used to calculate volumetric perfusion parameters of brain tumors from WBCTP. (orig.)

  20. A semi-automated volumetric software for segmentation and perfusion parameter quantification of brain tumors using 320-row multidetector computed tomography: a validation study.

    Science.gov (United States)

    Chae, Soo Young; Suh, Sangil; Ryoo, Inseon; Park, Arim; Noh, Kyoung Jin; Shim, Hackjoon; Seol, Hae Young

    2017-05-01

    We developed a semi-automated volumetric software, NPerfusion, to segment brain tumors and quantify perfusion parameters on whole-brain CT perfusion (WBCTP) images. The purpose of this study was to assess the feasibility of the software and to validate its performance compared with manual segmentation. Twenty-nine patients with pathologically proven brain tumors who underwent preoperative WBCTP between August 2012 and February 2015 were included. Three perfusion parameters, arterial flow (AF), equivalent blood volume (EBV), and Patlak flow (PF, which is a measure of permeability of capillaries), of brain tumors were generated by a commercial software and then quantified volumetrically by NPerfusion, which also semi-automatically segmented tumor boundaries. The quantification was validated by comparison with that of manual segmentation in terms of the concordance correlation coefficient and Bland-Altman analysis. With NPerfusion, we successfully performed segmentation and quantified whole volumetric perfusion parameters of all 29 brain tumors that showed consistent perfusion trends with previous studies. The validation of the perfusion parameter quantification exhibited almost perfect agreement with manual segmentation, with Lin concordance correlation coefficients (ρ c ) for AF, EBV, and PF of 0.9988, 0.9994, and 0.9976, respectively. On Bland-Altman analysis, most differences between this software and manual segmentation on the commercial software were within the limit of agreement. NPerfusion successfully performs segmentation of brain tumors and calculates perfusion parameters of brain tumors. We validated this semi-automated segmentation software by comparing it with manual segmentation. NPerfusion can be used to calculate volumetric perfusion parameters of brain tumors from WBCTP.

  1. Campylobacter in Broiler Chicken and Broiler Meat in Sri Lanka: Influence of Semi-Automated vs. Wet Market Processing on Campylobacter Contamination of Broiler Neck Skin Samples

    Directory of Open Access Journals (Sweden)

    Kottawattage S. A. Kottawatta

    2017-11-01

    Full Text Available Broiler meat can become contaminated with Campylobacter of intestinal origin during processing. The present study aimed to identify the prevalence of Campylobacter in broiler flocks and meat contamination at retail shops, and determine the influence of semi-automated and wet market processing on Campylobacter contamination of neck skin samples. Samples were collected from semi-automated plants (n = 102 and wet markets (n = 25. From each batch of broilers, pooled caecal samples and neck skin samples were tested for Campylobacter. Broiler meat purchased from retail outlets (n = 37 was also tested. The prevalence of Campylobacter colonized broiler flocks was 67%. The contamination of meat at retail was 59%. Both semi-automated and wet market processing resulted to contaminate the broiler neck skins to the levels of 27.4% and 48%, respectively. When Campylobacter-free broiler flocks were processed in semi-automated facilities 15% (5/33 of neck skin samples became contaminated by the end of processing whereas 25% (2/8 became contaminated after wet market processing. Characterization of isolates revealed a higher proportion of C. coli compared to C. jejuni. Higher proportions of isolates were resistant to important antimicrobials. This study shows the importance of Campylobacter in poultry industry in Sri Lanka and the need for controlling antimicrobial resistance.

  2. Campylobacter in Broiler Chicken and Broiler Meat in Sri Lanka: Influence of Semi-Automated vs. Wet Market Processing on Campylobacter Contamination of Broiler Neck Skin Samples.

    Science.gov (United States)

    Kottawatta, Kottawattage S A; Van Bergen, Marcel A P; Abeynayake, Preeni; Wagenaar, Jaap A; Veldman, Kees T; Kalupahana, Ruwani S

    2017-11-29

    Broiler meat can become contaminated with Campylobacter of intestinal origin during processing. The present study aimed to identify the prevalence of Campylobacter in broiler flocks and meat contamination at retail shops, and determine the influence of semi-automated and wet market processing on Campylobacter contamination of neck skin samples. Samples were collected from semi-automated plants ( n = 102) and wet markets ( n = 25). From each batch of broilers, pooled caecal samples and neck skin samples were tested for Campylobacter . Broiler meat purchased from retail outlets ( n = 37) was also tested. The prevalence of Campylobacter colonized broiler flocks was 67%. The contamination of meat at retail was 59%. Both semi-automated and wet market processing resulted to contaminate the broiler neck skins to the levels of 27.4% and 48%, respectively. When Campylobacter -free broiler flocks were processed in semi-automated facilities 15% (5/33) of neck skin samples became contaminated by the end of processing whereas 25% (2/8) became contaminated after wet market processing. Characterization of isolates revealed a higher proportion of C. coli compared to C. jejuni . Higher proportions of isolates were resistant to important antimicrobials. This study shows the importance of Campylobacter in poultry industry in Sri Lanka and the need for controlling antimicrobial resistance.

  3. Semi-automated extraction of longitudinal subglacial bedforms from digital terrain models - Two new methods

    Science.gov (United States)

    Jorge, Marco G.; Brennand, Tracy A.

    2017-07-01

    Relict drumlin and mega-scale glacial lineation (positive relief, longitudinal subglacial bedforms - LSBs) morphometry has been used as a proxy for paleo ice-sheet dynamics. LSB morphometric inventories have relied on manual mapping, which is slow and subjective and thus potentially difficult to reproduce. Automated methods are faster and reproducible, but previous methods for LSB semi-automated mapping have not been highly successful. Here, two new object-based methods for the semi-automated extraction of LSBs (footprints) from digital terrain models are compared in a test area in the Puget Lowland, Washington, USA. As segmentation procedures to create LSB-candidate objects, the normalized closed contour method relies on the contouring of a normalized local relief model addressing LSBs on slopes, and the landform elements mask method relies on the classification of landform elements derived from the digital terrain model. For identifying which LSB-candidate objects correspond to LSBs, both methods use the same LSB operational definition: a ruleset encapsulating expert knowledge, published morphometric data, and the morphometric range of LSBs in the study area. The normalized closed contour method was separately applied to four different local relief models, two computed in moving windows and two hydrology-based. Overall, the normalized closed contour method outperformed the landform elements mask method. The normalized closed contour method performed on a hydrological relief model from a multiple direction flow routing algorithm performed best. For an assessment of its transferability, the normalized closed contour method was evaluated on a second area, the Chautauqua drumlin field, Pennsylvania and New York, USA where it performed better than in the Puget Lowland. A broad comparison to previous methods suggests that the normalized relief closed contour method may be the most capable method to date, but more development is required.

  4. Semantic enrichment of medical forms - semi-automated coding of ODM-elements via web services.

    Science.gov (United States)

    Breil, Bernhard; Watermann, Andreas; Haas, Peter; Dziuballe, Philipp; Dugas, Martin

    2012-01-01

    Semantic interoperability is an unsolved problem which occurs while working with medical forms from different information systems or institutions. Standards like ODM or CDA assure structural homogenization but in order to compare elements from different data models it is necessary to use semantic concepts and codes on an item level of those structures. We developed and implemented a web-based tool which enables a domain expert to perform semi-automated coding of ODM-files. For each item it is possible to inquire web services which result in unique concept codes without leaving the context of the document. Although it was not feasible to perform a totally automated coding we have implemented a dialog based method to perform an efficient coding of all data elements in the context of the whole document. The proportion of codable items was comparable to results from previous studies.

  5. Unifying Inference of Meso-Scale Structures in Networks.

    Science.gov (United States)

    Tunç, Birkan; Verma, Ragini

    2015-01-01

    Networks are among the most prevalent formal representations in scientific studies, employed to depict interactions between objects such as molecules, neuronal clusters, or social groups. Studies performed at meso-scale that involve grouping of objects based on their distinctive interaction patterns form one of the main lines of investigation in network science. In a social network, for instance, meso-scale structures can correspond to isolated social groupings or groups of individuals that serve as a communication core. Currently, the research on different meso-scale structures such as community and core-periphery structures has been conducted via independent approaches, which precludes the possibility of an algorithmic design that can handle multiple meso-scale structures and deciding which structure explains the observed data better. In this study, we propose a unified formulation for the algorithmic detection and analysis of different meso-scale structures. This facilitates the investigation of hybrid structures that capture the interplay between multiple meso-scale structures and statistical comparison of competing structures, all of which have been hitherto unavailable. We demonstrate the applicability of the methodology in analyzing the human brain network, by determining the dominant organizational structure (communities) of the brain, as well as its auxiliary characteristics (core-periphery).

  6. Unifying Inference of Meso-Scale Structures in Networks.

    Directory of Open Access Journals (Sweden)

    Birkan Tunç

    Full Text Available Networks are among the most prevalent formal representations in scientific studies, employed to depict interactions between objects such as molecules, neuronal clusters, or social groups. Studies performed at meso-scale that involve grouping of objects based on their distinctive interaction patterns form one of the main lines of investigation in network science. In a social network, for instance, meso-scale structures can correspond to isolated social groupings or groups of individuals that serve as a communication core. Currently, the research on different meso-scale structures such as community and core-periphery structures has been conducted via independent approaches, which precludes the possibility of an algorithmic design that can handle multiple meso-scale structures and deciding which structure explains the observed data better. In this study, we propose a unified formulation for the algorithmic detection and analysis of different meso-scale structures. This facilitates the investigation of hybrid structures that capture the interplay between multiple meso-scale structures and statistical comparison of competing structures, all of which have been hitherto unavailable. We demonstrate the applicability of the methodology in analyzing the human brain network, by determining the dominant organizational structure (communities of the brain, as well as its auxiliary characteristics (core-periphery.

  7. Technical note on the validation of a semi-automated image analysis software application for estrogen and progesterone receptor detection in breast cancer

    Science.gov (United States)

    2011-01-01

    Background The immunohistochemical detection of estrogen (ER) and progesterone (PR) receptors in breast cancer is routinely used for prognostic and predictive testing. Whole slide digitalization supported by dedicated software tools allows quantization of the image objects (e.g. cell membrane, nuclei) and an unbiased analysis of immunostaining results. Validation studies of image analysis applications for the detection of ER and PR in breast cancer specimens provided strong concordance between the pathologist's manual assessment of slides and scoring performed using different software applications. Methods The effectiveness of two connected semi-automated image analysis software (NuclearQuant v. 1.13 application for Pannoramic™ Viewer v. 1.14) for determination of ER and PR status in formalin-fixed paraffin embedded breast cancer specimens immunostained with the automated Leica Bond Max system was studied. First the detection algorithm was calibrated to the scores provided an independent assessors (pathologist), using selected areas from 38 small digital slides (created from 16 cases) containing a mean number of 195 cells. Each cell was manually marked and scored according to the Allred-system combining frequency and intensity scores. The performance of the calibrated algorithm was tested on 16 cases (14 invasive ductal carcinoma, 2 invasive lobular carcinoma) against the pathologist's manual scoring of digital slides. Results The detection was calibrated to 87 percent object detection agreement and almost perfect Total Score agreement (Cohen's kappa 0.859, quadratic weighted kappa 0.986) from slight or moderate agreement at the start of the study, using the un-calibrated algorithm. The performance of the application was tested against the pathologist's manual scoring of digital slides on 53 regions of interest of 16 ER and PR slides covering all positivity ranges, and the quadratic weighted kappa provided almost perfect agreement (κ = 0.981) among the two

  8. NeuronMetrics: software for semi-automated processing of cultured neuron images.

    Science.gov (United States)

    Narro, Martha L; Yang, Fan; Kraft, Robert; Wenk, Carola; Efrat, Alon; Restifo, Linda L

    2007-03-23

    Using primary cell culture to screen for changes in neuronal morphology requires specialized analysis software. We developed NeuronMetrics for semi-automated, quantitative analysis of two-dimensional (2D) images of fluorescently labeled cultured neurons. It skeletonizes the neuron image using two complementary image-processing techniques, capturing fine terminal neurites with high fidelity. An algorithm was devised to span wide gaps in the skeleton. NeuronMetrics uses a novel strategy based on geometric features called faces to extract a branch number estimate from complex arbors with numerous neurite-to-neurite contacts, without creating a precise, contact-free representation of the neurite arbor. It estimates total neurite length, branch number, primary neurite number, territory (the area of the convex polygon bounding the skeleton and cell body), and Polarity Index (a measure of neuronal polarity). These parameters provide fundamental information about the size and shape of neurite arbors, which are critical factors for neuronal function. NeuronMetrics streamlines optional manual tasks such as removing noise, isolating the largest primary neurite, and correcting length for self-fasciculating neurites. Numeric data are output in a single text file, readily imported into other applications for further analysis. Written as modules for ImageJ, NeuronMetrics provides practical analysis tools that are easy to use and support batch processing. Depending on the need for manual intervention, processing time for a batch of approximately 60 2D images is 1.0-2.5 h, from a folder of images to a table of numeric data. NeuronMetrics' output accelerates the quantitative detection of mutations and chemical compounds that alter neurite morphology in vitro, and will contribute to the use of cultured neurons for drug discovery.

  9. Semi-Automated Classification of Seafloor Data Collected on the Delmarva Inner Shelf

    Science.gov (United States)

    Sweeney, E. M.; Pendleton, E. A.; Brothers, L. L.; Mahmud, A.; Thieler, E. R.

    2017-12-01

    We tested automated classification methods on acoustic bathymetry and backscatter data collected by the U.S. Geological Survey (USGS) and National Oceanic and Atmospheric Administration (NOAA) on the Delmarva inner continental shelf to efficiently and objectively identify sediment texture and geomorphology. Automated classification techniques are generally less subjective and take significantly less time than manual classification methods. We used a semi-automated process combining unsupervised and supervised classification techniques to characterize seafloor based on bathymetric slope and relative backscatter intensity. Statistical comparison of our automated classification results with those of a manual classification conducted on a subset of the acoustic imagery indicates that our automated method was highly accurate (95% total accuracy and 93% Kappa). Our methods resolve sediment ridges, zones of flat seafloor and areas of high and low backscatter. We compared our classification scheme with mean grain size statistics of samples collected in the study area and found that strong correlations between backscatter intensity and sediment texture exist. High backscatter zones are associated with the presence of gravel and shells mixed with sand, and low backscatter areas are primarily clean sand or sand mixed with mud. Slope classes further elucidate textural and geomorphologic differences in the seafloor, such that steep slopes (>0.35°) with high backscatter are most often associated with the updrift side of sand ridges and bedforms, whereas low slope with high backscatter correspond to coarse lag or shell deposits. Low backscatter and high slopes are most often found on the downdrift side of ridges and bedforms, and low backscatter and low slopes identify swale areas and sand sheets. We found that poor acoustic data quality was the most significant cause of inaccurate classification results, which required additional user input to mitigate. Our method worked well

  10. Evaluation and optimisation of preparative semi-automated electrophoresis systems for Illumina library preparation.

    Science.gov (United States)

    Quail, Michael A; Gu, Yong; Swerdlow, Harold; Mayho, Matthew

    2012-12-01

    Size selection can be a critical step in preparation of next-generation sequencing libraries. Traditional methods employing gel electrophoresis lack reproducibility, are labour intensive, do not scale well and employ hazardous interchelating dyes. In a high-throughput setting, solid-phase reversible immobilisation beads are commonly used for size-selection, but result in quite a broad fragment size range. We have evaluated and optimised the use of two semi-automated preparative DNA electrophoresis systems, the Caliper Labchip XT and the Sage Science Pippin Prep, for size selection of Illumina sequencing libraries. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. Quality of Radiomic Features in Glioblastoma Multiforme: Impact of Semi-Automated Tumor Segmentation Software.

    Science.gov (United States)

    Lee, Myungeun; Woo, Boyeong; Kuo, Michael D; Jamshidi, Neema; Kim, Jong Hyo

    2017-01-01

    The purpose of this study was to evaluate the reliability and quality of radiomic features in glioblastoma multiforme (GBM) derived from tumor volumes obtained with semi-automated tumor segmentation software. MR images of 45 GBM patients (29 males, 16 females) were downloaded from The Cancer Imaging Archive, in which post-contrast T1-weighted imaging and fluid-attenuated inversion recovery MR sequences were used. Two raters independently segmented the tumors using two semi-automated segmentation tools (TumorPrism3D and 3D Slicer). Regions of interest corresponding to contrast-enhancing lesion, necrotic portions, and non-enhancing T2 high signal intensity component were segmented for each tumor. A total of 180 imaging features were extracted, and their quality was evaluated in terms of stability, normalized dynamic range (NDR), and redundancy, using intra-class correlation coefficients, cluster consensus, and Rand Statistic. Our study results showed that most of the radiomic features in GBM were highly stable. Over 90% of 180 features showed good stability (intra-class correlation coefficient [ICC] ≥ 0.8), whereas only 7 features were of poor stability (ICC NDR ≥1), while above 35% of the texture features showed poor NDR (software tools provided sufficiently reliable tumor segmentation and feature stability; thus helping to overcome the inherent inter-rater and intra-rater variability of user intervention. However, certain aspects of feature quality, including NDR and redundancy, need to be assessed for determination of representative signature features before further development of radiomics.

  12. Contaminant analysis automation, an overview

    International Nuclear Information System (INIS)

    Hollen, R.; Ramos, O. Jr.

    1996-01-01

    To meet the environmental restoration and waste minimization goals of government and industry, several government laboratories, universities, and private companies have formed the Contaminant Analysis Automation (CAA) team. The goal of this consortium is to design and fabricate robotics systems that standardize and automate the hardware and software of the most common environmental chemical methods. In essence, the CAA team takes conventional, regulatory- approved (EPA Methods) chemical analysis processes and automates them. The automation consists of standard laboratory modules (SLMs) that perform the work in a much more efficient, accurate, and cost- effective manner

  13. Shape indexes for semi-automated detection of windbreaks in thematic tree cover maps from the central United States

    Science.gov (United States)

    Greg C. Liknes; Dacia M. Meneguzzo; Todd A. Kellerman

    2017-01-01

    Windbreaks are an important ecological resource across the large expanse of agricultural land in the central United States and are often planted in straight-line or L-shaped configurations to serve specific functions. As high-resolution (i.e., <5 m) land cover datasets become more available for these areas, semi-or fully-automated methods for distinguishing...

  14. Applicability Of A Semi-Automated Clinical Chemistry Analyzer In Determining The Antioxidant Concentrations Of Selected Plants

    OpenAIRE

    Allan L. Hilario; Phylis C. Rio; Geraldine Susan C. Tengco; Danilo M. Menorca

    2017-01-01

    Plants are rich sources of antioxidants that are protective against diseases associated to oxidative stress. There is a need for high throughput screening method that should be useful in determining the antioxidant concentration in plants. Such screening method should significantly simplify and speed up most antioxidant assays. This paper aimed at comparing the applicability of a semi-automated clinical chemistry analyzer Pointe Scientific MI USA with the traditional standard curve method and...

  15. Semi-automated Digital Imaging and Processing System for Measuring Lake Ice Thickness

    Science.gov (United States)

    Singh, Preetpal

    to detect equipment failure and identify defective products at the assembly line. The research work in this thesis combines machine vision and image processing technology to build a digital imaging and processing system for monitoring and measuring lake ice thickness in real time. An ultra-compact USB camera is programmed to acquire and transmit high resolution imagery for processing with MATLAB Image Processing toolbox. The image acquisition and transmission process is fully automated; image analysis is semi-automated and requires limited user input. Potential design changes to the prototype and ideas on fully automating the imaging and processing procedure are presented to conclude this research work.

  16. Semi-automated 86Y purification using a three-column system

    International Nuclear Information System (INIS)

    Park, Luke S.; Szajek, Lawrence P.; Wong, Karen J.; Plascjak, Paul S.; Garmestani, Kayhan; Googins, Shawn; Eckelman, William C.; Carrasquillo, Jorge A.; Paik, Chang H.

    2004-01-01

    The separation of 86 Y from 86 Sr was optimized by a semi-automated purification system involving the passage of the target sample through three sequential columns. The target material was dissolved in 4 N HNO 3 and loaded onto a Sr-selective (Sr-Spec) column to retain the 86 Sr. The yttrium was eluted with 4 N HNO 3 onto the second Y-selective (RE-Spec) column with quantitative retention. The RE-Spec column was eluted with a stepwise decreasing concentration of HNO 3 to wash out potential metallic impurities to a waste container. The eluate was then pumped onto an Aminex A5 column with 0.1 N HCl and finally with 3 N HCl to collect the radioyttrium in 0.6-0.8 mL with a >80% recovery. This method enabled us to decontaminate Sr by 250,000 times and label 30 μ g of DOTA-Biotin with a >95% yield

  17. A modular, prospective, semi-automated drug safety monitoring system for use in a distributed data environment.

    Science.gov (United States)

    Gagne, Joshua J; Wang, Shirley V; Rassen, Jeremy A; Schneeweiss, Sebastian

    2014-06-01

    The aim of this study was to develop and test a semi-automated process for conducting routine active safety monitoring for new drugs in a network of electronic healthcare databases. We built a modular program that semi-automatically performs cohort identification, confounding adjustment, diagnostic checks, aggregation and effect estimation across multiple databases, and application of a sequential alerting algorithm. During beta-testing, we applied the system to five databases to evaluate nine examples emulating prospective monitoring with retrospective data (five pairs for which we expected signals, two negative controls, and two examples for which it was uncertain whether a signal would be expected): cerivastatin versus atorvastatin and rhabdomyolysis; paroxetine versus tricyclic antidepressants and gastrointestinal bleed; lisinopril versus angiotensin receptor blockers and angioedema; ciprofloxacin versus macrolide antibiotics and Achilles tendon rupture; rofecoxib versus non-selective non-steroidal anti-inflammatory drugs (ns-NSAIDs) and myocardial infarction; telithromycin versus azithromycin and hepatotoxicity; rosuvastatin versus atorvastatin and diabetes and rhabdomyolysis; and celecoxib versus ns-NSAIDs and myocardial infarction. We describe the program, the necessary inputs, and the assumed data environment. In beta-testing, the system generated four alerts, all among positive control examples (i.e., lisinopril and angioedema; rofecoxib and myocardial infarction; ciprofloxacin and tendon rupture; and cerivastatin and rhabdomyolysis). Sequential effect estimates for each example were consistent in direction and magnitude with existing literature. Beta-testing across nine drug-outcome examples demonstrated the feasibility of the proposed semi-automated prospective monitoring approach. In retrospective assessments, the system identified an increased risk of myocardial infarction with rofecoxib and an increased risk of rhabdomyolysis with cerivastatin years

  18. Semi-automated detection of fractional shortening in zebrafish embryo heart videos

    Directory of Open Access Journals (Sweden)

    Nasrat Sara

    2016-09-01

    Full Text Available Quantifying cardiac functions in model organisms like embryonic zebrafish is of high importance in small molecule screens for new therapeutic compounds. One relevant cardiac parameter is the fractional shortening (FS. A method for semi-automatic quantification of FS in video recordings of zebrafish embryo hearts is presented. The software provides automated visual information about the end-systolic and end-diastolic stages of the heart by displaying corresponding colored lines into a Motion-mode display. After manually marking the ventricle diameters in frames of end-systolic and end-diastolic stages, the FS is calculated. The software was evaluated by comparing the results of the determination of FS with results obtained from another established method. Correlations of 0.96 < r < 0.99 between the two methods were found indicating that the new software provides comparable results for the determination of the FS.

  19. Semi-Automated Discovery of Application Session Structure

    Energy Technology Data Exchange (ETDEWEB)

    Kannan, J.; Jung, J.; Paxson, V.; Koksal, C.

    2006-09-07

    While the problem of analyzing network traffic at the granularity of individual connections has seen considerable previous work and tool development, understanding traffic at a higher level---the structure of user-initiated sessions comprised of groups of related connections---remains much less explored. Some types of session structure, such as the coupling between an FTP control connection and the data connections it spawns, have prespecified forms, though the specifications do not guarantee how the forms appear in practice. Other types of sessions, such as a user reading email with a browser, only manifest empirically. Still other sessions might exist without us even knowing of their presence, such as a botnet zombie receiving instructions from its master and proceeding in turn to carry them out. We present algorithms rooted in the statistics of Poisson processes that can mine a large corpus of network connection logs to extract the apparent structure of application sessions embedded in the connections. Our methods are semi-automated in that we aim to present an analyst with high-quality information (expressed as regular expressions) reflecting different possible abstractions of an application's session structure. We develop and test our methods using traces from a large Internet site, finding diversity in the number of applications that manifest, their different session structures, and the presence of abnormal behavior. Our work has applications to traffic characterization and monitoring, source models for synthesizing network traffic, and anomaly detection.

  20. Automated metabolic gas analysis systems: a review.

    Science.gov (United States)

    Macfarlane, D J

    2001-01-01

    The use of automated metabolic gas analysis systems or metabolic measurement carts (MMC) in exercise studies is common throughout the industrialised world. They have become essential tools for diagnosing many hospital patients, especially those with cardiorespiratory disease. Moreover, the measurement of maximal oxygen uptake (VO2max) is routine for many athletes in fitness laboratories and has become a defacto standard in spite of its limitations. The development of metabolic carts has also facilitated the noninvasive determination of the lactate threshold and cardiac output, respiratory gas exchange kinetics, as well as studies of outdoor activities via small portable systems that often use telemetry. Although the fundamental principles behind the measurement of oxygen uptake (VO2) and carbon dioxide production (VCO2) have not changed, the techniques used have, and indeed, some have almost turned through a full circle. Early scientists often employed a manual Douglas bag method together with separate chemical analyses, but the need for faster and more efficient techniques fuelled the development of semi- and full-automated systems by private and commercial institutions. Yet, recently some scientists are returning back to the traditional Douglas bag or Tissot-spirometer methods, or are using less complex automated systems to not only save capital costs, but also to have greater control over the measurement process. Over the last 40 years, a considerable number of automated systems have been developed, with over a dozen commercial manufacturers producing in excess of 20 different automated systems. The validity and reliability of all these different systems is not well known, with relatively few independent studies having been published in this area. For comparative studies to be possible and to facilitate greater consistency of measurements in test-retest or longitudinal studies of individuals, further knowledge about the performance characteristics of these

  1. Semi-automated preparation of the dopamine transporter ligand [18F]FECNT for human PET imaging studies

    International Nuclear Information System (INIS)

    Voll, Ronald J.; McConathy, Jonathan; Waldrep, Michael S.; Crowe, Ronald J.; Goodman, Mark M.

    2005-01-01

    The fluorine-18 labeled dopamine transport (DAT) ligand 2β-carbomethoxy-3β-(4-chlorophenyl)-8-(2-fluoroethyl)nortropane (FECNT) has shown promising properties as an in vivo DAT imaging agent in human and monkey PET studies. A semi-automated synthesis has been developed to reliably produce [ 18 F]FECNT in a 16% decay corrected yield. This method utilizes a new [ 18 F]fluoralkylating agent and provides high purity [ 18 F]FECNT in a formulation suitable for human use

  2. Parameterization of Mixed Layer and Deep-Ocean Mesoscales Including Nonlinearity

    Science.gov (United States)

    Canuto, V. M.; Cheng, Y.; Dubovikov, M. S.; Howard, A. M.; Leboissetier, A.

    2018-01-01

    In 2011, Chelton et al. carried out a comprehensive census of mesoscales using altimetry data and reached the following conclusions: "essentially all of the observed mesoscale features are nonlinear" and "mesoscales do not move with the mean velocity but with their own drift velocity," which is "the most germane of all the nonlinear metrics."� Accounting for these results in a mesoscale parameterization presents conceptual and practical challenges since linear analysis is no longer usable and one needs a model of nonlinearity. A mesoscale parameterization is presented that has the following features: 1) it is based on the solutions of the nonlinear mesoscale dynamical equations, 2) it describes arbitrary tracers, 3) it includes adiabatic (A) and diabatic (D) regimes, 4) the eddy-induced velocity is the sum of a Gent and McWilliams (GM) term plus a new term representing the difference between drift and mean velocities, 5) the new term lowers the transfer of mean potential energy to mesoscales, 6) the isopycnal slopes are not as flat as in the GM case, 7) deep-ocean stratification is enhanced compared to previous parameterizations where being more weakly stratified allowed a large heat uptake that is not observed, 8) the strength of the Deacon cell is reduced. The numerical results are from a stand-alone ocean code with Coordinated Ocean-Ice Reference Experiment I (CORE-I) normal-year forcing.

  3. Thermally forced mesoscale atmospheric flow over complex terrain in Southern Italy

    International Nuclear Information System (INIS)

    Baldi, M.; Colacino, M.; Dalu, G. A.; Piervitali, E.; Ye, Z.

    1998-01-01

    In this paper the Authors discuss some results concerning the analysis of the local atmospheric flow over the southern part of Italy, the peninsula of Calabria, using a mesoscale numerical model. Our study is focused on two different but related topics: a detailed analysis of the meteorology and climate of the region based on a data collection, reported in Colacino et al., 'Elementi di Climatologia della Calabria', edited by A. Guerrini, in the series P. S., 'Clima, Ambiente e Territorio nel Mezzogiorno' (CNR, Rome) 1997, pp. 218, and an analysis of the results based on the simulated flow produced using a mesoscale numerical model. The Colorado State University mesoscale numerical model has been applied to study several different climatic situations of particular interest for the region, as discussed in this paper

  4. Thermally forced mesoscale atmospheric flow over complex terrain in Southern Italy

    Energy Technology Data Exchange (ETDEWEB)

    Baldi, M.; Colacino, M.; Dalu, G. A.; Piervitali, E.; Ye, Z. [CNR, Rome (Italy). Ist. di Fisica dell`Atmosfera

    1998-07-01

    In this paper the Authors discuss some results concerning the analysis of the local atmospheric flow over the southern part of Italy, the peninsula of Calabria, using a mesoscale numerical model. Our study is focused on two different but related topics: a detailed analysis of the meteorology and climate of the region based on a data collection, reported in Colacino et al., `Elementi di Climatologia della Calabria`, edited by A. Guerrini, in the series P. S., `Clima, Ambiente e Territorio nel Mezzogiorno` (CNR, Rome) 1997, pp. 218, and an analysis of the results based on the simulated flow produced using a mesoscale numerical model. The Colorado State University mesoscale numerical model has been applied to study several different climatic situations of particular interest for the region, as discussed in this paper.

  5. Scaling up Ecological Measurements of Coral Reefs Using Semi-Automated Field Image Collection and Analysis

    Directory of Open Access Journals (Sweden)

    Manuel González-Rivero

    2016-01-01

    Full Text Available Ecological measurements in marine settings are often constrained in space and time, with spatial heterogeneity obscuring broader generalisations. While advances in remote sensing, integrative modelling and meta-analysis enable generalisations from field observations, there is an underlying need for high-resolution, standardised and geo-referenced field data. Here, we evaluate a new approach aimed at optimising data collection and analysis to assess broad-scale patterns of coral reef community composition using automatically annotated underwater imagery, captured along 2 km transects. We validate this approach by investigating its ability to detect spatial (e.g., across regions and temporal (e.g., over years change, and by comparing automated annotation errors to those of multiple human annotators. Our results indicate that change of coral reef benthos can be captured at high resolution both spatially and temporally, with an average error below 5%, among key benthic groups. Cover estimation errors using automated annotation varied between 2% and 12%, slightly larger than human errors (which varied between 1% and 7%, but small enough to detect significant changes among dominant groups. Overall, this approach allows a rapid collection of in-situ observations at larger spatial scales (km than previously possible, and provides a pathway to link, calibrate, and validate broader analyses across even larger spatial scales (10–10,000 km2.

  6. Analysis of new bone, cartilage, and fibrosis tissue in healing murine allografts using whole slide imaging and a new automated histomorphometric algorithm

    OpenAIRE

    Zhang, Longze; Chang, Martin; Beck, Christopher A; Schwarz, Edward M; Boyce, Brendan F

    2016-01-01

    Histomorphometric analysis of histologic sections of normal and diseased bone samples, such as healing allografts and fractures, is widely used in bone research. However, the utility of traditional semi-automated methods is limited because they are labor-intensive and can have high interobserver variability depending upon the parameters being assessed, and primary data cannot be re-analyzed automatically. Automated histomorphometry has long been recognized as a solution for these issues, and ...

  7. Automated analysis of small animal PET studies through deformable registration to an atlas

    International Nuclear Information System (INIS)

    Gutierrez, Daniel F.; Zaidi, Habib

    2012-01-01

    This work aims to develop a methodology for automated atlas-guided analysis of small animal positron emission tomography (PET) data through deformable registration to an anatomical mouse model. A non-rigid registration technique is used to put into correspondence relevant anatomical regions of rodent CT images from combined PET/CT studies to corresponding CT images of the Digimouse anatomical mouse model. The latter provides a pre-segmented atlas consisting of 21 anatomical regions suitable for automated quantitative analysis. Image registration is performed using a package based on the Insight Toolkit allowing the implementation of various image registration algorithms. The optimal parameters obtained for deformable registration were applied to simulated and experimental mouse PET/CT studies. The accuracy of the image registration procedure was assessed by segmenting mouse CT images into seven regions: brain, lungs, heart, kidneys, bladder, skeleton and the rest of the body. This was accomplished prior to image registration using a semi-automated algorithm. Each mouse segmentation was transformed using the parameters obtained during CT to CT image registration. The resulting segmentation was compared with the original Digimouse atlas to quantify image registration accuracy using established metrics such as the Dice coefficient and Hausdorff distance. PET images were then transformed using the same technique and automated quantitative analysis of tracer uptake performed. The Dice coefficient and Hausdorff distance show fair to excellent agreement and a mean registration mismatch distance of about 6 mm. The results demonstrate good quantification accuracy in most of the regions, especially the brain, but not in the bladder, as expected. Normalized mean activity estimates were preserved between the reference and automated quantification techniques with relative errors below 10 % in most of the organs considered. The proposed automated quantification technique is

  8. SU-E-J-252: Reproducibility of Radiogenomic Image Features: Comparison of Two Semi-Automated Segmentation Methods

    Energy Technology Data Exchange (ETDEWEB)

    Lee, M; Woo, B; Kim, J [Seoul National University, Seoul (Korea, Republic of); Jamshidi, N; Kuo, M [UCLA School of Medicine, Los Angeles, CA (United States)

    2015-06-15

    Purpose: Objective and reliable quantification of imaging phenotype is an essential part of radiogenomic studies. We compared the reproducibility of two semi-automatic segmentation methods for quantitative image phenotyping in magnetic resonance imaging (MRI) of glioblastoma multiforme (GBM). Methods: MRI examinations with T1 post-gadolinium and FLAIR sequences of 10 GBM patients were downloaded from the Cancer Image Archive site. Two semi-automatic segmentation tools with different algorithms (deformable model and grow cut method) were used to segment contrast enhancement, necrosis and edema regions by two independent observers. A total of 21 imaging features consisting of area and edge groups were extracted automatically from the segmented tumor. The inter-observer variability and coefficient of variation (COV) were calculated to evaluate the reproducibility. Results: Inter-observer correlations and coefficient of variation of imaging features with the deformable model ranged from 0.953 to 0.999 and 2.1% to 9.2%, respectively, and the grow cut method ranged from 0.799 to 0.976 and 3.5% to 26.6%, respectively. Coefficient of variation for especially important features which were previously reported as predictive of patient survival were: 3.4% with deformable model and 7.4% with grow cut method for the proportion of contrast enhanced tumor region; 5.5% with deformable model and 25.7% with grow cut method for the proportion of necrosis; and 2.1% with deformable model and 4.4% with grow cut method for edge sharpness of tumor on CE-T1W1. Conclusion: Comparison of two semi-automated tumor segmentation techniques shows reliable image feature extraction for radiogenomic analysis of GBM patients with multiparametric Brain MRI.

  9. SU-E-J-252: Reproducibility of Radiogenomic Image Features: Comparison of Two Semi-Automated Segmentation Methods

    International Nuclear Information System (INIS)

    Lee, M; Woo, B; Kim, J; Jamshidi, N; Kuo, M

    2015-01-01

    Purpose: Objective and reliable quantification of imaging phenotype is an essential part of radiogenomic studies. We compared the reproducibility of two semi-automatic segmentation methods for quantitative image phenotyping in magnetic resonance imaging (MRI) of glioblastoma multiforme (GBM). Methods: MRI examinations with T1 post-gadolinium and FLAIR sequences of 10 GBM patients were downloaded from the Cancer Image Archive site. Two semi-automatic segmentation tools with different algorithms (deformable model and grow cut method) were used to segment contrast enhancement, necrosis and edema regions by two independent observers. A total of 21 imaging features consisting of area and edge groups were extracted automatically from the segmented tumor. The inter-observer variability and coefficient of variation (COV) were calculated to evaluate the reproducibility. Results: Inter-observer correlations and coefficient of variation of imaging features with the deformable model ranged from 0.953 to 0.999 and 2.1% to 9.2%, respectively, and the grow cut method ranged from 0.799 to 0.976 and 3.5% to 26.6%, respectively. Coefficient of variation for especially important features which were previously reported as predictive of patient survival were: 3.4% with deformable model and 7.4% with grow cut method for the proportion of contrast enhanced tumor region; 5.5% with deformable model and 25.7% with grow cut method for the proportion of necrosis; and 2.1% with deformable model and 4.4% with grow cut method for edge sharpness of tumor on CE-T1W1. Conclusion: Comparison of two semi-automated tumor segmentation techniques shows reliable image feature extraction for radiogenomic analysis of GBM patients with multiparametric Brain MRI

  10. A High Throughput, 384-Well, Semi-Automated, Hepatocyte Intrinsic Clearance Assay for Screening New Molecular Entities in Drug Discovery.

    Science.gov (United States)

    Heinle, Lance; Peterkin, Vincent; de Morais, Sonia M; Jenkins, Gary J; Badagnani, Ilaria

    2015-01-01

    A high throughput, semi-automated clearance screening assay in hepatocytes was developed allowing a scientist to generate data for 96 compounds in one week. The 384-well format assay utilizes a Thermo Multidrop Combi and an optimized LC-MS/MS method. The previously reported LCMS/ MS method reduced the analytical run time by 3-fold, down to 1.2 min injection-to-injection. The Multidrop was able to deliver hepatocytes to 384-well plates with minimal viability loss. Comparison of results from the new 384-well and historical 24-well assays yielded a correlation of 0.95. In addition, results obtained for 25 marketed drugs with various metabolism pathways had a correlation of 0.75 when compared with literature values. Precision was maintained in the new format as 8 compounds tested in ≥39 independent experiments had coefficients of variation ≤21%. The ability to predict in vivo clearances using the new stability assay format was also investigated using 22 marketed drugs and 26 AbbVie compounds. Correction of intrinsic clearance values with binding to hepatocytes (in vitro data) and plasma (in vivo data) resulted in a higher in vitro to in vivo correlation when comparing 22 marketed compounds in human (0.80 vs 0.35) and 26 AbbVie Discovery compounds in rat (0.56 vs 0.17), demonstrating the importance of correcting for binding in clearance studies. This newly developed high throughput, semi-automated clearance assay allows for rapid screening of Discovery compounds to enable Structure Activity Relationship (SAR) analysis based on high quality hepatocyte stability data in sufficient quantity and quality to drive the next round of compound synthesis.

  11. Semi-automated De-identification of German Content Sensitive Reports for Big Data Analytics.

    Science.gov (United States)

    Seuss, Hannes; Dankerl, Peter; Ihle, Matthias; Grandjean, Andrea; Hammon, Rebecca; Kaestle, Nicola; Fasching, Peter A; Maier, Christian; Christoph, Jan; Sedlmayr, Martin; Uder, Michael; Cavallaro, Alexander; Hammon, Matthias

    2017-07-01

    Purpose  Projects involving collaborations between different institutions require data security via selective de-identification of words or phrases. A semi-automated de-identification tool was developed and evaluated on different types of medical reports natively and after adapting the algorithm to the text structure. Materials and Methods  A semi-automated de-identification tool was developed and evaluated for its sensitivity and specificity in detecting sensitive content in written reports. Data from 4671 pathology reports (4105 + 566 in two different formats), 2804 medical reports, 1008 operation reports, and 6223 radiology reports of 1167 patients suffering from breast cancer were de-identified. The content was itemized into four categories: direct identifiers (name, address), indirect identifiers (date of birth/operation, medical ID, etc.), medical terms, and filler words. The software was tested natively (without training) in order to establish a baseline. The reports were manually edited and the model re-trained for the next test set. After manually editing 25, 50, 100, 250, 500 and if applicable 1000 reports of each type re-training was applied. Results  In the native test, 61.3 % of direct and 80.8 % of the indirect identifiers were detected. The performance (P) increased to 91.4 % (P25), 96.7 % (P50), 99.5 % (P100), 99.6 % (P250), 99.7 % (P500) and 100 % (P1000) for direct identifiers and to 93.2 % (P25), 97.9 % (P50), 97.2 % (P100), 98.9 % (P250), 99.0 % (P500) and 99.3 % (P1000) for indirect identifiers. Without training, 5.3 % of medical terms were falsely flagged as critical data. The performance increased, after training, to 4.0 % (P25), 3.6 % (P50), 4.0 % (P100), 3.7 % (P250), 4.3 % (P500), and 3.1 % (P1000). Roughly 0.1 % of filler words were falsely flagged. Conclusion  Training of the developed de-identification tool continuously improved its performance. Training with roughly 100 edited

  12. Defining Mediterranean and Black Sea biogeochemical subprovinces and synthetic ocean indicators using mesoscale oceanographic features

    DEFF Research Database (Denmark)

    Nieblas, Anne-Elise; Drushka, Kyla; Reygondeau, Gabriel

    2014-01-01

    variables to define integrative indices to monitor the environmental changes within each resultant subprovince at monthly resolutions. Using both the classical and mesoscale features, we find five biogeochemical subprovinces for the Mediterranean and Black Seas. Interestingly, the use of mesoscale variables......The Mediterranean and Black Seas are semi-enclosed basins characterized by high environmental variability and growing anthropogenic pressure. This has led to an increasing need for a bioregionalization of the oceanic environment at local and regional scales that can be used for managerial...... applications as a geographical reference. We aim to identify biogeochemical subprovinces within this domain, and develop synthetic indices of the key oceanographic dynamics of each subprovince to quantify baselines from which to assess variability and change. To do this, we compile a data set of 101 months...

  13. Simple heuristics: A bridge between manual core design and automated optimization methods

    International Nuclear Information System (INIS)

    White, J.R.; Delmolino, P.M.

    1993-01-01

    The primary function of RESCUE is to serve as an aid in the analysis and identification of feasible loading patterns for LWR reload cores. The unique feature of RESCUE is that its physics model is based on some recent advances in generalized perturbation theory (GPT) methods. The high order GPT techniques offer the accuracy, computational efficiency, and flexibility needed for the implementation of a full range of capabilities within a set of compatible interactive (manual and semi-automated) and automated design tools. The basic design philosophy and current features within RESCUE are reviewed, and the new semi-automated capability is highlighted. The online advisor facility appears quite promising and it provides a natural bridge between the traditional trial-and-error manual process and the recent progress towards fully automated optimization sequences. (orig.)

  14. RCrane: semi-automated RNA model building

    International Nuclear Information System (INIS)

    Keating, Kevin S.; Pyle, Anna Marie

    2012-01-01

    RCrane is a new tool for the partially automated building of RNA crystallographic models into electron-density maps of low or intermediate resolution. This tool helps crystallographers to place phosphates and bases into electron density and then automatically predicts and builds the detailed all-atom structure of the traced nucleotides. RNA crystals typically diffract to much lower resolutions than protein crystals. This low-resolution diffraction results in unclear density maps, which cause considerable difficulties during the model-building process. These difficulties are exacerbated by the lack of computational tools for RNA modeling. Here, RCrane, a tool for the partially automated building of RNA into electron-density maps of low or intermediate resolution, is presented. This tool works within Coot, a common program for macromolecular model building. RCrane helps crystallographers to place phosphates and bases into electron density and then automatically predicts and builds the detailed all-atom structure of the traced nucleotides. RCrane then allows the crystallographer to review the newly built structure and select alternative backbone conformations where desired. This tool can also be used to automatically correct the backbone structure of previously built nucleotides. These automated corrections can fix incorrect sugar puckers, steric clashes and other structural problems

  15. SAMPO 90 - High resolution interactive gamma spectrum analysis including automation with macros

    International Nuclear Information System (INIS)

    Aarnio, P.A.; Nikkinen, M.T.; Routti, J.T.

    1991-01-01

    SAMPO 90 is a high performance gamma spectrum analysis program for personal computers. It uses high resolution color graphics to display calibrations, spectra, fitting results as multiplet components, and analysis results. All the analysis phases can be done either under full interactive user control or by using macros for automated measurement and analysis sequences including the control of MCAs and sample changers. Semi-automated calibrations for peak shapes (Gaussian with exponential tails), detector efficiency, and energy are available with a possibility for user intervention through interactive graphics. Accurate peak area determination of even the most complex multiplets, of up to 32 components, is accomplished using linear, non-linear and mixed mode fitting, where the component energies and areas can be either frozen or allowed to float in arbitrary combinations. Nuclide identification is done using associated lines techniques which allow interference correction for fully overlapping peaks. Peaked Background Subtraction can be performed and Minimum Detectable Activities calculated. Attenuation corrections can be taken into account in detector efficiency calculation. The most common PC-based MCA spectrum formats (Canberra S100, Ortec ACE, Nucleus PCA, ND AccuSpec) are supported as well as ASCII spectrum files. A gamma-line library is included together with an editor for user configurable libraries. The analysis reports and program parameters are fully customizable. Function key macros can be used to automate the most common analysis procedures. Small batch type modules are additionally available for routine work. SAMPO 90 is a result of over twenty man years of programming and contains 25,000 lines of Fortran, 10,000 lines of C, and 12,000 lines of assembler

  16. Semi-automated potentiometric titration method for uranium characterization

    Energy Technology Data Exchange (ETDEWEB)

    Cristiano, B.F.G., E-mail: barbara@ird.gov.br [Comissao Nacional de Energia Nuclear (CNEN), Instituto de Radioprotecao e Dosimetria (IRD), Avenida Salvador Allende s/n Recreio dos Bandeirantes, PO Box 37750, Rio de Janeiro, 22780-160 RJ (Brazil); Delgado, J.U.; Silva, J.W.S. da; Barros, P.D. de; Araujo, R.M.S. de [Comissao Nacional de Energia Nuclear (CNEN), Instituto de Radioprotecao e Dosimetria (IRD), Avenida Salvador Allende s/n Recreio dos Bandeirantes, PO Box 37750, Rio de Janeiro, 22780-160 RJ (Brazil); Lopes, R.T. [Programa de Engenharia Nuclear (PEN/COPPE), Universidade Federal do Rio de Janeiro (UFRJ), Ilha do Fundao, PO Box 68509, Rio de Janeiro, 21945-970 RJ (Brazil)

    2012-07-15

    The manual version of the potentiometric titration method has been used for certification and characterization of uranium compounds. In order to reduce the analysis time and the influence of the analyst, a semi-automatic version of the method was developed in the Brazilian Nuclear Energy Commission. The method was applied with traceability assured by using a potassium dichromate primary standard. The combined standard uncertainty in determining the total concentration of uranium was around 0.01%, which is suitable for uranium characterization. - Highlights: Black-Right-Pointing-Pointer We developed a semi-automatic version of potentiometric titration method. Black-Right-Pointing-Pointer The method is used for certification and characterization of uranium compounds. Black-Right-Pointing-Pointer The traceability of the method was assured by a K{sub 2}Cr{sub 2}O{sub 7} primary standard. Black-Right-Pointing-Pointer The results of U{sub 3}O{sub 8} reference material analyzed was consistent with certified value. Black-Right-Pointing-Pointer The uncertainty obtained, near 0.01%, is useful for characterization purposes.

  17. Unmet needs in automated cytogenetics

    International Nuclear Information System (INIS)

    Bender, M.A.

    1976-01-01

    Though some, at least, of the goals of automation systems for analysis of clinical cytogenetic material seem either at hand, like automatic metaphase finding, or at least likely to be met in the near future, like operator-assisted semi-automatic analysis of banded metaphase spreads, important areas of cytogenetic analsis, most importantly the determination of chromosomal aberration frequencies in populations of cells or in samples of cells from people exposed to environmental mutagens, await practical methods of automation. Important as are the clinical diagnostic applications, it is apparent that increasing concern over the clastogenic effects of the multitude of potentially clastogenic chemical and physical agents to which human populations are being increasingly exposed, and the resulting emergence of extensive cytogenetic testing protocols, makes the development of automation not only economically feasible but almost mandatory. The nature of the problems involved, and acutal of possible approaches to their solution, are discussed

  18. Evaluation of automated analysis of 15N and total N in plant material and soil

    DEFF Research Database (Denmark)

    Jensen, E.S.

    1991-01-01

    Simultaneous determination of N-15 and total N using an automated nitrogen analyser interfaced to a continuous-flow isotope ratio mass spectrometer (ANA-MS method) was evaluated. The coefficient of variation (CV) of repeated analyses of homogeneous standards and samples at natural abundance...... was lower than 0.1%. The CV of repeated analyses of N-15-labelled plant material and soil samples varied between 0.3% and 1.1%. The reproducibility of repeated total N analyses using the automated method was comparable to results obtained with a semi-micro Kjeldahl procedure. However, the automated method...... analysis showed that the recovery of inorganic N in the NH3 trap was lower when the N was diffused from water than from 2 M KCl. The results also indicated that different proportions of the NO3- and the NH4+ in aqueous solution were recovered in the trap after combined diffusion. The method is most suited...

  19. Semi-automated contour recognition using DICOMautomaton

    International Nuclear Information System (INIS)

    Clark, H; Duzenli, C; Wu, J; Moiseenko, V; Lee, R; Gill, B; Thomas, S

    2014-01-01

    Purpose: A system has been developed which recognizes and classifies Digital Imaging and Communication in Medicine contour data with minimal human intervention. It allows researchers to overcome obstacles which tax analysis and mining systems, including inconsistent naming conventions and differences in data age or resolution. Methods: Lexicographic and geometric analysis is used for recognition. Well-known lexicographic methods implemented include Levenshtein-Damerau, bag-of-characters, Double Metaphone, Soundex, and (word and character)-N-grams. Geometrical implementations include 3D Fourier Descriptors, probability spheres, boolean overlap, simple feature comparison (e.g. eccentricity, volume) and rule-based techniques. Both analyses implement custom, domain-specific modules (e.g. emphasis differentiating left/right organ variants). Contour labels from 60 head and neck patients are used for cross-validation. Results: Mixed-lexicographical methods show an effective improvement in more than 10% of recognition attempts compared with a pure Levenshtein-Damerau approach when withholding 70% of the lexicon. Domain-specific and geometrical techniques further boost performance. Conclusions: DICOMautomaton allows users to recognize contours semi-automatically. As usage increases and the lexicon is filled with additional structures, performance improves, increasing the overall utility of the system.

  20. Comparison of semi-automated center-dot and fully automated endothelial cell analyses from specular microscopy images.

    Science.gov (United States)

    Maruoka, Sachiko; Nakakura, Shunsuke; Matsuo, Naoko; Yoshitomi, Kayo; Katakami, Chikako; Tabuchi, Hitoshi; Chikama, Taiichiro; Kiuchi, Yoshiaki

    2017-10-30

    To evaluate two specular microscopy analysis methods across different endothelial cell densities (ECDs). Endothelial images of one eye from each of 45 patients were taken by using three different specular microscopes (three replicates each). To determine the consistency of the center-dot method, we compared SP-6000 and SP-2000P images. CME-530 and SP-6000 images were compared to assess the consistency of the fully automated method. The SP-6000 images from the two methods were compared. Intraclass correlation coefficients (ICCs) for the three measurements were calculated, and parametric multiple comparisons tests and Bland-Altman analysis were performed. The ECD mean value was 2425 ± 883 (range 516-3707) cells/mm 2 . ICC values were > 0.9 for all three microscopes for ECD, but the coefficients of variation (CVs) were 0.3-0.6. For ECD measurements, Bland-Altman analysis revealed that the mean difference was 42 cells/mm 2 between the SP-2000P and SP-6000 for the center-dot method; 57 cells/mm 2 between the SP-6000 measurements from both methods; and -5 cells/mm 2 between the SP-6000 and CME-530 for the fully automated method (95% limits of agreement: - 201 to 284 cell/mm 2 , - 410 to 522 cells/mm 2 , and - 327 to 318 cells/mm 2 , respectively). For CV measurements, the mean differences were - 3, - 12, and 13% (95% limits of agreement - 18 to 11, - 26 to 2, and - 5 to 32%, respectively). Despite using three replicate measurements, the precision of the center-dot method with the SP-2000P and SP-6000 software was only ± 10% for ECD data and was even worse for the fully automated method. Japan Clinical Trials Register ( http://www.umin.ac.jp/ctr/index/htm9 ) number UMIN 000015236.

  1. Automated Motion Estimation for 2D Cine DENSE MRI

    Science.gov (United States)

    Gilliam, Andrew D.; Epstein, Frederick H.

    2013-01-01

    Cine displacement encoding with stimulated echoes (DENSE) is a magnetic resonance (MR) method that directly encodes tissue displacement into MR phase images. This technique has successfully interrogated many forms of tissue motion, but is most commonly used to evaluate cardiac mechanics. Currently, motion analysis from cine DENSE images requires manually delineated anatomical structures. An automated analysis would improve measurement throughput, simplify data interpretation, and potentially access important physiological information during the MR exam. In this article, we present the first fully automated solution for the estimation of tissue motion and strain from 2D cine DENSE data. Results using both simulated and human cardiac cine DENSE data indicate good agreement between the automated algorithm and the standard semi-manual analysis method. PMID:22575669

  2. Serum bactericidal assay for the evaluation of typhoid vaccine using a semi-automated colony-counting method.

    Science.gov (United States)

    Jang, Mi Seon; Sahastrabuddhe, Sushant; Yun, Cheol-Heui; Han, Seung Hyun; Yang, Jae Seung

    2016-08-01

    Typhoid fever, mainly caused by Salmonella enterica serovar Typhi (S. Typhi), is a life-threatening disease, mostly in developing countries. Enzyme-linked immunosorbent assay (ELISA) is widely used to quantify antibodies against S. Typhi in serum but does not provide information about functional antibody titers. Although the serum bactericidal assay (SBA) using an agar plate is often used to measure functional antibody titers against various bacterial pathogens in clinical specimens, it has rarely been used for typhoid vaccines because it is time-consuming and labor-intensive. In the present study, we established an improved SBA against S. Typhi using a semi-automated colony-counting system with a square agar plate harboring 24 samples. The semi-automated SBA efficiently measured bactericidal titers of sera from individuals immunized with S. Typhi Vi polysaccharide vaccines. The assay specifically responded to S. Typhi Ty2 but not to other irrelevant enteric bacteria including Vibrio cholerae and Shigella flexneri. Baby rabbit complement was more appropriate source for the SBA against S. Typhi than complements from adult rabbit, guinea pig, and human. We also examined the correlation between SBA and ELISA for measuring antibody responses against S. Typhi using pre- and post-vaccination sera from 18 human volunteers. The SBA titer showed a good correlation with anti-Vi IgG quantity in the serum as determined by Spearman correlation coefficient of 0.737 (P measure functional antibody titers against S. Typhi in sera from human subjects immunized with typhoid vaccines. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  3. Reproducibility of Corneal Graft Thickness measurements with COLGATE in patients who have undergone DSAEK (Descemet Stripping Automated Endothelial Keratoplasty

    Directory of Open Access Journals (Sweden)

    Wong Melissa HY

    2012-08-01

    Full Text Available Abstract Background The CorneaL GrAft Thickness Evaluation (COLGATE system was recently developed to facilitate the evaluation of corneal graft thickness from OCT images. Graft thickness measurement can be a surrogate indicator for detecting graft failure or success. The purpose of this study was to determine the reproducibility of the COLGATE system in measuring DSAEK graft area between two observers. Methods This was a prospective case series in which 50 anterior segment OCT images of patients who had undergone DSAEK in either eye were analysed. Two observers (MW, AC independently obtained the image analysis for the graft area using both semi automated and automated method. One week later, each observer repeated the analysis for the same set of images. Bland-Altman analysis was performed to analyze inter and intra observer agreement. Results There was strong intraobserver correlation between the 2 semi automated readings obtained by both observers. (r = 0.936 and r = 0.962. Intraobserver ICC for observer 1 was 0.936 (95% CI 0.890 to 0.963 and 0.967 (95% CI 0.942 to 0.981 for observer 2. Likewise, there was also strong interobserver correlation (r = 0.913 and r = 0.969. The interobserver ICC for the first measurements was 0.911 (95% CI 0.849 to 0.949 and 0.968 (95% CI 0.945 to 0.982 for the second. There was statistical difference between the automatic and the semi automated readings for both observers (p = 0.006, p = 0.003. The automatic readings gave consistently higher values than the semi automated readings especially in thin grafts. Conclusion The analysis from the COLGATE programme can be reproducible between different observers. Care must be taken when interpreting the automated analysis as they tend to over estimate measurements.

  4. Semi-automated potentiometric titration method for uranium characterization.

    Science.gov (United States)

    Cristiano, B F G; Delgado, J U; da Silva, J W S; de Barros, P D; de Araújo, R M S; Lopes, R T

    2012-07-01

    The manual version of the potentiometric titration method has been used for certification and characterization of uranium compounds. In order to reduce the analysis time and the influence of the analyst, a semi-automatic version of the method was developed in the Brazilian Nuclear Energy Commission. The method was applied with traceability assured by using a potassium dichromate primary standard. The combined standard uncertainty in determining the total concentration of uranium was around 0.01%, which is suitable for uranium characterization. Copyright © 2011 Elsevier Ltd. All rights reserved.

  5. Automated sample analysis and remediation

    International Nuclear Information System (INIS)

    Hollen, R.; Settle, F.

    1995-01-01

    The Contaminant Analysis Automation Project is developing an automated chemical analysis system to address the current needs of the US Department of Energy (DOE). These needs focus on the remediation of large amounts of radioactive and chemically hazardous wastes stored, buried and still being processed at numerous DOE sites. This paper outlines the advantages of the system under development, and details the hardware and software design. A prototype system for characterizing polychlorinated biphenyls in soils is also described

  6. Semi-automated tabulation of the 3D topology and morphology of branching networks using CT: application to the airway tree

    International Nuclear Information System (INIS)

    Sauret, V.; Bailey, A.G.

    1999-01-01

    Detailed information on biological branching networks (optical nerves, airways or blood vessels) is often required to improve the analysis of 3D medical imaging data. A semi-automated algorithm has been developed to obtain the full 3D topology and dimensions (direction cosine, length, diameter, branching and gravity angles) of branching networks using their CT images. It has been tested using CT images of a simple Perspex branching network and applied to the CT images of a human cast of the airway tree. The morphology and topology of the computer derived network were compared with the manually measured dimensions. Good agreement was found. The airways dimensions also compared well with previous values quoted in literature. This algorithm can provide complete data set analysis much more quickly than manual measurements. Its use is limited by the CT resolution which means that very small branches are not visible. New data are presented on the branching angles of the airway tree. (author)

  7. RCrane: semi-automated RNA model building.

    Science.gov (United States)

    Keating, Kevin S; Pyle, Anna Marie

    2012-08-01

    RNA crystals typically diffract to much lower resolutions than protein crystals. This low-resolution diffraction results in unclear density maps, which cause considerable difficulties during the model-building process. These difficulties are exacerbated by the lack of computational tools for RNA modeling. Here, RCrane, a tool for the partially automated building of RNA into electron-density maps of low or intermediate resolution, is presented. This tool works within Coot, a common program for macromolecular model building. RCrane helps crystallographers to place phosphates and bases into electron density and then automatically predicts and builds the detailed all-atom structure of the traced nucleotides. RCrane then allows the crystallographer to review the newly built structure and select alternative backbone conformations where desired. This tool can also be used to automatically correct the backbone structure of previously built nucleotides. These automated corrections can fix incorrect sugar puckers, steric clashes and other structural problems.

  8. Semi-classical signal analysis

    KAUST Repository

    Laleg-Kirati, Taous-Meriem; Cré peau, Emmanuelle; Sorine, Michel

    2012-01-01

    This study introduces a new signal analysis method, based on a semi-classical approach. The main idea in this method is to interpret a pulse-shaped signal as a potential of a Schrödinger operator and then to use the discrete spectrum

  9. Semi-classical signal analysis

    KAUST Repository

    Laleg-Kirati, Taous-Meriem

    2012-09-30

    This study introduces a new signal analysis method, based on a semi-classical approach. The main idea in this method is to interpret a pulse-shaped signal as a potential of a Schrödinger operator and then to use the discrete spectrum of this operator for the analysis of the signal. We present some numerical examples and the first results obtained with this method on the analysis of arterial blood pressure waveforms. © 2012 Springer-Verlag London Limited.

  10. Semi-automated ontology generation and evolution

    Science.gov (United States)

    Stirtzinger, Anthony P.; Anken, Craig S.

    2009-05-01

    Extending the notion of data models or object models, ontology can provide rich semantic definition not only to the meta-data but also to the instance data of domain knowledge, making these semantic definitions available in machine readable form. However, the generation of an effective ontology is a difficult task involving considerable labor and skill. This paper discusses an Ontology Generation and Evolution Processor (OGEP) aimed at automating this process, only requesting user input when un-resolvable ambiguous situations occur. OGEP directly attacks the main barrier which prevents automated (or self learning) ontology generation: the ability to understand the meaning of artifacts and the relationships the artifacts have to the domain space. OGEP leverages existing lexical to ontological mappings in the form of WordNet, and Suggested Upper Merged Ontology (SUMO) integrated with a semantic pattern-based structure referred to as the Semantic Grounding Mechanism (SGM) and implemented as a Corpus Reasoner. The OGEP processing is initiated by a Corpus Parser performing a lexical analysis of the corpus, reading in a document (or corpus) and preparing it for processing by annotating words and phrases. After the Corpus Parser is done, the Corpus Reasoner uses the parts of speech output to determine the semantic meaning of a word or phrase. The Corpus Reasoner is the crux of the OGEP system, analyzing, extrapolating, and evolving data from free text into cohesive semantic relationships. The Semantic Grounding Mechanism provides a basis for identifying and mapping semantic relationships. By blending together the WordNet lexicon and SUMO ontological layout, the SGM is given breadth and depth in its ability to extrapolate semantic relationships between domain entities. The combination of all these components results in an innovative approach to user assisted semantic-based ontology generation. This paper will describe the OGEP technology in the context of the architectural

  11. Mesoscale Connections Summer 2017

    Energy Technology Data Exchange (ETDEWEB)

    Kippen, Karen Elizabeth [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Bourke, Mark Andrew M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-06-21

    Our challenge derives from the fact that in metals or explosives grains, interfaces and defects control engineering performance in ways that are neither amenable to continuum codes (which fail to rigorously describe the heterogeneities derived from microstructure) nor computationally tractable to first principles atomistic calculations. This is a region called the mesoscale, which stands at the frontier of our desire to translate fundamental science insights into confidence in aging system performance over the range of extreme conditions relevant in a nuclear weapon. For dynamic problems, the phenomena of interest can require extremely good temporal resolutions. A shock wave traveling at 1000 m/s (or 1 mm/μs) passes through a grain with a diameter of 1 micron in a nanosecond (10-9 sec). Thus, to observe the mesoscale phenomena—such as dislocations or phase transformations—as the shock passes, temporal resolution better than picoseconds (10-12 sec) may be needed. As we anticipate the science challenges over the next decade, experimental insights on material performance at the micron spatial scale with picosecond temporal resolution—at the mesoscale— are a clear challenge. This is a challenge fit for Los Alamos in partnership with our sister labs and academia. Mesoscale Connections will draw attention to our progress as we tackle the mesoscale challenge. We hope you like it and encourage suggestions of content you are interested in.

  12. Network analysis of mesoscale optical recordings to assess regional, functional connectivity.

    Science.gov (United States)

    Lim, Diana H; LeDue, Jeffrey M; Murphy, Timothy H

    2015-10-01

    With modern optical imaging methods, it is possible to map structural and functional connectivity. Optical imaging studies that aim to describe large-scale neural connectivity often need to handle large and complex datasets. In order to interpret these datasets, new methods for analyzing structural and functional connectivity are being developed. Recently, network analysis, based on graph theory, has been used to describe and quantify brain connectivity in both experimental and clinical studies. We outline how to apply regional, functional network analysis to mesoscale optical imaging using voltage-sensitive-dye imaging and channelrhodopsin-2 stimulation in a mouse model. We include links to sample datasets and an analysis script. The analyses we employ can be applied to other types of fluorescence wide-field imaging, including genetically encoded calcium indicators, to assess network properties. We discuss the benefits and limitations of using network analysis for interpreting optical imaging data and define network properties that may be used to compare across preparations or other manipulations such as animal models of disease.

  13. Mesoscale Modeling, Forecasting and Remote Sensing Research.

    Science.gov (United States)

    remote sensing , cyclonic scale diagnostic studies and mesoscale numerical modeling and forecasting are summarized. Mechanisms involved in the release of potential instability are discussed and simulated quantitatively, giving particular attention to the convective formulation. The basic mesoscale model is documented including the equations, boundary condition, finite differences and initialization through an idealized frontal zone. Results of tests including a three dimensional test with real data, tests of convective/mesoscale interaction and tests with a detailed

  14. Design considerations on user-interaction for semi-automated driving

    NARCIS (Netherlands)

    van den Beukel, Arie Paul; van der Voort, Mascha C.

    2015-01-01

    The automotive industry has recently made first steps towards implementation of automated driving, by introducing lateral control as addition to longitudinal control (i.e. ACC). This automated control is allowed during specific situations within existing infrastructure (e.g. motorway cruising).

  15. Meso-scale wind variability. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Larsen, S.; Larsen, X.; Vincent, C.; Soerensen, P.; Pinson, P.; Trombe, P.-J.; Madsen, H.; Cutululis, N.

    2011-11-15

    The project has aimed to characterize mesoscale meteorological phenomenon for the North Sea and the Inner Danish waters, and additionally aimed on improving the predictability and quality of the power production from offshore windfarms. The meso-scale meteorology has been characterized with respect to the physical processes, climatology, spectral characteristics and correlation properties based on measurements from wind farms, satellite data (SAR) and mesoscale numerical modeling (WRF). The abilities of the WRF model to characterize and predict relevant mesoscale phenomenon has been proven. Additionally application of statistical forecasting, using a Markov switching approach that can be related to the meteorological conditions, to analyze and short term predict the power production from an offshore wind farms have been documented. Two PhD studies have been conducted in connection with the project. The project has been a cooperative project between Risoe DTU, IMM DTU, DONG Energy, Vattenfall and VESTAS. It is registered as Energinet.dk, project no. 2007-1-7141. (Author)

  16. Automation of activation analysis

    International Nuclear Information System (INIS)

    Ivanov, I.N.; Ivanets, V.N.; Filippov, V.V.

    1985-01-01

    The basic data on the methods and equipment of activation analysis are presented. Recommendations on the selection of activation analysis techniques, and especially the technique envisaging the use of short-lived isotopes, are given. The equipment possibilities to increase dataway carrying capacity, using modern computers for the automation of the analysis and data processing procedure, are shown

  17. A mobile, high-throughput semi-automated system for testing cognition in large non-primate animal models of Huntington disease.

    Science.gov (United States)

    McBride, Sebastian D; Perentos, Nicholas; Morton, A Jennifer

    2016-05-30

    For reasons of cost and ethical concerns, models of neurodegenerative disorders such as Huntington disease (HD) are currently being developed in farm animals, as an alternative to non-human primates. Developing reliable methods of testing cognitive function is essential to determining the usefulness of such models. Nevertheless, cognitive testing of farm animal species presents a unique set of challenges. The primary aims of this study were to develop and validate a mobile operant system suitable for high throughput cognitive testing of sheep. We designed a semi-automated testing system with the capability of presenting stimuli (visual, auditory) and reward at six spatial locations. Fourteen normal sheep were used to validate the system using a two-choice visual discrimination task. Four stages of training devised to acclimatise animals to the system are also presented. All sheep progressed rapidly through the training stages, over eight sessions. All sheep learned the 2CVDT and performed at least one reversal stage. The mean number of trials the sheep took to reach criterion in the first acquisition learning was 13.9±1.5 and for the reversal learning was 19.1±1.8. This is the first mobile semi-automated operant system developed for testing cognitive function in sheep. We have designed and validated an automated operant behavioural testing system suitable for high throughput cognitive testing in sheep and other medium-sized quadrupeds, such as pigs and dogs. Sheep performance in the two-choice visual discrimination task was very similar to that reported for non-human primates and strongly supports the use of farm animals as pre-clinical models for the study of neurodegenerative diseases. Copyright © 2015 Elsevier B.V. All rights reserved.

  18. Interscan variation of semi-automated volumetry of subsolid pulmonary nodules

    International Nuclear Information System (INIS)

    Scholten, Ernst T.; Jong, Pim A. de; Willemink, Martin J.; Mali, Willem P.T.M.; Gietema, Hester A.; Jacobs, Colin; Riel, Sarah van; Ginneken, Bram van; Vliegenthart, Rozemarijn; Oudkerk, Matthijs; Koning, Harry J. de; Horeweg, Nanda; Prokop, Mathias

    2015-01-01

    We aimed to test the interscan variation of semi-automatic volumetry of subsolid nodules (SSNs), as growth evaluation is important for SSN management. From a lung cancer screening trial all SSNs that were stable over at least 3 months were included (N = 44). SSNs were quantified on the baseline CT by two observers using semi-automatic volumetry software for effective diameter, volume, and mass. One observer also measured the SSNs on the second CT 3 months later. Interscan variation was evaluated using Bland-Altman plots. Observer agreement was calculated as intraclass correlation coefficient (ICC). Data are presented as mean (± standard deviation) or median and interquartile range (IQR). A Mann-Whitney U test was used for the analysis of the influence of adjustments on the measurements. Semi-automatic measurements were feasible in all 44 SSNs. The interscan limits of agreement ranged from -12.0 % to 9.7 % for diameter, -35.4 % to 28.6 % for volume and -27.6 % to 30.8 % for mass. Agreement between observers was good with intraclass correlation coefficients of 0.978, 0.957, and 0.968 for diameter, volume, and mass, respectively. Our data suggest that when using our software an increase in mass of 30 % can be regarded as significant growth. (orig.)

  19. Interscan variation of semi-automated volumetry of subsolid pulmonary nodules

    Energy Technology Data Exchange (ETDEWEB)

    Scholten, Ernst T. [University Medical Center, Department of Radiology, Utrecht (Netherlands); Kennemer Gasthuis, Department of Radiology, Haarlem (Netherlands); Jong, Pim A. de; Willemink, Martin J.; Mali, Willem P.T.M.; Gietema, Hester A. [University Medical Center, Department of Radiology, Utrecht (Netherlands); Jacobs, Colin; Riel, Sarah van [Radboud University Medical Center, Diagnostic Image Analysis Group, Nijmegen (Netherlands); Ginneken, Bram van [Radboud University Medical Center, Diagnostic Image Analysis Group, Nijmegen (Netherlands); Fraunhofer MEVIS, Bremen (Germany); Vliegenthart, Rozemarijn [University of Groningen, University Medical Center Groningen, Department of Radiology, Groningen (Netherlands); University of Groningen, University Medical Centre Groningen, Center for Medical Imaging-North East Netherlands, Groningen (Netherlands); Oudkerk, Matthijs [University of Groningen, University Medical Centre Groningen, Center for Medical Imaging-North East Netherlands, Groningen (Netherlands); Koning, Harry J. de [Erasmus Medical Center, Department of Public Health, Rotterdam (Netherlands); Horeweg, Nanda [Erasmus Medical Center, Department of Public Health, Rotterdam (Netherlands); Erasmus Medical Center, Department of Pulmonology, Rotterdam (Netherlands); Prokop, Mathias [Radboud University Medical Center, Department of Radiology, Nijmegen (Netherlands)

    2015-04-01

    We aimed to test the interscan variation of semi-automatic volumetry of subsolid nodules (SSNs), as growth evaluation is important for SSN management. From a lung cancer screening trial all SSNs that were stable over at least 3 months were included (N = 44). SSNs were quantified on the baseline CT by two observers using semi-automatic volumetry software for effective diameter, volume, and mass. One observer also measured the SSNs on the second CT 3 months later. Interscan variation was evaluated using Bland-Altman plots. Observer agreement was calculated as intraclass correlation coefficient (ICC). Data are presented as mean (± standard deviation) or median and interquartile range (IQR). A Mann-Whitney U test was used for the analysis of the influence of adjustments on the measurements. Semi-automatic measurements were feasible in all 44 SSNs. The interscan limits of agreement ranged from -12.0 % to 9.7 % for diameter, -35.4 % to 28.6 % for volume and -27.6 % to 30.8 % for mass. Agreement between observers was good with intraclass correlation coefficients of 0.978, 0.957, and 0.968 for diameter, volume, and mass, respectively. Our data suggest that when using our software an increase in mass of 30 % can be regarded as significant growth. (orig.)

  20. Improvement of Binary Analysis Components in Automated Malware Analysis Framework

    Science.gov (United States)

    2017-02-21

    AFRL-AFOSR-JP-TR-2017-0018 Improvement of Binary Analysis Components in Automated Malware Analysis Framework Keiji Takeda KEIO UNIVERSITY Final...TYPE Final 3. DATES COVERED (From - To) 26 May 2015 to 25 Nov 2016 4. TITLE AND SUBTITLE Improvement of Binary Analysis Components in Automated Malware ...analyze malicious software ( malware ) with minimum human interaction. The system autonomously analyze malware samples by analyzing malware binary program

  1. Semi-automated digital measurement as the method of choice for beta cell mass analysis.

    Directory of Open Access Journals (Sweden)

    Violette Coppens

    Full Text Available Pancreas injury by partial duct ligation (PDL activates beta cell differentiation and proliferation in adult mouse pancreas but remains controversial regarding the anticipated increase in beta cell volume. Several reports unable to show beta cell volume augmentation in PDL pancreas used automated digital image analysis software. We hypothesized that fully automatic beta cell morphometry without manual micrograph artifact remediation introduces bias and therefore might be responsible for reported discrepancies and controversy. However, our present results prove that standard digital image processing with automatic thresholding is sufficiently robust albeit less sensitive and less adequate to demonstrate a significant increase in beta cell volume in PDL versus Sham-operated pancreas. We therefore conclude that other confounding factors such as quality of surgery, selection of samples based on relative abundance of the transcription factor Neurogenin 3 (Ngn3 and tissue processing give rise to inter-laboratory inconsistencies in beta cell volume quantification in PDL pancreas.

  2. Feasibility of geometric-intensity-based semi-automated delineation of the tentorium cerebelli from MRI scans.

    Science.gov (United States)

    Penumetcha, Neeraja; Kabadi, Suraj; Jedynak, Bruno; Walcutt, Charles; Gado, Mokhtar H; Wang, Lei; Ratnanather, J Tilak

    2011-04-01

    This paper describes a feasibility study of a method for delineating the tentorium cerebelli in magnetic resonance imaging (MRI) brain scans. The tentorium cerebelli is a thin sheet of dura matter covering the cerebellum and separating it from the posterior part of the temporal lobe and the occipital lobe of the cerebral hemispheres. Cortical structures such as the parahippocampal gyrus can be indistinguishable from tentorium in magnetized prepared rapid gradient echo and T1-weighted MRI scans. Similar intensities in these neighboring regions make it difficult to perform accurate cortical analysis in neuroimaging studies of schizophrenia and Alzheimer's disease. A semi-automated, geometric, intensity-based procedure for delineating the tentorium from a whole-brain scan is described. Initial and final curves are traced within the tentorium. A cost function, based on intensity and Euclidean distance, is computed between the two curves using the Fast Marching method. The initial curve is then evolved to the final curve based on the gradient of the computed costs, generating a series of intermediate curves. These curves are then used to generate a triangulated surface of the tentorium. For 3 scans, surfaces were found to be within 2 voxels from hand segmentations. Copyright © 2009 by the American Society of Neuroimaging.

  3. Operating procedure automation to enhance safety of nuclear power plants

    International Nuclear Information System (INIS)

    Husseiny, A.A.; Sabri, Z.A.; Adams, S.K.; Rodriguez, R.J.; Packer, D.; Holmes, J.W.

    1989-01-01

    Use of logic statements and computer assist are explored as means for automation and improvement on design of operating procedures including those employed in abnormal and emergency situations. Operating procedures for downpower and loss of forced circulation are used for demonstration. Human-factors analysis is performed on generic emergency operating procedures for three strategies of control; manual, semi-automatic and automatic, using standard emergency operating procedures. Such preliminary analysis shows that automation of procedures is feasible provided that fault-tolerant software and hardware become available for design of the controllers. Recommendations are provided for tests to substantiate the promise of enhancement of plant safety. Adequate design of operating procedures through automation may alleviate several major operational problems of nuclear power plants. Also, automation of procedures is necessary for partial or overall automatic control of plants. Fully automatic operations are needed for space applications while supervised automation of land-based and offshore plants may become the thrust of new generation of nulcear power plants. (orig.)

  4. 21 CFR 864.5600 - Automated hematocrit instrument.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Automated hematocrit instrument. 864.5600 Section 864.5600 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES HEMATOLOGY AND PATHOLOGY DEVICES Automated and Semi-Automated Hematology Devices...

  5. Campylobacter in Broiler Chicken and Broiler Meat in Sri Lanka : Influence of Semi-Automated vs. Wet Market Processing on Campylobacter Contamination of Broiler Neck Skin Samples

    NARCIS (Netherlands)

    Kottawatta, Kottawattage S A; van Bergen, Marcel A P; Abeynayake, Preeni; Wagenaar, Jaap A; Veldman, Kees T; Kalupahana, Ruwani S

    2017-01-01

    Broiler meat can become contaminated with Campylobacter of intestinal origin during processing. The present study aimed to identify the prevalence of Campylobacter in broiler flocks and meat contamination at retail shops, and determine the influence of semi-automated and wet market processing on

  6. Semi-automated scar detection in delayed enhanced cardiac magnetic resonance images

    Science.gov (United States)

    Morisi, Rita; Donini, Bruno; Lanconelli, Nico; Rosengarden, James; Morgan, John; Harden, Stephen; Curzen, Nick

    2015-06-01

    Late enhancement cardiac magnetic resonance images (MRI) has the ability to precisely delineate myocardial scars. We present a semi-automated method for detecting scars in cardiac MRI. This model has the potential to improve routine clinical practice since quantification is not currently offered due to time constraints. A first segmentation step was developed for extracting the target regions for potential scar and determining pre-candidate objects. Pattern recognition methods are then applied to the segmented images in order to detect the position of the myocardial scar. The database of late gadolinium enhancement (LE) cardiac MR images consists of 111 blocks of images acquired from 63 patients at the University Hospital Southampton NHS Foundation Trust (UK). At least one scar was present for each patient, and all the scars were manually annotated by an expert. A group of images (around one third of the entire set) was used for training the system which was subsequently tested on all the remaining images. Four different classifiers were trained (Support Vector Machine (SVM), k-nearest neighbor (KNN), Bayesian and feed-forward neural network) and their performance was evaluated by using Free response Receiver Operating Characteristic (FROC) analysis. Feature selection was implemented for analyzing the importance of the various features. The segmentation method proposed allowed the region affected by the scar to be extracted correctly in 96% of the blocks of images. The SVM was shown to be the best classifier for our task, and our system reached an overall sensitivity of 80% with less than 7 false positives per patient. The method we present provides an effective tool for detection of scars on cardiac MRI. This may be of value in clinical practice by permitting routine reporting of scar quantification.

  7. Automation systems for radioimmunoassay

    International Nuclear Information System (INIS)

    Yamasaki, Paul

    1974-01-01

    The application of automation systems for radioimmunoassay (RIA) was discussed. Automated systems could be useful in the second step, of the four basic processes in the course of RIA, i.e., preparation of sample for reaction. There were two types of instrumentation, a semi-automatic pipete, and a fully automated pipete station, both providing for fast and accurate dispensing of the reagent or for the diluting of sample with reagent. Illustrations of the instruments were shown. (Mukohata, S.)

  8. Some problems concenrning the use of automated radiochemical separation systems in destructive neutron activation analysis

    International Nuclear Information System (INIS)

    Nagy, L.G.; Toeroek, G.

    1977-01-01

    The present state of a long term program is reviewed. It was started to elaborate a remote controlled automated radiochemical processing system for the neutron activation analysis of biological materials. The system is based on wet ashing of the sample followed by reactive desorption of some volatile components. The distillation residue is passed through a series of columns filled with selective ion screening materials to remove the matrix activity. The solution is thus ''stripped'' from the interfering radioions, and it is processed to single-elements through group separations using ion-exchange chromatographic techniques. Some special problems concerning this system are treated. (a) General aspects of the construction of a (semi)automated radiochemical processing system are discussed. (b) Comparison is made between various technical realizations of the same basic concept. (c) Some problems concerning the ''reconstruction'' of an already published processing system are outlined. (T.G.)

  9. Wake modelling combining mesoscale and microscale models

    DEFF Research Database (Denmark)

    Badger, Jake; Volker, Patrick; Prospathospoulos, J.

    2013-01-01

    In this paper the basis for introducing thrust information from microscale wake models into mesocale model wake parameterizations will be described. A classification system for the different types of mesoscale wake parameterizations is suggested and outlined. Four different mesoscale wake paramet...

  10. 21 CFR 864.5620 - Automated hemoglobin system.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Automated hemoglobin system. 864.5620 Section 864.5620 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES HEMATOLOGY AND PATHOLOGY DEVICES Automated and Semi-Automated Hematology Devices § 864...

  11. 21 CFR 864.5200 - Automated cell counter.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Automated cell counter. 864.5200 Section 864.5200 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES HEMATOLOGY AND PATHOLOGY DEVICES Automated and Semi-Automated Hematology Devices § 864...

  12. 21 CFR 864.5680 - Automated heparin analyzer.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Automated heparin analyzer. 864.5680 Section 864.5680 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES HEMATOLOGY AND PATHOLOGY DEVICES Automated and Semi-Automated Hematology Devices § 864...

  13. 21 CFR 864.5850 - Automated slide spinner.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Automated slide spinner. 864.5850 Section 864.5850 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES HEMATOLOGY AND PATHOLOGY DEVICES Automated and Semi-Automated Hematology Devices § 864...

  14. A semi-automated methodology for finding lipid-related GO terms.

    Science.gov (United States)

    Fan, Mengyuan; Low, Hong Sang; Wenk, Markus R; Wong, Limsoon

    2014-01-01

    Although semantic similarity in Gene Ontology (GO) and other approaches may be used to find similar GO terms, there is yet a method to systematically find a class of GO terms sharing a common property with high accuracy (e.g., involving human curation). We have developed a methodology to address this issue and applied it to identify lipid-related GO terms, owing to the important and varied roles of lipids in many biological processes. Our methodology finds lipid-related GO terms in a semi-automated manner, requiring only moderate manual curation. We first obtain a list of lipid-related gold-standard GO terms by keyword search and manual curation. Then, based on the hypothesis that co-annotated GO terms share similar properties, we develop a machine learning method that expands the list of lipid-related terms from the gold standard. Those terms predicted most likely to be lipid related are examined by a human curator following specific curation rules to confirm the class labels. The structure of GO is also exploited to help reduce the curation effort. The prediction and curation cycle is repeated until no further lipid-related term is found. Our approach has covered a high proportion, if not all, of lipid-related terms with relatively high efficiency. http://compbio.ddns.comp.nus.edu.sg/∼lipidgo. © The Author(s) 2014. Published by Oxford University Press.

  15. Acoustic Characterization of Mesoscale Objects

    Energy Technology Data Exchange (ETDEWEB)

    Chinn, D; Huber, R; Chambers, D; Cole, G; Balogun, O; Spicer, J; Murray, T

    2007-03-13

    This report describes the science and engineering performed to provide state-of-the-art acoustic capabilities for nondestructively characterizing mesoscale (millimeter-sized) objects--allowing micrometer resolution over the objects entire volume. Materials and structures used in mesoscale objects necessitate the use of (1) GHz acoustic frequencies and (2) non-contacting laser generation and detection of acoustic waves. This effort demonstrated that acoustic methods at gigahertz frequencies have the necessary penetration depth and spatial resolution to effectively detect density discontinuities, gaps, and delaminations. A prototype laser-based ultrasonic system was designed and built. The system uses a micro-chip laser for excitation of broadband ultrasonic waves with frequency components reaching 1.0 GHz, and a path-stabilized Michelson interferometer for detection. The proof-of-concept for mesoscale characterization is demonstrated by imaging a micro-fabricated etched pattern in a 70 {micro}m thick silicon wafer.

  16. Robofurnace: A semi-automated laboratory chemical vapor deposition system for high-throughput nanomaterial synthesis and process discovery

    International Nuclear Information System (INIS)

    Oliver, C. Ryan; Westrick, William; Koehler, Jeremy; Brieland-Shoultz, Anna; Anagnostopoulos-Politis, Ilias; Cruz-Gonzalez, Tizoc; Hart, A. John

    2013-01-01

    Laboratory research and development on new materials, such as nanostructured thin films, often utilizes manual equipment such as tube furnaces due to its relatively low cost and ease of setup. However, these systems can be prone to inconsistent outcomes due to variations in standard operating procedures and limitations in performance such as heating and cooling rates restrict the parameter space that can be explored. Perhaps more importantly, maximization of research throughput and the successful and efficient translation of materials processing knowledge to production-scale systems, relies on the attainment of consistent outcomes. In response to this need, we present a semi-automated lab-scale chemical vapor deposition (CVD) furnace system, called “Robofurnace.” Robofurnace is an automated CVD system built around a standard tube furnace, which automates sample insertion and removal and uses motion of the furnace to achieve rapid heating and cooling. The system has a 10-sample magazine and motorized transfer arm, which isolates the samples from the lab atmosphere and enables highly repeatable placement of the sample within the tube. The system is designed to enable continuous operation of the CVD reactor, with asynchronous loading/unloading of samples. To demonstrate its performance, Robofurnace is used to develop a rapid CVD recipe for carbon nanotube (CNT) forest growth, achieving a 10-fold improvement in CNT forest mass density compared to a benchmark recipe using a manual tube furnace. In the long run, multiple systems like Robofurnace may be linked to share data among laboratories by methods such as Twitter. Our hope is Robofurnace and like automation will enable machine learning to optimize and discover relationships in complex material synthesis processes

  17. Robofurnace: A semi-automated laboratory chemical vapor deposition system for high-throughput nanomaterial synthesis and process discovery

    Energy Technology Data Exchange (ETDEWEB)

    Oliver, C. Ryan; Westrick, William; Koehler, Jeremy; Brieland-Shoultz, Anna; Anagnostopoulos-Politis, Ilias; Cruz-Gonzalez, Tizoc [Department of Mechanical Engineering, University of Michigan, Ann Arbor, Michigan 48109 (United States); Hart, A. John, E-mail: ajhart@mit.edu [Department of Mechanical Engineering, University of Michigan, Ann Arbor, Michigan 48109 (United States); Department of Mechanical Engineering, Massachusetts Institute of Technology, Cambridge, Massachusetts 02139 (United States)

    2013-11-15

    Laboratory research and development on new materials, such as nanostructured thin films, often utilizes manual equipment such as tube furnaces due to its relatively low cost and ease of setup. However, these systems can be prone to inconsistent outcomes due to variations in standard operating procedures and limitations in performance such as heating and cooling rates restrict the parameter space that can be explored. Perhaps more importantly, maximization of research throughput and the successful and efficient translation of materials processing knowledge to production-scale systems, relies on the attainment of consistent outcomes. In response to this need, we present a semi-automated lab-scale chemical vapor deposition (CVD) furnace system, called “Robofurnace.” Robofurnace is an automated CVD system built around a standard tube furnace, which automates sample insertion and removal and uses motion of the furnace to achieve rapid heating and cooling. The system has a 10-sample magazine and motorized transfer arm, which isolates the samples from the lab atmosphere and enables highly repeatable placement of the sample within the tube. The system is designed to enable continuous operation of the CVD reactor, with asynchronous loading/unloading of samples. To demonstrate its performance, Robofurnace is used to develop a rapid CVD recipe for carbon nanotube (CNT) forest growth, achieving a 10-fold improvement in CNT forest mass density compared to a benchmark recipe using a manual tube furnace. In the long run, multiple systems like Robofurnace may be linked to share data among laboratories by methods such as Twitter. Our hope is Robofurnace and like automation will enable machine learning to optimize and discover relationships in complex material synthesis processes.

  18. Mesoscale Effects on Carbon Export: A Global Perspective

    Science.gov (United States)

    Harrison, Cheryl S.; Long, Matthew C.; Lovenduski, Nicole S.; Moore, Jefferson K.

    2018-04-01

    Carbon export from the surface to the deep ocean is a primary control on global carbon budgets and is mediated by plankton that are sensitive to physical forcing. Earth system models generally do not resolve ocean mesoscale circulation (O(10-100) km), scales that strongly affect transport of nutrients and plankton. The role of mesoscale circulation in modulating export is evaluated by comparing global ocean simulations conducted at 1° and 0.1° horizontal resolution. Mesoscale resolution produces a small reduction in globally integrated export production (export production can be large (±50%), with compensating effects in different ocean basins. With mesoscale resolution, improved representation of coastal jets block off-shelf transport, leading to lower export in regions where shelf-derived nutrients fuel production. Export is further reduced in these regions by resolution of mesoscale turbulence, which restricts the spatial area of production. Maximum mixed layer depths are narrower and deeper across the Subantarctic at higher resolution, driving locally stronger nutrient entrainment and enhanced summer export production. In energetic regions with seasonal blooms, such as the Subantarctic and North Pacific, internally generated mesoscale variability drives substantial interannual variation in local export production. These results suggest that biogeochemical tracer dynamics show different sensitivities to transport biases than temperature and salinity, which should be considered in the formulation and validation of physical parameterizations. Efforts to compare estimates of export production from observations and models should account for large variability in space and time expected for regions strongly affected by mesoscale circulation.

  19. Semi-automated quantitative Drosophila wings measurements.

    Science.gov (United States)

    Loh, Sheng Yang Michael; Ogawa, Yoshitaka; Kawana, Sara; Tamura, Koichiro; Lee, Hwee Kuan

    2017-06-28

    Drosophila melanogaster is an important organism used in many fields of biological research such as genetics and developmental biology. Drosophila wings have been widely used to study the genetics of development, morphometrics and evolution. Therefore there is much interest in quantifying wing structures of Drosophila. Advancement in technology has increased the ease in which images of Drosophila can be acquired. However such studies have been limited by the slow and tedious process of acquiring phenotypic data. We have developed a system that automatically detects and measures key points and vein segments on a Drosophila wing. Key points are detected by performing image transformations and template matching on Drosophila wing images while vein segments are detected using an Active Contour algorithm. The accuracy of our key point detection was compared against key point annotations of users. We also performed key point detection using different training data sets of Drosophila wing images. We compared our software with an existing automated image analysis system for Drosophila wings and showed that our system performs better than the state of the art. Vein segments were manually measured and compared against the measurements obtained from our system. Our system was able to detect specific key points and vein segments from Drosophila wing images with high accuracy.

  20. Automated activation-analysis system

    International Nuclear Information System (INIS)

    Minor, M.M.; Garcia, S.R.; Denton, M.M.

    1982-01-01

    An automated delayed neutron counting and instrumental neutron activation analysis system has been developed at Los Alamos National Laboratory's Omega West Reactor (OWR) to analyze samples for uranium and 31 additional elements with a maximum throughput of 400 samples per day

  1. Impact of Automation on Drivers' Performance in Agricultural Semi-Autonomous Vehicles.

    Science.gov (United States)

    Bashiri, B; Mann, D D

    2015-04-01

    Drivers' inadequate mental workload has been reported as one of the negative effects of driving assistant systems and in-vehicle automation. The increasing trend of automation in agricultural vehicles raises some concerns about drivers' mental workload in such vehicles. Thus, a human factors perspective is needed to identify the consequences of such automated systems. In this simulator study, the effects of vehicle steering task automation (VSTA) and implement control and monitoring task automation (ICMTA) were investigated using a tractor-air seeder system as a case study. Two performance parameters (reaction time and accuracy of actions) were measured to assess drivers' perceived mental workload. Experiments were conducted using the tractor driving simulator (TDS) located in the Agricultural Ergonomics Laboratory at the University of Manitoba. Study participants were university students with tractor driving experience. According to the results, reaction time and number of errors made by drivers both decreased as the automation level increased. Correlations were found among performance parameters and subjective mental workload reported by the drivers.

  2. A semi-automated approach to derive elevation time-series and calculate glacier mass balance from historical aerial imagery

    Science.gov (United States)

    Whorton, E.; Headman, A.; Shean, D. E.; McCann, E.

    2017-12-01

    Understanding the implications of glacier recession on water resources in the western U.S. requires quantifying glacier mass change across large regions over several decades. Very few glaciers in North America have long-term continuous field measurements of glacier mass balance. However, systematic aerial photography campaigns began in 1957 on many glaciers in the western U.S. and Alaska. These historical, vertical aerial stereo-photographs documenting glacier evolution have recently become publically available. Digital elevation models (DEM) of the transient glacier surface preserved in each imagery timestamp can be derived, then differenced to calculate glacier volume and mass change to improve regional geodetic solutions of glacier mass balance. In order to batch process these data, we use Python-based algorithms and Agisoft Photoscan structure from motion (SfM) photogrammetry software to semi-automate DEM creation, and orthorectify and co-register historical aerial imagery in a high-performance computing environment. Scanned photographs are rotated to reduce scaling issues, cropped to the same size to remove fiducials, and batch histogram equalization is applied to improve image quality and aid pixel-matching algorithms using the Python library OpenCV. Processed photographs are then passed to Photoscan through the Photoscan Python library to create DEMs and orthoimagery. To extend the period of record, the elevation products are co-registered to each other, airborne LiDAR data, and DEMs derived from sub-meter commercial satellite imagery. With the exception of the placement of ground control points, the process is entirely automated with Python. Current research is focused on: one, applying these algorithms to create geodetic mass balance time series for the 90 photographed glaciers in Washington State and two, evaluating the minimal amount of positional information required in Photoscan to prevent distortion effects that cannot be addressed during co

  3. 21 CFR 864.5700 - Automated platelet aggregation system.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Automated platelet aggregation system. 864.5700 Section 864.5700 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES HEMATOLOGY AND PATHOLOGY DEVICES Automated and Semi-Automated Hematology Devices...

  4. 21 CFR 864.5220 - Automated differential cell counter.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Automated differential cell counter. 864.5220 Section 864.5220 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES HEMATOLOGY AND PATHOLOGY DEVICES Automated and Semi-Automated Hematology Devices...

  5. 21 CFR 864.5260 - Automated cell-locating device.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Automated cell-locating device. 864.5260 Section 864.5260 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES HEMATOLOGY AND PATHOLOGY DEVICES Automated and Semi-Automated Hematology Devices...

  6. 21 CFR 864.5800 - Automated sedimentation rate device.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Automated sedimentation rate device. 864.5800 Section 864.5800 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES HEMATOLOGY AND PATHOLOGY DEVICES Automated and Semi-Automated Hematology Devices...

  7. A simple viability analysis for unicellular cyanobacteria using a new autofluorescence assay, automated microscopy, and ImageJ

    Directory of Open Access Journals (Sweden)

    Schulze Katja

    2011-11-01

    Full Text Available Abstract Background Currently established methods to identify viable and non-viable cells of cyanobacteria are either time-consuming (eg. plating or preparation-intensive (eg. fluorescent staining. In this paper we present a new and fast viability assay for unicellular cyanobacteria, which uses red chlorophyll fluorescence and an unspecific green autofluorescence for the differentiation of viable and non-viable cells without the need of sample preparation. Results The viability assay for unicellular cyanobacteria using red and green autofluorescence was established and validated for the model organism Synechocystis sp. PCC 6803. Both autofluorescence signals could be observed simultaneously allowing a direct classification of viable and non-viable cells. The results were confirmed by plating/colony count, absorption spectra and chlorophyll measurements. The use of an automated fluorescence microscope and a novel ImageJ based image analysis plugin allow a semi-automated analysis. Conclusions The new method simplifies the process of viability analysis and allows a quick and accurate analysis. Furthermore results indicate that a combination of the new assay with absorption spectra or chlorophyll concentration measurements allows the estimation of the vitality of cells.

  8. Semi-Automatic Rename Refactoring for JavaScript

    DEFF Research Database (Denmark)

    Feldthaus, Asger; Møller, Anders

    2013-01-01

    and interaction with the programmer. With this pragmatic approach, we can provide scalable and effective refactoring support for real-world code, including libraries and incomplete applications. Through a series of experiments that estimate how much manual effort our technique demands from the programmer, we show......Modern IDEs support automated refactoring for many programming languages, but support for JavaScript is still primitive. To perform renaming, which is one of the fundamental refactorings, there is often no practical alternative to simple syntactic search-and-replace. Although more sophisticated...... alternatives have been developed, they are limited by whole-program assumptions and poor scalability. We propose a technique for semi-automatic refactoring for JavaScript, with a focus on renaming. Unlike traditional refactoring algorithms, semi-automatic refactoring works by a combination of static analysis...

  9. Fast and accurate semi-automated segmentation method of spinal cord MR images at 3T applied to the construction of a cervical spinal cord template.

    Directory of Open Access Journals (Sweden)

    Mohamed-Mounir El Mendili

    Full Text Available To design a fast and accurate semi-automated segmentation method for spinal cord 3T MR images and to construct a template of the cervical spinal cord.A semi-automated double threshold-based method (DTbM was proposed enabling both cross-sectional and volumetric measures from 3D T2-weighted turbo spin echo MR scans of the spinal cord at 3T. Eighty-two healthy subjects, 10 patients with amyotrophic lateral sclerosis, 10 with spinal muscular atrophy and 10 with spinal cord injuries were studied. DTbM was compared with active surface method (ASM, threshold-based method (TbM and manual outlining (ground truth. Accuracy of segmentations was scored visually by a radiologist in cervical and thoracic cord regions. Accuracy was also quantified at the cervical and thoracic levels as well as at C2 vertebral level. To construct a cervical template from healthy subjects' images (n=59, a standardization pipeline was designed leading to well-centered straight spinal cord images and accurate probability tissue map.Visual scoring showed better performance for DTbM than for ASM. Mean Dice similarity coefficient (DSC was 95.71% for DTbM and 90.78% for ASM at the cervical level and 94.27% for DTbM and 89.93% for ASM at the thoracic level. Finally, at C2 vertebral level, mean DSC was 97.98% for DTbM compared with 98.02% for TbM and 96.76% for ASM. DTbM showed similar accuracy compared with TbM, but with the advantage of limited manual interaction.A semi-automated segmentation method with limited manual intervention was introduced and validated on 3T images, enabling the construction of a cervical spinal cord template.

  10. Fast and accurate semi-automated segmentation method of spinal cord MR images at 3T applied to the construction of a cervical spinal cord template.

    Science.gov (United States)

    El Mendili, Mohamed-Mounir; Chen, Raphaël; Tiret, Brice; Villard, Noémie; Trunet, Stéphanie; Pélégrini-Issac, Mélanie; Lehéricy, Stéphane; Pradat, Pierre-François; Benali, Habib

    2015-01-01

    To design a fast and accurate semi-automated segmentation method for spinal cord 3T MR images and to construct a template of the cervical spinal cord. A semi-automated double threshold-based method (DTbM) was proposed enabling both cross-sectional and volumetric measures from 3D T2-weighted turbo spin echo MR scans of the spinal cord at 3T. Eighty-two healthy subjects, 10 patients with amyotrophic lateral sclerosis, 10 with spinal muscular atrophy and 10 with spinal cord injuries were studied. DTbM was compared with active surface method (ASM), threshold-based method (TbM) and manual outlining (ground truth). Accuracy of segmentations was scored visually by a radiologist in cervical and thoracic cord regions. Accuracy was also quantified at the cervical and thoracic levels as well as at C2 vertebral level. To construct a cervical template from healthy subjects' images (n=59), a standardization pipeline was designed leading to well-centered straight spinal cord images and accurate probability tissue map. Visual scoring showed better performance for DTbM than for ASM. Mean Dice similarity coefficient (DSC) was 95.71% for DTbM and 90.78% for ASM at the cervical level and 94.27% for DTbM and 89.93% for ASM at the thoracic level. Finally, at C2 vertebral level, mean DSC was 97.98% for DTbM compared with 98.02% for TbM and 96.76% for ASM. DTbM showed similar accuracy compared with TbM, but with the advantage of limited manual interaction. A semi-automated segmentation method with limited manual intervention was introduced and validated on 3T images, enabling the construction of a cervical spinal cord template.

  11. Semi-Automated Air-Coupled Impact-Echo Method for Large-Scale Parkade Structure

    Directory of Open Access Journals (Sweden)

    Tyler Epp

    2018-03-01

    Full Text Available Structural Health Monitoring (SHM has moved to data-dense systems, utilizing numerous sensor types to monitor infrastructure, such as bridges and dams, more regularly. One of the issues faced in this endeavour is the scale of the inspected structures and the time it takes to carry out testing. Installing automated systems that can provide measurements in a timely manner is one way of overcoming these obstacles. This study proposes an Artificial Neural Network (ANN application that determines intact and damaged locations from a small training sample of impact-echo data, using air-coupled microphones from a reinforced concrete beam in lab conditions and data collected from a field experiment in a parking garage. The impact-echo testing in the field is carried out in a semi-autonomous manner to expedite the front end of the in situ damage detection testing. The use of an ANN removes the need for a user-defined cutoff value for the classification of intact and damaged locations when a least-square distance approach is used. It is postulated that this may contribute significantly to testing time reduction when monitoring large-scale civil Reinforced Concrete (RC structures.

  12. An evaluation of semi-automated methods for collecting ecosystem-level data in temperate marine systems.

    Science.gov (United States)

    Griffin, Kingsley J; Hedge, Luke H; González-Rivero, Manuel; Hoegh-Guldberg, Ove I; Johnston, Emma L

    2017-07-01

    Historically, marine ecologists have lacked efficient tools that are capable of capturing detailed species distribution data over large areas. Emerging technologies such as high-resolution imaging and associated machine-learning image-scoring software are providing new tools to map species over large areas in the ocean. Here, we combine a novel diver propulsion vehicle (DPV) imaging system with free-to-use machine-learning software to semi-automatically generate dense and widespread abundance records of a habitat-forming algae over ~5,000 m 2 of temperate reef. We employ replicable spatial techniques to test the effectiveness of traditional diver-based sampling, and better understand the distribution and spatial arrangement of one key algal species. We found that the effectiveness of a traditional survey depended on the level of spatial structuring, and generally 10-20 transects (50 × 1 m) were required to obtain reliable results. This represents 2-20 times greater replication than have been collected in previous studies. Furthermore, we demonstrate the usefulness of fine-resolution distribution modeling for understanding patterns in canopy algae cover at multiple spatial scales, and discuss applications to other marine habitats. Our analyses demonstrate that semi-automated methods of data gathering and processing provide more accurate results than traditional methods for describing habitat structure at seascape scales, and therefore represent vastly improved techniques for understanding and managing marine seascapes.

  13. A semi-automated 2D/3D marker-based registration algorithm modelling prostate shrinkage during radiotherapy for prostate cancer

    International Nuclear Information System (INIS)

    Budiharto, Tom; Slagmolen, Pieter; Hermans, Jeroen; Maes, Frederik; Verstraete, Jan; Heuvel, Frank Van den; Depuydt, Tom; Oyen, Raymond; Haustermans, Karin

    2009-01-01

    Background and purpose: Currently, most available patient alignment tools based on implanted markers use manual marker matching and rigid registration transformations to measure the needed translational shifts. To quantify the particular effect of prostate gland shrinkage, implanted gold markers were tracked during a course of radiotherapy including an isotropic scaling factor to model prostate shrinkage. Materials and methods: Eight patients with prostate cancer had gold markers implanted transrectally and seven were treated with (neo) adjuvant androgen deprivation therapy. After patient alignment to skin tattoos, orthogonal electronic portal images (EPIs) were taken. A semi-automated 2D/3D marker-based registration was performed to calculate the necessary couch shifts. The registration consists of a rigid transformation combined with an isotropic scaling to model prostate shrinkage. Results: The inclusion of an isotropic shrinkage model in the registration algorithm cancelled the corresponding increase in registration error. The mean scaling factor was 0.89 ± 0.09. For all but two patients, a decrease of the isotropic scaling factor during treatment was observed. However, there was almost no difference in the translation offset between the manual matching of the EPIs to the digitally reconstructed radiographs and the semi-automated 2D/3D registration. A decrease in the intermarker distance was found correlating with prostate shrinkage rather than with random marker migration. Conclusions: Inclusion of shrinkage in the registration process reduces registration errors during a course of radiotherapy. Nevertheless, this did not lead to a clinically significant change in the proposed table translations when compared to translations obtained with manual marker matching without a scaling correction

  14. Improvement of Hydrological Simulations by Applying Daily Precipitation Interpolation Schemes in Meso-Scale Catchments

    Directory of Open Access Journals (Sweden)

    Mateusz Szcześniak

    2015-02-01

    Full Text Available Ground-based precipitation data are still the dominant input type for hydrological models. Spatial variability in precipitation can be represented by spatially interpolating gauge data using various techniques. In this study, the effect of daily precipitation interpolation methods on discharge simulations using the semi-distributed SWAT (Soil and Water Assessment Tool model over a 30-year period is examined. The study was carried out in 11 meso-scale (119–3935 km2 sub-catchments lying in the Sulejów reservoir catchment in central Poland. Four methods were tested: the default SWAT method (Def based on the Nearest Neighbour technique, Thiessen Polygons (TP, Inverse Distance Weighted (IDW and Ordinary Kriging (OK. =The evaluation of methods was performed using a semi-automated calibration program SUFI-2 (Sequential Uncertainty Fitting Procedure Version 2 with two objective functions: Nash-Sutcliffe Efficiency (NSE and the adjusted R2 coefficient (bR2. The results show that: (1 the most complex OK method outperformed other methods in terms of NSE; and (2 OK, IDW, and TP outperformed Def in terms of bR2. The median difference in daily/monthly NSE between OK and Def/TP/IDW calculated across all catchments ranged between 0.05 and 0.15, while the median difference between TP/IDW/OK and Def ranged between 0.05 and 0.07. The differences between pairs of interpolation methods were, however, spatially variable and a part of this variability was attributed to catchment properties: catchments characterised by low station density and low coefficient of variation of daily flows experienced more pronounced improvement resulting from using interpolation methods. Methods providing higher precipitation estimates often resulted in a better model performance. The implication from this study is that appropriate consideration of spatial precipitation variability (often neglected by model users that can be achieved using relatively simple interpolation methods can

  15. Process analysis of the modelled 3-D mesoscale impact of aircraft emissions on the atmosphere

    Energy Technology Data Exchange (ETDEWEB)

    Hendricks, J; Ebel, A; Lippert, E; Petry, H [Koeln Univ. (Germany). Inst. fuer Geophysik und Meterorologie

    1998-12-31

    A mesoscale chemistry transport model is applied to study the impact of aircraft emissions on the atmospheric trace gas composition. A special analysis of the simulations is conducted to separate the effects of chemistry, transport, diffusion and cloud processes on the transformation of the exhausts of a subsonic fleet cruising over the North Atlantic. The aircraft induced ozone production strongly depends on the tropopause height and the cruise altitude. Aircraft emissions may undergo an effective downward transport under the influence of stratosphere-troposphere exchange activity. (author) 12 refs.

  16. Process analysis of the modelled 3-D mesoscale impact of aircraft emissions on the atmosphere

    Energy Technology Data Exchange (ETDEWEB)

    Hendricks, J.; Ebel, A.; Lippert, E.; Petry, H. [Koeln Univ. (Germany). Inst. fuer Geophysik und Meterorologie

    1997-12-31

    A mesoscale chemistry transport model is applied to study the impact of aircraft emissions on the atmospheric trace gas composition. A special analysis of the simulations is conducted to separate the effects of chemistry, transport, diffusion and cloud processes on the transformation of the exhausts of a subsonic fleet cruising over the North Atlantic. The aircraft induced ozone production strongly depends on the tropopause height and the cruise altitude. Aircraft emissions may undergo an effective downward transport under the influence of stratosphere-troposphere exchange activity. (author) 12 refs.

  17. Mesoscale carbon sequestration site screening and CCS infrastructure analysis.

    Science.gov (United States)

    Keating, Gordon N; Middleton, Richard S; Stauffer, Philip H; Viswanathan, Hari S; Letellier, Bruce C; Pasqualini, Donatella; Pawar, Rajesh J; Wolfsberg, Andrew V

    2011-01-01

    We explore carbon capture and sequestration (CCS) at the meso-scale, a level of study between regional carbon accounting and highly detailed reservoir models for individual sites. We develop an approach to CO(2) sequestration site screening for industries or energy development policies that involves identification of appropriate sequestration basin, analysis of geologic formations, definition of surface sites, design of infrastructure, and analysis of CO(2) transport and storage costs. Our case study involves carbon management for potential oil shale development in the Piceance-Uinta Basin, CO and UT. This study uses new capabilities of the CO(2)-PENS model for site screening, including reservoir capacity, injectivity, and cost calculations for simple reservoirs at multiple sites. We couple this with a model of optimized source-sink-network infrastructure (SimCCS) to design pipeline networks and minimize CCS cost for a given industry or region. The CLEAR(uff) dynamical assessment model calculates the CO(2) source term for various oil production levels. Nine sites in a 13,300 km(2) area have the capacity to store 6.5 GtCO(2), corresponding to shale-oil production of 1.3 Mbbl/day for 50 years (about 1/4 of U.S. crude oil production). Our results highlight the complex, nonlinear relationship between the spatial deployment of CCS infrastructure and the oil-shale production rate.

  18. Automated Technology for Verificiation and Analysis

    DEFF Research Database (Denmark)

    -of-the-art research on theoretical and practical aspects of automated analysis, verification, and synthesis. Among 74 research papers and 10 tool papers submitted to ATVA 2009, the Program Committee accepted 23 as regular papers and 3 as tool papers. In all, 33 experts from 17 countries worked hard to make sure......This volume contains the papers presented at the 7th International Symposium on Automated Technology for Verification and Analysis held during October 13-16 in Macao SAR, China. The primary objective of the ATVA conferences remains the same: to exchange and promote the latest advances of state...

  19. An overview of the contaminant analysis automation program

    International Nuclear Information System (INIS)

    Hollen, R.M.; Erkkila, T.; Beugelsdijk, T.J.

    1992-01-01

    The Department of Energy (DOE) has significant amounts of radioactive and hazardous wastes stored, buried, and still being generated at many sites within the United States. These wastes must be characterized to determine the elemental, isotopic, and compound content before remediation can begin. In this paper, the authors project that sampling requirements will necessitate generating more than 10 million samples by 1995, which will far exceed the capabilities of our current manual chemical analysis laboratories. The Contaminant Analysis Automation effort (CAA), with Los Alamos National Laboratory (LANL) as to the coordinating Laboratory, is designing and fabricating robotic systems that will standardize and automate both the hardware and the software of the most common environmental chemical methods. This will be accomplished by designing and producing several unique analysis systems called Standard Analysis Methods (SAM). Each SAM will automate a specific chemical method, including sample preparation, the analytical analysis, and the data interpretation, by using a building block known as the Standard Laboratory Module (SLM). This concept allows the chemist to assemble an automated environmental method using standardized SLMs easily and without the worry of hardware compatibility or the necessity of generating complicated control programs

  20. Development of a web-based CANDU core management procedures automation system

    International Nuclear Information System (INIS)

    Lee, S.; Park, D.; Yeom, C.; Suh, H.

    2007-01-01

    Introduce CANDU core management procedures automation system (COMPAS) - A web-based application which semi-automates several CANDU core management tasks. It provides various functionalities including selection and evaluation of refueling channel, detector calibration, coolant flow estimation and thermal power calculation through automated interfacing with analysis codes (RFSP, NUCIRC, etc.) and plant data. It also utilizes brand new .NET computing technology such as ASP.NET, smart client, web services and so on. Since almost all functions are abstracted from the previous experiences of the current working members of the Wolsong Nuclear Power Plant (NPP), it will lead to an efficient and safe operation of CANDU plants. (author)

  1. Development of a web-based CANDU core management procedures automation system

    Energy Technology Data Exchange (ETDEWEB)

    Lee, S.; Park, D.; Yeom, C. [Inst. for Advanced Engineering (IAE), Yongin (Korea, Republic of); Suh, H. [Korea Hydro and Nuclear Power (KHNP), Wolsong (Korea, Republic of)

    2007-07-01

    Introduce CANDU core management procedures automation system (COMPAS) - A web-based application which semi-automates several CANDU core management tasks. It provides various functionalities including selection and evaluation of refueling channel, detector calibration, coolant flow estimation and thermal power calculation through automated interfacing with analysis codes (RFSP, NUCIRC, etc.) and plant data. It also utilizes brand new .NET computing technology such as ASP.NET, smart client, web services and so on. Since almost all functions are abstracted from the previous experiences of the current working members of the Wolsong Nuclear Power Plant (NPP), it will lead to an efficient and safe operation of CANDU plants. (author)

  2. 1st workshop on situational awareness in semi-Automated vehicles

    NARCIS (Netherlands)

    McCall, R.; Baumann, M.; Politis, I.; Borojeni, S.S.; Alvarez, I.; Mirnig, A.; Meschtscherjakov, A.; Tscheligi, M.; Chuang, L.; Terken, J.M.B.

    2016-01-01

    This workshop will focus on the problem of occupant and vehicle situational awareness with respect to automated vehicles when the driver must take over control. It will explore the future of fully automated and mixed traffic situations where vehicles are assumed to be operating at level 3 or above.

  3. Mesoscale Benchmark Demonstration Problem 1: Mesoscale Simulations of Intra-granular Fission Gas Bubbles in UO2 under Post-irradiation Thermal Annealing

    Energy Technology Data Exchange (ETDEWEB)

    Li, Yulan; Hu, Shenyang Y.; Montgomery, Robert; Gao, Fei; Sun, Xin; Tonks, Michael; Biner, Bullent; Millet, Paul; Tikare, Veena; Radhakrishnan, Balasubramaniam; Andersson , David

    2012-04-11

    A study was conducted to evaluate the capabilities of different numerical methods used to represent microstructure behavior at the mesoscale for irradiated material using an idealized benchmark problem. The purpose of the mesoscale benchmark problem was to provide a common basis to assess several mesoscale methods with the objective of identifying the strengths and areas of improvement in the predictive modeling of microstructure evolution. In this work, mesoscale models (phase-field, Potts, and kinetic Monte Carlo) developed by PNNL, INL, SNL, and ORNL were used to calculate the evolution kinetics of intra-granular fission gas bubbles in UO2 fuel under post-irradiation thermal annealing conditions. The benchmark problem was constructed to include important microstructural evolution mechanisms on the kinetics of intra-granular fission gas bubble behavior such as the atomic diffusion of Xe atoms, U vacancies, and O vacancies, the effect of vacancy capture and emission from defects, and the elastic interaction of non-equilibrium gas bubbles. An idealized set of assumptions was imposed on the benchmark problem to simplify the mechanisms considered. The capability and numerical efficiency of different models are compared against selected experimental and simulation results. These comparisons find that the phase-field methods, by the nature of the free energy formulation, are able to represent a larger subset of the mechanisms influencing the intra-granular bubble growth and coarsening mechanisms in the idealized benchmark problem as compared to the Potts and kinetic Monte Carlo methods. It is recognized that the mesoscale benchmark problem as formulated does not specifically highlight the strengths of the discrete particle modeling used in the Potts and kinetic Monte Carlo methods. Future efforts are recommended to construct increasingly more complex mesoscale benchmark problems to further verify and validate the predictive capabilities of the mesoscale modeling

  4. A spatio-temporal evaluation of the WRF physical parameterisations for numerical rainfall simulation in semi-humid and semi-arid catchments of Northern China

    Science.gov (United States)

    Tian, Jiyang; Liu, Jia; Wang, Jianhua; Li, Chuanzhe; Yu, Fuliang; Chu, Zhigang

    2017-07-01

    Mesoscale Numerical Weather Prediction systems can provide rainfall products at high resolutions in space and time, playing an increasingly more important role in water management and flood forecasting. The Weather Research and Forecasting (WRF) model is one of the most popular mesoscale systems and has been extensively used in research and practice. However, for hydrologists, an unsolved question must be addressed before each model application in a different target area. That is, how are the most appropriate combinations of physical parameterisations from the vast WRF library selected to provide the best downscaled rainfall? In this study, the WRF model was applied with 12 designed parameterisation schemes with different combinations of physical parameterisations, including microphysics, radiation, planetary boundary layer (PBL), land-surface model (LSM) and cumulus parameterisations. The selected study areas are two semi-humid and semi-arid catchments located in the Daqinghe River basin, Northern China. The performance of WRF with different parameterisation schemes is tested for simulating eight typical 24-h storm events with different evenness in space and time. In addition to the cumulative rainfall amount, the spatial and temporal patterns of the simulated rainfall are evaluated based on a two-dimensional composed verification statistic. Among the 12 parameterisation schemes, Scheme 4 outperforms the other schemes with the best average performance in simulating rainfall totals and temporal patterns; in contrast, Scheme 6 is generally a good choice for simulations of spatial rainfall distributions. Regarding the individual parameterisations, Single-Moment 6 (WSM6), Yonsei University (YSU), Kain-Fritsch (KF) and Grell-Devenyi (GD) are better choices for microphysics, planetary boundary layers (PBL) and cumulus parameterisations, respectively, in the study area. These findings provide helpful information for WRF rainfall downscaling in semi-humid and semi

  5. Mesoscale Frontogenesis: An Analysis of Two Cold Front Case Studies

    Science.gov (United States)

    1993-01-01

    marked the boundary of warm air or the "warm sector". Further development of this cyclone model by Bjerknes and Solberg (1922) and Bergeron (1928) provided...represent 25 mn s -1 Relative humidity of greater than 80% indicated by the shaded region in gray. Frontal zones marked with solid black lines. 24 two... Zuckerberg , J.T. Schaefer, and G.E. Rasch, 1986: Forecast problems: The meteorological and operational factors, In: Mesoscale Meteorology and Forecasting

  6. Time efficiency and diagnostic accuracy of new automated myocardial perfusion analysis software in 320-row CT cardiac imaging

    Energy Technology Data Exchange (ETDEWEB)

    Rief, Matthias; Stenzei, Fabian; Kranz, Anisha; Schlattmann, Peter; Dewey, Marc [Dept. of Radiology, Charite - Universiteitsmedizin Berlin, Berlin (Greece)

    2013-01-15

    We aimed to evaluate the time efficiency and diagnostic accuracy of automated myocardial computed tomography perfusion (CTP) image analysis software. 320-row CTP was performed in 30 patients, and analyses were conducted independently by three different blinded readers by the use of two recent software releases (version 4.6 and novel version 4.71GR001, Toshiba, Tokyo, Japan). Analysis times were compared, and automated epi- and endocardial contour detection was subjectively rated in five categories (excellent, good, fair, poor and very poor). As semi-quantitative perfusion parameters, myocardial attenuation and transmural perfusion ratio (TPR) were calculated for each myocardial segment and agreement was tested by using the intraclass correlation coefficient (ICC). Conventional coronary angiography served as reference standard. The analysis time was significantly reduced with the novel automated software version as compared with the former release (Reader 1: 43:08 ± 11:39 min vs. 09:47 ± 04:51 min, Reader 2: 42:07 ± 06:44 min vs. 09:42 ± 02:50 min and Reader 3: 21:38 ± 3:44 min vs. 07:34 ± 02:12 min; p < 0.001 for all). Epi- and endocardial contour detection for the novel software was rated to be significantly better (p < 0.001) than with the former software. ICCs demonstrated strong agreement (≥ 0.75) for myocardial attenuation in 93% and for TPR in 82%. Diagnostic accuracy for the two software versions was not significantly different (p 0.169) as compared with conventional coronary angiography. The novel automated CTP analysis software offers enhanced time efficiency with an improvement by a factor of about four, while maintaining diagnostic accuracy.

  7. Automated assembly of micro mechanical parts in a Microfactory setup

    DEFF Research Database (Denmark)

    Eriksson, Torbjörn Gerhard; Hansen, Hans Nørgaard; Gegeckaite, Asta

    2006-01-01

    Many micro products in use today are manufactured using semi-automatic assembly. Handling, assembly and transport of the parts are especially labour intense processes. Automation of these processes holds a large potential, especially if flexible, modular microfactories can be developed. This paper...... focuses on the issues that have to be taken into consideration in order to go from a semi-automatic production into an automated microfactory. The application in this study is a switch consisting of 7 parts. The development of a microfactory setup to take care of the automated assembly of the switch...

  8. Automating dChip: toward reproducible sharing of microarray data analysis

    Directory of Open Access Journals (Sweden)

    Li Cheng

    2008-05-01

    Full Text Available Abstract Background During the past decade, many software packages have been developed for analysis and visualization of various types of microarrays. We have developed and maintained the widely used dChip as a microarray analysis software package accessible to both biologist and data analysts. However, challenges arise when dChip users want to analyze large number of arrays automatically and share data analysis procedures and parameters. Improvement is also needed when the dChip user support team tries to identify the causes of reported analysis errors or bugs from users. Results We report here implementation and application of the dChip automation module. Through this module, dChip automation files can be created to include menu steps, parameters, and data viewpoints to run automatically. A data-packaging function allows convenient transfer from one user to another of the dChip software, microarray data, and analysis procedures, so that the second user can reproduce the entire analysis session of the first user. An analysis report file can also be generated during an automated run, including analysis logs, user comments, and viewpoint screenshots. Conclusion The dChip automation module is a step toward reproducible research, and it can prompt a more convenient and reproducible mechanism for sharing microarray software, data, and analysis procedures and results. Automation data packages can also be used as publication supplements. Similar automation mechanisms could be valuable to the research community if implemented in other genomics and bioinformatics software packages.

  9. AUTOMATED BATCH CHARACTERIZATION OF ICF SHELLS WITH VISION-ENABLED OPTICAL MICROSCOPE SYSTEM

    International Nuclear Information System (INIS)

    HUANG, H.; STEPHENS, R.B.; HILL, D.W.; LYON, C.; NIKROO, A.; STEINMAN, D.A.

    2003-09-01

    OAK-B135 Inertial Confinement Fusion (ICF) shells are mesoscale objects with nano-scale dimensional and nano-surface finish requirements. Currently, the shell dimensions are measured by white-light interferometry and an image analysis method. These two methods complement each other and give a rather complete data set on a single shell. The process is, however, labor intensive. They have developed an automation routine to fully characterize a shell in one shot and perform unattended batch measurements. The method is useful to the ICF program both for production screening and for full characterization. It also has potential for Inertial Fusion Energy (IFE) power plant where half a million shells need to be processed daily

  10. OMIT: dynamic, semi-automated ontology development for the microRNA domain.

    Directory of Open Access Journals (Sweden)

    Jingshan Huang

    Full Text Available As a special class of short non-coding RNAs, microRNAs (a.k.a. miRNAs or miRs have been reported to perform important roles in various biological processes by regulating respective target genes. However, significant barriers exist during biologists' conventional miR knowledge discovery. Emerging semantic technologies, which are based upon domain ontologies, can render critical assistance to this problem. Our previous research has investigated the construction of a miR ontology, named Ontology for MIcroRNA Target Prediction (OMIT, the very first of its kind that formally encodes miR domain knowledge. Although it is unavoidable to have a manual component contributed by domain experts when building ontologies, many challenges have been identified for a completely manual development process. The most significant issue is that a manual development process is very labor-intensive and thus extremely expensive. Therefore, we propose in this paper an innovative ontology development methodology. Our contributions can be summarized as: (i We have continued the development and critical improvement of OMIT, solidly based on our previous research outcomes. (ii We have explored effective and efficient algorithms with which the ontology development can be seamlessly combined with machine intelligence and be accomplished in a semi-automated manner, thus significantly reducing large amounts of human efforts. A set of experiments have been conducted to thoroughly evaluate our proposed methodology.

  11. OMIT: dynamic, semi-automated ontology development for the microRNA domain.

    Science.gov (United States)

    Huang, Jingshan; Dang, Jiangbo; Borchert, Glen M; Eilbeck, Karen; Zhang, He; Xiong, Min; Jiang, Weijian; Wu, Hao; Blake, Judith A; Natale, Darren A; Tan, Ming

    2014-01-01

    As a special class of short non-coding RNAs, microRNAs (a.k.a. miRNAs or miRs) have been reported to perform important roles in various biological processes by regulating respective target genes. However, significant barriers exist during biologists' conventional miR knowledge discovery. Emerging semantic technologies, which are based upon domain ontologies, can render critical assistance to this problem. Our previous research has investigated the construction of a miR ontology, named Ontology for MIcroRNA Target Prediction (OMIT), the very first of its kind that formally encodes miR domain knowledge. Although it is unavoidable to have a manual component contributed by domain experts when building ontologies, many challenges have been identified for a completely manual development process. The most significant issue is that a manual development process is very labor-intensive and thus extremely expensive. Therefore, we propose in this paper an innovative ontology development methodology. Our contributions can be summarized as: (i) We have continued the development and critical improvement of OMIT, solidly based on our previous research outcomes. (ii) We have explored effective and efficient algorithms with which the ontology development can be seamlessly combined with machine intelligence and be accomplished in a semi-automated manner, thus significantly reducing large amounts of human efforts. A set of experiments have been conducted to thoroughly evaluate our proposed methodology.

  12. OMIT: Dynamic, Semi-Automated Ontology Development for the microRNA Domain

    Science.gov (United States)

    Huang, Jingshan; Dang, Jiangbo; Borchert, Glen M.; Eilbeck, Karen; Zhang, He; Xiong, Min; Jiang, Weijian; Wu, Hao; Blake, Judith A.; Natale, Darren A.; Tan, Ming

    2014-01-01

    As a special class of short non-coding RNAs, microRNAs (a.k.a. miRNAs or miRs) have been reported to perform important roles in various biological processes by regulating respective target genes. However, significant barriers exist during biologists' conventional miR knowledge discovery. Emerging semantic technologies, which are based upon domain ontologies, can render critical assistance to this problem. Our previous research has investigated the construction of a miR ontology, named Ontology for MIcroRNA Target Prediction (OMIT), the very first of its kind that formally encodes miR domain knowledge. Although it is unavoidable to have a manual component contributed by domain experts when building ontologies, many challenges have been identified for a completely manual development process. The most significant issue is that a manual development process is very labor-intensive and thus extremely expensive. Therefore, we propose in this paper an innovative ontology development methodology. Our contributions can be summarized as: (i) We have continued the development and critical improvement of OMIT, solidly based on our previous research outcomes. (ii) We have explored effective and efficient algorithms with which the ontology development can be seamlessly combined with machine intelligence and be accomplished in a semi-automated manner, thus significantly reducing large amounts of human efforts. A set of experiments have been conducted to thoroughly evaluate our proposed methodology. PMID:25025130

  13. Automated concept and relationship extraction for the semi-automated ontology management (SEAM) system.

    Science.gov (United States)

    Doing-Harris, Kristina; Livnat, Yarden; Meystre, Stephane

    2015-01-01

    We develop medical-specialty specific ontologies that contain the settled science and common term usage. We leverage current practices in information and relationship extraction to streamline the ontology development process. Our system combines different text types with information and relationship extraction techniques in a low overhead modifiable system. Our SEmi-Automated ontology Maintenance (SEAM) system features a natural language processing pipeline for information extraction. Synonym and hierarchical groups are identified using corpus-based semantics and lexico-syntactic patterns. The semantic vectors we use are term frequency by inverse document frequency and context vectors. Clinical documents contain the terms we want in an ontology. They also contain idiosyncratic usage and are unlikely to contain the linguistic constructs associated with synonym and hierarchy identification. By including both clinical and biomedical texts, SEAM can recommend terms from those appearing in both document types. The set of recommended terms is then used to filter the synonyms and hierarchical relationships extracted from the biomedical corpus. We demonstrate the generality of the system across three use cases: ontologies for acute changes in mental status, Medically Unexplained Syndromes, and echocardiogram summary statements. Across the three uses cases, we held the number of recommended terms relatively constant by changing SEAM's parameters. Experts seem to find more than 300 recommended terms to be overwhelming. The approval rate of recommended terms increased as the number and specificity of clinical documents in the corpus increased. It was 60% when there were 199 clinical documents that were not specific to the ontology domain and 90% when there were 2879 documents very specific to the target domain. We found that fewer than 100 recommended synonym groups were also preferred. Approval rates for synonym recommendations remained low varying from 43% to 25% as the

  14. 21 CFR 864.5240 - Automated blood cell diluting apparatus.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Automated blood cell diluting apparatus. 864.5240 Section 864.5240 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL DEVICES HEMATOLOGY AND PATHOLOGY DEVICES Automated and Semi-Automated Hematology Devices...

  15. Mechatronic Design Automation

    DEFF Research Database (Denmark)

    Fan, Zhun

    successfully design analogue filters, vibration absorbers, micro-electro-mechanical systems, and vehicle suspension systems, all in an automatic or semi-automatic way. It also investigates the very important issue of co-designing plant-structures and dynamic controllers in automated design of Mechatronic...

  16. An optimized method for automated analysis of algal pigments by HPLC

    NARCIS (Netherlands)

    van Leeuwe, M. A.; Villerius, L. A.; Roggeveld, J.; Visser, R. J. W.; Stefels, J.

    2006-01-01

    A recent development in algal pigment analysis by high-performance liquid chromatography (HPLC) is the application of automation. An optimization of a complete sampling and analysis protocol applied specifically in automation has not yet been performed. In this paper we show that automation can only

  17. Automated activation-analysis system

    International Nuclear Information System (INIS)

    Minor, M.M.; Hensley, W.K.; Denton, M.M.; Garcia, S.R.

    1981-01-01

    An automated delayed neutron counting and instrumental neutron activation analysis system has been developed at Los Alamos National Laboratory's Omega West Reactor (OWR) to analyze samples for uranium and 31 additional elements with a maximum throughput of 400 samples per day. The system and its mode of operation for a large reconnaissance survey are described

  18. A standard analysis method (SAM) for the automated analysis of polychlorinated biphenyls (PCBs) in soils using the chemical analysis automation (CAA) paradigm: validation and performance

    International Nuclear Information System (INIS)

    Rzeszutko, C.; Johnson, C.R.; Monagle, M.; Klatt, L.N.

    1997-10-01

    The Chemical Analysis Automation (CAA) program is developing a standardized modular automation strategy for chemical analysis. In this automation concept, analytical chemistry is performed with modular building blocks that correspond to individual elements of the steps in the analytical process. With a standardized set of behaviors and interactions, these blocks can be assembled in a 'plug and play' manner into a complete analysis system. These building blocks, which are referred to as Standard Laboratory Modules (SLM), interface to a host control system that orchestrates the entire analytical process, from sample preparation through data interpretation. The integrated system is called a Standard Analysis Method (SAME). A SAME for the automated determination of Polychlorinated Biphenyls (PCB) in soils, assembled in a mobile laboratory, is undergoing extensive testing and validation. The SAME consists of the following SLMs: a four channel Soxhlet extractor, a High Volume Concentrator, column clean up, a gas chromatograph, a PCB data interpretation module, a robot, and a human- computer interface. The SAME is configured to meet the requirements specified in U.S. Environmental Protection Agency's (EPA) SW-846 Methods 3541/3620A/8082 for the analysis of pcbs in soils. The PCB SAME will be described along with the developmental test plan. Performance data obtained during developmental testing will also be discussed

  19. North Pacific Mesoscale Coupled Air-Ocean Simulations Compared with Observations

    Energy Technology Data Exchange (ETDEWEB)

    Cerovecki, Ivana [Univ. of California, San Diego, CA (United States). Scripps Inst. of Oceanography; McClean, Julie [Univ. of California, San Diego, CA (United States). Scripps Inst. of Oceanography; Koracin, Darko [Desert Research Inst. (DRI), Reno, NV (United States). Division of Atmospheric Sciences

    2014-11-14

    The overall objective of this study was to improve the representation of regional ocean circulation in the North Pacific by using high resolution atmospheric forcing that accurately represents mesoscale processes in ocean-atmosphere regional (North Pacific) model configuration. The goal was to assess the importance of accurate representation of mesoscale processes in the atmosphere and the ocean on large scale circulation. This is an important question, as mesoscale processes in the atmosphere which are resolved by the high resolution mesoscale atmospheric models such as Weather Research and Forecasting (WRF), are absent in commonly used atmospheric forcing such as CORE forcing, employed in e.g. the Community Climate System Model (CCSM).

  20. Design of a robotic automation system for transportation of goods in hospitals

    DEFF Research Database (Denmark)

    Özkil, Ali Gürcan; Sørensen, Torben; Fan, Zhun

    2007-01-01

    Hospitals face with heavy traffic of goods everyday, where transportation tasks are mainly carried by human. Analysis of the current situation of transportation in a typical hospital showed several transportation tasks are suitable for automation. This paper presents a system, consisting of a fleet...... of robot vehicles, automatic stations and smart containers for automation of transportation of goods in hospitals. Design of semi-autonomous robot vehicles, containers and stations are presented and the overall system architecture is described. Implementing such a system in an existing hospital showed...

  1. From Quanta to the Continuum: Opportunities for Mesoscale Science

    Energy Technology Data Exchange (ETDEWEB)

    Crabtree, George [Argonne National Lab. (ANL), Argonne, IL (United States); Sarrao, John [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Alivisatos, Paul [Univ. of California, Berkeley, CA (United States); Barletta, William [MIT (Massachusetts Inst. of Technology), Cambridge, MA (United States); Bates, Frank [Univ. of Minnesota, Minneapolis, MN (United States); Brown, Gordon [Stanford Univ., CA (United States); French, Roger [Case Western Reserve Univ., Cleveland, OH (United States); Greene, Laura [Univ. of Illinois, Urbana, IL (United States); Hemminger, John [Univ. of California, Irvine, CA (United States); Kastner, Marc [MIT (Massachusetts Inst. of Technology), Cambridge, MA (United States); Kay, Bruce [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Lewis, Jennifer [Univ. of Illinois, Urbana, IL (United States); Ratner, Mark [Northwestern Univ., Evanston, IL (United States); Anthony, Rollett [Carnegie Mellon Univ., Pittsburgh, PA (United States); Rubloff, Gary [University of Maryland, College Park, MD (United States); Spence, John [Arizona State Univ., Mesa, AZ (United States); Tobias, Douglas [Univ. of California, Irvine, CA (United States); Tranquada, John [Brookhaven National Lab. (BNL), Upton, NY (United States)

    2012-09-01

    This report explores the opportunity and defines the research agenda for mesoscale science—discovering, understanding, and controlling interactions among disparate systems and phenomena to reach the full potential of materials complexity and functionality. The ability to predict and control mesoscale phenomena and architectures is essential if atomic and molecular knowledge is to blossom into a next generation of technology opportunities, societal benefits, and scientific advances.. The body of this report outlines the need, the opportunities, the challenges, and the benefits of mastering mesoscale science.

  2. Automation in Immunohematology

    Directory of Open Access Journals (Sweden)

    Meenu Bajpai

    2012-01-01

    Full Text Available There have been rapid technological advances in blood banking in South Asian region over the past decade with an increasing emphasis on quality and safety of blood products. The conventional test tube technique has given way to newer techniques such as column agglutination technique, solid phase red cell adherence assay, and erythrocyte-magnetized technique. These new technologies are adaptable to automation and major manufacturers in this field have come up with semi and fully automated equipments for immunohematology tests in the blood bank. Automation improves the objectivity and reproducibility of tests. It reduces human errors in patient identification and transcription errors. Documentation and traceability of tests, reagents and processes and archiving of results is another major advantage of automation. Shifting from manual methods to automation is a major undertaking for any transfusion service to provide quality patient care with lesser turnaround time for their ever increasing workload. This article discusses the various issues involved in the process.

  3. Automating sensitivity analysis of computer models using computer calculus

    International Nuclear Information System (INIS)

    Oblow, E.M.; Pin, F.G.

    1986-01-01

    An automated procedure for performing sensitivity analysis has been developed. The procedure uses a new FORTRAN compiler with computer calculus capabilities to generate the derivatives needed to set up sensitivity equations. The new compiler is called GRESS - Gradient Enhanced Software System. Application of the automated procedure with direct and adjoint sensitivity theory for the analysis of non-linear, iterative systems of equations is discussed. Calculational efficiency consideration and techniques for adjoint sensitivity analysis are emphasized. The new approach is found to preserve the traditional advantages of adjoint theory while removing the tedious human effort previously needed to apply this theoretical methodology. Conclusions are drawn about the applicability of the automated procedure in numerical analysis and large-scale modelling sensitivity studies

  4. Assessment of tobacco smoke effects on neonatal cardiorespiratory control using a semi-automated processing approach.

    Science.gov (United States)

    Al-Omar, Sally; Le Rolle, Virginie; Beuchée, Alain; Samson, Nathalie; Praud, Jean-Paul; Carrault, Guy

    2018-05-10

    A semi-automated processing approach was developed to assess the effects of early postnatal environmental tobacco smoke (ETS) on the cardiorespiratory control of newborn lambs. The system consists of several steps beginning with artifact rejection, followed by the selection of stationary segments, and ending with feature extraction. This approach was used in six lambs exposed to 20 cigarettes/day for the first 15 days of life, while another six control lambs were exposed to room air. On postnatal day 16, electrocardiograph and respiratory signals were obtained from a 6-h polysomnographic recording. The effects of postnatal ETS exposure on heart rate variability, respiratory rate variability, and cardiorespiratory interrelations were explored. The unique results suggest that early postnatal ETS exposure increases respiratory rate variability and decreases the coupling between cardiac and respiratory systems. Potentially harmful consequences in early life include unstable breathing and decreased adaptability of cardiorespiratory function, particularly during early life challenges, such as prematurity or viral infection. Graphical abstract ᅟ.

  5. Semi-automated categorization of open-ended questions

    Directory of Open Access Journals (Sweden)

    Matthias Schonlau

    2016-08-01

    Full Text Available Text data from open-ended questions in surveys are difficult to analyze and are frequently ignored. Yet open-ended questions are important because they do not constrain respondents’ answer choices. Where open-ended questions are necessary, sometimes multiple human coders hand-code answers into one of several categories. At the same time, computer scientists have made impressive advances in text mining that may allow automation of such coding. Automated algorithms do not achieve an overall accuracy high enough to entirely replace humans. We categorize open-ended questions soliciting narrative responses using text mining for easy-to-categorize answers and humans for the remainder using expected accuracies to guide the choice of the threshold delineating between “easy” and “hard”. Employing multinomial boosting avoids the common practice of converting machine learning “confidence scores” into pseudo-probabilities. This approach is illustrated with examples from open-ended questions related to respondents’ advice to a patient in a hypothetical dilemma, a follow-up probe related to respondents’ perception of disclosure/privacy risk, and from a question on reasons for quitting smoking from a follow-up survey from the Ontario Smoker’s Helpline. Targeting 80% combined accuracy, we found that 54%-80% of the data could be categorized automatically in research surveys.

  6. Assessment of Automated Data Analysis Application on VVER Steam Generator Tubing

    International Nuclear Information System (INIS)

    Picek, E.; Barilar, D.

    2006-01-01

    INETEC - Institute for Nuclear Technology has developed software package named EddyOne having an option of automated analysis of bobbin coil eddy current data. During its development and site use some features were noticed preventing the wide use automatic analysis on VVER SG data. This article discuss these specific problems as well evaluates possible solutions. With regards to current state of automated analysis technology an overview of advantaged and disadvantages of automated analysis on VVER SG is summarized as well.(author)

  7. Automated Analysis of Infinite Scenarios

    DEFF Research Database (Denmark)

    Buchholtz, Mikael

    2005-01-01

    The security of a network protocol crucially relies on the scenario in which the protocol is deployed. This paper describes syntactic constructs for modelling network scenarios and presents an automated analysis tool, which can guarantee that security properties hold in all of the (infinitely many...

  8. Complex Domains Call for Automation but Automation Requires More Knowledge and Learning

    DEFF Research Database (Denmark)

    Madsen, Erik Skov; Mikkelsen, Lars Lindegaard

    studies investigate operation and automation of oil and gas production in the North Sea. Semi-structured interviews, surveys, and observations are the main methods used. The paper provides a novel conceptual framework around which management may generate discussions about productivity and the need...

  9. Automated analysis of slitless spectra. II. Quasars

    International Nuclear Information System (INIS)

    Edwards, G.; Beauchemin, M.; Borra, F.

    1988-01-01

    Automated software have been developed to process slitless spectra. The software, described in a previous paper, automatically separates stars from extended objects and quasars from stars. This paper describes the quasar search techniques and discusses the results. The performance of the software is compared and calibrated with a plate taken in a region of SA 57 that has been extensively surveyed by others using a variety of techniques: the proposed automated software performs very well. It is found that an eye search of the same plate is less complete than the automated search: surveys that rely on eye searches suffer from incompleteness at least from a magnitude brighter than the plate limit. It is shown how the complete automated analysis of a plate and computer simulations are used to calibrate and understand the characteristics of the present data. 20 references

  10. Methods for semi-automated indexing for high precision information retrieval

    Science.gov (United States)

    Berrios, Daniel C.; Cucina, Russell J.; Fagan, Lawrence M.

    2002-01-01

    OBJECTIVE: To evaluate a new system, ISAID (Internet-based Semi-automated Indexing of Documents), and to generate textbook indexes that are more detailed and more useful to readers. DESIGN: Pilot evaluation: simple, nonrandomized trial comparing ISAID with manual indexing methods. Methods evaluation: randomized, cross-over trial comparing three versions of ISAID and usability survey. PARTICIPANTS: Pilot evaluation: two physicians. Methods evaluation: twelve physicians, each of whom used three different versions of the system for a total of 36 indexing sessions. MEASUREMENTS: Total index term tuples generated per document per minute (TPM), with and without adjustment for concordance with other subjects; inter-indexer consistency; ratings of the usability of the ISAID indexing system. RESULTS: Compared with manual methods, ISAID decreased indexing times greatly. Using three versions of ISAID, inter-indexer consistency ranged from 15% to 65% with a mean of 41%, 31%, and 40% for each of three documents. Subjects using the full version of ISAID were faster (average TPM: 5.6) and had higher rates of concordant index generation. There were substantial learning effects, despite our use of a training/run-in phase. Subjects using the full version of ISAID were much faster by the third indexing session (average TPM: 9.1). There was a statistically significant increase in three-subject concordant indexing rate using the full version of ISAID during the second indexing session (p < 0.05). SUMMARY: Users of the ISAID indexing system create complex, precise, and accurate indexing for full-text documents much faster than users of manual methods. Furthermore, the natural language processing methods that ISAID uses to suggest indexes contributes substantially to increased indexing speed and accuracy.

  11. Semi-automated preparation of a 11C-labelled antibiotic - [N-methyl-11C]erythromycin A lactobionate

    International Nuclear Information System (INIS)

    Pike, V.W.; Palmer, A.J.; Horlock, P.L.; Liss, R.H.

    1984-01-01

    A fast semi-automated method is described for labelling the antibiotic, erythromycin A (1), with the short-lived positron-emitting radionuclide, 11 C(tsub(1/2)=20.4 min), in order to permit the non-invasive study of its tissue uptake in vivo. Labelling was achieved by the fast reductive methylation of N-demethylerythromycin A (2) with [ 11 C]formaldehyde, itself prepared from cyclotron-produced [ 11 C]-carbon dioxide. Rapid chemical and radiochemical purification of the [N-methyl- 11 C]erythromycin A (3) were achieved by HPLC and verified by TLC with autoradiography. The purified material was formulated for human i.v. injection as a sterile apyrogenic solution of the lactobionate salt. The preparation takes 42 min from the end of radionuclide production and from [ 11 C]carbon dioxide produces [N-methyl- 11 C]erythromycin A lactobionate in 4-12% radiochemical yield, corrected for radioactive decay. (author)

  12. Combined process automation for large-scale EEG analysis.

    Science.gov (United States)

    Sfondouris, John L; Quebedeaux, Tabitha M; Holdgraf, Chris; Musto, Alberto E

    2012-01-01

    Epileptogenesis is a dynamic process producing increased seizure susceptibility. Electroencephalography (EEG) data provides information critical in understanding the evolution of epileptiform changes throughout epileptic foci. We designed an algorithm to facilitate efficient large-scale EEG analysis via linked automation of multiple data processing steps. Using EEG recordings obtained from electrical stimulation studies, the following steps of EEG analysis were automated: (1) alignment and isolation of pre- and post-stimulation intervals, (2) generation of user-defined band frequency waveforms, (3) spike-sorting, (4) quantification of spike and burst data and (5) power spectral density analysis. This algorithm allows for quicker, more efficient EEG analysis. Copyright © 2011 Elsevier Ltd. All rights reserved.

  13. Automating sensitivity analysis of computer models using computer calculus

    International Nuclear Information System (INIS)

    Oblow, E.M.; Pin, F.G.

    1985-01-01

    An automated procedure for performing sensitivity analyses has been developed. The procedure uses a new FORTRAN compiler with computer calculus capabilities to generate the derivatives needed to set up sensitivity equations. The new compiler is called GRESS - Gradient Enhanced Software System. Application of the automated procedure with ''direct'' and ''adjoint'' sensitivity theory for the analysis of non-linear, iterative systems of equations is discussed. Calculational efficiency consideration and techniques for adjoint sensitivity analysis are emphasized. The new approach is found to preserve the traditional advantages of adjoint theory while removing the tedious human effort previously needed to apply this theoretical methodology. Conclusions are drawn about the applicability of the automated procedure in numerical analysis and large-scale modelling sensitivity studies. 24 refs., 2 figs

  14. Accuracy and Feasibility of Estimated Tumour Volumetry in Primary Gastric Gastrointestinal Stromal Tumours: Validation Using Semi-automated Technique in 127 Patients

    Science.gov (United States)

    Tirumani, Sree Harsha; Shinagare, Atul B.; O’Neill, Ailbhe C.; Nishino, Mizuki; Rosenthal, Michael H.; Ramaiya, Nikhil H.

    2015-01-01

    Objective To validate estimated tumour volumetry in primary gastric gastrointestinal stromal tumours (GISTs) using semi-automated volumetry. Materials and Methods In this IRB-approved retrospective study, we measured the three longest diameters in x, y, z axes on CTs of primary gastric GISTs in 127 consecutive patients (52 women, 75 men, mean age: 61 years) at our institute between 2000 and 2013. Segmented volumes (Vsegmented) were obtained using commercial software by two radiologists. Estimate volumes (V1–V6) were obtained using formulae for spheres and ellipsoids. Intra- and inter-observer agreement of Vsegmented and agreement of V1–6 with Vsegmented were analysed with concordance correlation coefficients (CCC) and Bland-Altman plots. Results Median Vsegmented and V1–V6 were 75.9 cm3, 124.9 cm3, 111.6 cm3, 94.0 cm3, 94.4cm3, 61.7 cm3 and 80.3 cm3 respectively. There was strong intra- and inter-observer agreement for Vsegmented. Agreement with Vsegmented was highest for V6 (scalene ellipsoid, x≠y≠z), with CCC of 0.96 [95%CI: 0.95–0.97]. Mean relative difference was smallest for V6 (0.6%), while it was −19.1% for V5, +14.5% for V4, +17.9% for V3, +32.6 % for V2 and +47% for V1. Conclusion Ellipsoidal approximations of volume using three measured axes may be used to closely estimate Vsegmented when semi-automated techniques are unavailable. PMID:25991487

  15. A Quality-Control-Oriented Database for a Mesoscale Meteorological Observation Network

    Science.gov (United States)

    Lussana, C.; Ranci, M.; Uboldi, F.

    2012-04-01

    In the operational context of a local weather service, data accessibility and quality related issues must be managed by taking into account a wide set of user needs. This work describes the structure and the operational choices made for the operational implementation of a database system storing data from highly automated observing stations, metadata and information on data quality. Lombardy's environmental protection agency, ARPA Lombardia, manages a highly automated mesoscale meteorological network. A Quality Assurance System (QAS) ensures that reliable observational information is collected and disseminated to the users. The weather unit in ARPA Lombardia, at the same time an important QAS component and an intensive data user, has developed a database specifically aimed to: 1) providing quick access to data for operational activities and 2) ensuring data quality for real-time applications, by means of an Automatic Data Quality Control (ADQC) procedure. Quantities stored in the archive include hourly aggregated observations of: precipitation amount, temperature, wind, relative humidity, pressure, global and net solar radiation. The ADQC performs several independent tests on raw data and compares their results in a decision-making procedure. An important ADQC component is the Spatial Consistency Test based on Optimal Interpolation. Interpolated and Cross-Validation analysis values are also stored in the database, providing further information to human operators and useful estimates in case of missing data. The technical solution adopted is based on a LAMP (Linux, Apache, MySQL and Php) system, constituting an open source environment suitable for both development and operational practice. The ADQC procedure itself is performed by R scripts directly interacting with the MySQL database. Users and network managers can access the database by using a set of web-based Php applications.

  16. Experimental Study on Meso-Scale Milling Process Using Nanofluid Minimum Quantity Lubrication

    International Nuclear Information System (INIS)

    Lee, P. H.; Nam, T. S.; Li, Cheng Jun; Lee, S. W.

    2010-01-01

    This paper present the characteristics of micro- and meso-scale milling processes in which compressed cold air, minimum quantity lubrication (MQL) and MoS 2 nanofluid MQL are used. For process characterization, the micro and meso-scale milling experiments are conducted using desktop meso-scale machine tool system and the surface roughness is measured. The experimental results show that the use of compressed chilly air and nanofluid MQL in the micro- and meso-scale milling processes is effective in improving the surface finish

  17. Semi-parametrical NAA method for paper analysis

    International Nuclear Information System (INIS)

    Medeiros, Ilca M.M.A.; Zamboni, Cibele B.; Cruz, Manuel T.F. da; Morel, Jose C.O.; Park, Song W.

    2007-01-01

    The semi-parametric Neutron Activation Analysis technique, using Au as flux monitor, was applied to determine element concentrations in white paper, usually commercialized, aiming to check the quality control of its production in industrial process. (author)

  18. Mesoscale modeling of solute precipitation and radiation damage

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Yongfeng [Idaho National Lab. (INL), Idaho Falls, ID (United States); Schwen, Daniel [Idaho National Lab. (INL), Idaho Falls, ID (United States); Ke, Huibin [Idaho National Lab. (INL), Idaho Falls, ID (United States); Univ. of Wisconsin, Madison, WI (United States); Bai, Xianming [Idaho National Lab. (INL), Idaho Falls, ID (United States); Hales, Jason [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-09-01

    This report summarizes the low length scale effort during FY 2014 in developing mesoscale capabilities for microstructure evolution in reactor pressure vessels. During operation, reactor pressure vessels are subject to hardening and embrittlement caused by irradiation-induced defect accumulation and irradiation-enhanced solute precipitation. Both defect production and solute precipitation start from the atomic scale, and manifest their eventual effects as degradation in engineering-scale properties. To predict the property degradation, multiscale modeling and simulation are needed to deal with the microstructure evolution, and to link the microstructure feature to material properties. In this report, the development of mesoscale capabilities for defect accumulation and solute precipitation are summarized. Atomic-scale efforts that supply information for the mesoscale capabilities are also included.

  19. O desempenho terminológico dos descritores em Ciência da Informação do Vocabulário Controlado do SIBi/USP nos processos de indexação manual, automática e semi-automática

    Directory of Open Access Journals (Sweden)

    Vania Mara Alves Lima

    Full Text Available Avaliou-se o desempenho terminológico, nos processos de indexação manual, automática e semi-automática, dos descritores, do Vocabulário Controlado do SIBi/USP, que representam o domínio da Ciência da Informação. Concluiu-se que os atuais descritores em Ciência da Informação do Vocabulário Controlado do SIBi/USP para representar adequadamente o conteúdo do corpus indexado devem ser ampliados e contextualizados através de definições terminológicas, de maneira a atender as necessidades de informação de seus usuários.

  20. Automated migration analysis based on cell texture: method & reliability

    Directory of Open Access Journals (Sweden)

    Chittenden Thomas W

    2005-03-01

    Full Text Available Abstract Background In this paper, we present and validate a way to measure automatically the extent of cell migration based on automated examination of a series of digital photographs. It was designed specifically to identify the impact of Second Hand Smoke (SHS on endothelial cell migration but has broader applications. The analysis has two stages: (1 preprocessing of image texture, and (2 migration analysis. Results The output is a graphic overlay that indicates the front lines of cell migration superimposed on each original image, with automated reporting of the distance traversed vs. time. Expert preference compares to manual placement of leading edge shows complete equivalence of automated vs. manual leading edge definition for cell migration measurement. Conclusion Our method is indistinguishable from careful manual determinations of cell front lines, with the advantages of full automation, objectivity, and speed.

  1. Automated spectral and timing analysis of AGNs

    Science.gov (United States)

    Munz, F.; Karas, V.; Guainazzi, M.

    2006-12-01

    % We have developed an autonomous script that helps the user to automate the XMM-Newton data analysis for the purposes of extensive statistical investigations. We test this approach by examining X-ray spectra of bright AGNs pre-selected from the public database. The event lists extracted in this process were studied further by constructing their energy-resolved Fourier power-spectrum density. This analysis combines energy distributions, light-curves, and their power-spectra and it proves useful to assess the variability patterns present is the data. As another example, an automated search was based on the XSPEC package to reveal the emission features in 2-8 keV range.

  2. Impact of SLA assimilation in the Sicily Channel Regional Model: model skills and mesoscale features

    Directory of Open Access Journals (Sweden)

    A. Olita

    2012-07-01

    Full Text Available The impact of the assimilation of MyOcean sea level anomalies along-track data on the analyses of the Sicily Channel Regional Model was studied. The numerical model has a resolution of 1/32° degrees and is capable to reproduce mesoscale and sub-mesoscale features. The impact of the SLA assimilation is studied by comparing a simulation (SIM, which does not assimilate data with an analysis (AN assimilating SLA along-track multi-mission data produced in the framework of MyOcean project. The quality of the analysis was evaluated by computing RMSE of the misfits between analysis background and observations (sea level before assimilation. A qualitative evaluation of the ability of the analyses to reproduce mesoscale structures is accomplished by comparing model results with ocean colour and SST satellite data, able to detect such features on the ocean surface. CTD profiles allowed to evaluate the impact of the SLA assimilation along the water column. We found a significant improvement for AN solution in terms of SLA RMSE with respect to SIM (the averaged RMSE of AN SLA misfits over 2 years is about 0.5 cm smaller than SIM. Comparison with CTD data shows a questionable improvement produced by the assimilation process in terms of vertical features: AN is better in temperature while for salinity it gets worse than SIM at the surface. This suggests that a better a-priori description of the vertical error covariances would be desirable. The qualitative comparison of simulation and analyses with synoptic satellite independent data proves that SLA assimilation allows to correctly reproduce some dynamical features (above all the circulation in the Ionian portion of the domain and mesoscale structures otherwise misplaced or neglected by SIM. Such mesoscale changes also infer that the eddy momentum fluxes (i.e. Reynolds stresses show major changes in the Ionian area. Changes in Reynolds stresses reflect a different pumping of eastward momentum from the eddy to

  3. The contaminant analysis automation robot implementation for the automated laboratory

    International Nuclear Information System (INIS)

    Younkin, J.R.; Igou, R.E.; Urenda, T.D.

    1995-01-01

    The Contaminant Analysis Automation (CAA) project defines the automated laboratory as a series of standard laboratory modules (SLM) serviced by a robotic standard support module (SSM). These SLMs are designed to allow plug-and-play integration into automated systems that perform standard analysis methods (SAM). While the SLMs are autonomous in the execution of their particular chemical processing task, the SAM concept relies on a high-level task sequence controller (TSC) to coordinate the robotic delivery of materials requisite for SLM operations, initiate an SLM operation with the chemical method dependent operating parameters, and coordinate the robotic removal of materials from the SLM when its commands and events has been established to allow ready them for transport operations as well as performing the Supervisor and Subsystems (GENISAS) software governs events from the SLMs and robot. The Intelligent System Operating Environment (ISOE) enables the inter-process communications used by GENISAS. CAA selected the Hewlett-Packard Optimized Robot for Chemical Analysis (ORCA) and its associated Windows based Methods Development Software (MDS) as the robot SSM. The MDS software is used to teach the robot each SLM position and required material port motions. To allow the TSC to command these SLM motions, a hardware and software implementation was required that allowed message passing between different operating systems. This implementation involved the use of a Virtual Memory Extended (VME) rack with a Force CPU-30 computer running VxWorks; a real-time multitasking operating system, and a Radiuses PC compatible VME computer running MDS. A GENISAS server on The Force computer accepts a transport command from the TSC, a GENISAS supervisor, over Ethernet and notifies software on the RadiSys PC of the pending command through VMEbus shared memory. The command is then delivered to the MDS robot control software using a Windows Dynamic Data Exchange conversation

  4. Clinical feasibility of a myocardial signal intensity threshold-based semi-automated cardiac magnetic resonance segmentation method

    Energy Technology Data Exchange (ETDEWEB)

    Varga-Szemes, Akos; Schoepf, U.J.; Suranyi, Pal; De Cecco, Carlo N.; Fox, Mary A. [Medical University of South Carolina, Division of Cardiovascular Imaging, Department of Radiology and Radiological Science, Charleston, SC (United States); Muscogiuri, Giuseppe [Medical University of South Carolina, Division of Cardiovascular Imaging, Department of Radiology and Radiological Science, Charleston, SC (United States); University of Rome ' ' Sapienza' ' , Department of Medical-Surgical Sciences and Translational Medicine, Rome (Italy); Wichmann, Julian L. [Medical University of South Carolina, Division of Cardiovascular Imaging, Department of Radiology and Radiological Science, Charleston, SC (United States); University Hospital Frankfurt, Department of Diagnostic and Interventional Radiology, Frankfurt (Germany); Cannao, Paola M. [Medical University of South Carolina, Division of Cardiovascular Imaging, Department of Radiology and Radiological Science, Charleston, SC (United States); University of Milan, Scuola di Specializzazione in Radiodiagnostica, Milan (Italy); Renker, Matthias [Medical University of South Carolina, Division of Cardiovascular Imaging, Department of Radiology and Radiological Science, Charleston, SC (United States); Kerckhoff Heart and Thorax Center, Bad Nauheim (Germany); Mangold, Stefanie [Medical University of South Carolina, Division of Cardiovascular Imaging, Department of Radiology and Radiological Science, Charleston, SC (United States); Eberhard-Karls University Tuebingen, Department of Diagnostic and Interventional Radiology, Tuebingen (Germany); Ruzsics, Balazs [Royal Liverpool and Broadgreen University Hospitals, Department of Cardiology, Liverpool (United Kingdom)

    2016-05-15

    To assess the accuracy and efficiency of a threshold-based, semi-automated cardiac MRI segmentation algorithm in comparison with conventional contour-based segmentation and aortic flow measurements. Short-axis cine images of 148 patients (55 ± 18 years, 81 men) were used to evaluate left ventricular (LV) volumes and mass (LVM) using conventional and threshold-based segmentations. Phase-contrast images were used to independently measure stroke volume (SV). LV parameters were evaluated by two independent readers. Evaluation times using the conventional and threshold-based methods were 8.4 ± 1.9 and 4.2 ± 1.3 min, respectively (P < 0.0001). LV parameters measured by the conventional and threshold-based methods, respectively, were end-diastolic volume (EDV) 146 ± 59 and 134 ± 53 ml; end-systolic volume (ESV) 64 ± 47 and 59 ± 46 ml; SV 82 ± 29 and 74 ± 28 ml (flow-based 74 ± 30 ml); ejection fraction (EF) 59 ± 16 and 58 ± 17 %; and LVM 141 ± 55 and 159 ± 58 g. Significant differences between the conventional and threshold-based methods were observed in EDV, ESV, and LVM measurements; SV from threshold-based and flow-based measurements were in agreement (P > 0.05) but were significantly different from conventional analysis (P < 0.05). Excellent inter-observer agreement was observed. Threshold-based LV segmentation provides improved accuracy and faster assessment compared to conventional contour-based methods. (orig.)

  5. Low-level wind response to mesoscale pressure systems

    Science.gov (United States)

    Garratt, J. R.; Physick, W. L.

    1983-09-01

    Observations are presented which show a strong correlation between low-level wind behaviour (e.g., rotation near the surface) and the passage of mesoscale pressure systems. The latter are associated with frontal transition zones, are dominated by a pressure-jump line and a mesoscale high pressure area, and produce locally large horizontal pressure gradients. The wind observations are simulated by specifying a time sequence of perturbation pressure gradient and subsequently solving the vertically-integrated momentum equations with appropriate initial conditions. Very good agreement is found between observed and calculated winds; in particular, (i) a 360 ° rotation in wind on passage of the mesoscale high; (ii) wind-shift lines produced dynamically by the pressure-jump line; (iii) rapid linear increase in wind speed on passage of the pressure jump.

  6. Intense mesoscale variability in the Sardinia Sea

    Science.gov (United States)

    Russo, Aniello; Borrione, Ines; Falchetti, Silvia; Knoll, Michaela; Fiekas, Heinz-Volker; Heywood, Karen; Oddo, Paolo; Onken, Reiner

    2015-04-01

    From the 6 to 25 June 2014, the REP14-MED sea trial was conducted by CMRE, supported by 20 partners from six different nations. The at-sea activities were carried out onboard the research vessels Alliance (NATO) and Planet (German Ministry of Defense), comprising a marine area of about 110 x 110 km2 to the west of the Sardinian coast. More than 300 CTD casts typically spaced at 10 km were collected; both ships continuously recorded vertical profiles of currents by means of their ADCPs, and a ScanFish® and a CTD chain were towed for almost three days by Alliance and Planet, respectively, following parallel routes. Twelve gliders from different manufacturers (Slocum, SeaGliderTM and SeaExplorer) were continuously sampling the study area following zonal tracks spaced at 10 km. In addition, six moorings, 17 surface drifters and one ARVOR float were deployed. From a first analysis of the observations, several mesoscale features were identified in the survey area, in particular: (i) a warm-core anticyclonic eddy in the southern part of the domain, about 50 km in diameter and with the strongest signal at about 50-m depth (ii) another warm-core anticyclonic eddy of comparable dimensions in the central part of the domain, but extending to greater depth than the former one, and (iii) a small (less than 15 km in diameter) cold-core cyclonic eddy of Winter Intermediate Water in the depth range between 170 m and 370 m. All three eddies showed intensified currents, up to 50 cm s-1. The huge high-resolution observational data set and the variety of observation techniques enabled the mesoscale features and their variability to be tracked for almost three weeks. In order to obtain a deeper understanding of the mesoscale dynamic behaviour and their interactions, assimilation studies with an ocean circulation model are underway.

  7. Application of semi-supervised deep learning to lung sound analysis.

    Science.gov (United States)

    Chamberlain, Daniel; Kodgule, Rahul; Ganelin, Daniela; Miglani, Vivek; Fletcher, Richard Ribon

    2016-08-01

    The analysis of lung sounds, collected through auscultation, is a fundamental component of pulmonary disease diagnostics for primary care and general patient monitoring for telemedicine. Despite advances in computation and algorithms, the goal of automated lung sound identification and classification has remained elusive. Over the past 40 years, published work in this field has demonstrated only limited success in identifying lung sounds, with most published studies using only a small numbers of patients (typically Ndeep learning algorithm for automatically classify lung sounds from a relatively large number of patients (N=284). Focusing on the two most common lung sounds, wheeze and crackle, we present results from 11,627 sound files recorded from 11 different auscultation locations on these 284 patients with pulmonary disease. 890 of these sound files were labeled to evaluate the model, which is significantly larger than previously published studies. Data was collected with a custom mobile phone application and a low-cost (US$30) electronic stethoscope. On this data set, our algorithm achieves ROC curves with AUCs of 0.86 for wheeze and 0.74 for crackle. Most importantly, this study demonstrates how semi-supervised deep learning can be used with larger data sets without requiring extensive labeling of data.

  8. Semi-automated non-invasive diagnostics method for melanoma differentiation from nevi and pigmented basal cell carcinomas

    Science.gov (United States)

    Lihacova, I.; Bolocko, K.; Lihachev, A.

    2017-12-01

    The incidence of skin cancer is still increasing mostly in in industrialized countries with light- skinned people. Late tumour detection is the main reason of the high mortality associated with skin cancer. The accessibility of early diagnostics of skin cancer in Latvia is limited by several factors, such as high cost of dermatology services, long queues on state funded oncologist examinations, as well as inaccessibility of oncologists in the countryside regions - this is an actual clinical problem. The new strategies and guidelines for skin cancer early detection and post-surgical follow-up intend to realize the full body examination (FBE) by primary care physicians (general practitioners, interns) in combination with classical dermoscopy. To implement this approach, a semi- automated method was established. Developed software analyses the combination of 3 optical density images at 540 nm, 650 nm, and 950 nm from pigmented skin malformations and classifies them into three groups- nevi, pigmented basal cell carcinoma or melanoma.

  9. O the Development and Use of Four-Dimensional Data Assimilation in Limited-Area Mesoscale Models Used for Meteorological Analysis.

    Science.gov (United States)

    Stauffer, David R.

    1990-01-01

    The application of dynamic relationships to the analysis problem for the atmosphere is extended to use a full-physics limited-area mesoscale model as the dynamic constraint. A four-dimensional data assimilation (FDDA) scheme based on Newtonian relaxation or "nudging" is developed and evaluated in the Penn State/National Center for Atmospheric Research (PSU/NCAR) mesoscale model, which is used here as a dynamic-analysis tool. The thesis is to determine what assimilation strategies and what meterological fields (mass, wind or both) have the greatest positive impact on the 72-h numerical simulations (dynamic analyses) of two mid-latitude, real-data cases. The basic FDDA methodology is tested in a 10-layer version of the model with a bulk-aerodynamic (single-layer) representation of the planetary boundary layer (PBL), and refined in a 15-layer version of the model by considering the effects of data assimilation within a multi-layer PBL scheme. As designed, the model solution can be relaxed toward either gridded analyses ("analysis nudging"), or toward the actual observations ("obs nudging"). The data used for assimilation include standard 12-hourly rawinsonde data, and also 3-hourly mesoalpha-scale surface data which are applied within the model's multi-layer PBL. Continuous assimilation of standard-resolution rawinsonde data into the 10-layer model successfully reduced large-scale amplitude and phase errors while the model realistically simulated mesoscale structures poorly defined or absent in the rawinsonde analyses and in the model simulations without FDDA. Nudging the model fields directly toward the rawinsonde observations generally produced results comparable to nudging toward gridded analyses. This obs -nudging technique is especially attractive for the assimilation of high-frequency, asynoptic data. Assimilation of 3-hourly surface wind and moisture data into the 15-layer FDDA system was most effective for improving the simulated precipitation fields because a

  10. Steam generator automated eddy current data analysis: A benchmarking study. Final report

    International Nuclear Information System (INIS)

    Brown, S.D.

    1998-12-01

    The eddy current examination of steam generator tubes is a very demanding process. Challenges include: complex signal analysis, massive amount of data to be reviewed quickly with extreme precision and accuracy, shortages of data analysts during peak periods, and the desire to reduce examination costs. One method to address these challenges is by incorporating automation into the data analysis process. Specific advantages, which automated data analysis has the potential to provide, include the ability to analyze data more quickly, consistently and accurately than can be performed manually. Also, automated data analysis can potentially perform the data analysis function with significantly smaller levels of analyst staffing. Despite the clear advantages that an automated data analysis system has the potential to provide, no automated system has been produced and qualified that can perform all of the functions that utility engineers demand. This report investigates the current status of automated data analysis, both at the commercial and developmental level. A summary of the various commercial and developmental data analysis systems is provided which includes the signal processing methodologies used and, where available, the performance data obtained for each system. Also, included in this report is input from seventeen research organizations regarding the actions required and obstacles to be overcome in order to bring automatic data analysis from the laboratory into the field environment. In order to provide assistance with ongoing and future research efforts in the automated data analysis arena, the most promising approaches to signal processing are described in this report. These approaches include: wavelet applications, pattern recognition, template matching, expert systems, artificial neural networks, fuzzy logic, case based reasoning and genetic algorithms. Utility engineers and NDE researchers can use this information to assist in developing automated data

  11. Automated Analysis of Corpora Callosa

    DEFF Research Database (Denmark)

    Stegmann, Mikkel Bille; Davies, Rhodri H.

    2003-01-01

    This report describes and evaluates the steps needed to perform modern model-based interpretation of the corpus callosum in MRI. The process is discussed from the initial landmark-free contours to full-fledged statistical models based on the Active Appearance Models framework. Topics treated incl...... include landmark placement, background modelling and multi-resolution analysis. Preliminary quantitative and qualitative validation in a cross-sectional study show that fully automated analysis and segmentation of the corpus callosum are feasible....

  12. Automated software analysis of nuclear core discharge data

    International Nuclear Information System (INIS)

    Larson, T.W.; Halbig, J.K.; Howell, J.A.; Eccleston, G.W.; Klosterbuer, S.F.

    1993-03-01

    Monitoring the fueling process of an on-load nuclear reactor is a full-time job for nuclear safeguarding agencies. Nuclear core discharge monitors (CDMS) can provide continuous, unattended recording of the reactor's fueling activity for later, qualitative review by a safeguards inspector. A quantitative analysis of this collected data could prove to be a great asset to inspectors because more information can be extracted from the data and the analysis time can be reduced considerably. This paper presents a prototype for an automated software analysis system capable of identifying when fuel bundle pushes occurred and monitoring the power level of the reactor. Neural network models were developed for calculating the region on the reactor face from which the fuel was discharged and predicting the burnup. These models were created and tested using actual data collected from a CDM system at an on-load reactor facility. Collectively, these automated quantitative analysis programs could help safeguarding agencies to gain a better perspective on the complete picture of the fueling activity of an on-load nuclear reactor. This type of system can provide a cost-effective solution for automated monitoring of on-load reactors significantly reducing time and effort

  13. A fully automated algorithm of baseline correction based on wavelet feature points and segment interpolation

    Science.gov (United States)

    Qian, Fang; Wu, Yihui; Hao, Peng

    2017-11-01

    Baseline correction is a very important part of pre-processing. Baseline in the spectrum signal can induce uneven amplitude shifts across different wavenumbers and lead to bad results. Therefore, these amplitude shifts should be compensated before further analysis. Many algorithms are used to remove baseline, however fully automated baseline correction is convenient in practical application. A fully automated algorithm based on wavelet feature points and segment interpolation (AWFPSI) is proposed. This algorithm finds feature points through continuous wavelet transformation and estimates baseline through segment interpolation. AWFPSI is compared with three commonly introduced fully automated and semi-automated algorithms, using simulated spectrum signal, visible spectrum signal and Raman spectrum signal. The results show that AWFPSI gives better accuracy and has the advantage of easy use.

  14. Automating Trend Analysis for Spacecraft Constellations

    Science.gov (United States)

    Davis, George; Cooter, Miranda; Updike, Clark; Carey, Everett; Mackey, Jennifer; Rykowski, Timothy; Powers, Edward I. (Technical Monitor)

    2001-01-01

    Spacecraft trend analysis is a vital mission operations function performed by satellite controllers and engineers, who perform detailed analyses of engineering telemetry data to diagnose subsystem faults and to detect trends that may potentially lead to degraded subsystem performance or failure in the future. It is this latter function that is of greatest importance, for careful trending can often predict or detect events that may lead to a spacecraft's entry into safe-hold. Early prediction and detection of such events could result in the avoidance of, or rapid return to service from, spacecraft safing, which not only results in reduced recovery costs but also in a higher overall level of service for the satellite system. Contemporary spacecraft trending activities are manually intensive and are primarily performed diagnostically after a fault occurs, rather than proactively to predict its occurrence. They also tend to rely on information systems and software that are oudated when compared to current technologies. When coupled with the fact that flight operations teams often have limited resources, proactive trending opportunities are limited, and detailed trend analysis is often reserved for critical responses to safe holds or other on-orbit events such as maneuvers. While the contemporary trend analysis approach has sufficed for current single-spacecraft operations, it will be unfeasible for NASA's planned and proposed space science constellations. Missions such as the Dynamics, Reconnection and Configuration Observatory (DRACO), for example, are planning to launch as many as 100 'nanospacecraft' to form a homogenous constellation. A simple extrapolation of resources and manpower based on single-spacecraft operations suggests that trending for such a large spacecraft fleet will be unmanageable, unwieldy, and cost-prohibitive. It is therefore imperative that an approach to automating the spacecraft trend analysis function be studied, developed, and applied to

  15. SMASH - semi-automatic muscle analysis using segmentation of histology: a MATLAB application.

    Science.gov (United States)

    Smith, Lucas R; Barton, Elisabeth R

    2014-01-01

    Histological assessment of skeletal muscle tissue is commonly applied to many areas of skeletal muscle physiological research. Histological parameters including fiber distribution, fiber type, centrally nucleated fibers, and capillary density are all frequently quantified measures of skeletal muscle. These parameters reflect functional properties of muscle and undergo adaptation in many muscle diseases and injuries. While standard operating procedures have been developed to guide analysis of many of these parameters, the software to freely, efficiently, and consistently analyze them is not readily available. In order to provide this service to the muscle research community we developed an open source MATLAB script to analyze immunofluorescent muscle sections incorporating user controls for muscle histological analysis. The software consists of multiple functions designed to provide tools for the analysis selected. Initial segmentation and fiber filter functions segment the image and remove non-fiber elements based on user-defined parameters to create a fiber mask. Establishing parameters set by the user, the software outputs data on fiber size and type, centrally nucleated fibers, and other structures. These functions were evaluated on stained soleus muscle sections from 1-year-old wild-type and mdx mice, a model of Duchenne muscular dystrophy. In accordance with previously published data, fiber size was not different between groups, but mdx muscles had much higher fiber size variability. The mdx muscle had a significantly greater proportion of type I fibers, but type I fibers did not change in size relative to type II fibers. Centrally nucleated fibers were highly prevalent in mdx muscle and were significantly larger than peripherally nucleated fibers. The MATLAB code described and provided along with this manuscript is designed for image processing of skeletal muscle immunofluorescent histological sections. The program allows for semi-automated fiber detection

  16. Comparison of methods for the identification of mesoscale wind speed fluctuations

    Directory of Open Access Journals (Sweden)

    Anna Rieke Mehrens

    2017-06-01

    Full Text Available Mesoscale wind speed fluctuations influence the characteristics of offshore wind energy. These recurring wind speed changes on time scales between tens of minutes and six hours lead to power output fluctuations. In order to investigate the meteorological conditions associated with mesoscale wind speed fluctuations, a measure is needed to detect these situations in wind speed time series. Previous studies used the empirical Hilbert-Huang Transform to determine the energy in the mesoscale frequency range or calculated the standard deviation of a band-pass filtered wind speed time series. The aim of this paper is to introduce newly developed empirical mesoscale fluctuation measures and to compare them with existing measures in regard to their sensitivity to recurring wind speed changes. One of the methods is based on the Hilbert-Huang Transform, two on the Fast Fourier Transform and one on wind speed increments. It is found that despite various complexity of the methods, all methods can identify days with highly variable mesoscale wind speeds equally well.

  17. Automated x-ray fluorescence analysis

    International Nuclear Information System (INIS)

    O'Connell, A.M.

    1977-01-01

    A fully automated x-ray fluorescence analytical system is described. The hardware is based on a Philips PW1220 sequential x-ray spectrometer. Software for on-line analysis of a wide range of sample types has been developed for the Hewlett-Packard 9810A programmable calculator. Routines to test the system hardware are also described. (Author)

  18. Global scaling for semi-quantitative analysis in FP-CIT SPECT.

    Science.gov (United States)

    Kupitz, D; Apostolova, I; Lange, C; Ulrich, G; Amthauer, H; Brenner, W; Buchert, R

    2014-01-01

    Semi-quantitative characterization of dopamine transporter availability from single photon emission computed tomography (SPECT) with 123I-ioflupane (FP-CIT) is based on uptake ratios relative to a reference region. The aim of this study was to evaluate the whole brain as reference region for semi-quantitative analysis of FP-CIT SPECT. The rationale was that this might reduce statistical noise associated with the estimation of non-displaceable FP-CIT uptake. 150 FP-CIT SPECTs were categorized as neurodegenerative or non-neurodegenerative by an expert. Semi-quantitative analysis of specific binding ratios (SBR) was performed with a custom-made tool based on the Statistical Parametric Mapping software package using predefined regions of interest (ROIs) in the anatomical space of the Montreal Neurological Institute. The following reference regions were compared: predefined ROIs for frontal and occipital lobe and whole brain (without striata, thalamus and brainstem). Tracer uptake in the reference region was characterized by the mean, median or 75th percentile of its voxel intensities. The area (AUC) under the receiver operating characteristic curve was used as performance measure. The highest AUC of 0.973 was achieved by the SBR of the putamen with the 75th percentile in the whole brain as reference. The lowest AUC for the putamen SBR of 0.937 was obtained with the mean in the frontal lobe as reference. We recommend the 75th percentile in the whole brain as reference for semi-quantitative analysis in FP-CIT SPECT. This combination provided the best agreement of the semi-quantitative analysis with visual evaluation of the SPECT images by an expert and, therefore, is appropriate to support less experienced physicians.

  19. Computer-automated neutron activation analysis system

    International Nuclear Information System (INIS)

    Minor, M.M.; Garcia, S.R.

    1983-01-01

    An automated delayed neutron counting and instrumental neutron activation analysis system has been developed at Los Alamos National Laboratory's Omega West Reactor (OWR) to analyze samples for uranium and 31 additional elements with a maximum throughput of 400 samples per day. 5 references

  20. Tool Efficiency Analysis model research in SEMI industry

    Directory of Open Access Journals (Sweden)

    Lei Ma

    2018-01-01

    Full Text Available One of the key goals in SEMI industry is to improve equipment through put and ensure equipment production efficiency maximization. This paper is based on SEMI standards in semiconductor equipment control, defines the transaction rules between different tool states,and presents a TEA system model which is to analysis tool performance automatically based on finite state machine. The system was applied to fab tools and verified its effectiveness successfully, and obtained the parameter values used to measure the equipment performance, also including the advices of improvement.

  1. Comparison of semi-automated and manual measurements of carotid intima-media thickening.

    LENUS (Irish Health Repository)

    Mac Ananey, Oscar

    2014-01-01

    Carotid intima-media thickening (CIMT) is a marker of both arteriosclerotic and atherosclerotic risks. Technological advances have semiautomated CIMT image acquisition and quantification. Studies comparing manual and automated methods have yielded conflicting results possibly due to plaque inclusion in measurements. Low atherosclerotic risk subjects (n = 126) were recruited to minimise the effect of focal atherosclerotic lesions on CIMT variability. CIMT was assessed by high-resolution B-mode ultrasound (Philips HDX7E, Phillips, UK) images of the common carotid artery using both manual and semiautomated methods (QLAB, Phillips, UK). Intraclass correlation coefficient (ICC) and the mean differences of paired measurements (Bland-Altman method) were used to compare both methodologies. The ICC of manual (0.547 ± 0.095 mm) and automated (0.524 ± 0.068 mm) methods was R = 0.74 and an absolute mean bias ± SD of 0.023 ± 0.052 mm was observed. Interobserver and intraobserver ICC were greater for automated (R = 0.94 and 0.99) compared to manual (R = 0.72 and 0.88) methods. Although not considered to be clinically significant, manual measurements yielded higher values compared to automated measurements. Automated measurements were more reproducible and showed lower interobserver variation compared to manual measurements. These results offer important considerations for large epidemiological studies.

  2. Mesoscale variability in the Bransfield Strait region (Antarctica during Austral summer

    Directory of Open Access Journals (Sweden)

    M. A. García

    1994-08-01

    Full Text Available The Bransfield Strait is one the best-known areas of Antarctica's oceanic surroundings. In spite of this, the study of the mesoscale variability of its local circulation has been addressed only recently. This paper focuses on the mesoscale structure of local physical oceanographic conditions in the Bransfield Strait during the Austral summer as derived from the BIOANTAR 93 cruise and auxiliary remote sensing data. Moreover, data recovered from moored current meters allow identification of transient mesoscale phenomena.

  3. Reload safety analysis automation tools

    International Nuclear Information System (INIS)

    Havlůj, F.; Hejzlar, J.; Vočka, R.

    2013-01-01

    Performing core physics calculations for the sake of reload safety analysis is a very demanding and time consuming process. This process generally begins with the preparation of libraries for the core physics code using a lattice code. The next step involves creating a very large set of calculations with the core physics code. Lastly, the results of the calculations must be interpreted, correctly applying uncertainties and checking whether applicable limits are satisfied. Such a procedure requires three specialized experts. One must understand the lattice code in order to correctly calculate and interpret its results. The next expert must have a good understanding of the physics code in order to create libraries from the lattice code results and to correctly define all the calculations involved. The third expert must have a deep knowledge of the power plant and the reload safety analysis procedure in order to verify, that all the necessary calculations were performed. Such a procedure involves many steps and is very time consuming. At ÚJV Řež, a.s., we have developed a set of tools which can be used to automate and simplify the whole process of performing reload safety analysis. Our application QUADRIGA automates lattice code calculations for library preparation. It removes user interaction with the lattice code and reduces his task to defining fuel pin types, enrichments, assembly maps and operational parameters all through a very nice and user-friendly GUI. The second part in reload safety analysis calculations is done by CycleKit, a code which is linked with our core physics code ANDREA. Through CycleKit large sets of calculations with complicated interdependencies can be performed using simple and convenient notation. CycleKit automates the interaction with ANDREA, organizes all the calculations, collects the results, performs limit verification and displays the output in clickable html format. Using this set of tools for reload safety analysis simplifies

  4. A semi-automated method for rapid detection of ripple events on interictal voltage discharges in the scalp electroencephalogram.

    Science.gov (United States)

    Chu, Catherine J; Chan, Arthur; Song, Dan; Staley, Kevin J; Stufflebeam, Steven M; Kramer, Mark A

    2017-02-01

    High frequency oscillations are emerging as a clinically important indicator of epileptic networks. However, manual detection of these high frequency oscillations is difficult, time consuming, and subjective, especially in the scalp EEG, thus hindering further clinical exploration and application. Semi-automated detection methods augment manual detection by reducing inspection to a subset of time intervals. We propose a new method to detect high frequency oscillations that co-occur with interictal epileptiform discharges. The new method proceeds in two steps. The first step identifies candidate time intervals during which high frequency activity is increased. The second step computes a set of seven features for each candidate interval. These features require that the candidate event contain a high frequency oscillation approximately sinusoidal in shape, with at least three cycles, that co-occurs with a large amplitude discharge. Candidate events that satisfy these features are stored for validation through visual analysis. We evaluate the detector performance in simulation and on ten examples of scalp EEG data, and show that the proposed method successfully detects spike-ripple events, with high positive predictive value, low false positive rate, and high intra-rater reliability. The proposed method is less sensitive than the existing method of visual inspection, but much faster and much more reliable. Accurate and rapid detection of high frequency activity increases the clinical viability of this rhythmic biomarker of epilepsy. The proposed spike-ripple detector rapidly identifies candidate spike-ripple events, thus making clinical analysis of prolonged, multielectrode scalp EEG recordings tractable. Copyright © 2016 Elsevier B.V. All rights reserved.

  5. EddyOne automated analysis of PWR/WWER steam generator tubes eddy current data

    International Nuclear Information System (INIS)

    Nadinic, B.; Vanjak, Z.

    2004-01-01

    INETEC Institute for Nuclear Technology developed software package called Eddy One which has option of automated analysis of bobbin coil eddy current data. During its development and on site use, many valuable lessons were learned which are described in this article. In accordance with previous, the following topics are covered: General requirements for automated analysis of bobbin coil eddy current data; Main approaches to automated analysis; Multi rule algorithms for data screening; Landmark detection algorithms as prerequisite for automated analysis (threshold algorithms and algorithms based on neural network principles); Field experience with Eddy One software; Development directions (use of artificial intelligence with self learning abilities for indication detection and sizing); Automated analysis software qualification; Conclusions. Special emphasis is given on results obtained on different types of steam generators, condensers and heat exchangers. Such results are then compared with results obtained by other automated software vendors giving clear advantage to INETEC approach. It has to be pointed out that INETEC field experience was collected also on WWER steam generators what is for now unique experience.(author)

  6. Automated quantitative cytological analysis using portable microfluidic microscopy.

    Science.gov (United States)

    Jagannadh, Veerendra Kalyan; Murthy, Rashmi Sreeramachandra; Srinivasan, Rajesh; Gorthi, Sai Siva

    2016-06-01

    In this article, a portable microfluidic microscopy based approach for automated cytological investigations is presented. Inexpensive optical and electronic components have been used to construct a simple microfluidic microscopy system. In contrast to the conventional slide-based methods, the presented method employs microfluidics to enable automated sample handling and image acquisition. The approach involves the use of simple in-suspension staining and automated image acquisition to enable quantitative cytological analysis of samples. The applicability of the presented approach to research in cellular biology is shown by performing an automated cell viability assessment on a given population of yeast cells. Further, the relevance of the presented approach to clinical diagnosis and prognosis has been demonstrated by performing detection and differential assessment of malaria infection in a given sample. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. The South China Sea Mesoscale Eddy Experiment (S-MEE) and Its Primary Findings

    Science.gov (United States)

    Zhang, Z.; Tian, J.; Zhao, W.; Qiu, B.

    2016-02-01

    South China Sea (SCS), the largest marginal sea in the northwestern Pacific, have strong eddy activities as revealed by both satellite and in situ observations. The 3D structures of the SCS mesoscale eddies and their lifecycles, including the generation and dissipation processes, are, however, still not well understood at present because of the lack of well-designed field observations. In order to address the above two scientific issues (3D structure and lifecycle of SCS mesoscale eddies), the SCS Mesoscale Eddy Experiment (S-MEE for short) was designed and conducted in the period from October 2013 to June 2014. As part of S-MEE, two bottom-anchored subsurface mooring arrays with one consisting of 10 moorings and the other 7 moorings, were deployed along the historical pathway of the mesoscale eddies in the northern SCS. All the moorings were equipped with ADCPs, RCMs, CTDs and temperature chains to make continues measurements of horizontal current velocity and temperature/salinity in the whole water column. During the S-MEE, a total of 5 distinct mesoscale eddies were observed to cross the mooring arrays, among which one anticyclonic and cyclonic eddy pair was fully captured by the mooring arrays. In addition to moored observations, we also conducted two transects across the center of the anticyclonic eddy and made high-resolution hydrographic and turbulent mixing measurements. Based on the data collected by the S-MEE and concurrent satellite-derived observations, we constructed the full-depth 3D structure of the eddy pair and analyzed its generation and dissipation mechanisms. We found that the eddies extend from the surface to the sea bottom and display prominent tilted structures in the vertical. By conducting an eddy energy budget analysis, we further identified that generation of submesoscale motions constitutes the dominant mechanism for the oceanic eddy dissipation.

  8. Driver-centred vehicle automation: using network analysis for agent-based modelling of the driver in highly automated driving systems.

    Science.gov (United States)

    Banks, Victoria A; Stanton, Neville A

    2016-11-01

    To the average driver, the concept of automation in driving infers that they can become completely 'hands and feet free'. This is a common misconception, however, one that has been shown through the application of Network Analysis to new Cruise Assist technologies that may feature on our roads by 2020. Through the adoption of a Systems Theoretic approach, this paper introduces the concept of driver-initiated automation which reflects the role of the driver in highly automated driving systems. Using a combination of traditional task analysis and the application of quantitative network metrics, this agent-based modelling paper shows how the role of the driver remains an integral part of the driving system implicating the need for designers to ensure they are provided with the tools necessary to remain actively in-the-loop despite giving increasing opportunities to delegate their control to the automated subsystems. Practitioner Summary: This paper describes and analyses a driver-initiated command and control system of automation using representations afforded by task and social networks to understand how drivers remain actively involved in the task. A network analysis of different driver commands suggests that such a strategy does maintain the driver in the control loop.

  9. Evaluation of a completely automated cold fiber device using compounds with varying volatility and polarity.

    Science.gov (United States)

    Jiang, Ruifen; Carasek, Eduardo; Risticevic, Sanja; Cudjoe, Erasmus; Warren, Jamie; Pawliszyn, Janusz

    2012-09-12

    A fully automated cold fiber solid phase microextraction device has been developed by coupling to a GERSTEL multipurpose (MPS 2) autosampler and applied to the analysis of volatiles and semi-volatiles in aqueous and solid matrices. The proposed device was thoroughly evaluated for its extraction performance, robustness, reproducibility and reliability by gas chromatograph/mass spectrometer (GC/MS). With the use of a septumless head injector, the entire automated setup was capable of analyzing over 200 samples without any GC injector leakages. Evaluation of the automated cold fiber device was carried out using a group of compounds characterized by different volatilities and polarities. Extraction efficiency as well as analytical figures of merit was compared to commercial solid phase microextraction fibers. The automated cold fiber device showed significantly improved extraction efficiency compared to the commercial polydimethylsiloxane (PDMS) and cold fiber without cooling for the analysis of aqueous standard samples due to the low temperature of the coating. Comparing results obtained from cold fiber and commercial divinylbenzene/carboxen/polydimethylsiloxane (DVB/CAR/PDMS) fiber temperature profile demonstrated that the temperature gap between the sample matrix and the coating improved the distribution coefficient and therefore the extraction amount. The linear dynamic range of the cold fiber device was 0.5 ng mL(-1) to 100 ng mL(-1) with a linear regression coefficient ≥0.9963 for all compounds. The limit of detection for all analytes ranged from 1.0 ng mL(-1) to 9.4 ng mL(-1). The newly automated cold fiber device presents a platform for headspace analysis of volatiles and semi-volatiles for large number of samples with improved throughput and sensitivity. Copyright © 2012 Elsevier B.V. All rights reserved.

  10. Global left ventricular function in cardiac CT. Evaluation of an automated 3D region-growing segmentation algorithm

    International Nuclear Information System (INIS)

    Muehlenbruch, Georg; Das, Marco; Hohl, Christian; Wildberger, Joachim E.; Guenther, Rolf W.; Mahnken, Andreas H.; Rinck, Daniel; Flohr, Thomas G.; Koos, Ralf; Knackstedt, Christian

    2006-01-01

    The purpose was to evaluate a new semi-automated 3D region-growing segmentation algorithm for functional analysis of the left ventricle in multislice CT (MSCT) of the heart. Twenty patients underwent contrast-enhanced MSCT of the heart (collimation 16 x 0.75 mm; 120 kV; 550 mAseff). Multiphase image reconstructions with 1-mm axial slices and 8-mm short-axis slices were performed. Left ventricular volume measurements (end-diastolic volume, end-systolic volume, ejection fraction and stroke volume) from manually drawn endocardial contours in the short axis slices were compared to semi-automated region-growing segmentation of the left ventricle from the 1-mm axial slices. The post-processing-time for both methods was recorded. Applying the new region-growing algorithm in 13/20 patients (65%), proper segmentation of the left ventricle was feasible. In these patients, the signal-to-noise ratio was higher than in the remaining patients (3.2±1.0 vs. 2.6±0.6). Volume measurements of both segmentation algorithms showed an excellent correlation (all P≤0.0001); the limits of agreement for the ejection fraction were 2.3±8.3 ml. In the patients with proper segmentation the mean post-processing time using the region-growing algorithm was diminished by 44.2%. On the basis of a good contrast-enhanced data set, a left ventricular volume analysis using the new semi-automated region-growing segmentation algorithm is technically feasible, accurate and more time-effective. (orig.)

  11. Analysis of mesoscale factors at the onset of deep convection on hailstorm days in Southern France and their relation to the synoptic patterns

    Science.gov (United States)

    Sanchez, Jose Luis; Wu, Xueke; Gascón, Estibaliz; López, Laura; Melcón, Pablo; García-Ortega, Eduardo; Berthet, Claude; Dessens, Jean; Merino, Andrés

    2013-04-01

    Storms and the weather phenomena associated to intense precipitation, lightning, strong winds or hail, are among the most common and dangerous weather risks in many European countries. To get a reliable forecast of their occurrence is remaining an open problem. The question is: how is possible to improve the reliability of forecast? Southwestern France is frequently affected by hailstorms, producing severe damages on crops and properties. Considerable efforts were made to improve the forecast of hailfall in this area. First of all, if we want to improve this type of forecast, it is necessary to have a good "ground truth" of the hail days and zones affected by hailfall. Fortunately, ANELFA has deployed thousands of hailpad stations in Southern France. The ANELFA processed the point hailfall data recorded during each hail season at these stations. The focus of this paper presents a methodology to improve the forecast of the occurrence of hailfall according to the synoptic environment and mesoscale factors in the study area. One hundred of hail days were selected, following spatial and severity criteria, occurred in the period 2000-2010. The mesoscale model WRF was applied for all cases to study the synoptic environment of mean geopotential and temperature fields at 500 hPa. Three nested domains have been defined following a two-way nesting strategy, with a horizontal spatial resolution of 36, 12 and 4 km, and 30 vertical terrains— following σ-levels. Then, using the Principal Component Analysis in T-Mode, 4 mesoscale configurations were defined for the fields of convective instability (CI), water vapor flux divergence and wind flow and humidity at low layer (850hPa), and several clusters were classified followed by using the K-means Clustering. Finally, we calculated several characteristic values of four hail forecast parameters: Convective Available Potential Energy (CAPE), Storm Relative Helicity between 0 and 3 km (SRH0-3), Energy-Helicity Index (EHI) and

  12. Fully-automated computer-assisted method of CT brain scan analysis for the measurement of cerebrospinal fluid spaces and brain absorption density

    Energy Technology Data Exchange (ETDEWEB)

    Baldy, R.E.; Brindley, G.S.; Jacobson, R.R.; Reveley, M.A.; Lishman, W.A.; Ewusi-Mensah, I.; Turner, S.W.

    1986-03-01

    Computer-assisted methods of CT brain scan analysis offer considerable advantages over visual inspection, particularly in research; and several semi-automated methods are currently available. A new computer-assisted program is presented which provides fully automated processing of CT brain scans, depending on ''anatomical knowledge'' of where cerebrospinal fluid (CSF)-containing spaces are likely to lie. After identifying these regions of interest quantitative estimates are then provided of CSF content in each slice in cisterns, ventricles, Sylvian fissure and interhemispheric fissure. Separate measures are also provided of mean brain density in each slice. These estimates can be summated to provide total ventricular and total brain volumes. The program shows a high correlation with measures derived from mechanical planimetry and visual grading procedures, also when tested against a phantom brain of known ventricular volume. The advantages and limitations of the present program are discussed.

  13. Failure mode and effect analysis oriented to risk-reduction interventions in intraoperative electron radiation therapy: the specific impact of patient transportation, automation, and treatment planning availability.

    Science.gov (United States)

    López-Tarjuelo, Juan; Bouché-Babiloni, Ana; Santos-Serra, Agustín; Morillo-Macías, Virginia; Calvo, Felipe A; Kubyshin, Yuri; Ferrer-Albiach, Carlos

    2014-11-01

    Industrial companies use failure mode and effect analysis (FMEA) to improve quality. Our objective was to describe an FMEA and subsequent interventions for an automated intraoperative electron radiotherapy (IOERT) procedure with computed tomography simulation, pre-planning, and a fixed conventional linear accelerator. A process map, an FMEA, and a fault tree analysis are reported. The equipment considered was the radiance treatment planning system (TPS), the Elekta Precise linac, and TN-502RDM-H metal-oxide-semiconductor-field-effect transistor in vivo dosimeters. Computerized order-entry and treatment-automation were also analyzed. Fifty-seven potential modes and effects were identified and classified into 'treatment cancellation' and 'delivering an unintended dose'. They were graded from 'inconvenience' or 'suboptimal treatment' to 'total cancellation' or 'potentially wrong' or 'very wrong administered dose', although these latter effects were never experienced. Risk priority numbers (RPNs) ranged from 3 to 324 and totaled 4804. After interventions such as double checking, interlocking, automation, and structural changes the final total RPN was reduced to 1320. FMEA is crucial for prioritizing risk-reduction interventions. In a semi-surgical procedure like IOERT double checking has the potential to reduce risk and improve quality. Interlocks and automation should also be implemented to increase the safety of the procedure. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  14. Optogenetic stimulation of a meso-scale human cortical model

    Science.gov (United States)

    Selvaraj, Prashanth; Szeri, Andrew; Sleigh, Jamie; Kirsch, Heidi

    2015-03-01

    Neurological phenomena like sleep and seizures depend not only on the activity of individual neurons, but on the dynamics of neuron populations as well. Meso-scale models of cortical activity provide a means to study neural dynamics at the level of neuron populations. Additionally, they offer a safe and economical way to test the effects and efficacy of stimulation techniques on the dynamics of the cortex. Here, we use a physiologically relevant meso-scale model of the cortex to study the hypersynchronous activity of neuron populations during epileptic seizures. The model consists of a set of stochastic, highly non-linear partial differential equations. Next, we use optogenetic stimulation to control seizures in a hyperexcited cortex, and to induce seizures in a normally functioning cortex. The high spatial and temporal resolution this method offers makes a strong case for the use of optogenetics in treating meso scale cortical disorders such as epileptic seizures. We use bifurcation analysis to investigate the effect of optogenetic stimulation in the meso scale model, and its efficacy in suppressing the non-linear dynamics of seizures.

  15. Automated Behavior Property Verification Tool

    National Research Council Canada - National Science Library

    Leo, John K

    2008-01-01

    .... A type of CGF in which the entities have limited autonomy is semi-automated forces (SAF). The SAF system for this thesis research is OneSAF, a near real-time SAF that offers raw data collection of the entities in a particular simulation scenario...

  16. From Laser Scanning to Finite Element Analysis of Complex Buildings by Using a Semi-Automatic Procedure.

    Science.gov (United States)

    Castellazzi, Giovanni; D'Altri, Antonio Maria; Bitelli, Gabriele; Selvaggi, Ilenia; Lambertini, Alessandro

    2015-07-28

    In this paper, a new semi-automatic procedure to transform three-dimensional point clouds of complex objects to three-dimensional finite element models is presented and validated. The procedure conceives of the point cloud as a stacking of point sections. The complexity of the clouds is arbitrary, since the procedure is designed for terrestrial laser scanner surveys applied to buildings with irregular geometry, such as historical buildings. The procedure aims at solving the problems connected to the generation of finite element models of these complex structures by constructing a fine discretized geometry with a reduced amount of time and ready to be used with structural analysis. If the starting clouds represent the inner and outer surfaces of the structure, the resulting finite element model will accurately capture the whole three-dimensional structure, producing a complex solid made by voxel elements. A comparison analysis with a CAD-based model is carried out on a historical building damaged by a seismic event. The results indicate that the proposed procedure is effective and obtains comparable models in a shorter time, with an increased level of automation.

  17. Automated Analysis of Security in Networking Systems

    DEFF Research Database (Denmark)

    Buchholtz, Mikael

    2004-01-01

    such networking systems are modelled in the process calculus LySa. On top of this programming language based formalism an analysis is developed, which relies on techniques from data and control ow analysis. These are techniques that can be fully automated, which make them an ideal basis for tools targeted at non...

  18. Accuracy of Estimation of Graft Size for Living-Related Liver Transplantation: First Results of a Semi-Automated Interactive Software for CT-Volumetry

    Science.gov (United States)

    Mokry, Theresa; Bellemann, Nadine; Müller, Dirk; Lorenzo Bermejo, Justo; Klauß, Miriam; Stampfl, Ulrike; Radeleff, Boris; Schemmer, Peter; Kauczor, Hans-Ulrich; Sommer, Christof-Matthias

    2014-01-01

    Objectives To evaluate accuracy of estimated graft size for living-related liver transplantation using a semi-automated interactive software for CT-volumetry. Materials and Methods Sixteen donors for living-related liver transplantation (11 male; mean age: 38.2±9.6 years) underwent contrast-enhanced CT prior to graft removal. CT-volumetry was performed using a semi-automated interactive software (P), and compared with a manual commercial software (TR). For P, liver volumes were provided either with or without vessels. For TR, liver volumes were provided always with vessels. Intraoperative weight served as reference standard. Major study goals included analyses of volumes using absolute numbers, linear regression analyses and inter-observer agreements. Minor study goals included the description of the software workflow: degree of manual correction, speed for completion, and overall intuitiveness using five-point Likert scales: 1–markedly lower/faster/higher for P compared with TR, 2–slightly lower/faster/higher for P compared with TR, 3–identical for P and TR, 4–slightly lower/faster/higher for TR compared with P, and 5–markedly lower/faster/higher for TR compared with P. Results Liver segments II/III, II–IV and V–VIII served in 6, 3, and 7 donors as transplanted liver segments. Volumes were 642.9±368.8 ml for TR with vessels, 623.8±349.1 ml for P with vessels, and 605.2±345.8 ml for P without vessels (Pvolumetry performed with P can predict accurately graft size for living-related liver transplantation while improving workflow compared with TR. PMID:25330198

  19. The Effect of Information Analysis Automation Display Content on Human Judgment Performance in Noisy Environments

    Science.gov (United States)

    Bass, Ellen J.; Baumgart, Leigh A.; Shepley, Kathryn Klein

    2014-01-01

    Displaying both the strategy that information analysis automation employs to makes its judgments and variability in the task environment may improve human judgment performance, especially in cases where this variability impacts the judgment performance of the information analysis automation. This work investigated the contribution of providing either information analysis automation strategy information, task environment information, or both, on human judgment performance in a domain where noisy sensor data are used by both the human and the information analysis automation to make judgments. In a simplified air traffic conflict prediction experiment, 32 participants made probability of horizontal conflict judgments under different display content conditions. After being exposed to the information analysis automation, judgment achievement significantly improved for all participants as compared to judgments without any of the automation's information. Participants provided with additional display content pertaining to cue variability in the task environment had significantly higher aided judgment achievement compared to those provided with only the automation's judgment of a probability of conflict. When designing information analysis automation for environments where the automation's judgment achievement is impacted by noisy environmental data, it may be beneficial to show additional task environment information to the human judge in order to improve judgment performance. PMID:24847184

  20. The Effect of Information Analysis Automation Display Content on Human Judgment Performance in Noisy Environments.

    Science.gov (United States)

    Bass, Ellen J; Baumgart, Leigh A; Shepley, Kathryn Klein

    2013-03-01

    Displaying both the strategy that information analysis automation employs to makes its judgments and variability in the task environment may improve human judgment performance, especially in cases where this variability impacts the judgment performance of the information analysis automation. This work investigated the contribution of providing either information analysis automation strategy information, task environment information, or both, on human judgment performance in a domain where noisy sensor data are used by both the human and the information analysis automation to make judgments. In a simplified air traffic conflict prediction experiment, 32 participants made probability of horizontal conflict judgments under different display content conditions. After being exposed to the information analysis automation, judgment achievement significantly improved for all participants as compared to judgments without any of the automation's information. Participants provided with additional display content pertaining to cue variability in the task environment had significantly higher aided judgment achievement compared to those provided with only the automation's judgment of a probability of conflict. When designing information analysis automation for environments where the automation's judgment achievement is impacted by noisy environmental data, it may be beneficial to show additional task environment information to the human judge in order to improve judgment performance.

  1. Process Concepts for Semi-automatic Dismantling of LCD Televisions

    OpenAIRE

    Elo, Kristofer; Sundin, Erik

    2014-01-01

    There is a large variety of electrical and electronic equipment products, for example liquid crystal display television sets (LCD TVs), in the waste stream today. Many LCD TVs contain mercury, which is a challenge to treat at the recycling plants. Two current used processes to recycle LCD TVs are automated shredding and manual disassembly. This paper aims to present concepts for semi-automated dismantling processes for LCD TVs in order to achieve higher productivity and flexibility, and in tu...

  2. Flow injection analysis: Emerging tool for laboratory automation in radiochemistry

    International Nuclear Information System (INIS)

    Egorov, O.; Ruzicka, J.; Grate, J.W.; Janata, J.

    1996-01-01

    Automation of routine and serial assays is a common practice of modern analytical laboratory, while it is virtually nonexistent in the field of radiochemistry. Flow injection analysis (FIA) is a general solution handling methodology that has been extensively used for automation of routine assays in many areas of analytical chemistry. Reproducible automated solution handling and on-line separation capabilities are among several distinctive features that make FI a very promising, yet under utilized tool for automation in analytical radiochemistry. The potential of the technique is demonstrated through the development of an automated 90 Sr analyzer and its application in the analysis of tank waste samples from the Hanford site. Sequential injection (SI), the latest generation of FIA, is used to rapidly separate 90 Sr from interfering radionuclides and deliver separated Sr zone to a flow-through liquid scintillation detector. The separation is performed on a mini column containing Sr-specific sorbent extraction material, which selectively retains Sr under acidic conditions. The 90 Sr is eluted with water, mixed with scintillation cocktail, and sent through the flow cell of a flow through counter, where 90 Sr radioactivity is detected as a transient signal. Both peak area and peak height can be used for quantification of sample radioactivity. Alternatively, stopped flow detection can be performed to improve detection precision for low activity samples. The authors current research activities are focused on expansion of radiochemical applications of FIA methodology, with an ultimate goal of creating a set of automated methods that will cover the basic needs of radiochemical analysis at the Hanford site. The results of preliminary experiments indicate that FIA is a highly suitable technique for the automation of chemically more challenging separations, such as separation of actinide elements

  3. Development of Semi-Automatic Lathe by using Intelligent Soft Computing Technique

    Science.gov (United States)

    Sakthi, S.; Niresh, J.; Vignesh, K.; Anand Raj, G.

    2018-03-01

    This paper discusses the enhancement of conventional lathe machine to semi-automated lathe machine by implementing a soft computing method. In the present scenario, lathe machine plays a vital role in the engineering division of manufacturing industry. While the manual lathe machines are economical, the accuracy and efficiency are not up to the mark. On the other hand, CNC machine provide the desired accuracy and efficiency, but requires a huge capital. In order to over come this situation, a semi-automated approach towards the conventional lathe machine is developed by employing stepper motors to the horizontal and vertical drive, that can be controlled by Arduino UNO -microcontroller. Based on the input parameters of the lathe operation the arduino coding is been generated and transferred to the UNO board. Thus upgrading from manual to semi-automatic lathe machines can significantly increase the accuracy and efficiency while, at the same time, keeping a check on investment cost and consequently provide a much needed escalation to the manufacturing industry.

  4. Probabilistic, meso-scale flood loss modelling

    Science.gov (United States)

    Kreibich, Heidi; Botto, Anna; Schröter, Kai; Merz, Bruno

    2016-04-01

    Flood risk analyses are an important basis for decisions on flood risk management and adaptation. However, such analyses are associated with significant uncertainty, even more if changes in risk due to global change are expected. Although uncertainty analysis and probabilistic approaches have received increased attention during the last years, they are still not standard practice for flood risk assessments and even more for flood loss modelling. State of the art in flood loss modelling is still the use of simple, deterministic approaches like stage-damage functions. Novel probabilistic, multi-variate flood loss models have been developed and validated on the micro-scale using a data-mining approach, namely bagging decision trees (Merz et al. 2013). In this presentation we demonstrate and evaluate the upscaling of the approach to the meso-scale, namely on the basis of land-use units. The model is applied in 19 municipalities which were affected during the 2002 flood by the River Mulde in Saxony, Germany (Botto et al. submitted). The application of bagging decision tree based loss models provide a probability distribution of estimated loss per municipality. Validation is undertaken on the one hand via a comparison with eight deterministic loss models including stage-damage functions as well as multi-variate models. On the other hand the results are compared with official loss data provided by the Saxon Relief Bank (SAB). The results show, that uncertainties of loss estimation remain high. Thus, the significant advantage of this probabilistic flood loss estimation approach is that it inherently provides quantitative information about the uncertainty of the prediction. References: Merz, B.; Kreibich, H.; Lall, U. (2013): Multi-variate flood damage assessment: a tree-based data-mining approach. NHESS, 13(1), 53-64. Botto A, Kreibich H, Merz B, Schröter K (submitted) Probabilistic, multi-variable flood loss modelling on the meso-scale with BT-FLEMO. Risk Analysis.

  5. Automated image analysis in the study of collagenous colitis

    DEFF Research Database (Denmark)

    Fiehn, Anne-Marie Kanstrup; Kristensson, Martin; Engel, Ulla

    2016-01-01

    PURPOSE: The aim of this study was to develop an automated image analysis software to measure the thickness of the subepithelial collagenous band in colon biopsies with collagenous colitis (CC) and incomplete CC (CCi). The software measures the thickness of the collagenous band on microscopic...... slides stained with Van Gieson (VG). PATIENTS AND METHODS: A training set consisting of ten biopsies diagnosed as CC, CCi, and normal colon mucosa was used to develop the automated image analysis (VG app) to match the assessment by a pathologist. The study set consisted of biopsies from 75 patients...

  6. ORIGAMI Automator Primer. Automated ORIGEN Source Terms and Spent Fuel Storage Pool Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Wieselquist, William A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Thompson, Adam B. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Bowman, Stephen M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Peterson, Joshua L. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-04-01

    Source terms and spent nuclear fuel (SNF) storage pool decay heat load analyses for operating nuclear power plants require a large number of Oak Ridge Isotope Generation and Depletion (ORIGEN) calculations. SNF source term calculations also require a significant amount of bookkeeping to track quantities such as core and assembly operating histories, spent fuel pool (SFP) residence times, heavy metal masses, and enrichments. The ORIGEN Assembly Isotopics (ORIGAMI) module in the SCALE code system provides a simple scheme for entering these data. However, given the large scope of the analysis, extensive scripting is necessary to convert formats and process data to create thousands of ORIGAMI input files (one per assembly) and to process the results into formats readily usable by follow-on analysis tools. This primer describes a project within the SCALE Fulcrum graphical user interface (GUI) called ORIGAMI Automator that was developed to automate the scripting and bookkeeping in large-scale source term analyses. The ORIGAMI Automator enables the analyst to (1) easily create, view, and edit the reactor site and assembly information, (2) automatically create and run ORIGAMI inputs, and (3) analyze the results from ORIGAMI. ORIGAMI Automator uses the standard ORIGEN binary concentrations files produced by ORIGAMI, with concentrations available at all time points in each assembly’s life. The GUI plots results such as mass, concentration, activity, and decay heat using a powerful new ORIGEN Post-Processing Utility for SCALE (OPUS) GUI component. This document includes a description and user guide for the GUI, a step-by-step tutorial for a simplified scenario, and appendices that document the file structures used.

  7. Mesoscale characterization of local property distributions in heterogeneous electrodes

    Science.gov (United States)

    Hsu, Tim; Epting, William K.; Mahbub, Rubayyat; Nuhfer, Noel T.; Bhattacharya, Sudip; Lei, Yinkai; Miller, Herbert M.; Ohodnicki, Paul R.; Gerdes, Kirk R.; Abernathy, Harry W.; Hackett, Gregory A.; Rollett, Anthony D.; De Graef, Marc; Litster, Shawn; Salvador, Paul A.

    2018-05-01

    The performance of electrochemical devices depends on the three-dimensional (3D) distributions of microstructural features in their electrodes. Several mature methods exist to characterize 3D microstructures over the microscale (tens of microns), which are useful in understanding homogeneous electrodes. However, methods that capture mesoscale (hundreds of microns) volumes at appropriate resolution (tens of nm) are lacking, though they are needed to understand more common, less ideal electrodes. Using serial sectioning with a Xe plasma focused ion beam combined with scanning electron microscopy (Xe PFIB-SEM), two commercial solid oxide fuel cell (SOFC) electrodes are reconstructed over volumes of 126 × 73 × 12.5 and 124 × 110 × 8 μm3 with a resolution on the order of ≈ 503 nm3. The mesoscale distributions of microscale structural features are quantified and both microscale and mesoscale inhomogeneities are found. We analyze the origin of inhomogeneity over different length scales by comparing experimental and synthetic microstructures, generated with different particle size distributions, with such synthetic microstructures capturing well the high-frequency heterogeneity. Effective medium theory models indicate that significant mesoscale variations in local electrochemical activity are expected throughout such electrodes. These methods offer improved understanding of the performance of complex electrodes in energy conversion devices.

  8. Towards high resolution mapping of 3-D mesoscale dynamics from observations

    Directory of Open Access Journals (Sweden)

    B. Buongiorno Nardelli

    2012-10-01

    Full Text Available The MyOcean R&D project MESCLA (MEsoSCaLe dynamical Analysis through combined model, satellite and in situ data was devoted to the high resolution 3-D retrieval of tracer and velocity fields in the oceans, based on the combination of in situ and satellite observations and quasi-geostrophic dynamical models. The retrieval techniques were also tested and compared with the output of a primitive equation model, with particular attention to the accuracy of the vertical velocity field as estimated through the Q vector formulation of the omega equation. The project focused on a test case, covering the region where the Gulf Stream separates from the US East Coast. This work demonstrated that innovative methods for the high resolution mapping of 3-D mesoscale dynamics from observations can be used to build the next generations of operational observation-based products.

  9. Semi-automated procedures for shoreline extraction using single RADARSAT-1 SAR image

    Science.gov (United States)

    Al Fugura, A.'kif; Billa, Lawal; Pradhan, Biswajeet

    2011-12-01

    Coastline identification is important for surveying and mapping reasons. Coastline serves as the basic point of reference and is used on nautical charts for navigation purposes. Its delineation has become crucial and more important in the wake of the many recent earthquakes and tsunamis resulting in complete change and redraw of some shorelines. In a tropical country like Malaysia, presence of cloud cover hinders the application of optical remote sensing data. In this study a semi-automated technique and procedures are presented for shoreline delineation from RADARSAT-1 image. A scene of RADARSAT-1 satellite image was processed using enhanced filtering technique to identify and extract the shoreline coast of Kuala Terengganu, Malaysia. RADSARSAT image has many advantages over the optical data because of its ability to penetrate cloud cover and its night sensing capabilities. At first, speckles were removed from the image by using Lee sigma filter which was used to reduce random noise and to enhance the image and discriminate the boundary between land and water. The results showed an accurate and improved extraction and delineation of the entire coastline of Kuala Terrenganu. The study demonstrated the reliability of the image averaging filter in reducing random noise over the sea surface especially near the shoreline. It enhanced land-water boundary differentiation, enabling better delineation of the shoreline. Overall, the developed techniques showed the potential of radar imagery for accurate shoreline mapping and will be useful for monitoring shoreline changes during high and low tides as well as shoreline erosion in a tropical country like Malaysia.

  10. Upscale Impact of Mesoscale Disturbances of Tropical Convection on Convectively Coupled Kelvin Waves

    Science.gov (United States)

    Yang, Q.; Majda, A.

    2017-12-01

    Tropical convection associated with convectively coupled Kelvin waves (CCKWs) is typically organized by an eastward-moving synoptic-scale convective envelope with numerous embedded westward-moving mesoscale disturbances. It is of central importance to assess upscale impact of mesoscale disturbances on CCKWs as mesoscale disturbances propagate at various tilt angles and speeds. Here a simple multi-scale model is used to capture this multi-scale structure, where mesoscale fluctuations are directly driven by mesoscale heating and synoptic-scale circulation is forced by mean heating and eddy transfer of momentum and temperature. The two-dimensional version of the multi-scale model drives the synoptic-scale circulation, successfully reproduces key features of flow fields with a front-to-rear tilt and compares well with results from a cloud resolving model. In the scenario with an elevated upright mean heating, the tilted vertical structure of synoptic-scale circulation is still induced by the upscale impact of mesoscale disturbances. In a faster propagation scenario, the upscale impact becomes less important, while the synoptic-scale circulation response to mean heating dominates. In the unrealistic scenario with upward/westward tilted mesoscale heating, positive potential temperature anomalies are induced in the leading edge, which will suppress shallow convection in a moist environment. In its three-dimensional version, results show that upscale impact of mesoscale disturbances that propagate at tilt angles (110o 250o) induces negative lower-tropospheric potential temperature anomalies in the leading edge, providing favorable conditions for shallow convection in a moist environment, while the remaining tilt angle cases have opposite effects. Even in the presence of upright mean heating, the front-to-rear tilted synoptic-scale circulation can still be induced by eddy terms at tilt angles (120o 240o). In the case with fast propagating mesoscale heating, positive

  11. Automated analysis of objective-prism spectra

    International Nuclear Information System (INIS)

    Hewett, P.C.; Irwin, M.J.; Bunclark, P.; Bridgeland, M.T.; Kibblewhite, E.J.; Smith, M.G.

    1985-01-01

    A fully automated system for the location, measurement and analysis of large numbers of low-resolution objective-prism spectra is described. The system is based on the APM facility at the University of Cambridge, and allows processing of objective-prism, grens or grism data. Particular emphasis is placed on techniques to obtain the maximum signal-to-noise ratio from the data, both in the initial spectral estimation procedure and for subsequent feature identification. Comparison of a high-quality visual catalogue of faint quasar candidates with an equivalent automated sample demonstrates the ability of the APM system to identify all the visually selected quasar candidates. In addition, a large population of new, faint (msub(J)approx. 20) candidates is identified. (author)

  12. A fully-automated computer-assisted method of CT brain scan analysis for the measurement of cerebrospinal fluid spaces and brain absorption density

    International Nuclear Information System (INIS)

    Baldy, R.E.; Brindley, G.S.; Jacobson, R.R.; Reveley, M.A.; Lishman, W.A.; Ewusi-Mensah, I.; Turner, S.W.

    1986-01-01

    Computer-assisted methods of CT brain scan analysis offer considerable advantages over visual inspection, particularly in research; and several semi-automated methods are currently available. A new computer-assisted program is presented which provides fully automated processing of CT brain scans, depending on ''anatomical knowledge'' of where cerebrospinal fluid (CSF)-containing spaces are likely to lie. After identifying these regions of interest quantitative estimates are then provided of CSF content in each slice in cisterns, ventricles, Sylvian fissure and interhemispheric fissure. Separate measures are also provided of mean brain density in each slice. These estimates can be summated to provide total ventricular and total brain volumes. The program shows a high correlation with measures derived from mechanical planimetry and visual grading procedures, also when tested against a phantom brain of known ventricular volume. The advantages and limitations of the present program are discussed. (orig.)

  13. A catalog of automated analysis methods for enterprise models.

    Science.gov (United States)

    Florez, Hector; Sánchez, Mario; Villalobos, Jorge

    2016-01-01

    Enterprise models are created for documenting and communicating the structure and state of Business and Information Technologies elements of an enterprise. After models are completed, they are mainly used to support analysis. Model analysis is an activity typically based on human skills and due to the size and complexity of the models, this process can be complicated and omissions or miscalculations are very likely. This situation has fostered the research of automated analysis methods, for supporting analysts in enterprise analysis processes. By reviewing the literature, we found several analysis methods; nevertheless, they are based on specific situations and different metamodels; then, some analysis methods might not be applicable to all enterprise models. This paper presents the work of compilation (literature review), classification, structuring, and characterization of automated analysis methods for enterprise models, expressing them in a standardized modeling language. In addition, we have implemented the analysis methods in our modeling tool.

  14. Analysis of Trinity Power Metrics for Automated Monitoring

    Energy Technology Data Exchange (ETDEWEB)

    Michalenko, Ashley Christine [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-07-28

    This is a presentation from Los Alamos National Laboraotyr (LANL) about the analysis of trinity power metrics for automated monitoring. The following topics are covered: current monitoring efforts, motivation for analysis, tools used, the methodology, work performed during the summer, and future work planned.

  15. An ImageJ-based algorithm for a semi-automated method for microscopic image enhancement and DNA repair foci counting

    International Nuclear Information System (INIS)

    Klokov, D.; Suppiah, R.

    2015-01-01

    Proper evaluation of the health risks of low-dose ionizing radiation exposure heavily relies on the ability to accurately measure very low levels of DNA damage in cells. One of the most sensitive methods for measuring DNA damage levels is the quantification of DNA repair foci that consist of macromolecular aggregates of DNA repair proteins, such as γH2AX and 53BP1, forming around individual DNA double-strand breaks. They can be quantified using immunofluorescence microscopy and are widely used as markers of DNA double-strand breaks. However this quantification, if performed manually, may be very tedious and prone to inter-individual bias. Low-dose radiation studies are especially sensitive to this potential bias due to a very low magnitude of the effects anticipated. Therefore, we designed and validated an algorithm for the semi-automated processing of microscopic images and quantification of DNA repair foci. The algorithm uses ImageJ, a freely available image analysis software that is customizable to individual cellular properties or experimental conditions. We validated the algorithm using immunolabeled 53BP1 and γH2AX in normal human fibroblast AG01522 cells under both normal and irradiated conditions. This method is easy to learn, can be used by nontrained personnel, and can help avoiding discrepancies in inter-laboratory comparison studies examining the effects of low-dose radiation. (author)

  16. An ImageJ-based algorithm for a semi-automated method for microscopic image enhancement and DNA repair foci counting

    Energy Technology Data Exchange (ETDEWEB)

    Klokov, D., E-mail: dmitry.klokov@cnl.ca [Canadian Nuclear Laboratories, Chalk River, Ontario (Canada); Suppiah, R. [Queen' s Univ., Dept. of Biomedical and Molecular Sciences, Kingston, Ontario (Canada)

    2015-06-15

    Proper evaluation of the health risks of low-dose ionizing radiation exposure heavily relies on the ability to accurately measure very low levels of DNA damage in cells. One of the most sensitive methods for measuring DNA damage levels is the quantification of DNA repair foci that consist of macromolecular aggregates of DNA repair proteins, such as γH2AX and 53BP1, forming around individual DNA double-strand breaks. They can be quantified using immunofluorescence microscopy and are widely used as markers of DNA double-strand breaks. However this quantification, if performed manually, may be very tedious and prone to inter-individual bias. Low-dose radiation studies are especially sensitive to this potential bias due to a very low magnitude of the effects anticipated. Therefore, we designed and validated an algorithm for the semi-automated processing of microscopic images and quantification of DNA repair foci. The algorithm uses ImageJ, a freely available image analysis software that is customizable to individual cellular properties or experimental conditions. We validated the algorithm using immunolabeled 53BP1 and γH2AX in normal human fibroblast AG01522 cells under both normal and irradiated conditions. This method is easy to learn, can be used by nontrained personnel, and can help avoiding discrepancies in inter-laboratory comparison studies examining the effects of low-dose radiation. (author)

  17. Modeling of mesoscale dispersion effect on the piezoresistivity of carbon nanotube-polymer nanocomposites via 3D computational multiscale micromechanics methods

    International Nuclear Information System (INIS)

    Ren, Xiang; Seidel, Gary D; Chaurasia, Adarsh K; Oliva-Avilés, Andrés I; Ku-Herrera, José J; Avilés, Francis

    2015-01-01

    In uniaxial tension and compression experiments, carbon nanotube (CNT)-polymer nanocomposites have demonstrated exceptional mechanical and coupled electrostatic properties in the form of piezoresistivity. In order to better understand the correlation of the piezoresistive response with the CNT dispersion at the mesoscale, a 3D computational multiscale micromechanics model based on finite element analysis is constructed to predict the effective macroscale piezoresistive response of CNT/polymer nanocomposites. The key factors that may contribute to the overall piezoresistive response, i.e. the nanoscale electrical tunneling effect, the inherent CNT piezoresistivity and the CNT mesoscale network effect are incorporated in the model based on a 3D multiscale mechanical–electrostatic coupled code. The results not only explain how different nanoscale mechanisms influence the overall macroscale piezoresistive response through the mesoscale CNT network, but also give reason and provide bounds for the wide range of gauge factors found in the literature offering insight regarding how control of the mesoscale CNT networks can be used to tailor nanocomposite piezoresistive response. (paper)

  18. Analysis and design of power efficient semi-passive RFID tag

    Energy Technology Data Exchange (ETDEWEB)

    Che Wenyi; Guan Shuo; Wang Xiao; Xiong Tingwen; Xi Jingtian; Tan Xi; Yan Na; Min Hao, E-mail: yanna@fudan.edu.c [State Key Laboratory of ASIC and System, Auto-ID Laboratory, Fudan University, Shanghai 201203 (China)

    2010-07-15

    The analysis and design of a semi-passive radio frequency identification (RFID) tag is presented. By studying the power transmission link of the backscatter RFID system and exploiting a power conversion efficiency model for a multi-stage AC-DC charge pump, the calculation method for semi-passive tag's read range is proposed. According to different read range limitation factors, an intuitive way to define the specifications of tag's power budget and backscatter modulation index is given. A test chip is implemented in SMIC 0.18 {mu}m standard CMOS technology under the guidance of theoretical analysis. The main building blocks are the threshold compensated charge pump and low power wake-up circuit using the power triggering wake-up mode. The proposed semi-passive tag is fully compatible to EPC C1G2 standard. It has a compact chip size of 0.54 mm{sup 2}, and is adaptable to batteries with a 1.2 to 2.4 V output voltage.

  19. Analysis and design of power efficient semi-passive RFID tag

    International Nuclear Information System (INIS)

    Che Wenyi; Guan Shuo; Wang Xiao; Xiong Tingwen; Xi Jingtian; Tan Xi; Yan Na; Min Hao

    2010-01-01

    The analysis and design of a semi-passive radio frequency identification (RFID) tag is presented. By studying the power transmission link of the backscatter RFID system and exploiting a power conversion efficiency model for a multi-stage AC-DC charge pump, the calculation method for semi-passive tag's read range is proposed. According to different read range limitation factors, an intuitive way to define the specifications of tag's power budget and backscatter modulation index is given. A test chip is implemented in SMIC 0.18 μm standard CMOS technology under the guidance of theoretical analysis. The main building blocks are the threshold compensated charge pump and low power wake-up circuit using the power triggering wake-up mode. The proposed semi-passive tag is fully compatible to EPC C1G2 standard. It has a compact chip size of 0.54 mm 2 , and is adaptable to batteries with a 1.2 to 2.4 V output voltage.

  20. Semi-automated separation of the epimeric dehydropyrrolizidine alkaloids lycopsamine and intermedine: preparation of their N-oxides and NMR comparison with diastereoisomeric rinderine and echinatine.

    Science.gov (United States)

    Colegate, Steven M; Gardner, Dale R; Betz, Joseph M; Panter, Kip E

    2014-01-01

    The diversity of structure and, particularly, stereochemical variation of the dehydropyrrolizidine alkaloids can present challenges for analysis and the isolation of pure compounds for the preparation of analytical standards and for toxicology studies. To investigate methods for the separation of gram-scale quantities of the epimeric dehydropyrrolizidine alkaloids lycopsamine and intermedine and to compare their NMR spectroscopic data with those of their heliotridine-based analogues echinatine and rinderine. Lycopsamine and intermedine were extracted, predominantly as their N-oxides and along with their acetylated derivatives, from commercial samples of comfrey (Symphytum officinale) root. Alkaloid enrichment involved liquid-liquid partitioning of the crude methanol extract between dilute aqueous acid and n-butanol, reduction of N-oxides and subsequent continuous liquid-liquid extraction of free base alkaloids into CHCl3 . The alkaloid-rich fraction was further subjected to semi-automated flash chromatography using boronated soda glass beads or boronated quartz sand. Boronated soda glass beads (or quartz sand) chromatography adapted to a Biotage Isolera Flash Chromatography System enabled large-scale separation (at least up to 1-2 g quantities) of lycopsamine and intermedine. The structures were confirmed using one- and two-dimensional (1) H- and (13) C-NMR spectroscopy. Examination of the NMR data for lycopsamine, intermedine and their heliotridine-based analogues echinatine and rinderine allowed for some amendments of literature data and provided useful comparisons for determining relative configurations in monoester dehydropyrrolizidine alkaloids. A similar NMR comparison of lycopsamine and intermedine with their N-oxides showed the effects of N-oxidation on some key chemical shifts. A levorotatory shift in specific rotation from +3.29° to -1.5° was observed for lycopsamine when dissolved in ethanol or methanol respectively. A semi-automated flash

  1. Semi-automated operation of Mars Climate Simulation chamber - MCSC modelled for biological experiments

    Science.gov (United States)

    Tarasashvili, M. V.; Sabashvili, Sh. A.; Tsereteli, S. L.; Aleksidze, N. D.; Dalakishvili, O.

    2017-10-01

    The Mars Climate Simulation Chamber (MCSC) (GEO PAT 12 522/01) is designed for the investigation of the possible past and present habitability of Mars, as well as for the solution of practical tasks necessary for the colonization and Terraformation of the Planet. There are specific tasks such as the experimental investigation of the biological parameters that allow many terrestrial organisms to adapt to the imitated Martian conditions: chemistry of the ground, atmosphere, temperature, radiation, etc. MCSC is set for the simulation of the conduction of various biological experiments, as well as the selection of extremophile microorganisms for the possible Settlement, Ecopoesis and/or Terraformation purposes and investigation of their physiological functions. For long-term purposes, it is possible to cultivate genetically modified organisms (e.g., plants) adapted to the Martian conditions for future Martian agriculture to sustain human Mars missions and permanent settlements. The size of the chamber allows preliminary testing of the functionality of space-station mini-models and personal protection devices such as space-suits, covering and building materials and other structures. The reliability of the experimental biotechnological materials can also be tested over a period of years. Complex and thorough research has been performed to acquire the most appropriate technical tools for the accurate engineering of the MCSC and precious programmed simulation of Martian environmental conditions. This paper describes the construction and technical details of the equipment of the MCSC, which allows its semi-automated, long-term operation.

  2. A semi-automated method for non-invasive internal organ weight estimation by post-mortem magnetic resonance imaging in fetuses, newborns and children

    International Nuclear Information System (INIS)

    Thayyil, Sudhin; Schievano, Silvia; Robertson, Nicola J.; Jones, Rodney; Chitty, Lyn S.; Sebire, Neil J.; Taylor, Andrew M.

    2009-01-01

    Magnetic resonance (MR) imaging allows minimally invasive autopsy, especially when consent is declined for traditional autopsy. Estimation of individual visceral organ weights is an important component of traditional autopsy. Objective: To examine whether a semi-automated can be used for non-invasive internal organ weight measurement using post-mortem MR imaging in fetuses, newborns and children. Methods: Phase 1: In vitro scanning of 36 animal organs (heart, liver, kidneys) was performed to check the accuracy of volume reconstruction methodology. Real volumes were measured by water displacement method. Phase 2: Sixty-five whole body post-mortem MR scans were performed in fetuses (n = 30), newborns (n = 5) and children (n = 30) at 1.5 T using a 3D TSE T2-weighted sequence. These data were analysed offline using the image processing software Mimics 11.0. Results: Phase 1: Mean difference (S.D.) between estimated and actual volumes were -0.3 (1.5) ml for kidney, -0.7 (1.3) ml for heart, -1.7 (3.6) ml for liver in animal experiments. Phase 2: In fetuses, newborns and children mean differences between estimated and actual weights (S.D.) were -0.6 (4.9) g for liver, -5.1 (1.2) g for spleen, -0.3 (0.6) g for adrenals, 0.4 (1.6) g for thymus, 0.9 (2.5) g for heart, -0.7 (2.4) g for kidneys and 2.7 (14) g for lungs. Excellent co-correlation was noted for estimated and actual weights (r 2 = 0.99, p < 0.001). Accuracy was lower when fetuses were less than 20 weeks or less than 300 g. Conclusion: Rapid, accurate and reproducible estimation of solid internal organ weights is feasible using the semi-automated 3D volume reconstruction method.

  3. Critical appraisal of semi-quantitative analysis of 2-deoxyglucose autoradiograms

    Energy Technology Data Exchange (ETDEWEB)

    Kelly, P T; McCulloch, J [Glasgow Univ. (UK)

    1983-06-13

    Semi-quantitative analysis (e.g. optical density ratios) of (/sup 14/C)2-deoxyglucose autoradiograms is widely used in neuroscience research. The authors demonstrate that a fixed ratio of /sup 14/C-concentrations in the CNS does not yield a constant optical density ratio but is dependent upon the exposure time in the preparation of the autoradiograms and the absolute amounts of /sup 14/C from which the concentration ratio is derived. The failure of a fixed glucose utilization ratio to result in a constant optical density ratio represents a major interpretative difficulty in investigations where only semi-quantitative analysis of (/sup 14/C)2-deoxyglucose autoradiograms is undertaken.

  4. Mesoscale wind fluctuations over Danish waters

    DEFF Research Database (Denmark)

    Vincent, Claire Louise

    in generated power are a particular problem for oshore wind farms because the typically high concentration of turbines within a limited geographical area means that uctuations can be correlated across large numbers of turbines. Furthermore, organised mesoscale structures that often form over water......Mesoscale wind uctuations aect the large scale integration of wind power because they undermine the day-ahead predictability of wind speed and power production, and because they can result in large uctuations in power generation that must be balanced using reserve power. Large uctuations...... that realistic hour-scale wind uctuations and open cellular convection patterns develop in WRF simulations with 2km horizontal grid spacing. The atmospheric conditions during one of the case studies are then used to initialise a simplied version of the model that has no large scale weather forcing, topography...

  5. Automics: an integrated platform for NMR-based metabonomics spectral processing and data analysis

    Directory of Open Access Journals (Sweden)

    Qu Lijia

    2009-03-01

    Full Text Available Abstract Background Spectral processing and post-experimental data analysis are the major tasks in NMR-based metabonomics studies. While there are commercial and free licensed software tools available to assist these tasks, researchers usually have to use multiple software packages for their studies because software packages generally focus on specific tasks. It would be beneficial to have a highly integrated platform, in which these tasks can be completed within one package. Moreover, with open source architecture, newly proposed algorithms or methods for spectral processing and data analysis can be implemented much more easily and accessed freely by the public. Results In this paper, we report an open source software tool, Automics, which is specifically designed for NMR-based metabonomics studies. Automics is a highly integrated platform that provides functions covering almost all the stages of NMR-based metabonomics studies. Automics provides high throughput automatic modules with most recently proposed algorithms and powerful manual modules for 1D NMR spectral processing. In addition to spectral processing functions, powerful features for data organization, data pre-processing, and data analysis have been implemented. Nine statistical methods can be applied to analyses including: feature selection (Fisher's criterion, data reduction (PCA, LDA, ULDA, unsupervised clustering (K-Mean and supervised regression and classification (PLS/PLS-DA, KNN, SIMCA, SVM. Moreover, Automics has a user-friendly graphical interface for visualizing NMR spectra and data analysis results. The functional ability of Automics is demonstrated with an analysis of a type 2 diabetes metabolic profile. Conclusion Automics facilitates high throughput 1D NMR spectral processing and high dimensional data analysis for NMR-based metabonomics applications. Using Automics, users can complete spectral processing and data analysis within one software package in most cases

  6. Automics: an integrated platform for NMR-based metabonomics spectral processing and data analysis.

    Science.gov (United States)

    Wang, Tao; Shao, Kang; Chu, Qinying; Ren, Yanfei; Mu, Yiming; Qu, Lijia; He, Jie; Jin, Changwen; Xia, Bin

    2009-03-16

    Spectral processing and post-experimental data analysis are the major tasks in NMR-based metabonomics studies. While there are commercial and free licensed software tools available to assist these tasks, researchers usually have to use multiple software packages for their studies because software packages generally focus on specific tasks. It would be beneficial to have a highly integrated platform, in which these tasks can be completed within one package. Moreover, with open source architecture, newly proposed algorithms or methods for spectral processing and data analysis can be implemented much more easily and accessed freely by the public. In this paper, we report an open source software tool, Automics, which is specifically designed for NMR-based metabonomics studies. Automics is a highly integrated platform that provides functions covering almost all the stages of NMR-based metabonomics studies. Automics provides high throughput automatic modules with most recently proposed algorithms and powerful manual modules for 1D NMR spectral processing. In addition to spectral processing functions, powerful features for data organization, data pre-processing, and data analysis have been implemented. Nine statistical methods can be applied to analyses including: feature selection (Fisher's criterion), data reduction (PCA, LDA, ULDA), unsupervised clustering (K-Mean) and supervised regression and classification (PLS/PLS-DA, KNN, SIMCA, SVM). Moreover, Automics has a user-friendly graphical interface for visualizing NMR spectra and data analysis results. The functional ability of Automics is demonstrated with an analysis of a type 2 diabetes metabolic profile. Automics facilitates high throughput 1D NMR spectral processing and high dimensional data analysis for NMR-based metabonomics applications. Using Automics, users can complete spectral processing and data analysis within one software package in most cases. Moreover, with its open source architecture, interested

  7. Failure mode and effect analysis oriented to risk-reduction interventions in intraoperative electron radiation therapy: The specific impact of patient transportation, automation, and treatment planning availability

    International Nuclear Information System (INIS)

    López-Tarjuelo, Juan; Bouché-Babiloni, Ana; Santos-Serra, Agustín; Morillo-Macías, Virginia; Calvo, Felipe A.; Kubyshin, Yuri

    2014-01-01

    Background and purpose: Industrial companies use failure mode and effect analysis (FMEA) to improve quality. Our objective was to describe an FMEA and subsequent interventions for an automated intraoperative electron radiotherapy (IOERT) procedure with computed tomography simulation, pre-planning, and a fixed conventional linear accelerator. Material and methods: A process map, an FMEA, and a fault tree analysis are reported. The equipment considered was the radiance treatment planning system (TPS), the Elekta Precise linac, and TN-502RDM-H metal–oxide-semiconductor-field-effect transistor in vivo dosimeters. Computerized order-entry and treatment-automation were also analyzed. Results: Fifty-seven potential modes and effects were identified and classified into ‘treatment cancellation’ and ‘delivering an unintended dose’. They were graded from ‘inconvenience’ or ‘suboptimal treatment’ to ‘total cancellation’ or ‘potentially wrong’ or ‘very wrong administered dose’, although these latter effects were never experienced. Risk priority numbers (RPNs) ranged from 3 to 324 and totaled 4804. After interventions such as double checking, interlocking, automation, and structural changes the final total RPN was reduced to 1320. Conclusions: FMEA is crucial for prioritizing risk-reduction interventions. In a semi-surgical procedure like IOERT double checking has the potential to reduce risk and improve quality. Interlocks and automation should also be implemented to increase the safety of the procedure

  8. Automated Asteroseismic Analysis of Solar-type Stars

    DEFF Research Database (Denmark)

    Karoff, Christoffer; Campante, T.L.; Chaplin, W.J.

    2010-01-01

    The rapidly increasing volume of asteroseismic observations on solar-type stars has revealed a need for automated analysis tools. The reason for this is not only that individual analyses of single stars are rather time consuming, but more importantly that these large volumes of observations open...... are calculated in a consistent way. Here we present a set of automated asterosesimic analysis tools. The main engine of these set of tools is an algorithm for modelling the autocovariance spectra of the stellar acoustic spectra allowing us to measure not only the frequency of maximum power and the large......, radius, luminosity, effective temperature, surface gravity and age based on grid modeling. All the tools take into account the window function of the observations which means that they work equally well for space-based photometry observations from e.g. the NASA Kepler satellite and ground-based velocity...

  9. Semi-continuous protein fractionating using affinity cross-flow filtration

    NARCIS (Netherlands)

    Borneman, Zandrie; Zhang, W.; van den Boomgaard, Anthonie; Smolders, C.A.

    2002-01-01

    Protein purification by means of downstream processing is increasingly important. At the University of Twente a semi-continuous process is developed for the isolation of BSA out of crude protein mixtures. For this purpose an automated Affinity Cross-Flow Filtration, ACFF, process is developed. This

  10. Automated Image Analysis Corrosion Working Group Update: February 1, 2018

    Energy Technology Data Exchange (ETDEWEB)

    Wendelberger, James G. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2018-02-01

    These are slides for the automated image analysis corrosion working group update. The overall goals were: automate the detection and quantification of features in images (faster, more accurate), how to do this (obtain data, analyze data), focus on Laser Scanning Confocal Microscope (LCM) data (laser intensity, laser height/depth, optical RGB, optical plus laser RGB).

  11. Mesoscale analysis of failure in quasi-brittle materials: comparison between lattice model and acoustic emission data.

    Science.gov (United States)

    Grégoire, David; Verdon, Laura; Lefort, Vincent; Grassl, Peter; Saliba, Jacqueline; Regoin, Jean-Pierre; Loukili, Ahmed; Pijaudier-Cabot, Gilles

    2015-10-25

    The purpose of this paper is to analyse the development and the evolution of the fracture process zone during fracture and damage in quasi-brittle materials. A model taking into account the material details at the mesoscale is used to describe the failure process at the scale of the heterogeneities. This model is used to compute histograms of the relative distances between damaged points. These numerical results are compared with experimental data, where the damage evolution is monitored using acoustic emissions. Histograms of the relative distances between damage events in the numerical calculations and acoustic events in the experiments exhibit good agreement. It is shown that the mesoscale model provides relevant information from the point of view of both global responses and the local failure process. © 2015 The Authors. International Journal for Numerical and Analytical Methods in Geomechanics published by John Wiley & Sons Ltd.

  12. Semi-automated segmentation of a glioblastoma multiforme on brain MR images for radiotherapy planning.

    Science.gov (United States)

    Hori, Daisuke; Katsuragawa, Shigehiko; Murakami, Ryuuji; Hirai, Toshinori

    2010-04-20

    We propose a computerized method for semi-automated segmentation of the gross tumor volume (GTV) of a glioblastoma multiforme (GBM) on brain MR images for radiotherapy planning (RTP). Three-dimensional (3D) MR images of 28 cases with a GBM were used in this study. First, a sphere volume of interest (VOI) including the GBM was selected by clicking a part of the GBM region in the 3D image. Then, the sphere VOI was transformed to a two-dimensional (2D) image by use of a spiral-scanning technique. We employed active contour models (ACM) to delineate an optimal outline of the GBM in the transformed 2D image. After inverse transform of the optimal outline to the 3D space, a morphological filter was applied to smooth the shape of the 3D segmented region. For evaluation of our computerized method, we compared the computer output with manually segmented regions, which were obtained by a therapeutic radiologist using a manual tracking method. In evaluating our segmentation method, we employed the Jaccard similarity coefficient (JSC) and the true segmentation coefficient (TSC) in volumes between the computer output and the manually segmented region. The mean and standard deviation of JSC and TSC were 74.2+/-9.8% and 84.1+/-7.1%, respectively. Our segmentation method provided a relatively accurate outline for GBM and would be useful for radiotherapy planning.

  13. Semi-automated camera trap image processing for the detection of ungulate fence crossing events.

    Science.gov (United States)

    Janzen, Michael; Visser, Kaitlyn; Visscher, Darcy; MacLeod, Ian; Vujnovic, Dragomir; Vujnovic, Ksenija

    2017-09-27

    Remote cameras are an increasingly important tool for ecological research. While remote camera traps collect field data with minimal human attention, the images they collect require post-processing and characterization before it can be ecologically and statistically analyzed, requiring the input of substantial time and money from researchers. The need for post-processing is due, in part, to a high incidence of non-target images. We developed a stand-alone semi-automated computer program to aid in image processing, categorization, and data reduction by employing background subtraction and histogram rules. Unlike previous work that uses video as input, our program uses still camera trap images. The program was developed for an ungulate fence crossing project and tested against an image dataset which had been previously processed by a human operator. Our program placed images into categories representing the confidence of a particular sequence of images containing a fence crossing event. This resulted in a reduction of 54.8% of images that required further human operator characterization while retaining 72.6% of the known fence crossing events. This program can provide researchers using remote camera data the ability to reduce the time and cost required for image post-processing and characterization. Further, we discuss how this procedure might be generalized to situations not specifically related to animal use of linear features.

  14. Lightning characteristics of derecho producing mesoscale convective systems

    Science.gov (United States)

    Bentley, Mace L.; Franks, John R.; Suranovic, Katelyn R.; Barbachem, Brent; Cannon, Declan; Cooper, Stonie R.

    2016-06-01

    Derechos, or widespread, convectively induced wind storms, are a common warm season phenomenon in the Central and Eastern United States. These damaging and severe weather events are known to sweep quickly across large spatial regions of more than 400 km and produce wind speeds exceeding 121 km h-1. Although extensive research concerning derechos and their parent mesoscale convective systems already exists, there have been few investigations of the spatial and temporal distribution of associated cloud-to-ground lightning with these events. This study analyzes twenty warm season (May through August) derecho events between 2003 and 2013 in an effort to discern their lightning characteristics. Data used in the study included cloud-to-ground flash data derived from the National Lightning Detection Network, WSR-88D imagery from the University Corporation for Atmospheric Research, and damaging wind report data obtained from the Storm Prediction Center. A spatial and temporal analysis was conducted by incorporating these data into a geographic information system to determine the distribution and lightning characteristics of the environments of derecho producing mesoscale convective systems. Primary foci of this research include: (1) finding the approximate size of the lightning activity region for individual and combined event(s); (2) determining the intensity of each event by examining the density and polarity of lightning flashes; (3) locating areas of highest lightning flash density; and (4) to provide a lightning spatial analysis that outlines the temporal and spatial distribution of flash activity for particularly strong derecho producing thunderstorm episodes.

  15. Semi-Automated Diagnosis, Repair, and Rework of Spacecraft Electronics

    Science.gov (United States)

    Struk, Peter M.; Oeftering, Richard C.; Easton, John W.; Anderson, Eric E.

    2008-01-01

    NASA's Constellation Program for Exploration of the Moon and Mars places human crews in extreme isolation in resource scarce environments. Near Earth, the discontinuation of Space Shuttle flights after 2010 will alter the up- and down-mass capacity for the International Space Station (ISS). NASA is considering new options for logistics support strategies for future missions. Aerospace systems are often composed of replaceable modular blocks that minimize the need for complex service operations in the field. Such a strategy however, implies a robust and responsive logistics infrastructure with relatively low transportation costs. The modular Orbital Replacement Units (ORU) used for ISS requires relatively large blocks of replacement hardware even though the actual failed component may really be three orders of magnitude smaller. The ability to perform in-situ repair of electronics circuits at the component level can dramatically reduce the scale of spares and related logistics cost. This ability also reduces mission risk, increases crew independence and improves the overall supportability of the program. The Component-Level Electronics Assembly Repair (CLEAR) task under the NASA Supportability program was established to demonstrate the practicality of repair by first investigating widely used soldering materials and processes (M&P) performed by modest manual means. The work will result in program guidelines for performing manual repairs along with design guidance for circuit reparability. The next phase of CLEAR recognizes that manual repair has its limitations and some highly integrated devices are extremely difficult to handle and demand semi-automated equipment. Further, electronics repairs require a broad range of diagnostic capability to isolate the faulty components. Finally repairs must pass functional tests to determine that the repairs are successful and the circuit can be returned to service. To prevent equipment demands from exceeding spacecraft volume

  16. A robust computational solution for automated quantification of a specific binding ratio based on [123I]FP-CIT SPECT images

    International Nuclear Information System (INIS)

    Oliveira, F. P. M.; Tavares, J. M. R. S.; Borges, Faria D.; Campos, Costa D.

    2014-01-01

    The purpose of the current paper is to present a computational solution to accurately quantify a specific to a non-specific uptake ratio in [ 123 I]fP-CIT single photon emission computed tomography (SPECT) images and simultaneously measure the spatial dimensions of the basal ganglia, also known as basal nuclei. A statistical analysis based on a reference dataset selected by the user is also automatically performed. The quantification of the specific to non-specific uptake ratio here is based on regions of interest defined after the registration of the image under study with a template image. The computational solution was tested on a dataset of 38 [ 123 I]FP-CIT SPECT images: 28 images were from patients with Parkinson’s disease and the remainder from normal patients, and the results of the automated quantification were compared to the ones obtained by three well-known semi-automated quantification methods. The results revealed a high correlation coefficient between the developed automated method and the three semi-automated methods used for comparison (r ≥0.975). The solution also showed good robustness against different positions of the patient, as an almost perfect agreement between the specific to non-specific uptake ratio was found (ICC=1.000). The mean processing time was around 6 seconds per study using a common notebook PC. The solution developed can be useful for clinicians to evaluate [ 123 I]FP-CIT SPECT images due to its accuracy, robustness and speed. Also, the comparison between case studies and the follow-up of patients can be done more accurately and proficiently since the intra- and inter-observer variability of the semi-automated calculation does not exist in automated solutions. The dimensions of the basal ganglia and their automatic comparison with the values of the population selected as reference are also important for professionals in this area.

  17. Error Covariance Estimation of Mesoscale Data Assimilation

    National Research Council Canada - National Science Library

    Xu, Qin

    2005-01-01

    The goal of this project is to explore and develop new methods of error covariance estimation that will provide necessary statistical descriptions of prediction and observation errors for mesoscale data assimilation...

  18. Automated multivariate analysis of comprehensive two-dimensional gas chromatograms of petroleum

    DEFF Research Database (Denmark)

    Skov, Søren Furbo

    of separated compounds makes the analysis of GCGC chromatograms tricky, as there are too much data for manual analysis , and automated analysis is not always trouble-free: Manual checking of the results is often necessary. In this work, I will investigate the possibility of another approach to analysis of GCGC...... impossible to find it. For a special class of models, multi-way models, unique solutions often exist, meaning that the underlying phenomena can be found. I have tested this class of models on GCGC data from petroleum and conclude that more work is needed before they can be automated. I demonstrate how...

  19. Analysis And Control System For Automated Welding

    Science.gov (United States)

    Powell, Bradley W.; Burroughs, Ivan A.; Kennedy, Larry Z.; Rodgers, Michael H.; Goode, K. Wayne

    1994-01-01

    Automated variable-polarity plasma arc (VPPA) welding apparatus operates under electronic supervision by welding analysis and control system. System performs all major monitoring and controlling functions. It acquires, analyzes, and displays weld-quality data in real time and adjusts process parameters accordingly. Also records pertinent data for use in post-weld analysis and documentation of quality. System includes optoelectronic sensors and data processors that provide feedback control of welding process.

  20. Mesoscale hybrid calibration artifact

    Science.gov (United States)

    Tran, Hy D.; Claudet, Andre A.; Oliver, Andrew D.

    2010-09-07

    A mesoscale calibration artifact, also called a hybrid artifact, suitable for hybrid dimensional measurement and the method for make the artifact. The hybrid artifact has structural characteristics that make it suitable for dimensional measurement in both vision-based systems and touch-probe-based systems. The hybrid artifact employs the intersection of bulk-micromachined planes to fabricate edges that are sharp to the nanometer level and intersecting planes with crystal-lattice-defined angles.

  1. Developing a semi/automated protocol to post-process large volume, High-resolution airborne thermal infrared (TIR) imagery for urban waste heat mapping

    Science.gov (United States)

    Rahman, Mir Mustafizur

    In collaboration with The City of Calgary 2011 Sustainability Direction and as part of the HEAT (Heat Energy Assessment Technologies) project, the focus of this research is to develop a semi/automated 'protocol' to post-process large volumes of high-resolution (H-res) airborne thermal infrared (TIR) imagery to enable accurate urban waste heat mapping. HEAT is a free GeoWeb service, designed to help Calgary residents improve their home energy efficiency by visualizing the amount and location of waste heat leaving their homes and communities, as easily as clicking on their house in Google Maps. HEAT metrics are derived from 43 flight lines of TABI-1800 (Thermal Airborne Broadband Imager) data acquired on May 13--14, 2012 at night (11:00 pm--5:00 am) over The City of Calgary, Alberta (˜825 km 2) at a 50 cm spatial resolution and 0.05°C thermal resolution. At present, the only way to generate a large area, high-spatial resolution TIR scene is to acquire separate airborne flight lines and mosaic them together. However, the ambient sensed temperature within, and between flight lines naturally changes during acquisition (due to varying atmospheric and local micro-climate conditions), resulting in mosaicked images with different temperatures for the same scene components (e.g. roads, buildings), and mosaic join-lines arbitrarily bisect many thousands of homes. In combination these effects result in reduced utility and classification accuracy including, poorly defined HEAT Metrics, inaccurate hotspot detection and raw imagery that are difficult to interpret. In an effort to minimize these effects, three new semi/automated post-processing algorithms (the protocol) are described, which are then used to generate a 43 flight line mosaic of TABI-1800 data from which accurate Calgary waste heat maps and HEAT metrics can be generated. These algorithms (presented as four peer-reviewed papers)---are: (a) Thermal Urban Road Normalization (TURN)---used to mitigate the microclimatic

  2. New Mesoscale Fluvial Landscapes - Seismic Geomorphology and Exploration

    Science.gov (United States)

    Wilkinson, M. J.

    2013-01-01

    Megafans (100-600 km radius) are very large alluvial fans that cover significant areas on most continents, the surprising finding of recent global surveys. The number of such fans and patterns of sedimentation on them provides new mesoscale architectures that can now be applied on continental fluvial depositional systems, and therefore on. Megafan-scale reconstructions underground as yet have not been attempted. Seismic surveys offer new possibilities in identifying the following prospective situations at potentially unsuspected locations: (i) sand concentrations points, (ii) sand-mud continuums at the mesoscale, (iii) paleo-valley forms in these generally unvalleyed landscapes, (iv) stratigraphic traps, and (v) structural traps.

  3. A semi-automated tool for reducing the creation of false closed depressions from a filled LIDAR-derived digital elevation model

    Science.gov (United States)

    Waller, John S.; Doctor, Daniel H.; Terziotti, Silvia

    2015-01-01

    Closed depressions on the land surface can be identified by ‘filling’ a digital elevation model (DEM) and subtracting the filled model from the original DEM. However, automated methods suffer from artificial ‘dams’ where surface streams cross under bridges and through culverts. Removal of these false depressions from an elevation model is difficult due to the lack of bridge and culvert inventories; thus, another method is needed to breach these artificial dams. Here, we present a semi-automated workflow and toolbox to remove falsely detected closed depressions created by artificial dams in a DEM. The approach finds the intersections between transportation routes (e.g., roads) and streams, and then lowers the elevation surface across the roads to stream level allowing flow to be routed under the road. Once the surface is corrected to match the approximate location of the National Hydrologic Dataset stream lines, the procedure is repeated with sequentially smaller flow accumulation thresholds in order to generate stream lines with less contributing area within the watershed. Through multiple iterations, artificial depressions that may arise due to ephemeral flow paths can also be removed. Preliminary results reveal that this new technique provides significant improvements for flow routing across a DEM and minimizes artifacts within the elevation surface. Slight changes in the stream flow lines generally improve the quality of flow routes; however some artificial dams may persist. Problematic areas include extensive road ditches, particularly along divided highways, and where surface flow crosses beneath road intersections. Limitations do exist, and the results partially depend on the quality of data being input. Of 166 manually identified culverts from a previous study by Doctor and Young in 2013, 125 are within 25 m of culverts identified by this tool. After three iterations, 1,735 culverts were identified and cataloged. The result is a reconditioned

  4. The Effect of Information Analysis Automation Display Content on Human Judgment Performance in Noisy Environments

    OpenAIRE

    Bass, Ellen J.; Baumgart, Leigh A.; Shepley, Kathryn Klein

    2012-01-01

    Displaying both the strategy that information analysis automation employs to makes its judgments and variability in the task environment may improve human judgment performance, especially in cases where this variability impacts the judgment performance of the information analysis automation. This work investigated the contribution of providing either information analysis automation strategy information, task environment information, or both, on human judgment performance in a domain where noi...

  5. Prevalence of discordant microscopic changes with automated CBC analysis

    Directory of Open Access Journals (Sweden)

    Fabiano de Jesus Santos

    2014-12-01

    Full Text Available Introduction:The most common cause of diagnostic error is related to errors in laboratory tests as well as errors of results interpretation. In order to reduce them, the laboratory currently has modern equipment which provides accurate and reliable results. The development of automation has revolutionized the laboratory procedures in Brazil and worldwide.Objective:To determine the prevalence of microscopic changes present in blood slides concordant and discordant with results obtained using fully automated procedures.Materials and method:From January to July 2013, 1,000 hematological parameters slides were analyzed. Automated analysis was performed on last generation equipment, which methodology is based on electrical impedance, and is able to quantify all the figurative elements of the blood in a universe of 22 parameters. The microscopy was performed by two experts in microscopy simultaneously.Results:The data showed that only 42.70% were concordant, comparing with 57.30% discordant. The main findings among discordant were: Changes in red blood cells 43.70% (n = 250, white blood cells 38.46% (n = 220, and number of platelet 17.80% (n = 102.Discussion:The data show that some results are not consistent with clinical or physiological state of an individual, and cannot be explained because they have not been investigated, which may compromise the final diagnosis.Conclusion:It was observed that it is of fundamental importance that the microscopy qualitative analysis must be performed in parallel with automated analysis in order to obtain reliable results, causing a positive impact on the prevention, diagnosis, prognosis, and therapeutic follow-up.

  6. Automated Motivic Analysis

    DEFF Research Database (Denmark)

    Lartillot, Olivier

    2016-01-01

    Motivic analysis provides very detailed understanding of musical composi- tions, but is also particularly difficult to formalize and systematize. A computational automation of the discovery of motivic patterns cannot be reduced to a mere extraction of all possible sequences of descriptions...... for lossless compression. The structural complexity resulting from successive repetitions of patterns can be controlled through a simple modelling of cycles. Generally, motivic patterns cannot always be defined solely as sequences of descriptions in a fixed set of dimensions: throughout the descriptions...... of the successive notes and intervals, various sets of musical parameters may be invoked. In this chapter, a method is presented that allows for these heterogeneous patterns to be discovered. Motivic repetition with local ornamentation is detected by reconstructing, on top of “surface-level” monodic voices, longer...

  7. Wind-Farm Parametrisations in Mesoscale Models

    DEFF Research Database (Denmark)

    Volker, Patrick; Badger, Jake; Hahmann, Andrea N.

    2013-01-01

    In this paper we compare three wind-farm parametrisations for mesoscale models against measurement data from the Horns Rev I offshore wind-farm. The parametrisations vary from a simple rotor drag method, to more sophisticated models. Additional to (4) we investigated the horizontal resolution dep...

  8. Potencial da técnica in vitro semi-automática de produção de gases para avaliação de silagens de sorgo (Sorghum bicolor (L. Moench

    Directory of Open Access Journals (Sweden)

    Maurício Rogério Martins

    2003-01-01

    Full Text Available O potential da técnica in vitro semi-automática de produção de gases foi estudado pela avaliação das silagens de quatro híbridos de sorgo (BR700, BR701, BR601 e AG2002. Os resultados desse experimento foram comparados aos obtidos em experimento de digestibilidade aparente. A relação entre a digestibilidade da matéria seca obtida pela técnica de produção de gases após 96 horas de fermentação (DMS e a digestibilidade aparente da MS foi representada pela equação: digestibilidade in vivo (g/kg = 0,46 x DMS (g/kg + 361,08 (r²=0,97. A técnica in vitro semi automática de produção de gases estimou de forma precisa os valores de digestibilidade aparente da MS das silagens avaliadas nesse experimento. Além disto, forneceu informações adicionais sobre a cinética de fermentação ruminal das silagens e degradabilidade efetiva da matéria seca em diferentes taxas de passagem. A superioridade da taxa de produção de gases (%/h do híbrido BR601 (0,056 em relação ao BR700 (0,051, BR701 (0,044 e AG2002 (0,045 está correlacionada com a maior DMS do material (649, 598, 601 e 593 g/kg, respectivamente. Dessa forma, a técnica in vitro semi-automática de produção de gases foi capaz de selecionar o híbrido BR601, em termos de digestibilidade e cinética de fermentação ruminal, como o mais promissor para uso na alimentação dos ruminantes, demonstrando assim o seu potencial para avaliação de silagens de sorgo.

  9. Cross-Domain Semi-Supervised Learning Using Feature Formulation.

    Science.gov (United States)

    Xingquan Zhu

    2011-12-01

    Semi-Supervised Learning (SSL) traditionally makes use of unlabeled samples by including them into the training set through an automated labeling process. Such a primitive Semi-Supervised Learning (pSSL) approach suffers from a number of disadvantages including false labeling and incapable of utilizing out-of-domain samples. In this paper, we propose a formative Semi-Supervised Learning (fSSL) framework which explores hidden features between labeled and unlabeled samples to achieve semi-supervised learning. fSSL regards that both labeled and unlabeled samples are generated from some hidden concepts with labeling information partially observable for some samples. The key of the fSSL is to recover the hidden concepts, and take them as new features to link labeled and unlabeled samples for semi-supervised learning. Because unlabeled samples are only used to generate new features, but not to be explicitly included in the training set like pSSL does, fSSL overcomes the inherent disadvantages of the traditional pSSL methods, especially for samples not within the same domain as the labeled instances. Experimental results and comparisons demonstrate that fSSL significantly outperforms pSSL-based methods for both within-domain and cross-domain semi-supervised learning.

  10. Automated longitudinal intra-subject analysis (ALISA) for diffusion MRI tractography

    DEFF Research Database (Denmark)

    Aarnink, Saskia H; Vos, Sjoerd B; Leemans, Alexander

    2014-01-01

    the inter-subject and intra-subject automation in this situation are intended for subjects without gross pathology. In this work, we propose such an automated longitudinal intra-subject analysis (dubbed ALISA) approach, and assessed whether ALISA could preserve the same level of reliability as obtained....... The major disadvantage of manual FT segmentations, unfortunately, is that placing regions-of-interest for tract selection can be very labor-intensive and time-consuming. Although there are several methods that can identify specific WM fiber bundles in an automated way, manual FT segmentations across...... multiple subjects performed by a trained rater with neuroanatomical expertise are generally assumed to be more accurate. However, for longitudinal DTI analyses it may still be beneficial to automate the FT segmentation across multiple time points, but then for each individual subject separately. Both...

  11. Mesoscale Climate Evaluation Using Grid Computing

    Science.gov (United States)

    Campos Velho, H. F.; Freitas, S. R.; Souto, R. P.; Charao, A. S.; Ferraz, S.; Roberti, D. R.; Streck, N.; Navaux, P. O.; Maillard, N.; Collischonn, W.; Diniz, G.; Radin, B.

    2012-04-01

    The CLIMARS project is focused to establish an operational environment for seasonal climate prediction for the Rio Grande do Sul state, Brazil. The dynamical downscaling will be performed with the use of several software platforms and hardware infrastructure to carry out the investigation on mesoscale of the global change impact. The grid computing takes advantage of geographically spread out computer systems, connected by the internet, for enhancing the power of computation. The ensemble climate prediction is an appropriated application for processing on grid computing, because the integration of each ensemble member does not have a dependency on information from another ensemble members. The grid processing is employed to compute the 20-year climatology and the long range simulations under ensemble methodology. BRAMS (Brazilian Regional Atmospheric Model) is a mesoscale model developed from a version of the RAMS (from the Colorado State University - CSU, USA). BRAMS model is the tool for carrying out the dynamical downscaling from the IPCC scenarios. Long range BRAMS simulations will provide data for some climate (data) analysis, and supply data for numerical integration of different models: (a) Regime of the extreme events for temperature and precipitation fields: statistical analysis will be applied on the BRAMS data, (b) CCATT-BRAMS (Coupled Chemistry Aerosol Tracer Transport - BRAMS) is an environmental prediction system that will be used to evaluate if the new standards of temperature, rain regime, and wind field have a significant impact on the pollutant dispersion in the analyzed regions, (c) MGB-IPH (Portuguese acronym for the Large Basin Model (MGB), developed by the Hydraulic Research Institute, (IPH) from the Federal University of Rio Grande do Sul (UFRGS), Brazil) will be employed to simulate the alteration of the river flux under new climate patterns. Important meteorological input variables for the MGB-IPH are the precipitation (most relevant

  12. Multi-sensor in situ observations to resolve the sub-mesoscale features in the stratified Gulf of Finland, Baltic Sea

    Science.gov (United States)

    Lips, Urmas; Kikas, Villu; Liblik, Taavi; Lips, Inga

    2016-05-01

    High-resolution numerical modeling, remote sensing, and in situ data have revealed significant role of sub-mesoscale features in shaping the distribution pattern of tracers in the ocean's upper layer. However, in situ measurements are difficult to conduct with the required resolution and coverage in time and space to resolve the sub-mesoscale, especially in such relatively shallow basins as the Gulf of Finland, where the typical baroclinic Rossby radius is 2-5 km. To map the multi-scale spatiotemporal variability in the gulf, we initiated continuous measurements with autonomous devices, including a moored profiler and Ferrybox system, which were complemented by dedicated research-vessel-based surveys. The analysis of collected high-resolution data in the summers of 2009-2012 revealed pronounced variability at the sub-mesoscale in the presence of mesoscale upwelling/downwelling, fronts, and eddies. The horizontal wavenumber spectra of temperature variance in the surface layer had slopes close to -2 between the lateral scales from 10 to 0.5 km. Similar tendency towards the -2 slopes of horizontal wavenumber spectra of temperature variance was found in the seasonal thermocline between the lateral scales from 10 to 1 km. It suggests that the ageostrophic sub-mesoscale processes could contribute considerably to the energy cascade in such a stratified sea basin. We showed that the intrusions of water with different salinity, which indicate the occurrence of a layered flow structure, could appear in the process of upwelling/downwelling development and relaxation in response to variable wind forcing. We suggest that the sub-mesoscale processes play a major role in feeding surface blooms in the conditions of coupled coastal upwelling and downwelling events in the Gulf of Finland.

  13. Automated quantification of optic nerve axons in primate glaucomatous and normal eyes--method and comparison to semi-automated manual quantification.

    Science.gov (United States)

    Reynaud, Juan; Cull, Grant; Wang, Lin; Fortune, Brad; Gardiner, Stuart; Burgoyne, Claude F; Cioffi, George A

    2012-05-01

    To describe an algorithm and software application (APP) for 100% optic nerve axon counting and to compare its performance with a semi-automated manual (SAM) method in optic nerve cross-section images (images) from normal and experimental glaucoma (EG) nonhuman primate (NHP) eyes. ON cross sections from eight EG eyes from eight NHPs, five EG and five normal eyes from five NHPs, and 12 normal eyes from 12 NHPs were imaged at 100×. Calibration (n = 500) and validation (n = 50) image sets ranging from normal to end-stage damage were assembled. Correlation between APP and SAM axon counts was assessed by Deming regression within the calibration set and a compensation formula was generated to account for the subtle, systematic differences. Then, compensated APP counts for each validation image were compared with the mean and 95% confidence interval of five SAM counts of the validation set performed by a single observer. Calibration set APP counts linearly correlated to SAM counts (APP = 10.77 + 1.03 [SAM]; R(2) = 0.94, P < 0.0001) in normal to end-stage damage images. In the validation set, compensated APP counts fell within the 95% confidence interval of the SAM counts in 42 of the 50 images and were within 12 axons of the confidence intervals in six of the eight remaining images. Uncompensated axon density maps for the normal and EG eyes of a representative NHP were generated. An APP for 100% ON axon counts has been calibrated and validated relative to SAM counts in normal and EG NHP eyes.

  14. Automated Quantification of Optic Nerve Axons in Primate Glaucomatous and Normal Eyes—Method and Comparison to Semi-Automated Manual Quantification

    Science.gov (United States)

    Reynaud, Juan; Cull, Grant; Wang, Lin; Fortune, Brad; Gardiner, Stuart; Burgoyne, Claude F; Cioffi, George A

    2012-01-01

    Purpose. To describe an algorithm and software application (APP) for 100% optic nerve axon counting and to compare its performance with a semi-automated manual (SAM) method in optic nerve cross-section images (images) from normal and experimental glaucoma (EG) nonhuman primate (NHP) eyes. Methods. ON cross sections from eight EG eyes from eight NHPs, five EG and five normal eyes from five NHPs, and 12 normal eyes from 12 NHPs were imaged at 100×. Calibration (n = 500) and validation (n = 50) image sets ranging from normal to end-stage damage were assembled. Correlation between APP and SAM axon counts was assessed by Deming regression within the calibration set and a compensation formula was generated to account for the subtle, systematic differences. Then, compensated APP counts for each validation image were compared with the mean and 95% confidence interval of five SAM counts of the validation set performed by a single observer. Results. Calibration set APP counts linearly correlated to SAM counts (APP = 10.77 + 1.03 [SAM]; R2 = 0.94, P < 0.0001) in normal to end-stage damage images. In the validation set, compensated APP counts fell within the 95% confidence interval of the SAM counts in 42 of the 50 images and were within 12 axons of the confidence intervals in six of the eight remaining images. Uncompensated axon density maps for the normal and EG eyes of a representative NHP were generated. Conclusions. An APP for 100% ON axon counts has been calibrated and validated relative to SAM counts in normal and EG NHP eyes. PMID:22467571

  15. Automated procedure for performing computer security risk analysis

    International Nuclear Information System (INIS)

    Smith, S.T.; Lim, J.J.

    1984-05-01

    Computers, the invisible backbone of nuclear safeguards, monitor and control plant operations and support many materials accounting systems. Our automated procedure to assess computer security effectiveness differs from traditional risk analysis methods. The system is modeled as an interactive questionnaire, fully automated on a portable microcomputer. A set of modular event trees links the questionnaire to the risk assessment. Qualitative scores are obtained for target vulnerability, and qualitative impact measures are evaluated for a spectrum of threat-target pairs. These are then combined by a linguistic algebra to provide an accurate and meaningful risk measure. 12 references, 7 figures

  16. Evaluation of an automated analysis for pain-related evoked potentials

    Directory of Open Access Journals (Sweden)

    Wulf Michael

    2017-09-01

    Full Text Available This paper presents initial steps towards an auto-mated analysis for pain-related evoked potentials (PREP to achieve a higher objectivity and non-biased examination as well as a reduction in the time expended during clinical daily routines. While manually examining, each epoch of an en-semble of stimulus-locked EEG signals, elicited by electrical stimulation of predominantly intra-epidermal small nerve fibers and recorded over the central electrode (Cz, is in-spected for artifacts before calculating the PREP by averag-ing the artifact-free epochs. Afterwards, specific peak-latencies (like the P0-, N1 and P1-latency are identified as certain extrema in the PREP’s waveform. The proposed automated analysis uses Pearson’s correlation and low-pass differentiation to perform these tasks. To evaluate the auto-mated analysis’ accuracy its results of 232 datasets were compared to the results of the manually performed examina-tion. Results of the automated artifact rejection were compa-rable to the manual examination. Detection of peak-latencies was more heterogeneous, indicating some sensitivity of the detected events upon the criteria used during data examina-tion.

  17. Cardiac imaging: working towards fully-automated machine analysis & interpretation.

    Science.gov (United States)

    Slomka, Piotr J; Dey, Damini; Sitek, Arkadiusz; Motwani, Manish; Berman, Daniel S; Germano, Guido

    2017-03-01

    Non-invasive imaging plays a critical role in managing patients with cardiovascular disease. Although subjective visual interpretation remains the clinical mainstay, quantitative analysis facilitates objective, evidence-based management, and advances in clinical research. This has driven developments in computing and software tools aimed at achieving fully automated image processing and quantitative analysis. In parallel, machine learning techniques have been used to rapidly integrate large amounts of clinical and quantitative imaging data to provide highly personalized individual patient-based conclusions. Areas covered: This review summarizes recent advances in automated quantitative imaging in cardiology and describes the latest techniques which incorporate machine learning principles. The review focuses on the cardiac imaging techniques which are in wide clinical use. It also discusses key issues and obstacles for these tools to become utilized in mainstream clinical practice. Expert commentary: Fully-automated processing and high-level computer interpretation of cardiac imaging are becoming a reality. Application of machine learning to the vast amounts of quantitative data generated per scan and integration with clinical data also facilitates a move to more patient-specific interpretation. These developments are unlikely to replace interpreting physicians but will provide them with highly accurate tools to detect disease, risk-stratify, and optimize patient-specific treatment. However, with each technological advance, we move further from human dependence and closer to fully-automated machine interpretation.

  18. Development and evaluation of new semi-automatic TLD reader software

    International Nuclear Information System (INIS)

    Pathan, M.S.; Pradhan, S.M.; Palani Selvam, T.; Datta, D.

    2018-01-01

    Nowadays, all technology advancement is primarily focused on creating the user-friendly environment while operating any machine, also minimizing the human errors by automation of procedures. In the present study development and evaluation of new software for semi-automatic TLD badge reader (TLDBR-7B) is presented. The software provides an interactive interface and is compatible with latest windows OS as well as USB mode of data communication. Important new features of the software are automatic glow curve analysis for identifying any abnormality, event log register, user defined limits on TL count and time of temperature stabilization for readout interruption and auto reading resumption options

  19. Mesoscale simulation of concrete spall failure

    Science.gov (United States)

    Knell, S.; Sauer, M.; Millon, O.; Riedel, W.

    2012-05-01

    Although intensively studied, it is still being debated which physical mechanisms are responsible for the increase of dynamic strength and fracture energy of concrete observed at high loading rates, and to what extent structural inertia forces on different scales contribute to the observation. We present a new approach for the three dimensional mesoscale modelling of dynamic damage and cracking in concrete. Concrete is approximated as a composite of spherical elastic aggregates of mm to cm size embedded in an elastic cement stone matrix. Cracking within the matrix and at aggregate interfaces in the μm range are modelled with adaptively inserted—initially rigid—cohesive interface elements. The model is applied to analyse the dynamic tensile failure observed in Hopkinson-Bar spallation experiments with strain rates up to 100/s. The influence of the key mesoscale failure parameters of strength, fracture energy and relative weakening of the ITZ on macromechanic strength, momentum and energy conservation is numerically investigated.

  20. Automated packing systems: review of industrial implementations

    Science.gov (United States)

    Whelan, Paul F.; Batchelor, Bruce G.

    1993-08-01

    A rich theoretical background to the problems that occur in the automation of material handling can be found in operations research, production engineering, systems engineering and automation, more specifically machine vision, literature. This work has contributed towards the design of intelligent handling systems. This paper will review the application of these automated material handling and packing techniques to industrial problems. The discussion will also highlight the systems integration issues involved in these applications. An outline of one such industrial application, the automated placement of shape templates on to leather hides, is also discussed. The purpose of this system is to arrange shape templates on a leather hide in an efficient manner, so as to minimize the leather waste, before they are automatically cut from the hide. These pieces are used in the furniture and car manufacturing industries for the upholstery of high quality leather chairs and car seats. Currently this type of operation is semi-automated. The paper will outline the problems involved in the full automation of such a procedure.

  1. Impacts of Mesoscale Eddies on the Vertical Nitrate Flux in the Gulf Stream Region

    Science.gov (United States)

    Zhang, Shuwen; Curchitser, Enrique N.; Kang, Dujuan; Stock, Charles A.; Dussin, Raphael

    2018-01-01

    The Gulf Stream (GS) region has intense mesoscale variability that can affect the supply of nutrients to the euphotic zone (Zeu). In this study, a recently developed high-resolution coupled physical-biological model is used to conduct a 25-year simulation in the Northwest Atlantic. The Reynolds decomposition method is applied to quantify the nitrate budget and shows that the mesoscale variability is important to the vertical nitrate supply over the GS region. The decomposition, however, cannot isolate eddy effects from those arising from other mesoscale phenomena. This limitation is addressed by analyzing a large sample of eddies detected and tracked from the 25-year simulation. The eddy composite structures indicate that positive nitrate anomalies within Zeu exist in both cyclonic eddies (CEs) and anticyclonic eddies (ACEs) over the GS region, and are even more pronounced in the ACEs. Our analysis further indicates that positive nitrate anomalies mostly originate from enhanced vertical advective flux rather than vertical turbulent diffusion. The eddy-wind interaction-induced Ekman pumping is very likely the mechanism driving the enhanced vertical motions and vertical nitrate transport within ACEs. This study suggests that the ACEs in GS region may play an important role in modulating the oceanic biogeochemical properties by fueling local biomass production through the persistent supply of nitrate.

  2. Development of a fully automated online mixing system for SAXS protein structure analysis

    DEFF Research Database (Denmark)

    Nielsen, Søren Skou; Arleth, Lise

    2010-01-01

    This thesis presents the development of an automated high-throughput mixing and exposure system for Small-Angle Scattering analysis on a synchrotron using polymer microfluidics. Software and hardware for both automated mixing, exposure control on a beamline and automated data reduction...... and preliminary analysis is presented. Three mixing systems that have been the corner stones of the development process are presented including a fully functioning high-throughput microfluidic system that is able to produce and expose 36 mixed samples per hour using 30 μL of sample volume. The system is tested...

  3. Do mesoscale faults in a young fold belt indicate regional or local stress?

    Science.gov (United States)

    Kokado, Akihiro; Yamaji, Atsushi; Sato, Katsushi

    2017-04-01

    The result of paleostress analyses of mesoscale faults is usually thought of as evidence of a regional stress. On the other hand, the recent advancement of the trishear modeling has enabled us to predict the deformation field around fault-propagation folds without the difficulty of assuming paleo mechanical properties of rocks and sediments. We combined the analysis of observed mesoscale faults and the trishear modeling to understand the significance of regional and local stresses for the formation of mesoscale faults. To this end, we conducted the 2D trishear inverse modeling with a curved thrust fault to predict the subsurface structure and strain field of an anticline, which has a more or less horizontal axis and shows a map-scale plane strain perpendicular to the axis, in the active fold belt of Niigata region, central Japan. The anticline is thought to have been formed by fault-propagation folding under WNW-ESE regional compression. Based on the attitudes of strata and the positions of key tephra beds in Lower Pleistocene soft sediments cropping out at the surface, we obtained (1) a fault-propagation fold with the fault tip at a depth of ca. 4 km as the optimal subsurface structure, and (2) the temporal variation of deformation field during the folding. We assumed that mesoscale faults were activated along the direction of maximum shear strain on the faults to test whether the fault-slip data collected at the surface were consistent with the deformation in some stage(s) of folding. The Wallace-Bott hypothesis was used to estimate the consistence of faults with the regional stress. As a result, the folding and the regional stress explained 27 and 33 of 45 observed faults, respectively, with the 11 faults being consistent with the both. Both the folding and regional one were inconsistent with the remaining 17 faults, which could be explained by transfer faulting and/or the gravitational spreading of the growing anticline. The lesson we learnt from this work was

  4. A chemical profiling strategy for semi-quantitative analysis of flavonoids in Ginkgo extracts.

    Science.gov (United States)

    Yang, Jing; Wang, An-Qi; Li, Xue-Jing; Fan, Xue; Yin, Shan-Shan; Lan, Ke

    2016-05-10

    Flavonoids analysis in herbal products is challenged by their vast chemical diversity. This work aimed to develop a chemical profiling strategy for the semi-quantification of flavonoids using extracts of Ginkgo biloba L. (EGB) as an example. The strategy was based on the principle that flavonoids in EGB have an almost equivalent molecular absorption coefficient at a fixed wavelength. As a result, the molecular-contents of flavonoids were able to be semi-quantitatively determined by the molecular-concentration calibration curves of common standards and recalculated as the mass-contents with the characterized molecular weight (MW). Twenty batches of EGB were subjected to HPLC-UV/DAD/MS fingerprinting analysis to test the feasibility and reliability of this strategy. The flavonoid peaks were distinguished from the other peaks with principle component analysis and Pearson correlation analysis of the normalized UV spectrometric dataset. Each flavonoid peak was subsequently tentatively identified by the MS data to ascertain their MW. It was highlighted that the flavonoids absorption at Band-II (240-280 nm) was more suitable for the semi-quantification purpose because of the less variation compared to that at Band-I (300-380 nm). The semi-quantification was therefore conducted at 254 nm. Beyond the qualitative comparison results acquired by common chemical profiling techniques, the semi-quantitative approach presented the detailed compositional information of flavonoids in EGB and demonstrated how the adulteration of one batch was achieved. The developed strategy was believed to be useful for the advanced analysis of herbal extracts with a high flavonoid content without laborious identification and isolation of individual components. Copyright © 2016 Elsevier B.V. All rights reserved.

  5. WRF Mesoscale Pre-Run for the Wind Atlas of Mexico

    OpenAIRE

    Hahmann, Andrea N.; Pena Diaz, Alfredo; Hansen, Jens Carsten

    2016-01-01

    This report documents the work performed by DTU Wind Energy for the project “Atlas Eólico Mexicano” or the Wind Atlas of Mexico. This document reports on the methods used in “Pre-run” of the windmapping project for Mexico. The interim mesoscale modeling results were calculated from the output of simulations using the Weather, Research and Forecasting (WRF) model. We document the method used to run the mesoscale simulations and to generalize the WRF model wind climatologies. A separate section...

  6. A critical appraisal of semi-quantitative analysis of 2-deoxyglucose autoradiograms

    International Nuclear Information System (INIS)

    Kelly, P.T.; McCulloch, J.

    1983-01-01

    Semi-quantitative analysis (e.g. optical density ratios) of [ 14 C]2-deoxyglucose autoradiograms is widely used in neuroscience research. The authors demonstrate that a fixed ratio of 14 C-concentrations in the CNS does not yield a constant optical density ratio but is dependent upon the exposure time in the preparation of the autoradiograms and the absolute amounts of 14 C from which the concentration ratio is derived. The failure of a fixed glucose utilization ratio to result in a constant optical density ratio represents a major interpretative difficulty in investigations where only semi-quantitative analysis of [ 14 C]2-deoxyglucose autoradiograms is undertaken. (Auth.)

  7. A5: Automated Analysis of Adversarial Android Applications

    Science.gov (United States)

    2014-06-03

    A5: Automated Analysis of Adversarial Android Applications Timothy Vidas, Jiaqi Tan, Jay Nahata, Chaur Lih Tan, Nicolas Christin...detecting, on the device itself, that an application is malicious is much more complex without elevated privileges . In other words, given the...interface via website. Blasing et al. [7] describe another dynamic analysis system for Android . Their system focuses on classifying input applications as

  8. Planning representation for automated exploratory data analysis

    Science.gov (United States)

    St. Amant, Robert; Cohen, Paul R.

    1994-03-01

    Igor is a knowledge-based system for exploratory statistical analysis of complex systems and environments. Igor has two related goals: to help automate the search for interesting patterns in data sets, and to help develop models that capture significant relationships in the data. We outline a language for Igor, based on techniques of opportunistic planning, which balances control and opportunism. We describe the application of Igor to the analysis of the behavior of Phoenix, an artificial intelligence planning system.

  9. Automated acquisition and analysis of small angle X-ray scattering data

    International Nuclear Information System (INIS)

    Franke, Daniel; Kikhney, Alexey G.; Svergun, Dmitri I.

    2012-01-01

    Small Angle X-ray Scattering (SAXS) is a powerful tool in the study of biological macromolecules providing information about the shape, conformation, assembly and folding states in solution. Recent advances in robotic fluid handling make it possible to perform automated high throughput experiments including fast screening of solution conditions, measurement of structural responses to ligand binding, changes in temperature or chemical modifications. Here, an approach to full automation of SAXS data acquisition and data analysis is presented, which advances automated experiments to the level of a routine tool suitable for large scale structural studies. The approach links automated sample loading, primary data reduction and further processing, facilitating queuing of multiple samples for subsequent measurement and analysis and providing means of remote experiment control. The system was implemented and comprehensively tested in user operation at the BioSAXS beamlines X33 and P12 of EMBL at the DORIS and PETRA storage rings of DESY, Hamburg, respectively, but is also easily applicable to other SAXS stations due to its modular design.

  10. Automated ultrasonic inspection using PULSDAT

    International Nuclear Information System (INIS)

    Naybour, P.J.

    1992-01-01

    PULSDAT (Portable Ultrasonic Data Acquisition Tool) is a system for recording the data from single probe automated ultrasonic inspections. It is one of a range of instruments and software developed by Nuclear Electric to carry out a wide variety of high quality ultrasonic inspections. These vary from simple semi-automated inspections through to multi-probe, highly automated ones. PULSDAT runs under the control of MIPS software, and collects data which is compatible with the GUIDE data display system. PULSDAT is therefore fully compatible with Nuclear Electric's multi-probe inspection systems and utilises all the reliability and quality assurance of the software. It is a rugged, portable system that can be used in areas of difficult access. The paper discusses the benefits of automated inspection and gives an outline of the main features of PULSDAT. Since April 1990 PULSDAT has been used in several applications within Nuclear Electric and this paper presents two examples: the first is a ferritic set-through nozzle and the second is an austenitic fillet weld. (Author)

  11. Automating risk analysis of software design models.

    Science.gov (United States)

    Frydman, Maxime; Ruiz, Guifré; Heymann, Elisa; César, Eduardo; Miller, Barton P

    2014-01-01

    The growth of the internet and networked systems has exposed software to an increased amount of security threats. One of the responses from software developers to these threats is the introduction of security activities in the software development lifecycle. This paper describes an approach to reduce the need for costly human expertise to perform risk analysis in software, which is common in secure development methodologies, by automating threat modeling. Reducing the dependency on security experts aims at reducing the cost of secure development by allowing non-security-aware developers to apply secure development with little to no additional cost, making secure development more accessible. To automate threat modeling two data structures are introduced, identification trees and mitigation trees, to identify threats in software designs and advise mitigation techniques, while taking into account specification requirements and cost concerns. These are the components of our model for automated threat modeling, AutSEC. We validated AutSEC by implementing it in a tool based on data flow diagrams, from the Microsoft security development methodology, and applying it to VOMS, a grid middleware component, to evaluate our model's performance.

  12. Automated voxel-based analysis of brain perfusion SPECT for vasospasm after subarachnoid haemorrhage

    International Nuclear Information System (INIS)

    Iwabuchi, S.; Yokouchi, T.; Hayashi, M.; Kimura, H.; Tomiyama, A.; Hirata, Y.; Saito, N.; Harashina, J.; Nakayama, H.; Sato, K.; Aoki, K.; Samejima, H.; Ueda, M.; Terada, H.; Hamazaki, K.

    2008-01-01

    We evaluated regional cerebral blood flow (rCBF) during vasospasm after subarachnoid haemorrhage ISAH) using automated voxel-based analysis of brain perfusion single-photon emission computed tomography (SPELT). Brain perfusion SPECT was performed 7 to 10 days after onset of SAH. Automated voxel-based analysis of SPECT used a Z-score map that was calculated by comparing the patients data with a control database. In cases where computed tomography (CT) scans detected an ischemic region due to vasospasm, automated voxel-based analysis of brain perfusion SPECT revealed dramatically reduced rCBF (Z-score ≤ -4). No patients with mildly or moderately diminished rCBF (Z-score > -3) progressed to cerebral infarction. Some patients with a Z-score < -4 did not progress to cerebral infarction after active treatment with a angioplasty. Three-dimensional images provided detailed anatomical information and helped us to distinguish surgical sequelae from vasospasm. In conclusion, automated voxel-based analysis of brain perfusion SPECT using a Z-score map is helpful in evaluating decreased rCBF due to vasospasm. (author)

  13. Upscaling and automation of electrophysiology: toward high throughput screening in ion channel drug discovery

    DEFF Research Database (Denmark)

    Asmild, Margit; Oswald, Nicholas; Krzywkowski, Karen M

    2003-01-01

    by developing two lines of automated patch clamp products, a traditional pipette-based system called Apatchi-1, and a silicon chip-based system QPatch. The degree of automation spans from semi-automation (Apatchi-1) where a trained technician interacts with the system in a limited way, to a complete automation...... (QPatch 96) where the system works continuously and unattended until screening of a full compound library is completed. The performance of the systems range from medium to high throughputs....

  14. Investigating the Potential Impact of the Surface Water and Ocean Topography (SWOT) Altimeter on Ocean Mesoscale Prediction

    Science.gov (United States)

    Carrier, M.; Ngodock, H.; Smith, S. R.; Souopgui, I.

    2016-02-01

    NASA's Surface Water and Ocean Topography (SWOT) satellite, scheduled for launch in 2020, will provide sea surface height anomaly (SSHA) observations with a wider swath width and higher spatial resolution than current satellite altimeters. It is expected that this will help to further constrain ocean models in terms of the mesoscale circulation. In this work, this expectation is investigated by way of twin data assimilation experiments using the Navy Coastal Ocean Model Four Dimensional Variational (NCOM-4DVAR) data assimilation system using a weak constraint formulation. Here, a nature run is created from which SWOT observations are sampled, as well as along-track SSHA observations from simulated Jason-2 tracks. The simulated SWOT data has appropriate spatial coverage, resolution, and noise characteristics based on an observation-simulator program provided by the SWOT science team. The experiment is run for a three-month period during which the analysis is updated every 24 hours and each analysis is used to initialize a 96 hour forecast. The forecasts in each experiment are compared to the available nature run to determine the impact of the assimilated data. It is demonstrated here that the SWOT observations help to constrain the model mesoscale in a more consistent manner than traditional altimeter observations. The findings of this study suggest that data from SWOT may have a substantial impact on improving the ocean model analysis and forecast of mesoscale features and surface ocean transport.

  15. An automated method for the layup of fiberglass fabric

    Science.gov (United States)

    Zhu, Siqi

    This dissertation presents an automated composite fabric layup solution based on a new method to deform fiberglass fabric referred to as shifting. A layup system was designed and implemented using a large robotic gantry and custom end-effector for shifting. Layup tests proved that the system can deposit fabric onto two-dimensional and three-dimensional tooling surfaces accurately and repeatedly while avoiding out-of-plane deformation. A process planning method was developed to generate tool paths for the layup system based on a geometric model of the tooling surface. The approach is analogous to Computer Numerical Controlled (CNC) machining, where Numerical Control (NC) code from a Computer-Aided Design (CAD) model is generated to drive the milling machine. Layup experiments utilizing the proposed method were conducted to validate the performance. The results show that the process planning software requires minimal time or human intervention and can generate tool paths leading to accurate composite fabric layups. Fiberglass fabric samples processed with shifting deformation were observed for meso-scale deformation. Tow thinning, bending and spacing was observed and measured. Overall, shifting did not create flaws in amounts that would disqualify the method from use in industry. This suggests that shifting is a viable method for use in automated manufacturing. The work of this dissertation provides a new method for the automated layup of broad width composite fabric that is not possible with any available composite automation systems to date.

  16. Intercomparison of state-of-the-art models for wind energy resources with mesoscale models:

    Science.gov (United States)

    Olsen, Bjarke Tobias; Hahmann, Andrea N.; Sempreviva, Anna Maria; Badger, Jake; Joergensen, Hans E.

    2016-04-01

    vertical resolution, model parameterizations, surface roughness length) that could be used to group the various models and interpret the results of the intercomparison. 3. Main body abstract Twenty separate entries were received by the deadline of 31 March 2015. They included simulations done with various versions of the Weather Research and Forecast (WRF) model, but also of six other well-known mesoscale models. The various entries represent an excellent sample of the various models used in by the wind energy industry today. The analysis of the submitted time series included comparison to observations, summarized with well-known measures such as biases, RMSE, correlations, and of sector-wise statistics, e.g. frequency and Weibull A and k. The comparison also includes the observed and modeled temporal spectra. The various statistics were grouped as a function of the various models, their spatial resolution, forcing data, and the various integration methods. Many statistics have been computed and will be presented in addition to those shown in the Helsinki presentation. 4. Conclusions The analysis of the time series from twenty entries has shown to be an invaluable source of information about state of the art in wind modeling with mesoscale models. Biases between the simulated and observed wind speeds at hub heights (80-100 m AGL) from the various models are around ±1.0 m/s and fairly independent of the site and do not seem to be directly related to the model horizontal resolution used in the modeling. As probably expected, the wind speeds from the simulations using the various version of the WRF model cluster close to each other, especially in their description of the wind profile.

  17. 14 CFR 1261.413 - Analysis of costs; automation; prevention of overpayments, delinquencies, or defaults.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 5 2010-01-01 2010-01-01 false Analysis of costs; automation; prevention of overpayments, delinquencies, or defaults. 1261.413 Section 1261.413 Aeronautics and Space NATIONAL...) § 1261.413 Analysis of costs; automation; prevention of overpayments, delinquencies, or defaults. The...

  18. Development of a semi-automated method for subspecialty case distribution and prediction of intraoperative consultations in surgical pathology

    Directory of Open Access Journals (Sweden)

    Raul S Gonzalez

    2015-01-01

    Full Text Available Background: In many surgical pathology laboratories, operating room schedules are prospectively reviewed to determine specimen distribution to different subspecialty services and to predict the number and nature of potential intraoperative consultations for which prior medical records and slides require review. At our institution, such schedules were manually converted into easily interpretable, surgical pathology-friendly reports to facilitate these activities. This conversion, however, was time-consuming and arguably a non-value-added activity. Objective: Our goal was to develop a semi-automated method of generating these reports that improved their readability while taking less time to perform than the manual method. Materials and Methods: A dynamic Microsoft Excel workbook was developed to automatically convert published operating room schedules into different tabular formats. Based on the surgical procedure descriptions in the schedule, a list of linked keywords and phrases was utilized to sort cases by subspecialty and to predict potential intraoperative consultations. After two trial-and-optimization cycles, the method was incorporated into standard practice. Results: The workbook distributed cases to appropriate subspecialties and accurately predicted intraoperative requests. Users indicated that they spent 1-2 h fewer per day on this activity than before, and team members preferred the formatting of the newer reports. Comparison of the manual and semi-automatic predictions showed that the mean daily difference in predicted versus actual intraoperative consultations underwent no statistically significant changes before and after implementation for most subspecialties. Conclusions: A well-designed, lean, and simple information technology solution to determine subspecialty case distribution and prediction of intraoperative consultations in surgical pathology is approximately as accurate as the gold standard manual method and requires less

  19. Towards semi-automated assistance for the treatment of stress disorders

    NARCIS (Netherlands)

    van der Sluis, Frans; van den Broek, Egon; Dijkstra, Ton; Fred, A.; Filipe, J.; Gamboa, H.

    2010-01-01

    People who suffer from a stress disorder have a severe handicap in daily life. In addition, stress disorders are complex and consequently, hard to define and hard to treat. Semi-automatic assistance was envisioned that helps in the treatment of a stress disorder. Speech was considered to provide an

  20. Semi-supervised learning for ordinal Kernel Discriminant Analysis.

    Science.gov (United States)

    Pérez-Ortiz, M; Gutiérrez, P A; Carbonero-Ruz, M; Hervás-Martínez, C

    2016-12-01

    Ordinal classification considers those classification problems where the labels of the variable to predict follow a given order. Naturally, labelled data is scarce or difficult to obtain in this type of problems because, in many cases, ordinal labels are given by a user or expert (e.g. in recommendation systems). Firstly, this paper develops a new strategy for ordinal classification where both labelled and unlabelled data are used in the model construction step (a scheme which is referred to as semi-supervised learning). More specifically, the ordinal version of kernel discriminant learning is extended for this setting considering the neighbourhood information of unlabelled data, which is proposed to be computed in the feature space induced by the kernel function. Secondly, a new method for semi-supervised kernel learning is devised in the context of ordinal classification, which is combined with our developed classification strategy to optimise the kernel parameters. The experiments conducted compare 6 different approaches for semi-supervised learning in the context of ordinal classification in a battery of 30 datasets, showing (1) the good synergy of the ordinal version of discriminant analysis and the use of unlabelled data and (2) the advantage of computing distances in the feature space induced by the kernel function. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. Analysis of dicentrics in human lymphocytes exposed to ionizing radiation using the automated system and optical microscope

    International Nuclear Information System (INIS)

    Martinez A, J.

    2016-01-01

    Ionizing radiation is a form of energy that produces ionizations in the molecules it traverses. When the higher energy radiation interacts with the structure of human chromosomes, chromosome aberrations, mainly of the dicentric type, are the union of two damaged chromosomes, represented by two centromeres and non centromere fragment. There are situations where a population of people may be affected by the release of any radioactive material and it is impossible to determine in a short time the absorbed dose to which each person was exposed. The dicentrics analysis from the culture of human lymphocytes is used to estimate doses of exposure to ionizing radiation, using the optical microscope. The objective of this work is to analyze dicentric chromosomal lesions, using the optical microscope in comparison with the semi-automated system, to respond promptly to radiological emergencies. For this study, two samples irradiated with "6"0Co were analyzed, one in the Instituto Nacional de Investigaciones Nucleares (ININ) reaching doses of 2.7 ± 0.1 and 0.85 ± 0.1 Gy, and the other in Walischmiller Engineering G mb H, Markdorf (Germany) reaching doses of 0.84 ± 0.3 and 2.8 ± 0.1 Gy. A lymphocyte culture was performed following the recommendations of the IAEA, using minimum essential MEM medium previously prepared with BrdU, sodium heparin, antibiotic and L-glutamine. Phytohemagglutinin, fetal calf serum was added to the sample, incubated at 37 degrees Celsius for 48 hours and three hours before the end of incubation, colcemide was placed. KCl post-culture was added and lamellae were prepared by washing with the 3:1 acid-acetic fixative solution and a Giemsa staining. 1000 cell readings were performed using the optical microscope and the automated system according to study protocols and quality standards to estimate absorbed dose by means of dicentric analysis, defined by ISO-19238. With the automated system similar results of absorbed dose were obtained with respect to

  2. Time-Motion Analysis of Four Automated Systems for the Detection of Chlamydia trachomatis and Neisseria gonorrhoeae by Nucleic Acid Amplification Testing.

    Science.gov (United States)

    Williams, James A; Eddleman, Laura; Pantone, Amy; Martinez, Regina; Young, Stephen; Van Der Pol, Barbara

    2014-08-01

    Next-generation diagnostics for Chlamydia trachomatis and Neisseria gonorrhoeae are available on semi- or fully-automated platforms. These systems require less hands-on time than older platforms and are user friendly. Four automated systems, the ABBOTT m2000 system, Becton Dickinson Viper System with XTR Technology, Gen-Probe Tigris DTS system, and Roche cobas 4800 system, were evaluated for total run time, hands-on time, and walk-away time. All of the systems evaluated in this time-motion study were able to complete a diagnostic test run within an 8-h work shift, instrument setup and operation were straightforward and uncomplicated, and walk-away time ranged from approximately 90 to 270 min in a head-to-head comparison of each system. All of the automated systems provide technical staff with increased time to perform other tasks during the run, offer easy expansion of the diagnostic test menu, and have the ability to increase specimen throughput. © 2013 Society for Laboratory Automation and Screening.

  3. Automated tool for virtual screening and pharmacology-based pathway prediction and analysis

    Directory of Open Access Journals (Sweden)

    Sugandh Kumar

    2017-10-01

    Full Text Available The virtual screening is an effective tool for the lead identification in drug discovery. However, there are limited numbers of crystal structures available as compared to the number of biological sequences which makes (Structure Based Drug Discovery SBDD a difficult choice. The current tool is an attempt to automate the protein structure modelling and automatic virtual screening followed by pharmacology-based prediction and analysis. Starting from sequence(s, this tool automates protein structure modelling, binding site identification, automated docking, ligand preparation, post docking analysis and identification of hits in the biological pathways that can be modulated by a group of ligands. This automation helps in the characterization of ligands selectivity and action of ligands on a complex biological molecular network as well as on individual receptor. The judicial combination of the ligands binding different receptors can be used to inhibit selective biological pathways in a disease. This tool also allows the user to systemically investigate network-dependent effects of a drug or drug candidate.

  4. Interscan variation of semi-automated volumetry of subsolid pulmonary nodules

    NARCIS (Netherlands)

    Scholten, Ernst Th; de Jong, Pim A.; Jacobs, Colin; van Ginneken, Bram; van Riel, Sarah; Willemink, Martin J.; Vliegenthart, Rozemarijn; Oudkerk, Matthijs; de Koning, Harry J.; Horeweg, Nanda; Prokop, Mathias; Mali, Willem P. Th. M.; Gietema, Hester A.

    We aimed to test the interscan variation of semi-automatic volumetry of subsolid nodules (SSNs), as growth evaluation is important for SSN management. From a lung cancer screening trial all SSNs that were stable over at least 3 months were included (N = 44). SSNs were quantified on the baseline CT

  5. Automated daily quality control analysis for mammography in a multi-unit imaging center.

    Science.gov (United States)

    Sundell, Veli-Matti; Mäkelä, Teemu; Meaney, Alexander; Kaasalainen, Touko; Savolainen, Sauli

    2018-01-01

    Background The high requirements for mammography image quality necessitate a systematic quality assurance process. Digital imaging allows automation of the image quality analysis, which can potentially improve repeatability and objectivity compared to a visual evaluation made by the users. Purpose To develop an automatic image quality analysis software for daily mammography quality control in a multi-unit imaging center. Material and Methods An automated image quality analysis software using the discrete wavelet transform and multiresolution analysis was developed for the American College of Radiology accreditation phantom. The software was validated by analyzing 60 randomly selected phantom images from six mammography systems and 20 phantom images with different dose levels from one mammography system. The results were compared to a visual analysis made by four reviewers. Additionally, long-term image quality trends of a full-field digital mammography system and a computed radiography mammography system were investigated. Results The automated software produced feature detection levels comparable to visual analysis. The agreement was good in the case of fibers, while the software detected somewhat more microcalcifications and characteristic masses. Long-term follow-up via a quality assurance web portal demonstrated the feasibility of using the software for monitoring the performance of mammography systems in a multi-unit imaging center. Conclusion Automated image quality analysis enables monitoring the performance of digital mammography systems in an efficient, centralized manner.

  6. Capacity analysis of an automated kit transportation system

    NARCIS (Netherlands)

    Zijm, W.H.M.; Adan, I.J.B.F.; Buitenhek, R.; Houtum, van G.J.J.A.N.

    2000-01-01

    In this paper, we present a capacity analysis of an automated transportation system in a flexible assembly factory. The transportation system, together with the workstations, is modeled as a network of queues with multiple job classes. Due to its complex nature, the steadystate behavior of this

  7. Automated Groundwater Screening

    International Nuclear Information System (INIS)

    Taylor, Glenn A.; Collard, Leonard B.

    2005-01-01

    The Automated Intruder Analysis has been extended to include an Automated Ground Water Screening option. This option screens 825 radionuclides while rigorously applying the National Council on Radiation Protection (NCRP) methodology. An extension to that methodology is presented to give a more realistic screening factor for those radionuclides which have significant daughters. The extension has the promise of reducing the number of radionuclides which must be tracked by the customer. By combining the Automated Intruder Analysis with the Automated Groundwater Screening a consistent set of assumptions and databases is used. A method is proposed to eliminate trigger values by performing rigorous calculation of the screening factor thereby reducing the number of radionuclides sent to further analysis. Using the same problem definitions as in previous groundwater screenings, the automated groundwater screening found one additional nuclide, Ge-68, which failed the screening. It also found that 18 of the 57 radionuclides contained in NCRP Table 3.1 failed the screening. This report describes the automated groundwater screening computer application

  8. Development of an Automated Technique for Failure Modes and Effect Analysis

    DEFF Research Database (Denmark)

    Blanke, M.; Borch, Ole; Allasia, G.

    1999-01-01

    Advances in automation have provided integration of monitoring and control functions to enhance the operator's overview and ability to take remedy actions when faults occur. Automation in plant supervision is technically possible with integrated automation systems as platforms, but new design...... methods are needed to cope efficiently with the complexity and to ensure that the functionality of a supervisor is correct and consistent. In particular these methods are expected to significantly improve fault tolerance of the designed systems. The purpose of this work is to develop a software module...... implementing an automated technique for Failure Modes and Effects Analysis (FMEA). This technique is based on the matrix formulation of FMEA for the investigation of failure propagation through a system. As main result, this technique will provide the design engineer with decision tables for fault handling...

  9. Development of an automated technique for failure modes and effect analysis

    DEFF Research Database (Denmark)

    Blanke, Mogens; Borch, Ole; Bagnoli, F.

    1999-01-01

    Advances in automation have provided integration of monitoring and control functions to enhance the operator's overview and ability to take remedy actions when faults occur. Automation in plant supervision is technically possible with integrated automation systems as platforms, but new design...... methods are needed to cope efficiently with the complexity and to ensure that the functionality of a supervisor is correct and consistent. In particular these methods are expected to significantly improve fault tolerance of the designed systems. The purpose of this work is to develop a software module...... implementing an automated technique for Failure Modes and Effects Analysis (FMEA). This technique is based on the matrix formulation of FMEA for the investigation of failure propagation through a system. As main result, this technique will provide the design engineer with decision tables for fault handling...

  10. Automated Detection of Thermo-Erosion in High Latitude Ecosystems

    Science.gov (United States)

    Lara, M. J.; Chipman, M. L.; Hu, F.

    2017-12-01

    Detecting permafrost disturbance is of critical importance as the severity of climate change and associated increase in wildfire frequency and magnitude impacts regional to global carbon dynamics. However, it has not been possible to evaluate spatiotemporal patterns of permafrost degradation over large regions of the Arctic, due to limited spatial and temporal coverage of high resolution optical, radar, lidar, or hyperspectral remote sensing products. Here we present the first automated multi-temporal analysis for detecting disturbance in response to permafrost thaw, using meso-scale high-frequency remote sensing products (i.e. entire Landsat image archive). This approach was developed, tested, and applied in the Noatak National Preserve (26,500km2) in northwestern Alaska. We identified thermo-erosion (TE), by capturing the indirect spectral signal associated with episodic sediment plumes in adjacent waterbodies following TE disturbance. We isolated this turbidity signal within lakes during summer (mid-summer & late-summer) and annual time-period image composites (1986-2016), using the cloud-based geospatial parallel processing platform, Google Earth Engine™API. We validated the TE detection algorithm using seven consecutive years of sub-meter high resolution imagery (2009-2015) covering 798 ( 33%) of the 2456 total lakes in the Noatak lowlands. Our approach had "good agreement" with sediment pulses and landscape deformation in response to permafrost thaw (overall accuracy and kappa coefficient of 85% and 0.61). We identify active TE to impact 10.4% of all lakes, but was inter-annually variable, with the highest and lowest TE years represented by 1986 ( 41.1%) and 2002 ( 0.7%), respectively. We estimate thaw slumps, lake erosion, lake drainage, and gully formation to account for 23.3, 61.8, 12.5, and 1.3%, of all active TE across the Noatak National Preserve. Preliminary analysis, suggests TE may be subject to a hysteresis effect following extreme climatic

  11. Explicit simulation of a midlatitude Mesoscale Convective System

    Energy Technology Data Exchange (ETDEWEB)

    Alexander, G.D.; Cotton, W.R. [Colorado State Univ., Fort Collins, CO (United States)

    1996-04-01

    We have explicitly simulated the mesoscale convective system (MCS) observed on 23-24 June 1985 during PRE-STORM, the Preliminary Regional Experiment for the Stormscale Operational and Research and Meterology Program. Stensrud and Maddox (1988), Johnson and Bartels (1992), and Bernstein and Johnson (1994) are among the researchers who have investigated various aspects of this MCS event. We have performed this MCS simulation (and a similar one of a tropical MCS; Alexander and Cotton 1994) in the spirit of the Global Energy and Water Cycle Experiment Cloud Systems Study (GCSS), in which cloud-resolving models are used to assist in the formulation and testing of cloud parameterization schemes for larger-scale models. In this paper, we describe (1) the nature of our 23-24 June MCS dimulation and (2) our efforts to date in using our explicit MCS simulations to assist in the development of a GCM parameterization for mesoscale flow branches. The paper is organized as follows. First, we discuss the synoptic situation surrounding the 23-24 June PRE-STORM MCS followed by a discussion of the model setup and results of our simulation. We then discuss the use of our MCS simulation. We then discuss the use of our MCS simulations in developing a GCM parameterization for mesoscale flow branches and summarize our results.

  12. Human-centered automation of testing, surveillance and maintenance

    International Nuclear Information System (INIS)

    Bhatt, S.C.; Sun, B.K.H.

    1991-01-01

    Manual surveillance and testing of instrumentation, control and protection systems at nuclear power plants involves system and human errors which can lead to substantial plant down time. Frequent manual testing can also contribute significantly to operation and maintenance cost. Automation technology offers potential for prudent applications at the power plant to reduce testing errors and cost. To help address the testing problems and to harness the benefit of automation application, input from utilities is obtained on suitable automation approaches. This paper includes lessens from successful past experience at a few plants where some island of automation exist. The results are summarized as a set of specifications for semi automatic testing. A human-centered automation methodology is proposed with the guidelines for optimal human/computer division of tasks given. Implementation obstacles for significant changes of testing practices are identified and methods acceptable to nuclear power plants for addressing these obstacles have been suggested

  13. North American Mesoscale Forecast System (NAM) [12 km

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The North American Mesoscale Forecast System (NAM) is one of the major regional weather forecast models run by the National Centers for Environmental Prediction...

  14. Automated Diatom Analysis Applied to Traditional Light Microscopy: A Proof-of-Concept Study

    Science.gov (United States)

    Little, Z. H. L.; Bishop, I.; Spaulding, S. A.; Nelson, H.; Mahoney, C.

    2017-12-01

    Diatom identification and enumeration by high resolution light microscopy is required for many areas of research and water quality assessment. Such analyses, however, are both expertise and labor-intensive. These challenges motivate the need for an automated process to efficiently and accurately identify and enumerate diatoms. Improvements in particle analysis software have increased the likelihood that diatom enumeration can be automated. VisualSpreadsheet software provides a possible solution for automated particle analysis of high-resolution light microscope diatom images. We applied the software, independent of its complementary FlowCam hardware, to automated analysis of light microscope images containing diatoms. Through numerous trials, we arrived at threshold settings to correctly segment 67% of the total possible diatom valves and fragments from broad fields of view. (183 light microscope images were examined containing 255 diatom particles. Of the 255 diatom particles present, 216 diatoms valves and fragments of valves were processed, with 170 properly analyzed and focused upon by the software). Manual analysis of the images yielded 255 particles in 400 seconds, whereas the software yielded a total of 216 particles in 68 seconds, thus highlighting that the software has an approximate five-fold efficiency advantage in particle analysis time. As in past efforts, incomplete or incorrect recognition was found for images with multiple valves in contact or valves with little contrast. The software has potential to be an effective tool in assisting taxonomists with diatom enumeration by completing a large portion of analyses. Benefits and limitations of the approach are presented to allow for development of future work in image analysis and automated enumeration of traditional light microscope images containing diatoms.

  15. Automated image analysis for quantification of filamentous bacteria

    DEFF Research Database (Denmark)

    Fredborg, Marlene; Rosenvinge, Flemming Schønning; Spillum, Erik

    2015-01-01

    in systems relying on colorimetry or turbidometry (such as Vitek-2, Phoenix, MicroScan WalkAway). The objective was to examine an automated image analysis algorithm for quantification of filamentous bacteria using the 3D digital microscopy imaging system, oCelloScope. Results Three E. coli strains displaying...

  16. Development of a robotics system for automated chemical analysis of sediments, sludges, and soils

    International Nuclear Information System (INIS)

    McGrail, B.P.; Dodson, M.G.; Skorpik, J.R.; Strachan, D.M.; Barich, J.J.

    1989-01-01

    Adaptation and use of a high-reliability robot to conduct a standard laboratory procedure for soil chemical analysis are reported. Results from a blind comparative test were used to obtain a quantitative measure of the improvement in precision possible with the automated test method. Results from the automated chemical analysis procedure were compared with values obtained from an EPA-certified lab and with results from a more extensive interlaboratory round robin conducted by the EPA. For several elements, up to fivefold improvement in precision was obtained with the automated test method

  17. 3D neuromelanin-sensitive magnetic resonance imaging with semi-automated volume measurement of the substantia nigra pars compacta for diagnosis of Parkinson's disease

    Energy Technology Data Exchange (ETDEWEB)

    Ogisu, Kimihiro; Shirato, Hiroki [Hokkaido University Graduate School of Medicine, Department of Radiology, Hokkaido (Japan); Kudo, Kohsuke; Sasaki, Makoto [Iwate Medical University, Division of Ultrahigh Field MRI, Iwate (Japan); Sakushima, Ken; Yabe, Ichiro; Sasaki, Hidenao [Hokkaido University Hospital, Department of Neurology, Hokkaido (Japan); Terae, Satoshi; Nakanishi, Mitsuhiro [Hokkaido University Hospital, Department of Radiology, Hokkaido (Japan)

    2013-06-15

    Neuromelanin-sensitive MRI has been reported to be used in the diagnosis of Parkinson's disease (PD), which results from loss of dopamine-producing cells in the substantia nigra pars compacta (SNc). In this study, we aimed to apply a 3D turbo field echo (TFE) sequence for neuromelanin-sensitive MRI and to evaluate the diagnostic performance of semi-automated method for measurement of SNc volume in patients with PD. We examined 18 PD patients and 27 healthy volunteers (control subjects). A 3D TFE technique with off-resonance magnetization transfer pulse was used for neuromelanin-sensitive MRI on a 3T scanner. The SNc volume was semi-automatically measured using a region-growing technique at various thresholds (ranging from 1.66 to 2.48), with the signals measured relative to that for the superior cerebellar peduncle. Receiver operating characteristic (ROC) analysis was performed at all thresholds. Intra-rater reproducibility was evaluated by intraclass correlation coefficient (ICC). The average SNc volume in the PD group was significantly smaller than that in the control group at all the thresholds (P < 0.01, student t test). At higher thresholds (>2.0), the area under the curve of ROC (Az) increased (0.88). In addition, we observed balanced sensitivity and specificity (0.83 and 0.85, respectively). At lower thresholds, sensitivity tended to increase but specificity reduced in comparison with that at higher thresholds. ICC was larger than 0.9 when the threshold was over 1.86. Our method can distinguish the PD group from the control group with high sensitivity and specificity, especially for early stage of PD. (orig.)

  18. Automated Tracking of Tornado-Producing Mesoscale Convective Systems in the United States

    Science.gov (United States)

    Kuo, K.; Hong, Y.; Clune, T. L.

    2011-12-01

    The great majority of Earth Science events are studied using "snap-shot" observations in time, mainly due to the scarcity of observations with dense temporal coverage and the lack of robust methods amenable to connecting the "snap shots". To enable the studies of these events in the four-dimensional (4D) spatiotemporal space and to demonstrate the utility of this capability, we have applied the neighbor enclosed area tracking (NEAT) method of Inatsu (2009) to three years of high-resolution (in both time and space) NEXRAD-derived and rain-gauge-corrected QE2 precipitation observations and GOES satellite Rapid Scan Operation imagery to track tornado-producing mesoscale convective systems (MCS's). We combine information from the databases of the Tornado History Project (which provides tornado occurrence and trajectory) and the NWS Watch/Warning Archive (which provides severe weather watch/warning locations) to obtain initial estimate of the time and location of a tornado-producing MCS. The NEAT algorithm is then applied to QE2 and GOES data, both forward and backward in time, to identify the entire system as one integral entity from its inception to its eventual dissipation in the 4D spatiotemporal space. For each system so identified, we extract its morphological/structural parameters, such as perimeter length, area, and orientation, from each of the snap shots in time. We also record physical parameters such as minimum and maximum precipitation rates. In addition, we perform areal integral on the precipitation rate field, which in turn enables time integral for the entire MCS throughout its lifecycle to obtain an estimate of the system's precipitation production. We can extend this proof-of-concept prototype to other precipitation producing severe weather events, such as blizzards. Furthermore, the spatiotemporal data collected may be used to discover other data, such as satellite remote sensing observations and model analyses/simulations, which can then be combined

  19. Data assimilation of a ten-day period during June 1993 over the Southern Great Plains Site using a nested mesoscale model

    Energy Technology Data Exchange (ETDEWEB)

    Dudhia, J.; Guo, Y.R. [National Center for Atmospheric Research, Boulder, CO (United States)

    1996-04-01

    A goal of the Atmospheric Radiation Measurement (ARM) Program has been to obtain a complete representation of physical processes on the scale of a general circulation model (GCM) grid box in order to better parameterize radiative processes in these models. Since an observational network of practical size cannot be used alone to characterize the Cloud and Radiation Testbed (CART) site`s 3D structure and time development, data assimilation using the enhanced observations together with a mesoscale model is used to give a full 4D analysis at high resolution. The National Center for Atmospheric Research (NCAR)/Penn State Mesoscale Model (MM5) has been applied over a ten-day continuous period in a triple-nested mode with grid sizes of 60, 20 and 6.67 in. The outer domain covers the United States` 48 contiguous states; the innermost is a 480-km square centered on Lamont, Oklahoma. A simulation has been run with data assimilation using the Mesoscale Analysis and Prediction System (MAPS) 60-km analyses from the Forecast Systems Laboratory (FSL) of the National Ocean and Atmospheric Administration (NOAA). The nested domains take boundary conditions from and feed back continually to their parent meshes (i.e., they are two-way interactive). As reported last year, this provided a simulation of the basic features of mesoscale events over the CART site during the period 16-26 June 1993 when an Intensive Observation Period (IOP) was under way.

  20. The influence of mesoscale porosity on cortical bone anisotropy. Investigations via asymptotic homogenization

    Science.gov (United States)

    Parnell, William J; Grimal, Quentin

    2008-01-01

    Recently, the mesoscale of cortical bone has been given particular attention in association with novel experimental techniques such as nanoindentation, micro-computed X-ray tomography and quantitative scanning acoustic microscopy (SAM). A need has emerged for reliable mathematical models to interpret the related microscopic and mesoscopic data in terms of effective elastic properties. In this work, a new model of cortical bone elasticity is developed and used to assess the influence of mesoscale porosity on the induced anisotropy of the material. Only the largest pores (Haversian canals and resorption cavities), characteristic of the mesoscale, are considered. The input parameters of the model are derived from typical mesoscale experimental data (e.g. SAM data). We use the method of asymptotic homogenization to determine the local effective elastic properties by modelling the propagation of low-frequency elastic waves through an idealized material that models the local mesostructure. We use a novel solution of the cell problem developed by Parnell & Abrahams. This solution is stable for the physiological range of variation of mesoscopic porosity and elasticity found in bone. Results are computed efficiently (in seconds) and the solutions can be implemented easily by other workers. Parametric studies are performed in order to assess the influence of mesoscopic porosity, the assumptions regarding the material inside the mesoscale pores (drained or undrained bone) and the shape of pores. Results are shown to be in good qualitative agreement with existing schemes and we describe the potential of the scheme for future use in modelling more complex microstructures for cortical bone. In particular, the scheme is shown to be a useful tool with which to predict the qualitative changes in anisotropy due to variations in the structure at the mesoscale. PMID:18628200

  1. Initial development of an automated task analysis profiling system

    International Nuclear Information System (INIS)

    Jorgensen, C.C.

    1984-01-01

    A program for automated task analysis is described. Called TAPS (task analysis profiling system), the program accepts normal English prose and outputs skills, knowledges, attitudes, and abilities (SKAAs) along with specific guidance and recommended ability measurement tests for nuclear power plant operators. A new method for defining SKAAs is presented along with a sample program output

  2. Application of quantum dots as analytical tools in automated chemical analysis: A review

    International Nuclear Information System (INIS)

    Frigerio, Christian; Ribeiro, David S.M.; Rodrigues, S. Sofia M.; Abreu, Vera L.R.G.; Barbosa, João A.C.; Prior, João A.V.; Marques, Karine L.; Santos, João L.M.

    2012-01-01

    Highlights: ► Review on quantum dots application in automated chemical analysis. ► Automation by using flow-based techniques. ► Quantum dots in liquid chromatography and capillary electrophoresis. ► Detection by fluorescence and chemiluminescence. ► Electrochemiluminescence and radical generation. - Abstract: Colloidal semiconductor nanocrystals or quantum dots (QDs) are one of the most relevant developments in the fast-growing world of nanotechnology. Initially proposed as luminescent biological labels, they are finding new important fields of application in analytical chemistry, where their photoluminescent properties have been exploited in environmental monitoring, pharmaceutical and clinical analysis and food quality control. Despite the enormous variety of applications that have been developed, the automation of QDs-based analytical methodologies by resorting to automation tools such as continuous flow analysis and related techniques, which would allow to take advantage of particular features of the nanocrystals such as the versatile surface chemistry and ligand binding ability, the aptitude to generate reactive species, the possibility of encapsulation in different materials while retaining native luminescence providing the means for the implementation of renewable chemosensors or even the utilisation of more drastic and even stability impairing reaction conditions, is hitherto very limited. In this review, we provide insights into the analytical potential of quantum dots focusing on prospects of their utilisation in automated flow-based and flow-related approaches and the future outlook of QDs applications in chemical analysis.

  3. Cost and Benefit Analysis of an Automated Nursing Administration System: A Methodology*

    OpenAIRE

    Rieder, Karen A.

    1984-01-01

    In order for a nursing service administration to select the appropriate automated system for its requirements, a systematic process of evaluating alternative approaches must be completed. This paper describes a methodology for evaluating and comparing alternative automated systems based upon an economic analysis which includes two major categories of criteria: costs and benefits.

  4. Avaliação da exeqüibilidade, eficácia e segurança do transplante lamelar semi-automatizado de córnea Evaluation of performance, efficacy and safety of semi-automated lamellar keratoplasty

    Directory of Open Access Journals (Sweden)

    Núbia Cristina de Freitas Maia

    2006-12-01

    Full Text Available OBJETIVO: Avaliar a exeqüibilidade, eficácia e segurança do uso de microcerátomo e câmara anterior artificial para o transplante lamelar (sistema ALTK®. MÉTODOS: 21 olhos com opacidades corneanas superficiais foram submetidos ao transplante lamelar semi-automatizado de córnea. Nos olhos receptores a ceratectomia foi realizada de modo semelhante a uma cirurgia refrativa. As lamelas doadoras foram obtidas a partir de botões esclero-corneanos utilizando o mesmo microcerátomo e uma câmara anterior artificial. As medidas das espessuras corneanas foram feitas através da biomicroscopia ultra-sônica. RESULTADOS: As cirurgias obtiveram êxito em 19 olhos. Em 80% das lamelas obtidas em córneas doadoras e em 84,2% das lamelas em olhos receptores houve uma variação de até 0,5 mm do diâmetro desejado. Verificou-se alta semelhança entre as espessuras das lamelas obtidas nos olhos receptores e lamelas doadoras. Obteve-se acuidade visual corrigida pós-operatória igual ou superior a 20/40 em 52,6% dos olhos. Foram observadas complicações como diâmetro inadequado da lamela, perfuração intra-operatória no olho receptor e ectasia corneana pós-operatória (um caso. CONCLUSÕES: O transplante lamelar semi-automatizado de córnea mostrou-se exequível pela reprodutibilidade das espessuras e diâmetros das lamelas; eficaz pela melhora da acuidade visual pós-operatória e seguro, devido ao baixo índice de complicações cirúrgicas.PURPOSE: To evaluate the feasibility, efficacy and safety of a manual microkeratome and an artificial anterior chamber for lamellar keratoplasty (ALTK® system. METHODS: Twenty-one eyes with superficial corneal opacities were submitted to semi-automated lamellar keratectomy. In recipient eyes keratectomy was performed as in refractive surgery. The donor flap was removed from the preserved corneal shell using the same microkeratome and an artificial anterior chamber. Lamella thickness was measured through

  5. The Coupled Ocean/Atmosphere Mesoscale Prediction System (COAMPS)

    National Research Council Canada - National Science Library

    Hodur, Richard M; Hong, Xiaodong; Doyle, James D; Pullen, Julie; Cummings, James; Martin, Paul; Rennick, Mary Alice

    2002-01-01

    ... of the Couple Ocean/Atmosphere Mesoscale Prediction System (COAMPS). The goal of this modeling project is to gain predictive skill in simulating the ocean and atmosphere at high resolution on time-scales of hours to several days...

  6. TU-F-CAMPUS-I-05: Semi-Automated, Open Source MRI Quality Assurance and Quality Control Program for Multi-Unit Institution

    International Nuclear Information System (INIS)

    Yung, J; Stefan, W; Reeve, D; Stafford, RJ

    2015-01-01

    Purpose: Phantom measurements allow for the performance of magnetic resonance (MR) systems to be evaluated. Association of Physicists in Medicine (AAPM) Report No. 100 Acceptance Testing and Quality Assurance Procedures for MR Imaging Facilities, American College of Radiology (ACR) MR Accreditation Program MR phantom testing, and ACR MRI quality control (QC) program documents help to outline specific tests for establishing system performance baselines as well as system stability over time. Analyzing and processing tests from multiple systems can be time-consuming for medical physicists. Besides determining whether tests are within predetermined limits or criteria, monitoring longitudinal trends can also help prevent costly downtime of systems during clinical operation. In this work, a semi-automated QC program was developed to analyze and record measurements in a database that allowed for easy access to historical data. Methods: Image analysis was performed on 27 different MR systems of 1.5T and 3.0T field strengths from GE and Siemens manufacturers. Recommended measurements involved the ACR MRI Accreditation Phantom, spherical homogenous phantoms, and a phantom with an uniform hole pattern. Measurements assessed geometric accuracy and linearity, position accuracy, image uniformity, signal, noise, ghosting, transmit gain, center frequency, and magnetic field drift. The program was designed with open source tools, employing Linux, Apache, MySQL database and Python programming language for the front and backend. Results: Processing time for each image is <2 seconds. Figures are produced to show regions of interests (ROIs) for analysis. Historical data can be reviewed to compare previous year data and to inspect for trends. Conclusion: A MRI quality assurance and QC program is necessary for maintaining high quality, ACR MRI Accredited MR programs. A reviewable database of phantom measurements assists medical physicists with processing and monitoring of large datasets

  7. TU-F-CAMPUS-I-05: Semi-Automated, Open Source MRI Quality Assurance and Quality Control Program for Multi-Unit Institution

    Energy Technology Data Exchange (ETDEWEB)

    Yung, J; Stefan, W; Reeve, D; Stafford, RJ [UT MD Anderson Cancer Center, Houston, TX (United States)

    2015-06-15

    Purpose: Phantom measurements allow for the performance of magnetic resonance (MR) systems to be evaluated. Association of Physicists in Medicine (AAPM) Report No. 100 Acceptance Testing and Quality Assurance Procedures for MR Imaging Facilities, American College of Radiology (ACR) MR Accreditation Program MR phantom testing, and ACR MRI quality control (QC) program documents help to outline specific tests for establishing system performance baselines as well as system stability over time. Analyzing and processing tests from multiple systems can be time-consuming for medical physicists. Besides determining whether tests are within predetermined limits or criteria, monitoring longitudinal trends can also help prevent costly downtime of systems during clinical operation. In this work, a semi-automated QC program was developed to analyze and record measurements in a database that allowed for easy access to historical data. Methods: Image analysis was performed on 27 different MR systems of 1.5T and 3.0T field strengths from GE and Siemens manufacturers. Recommended measurements involved the ACR MRI Accreditation Phantom, spherical homogenous phantoms, and a phantom with an uniform hole pattern. Measurements assessed geometric accuracy and linearity, position accuracy, image uniformity, signal, noise, ghosting, transmit gain, center frequency, and magnetic field drift. The program was designed with open source tools, employing Linux, Apache, MySQL database and Python programming language for the front and backend. Results: Processing time for each image is <2 seconds. Figures are produced to show regions of interests (ROIs) for analysis. Historical data can be reviewed to compare previous year data and to inspect for trends. Conclusion: A MRI quality assurance and QC program is necessary for maintaining high quality, ACR MRI Accredited MR programs. A reviewable database of phantom measurements assists medical physicists with processing and monitoring of large datasets

  8. Meso-scale effects of tropical deforestation in Amazonia: preparatory LBA modelling studies

    Directory of Open Access Journals (Sweden)

    A. J. Dolman

    1999-08-01

    Full Text Available As part of the preparation for the Large-Scale Biosphere Atmosphere Experiment in Amazonia, a meso-scale modelling study was executed to highlight deficiencies in the current understanding of land surface atmosphere interaction at local to sub-continental scales in the dry season. Meso-scale models were run in 1-D and 3-D mode for the area of Rondonia State, Brazil. The important conclusions are that without calibration it is difficult to model the energy partitioning of pasture; modelling that of forest is easier due to the absence of a strong moisture deficit signal. The simulation of the boundary layer above forest is good, above deforested areas (pasture poor. The models' underestimate of the temperature of the boundary layer is likely to be caused by the neglect of the radiative effects of aerosols caused by biomass burning, but other factors such as lack of sufficient entrainment in the model at the mixed layer top may also contribute. The Andes generate patterns of subsidence and gravity waves, the effects of which are felt far into the Rondonian area The results show that the picture presented by GCM modelling studies may need to be balanced by an increased understanding of what happens at the meso-scale. The results are used to identify key measurements for the LBA atmospheric meso-scale campaign needed to improve the model simulations. Similar modelling studies are proposed for the wet season in Rondonia, when convection plays a major role.Key words. Atmospheric composition and structure (aerosols and particles; biosphere-atmosphere interactions · Meterology and atmospheric dynamics (mesoscale meterology

  9. Meso-scale effects of tropical deforestation in Amazonia: preparatory LBA modelling studies

    Directory of Open Access Journals (Sweden)

    A. J. Dolman

    Full Text Available As part of the preparation for the Large-Scale Biosphere Atmosphere Experiment in Amazonia, a meso-scale modelling study was executed to highlight deficiencies in the current understanding of land surface atmosphere interaction at local to sub-continental scales in the dry season. Meso-scale models were run in 1-D and 3-D mode for the area of Rondonia State, Brazil. The important conclusions are that without calibration it is difficult to model the energy partitioning of pasture; modelling that of forest is easier due to the absence of a strong moisture deficit signal. The simulation of the boundary layer above forest is good, above deforested areas (pasture poor. The models' underestimate of the temperature of the boundary layer is likely to be caused by the neglect of the radiative effects of aerosols caused by biomass burning, but other factors such as lack of sufficient entrainment in the model at the mixed layer top may also contribute. The Andes generate patterns of subsidence and gravity waves, the effects of which are felt far into the Rondonian area The results show that the picture presented by GCM modelling studies may need to be balanced by an increased understanding of what happens at the meso-scale. The results are used to identify key measurements for the LBA atmospheric meso-scale campaign needed to improve the model simulations. Similar modelling studies are proposed for the wet season in Rondonia, when convection plays a major role.

    Key words. Atmospheric composition and structure (aerosols and particles; biosphere-atmosphere interactions · Meterology and atmospheric dynamics (mesoscale meterology

  10. Automated analysis of autoradiographic imagery

    International Nuclear Information System (INIS)

    Bisignani, W.T.; Greenhouse, S.C.

    1975-01-01

    A research programme is described which has as its objective the automated characterization of neurological tissue regions from autoradiographs by utilizing hybrid-resolution image processing techniques. An experimental system is discussed which includes raw imagery, scanning an digitizing equipments, feature-extraction algorithms, and regional characterization techniques. The parameters extracted by these algorithms are presented as well as the regional characteristics which are obtained by operating on the parameters with statistical sampling techniques. An approach is presented for validating the techniques and initial experimental results are obtained from an anlysis of an autoradiograph of a region of the hypothalamus. An extension of these automated techniques to other biomedical research areas is discussed as well as the implications of applying automated techniques to biomedical research problems. (author)

  11. [Morphometry of pulmonary tissue: From manual to high throughput automation].

    Science.gov (United States)

    Sallon, C; Soulet, D; Tremblay, Y

    2017-12-01

    Weibel's research has shown that any alteration of the pulmonary structure has effects on function. This demonstration required a quantitative analysis of lung structures called morphometry. This is possible thanks to stereology, a set of methods based on principles of geometry and statistics. His work has helped to better understand the morphological harmony of the lung, which is essential for its proper functioning. An imbalance leads to pathophysiology such as chronic obstructive pulmonary disease in adults and bronchopulmonary dysplasia in neonates. It is by studying this imbalance that new therapeutic approaches can be developed. These advances are achievable only through morphometric analytical methods, which are increasingly precise and focused, in particular thanks to the high-throughput automation of these methods. This review makes a comparison between an automated method that we developed in the laboratory and semi-manual methods of morphometric analyzes. The automation of morphometric measurements is a fundamental asset in the study of pulmonary pathophysiology because it is an assurance of robustness, reproducibility and speed. This tool will thus contribute significantly to the acceleration of the race for the development of new drugs. Copyright © 2017 SPLF. Published by Elsevier Masson SAS. All rights reserved.

  12. Experimental demonstration of a semi-brittle origin for crustal strain transients

    Science.gov (United States)

    Reber, J. E.; Lavier, L. L.; Hayman, N. W.

    2015-12-01

    Tectonic motions that give rise to destructive earthquakes and enigmatic transient slip events are commonly explained by friction laws that describe slip on fault surfaces and gouge-filled zones. Friction laws with the added effects of pore fluid pressure, shear heating, and chemical reactions as currently applied do not take into account that over a wide range of pressure and temperature conditions rocks deform following a complex mixed brittle-ductile rheology. In semi-brittle materials, such as polymineralic rocks, elasto-plastic and visco-elastic defamation can be observed simultaneously in different phases of the material. Field observations of such semi-brittle rocks at the mesoscale have shown that for a given range of composition, temperature, and pressure, the formation of fluid-filled brittle fractures and veins can precede and accompany the development of localized ductile flow. We propose that the coexistence of brittle and viscous behavior controls some of the physical characteristics of strain transients and slow slip events. Here we present results from shear experiments on semi-brittle rock analogues investigating the effect of yield stress on fracture propagation and connection, and how this can lead to reoccurring strain transients. During the experiments we monitor the evolution of fractures and flow as well as the force development in the system. We show that the nature of localized slip and flow in semi-brittle materials depends on the initiation and formation of mode I and II fractures and does not involve frictional behavior, supporting an alternative mechanism for the development of tectonic strain transients.

  13. Automated vessel segmentation using cross-correlation and pooled covariance matrix analysis.

    Science.gov (United States)

    Du, Jiang; Karimi, Afshin; Wu, Yijing; Korosec, Frank R; Grist, Thomas M; Mistretta, Charles A

    2011-04-01

    Time-resolved contrast-enhanced magnetic resonance angiography (CE-MRA) provides contrast dynamics in the vasculature and allows vessel segmentation based on temporal correlation analysis. Here we present an automated vessel segmentation algorithm including automated generation of regions of interest (ROIs), cross-correlation and pooled sample covariance matrix analysis. The dynamic images are divided into multiple equal-sized regions. In each region, ROIs for artery, vein and background are generated using an iterative thresholding algorithm based on the contrast arrival time map and contrast enhancement map. Region-specific multi-feature cross-correlation analysis and pooled covariance matrix analysis are performed to calculate the Mahalanobis distances (MDs), which are used to automatically separate arteries from veins. This segmentation algorithm is applied to a dual-phase dynamic imaging acquisition scheme where low-resolution time-resolved images are acquired during the dynamic phase followed by high-frequency data acquisition at the steady-state phase. The segmented low-resolution arterial and venous images are then combined with the high-frequency data in k-space and inverse Fourier transformed to form the final segmented arterial and venous images. Results from volunteer and patient studies demonstrate the advantages of this automated vessel segmentation and dual phase data acquisition technique. Copyright © 2011 Elsevier Inc. All rights reserved.

  14. Evaluation and genetic analysis of semi-dwarf mutants in rice (Oryza sativa L.)

    International Nuclear Information System (INIS)

    Awan, M.A.; Cheema, A.A.; Tahir, G.R.

    1984-01-01

    Four semi-dwarf mutants namely DM16-5-1, DM16-5-2, DM-2 and DM107-4 were derived from the local tall basmati cultivar. The mode of reduction of internode length was studied in DM107-4. The reduction in culm length was due to a corresponding but disproportionate reduction in all the internodes. It was inferred that reduction in internode length contributes more towards reduction in height as compared to the reduction in the total number of internodes. The effect of semi-dwarfism on some yield components (panicle characters) was studied in two semi-dwarf mutants viz. DM16-5-1 and DM107-4 compared to Basmati 370. A marginal reduction in the panicle axis, primary branches per panicle, secondary branches per primary branch per panicle, spikelets borne on secondary branches and total number of spikelets per panicle was observed in DM16-5-1, whereas, a significant reduction of these characters was observed in DM107-4. Evaluation of the semi-dwarf mutants with respect to grain yield and harvest index showed that all the mutants possess high yield potential with higher harvest index values compared to the parent cultivar. Genetic analysis for plant height in 4x4 diallel involving semi-dwarf mutants revealed that mutant DM107-4 carries mainly recessive alleles while mutant DM16-5-1 showed some dominance effects as assessed through the estimates of genetic components of variation and Vr,Wr graph analysis. The semi-dwarf mutants have good potential for use as parents in cross-breeding programmes. (author)

  15. Maggot Instructor: Semi-Automated Analysis of Learning and Memory in Drosophila Larvae

    Directory of Open Access Journals (Sweden)

    Urte Tomasiunaite

    2018-06-01

    Full Text Available For several decades, Drosophila has been widely used as a suitable model organism to study the fundamental processes of associative olfactory learning and memory. More recently, this condition also became true for the Drosophila larva, which has become a focus for learning and memory studies based on a number of technical advances in the field of anatomical, molecular, and neuronal analyses. The ongoing efforts should be mentioned to reconstruct the complete connectome of the larval brain featuring a total of about 10,000 neurons and the development of neurogenic tools that allow individual manipulation of each neuron. By contrast, standardized behavioral assays that are commonly used to analyze learning and memory in Drosophila larvae exhibit no such technical development. Most commonly, a simple assay with Petri dishes and odor containers is used; in this method, the animals must be manually transferred in several steps. The behavioral approach is therefore labor-intensive and limits the capacity to conduct large-scale genetic screenings in small laboratories. To circumvent these limitations, we introduce a training device called the Maggot Instructor. This device allows automatic training up to 10 groups of larvae in parallel. To achieve such goal, we used fully automated, computer-controlled optogenetic activation of single olfactory neurons in combination with the application of electric shocks. We showed that Drosophila larvae trained with the Maggot Instructor establish an odor-specific memory, which is independent of handling and non-associative effects. The Maggot Instructor will allow to investigate the large collections of genetically modified larvae in a short period and with minimal human resources. Therefore, the Maggot Instructor should be able to help extensive behavioral experiments in Drosophila larvae to keep up with the current technical advancements. In the longer term, this condition will lead to a better understanding of

  16. Automated reasoning applications to design validation and sneak function analysis

    International Nuclear Information System (INIS)

    Stratton, R.C.

    1984-01-01

    Argonne National Laboratory (ANL) is actively involved in the LMFBR Man-Machine Integration (MMI) Safety Program. The objective of this program is to enhance the operational safety and reliability of fast-breeder reactors by optimum integration of men and machines through the application of human factors principles and control engineering to the design, operation, and the control environment. ANL is developing methods to apply automated reasoning and computerization in the validation and sneak function analysis process. This project provides the element definitions and relations necessary for an automated reasoner (AR) to reason about design validation and sneak function analysis. This project also provides a demonstration of this AR application on an Experimental Breeder Reactor-II (EBR-II) system, the Argonne Cooling System

  17. Automated analysis of gastric emptying

    International Nuclear Information System (INIS)

    Abutaleb, A.; Frey, D.; Spicer, K.; Spivey, M.; Buckles, D.

    1986-01-01

    The authors devised a novel method to automate the analysis of nuclear gastric emptying studies. Many previous methods have been used to measure gastric emptying but, are cumbersome and require continuing interference by the operator to use. Two specific problems that occur are related to patient movement between images and changes in the location of the radioactive material within the stomach. Their method can be used with either dual or single phase studies. For dual phase studies the authors use In-111 labeled water and Tc-99MSC (Sulfur Colloid) labeled scrambled eggs. For single phase studies either the liquid or solid phase material is used

  18. Spectral structure of mesoscale winds over the water

    DEFF Research Database (Denmark)

    Larsén, Xiaoli Guo; Vincent, Claire Louise; Larsen, Søren Ejling

    2013-01-01

    to describe the spectral slope transition as well as the limit for application of the Taylor hypothesis. The stability parameter calculated from point measurements, the bulk Richardson number, is found insufficient to represent the various atmospheric structures that have their own spectral behaviours under...... spectra show universal characteristics, in agreement with the findings in literature, including the energy amplitude and the −5/3 spectral slope in the mesoscale range transitioning to a slope of −3 for synoptic and planetary scales. The integral time-scale of the local weather is found to be useful...... different stability conditions, such as open cells and gravity waves. For stationary conditions, the mesoscale turbulence is found to bear some characteristics of two-dimensional isotropy, including (1) very minor vertical variation of spectra; (2) similar spectral behaviour for the along- and across...

  19. Diagnostic performance of semi-quantitative and quantitative stress CMR perfusion analysis: a meta-analysis.

    Science.gov (United States)

    van Dijk, R; van Assen, M; Vliegenthart, R; de Bock, G H; van der Harst, P; Oudkerk, M

    2017-11-27

    Stress cardiovascular magnetic resonance (CMR) perfusion imaging is a promising modality for the evaluation of coronary artery disease (CAD) due to high spatial resolution and absence of radiation. Semi-quantitative and quantitative analysis of CMR perfusion are based on signal-intensity curves produced during the first-pass of gadolinium contrast. Multiple semi-quantitative and quantitative parameters have been introduced. Diagnostic performance of these parameters varies extensively among studies and standardized protocols are lacking. This study aims to determine the diagnostic accuracy of semi- quantitative and quantitative CMR perfusion parameters, compared to multiple reference standards. Pubmed, WebOfScience, and Embase were systematically searched using predefined criteria (3272 articles). A check for duplicates was performed (1967 articles). Eligibility and relevance of the articles was determined by two reviewers using pre-defined criteria. The primary data extraction was performed independently by two researchers with the use of a predefined template. Differences in extracted data were resolved by discussion between the two researchers. The quality of the included studies was assessed using the 'Quality Assessment of Diagnostic Accuracy Studies Tool' (QUADAS-2). True positives, false positives, true negatives, and false negatives were subtracted/calculated from the articles. The principal summary measures used to assess diagnostic accuracy were sensitivity, specificity, andarea under the receiver operating curve (AUC). Data was pooled according to analysis territory, reference standard and perfusion parameter. Twenty-two articles were eligible based on the predefined study eligibility criteria. The pooled diagnostic accuracy for segment-, territory- and patient-based analyses showed good diagnostic performance with sensitivity of 0.88, 0.82, and 0.83, specificity of 0.72, 0.83, and 0.76 and AUC of 0.90, 0.84, and 0.87, respectively. In per territory

  20. Automated information retrieval system for radioactivation analysis

    International Nuclear Information System (INIS)

    Lambrev, V.G.; Bochkov, P.E.; Gorokhov, S.A.; Nekrasov, V.V.; Tolstikova, L.I.

    1981-01-01

    An automated information retrieval system for radioactivation analysis has been developed. An ES-1022 computer and a problem-oriented software ''The description information search system'' were used for the purpose. Main aspects and sources of forming the system information fund, characteristics of the information retrieval language of the system are reported and examples of question-answer dialogue are given. Two modes can be used: selective information distribution and retrospective search [ru

  1. Automated X-ray image analysis for cargo security: Critical review and future promise.

    Science.gov (United States)

    Rogers, Thomas W; Jaccard, Nicolas; Morton, Edward J; Griffin, Lewis D

    2017-01-01

    We review the relatively immature field of automated image analysis for X-ray cargo imagery. There is increasing demand for automated analysis methods that can assist in the inspection and selection of containers, due to the ever-growing volumes of traded cargo and the increasing concerns that customs- and security-related threats are being smuggled across borders by organised crime and terrorist networks. We split the field into the classical pipeline of image preprocessing and image understanding. Preprocessing includes: image manipulation; quality improvement; Threat Image Projection (TIP); and material discrimination and segmentation. Image understanding includes: Automated Threat Detection (ATD); and Automated Contents Verification (ACV). We identify several gaps in the literature that need to be addressed and propose ideas for future research. Where the current literature is sparse we borrow from the single-view, multi-view, and CT X-ray baggage domains, which have some characteristics in common with X-ray cargo.

  2. Space Environment Automated Alerts and Anomaly Analysis Assistant (SEA^5) for NASA

    Data.gov (United States)

    National Aeronautics and Space Administration — We propose to develop a comprehensive analysis and dissemination system (Space Environment Automated Alerts  & Anomaly Analysis Assistant: SEA5) that will...

  3. SWOT Analysis of Automation for Cash and Accounts Control in Construction

    OpenAIRE

    Mariya Deriy

    2013-01-01

    The possibility has been analyzed as to computerization of control over accounting and information systems data in terms of cash and payments in company practical activity provided that the problem is solved of the existence of well-functioning single computer network between different units of a developing company. Current state of the control organization and possibility of its automation has been observed. SWOT analysis of control automation to identify its strengths and weaknesses, obstac...

  4. Using historical wafermap data for automated yield analysis

    International Nuclear Information System (INIS)

    Tobin, K.W.; Karnowski, T.P.; Gleason, S.S.; Jensen, D.; Lakhani, F.

    1999-01-01

    To be productive and profitable in a modern semiconductor fabrication environment, large amounts of manufacturing data must be collected, analyzed, and maintained. This includes data collected from in- and off-line wafer inspection systems and from the process equipment itself. This data is increasingly being used to design new processes, control and maintain tools, and to provide the information needed for rapid yield learning and prediction. Because of increasing device complexity, the amount of data being generated is outstripping the yield engineer close-quote s ability to effectively monitor and correct unexpected trends and excursions. The 1997 SIA National Technology Roadmap for Semiconductors highlights a need to address these issues through open-quotes automated data reduction algorithms to source defects from multiple data sources and to reduce defect sourcing time.close quotes SEMATECH and the Oak Ridge National Laboratory have been developing new strategies and technologies for providing the yield engineer with higher levels of assisted data reduction for the purpose of automated yield analysis. In this article, we will discuss the current state of the art and trends in yield management automation. copyright 1999 American Vacuum Society

  5. IFDOTMETER: A New Software Application for Automated Immunofluorescence Analysis.

    Science.gov (United States)

    Rodríguez-Arribas, Mario; Pizarro-Estrella, Elisa; Gómez-Sánchez, Rubén; Yakhine-Diop, S M S; Gragera-Hidalgo, Antonio; Cristo, Alejandro; Bravo-San Pedro, Jose M; González-Polo, Rosa A; Fuentes, José M

    2016-04-01

    Most laboratories interested in autophagy use different imaging software for managing and analyzing heterogeneous parameters in immunofluorescence experiments (e.g., LC3-puncta quantification and determination of the number and size of lysosomes). One solution would be software that works on a user's laptop or workstation that can access all image settings and provide quick and easy-to-use analysis of data. Thus, we have designed and implemented an application called IFDOTMETER, which can run on all major operating systems because it has been programmed using JAVA (Sun Microsystems). Briefly, IFDOTMETER software has been created to quantify a variety of biological hallmarks, including mitochondrial morphology and nuclear condensation. The program interface is intuitive and user-friendly, making it useful for users not familiar with computer handling. By setting previously defined parameters, the software can automatically analyze a large number of images without the supervision of the researcher. Once analysis is complete, the results are stored in a spreadsheet. Using software for high-throughput cell image analysis offers researchers the possibility of performing comprehensive and precise analysis of a high number of images in an automated manner, making this routine task easier. © 2015 Society for Laboratory Automation and Screening.

  6. Artificial neural networks for automation of Rutherford backscattering spectroscopy experiments and data analysis

    International Nuclear Information System (INIS)

    Barradas, N.P.; Vieira, A.; Patricio, R.

    2002-01-01

    We present an algorithm based on artificial neural networks able to determine optimized experimental conditions for Rutherford backscattering measurements of Ge-implanted Si. The algorithm can be implemented for any other element implanted into a lighter substrate. It is foreseeable that the method developed in this work can be applied to still many other systems. The algorithm presented is a push-button black box, and does not require any human intervention. It is thus suited for automated control of an experimental setup, given an interface to the relevant hardware. Once the experimental conditions are optimized, the algorithm analyzes the final data obtained, and determines the desired parameters. The method is thus also suited for automated analysis of the data. The algorithm presented can be easily extended to other ion beam analysis techniques. Finally, it is suggested how the artificial neural networks required for automated control and analysis of experiments could be automatically generated. This would be suited for automated generation of the required computer code. Thus could RBS be done without experimentalists, data analysts, or programmers, with only technicians to keep the machines running

  7. Extended -Regular Sequence for Automated Analysis of Microarray Images

    Directory of Open Access Journals (Sweden)

    Jin Hee-Jeong

    2006-01-01

    Full Text Available Microarray study enables us to obtain hundreds of thousands of expressions of genes or genotypes at once, and it is an indispensable technology for genome research. The first step is the analysis of scanned microarray images. This is the most important procedure for obtaining biologically reliable data. Currently most microarray image processing systems require burdensome manual block/spot indexing work. Since the amount of experimental data is increasing very quickly, automated microarray image analysis software becomes important. In this paper, we propose two automated methods for analyzing microarray images. First, we propose the extended -regular sequence to index blocks and spots, which enables a novel automatic gridding procedure. Second, we provide a methodology, hierarchical metagrid alignment, to allow reliable and efficient batch processing for a set of microarray images. Experimental results show that the proposed methods are more reliable and convenient than the commercial tools.

  8. Digital image analysis applied to industrial nondestructive evaluation and automated parts assembly

    International Nuclear Information System (INIS)

    Janney, D.H.; Kruger, R.P.

    1979-01-01

    Many ideas of image enhancement and analysis are relevant to the needs of the nondestructive testing engineer. These ideas not only aid the engineer in the performance of his current responsibilities, they also open to him new areas of industrial development and automation which are logical extensions of classical testing problems. The paper begins with a tutorial on the fundamentals of computerized image enhancement as applied to nondestructive testing, then progresses through pattern recognition and automated inspection to automated, or robotic, assembly procedures. It is believed that such procedures are cost-effective in many instances, and are but the logical extension of those techniques now commonly used, but often limited to analysis of data from quality-assurance images. Many references are given in order to help the reader who wishes to pursue a given idea further

  9. Neuromantic - from semi manual to semi automatic reconstruction of neuron morphology

    Directory of Open Access Journals (Sweden)

    Darren eMyatt

    2012-03-01

    Full Text Available The ability to create accurate geometric models of neuronal morphologyis important for understanding the role of shape in informationprocessing. Despite a significant amount of research on automating neuronreconstructions from image stacks obtained via microscopy, in practice mostdata are still collected manually.This paper describes Neuromantic, an open source system for threedimensional digital tracing of neurites. Neuromantic reconstructions arecomparable in quality to those of existing commercial and freeware systemswhile balancing speed and accuracy of manual reconstruction. Thecombination of semi-automatic tracing, intuitive editing, and ability ofvisualising large image stacks on standard computing platforms providesa versatile tool that can help address the reconstructions availabilitybottleneck. Practical considerations for reducing the computational time andspace requirements of the extended algorithm are also discussed.

  10. Recipes for correcting the impact of effective mesoscale resolution on the estimation of extreme winds

    DEFF Research Database (Denmark)

    Larsén, Xiaoli Guo; Ott, Søren; Badger, Jake

    2012-01-01

    Extreme winds derived from simulations using mesoscale models are underestimated due to the effective spatial and temporal resolutions. This is reflected in the spectral domain as an energy deficit in the mesoscale range. The energy deficit implies smaller spectral moments and thus underestimatio...

  11. Onset of meso-scale turbulence in active nematics

    NARCIS (Netherlands)

    Doostmohammadi, A.; Shendruk, T.N.; Thijssen, K.; Yeomans, J.M.

    2017-01-01

    Meso-scale turbulence is an innate phenomenon, distinct from inertial turbulence, that spontaneously occurs at low Reynolds number in fluidized biological systems. This spatiotemporal disordered flow radically changes nutrient and molecular transport in living fluids and can strongly affect the

  12. Mesoscale cyclogenesis over the western north Pacific Ocean during TPARC

    Directory of Open Access Journals (Sweden)

    Christopher A. Davis

    2013-01-01

    Full Text Available Three cases of mesoscale marine cyclogenesis over the subtropics of the Western Pacific Ocean are investigated. Each case occurred during the THORPEX Pacific Asia Regional Campaign and Tropical Cyclone Structure (TCS-08 field phases in 2008. Each cyclone developed from remnants of disturbances that earlier showed potential for tropical cyclogenesis within the tropics. Two of the cyclones produced gale-force surface winds, and one, designated as a tropical cyclone, resulted in a significant coastal storm over eastern Japan. Development was initiated by a burst of organized mesoscale convection that consolidated and intensified the surface cyclonic circulation over a period of 12–24 h. Upper-tropospheric potential vorticity anomalies modulated the vertical wind shear that, in turn, influenced the periods of cyclone intensification and weakening. Weak baroclinicity associated with vertical shear was also deemed important in organizing mesoscale ascent and the convection outbreaks. The remnant tropical disturbances contributed exceptional water vapour content to higher latitudes that led to strong diabatic heating, and the tropical remnants contributed vorticity that was the seed of the development in the subtropics. Predictability of these events more than three days in advance appears to be minimal.

  13. Discrimination between smiling faces: Human observers vs. automated face analysis.

    Science.gov (United States)

    Del Líbano, Mario; Calvo, Manuel G; Fernández-Martín, Andrés; Recio, Guillermo

    2018-05-11

    This study investigated (a) how prototypical happy faces (with happy eyes and a smile) can be discriminated from blended expressions with a smile but non-happy eyes, depending on type and intensity of the eye expression; and (b) how smile discrimination differs for human perceivers versus automated face analysis, depending on affective valence and morphological facial features. Human observers categorized faces as happy or non-happy, or rated their valence. Automated analysis (FACET software) computed seven expressions (including joy/happiness) and 20 facial action units (AUs). Physical properties (low-level image statistics and visual saliency) of the face stimuli were controlled. Results revealed, first, that some blended expressions (especially, with angry eyes) had lower discrimination thresholds (i.e., they were identified as "non-happy" at lower non-happy eye intensities) than others (especially, with neutral eyes). Second, discrimination sensitivity was better for human perceivers than for automated FACET analysis. As an additional finding, affective valence predicted human discrimination performance, whereas morphological AUs predicted FACET discrimination. FACET can be a valid tool for categorizing prototypical expressions, but is currently more limited than human observers for discrimination of blended expressions. Configural processing facilitates detection of in/congruence(s) across regions, and thus detection of non-genuine smiling faces (due to non-happy eyes). Copyright © 2018 Elsevier B.V. All rights reserved.

  14. Characterizing the Meso-scale Plasma Flows in Earth's Coupled Magnetosphere-Ionosphere-Thermosphere System

    Science.gov (United States)

    Gabrielse, C.; Nishimura, T.; Lyons, L. R.; Gallardo-Lacourt, B.; Deng, Y.; McWilliams, K. A.; Ruohoniemi, J. M.

    2017-12-01

    NASA's Heliophysics Decadal Survey put forth several imperative, Key Science Goals. The second goal communicates the urgent need to "Determine the dynamics and coupling of Earth's magnetosphere, ionosphere, and atmosphere and their response to solar and terrestrial inputs...over a range of spatial and temporal scales." Sun-Earth connections (called Space Weather) have strong societal impacts because extreme events can disturb radio communications and satellite operations. The field's current modeling capabilities of such Space Weather phenomena include large-scale, global responses of the Earth's upper atmosphere to various inputs from the Sun, but the meso-scale ( 50-500 km) structures that are much more dynamic and powerful in the coupled system remain uncharacterized. Their influences are thus far poorly understood. We aim to quantify such structures, particularly auroral flows and streamers, in order to create an empirical model of their size, location, speed, and orientation based on activity level (AL index), season, solar cycle (F10.7), interplanetary magnetic field (IMF) inputs, etc. We present a statistical study of meso-scale flow channels in the nightside auroral oval and polar cap using SuperDARN. These results are used to inform global models such as the Global Ionosphere Thermosphere Model (GITM) in order to evaluate the role of meso-scale disturbances on the fully coupled magnetosphere-ionosphere-thermosphere system. Measuring the ionospheric footpoint of magnetospheric fast flows, our analysis technique from the ground also provides a 2D picture of flows and their characteristics during different activity levels that spacecraft alone cannot.

  15. Automated quantification of aligned collagen for human breast carcinoma prognosis

    Directory of Open Access Journals (Sweden)

    Jeremy S Bredfeldt

    2014-01-01

    Full Text Available Background: Mortality in cancer patients is directly attributable to the ability of cancer cells to metastasize to distant sites from the primary tumor. This migration of tumor cells begins with a remodeling of the local tumor microenvironment, including changes to the extracellular matrix and the recruitment of stromal cells, both of which facilitate invasion of tumor cells into the bloodstream. In breast cancer, it has been proposed that the alignment of collagen fibers surrounding tumor epithelial cells can serve as a quantitative image-based biomarker for survival of invasive ductal carcinoma patients. Specific types of collagen alignment have been identified for their prognostic value and now these tumor associated collagen signatures (TACS are central to several clinical specimen imaging trials. Here, we implement the semi-automated acquisition and analysis of this TACS candidate biomarker and demonstrate a protocol that will allow consistent scoring to be performed throughout large patient cohorts. Methods: Using large field of view high resolution microscopy techniques, image processing and supervised learning methods, we are able to quantify and score features of collagen fiber alignment with respect to adjacent tumor-stromal boundaries. Results: Our semi-automated technique produced scores that have statistically significant correlation with scores generated by a panel of three human observers. In addition, our system generated classification scores that accurately predicted survival in a cohort of 196 breast cancer patients. Feature rank analysis reveals that TACS positive fibers are more well-aligned with each other, are of generally lower density, and terminate within or near groups of epithelial cells at larger angles of interaction. Conclusion: These results demonstrate the utility of a supervised learning protocol for streamlining the analysis of collagen alignment with respect to tumor stromal boundaries.

  16. Methods and measurement variance for field estimations of coral colony planar area using underwater photographs and semi-automated image segmentation.

    Science.gov (United States)

    Neal, Benjamin P; Lin, Tsung-Han; Winter, Rivah N; Treibitz, Tali; Beijbom, Oscar; Kriegman, David; Kline, David I; Greg Mitchell, B

    2015-08-01

    Size and growth rates for individual colonies are some of the most essential descriptive parameters for understanding coral communities, which are currently experiencing worldwide declines in health and extent. Accurately measuring coral colony size and changes over multiple years can reveal demographic, growth, or mortality patterns often not apparent from short-term observations and can expose environmental stress responses that may take years to manifest. Describing community size structure can reveal population dynamics patterns, such as periods of failed recruitment or patterns of colony fission, which have implications for the future sustainability of these ecosystems. However, rapidly and non-invasively measuring coral colony sizes in situ remains a difficult task, as three-dimensional underwater digital reconstruction methods are currently not practical for large numbers of colonies. Two-dimensional (2D) planar area measurements from projection of underwater photographs are a practical size proxy, although this method presents operational difficulties in obtaining well-controlled photographs in the highly rugose environment of the coral reef, and requires extensive time for image processing. Here, we present and test the measurement variance for a method of making rapid planar area estimates of small to medium-sized coral colonies using a lightweight monopod image-framing system and a custom semi-automated image segmentation analysis program. This method demonstrated a coefficient of variation of 2.26% for repeated measurements in realistic ocean conditions, a level of error appropriate for rapid, inexpensive field studies of coral size structure, inferring change in colony size over time, or measuring bleaching or disease extent of large numbers of individual colonies.

  17. Automated reasoning applications to design analysis

    International Nuclear Information System (INIS)

    Stratton, R.C.

    1984-01-01

    Given the necessary relationships and definitions of design functions and components, validation of system incarnation (the physical product of design) and sneak function analysis can be achieved via automated reasoners. The relationships and definitions must define the design specification and incarnation functionally. For the design specification, the hierarchical functional representation is based on physics and engineering principles and bounded by design objectives and constraints. The relationships and definitions of the design incarnation are manifested as element functional definitions, state relationship to functions, functional relationship to direction, element connectivity, and functional hierarchical configuration

  18. Magnetic resonance image segmentation using semi-automated software for quantification of knee articular cartilage - initial evaluation of a technique for paired scans

    International Nuclear Information System (INIS)

    Brem, M.H.; Lang, P.K.; Neumann, G.; Schlechtweg, P.M.; Yoshioka, H.; Pappas, G.; Duryea, J.; Schneider, E.; Jackson, R.; Yu, J.; Eaton, C.B.; Hennig, F.F.

    2009-01-01

    Software-based image analysis is important for studies of cartilage changes in knee osteoarthritis (OA). This study describes an evaluation of a semi-automated cartilage segmentation software tool capable of quantifying paired images for potential use in longitudinal studies of knee OA. We describe the methodology behind the analysis and demonstrate its use by determination of test-retest analysis precision of duplicate knee magnetic resonance imaging (MRI) data sets. Test-retest knee MR images of 12 subjects with a range of knee health were evaluated from the Osteoarthritis Initiative (OAI) pilot MR study. Each subject was removed from the magnet between the two scans. The 3D DESS (sagittal, 0.456 mm x 0.365 mm, 0.7 mm slice thickness, TR 16.5 ms, TE 4.7 ms) images were obtained on a 3-T Siemens Trio MR system with a USA Instruments quadrature transmit-receive extremity coil. Segmentation of one 3D-image series was first performed and then the corresponding retest series was segmented by viewing both image series concurrently in two adjacent windows. After manual registration of the series, the first segmentation cartilage outline served as an initial estimate for the second segmentation. We evaluated morphometric measures of the bone and cartilage surface area (tAB and AC), cartilage volume (VC), and mean thickness (ThC.me) for medial/lateral tibia (MT/LT), total femur (F) and patella (P). Test-retest reproducibility was assessed using the root-mean square coefficient of variation (RMS CV%). For the paired analyses, RMS CV % ranged from 0.9% to 1.2% for VC, from 0.3% to 0.7% for AC, from 0.6% to 2.7% for tAB and 0.8% to 1.5% for ThC.me. Paired image analysis improved the measurement precision of cartilage segmentation. Our results are in agreement with other publications supporting the use of paired analysis for longitudinal studies of knee OA. (orig.)

  19. Magnetic resonance image segmentation using semi-automated software for quantification of knee articular cartilage - initial evaluation of a technique for paired scans

    Energy Technology Data Exchange (ETDEWEB)

    Brem, M.H. [Brigham and Women' s Hospital, Department of Radiology, Boston, MA (United States); Friedrich-Alexander-University Erlangen Nurenberg, Division of Orthopaedic and Trauma Surgery, Department of Surgery, Erlangen (Germany); Lang, P.K.; Neumann, G.; Schlechtweg, P.M.; Yoshioka, H.; Pappas, G.; Duryea, J. [Brigham and Women' s Hospital, Department of Radiology, Boston, MA (United States); Schneider, E. [LLC, SciTrials, Rocky River, OH (United States); Cleveland Clinic, Imaging Institute, Cleveland, OH (United States); Jackson, R.; Yu, J. [Ohio State University, Diabetes and Metabolism and Radiology, Department of Endocrinology, Columbus, OH (United States); Eaton, C.B. [Center for Primary Care and Prevention and the Warren Alpert Medical School of Brown University, Memorial Hospital of Rhode Island, Providence, RI (United States); Hennig, F.F. [Friedrich-Alexander-University Erlangen Nurenberg, Division of Orthopaedic and Trauma Surgery, Department of Surgery, Erlangen (Germany)

    2009-05-15

    Software-based image analysis is important for studies of cartilage changes in knee osteoarthritis (OA). This study describes an evaluation of a semi-automated cartilage segmentation software tool capable of quantifying paired images for potential use in longitudinal studies of knee OA. We describe the methodology behind the analysis and demonstrate its use by determination of test-retest analysis precision of duplicate knee magnetic resonance imaging (MRI) data sets. Test-retest knee MR images of 12 subjects with a range of knee health were evaluated from the Osteoarthritis Initiative (OAI) pilot MR study. Each subject was removed from the magnet between the two scans. The 3D DESS (sagittal, 0.456 mm x 0.365 mm, 0.7 mm slice thickness, TR 16.5 ms, TE 4.7 ms) images were obtained on a 3-T Siemens Trio MR system with a USA Instruments quadrature transmit-receive extremity coil. Segmentation of one 3D-image series was first performed and then the corresponding retest series was segmented by viewing both image series concurrently in two adjacent windows. After manual registration of the series, the first segmentation cartilage outline served as an initial estimate for the second segmentation. We evaluated morphometric measures of the bone and cartilage surface area (tAB and AC), cartilage volume (VC), and mean thickness (ThC.me) for medial/lateral tibia (MT/LT), total femur (F) and patella (P). Test-retest reproducibility was assessed using the root-mean square coefficient of variation (RMS CV%). For the paired analyses, RMS CV % ranged from 0.9% to 1.2% for VC, from 0.3% to 0.7% for AC, from 0.6% to 2.7% for tAB and 0.8% to 1.5% for ThC.me. Paired image analysis improved the measurement precision of cartilage segmentation. Our results are in agreement with other publications supporting the use of paired analysis for longitudinal studies of knee OA. (orig.)

  20. OpenComet: An automated tool for comet assay image analysis

    Directory of Open Access Journals (Sweden)

    Benjamin M. Gyori

    2014-01-01

    Full Text Available Reactive species such as free radicals are constantly generated in vivo and DNA is the most important target of oxidative stress. Oxidative DNA damage is used as a predictive biomarker to monitor the risk of development of many diseases. The comet assay is widely used for measuring oxidative DNA damage at a single cell level. The analysis of comet assay output images, however, poses considerable challenges. Commercial software is costly and restrictive, while free software generally requires laborious manual tagging of cells. This paper presents OpenComet, an open-source software tool providing automated analysis of comet assay images. It uses a novel and robust method for finding comets based on geometric shape attributes and segmenting the comet heads through image intensity profile analysis. Due to automation, OpenComet is more accurate, less prone to human bias, and faster than manual analysis. A live analysis functionality also allows users to analyze images captured directly from a microscope. We have validated OpenComet on both alkaline and neutral comet assay images as well as sample images from existing software packages. Our results show that OpenComet achieves high accuracy with significantly reduced analysis time.

  1. Automation for System Safety Analysis

    Science.gov (United States)

    Malin, Jane T.; Fleming, Land; Throop, David; Thronesbery, Carroll; Flores, Joshua; Bennett, Ted; Wennberg, Paul

    2009-01-01

    This presentation describes work to integrate a set of tools to support early model-based analysis of failures and hazards due to system-software interactions. The tools perform and assist analysts in the following tasks: 1) extract model parts from text for architecture and safety/hazard models; 2) combine the parts with library information to develop the models for visualization and analysis; 3) perform graph analysis and simulation to identify and evaluate possible paths from hazard sources to vulnerable entities and functions, in nominal and anomalous system-software configurations and scenarios; and 4) identify resulting candidate scenarios for software integration testing. There has been significant technical progress in model extraction from Orion program text sources, architecture model derivation (components and connections) and documentation of extraction sources. Models have been derived from Internal Interface Requirements Documents (IIRDs) and FMEA documents. Linguistic text processing is used to extract model parts and relationships, and the Aerospace Ontology also aids automated model development from the extracted information. Visualizations of these models assist analysts in requirements overview and in checking consistency and completeness.

  2. Three-dimensional Mesoscale Simulations of Detonation Initiation in Energetic Materials with Density-based Kinetics

    Science.gov (United States)

    Jackson, Thomas; Jost, A. M.; Zhang, Ju; Sridharan, P.; Amadio, G.

    2017-06-01

    In this work we present three-dimensional mesoscale simulations of detonation initiation in energetic materials. We solve the reactive Euler equations, with the energy equation augmented by a power deposition term. The reaction rate at the mesoscale is modelled using a density-based kinetics scheme, adapted from standard Ignition and Growth models. The deposition term is based on previous results of simulations of pore collapse at the microscale, modelled at the mesoscale as hot-spots. We carry out three-dimensional mesoscale simulations of random packs of HMX crystals in a binder, and show that the transition between no-detonation and detonation depends on the number density of the hot-spots, the initial radius of the hot-spot, the post-shock pressure of an imposed shock, and the amplitude of the power deposition term. The trends of transition at lower pressure of the imposed shock for larger number density of pore observed in experiments is reproduced. Initial attempts to improve the agreement between the simulation and experiments through calibration of various parameters will also be made.

  3. An intercomparison of several diagnostic meteorological processors used in mesoscale air quality modeling

    Energy Technology Data Exchange (ETDEWEB)

    Vimont, J.C. [National Park Service, Lakewood, CO (United States); Scire, J.S. [Sigma Research Corp., Concord, MA (United States)

    1994-12-31

    A major component, and area of uncertainty, in mesoscale air quality modeling, is the specification of the meteorological fields which affect the transport and dispersion of pollutants. Various options are available for estimating the wind and mixing depth fields over a mesoscale domain. Estimates of the wind field can be obtained from spatial and temporal interpolation of available observations or from diagnostic meteorological models, which estimate a meteorological field from available data and adjust those fields based on parameterizations of physical processes. A major weakness of these processors is their dependence on spatially and temporally sparse input data, particularly upper air data. These problems are exacerbated in regions of complex terrain and along the shorelines of large bodies of water. Similarly, the estimation of mixing depth is also reliant upon sparse observations and the parameterization of the convective and mechanical processes. The meteorological processors examined in this analysis were developed to drive different Lagrangian puff models. This paper describes the algorithms these processors use to estimate the wind fields and mixing depth fields.

  4. Applications of Automation Methods for Nonlinear Fracture Test Analysis

    Science.gov (United States)

    Allen, Phillip A.; Wells, Douglas N.

    2013-01-01

    Using automated and standardized computer tools to calculate the pertinent test result values has several advantages such as: 1. allowing high-fidelity solutions to complex nonlinear phenomena that would be impractical to express in written equation form, 2. eliminating errors associated with the interpretation and programing of analysis procedures from the text of test standards, 3. lessening the need for expertise in the areas of solid mechanics, fracture mechanics, numerical methods, and/or finite element modeling, to achieve sound results, 4. and providing one computer tool and/or one set of solutions for all users for a more "standardized" answer. In summary, this approach allows a non-expert with rudimentary training to get the best practical solution based on the latest understanding with minimum difficulty.Other existing ASTM standards that cover complicated phenomena use standard computer programs: 1. ASTM C1340/C1340M-10- Standard Practice for Estimation of Heat Gain or Loss Through Ceilings Under Attics Containing Radiant Barriers by Use of a Computer Program 2. ASTM F 2815 - Standard Practice for Chemical Permeation through Protective Clothing Materials: Testing Data Analysis by Use of a Computer Program 3. ASTM E2807 - Standard Specification for 3D Imaging Data Exchange, Version 1.0 The verification, validation, and round-robin processes required of a computer tool closely parallel the methods that are used to ensure the solution validity for equations included in test standard. The use of automated analysis tools allows the creation and practical implementation of advanced fracture mechanics test standards that capture the physics of a nonlinear fracture mechanics problem without adding undue burden or expense to the user. The presented approach forms a bridge between the equation-based fracture testing standards of today and the next generation of standards solving complex problems through analysis automation.

  5. Semi-on-line analysis for fast and precise monitoring of bioreaction processes

    DEFF Research Database (Denmark)

    Christensen, L.H.; Marcher, J.; Schulze, Ulrik

    1996-01-01

    Monitoring of substrates and products during fermentation processes can be achieved either by on-line, in situ sensors or by semi-on-line analysis consisting of an automatic sampling step followed by an ex situ analysis of the retrieved sample. The potential risk of introducing time delays...

  6. Micro- and meso-scale effects of forested terrain

    DEFF Research Database (Denmark)

    Dellwik, Ebba; Mann, Jakob; Sogachev, Andrey

    2011-01-01

    scales are the height of the planetary boundary layer and the Monin-Obukhov length, which both are related to the energy balance of the surface. Examples of important micro- and meso-scale effects of forested terrain are shown using data and model results from recent and ongoing experiments. For micro......The height and rotor diameter of modern wind turbines are so extensive, that the wind conditions they encounter often are well above the surface layer, where traditionally it is assumed that wind direction and turbulent fluxes are constant with respect to height, if the surface is homogenous....... Deviations from the requirement of homogeneity are often the focus of micro-scale studies in forested areas. Yet, to explain the wind climate in the relevant height range for turbines, it is necessary to also account for the length scales that are important parameters for the meso-scale flow. These length...

  7. Robotics/Automated Systems Task Analysis and Description of Required Job Competencies Report. Task Analysis and Description of Required Job Competencies of Robotics/Automated Systems Technicians.

    Science.gov (United States)

    Hull, Daniel M.; Lovett, James E.

    This task analysis report for the Robotics/Automated Systems Technician (RAST) curriculum project first provides a RAST job description. It then discusses the task analysis, including the identification of tasks, the grouping of tasks according to major areas of specialty, and the comparison of the competencies to existing or new courses to…

  8. A dorsolateral prefrontal cortex semi-automatic segmenter

    Science.gov (United States)

    Al-Hakim, Ramsey; Fallon, James; Nain, Delphine; Melonakos, John; Tannenbaum, Allen

    2006-03-01

    Structural, functional, and clinical studies in schizophrenia have, for several decades, consistently implicated dysfunction of the prefrontal cortex in the etiology of the disease. Functional and structural imaging studies, combined with clinical, psychometric, and genetic analyses in schizophrenia have confirmed the key roles played by the prefrontal cortex and closely linked "prefrontal system" structures such as the striatum, amygdala, mediodorsal thalamus, substantia nigra-ventral tegmental area, and anterior cingulate cortices. The nodal structure of the prefrontal system circuit is the dorsal lateral prefrontal cortex (DLPFC), or Brodmann area 46, which also appears to be the most commonly studied and cited brain area with respect to schizophrenia. 1, 2, 3, 4 In 1986, Weinberger et. al. tied cerebral blood flow in the DLPFC to schizophrenia.1 In 2001, Perlstein et. al. demonstrated that DLPFC activation is essential for working memory tasks commonly deficient in schizophrenia. 2 More recently, groups have linked morphological changes due to gene deletion and increased DLPFC glutamate concentration to schizophrenia. 3, 4 Despite the experimental and clinical focus on the DLPFC in structural and functional imaging, the variability of the location of this area, differences in opinion on exactly what constitutes DLPFC, and inherent difficulties in segmenting this highly convoluted cortical region have contributed to a lack of widely used standards for manual or semi-automated segmentation programs. Given these implications, we developed a semi-automatic tool to segment the DLPFC from brain MRI scans in a reproducible way to conduct further morphological and statistical studies. The segmenter is based on expert neuroanatomist rules (Fallon-Kindermann rules), inspired by cytoarchitectonic data and reconstructions presented by Rajkowska and Goldman-Rakic. 5 It is semi-automated to provide essential user interactivity. We present our results and provide details on

  9. Three-dimensional mesoscale heterostructures of ZnO nanowire arrays epitaxially grown on CuGaO2 nanoplates as individual diodes.

    Science.gov (United States)

    Forticaux, Audrey; Hacialioglu, Salih; DeGrave, John P; Dziedzic, Rafal; Jin, Song

    2013-09-24

    We report a three-dimensional (3D) mesoscale heterostructure composed of one-dimensional (1D) nanowire (NW) arrays epitaxially grown on two-dimensional (2D) nanoplates. Specifically, three facile syntheses are developed to assemble vertical ZnO NWs on CuGaO2 (CGO) nanoplates in mild aqueous solution conditions. The key to the successful 3D mesoscale integration is the preferential nucleation and heteroepitaxial growth of ZnO NWs on the CGO nanoplates. Using transmission electron microscopy, heteroepitaxy was found between the basal planes of CGO nanoplates and ZnO NWs, which are their respective (001) crystallographic planes, by the observation of a hexagonal Moiré fringes pattern resulting from the slight mismatch between the c planes of ZnO and CGO. Careful analysis shows that this pattern can be described by a hexagonal supercell with a lattice parameter of almost exactly 11 and 12 times the a lattice constants for ZnO and CGO, respectively. The electrical properties of the individual CGO-ZnO mesoscale heterostructures were measured using a current-sensing atomic force microscopy setup to confirm the rectifying p-n diode behavior expected from the band alignment of p-type CGO and n-type ZnO wide band gap semiconductors. These 3D mesoscale heterostructures represent a new motif in nanoassembly for the integration of nanomaterials into functional devices with potential applications in electronics, photonics, and energy.

  10. 40 CFR 13.19 - Analysis of costs; automation; prevention of overpayments, delinquencies or defaults.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 1 2010-07-01 2010-07-01 false Analysis of costs; automation; prevention of overpayments, delinquencies or defaults. 13.19 Section 13.19 Protection of Environment...; automation; prevention of overpayments, delinquencies or defaults. (a) The Administrator may periodically...

  11. Engineering Mathematical Analysis Method for Productivity Rate in Linear Arrangement Serial Structure Automated Flow Assembly Line

    Directory of Open Access Journals (Sweden)

    Tan Chan Sin

    2015-01-01

    Full Text Available Productivity rate (Q or production rate is one of the important indicator criteria for industrial engineer to improve the system and finish good output in production or assembly line. Mathematical and statistical analysis method is required to be applied for productivity rate in industry visual overviews of the failure factors and further improvement within the production line especially for automated flow line since it is complicated. Mathematical model of productivity rate in linear arrangement serial structure automated flow line with different failure rate and bottleneck machining time parameters becomes the basic model for this productivity analysis. This paper presents the engineering mathematical analysis method which is applied in an automotive company which possesses automated flow assembly line in final assembly line to produce motorcycle in Malaysia. DCAS engineering and mathematical analysis method that consists of four stages known as data collection, calculation and comparison, analysis, and sustainable improvement is used to analyze productivity in automated flow assembly line based on particular mathematical model. Variety of failure rate that causes loss of productivity and bottleneck machining time is shown specifically in mathematic figure and presents the sustainable solution for productivity improvement for this final assembly automated flow line.

  12. Automated Tracking of Cell Migration with Rapid Data Analysis.

    Science.gov (United States)

    DuChez, Brian J

    2017-09-01

    Cell migration is essential for many biological processes including development, wound healing, and metastasis. However, studying cell migration often requires the time-consuming and labor-intensive task of manually tracking cells. To accelerate the task of obtaining coordinate positions of migrating cells, we have developed a graphical user interface (GUI) capable of automating the tracking of fluorescently labeled nuclei. This GUI provides an intuitive user interface that makes automated tracking accessible to researchers with no image-processing experience or familiarity with particle-tracking approaches. Using this GUI, users can interactively determine a minimum of four parameters to identify fluorescently labeled cells and automate acquisition of cell trajectories. Additional features allow for batch processing of numerous time-lapse images, curation of unwanted tracks, and subsequent statistical analysis of tracked cells. Statistical outputs allow users to evaluate migratory phenotypes, including cell speed, distance, displacement, and persistence, as well as measures of directional movement, such as forward migration index (FMI) and angular displacement. © 2017 by John Wiley & Sons, Inc. Copyright © 2017 John Wiley & Sons, Inc.

  13. Mesoscale modeling: solving complex flows in biology and biotechnology.

    Science.gov (United States)

    Mills, Zachary Grant; Mao, Wenbin; Alexeev, Alexander

    2013-07-01

    Fluids are involved in practically all physiological activities of living organisms. However, biological and biorelated flows are hard to analyze due to the inherent combination of interdependent effects and processes that occur on a multitude of spatial and temporal scales. Recent advances in mesoscale simulations enable researchers to tackle problems that are central for the understanding of such flows. Furthermore, computational modeling effectively facilitates the development of novel therapeutic approaches. Among other methods, dissipative particle dynamics and the lattice Boltzmann method have become increasingly popular during recent years due to their ability to solve a large variety of problems. In this review, we discuss recent applications of these mesoscale methods to several fluid-related problems in medicine, bioengineering, and biotechnology. Copyright © 2013 Elsevier Ltd. All rights reserved.

  14. A mini-max principle for drift waves and mesoscale fluctuations

    International Nuclear Information System (INIS)

    Itoh, S-I; Itoh, K

    2011-01-01

    A mini-max principle for the system of the drift waves and mesoscale fluctuations (e.g. zonal flows, etc) is studied. For the system of model equations a Lyapunov function is constructed, which takes the minimum when the stationary state is realized. The dynamical evolution describes the access to the state that is realized. The competition between different mesoscale fluctuations is explained. The origins of irreversibility that cause an approach to the stationary state are discussed. A selection rule among fluctuations is derived, and conditions, under which different kinds of mesocale fluctuations coexist, are investigated. An analogy of this minimum principle to the principle of 'minimum Helmholtz free energy' in thermal equilibrium is shown.

  15. Automated Reconstruction of Historic Roof Structures from Point Clouds - Development and Examples

    Science.gov (United States)

    Pöchtrager, M.; Styhler-Aydın, G.; Döring-Williams, M.; Pfeifer, N.

    2017-08-01

    The analysis of historic roof constructions is an important task for planning the adaptive reuse of buildings or for maintenance and restoration issues. Current approaches to modeling roof constructions consist of several consecutive operations that need to be done manually or using semi-automatic routines. To increase efficiency and allow the focus to be on analysis rather than on data processing, a set of methods was developed for the fully automated analysis of the roof constructions, including integration of architectural and structural modeling. Terrestrial laser scanning permits high-detail surveying of large-scale structures within a short time. Whereas 3-D laser scan data consist of millions of single points on the object surface, we need a geometric description of structural elements in order to obtain a structural model consisting of beam axis and connections. Preliminary results showed that the developed methods work well for beams in flawless condition with a quadratic cross section and no bending. Deformations or damages such as cracks and cuts on the wooden beams can lead to incomplete representations in the model. Overall, a high degree of automation was achieved.

  16. Environments of Long-Lived Mesoscale Convective Systems Over the Central United States in Convection Permitting Climate Simulations: Long-Lived Mesoscale Convective Systems

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Qing [Atmospheric Sciences and Global Change Division, Pacific Northwest National Laboratory, Richland WA USA; Houze, Robert A. [Atmospheric Sciences and Global Change Division, Pacific Northwest National Laboratory, Richland WA USA; Department of Atmospheric Sciences, University of Washington, Seattle WA USA; Leung, L. Ruby [Atmospheric Sciences and Global Change Division, Pacific Northwest National Laboratory, Richland WA USA; Feng, Zhe [Atmospheric Sciences and Global Change Division, Pacific Northwest National Laboratory, Richland WA USA

    2017-12-27

    Continental-scale convection-permitting simulations of the warm seasons of 2011 and 2012 reproduce realistic structure and frequency distribution of lifetime and event mean precipitation of mesoscale convective systems (MCSs) over the central United States. Analysis is performed to determine the environmental conditions conducive to generating the longest-lived MCSs and their subsequent interactions. The simulations show that MCSs systematically form over the Great Plains ahead of a trough in the westerlies in combination with an enhanced low-level jet from the Gulf of Mexico. These environmental properties at the time of storm initiation are most prominent for the MCSs that persist for the longest times. Systems reaching 9 h or more in lifetime exhibit feedback to the environment conditions through diabatic heating in the MCS stratiform regions. As a result, the parent synoptic-scale wave is strengthened as a divergent perturbation develops over the MCS at high levels, while a cyclonic circulation perturbation develops in the midlevels of the trough, where the vertical gradient of heating in the MCS region is maximized. The quasi-balanced mesoscale vortex helps to maintain the MCS over a long period of time by feeding dry, cool air into the environment at the rear of the MCS region, so that the MCS can draw in air that increases the evaporative cooling that helps maintain the MCS. At lower levels the south-southeasterly jet of warm moist air from the Gulf is enhanced in the presence of the synoptic-scale wave. That moisture supply is essential to the continued redevelopment of the MCS.

  17. Failure mode and effects analysis of software-based automation systems

    International Nuclear Information System (INIS)

    Haapanen, P.; Helminen, A.

    2002-08-01

    Failure mode and effects analysis (FMEA) is one of the well-known analysis methods having an established position in the traditional reliability analysis. The purpose of FMEA is to identify possible failure modes of the system components, evaluate their influences on system behaviour and propose proper countermeasures to suppress these effects. The generic nature of FMEA has enabled its wide use in various branches of industry reaching from business management to the design of spaceships. The popularity and diverse use of the analysis method has led to multiple interpretations, practices and standards presenting the same analysis method. FMEA is well understood at the systems and hardware levels, where the potential failure modes usually are known and the task is to analyse their effects on system behaviour. Nowadays, more and more system functions are realised on software level, which has aroused the urge to apply the FMEA methodology also on software based systems. Software failure modes generally are unknown - 'software modules do not fail, they only display incorrect behaviour' - and depend on dynamic behaviour of the application. These facts set special requirements on the FMEA of software based systems and make it difficult to realise. In this report the failure mode and effects analysis is studied for the use of reliability analysis of software-based systems. More precisely, the target system of FMEA is defined to be a safety-critical software-based automation application in a nuclear power plant, implemented on an industrial automation system platform. Through a literature study the report tries to clarify the intriguing questions related to the practical use of software failure mode and effects analysis. The study is a part of the research project 'Programmable Automation System Safety Integrity assessment (PASSI)', belonging to the Finnish Nuclear Safety Research Programme (FINNUS, 1999-2002). In the project various safety assessment methods and tools for

  18. Trends and applications of integrated automated ultra-trace sample handling and analysis (T9)

    International Nuclear Information System (INIS)

    Kingston, H.M.S.; Ye Han; Stewart, L.; Link, D.

    2002-01-01

    Full text: Automated analysis, sub-ppt detection limits, and the trend toward speciated analysis (rather than just elemental analysis) force the innovation of sophisticated and integrated sample preparation and analysis techniques. Traditionally, the ability to handle samples at ppt and sub-ppt levels has been limited to clean laboratories and special sample handling techniques and equipment. The world of sample handling has passed a threshold where older or 'old fashioned' traditional techniques no longer provide the ability to see the sample due to the influence of the analytical blank and the fragile nature of the analyte. When samples require decomposition, extraction, separation and manipulation, application of newer more sophisticated sample handling systems are emerging that enable ultra-trace analysis and species manipulation. In addition, new instrumentation has emerged which integrate sample preparation and analysis to enable on-line near real-time analysis. Examples of those newer sample-handling methods will be discussed and current examples provided as alternatives to traditional sample handling. Two new techniques applying ultra-trace microwave energy enhanced sample handling have been developed that permit sample separation and refinement while performing species manipulation during decomposition. A demonstration, that applies to semiconductor materials, will be presented. Next, a new approach to the old problem of sample evaporation without losses will be demonstrated that is capable of retaining all elements and species tested. Both of those methods require microwave energy manipulation in specialized systems and are not accessible through convection, conduction, or other traditional energy applications. A new automated integrated method for handling samples for ultra-trace analysis has been developed. An on-line near real-time measurement system will be described that enables many new automated sample handling and measurement capabilities. This

  19. Video and accelerometer-based motion analysis for automated surgical skills assessment.

    Science.gov (United States)

    Zia, Aneeq; Sharma, Yachna; Bettadapura, Vinay; Sarin, Eric L; Essa, Irfan

    2018-03-01

    Basic surgical skills of suturing and knot tying are an essential part of medical training. Having an automated system for surgical skills assessment could help save experts time and improve training efficiency. There have been some recent attempts at automated surgical skills assessment using either video analysis or acceleration data. In this paper, we present a novel approach for automated assessment of OSATS-like surgical skills and provide an analysis of different features on multi-modal data (video and accelerometer data). We conduct a large study for basic surgical skill assessment on a dataset that contained video and accelerometer data for suturing and knot-tying tasks. We introduce "entropy-based" features-approximate entropy and cross-approximate entropy, which quantify the amount of predictability and regularity of fluctuations in time series data. The proposed features are compared to existing methods of Sequential Motion Texture, Discrete Cosine Transform and Discrete Fourier Transform, for surgical skills assessment. We report average performance of different features across all applicable OSATS-like criteria for suturing and knot-tying tasks. Our analysis shows that the proposed entropy-based features outperform previous state-of-the-art methods using video data, achieving average classification accuracies of 95.1 and 92.2% for suturing and knot tying, respectively. For accelerometer data, our method performs better for suturing achieving 86.8% average accuracy. We also show that fusion of video and acceleration features can improve overall performance for skill assessment. Automated surgical skills assessment can be achieved with high accuracy using the proposed entropy features. Such a system can significantly improve the efficiency of surgical training in medical schools and teaching hospitals.

  20. Automated striatal uptake analysis of 18F-FDOPA PET images applied to Parkinson's disease patients

    International Nuclear Information System (INIS)

    Chang Icheng; Lue Kunhan; Hsieh Hungjen; Liu Shuhsin; Kao, Chinhao K.

    2011-01-01

    6-[ 18 F]Fluoro-L-DOPA (FDOPA) is a radiopharmaceutical valuable for assessing the presynaptic dopaminergic function when used with positron emission tomography (PET). More specifically, the striatal-to-occipital ratio (SOR) of FDOPA uptake images has been extensively used as a quantitative parameter in these PET studies. Our aim was to develop an easy, automated method capable of performing objective analysis of SOR in FDOPA PET images of Parkinson's disease (PD) patients. Brain images from FDOPA PET studies of 21 patients with PD and 6 healthy subjects were included in our automated striatal analyses. Images of each individual were spatially normalized into an FDOPA template. Subsequently, the image slice with the highest level of basal ganglia activity was chosen among the series of normalized images. Also, the immediate preceding and following slices of the chosen image were then selected. Finally, the summation of these three images was used to quantify and calculate the SOR values. The results obtained by automated analysis were compared with manual analysis by a trained and experienced image processing technologist. The SOR values obtained from the automated analysis had a good agreement and high correlation with manual analysis. The differences in caudate, putamen, and striatum were -0.023, -0.029, and -0.025, respectively; correlation coefficients 0.961, 0.957, and 0.972, respectively. We have successfully developed a method for automated striatal uptake analysis of FDOPA PET images. There was no significant difference between the SOR values obtained from this method and using manual analysis. Yet it is an unbiased time-saving and cost-effective program and easy to implement on a personal computer. (author)

  1. Semi-supervised consensus clustering for gene expression data analysis

    OpenAIRE

    Wang, Yunli; Pan, Youlian

    2014-01-01

    Background Simple clustering methods such as hierarchical clustering and k-means are widely used for gene expression data analysis; but they are unable to deal with noise and high dimensionality associated with the microarray gene expression data. Consensus clustering appears to improve the robustness and quality of clustering results. Incorporating prior knowledge in clustering process (semi-supervised clustering) has been shown to improve the consistency between the data partitioning and do...

  2. Manual versus Automated Narrative Analysis of Agrammatic Production Patterns: The Northwestern Narrative Language Analysis and Computerized Language Analysis

    Science.gov (United States)

    Hsu, Chien-Ju; Thompson, Cynthia K.

    2018-01-01

    Purpose: The purpose of this study is to compare the outcomes of the manually coded Northwestern Narrative Language Analysis (NNLA) system, which was developed for characterizing agrammatic production patterns, and the automated Computerized Language Analysis (CLAN) system, which has recently been adopted to analyze speech samples of individuals…

  3. Semi-automated delineation of breast cancer tumors and subsequent materialization using three-dimensional printing (rapid prototyping).

    Science.gov (United States)

    Schulz-Wendtland, Rüdiger; Harz, Markus; Meier-Meitinger, Martina; Brehm, Barbara; Wacker, Till; Hahn, Horst K; Wagner, Florian; Wittenberg, Thomas; Beckmann, Matthias W; Uder, Michael; Fasching, Peter A; Emons, Julius

    2017-03-01

    Three-dimensional (3D) printing has become widely available, and a few cases of its use in clinical practice have been described. The aim of this study was to explore facilities for the semi-automated delineation of breast cancer tumors and to assess the feasibility of 3D printing of breast cancer tumors. In a case series of five patients, different 3D imaging methods-magnetic resonance imaging (MRI), digital breast tomosynthesis (DBT), and 3D ultrasound-were used to capture 3D data for breast cancer tumors. The volumes of the breast tumors were calculated to assess the comparability of the breast tumor models, and the MRI information was used to render models on a commercially available 3D printer to materialize the tumors. The tumor volumes calculated from the different 3D methods appeared to be comparable. Tumor models with volumes between 325 mm 3 and 7,770 mm 3 were printed and compared with the models rendered from MRI. The materialization of the tumors reflected the computer models of them. 3D printing (rapid prototyping) appears to be feasible. Scenarios for the clinical use of the technology might include presenting the model to the surgeon to provide a better understanding of the tumor's spatial characteristics in the breast, in order to improve decision-making in relation to neoadjuvant chemotherapy or surgical approaches. J. Surg. Oncol. 2017;115:238-242. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  4. A semi-spring and semi-edge combined contact model in CDEM and its application to analysis of Jiweishan landslide

    Directory of Open Access Journals (Sweden)

    Chun Feng

    2014-02-01

    Full Text Available Continuum-based discrete element method (CDEM is an explicit numerical method used for simulation of progressive failure of geological body. To improve the efficiency of contact detection and simplify the calculation steps for contact forces, semi-spring and semi-edge are introduced in calculation. Semi-spring is derived from block vertex, and formed by indenting the block vertex into each face (24 semi-springs for a hexahedral element. The formation process of semi-edge is the same as that of semi-spring (24 semi-edges for a hexahedral element. Based on the semi-springs and semi-edges, a new type of combined contact model is presented. According to this model, six contact types could be reduced to two, i.e. the semi-spring target face contact and semi-edge target edge contact. By the combined model, the contact force could be calculated directly (the information of contact type is not necessary, and the failure judgment could be executed in a straightforward way (each semi-spring and semi-edge own their characteristic areas. The algorithm has been successfully programmed in C++ program. Some simple numerical cases are presented to show the validity and accuracy of this model. Finally, the failure mode, sliding distance and critical friction angle of Jiweishan landslide are studied with the combined model.

  5. Assessment of MARMOT. A Mesoscale Fuel Performance Code

    Energy Technology Data Exchange (ETDEWEB)

    Tonks, M. R. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Schwen, D. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Zhang, Y. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Chakraborty, P. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Bai, X. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Fromm, B. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Yu, J. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Teague, M. C. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Andersson, D. A. [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-04-01

    MARMOT is the mesoscale fuel performance code under development as part of the US DOE Nuclear Energy Advanced Modeling and Simulation Program. In this report, we provide a high level summary of MARMOT, its capabilities, and its current state of validation. The purpose of MARMOT is to predict the coevolution of microstructure and material properties of nuclear fuel and cladding. It accomplished this using the phase field method coupled to solid mechanics and heat conduction. MARMOT is based on the Multiphysics Object-Oriented Simulation Environment (MOOSE), and much of its basic capability in the areas of the phase field method, mechanics, and heat conduction come directly from MOOSE modules. However, additional capability specific to fuel and cladding is available in MARMOT. While some validation of MARMOT has been completed in the areas of fission gas behavior and grain growth, much more validation needs to be conducted. However, new mesoscale data needs to be obtained in order to complete this validation.

  6. Study of the air-sea interactions at the mesoscale: the SEMAPHORE experiment

    Directory of Open Access Journals (Sweden)

    L. Eymard

    1996-09-01

    Full Text Available The SEMAPHORE (Structure des Echanges Mer-Atmosphère, Propriétés des Hétérogénéités Océaniques: Recherche Expérimentale experiment has been conducted from June to November 1993 in the Northeast Atlantic between the Azores and Madeira. It was centered on the study of the mesoscale ocean circulation and air-sea interactions. The experimental investigation was achieved at the mesoscale using moorings, floats, and ship hydrological survey, and at a smaller scale by one dedicated ship, two instrumented aircraft, and surface drifting buoys, for one and a half month in October-November (IOP: intense observing period. Observations from meteorological operational satellites as well as spaceborne microwave sensors were used in complement. The main studies undertaken concern the mesoscale ocean, the upper ocean, the atmospheric boundary layer, and the sea surface, and first results are presented for the various topics. From data analysis and model simulations, the main characteristics of the ocean circulation were deduced, showing the close relationship between the Azores front meander and the occurrence of Mediterranean water lenses (meddies, and the shift between the Azores current frontal signature at the surface and within the thermocline. Using drifting buoys and ship data in the upper ocean, the gap between the scales of the atmospheric forcing and the oceanic variability was made evident. A 2 °C decrease and a 40-m deepening of the mixed layer were measured within the IOP, associated with a heating loss of about 100 W m-2. This evolution was shown to be strongly connected to the occurrence of storms at the beginning and the end of October. Above the surface, turbulent measurements from ship and aircraft were analyzed across the surface thermal front, showing a 30% difference in heat fluxes between both sides during a 4-day period, and the respective contributions of the wind and the surface temperature were evaluated. The classical

  7. Study of the air-sea interactions at the mesoscale: the SEMAPHORE experiment

    Science.gov (United States)

    Eymard, L.; Planton, S.; Durand, P.; Le Visage, C.; Le Traon, P. Y.; Prieur, L.; Weill, A.; Hauser, D.; Rolland, J.; Pelon, J.; Baudin, F.; Bénech, B.; Brenguier, J. L.; Caniaux, G.; de Mey, P.; Dombrowski, E.; Druilhet, A.; Dupuis, H.; Ferret, B.; Flamant, C.; Flamant, P.; Hernandez, F.; Jourdan, D.; Katsaros, K.; Lambert, D.; Lefèvre, J. M.; Le Borgne, P.; Le Squere, B.; Marsoin, A.; Roquet, H.; Tournadre, J.; Trouillet, V.; Tychensky, A.; Zakardjian, B.

    1996-09-01

    The SEMAPHORE (Structure des Echanges Mer-Atmosphère, Propriétés des Hétérogénéités Océaniques: Recherche Expérimentale) experiment has been conducted from June to November 1993 in the Northeast Atlantic between the Azores and Madeira. It was centered on the study of the mesoscale ocean circulation and air-sea interactions. The experimental investigation was achieved at the mesoscale using moorings, floats, and ship hydrological survey, and at a smaller scale by one dedicated ship, two instrumented aircraft, and surface drifting buoys, for one and a half month in October-November (IOP: intense observing period). Observations from meteorological operational satellites as well as spaceborne microwave sensors were used in complement. The main studies undertaken concern the mesoscale ocean, the upper ocean, the atmospheric boundary layer, and the sea surface, and first results are presented for the various topics. From data analysis and model simulations, the main characteristics of the ocean circulation were deduced, showing the close relationship between the Azores front meander and the occurrence of Mediterranean water lenses (meddies), and the shift between the Azores current frontal signature at the surface and within the thermocline. Using drifting buoys and ship data in the upper ocean, the gap between the scales of the atmospheric forcing and the oceanic variability was made evident. A 2 °C decrease and a 40-m deepening of the mixed layer were measured within the IOP, associated with a heating loss of about 100 W m-2. This evolution was shown to be strongly connected to the occurrence of storms at the beginning and the end of October. Above the surface, turbulent measurements from ship and aircraft were analyzed across the surface thermal front, showing a 30% difference in heat fluxes between both sides during a 4-day period, and the respective contributions of the wind and the surface temperature were evaluated. The classical momentum flux bulk

  8. Study of the air-sea interactions at the mesoscale: the SEMAPHORE experiment

    Directory of Open Access Journals (Sweden)

    L. Eymard

    Full Text Available The SEMAPHORE (Structure des Echanges Mer-Atmosphère, Propriétés des Hétérogénéités Océaniques: Recherche Expérimentale experiment has been conducted from June to November 1993 in the Northeast Atlantic between the Azores and Madeira. It was centered on the study of the mesoscale ocean circulation and air-sea interactions. The experimental investigation was achieved at the mesoscale using moorings, floats, and ship hydrological survey, and at a smaller scale by one dedicated ship, two instrumented aircraft, and surface drifting buoys, for one and a half month in October-November (IOP: intense observing period. Observations from meteorological operational satellites as well as spaceborne microwave sensors were used in complement. The main studies undertaken concern the mesoscale ocean, the upper ocean, the atmospheric boundary layer, and the sea surface, and first results are presented for the various topics. From data analysis and model simulations, the main characteristics of the ocean circulation were deduced, showing the close relationship between the Azores front meander and the occurrence of Mediterranean water lenses (meddies, and the shift between the Azores current frontal signature at the surface and within the thermocline. Using drifting buoys and ship data in the upper ocean, the gap between the scales of the atmospheric forcing and the oceanic variability was made evident. A 2 °C decrease and a 40-m deepening of the mixed layer were measured within the IOP, associated with a heating loss of about 100 W m-2. This evolution was shown to be strongly connected to the occurrence of storms at the beginning and the end of October. Above the surface, turbulent measurements from ship and aircraft were analyzed across the surface thermal front, showing a 30% difference in heat fluxes between both sides during a 4-day period, and the respective contributions of the wind and the surface temperature were evaluated. The

  9. ROBOCAL: An automated NDA [nondestructive analysis] calorimetry and gamma isotopic system

    International Nuclear Information System (INIS)

    Hurd, J.R.; Powell, W.D.; Ostenak, C.A.

    1989-01-01

    ROBOCAL, which is presently being developed and tested at Los Alamos National Laboratory, is a full-scale, prototype robotic system for remote calorimetric and gamma-ray analysis of special nuclear materials. It integrates a fully automated, multidrawer, vertical stacker-retriever system for staging unmeasured nuclear materials, and a fully automated gantry robot for computer-based selection and transfer of nuclear materials to calorimetric and gamma-ray measurement stations. Since ROBOCAL is designed for minimal operator intervention, a completely programmed user interface is provided to interact with the automated mechanical and assay systems. The assay system is designed to completely integrate calorimetric and gamma-ray data acquisition and to perform state-of-the-art analyses on both homogeneous and heterogeneous distributions of nuclear materials in a wide variety of matrices

  10. Participation through Automation: Fully Automated Critical PeakPricing in Commercial Buildings

    Energy Technology Data Exchange (ETDEWEB)

    Piette, Mary Ann; Watson, David S.; Motegi, Naoya; Kiliccote,Sila; Linkugel, Eric

    2006-06-20

    California electric utilities have been exploring the use of dynamic critical peak prices (CPP) and other demand response programs to help reduce peaks in customer electric loads. CPP is a tariff design to promote demand response. Levels of automation in DR can be defined as follows: Manual Demand Response involves a potentially labor-intensive approach such as manually turning off or changing comfort set points at each equipment switch or controller. Semi-Automated Demand Response involves a pre-programmed demand response strategy initiated by a person via centralized control system. Fully Automated Demand Response does not involve human intervention, but is initiated at a home, building, or facility through receipt of an external communications signal. The receipt of the external signal initiates pre-programmed demand response strategies. They refer to this as Auto-DR. This paper describes the development, testing, and results from automated CPP (Auto-CPP) as part of a utility project in California. The paper presents the project description and test methodology. This is followed by a discussion of Auto-DR strategies used in the field test buildings. They present a sample Auto-CPP load shape case study, and a selection of the Auto-CPP response data from September 29, 2005. If all twelve sites reached their maximum saving simultaneously, a total of approximately 2 MW of DR is available from these twelve sites that represent about two million ft{sup 2}. The average DR was about half that value, at about 1 MW. These savings translate to about 0.5 to 1.0 W/ft{sup 2} of demand reduction. They are continuing field demonstrations and economic evaluations to pursue increasing penetrations of automated DR that has demonstrated ability to provide a valuable DR resource for California.

  11. Improved analysis and visualization of friction loop data: unraveling the energy dissipation of meso-scale stick-slip motion

    Science.gov (United States)

    Kokorian, Jaap; Merlijn van Spengen, W.

    2017-11-01

    In this paper we demonstrate a new method for analyzing and visualizing friction force measurements of meso-scale stick-slip motion, and introduce a method for extracting two separate dissipative energy components. Using a microelectromechanical system tribometer, we execute 2 million reciprocating sliding cycles, during which we measure the static friction force with a resolution of \

  12. SU-E-T-497: Semi-Automated in Vivo Radiochromic Film Dosimetry Using a Novel Image Processing Algorithm

    International Nuclear Information System (INIS)

    Reyhan, M; Yue, N

    2014-01-01

    Purpose: To validate an automated image processing algorithm designed to detect the center of radiochromic film used for in vivo film dosimetry against the current gold standard of manual selection. Methods: An image processing algorithm was developed to automatically select the region of interest (ROI) in *.tiff images that contain multiple pieces of radiochromic film (0.5x1.3cm 2 ). After a user has linked a calibration file to the processing algorithm and selected a *.tiff file for processing, an ROI is automatically detected for all films by a combination of thresholding and erosion, which removes edges and any additional markings for orientation. Calibration is applied to the mean pixel values from the ROIs and a *.tiff image is output displaying the original image with an overlay of the ROIs and the measured doses. Validation of the algorithm was determined by comparing in vivo dose determined using the current gold standard (manually drawn ROIs) versus automated ROIs for n=420 scanned films. Bland-Altman analysis, paired t-test, and linear regression were performed to demonstrate agreement between the processes. Results: The measured doses ranged from 0.2-886.6cGy. Bland-Altman analysis of the two techniques (automatic minus manual) revealed a bias of -0.28cGy and a 95% confidence interval of (5.5cGy,-6.1cGy). These values demonstrate excellent agreement between the two techniques. Paired t-test results showed no statistical differences between the two techniques, p=0.98. Linear regression with a forced zero intercept demonstrated that Automatic=0.997*Manual, with a Pearson correlation coefficient of 0.999. The minimal differences between the two techniques may be explained by the fact that the hand drawn ROIs were not identical to the automatically selected ones. The average processing time was 6.7seconds in Matlab on an IntelCore2Duo processor. Conclusion: An automated image processing algorithm has been developed and validated, which will help minimize

  13. Toward Semi-automated Assessment of Target Volume Delineation in Radiotherapy Trials: The SCOPE 1 Pretrial Test Case

    Energy Technology Data Exchange (ETDEWEB)

    Gwynne, Sarah, E-mail: Sarah.Gwynne2@wales.nhs.uk [Department of Clinical Oncology, Velindre Cancer Centre, Cardiff, Wales (United Kingdom); Spezi, Emiliano; Wills, Lucy [Department of Medical Physics, Velindre Cancer Centre, Cardiff, Wales (United Kingdom); Nixon, Lisette; Hurt, Chris [Wales Cancer Trials Unit, School of Medicine, Cardiff University, Cardiff, Wales (United Kingdom); Joseph, George [Department of Diagnostic Radiology, Velindre Cancer Centre, Cardiff, Wales (United Kingdom); Evans, Mererid [Department of Clinical Oncology, Velindre Cancer Centre, Cardiff, Wales (United Kingdom); Griffiths, Gareth [Wales Cancer Trials Unit, School of Medicine, Cardiff University, Cardiff, Wales (United Kingdom); Crosby, Tom [Department of Clinical Oncology, Velindre Cancer Centre, Cardiff, Wales (United Kingdom); Staffurth, John [Division of Cancer, School of Medicine, Cardiff University, Cardiff, Wales (United Kingdom)

    2012-11-15

    Purpose: To evaluate different conformity indices (CIs) for use in the analysis of outlining consistency within the pretrial quality assurance (Radiotherapy Trials Quality Assurance [RTTQA]) program of a multicenter chemoradiation trial of esophageal cancer and to make recommendations for their use in future trials. Methods and Materials: The National Cancer Research Institute SCOPE 1 trial is an ongoing Cancer Research UK-funded phase II/III randomized controlled trial of chemoradiation with capecitabine and cisplatin with or without cetuximab for esophageal cancer. The pretrial RTTQA program included a detailed radiotherapy protocol, an educational package, and a single mid-esophageal tumor test case that were sent to each investigator to outline. Investigator gross tumor volumes (GTVs) were received from 50 investigators in 34 UK centers, and CERR (Computational Environment for Radiotherapy Research) was used to perform an assessment of each investigator GTV against a predefined gold-standard GTV using different CIs. A new metric, the local conformity index (l-CI), that can localize areas of maximal discordance was developed. Results: The median Jaccard conformity index (JCI) was 0.69 (interquartile range, 0.62-0.70), with 14 of 50 investigators (28%) achieving a JCI of 0.7 or greater. The median geographical miss index was 0.09 (interquartile range, 0.06-0.16), and the mean discordance index was 0.27 (95% confidence interval, 0.25-0.30). The l-CI was highest in the middle section of the volume, where the tumor was bulky and more easily definable, and identified 4 slices where fewer than 20% of investigators achieved an l-CI of 0.7 or greater. Conclusions: The available CIs analyze different aspects of a gold standard-observer variation, with JCI being the most useful as a single metric. Additional information is provided by the l-CI and can focus the efforts of the RTTQA team in these areas, possibly leading to semi-automated outlining assessment.

  14. Toward Semi-automated Assessment of Target Volume Delineation in Radiotherapy Trials: The SCOPE 1 Pretrial Test Case

    International Nuclear Information System (INIS)

    Gwynne, Sarah; Spezi, Emiliano; Wills, Lucy; Nixon, Lisette; Hurt, Chris; Joseph, George; Evans, Mererid; Griffiths, Gareth; Crosby, Tom; Staffurth, John

    2012-01-01

    Purpose: To evaluate different conformity indices (CIs) for use in the analysis of outlining consistency within the pretrial quality assurance (Radiotherapy Trials Quality Assurance [RTTQA]) program of a multicenter chemoradiation trial of esophageal cancer and to make recommendations for their use in future trials. Methods and Materials: The National Cancer Research Institute SCOPE 1 trial is an ongoing Cancer Research UK-funded phase II/III randomized controlled trial of chemoradiation with capecitabine and cisplatin with or without cetuximab for esophageal cancer. The pretrial RTTQA program included a detailed radiotherapy protocol, an educational package, and a single mid-esophageal tumor test case that were sent to each investigator to outline. Investigator gross tumor volumes (GTVs) were received from 50 investigators in 34 UK centers, and CERR (Computational Environment for Radiotherapy Research) was used to perform an assessment of each investigator GTV against a predefined gold-standard GTV using different CIs. A new metric, the local conformity index (l-CI), that can localize areas of maximal discordance was developed. Results: The median Jaccard conformity index (JCI) was 0.69 (interquartile range, 0.62-0.70), with 14 of 50 investigators (28%) achieving a JCI of 0.7 or greater. The median geographical miss index was 0.09 (interquartile range, 0.06-0.16), and the mean discordance index was 0.27 (95% confidence interval, 0.25-0.30). The l-CI was highest in the middle section of the volume, where the tumor was bulky and more easily definable, and identified 4 slices where fewer than 20% of investigators achieved an l-CI of 0.7 or greater. Conclusions: The available CIs analyze different aspects of a gold standard–observer variation, with JCI being the most useful as a single metric. Additional information is provided by the l-CI and can focus the efforts of the RTTQA team in these areas, possibly leading to semi-automated outlining assessment.

  15. Assessing the Agreement Between Eo-Based Semi-Automated Landslide Maps with Fuzzy Manual Landslide Delineation

    Science.gov (United States)

    Albrecht, F.; Hölbling, D.; Friedl, B.

    2017-09-01

    Landslide mapping benefits from the ever increasing availability of Earth Observation (EO) data resulting from programmes like the Copernicus Sentinel missions and improved infrastructure for data access. However, there arises the need for improved automated landslide information extraction processes from EO data while the dominant method is still manual delineation. Object-based image analysis (OBIA) provides the means for the fast and efficient extraction of landslide information. To prove its quality, automated results are often compared to manually delineated landslide maps. Although there is awareness of the uncertainties inherent in manual delineations, there is a lack of understanding how they affect the levels of agreement in a direct comparison of OBIA-derived landslide maps and manually derived landslide maps. In order to provide an improved reference, we present a fuzzy approach for the manual delineation of landslides on optical satellite images, thereby making the inherent uncertainties of the delineation explicit. The fuzzy manual delineation and the OBIA classification are compared by accuracy metrics accepted in the remote sensing community. We have tested this approach for high resolution (HR) satellite images of three large landslides in Austria and Italy. We were able to show that the deviation of the OBIA result from the manual delineation can mainly be attributed to the uncertainty inherent in the manual delineation process, a relevant issue for the design of validation processes for OBIA-derived landslide maps.

  16. Silicate:nitrate ratios of upwelled waters control the phytoplankton community sustained by mesoscale eddies in sub-tropical North Atlantic and Pacific

    Directory of Open Access Journals (Sweden)

    T. S. Bibby

    2011-03-01

    Full Text Available Mesoscale eddies in sub-tropical gyres physically perturb the water column and can introduce macronutrients to the euphotic zone, stimulating a biological response in which phytoplankton communities can become dominated by large phytoplankton. Mesoscale eddies may therefore be important in driving export in oligotrophic regions of the modern ocean. However, the character and magnitude of the biological response sustained by eddies is variable. Here we present data from mesoscale eddies in the Sargasso Sea (Atlantic and the waters off Hawai'i (Pacific, alongside mesoscale events that affected the Bermuda Atlantic Time-Series Study (BATS over the past decade. From this analysis, we suggest that the phytoplankton community structure sustained by mesoscale eddies is predetermined by the relative abundance of silicate over nitrate (Si* in the upwelled waters. We present data that demonstrate that mode-water eddies (MWE in the Sargasso Sea upwell locally formed waters with relatively high Si* to the euphotic zone, and that cyclonic eddies in the Sargasso Sea introduce waters with relatively low Si*, a signature that originated in the iron-limited Southern Ocean. We propose that this phenomenon can explain the observed dominance of the phytoplankton community by large-diatom species in MWE and by small prokaryotic phytoplankton in cyclonic features. In contrast to the Atlantic, North Pacific Intermediate Water (NPIW with high Si* may influence the cyclonic eddies in waters off Hawai'i, which also appear capable of sustaining diatom populations. These observations suggest that the structure of phytoplankton communities sustained by eddies may be related to the chemical composition of the upwelled waters in addition to the physical nature of the eddy.

  17. Does mesoscale matters in decadal changes observed in the northern Canary upwelling system?

    Science.gov (United States)

    Relvas, P.; Luís, J.; Santos, A. M. P.

    2009-04-01

    The Western Iberia constitutes the northern limb of the Canary Current Upwelling System, one of the four Eastern Boundary Upwelling Systems of the world ocean. The strong dynamic link between the atmosphere and the ocean makes these systems highly sensitive to global change, ideal to monitor and investigate its effects. In order to investigate decadal changes of the mesoscale patterns in the Northern Canary upwelling system (off Western Iberia), the field of the satellite-derived sea surface temperature (SST) trends was built at the pixel scale (4x4 km) for the period 1985-2007, based on the monthly mean data from the Advanced Very High Resolution Radiometer (AVHRR) on board NOAA series satellites, provided by the NASA Physical Oceanography Distributed Active Archive Center (PO.DAAC) at the Jet Propulsion Laboratory. The time series were limited to the nighttime passes to avoid the solar heating effect and a suite of procedures were followed to guarantee that the temperature trends were not biased towards the seasonally more abundant summer data, when the sky is considerably clear. A robust linear fit was applied to each individual pixel, crossing along the time the same pixel in all the processed monthly mean AVHRR SST images from 1985 until 2007. The field of the SST trends was created upon the slopes of the linear fits applied to each pixel. Monthly mean SST time series from the one degree enhanced International Comprehensive Ocean-Atmosphere Data Set (ICOADS) and from near-shore measurements collected on a daily basis by the Portuguese Meteorological Office (IM) are also used to compare the results and extend the analysis back until 1960. A generalized warming trend is detected in the coastal waters off Western Iberia during the last decades, no matter which data set we analyse. However, significant spatial differences in the warming rates are observed in the satellite-derived SST trends. Remarkably, off the southern part of the Western Iberia the known

  18. Assimilation of Doppler weather radar observations in a mesoscale ...

    Indian Academy of Sciences (India)

    Research (PSU–NCAR) mesoscale model (MM5) version 3.5.6. The variational data assimilation ... investigation of the direct assimilation of radar reflectivity data in 3DVAR system. The present ...... Results presented in this paper are based on.

  19. An Automated Data Analysis Tool for Livestock Market Data

    Science.gov (United States)

    Williams, Galen S.; Raper, Kellie Curry

    2011-01-01

    This article describes an automated data analysis tool that allows Oklahoma Cooperative Extension Service educators to disseminate results in a timely manner. Primary data collected at Oklahoma Quality Beef Network (OQBN) certified calf auctions across the state results in a large amount of data per sale site. Sale summaries for an individual sale…

  20. Specdata: Automated Analysis Software for Broadband Spectra

    Science.gov (United States)

    Oliveira, Jasmine N.; Martin-Drumel, Marie-Aline; McCarthy, Michael C.

    2017-06-01

    With the advancement of chirped-pulse techniques, broadband rotational spectra with a few tens to several hundred GHz of spectral coverage are now routinely recorded. When studying multi-component mixtures that might result, for example, with the use of an electrical discharge, lines of new chemical species are often obscured by those of known compounds, and analysis can be laborious. To address this issue, we have developed SPECdata, an open source, interactive tool which is designed to simplify and greatly accelerate the spectral analysis and discovery. Our software tool combines both automated and manual components that free the user from computation, while giving him/her considerable flexibility to assign, manipulate, interpret and export their analysis. The automated - and key - component of the new software is a database query system that rapidly assigns transitions of known species in an experimental spectrum. For each experiment, the software identifies spectral features, and subsequently assigns them to known molecules within an in-house database (Pickett .cat files, list of frequencies...), or those catalogued in Splatalogue (using automatic on-line queries). With suggested assignments, the control is then handed over to the user who can choose to accept, decline or add additional species. Data visualization, statistical information, and interactive widgets assist the user in making decisions about their data. SPECdata has several other useful features intended to improve the user experience. Exporting a full report of the analysis, or a peak file in which assigned lines are removed are among several options. A user may also save their progress to continue at another time. Additional features of SPECdata help the user to maintain and expand their database for future use. A user-friendly interface allows one to search, upload, edit or update catalog or experiment entries.

  1. IMAGE CONSTRUCTION TO AUTOMATION OF PROJECTIVE TECHNIQUES FOR PSYCHOPHYSIOLOGICAL ANALYSIS

    Directory of Open Access Journals (Sweden)

    Natalia Pavlova

    2018-04-01

    Full Text Available The search for a solution of automation of the process of assessment of a psychological analysis of the person drawings created by it from an available set of some templates are presented at this article. It will allow to reveal more effectively infringements of persons mentality. In particular, such decision can be used for work with children who possess the developed figurative thinking, but are not yet capable of an accurate statement of the thoughts and experiences. For automation of testing by using a projective method, we construct interactive environment for visualization of compositions of the several images and then analyse

  2. On semi-classical questions related to signal analysis

    KAUST Repository

    Helffer, Bernard

    2011-12-01

    This study explores the reconstruction of a signal using spectral quantities associated with some self-adjoint realization of an h-dependent Schrödinger operator -h2(d2/dx2)-y(x), h>0, when the parameter h tends to 0. Theoretical results in semi-classical analysis are proved. Some numerical results are also presented. We first consider as a toy model the sech2 function. Then we study a real signal given by arterial blood pressure measurements. This approach seems to be very promising in signal analysis. Indeed it provides new spectral quantities that can give relevant information on some signals as it is the case for arterial blood pressure signal. © 2011 - IOS Press and the authors. All rights reserved.

  3. Magnetic saturation in semi-analytical harmonic modeling for electric machine analysis

    NARCIS (Netherlands)

    Sprangers, R.L.J.; Paulides, J.J.H.; Gysen, B.L.J.; Lomonova, E.

    2016-01-01

    A semi-analytical method based on the harmonic modeling (HM) technique is presented for the analysis of the magneto-static field distribution in the slotted structure of rotating electric machines. In contrast to the existing literature, the proposed model does not require the assumption of infinite

  4. Analysis of the Behaviour of Semi Rigid Steel End Plate Connections

    Directory of Open Access Journals (Sweden)

    Bahaz A.

    2018-01-01

    Full Text Available The analysis of steel-framed building structures with full strength beam to column joints is quite standard nowadays. Buildings utilizing such framing systems are widely used in design practice. However, there is a growing recognition of significant benefits in designing joints as partial strength/semi-rigid. The design of joints within this partial strength/semi-rigid approach is becoming more and more popular. This requires the knowledge of the full nonlinear moment-rotation behaviour of the joint, which is also a design parameter. The rotational behaviour of steel semi rigid connections can be studied using the finite element method for the following three reasons: i such models are inexpensive; ii they allow the understanding of local effects, which are difficult to measure accurately physically, and iii they can be used to generate extensive parametric studies. This paper presents a three-dimensional finite element model using ABAQUS software in order to identify the effect of different parameters on the behaviour of semi rigid steel beam to column end plate connections. Contact and sliding between different elements, bolt pretension and geometric and material non-linearity are included in this model. A parametric study is conducted using a model of two end-plate configurations: flush and extended end plates. The studied parameters were as follows: bolts type, end plate thickness and column web stiffener. Then, the model was calibrated and validated with experimental results taken from the literature and with the model proposed by Eurocode3. The procedure for determining the moment–rotation curve using finite element analysis is also given together with a brief explanation of how the design moment resistance and the initial rotational stiffness of the joint are obtained.

  5. Automated Analysis of Accountability

    DEFF Research Database (Denmark)

    Bruni, Alessandro; Giustolisi, Rosario; Schürmann, Carsten

    2017-01-01

    that the system can detect the misbehaving parties who caused that failure. Accountability is an intuitively stronger property than verifiability as the latter only rests on the possibility of detecting the failure of a goal. A plethora of accountability and verifiability definitions have been proposed...... in the literature. Those definitions are either very specific to the protocols in question, hence not applicable in other scenarios, or too general and widely applicable but requiring complicated and hard to follow manual proofs. In this paper, we advance formal definitions of verifiability and accountability...... that are amenable to automated verification. Our definitions are general enough to be applied to different classes of protocols and different automated security verification tools. Furthermore, we point out formally the relation between verifiability and accountability. We validate our definitions...

  6. Automated Classification and Analysis of Non-metallic Inclusion Data Sets

    Science.gov (United States)

    Abdulsalam, Mohammad; Zhang, Tongsheng; Tan, Jia; Webler, Bryan A.

    2018-05-01

    The aim of this study is to utilize principal component analysis (PCA), clustering methods, and correlation analysis to condense and examine large, multivariate data sets produced from automated analysis of non-metallic inclusions. Non-metallic inclusions play a major role in defining the properties of steel and their examination has been greatly aided by automated analysis in scanning electron microscopes equipped with energy dispersive X-ray spectroscopy. The methods were applied to analyze inclusions on two sets of samples: two laboratory-scale samples and four industrial samples from a near-finished 4140 alloy steel components with varying machinability. The laboratory samples had well-defined inclusions chemistries, composed of MgO-Al2O3-CaO, spinel (MgO-Al2O3), and calcium aluminate inclusions. The industrial samples contained MnS inclusions as well as (Ca,Mn)S + calcium aluminate oxide inclusions. PCA could be used to reduce inclusion chemistry variables to a 2D plot, which revealed inclusion chemistry groupings in the samples. Clustering methods were used to automatically classify inclusion chemistry measurements into groups, i.e., no user-defined rules were required.

  7. Observed 3D Structure, Generation, and Dissipation of Mesoscale Eddies in the South China Sea

    Science.gov (United States)

    Zhang, Z.; Tian, J.; Qiu, B.; Zhao, W.

    2016-12-01

    South China Sea (SCS), the largest marginal sea in the western Pacific, is abundant with strong mesoscale eddies as revealed by both satellite and in situ observations. The 3D structure, generation and dissipation mechanisms of the SCS mesoscale eddies, however, are still not well understood at present due to the lack of well-designed and comprehensive field observations. In order to address the above scientific issues, the SCS Mesoscale Eddy Experiment (S-MEE for short) was designed and conducted in the period from October 2013 to June 2014. As part of S-MEE, two bottom-anchored subsurface mooring arrays with one consisting of 10 moorings and the other 7 moorings, were deployed along the historical pathway of the mesoscale eddies in the northern SCS. All the moorings were equipped with ADCPs, RCMs, CTDs and temperature chains to make continues measurements of horizontal current velocity and temperature/salinity in the whole water column. In addition to moored observations, we also conducted two transects across the center of one anticyclonic eddy (AE) and made high-resolution hydrographic and turbulent mixing measurements. Based on the data collected by the S-MEE, we obtained the full-depth 3D structures of one AE and one cyclonic eddy (CE) and revealed their generation and dissipation mechanisms. For the first time we found that the eddies in the northern SCS extend from the surface to the sea bottom and display prominent tilted structures in the vertical. The AE was suggested to be shed from the Kuroshio current, which intruded into the SCS through Luzon Strait in winter. For the CE, its generation was associated with the barotropic instability of the Kuroshio current. By conducting an eddy energy budget analysis, we further identified that generation of submesoscale motions constitutes the dominant mechanism for the eddy dissipation. The findings in this study, not only provides new insights into the 3D structure of oceanic eddies, but also contributes to

  8. Skills of different mesoscale models over Indian region during ...

    Indian Academy of Sciences (India)

    tion and prediction of high impact severe weather systems. Such models ... mesoscale models can be run at cloud resolving resolutions (∼1km) ... J. Earth Syst. Sci. 117, No. ..... similar to climate drift, indicating that those error components are ...

  9. Development of Process Automation in the Neutron Activation Analysis Facility in Malaysian Nuclear Agency

    International Nuclear Information System (INIS)

    Yussup, N.; Azman, A.; Ibrahim, M.M.; Rahman, N.A.A.; Che Sohashaari, S.; Atan, M.N.; Hamzah, M.A.; Mokhtar, M.; Khalid, M.A.; Salim, N.A.A.; Hamzah, M.S.

    2018-01-01

    Neutron Activation Analysis (NAA) has been established in Malaysian Nuclear Agency (Nuclear Malaysia) since 1980s. Most of the procedures established from sample registration to analysis are performed manually. These manual procedures carried out by the NAA laboratory personnel are time consuming and inefficient. Hence, system automation is developed in order to provide an effective method to replace redundant manual data entries and produce faster sample analysis and calculation process. This report explains NAA process in Nuclear Malaysia and describes the automation development in detail which includes sample registration software, automatic sample changer system which consists of hardware and software; and sample analysis software. (author)

  10. The diffusion of radioactive gases in the meso-scale (20 km-400 km)

    International Nuclear Information System (INIS)

    Wippermann, F.

    1974-01-01

    The term ''Mesoscale'' refers to distances between 20 km and 400 km from the source; in defining this range, the structure of atmospheric turbulence is taken into account. To arrive at an evaluation of diffusion in the mesoscale, quantitative methods from the microscale (source distance 400 km) are extrapolated into the mesoscale. In the first case a table is given to read off the minimum factor by which the concentration is reduced in the mesoscale as the source distance increases to obtain the diffusion for the worst possible case, the existence of a mixing-layer topped by a temperature inversion, was assumed. For this it was essential, first of all, to determine the source distance xsub(D) beyond which the diffusing gases are completely mixed within the mixing-layer of thickness D. To make allowance for all possible thicknesses of this mixing-layer, a measurement carried out at ground level at only 10 km from the source can be used to calculate the correct concentrations in the mixing-layer; the dilution factors will then be related to this value. Possible ways of an improved incorporation of certain factors in the diffusion estimate, such as the topography of the earth's surface, the roughness of terrain, the vertical profiles of wind and exchange coefficients and the effects of non-stability are given in the last section

  11. Knowledge Support and Automation for Performance Analysis with PerfExplorer 2.0

    Directory of Open Access Journals (Sweden)

    Kevin A. Huck

    2008-01-01

    Full Text Available The integration of scalable performance analysis in parallel development tools is difficult. The potential size of data sets and the need to compare results from multiple experiments presents a challenge to manage and process the information. Simply to characterize the performance of parallel applications running on potentially hundreds of thousands of processor cores requires new scalable analysis techniques. Furthermore, many exploratory analysis processes are repeatable and could be automated, but are now implemented as manual procedures. In this paper, we will discuss the current version of PerfExplorer, a performance analysis framework which provides dimension reduction, clustering and correlation analysis of individual trails of large dimensions, and can perform relative performance analysis between multiple application executions. PerfExplorer analysis processes can be captured in the form of Python scripts, automating what would otherwise be time-consuming tasks. We will give examples of large-scale analysis results, and discuss the future development of the framework, including the encoding and processing of expert performance rules, and the increasing use of performance metadata.

  12. Preliminary design of mesoscale turbocompressor and rotordynamics tests of rotor bearing system

    Science.gov (United States)

    Hossain, Md Saddam

    2011-12-01

    A mesoscale turbocompressor spinning above 500,000 RPM is evolutionary technology for micro turbochargers, turbo blowers, turbo compressors, micro-gas turbines, auxiliary power units, etc for automotive, aerospace, and fuel cell industries. Objectives of this work are: (1) to evaluate different air foil bearings designed for the intended applications, and (2) to design & perform CFD analysis of a micro-compressor. CFD analysis of shrouded 3-D micro compressor was conducted using Ansys Bladegen as blade generation tool, ICEM CFD as mesh generation tool, and CFX as main solver for different design and off design cases and also for different number of blades. Comprehensive experimental facilities for testing the turbocompressor system have been also designed and proposed for future work.

  13. PIQMIe: A web server for semi-quantitative proteomics data management and analysis

    NARCIS (Netherlands)

    A. Kuzniar (Arnold); R. Kanaar (Roland)

    2014-01-01

    textabstractWe present the Proteomics Identifications and Quantitations Data Management and Integration Service or PIQMIe that aids in reliable and scalable data management, analysis and visualization of semi-quantitative mass spectrometry based proteomics experiments. PIQMIe readily integrates

  14. Semi-Markov Chains and Hidden Semi-Markov Models toward Applications Their Use in Reliability and DNA Analysis

    CERN Document Server

    Barbu, Vlad

    2008-01-01

    Semi-Markov processes are much more general and better adapted to applications than the Markov ones because sojourn times in any state can be arbitrarily distributed, as opposed to the geometrically distributed sojourn time in the Markov case. This book concerns with the estimation of discrete-time semi-Markov and hidden semi-Markov processes

  15. Semi-automated solid phase extraction method for the mass spectrometric quantification of 12 specific metabolites of organophosphorus pesticides, synthetic pyrethroids, and select herbicides in human urine.

    Science.gov (United States)

    Davis, Mark D; Wade, Erin L; Restrepo, Paula R; Roman-Esteva, William; Bravo, Roberto; Kuklenyik, Peter; Calafat, Antonia M

    2013-06-15

    Organophosphate and pyrethroid insecticides and phenoxyacetic acid herbicides represent important classes of pesticides applied in commercial and residential settings. Interest in assessing the extent of human exposure to these pesticides exists because of their widespread use and their potential adverse health effects. An analytical method for measuring 12 biomarkers of several of these pesticides in urine has been developed. The target analytes were extracted from one milliliter of urine by a semi-automated solid phase extraction technique, separated from each other and from other urinary biomolecules by reversed-phase high performance liquid chromatography, and detected using tandem mass spectrometry with isotope dilution quantitation. This method can be used to measure all the target analytes in one injection with similar repeatability and detection limits of previous methods which required more than one injection. Each step of the procedure was optimized to produce a robust, reproducible, accurate, precise and efficient method. The required selectivity and sensitivity for trace-level analysis (e.g., limits of detection below 0.5ng/mL) was achieved using a narrow diameter analytical column, higher than unit mass resolution for certain analytes, and stable isotope labeled internal standards. The method was applied to the analysis of 55 samples collected from adult anonymous donors with no known exposure to the target pesticides. This efficient and cost-effective method is adequate to handle the large number of samples required for national biomonitoring surveys. Published by Elsevier B.V.

  16. Automated drumlin shape and volume estimation using high resolution LiDAR imagery (Curvature Based Relief Separation): A test from the Wadena Drumlin Field, Minnesota

    Science.gov (United States)

    Yu, Peter; Eyles, Nick; Sookhan, Shane

    2015-10-01

    Resolving the origin(s) of drumlins and related megaridges in areas of megascale glacial lineations (MSGL) left by paleo-ice sheets is critical to understanding how ancient ice sheets interacted with their sediment beds. MSGL is now linked with fast-flowing ice streams but there is a broad range of erosional and depositional models. Further progress is reliant on constraining fluxes of subglacial sediment at the ice sheet base which in turn is dependent on morphological data such as landform shape and elongation and most importantly landform volume. Past practice in determining shape has employed a broad range of geomorphological methods from strictly visualisation techniques to more complex semi-automated and automated drumlin extraction methods. This paper reviews and builds on currently available visualisation, semi-automated and automated extraction methods and presents a new, Curvature Based Relief Separation (CBRS) technique; for drumlin mapping. This uses curvature analysis to generate a base level from which topography can be normalized and drumlin volume can be derived. This methodology is tested using a high resolution (3 m) LiDAR elevation dataset from the Wadena Drumlin Field, Minnesota, USA, which was constructed by the Wadena Lobe of the Laurentide Ice Sheet ca. 20,000 years ago and which as a whole contains 2000 drumlins across an area of 7500 km2. This analysis demonstrates that CBRS provides an objective and robust procedure for automated drumlin extraction. There is strong agreement with manually selected landforms but the method is also capable of resolving features that were not detectable manually thereby considerably expanding the known population of streamlined landforms. CBRS provides an effective automatic method for visualisation of large areas of the streamlined beds of former ice sheets and for modelling sediment fluxes below ice sheets.

  17. Evaluation of ventricular dysfunction using semi-automatic longitudinal strain analysis of four-chamber cine MR imaging.

    Science.gov (United States)

    Kawakubo, Masateru; Nagao, Michinobu; Kumazawa, Seiji; Yamasaki, Yuzo; Chishaki, Akiko S; Nakamura, Yasuhiko; Honda, Hiroshi; Morishita, Junji

    2016-02-01

    The aim of this study was to evaluate ventricular dysfunction using the longitudinal strain analysis in 4-chamber (4CH) cine MR imaging, and to investigate the agreement between the semi-automatic and manual measurements in the analysis. Fifty-two consecutive patients with ischemic, or non-ischemic cardiomyopathy and repaired tetralogy of Fallot who underwent cardiac MR examination incorporating cine MR imaging were retrospectively enrolled. The LV and RV longitudinal strain values were obtained by semi-automatically and manually. Receiver operating characteristic (ROC) analysis was performed to determine the optimal cutoff of the minimum longitudinal strain value for the detection of patients with cardiac dysfunction. The correlations between manual and semi-automatic measurements for LV and RV walls were analyzed by Pearson coefficient analysis. ROC analysis demonstrated the optimal cut-off of the minimum longitudinal strain values (εL_min) for diagnoses the LV and RV dysfunction at a high accuracy (LV εL_min = -7.8 %: area under the curve, 0.89; sensitivity, 83 %; specificity, 91 %, RV εL_min = -15.7 %: area under the curve, 0.82; sensitivity, 92 %; specificity, 68 %). Excellent correlations between manual and semi-automatic measurements for LV and RV free wall were observed (LV, r = 0.97, p cine MR imaging can evaluate LV and RV dysfunction with simply and easy measurements. The strain analysis could have extensive application in cardiac imaging for various clinical cases.

  18. Semi-automated petrographic assessment of coal by coal grain analysis

    Energy Technology Data Exchange (ETDEWEB)

    O' Brien, G.; Jenkins, B.; Ofori, P.; Ferguson, K. [CSIRO Exploration and Mining, Pullenvale, Qld. (Australia)

    2007-04-15

    A new classification method, coal grain analysis, which uses optical imaging techniques for the microscopic characterisation of the individual grains present in coal samples is discussed. This differs from other coal petrography imaging methods in that a mask is used to remove the pixels of mounting resin to obtain compositional information of the maceral (vitrinite, inertinite and liptinite) and mineral abundances on each individual grain within each image. Experiments were conducted to establish the density of individual constituents in order to enable the density of each grain to be determined and the results reported on a mass basis. The grains were sorted into eight grain classes of liberated (single component) and composite grains. By analysing all streams (feed, concentrate and tailings) of the flotation circuit at a coal washing plant, the flotation response of the individual grain classes was tracked. This has implications for flotation process diagnostics and optimisation.

  19. A methodology to aid in the design of naval steels: Linking first principles calculations to mesoscale modeling

    International Nuclear Information System (INIS)

    Spanos, G.; Geltmacher, A.B.; Lewis, A.C.; Bingert, J.F.; Mehl, M.; Papaconstantopoulos, D.; Mishin, Y.; Gupta, A.; Matic, P.

    2007-01-01

    This paper provides a brief overview of a multidisciplinary effort at the Naval Research Laboratory aimed at developing a computationally-based methodology to assist in the design of advanced Naval steels. This program uses multiple computational techniques ranging from the atomistic length scale to continuum response. First-principles electronic structure calculations using density functional theory were employed, semi-empirical angular dependent potentials were developed based on the embedded atom method, and these potentials were used as input into Monte-Carlo and molecular dynamics simulations. Experimental techniques have also been applied to a super-austenitic stainless steel (AL6XN) to provide experimental input, guidance, verification, and enhancements to the models. These experimental methods include optical microscopy, scanning electron microscopy, transmission electron microscopy, electron backscatter diffraction, and serial sectioning in conjunction with computer-based three-dimensional reconstruction and quantitative analyses. The experimental results are also used as critical input into mesoscale finite element models of materials response

  20. Seasonal to Mesoscale Variability of Water Masses in Barrow Canyon,Chukchi Sea

    Science.gov (United States)

    Nobre, C.; Pickart, R. S.; Moore, K.; Ashjian, C. J.; Arrigo, K. R.; Grebmeier, J. M.; Vagle, S.; Itoh, M.; Berchok, C.; Stabeno, P. J.; Kikuchi, T.; Cooper, L. W.; Hartwell, I.; He, J.

    2016-02-01

    Barrow Canyon is one of the primary conduits by which Pacific-origin water exits the Chukchi Sea into the Canada Basin. As such, it is an ideal location to monitor the different water masses through the year. At the same time, the canyon is an energetic environment where mixing and entrainment can occur, modifying the pacific-origin waters. As part of the Distributed Biological Observatory (DBO) program, a transect across the canyon was occupied 24 times between 2010-2013 by international ships of opportunity passing through the region during summer and early-fall. Here we present results from an analysis of these sections to determine the seasonal evolution of the water masses and to investigate the nature of the mesoscale variability. The mean state shows the clear presence of six water masses present at various times through the summer. The seasonal evolution of these summer water masses is characterized both in depth space and in temperature-salinity (T-S) space. Clear patterns emerge, including the arrival of Alaskan coastal water and its modification in early-fall. The primary mesoscale variability is associated with wind-driven upwelling events which occur predominantly in September. The atmospheric forcing of these events is investigated as is the oceanic response.

  1. Completely automated measurement facility (PAVICOM) for track-detector data processing

    CERN Document Server

    Aleksandrov, A B; Feinberg, E L; Goncharova, L A; Konovalova, N S; Martynov, A G; Polukhina, N G; Roussetski, A S; Starkov, NI; Tsarev, V A

    2004-01-01

    A review of technical capabilities and investigations performed using the completely automated measuring facility (PAVICOM) is presented. This very efficient facility for track-detector data processing in the field of nuclear and high-energy particle physics has been constructed in the Lebedev physical institute. PAVICOM is widely used in Russia for treatment of experimental data from track detectors (emulsion and solid-state trackers) in high- and low-energy physics, cosmic ray physics, etc. PAVICOM provides an essential improvement of the efficiency of experimental studies. In contrast to semi-automated microscopes widely used until now, PAVICOM is capable of performing completely automated measurements of charged particle tracks in nuclear emulsions and track detectors without employing hard visual work. In this case, track images are recorded by CCD cameras and then are digitized and converted into files. Thus, experimental data processing is accelerated by approximately a thousand times. Completely autom...

  2. Coupling a Mesoscale Numerical Weather Prediction Model with Large-Eddy Simulation for Realistic Wind Plant Aerodynamics Simulations (Poster)

    Energy Technology Data Exchange (ETDEWEB)

    Draxl, C.; Churchfield, M.; Mirocha, J.; Lee, S.; Lundquist, J.; Michalakes, J.; Moriarty, P.; Purkayastha, A.; Sprague, M.; Vanderwende, B.

    2014-06-01

    Wind plant aerodynamics are influenced by a combination of microscale and mesoscale phenomena. Incorporating mesoscale atmospheric forcing (e.g., diurnal cycles and frontal passages) into wind plant simulations can lead to a more accurate representation of microscale flows, aerodynamics, and wind turbine/plant performance. Our goal is to couple a numerical weather prediction model that can represent mesoscale flow [specifically the Weather Research and Forecasting model] with a microscale LES model (OpenFOAM) that can predict microscale turbulence and wake losses.

  3. Accuracy of estimation of graft size for living-related liver transplantation: first results of a semi-automated interactive software for CT-volumetry.

    Directory of Open Access Journals (Sweden)

    Theresa Mokry

    Full Text Available To evaluate accuracy of estimated graft size for living-related liver transplantation using a semi-automated interactive software for CT-volumetry.Sixteen donors for living-related liver transplantation (11 male; mean age: 38.2±9.6 years underwent contrast-enhanced CT prior to graft removal. CT-volumetry was performed using a semi-automated interactive software (P, and compared with a manual commercial software (TR. For P, liver volumes were provided either with or without vessels. For TR, liver volumes were provided always with vessels. Intraoperative weight served as reference standard. Major study goals included analyses of volumes using absolute numbers, linear regression analyses and inter-observer agreements. Minor study goals included the description of the software workflow: degree of manual correction, speed for completion, and overall intuitiveness using five-point Likert scales: 1--markedly lower/faster/higher for P compared with TR, 2--slightly lower/faster/higher for P compared with TR, 3--identical for P and TR, 4--slightly lower/faster/higher for TR compared with P, and 5--markedly lower/faster/higher for TR compared with P.Liver segments II/III, II-IV and V-VIII served in 6, 3, and 7 donors as transplanted liver segments. Volumes were 642.9±368.8 ml for TR with vessels, 623.8±349.1 ml for P with vessels, and 605.2±345.8 ml for P without vessels (P<0.01. Regression equations between intraoperative weights and volumes were y = 0.94x+30.1 (R2 = 0.92; P<0.001 for TR with vessels, y = 1.00x+12.0 (R2 = 0.92; P<0.001 for P with vessels, and y = 1.01x+28.0 (R2 = 0.92; P<0.001 for P without vessels. Inter-observer agreement showed a bias of 1.8 ml for TR with vessels, 5.4 ml for P with vessels, and 4.6 ml for P without vessels. For the degree of manual correction, speed for completion and overall intuitiveness, scale values were 2.6±0.8, 2.4±0.5 and 2.CT-volumetry performed with P can predict accurately graft

  4. Meso-scale modelling of the heat conductivity effect on the shock response of a porous material

    Science.gov (United States)

    Resnyansky, A. D.

    2017-06-01

    Understanding of deformation mechanisms of porous materials under shock compression is important for tailoring material properties at the shock manufacturing of advanced materials from substrate powders and for studying the response of porous materials under shock loading. Numerical set-up of the present work considers a set of solid particles separated by air representing a volume of porous material. Condensed material in the meso-scale set-up is simulated with a viscoelastic rate sensitive material model with heat conduction formulated from the principles of irreversible thermodynamics. The model is implemented in the CTH shock physics code. The meso-scale CTH simulation of the shock loading of the representative volume reveals the mechanism of pore collapse and shows in detail the transition from a high porosity case typical for abnormal Hugoniot response to a moderate porosity case typical for conventional Hugoniot response. Results of the analysis agree with previous analytical considerations and support hypotheses used in the two-phase approach.

  5. Evaluation of an Automated Analysis Tool for Prostate Cancer Prediction Using Multiparametric Magnetic Resonance Imaging.

    Directory of Open Access Journals (Sweden)

    Matthias C Roethke

    Full Text Available To evaluate the diagnostic performance of an automated analysis tool for the assessment of prostate cancer based on multiparametric magnetic resonance imaging (mpMRI of the prostate.A fully automated analysis tool was used for a retrospective analysis of mpMRI sets (T2-weighted, T1-weighted dynamic contrast-enhanced, and diffusion-weighted sequences. The software provided a malignancy prediction value for each image pixel, defined as Malignancy Attention Index (MAI that can be depicted as a colour map overlay on the original images. The malignancy maps were compared to histopathology derived from a combination of MRI-targeted and systematic transperineal MRI/TRUS-fusion biopsies.In total, mpMRI data of 45 patients were evaluated. With a sensitivity of 85.7% (with 95% CI of 65.4-95.0, a specificity of 87.5% (with 95% CI of 69.0-95.7 and a diagnostic accuracy of 86.7% (with 95% CI of 73.8-93.8 for detection of prostate cancer, the automated analysis results corresponded well with the reported diagnostic accuracies by human readers based on the PI-RADS system in the current literature.The study revealed comparable diagnostic accuracies for the detection of prostate cancer of a user-independent MAI-based automated analysis tool and PI-RADS-scoring-based human reader analysis of mpMRI. Thus, the analysis tool could serve as a detection support system for less experienced readers. The results of the study also suggest the potential of MAI-based analysis for advanced lesion assessments, such as cancer extent and staging prediction.

  6. A three-dimensional meso-scale modeling for helium bubble growth in metals

    International Nuclear Information System (INIS)

    Suzudo, T.; Kaburaki, H.; Wakai, E.

    2007-01-01

    A three-dimensional meso-scale computer model using a Monte-Carlo simulation method has been proposed to simulate the helium bubble growth in metals. The primary merit of this model is that it enables the visual comparison between the microstructure observed by the TEM imaging and those by calculations. The modeling is so simple that one can control easily the calculation by tuning parameters. The simulation results are confirmed by the ideal gas law and the capillary relation. helium bubble growth, meso-scale modeling, Monte-Carlo simulation, the ideal gas law and the capillary relation. (authors)

  7. ASSESSING THE AGREEMENT BETWEEN EO-BASED SEMI-AUTOMATED LANDSLIDE MAPS WITH FUZZY MANUAL LANDSLIDE DELINEATION

    Directory of Open Access Journals (Sweden)

    F. Albrecht

    2017-09-01

    Full Text Available Landslide mapping benefits from the ever increasing availability of Earth Observation (EO data resulting from programmes like the Copernicus Sentinel missions and improved infrastructure for data access. However, there arises the need for improved automated landslide information extraction processes from EO data while the dominant method is still manual delineation. Object-based image analysis (OBIA provides the means for the fast and efficient extraction of landslide information. To prove its quality, automated results are often compared to manually delineated landslide maps. Although there is awareness of the uncertainties inherent in manual delineations, there is a lack of understanding how they affect the levels of agreement in a direct comparison of OBIA-derived landslide maps and manually derived landslide maps. In order to provide an improved reference, we present a fuzzy approach for the manual delineation of landslides on optical satellite images, thereby making the inherent uncertainties of the delineation explicit. The fuzzy manual delineation and the OBIA classification are compared by accuracy metrics accepted in the remote sensing community. We have tested this approach for high resolution (HR satellite images of three large landslides in Austria and Italy. We were able to show that the deviation of the OBIA result from the manual delineation can mainly be attributed to the uncertainty inherent in the manual delineation process, a relevant issue for the design of validation processes for OBIA-derived landslide maps.

  8. Fluorescence In Situ Hybridization (FISH Signal Analysis Using Automated Generated Projection Images

    Directory of Open Access Journals (Sweden)

    Xingwei Wang

    2012-01-01

    Full Text Available Fluorescence in situ hybridization (FISH tests provide promising molecular imaging biomarkers to more accurately and reliably detect and diagnose cancers and genetic disorders. Since current manual FISH signal analysis is low-efficient and inconsistent, which limits its clinical utility, developing automated FISH image scanning systems and computer-aided detection (CAD schemes has been attracting research interests. To acquire high-resolution FISH images in a multi-spectral scanning mode, a huge amount of image data with the stack of the multiple three-dimensional (3-D image slices is generated from a single specimen. Automated preprocessing these scanned images to eliminate the non-useful and redundant data is important to make the automated FISH tests acceptable in clinical applications. In this study, a dual-detector fluorescence image scanning system was applied to scan four specimen slides with FISH-probed chromosome X. A CAD scheme was developed to detect analyzable interphase cells and map the multiple imaging slices recorded FISH-probed signals into the 2-D projection images. CAD scheme was then applied to each projection image to detect analyzable interphase cells using an adaptive multiple-threshold algorithm, identify FISH-probed signals using a top-hat transform, and compute the ratios between the normal and abnormal cells. To assess CAD performance, the FISH-probed signals were also independently visually detected by an observer. The Kappa coefficients for agreement between CAD and observer ranged from 0.69 to 1.0 in detecting/counting FISH signal spots in four testing samples. The study demonstrated the feasibility of automated FISH signal analysis that applying a CAD scheme to the automated generated 2-D projection images.

  9. Automated retroillumination photography analysis for objective assessment of Fuchs Corneal Dystrophy severity

    Science.gov (United States)

    Eghrari, Allen O.; Mumtaz, Aisha A.; Garrett, Brian; Rezaei, Mahsa; Akhavan, Mina S.; Riazuddin, S. Amer; Gottsch, John D.

    2016-01-01

    Purpose Retroillumination photography analysis (RPA) is an objective tool for assessment of the number and distribution of guttae in eyes affected with Fuchs Corneal Dystrophy (FCD). Current protocols include manual processing of images; here we assess validity and interrater reliability of automated analysis across various levels of FCD severity. Methods Retroillumination photographs of 97 FCD-affected corneas were acquired and total counts of guttae previously summated manually. For each cornea, a single image was loaded into ImageJ software. We reduced color variability and subtracted background noise. Reflection of light from each gutta was identified as a local area of maximum intensity and counted automatically. Noise tolerance level was titrated for each cornea by examining a small region of each image with automated overlay to ensure appropriate coverage of individual guttae. We tested interrater reliability of automated counts of guttae across a spectrum of clinical and educational experience. Results A set of 97 retroillumination photographs were analyzed. Clinical severity as measured by a modified Krachmer scale ranged from a severity level of 1 to 5 in the set of analyzed corneas. Automated counts by an ophthalmologist correlated strongly with Krachmer grading (R2=0.79) and manual counts (R2=0.88). Intraclass correlation coefficient demonstrated strong correlation, at 0.924 (95% CI, 0.870- 0.958) among cases analyzed by three students, and 0.869 (95% CI, 0.797- 0.918) among cases for which images was analyzed by an ophthalmologist and two students. Conclusions Automated RPA allows for grading of FCD severity with high resolution across a spectrum of disease severity. PMID:27811565

  10. Mesoscale eddies in the Subantarctic Front-Southwest Atlantic

    Directory of Open Access Journals (Sweden)

    Pablo D. Glorioso

    2005-12-01

    Full Text Available Satellite and ship observations in the southern southwest Atlantic (SSWA reveal an intense eddy field and highlight the potential for using continuous real-time satellite altimetry to detect and monitor mesoscale phenomena with a view to understanding the regional circulation. The examples presented suggest that mesoscale eddies are a dominant feature of the circulation and play a fundamental role in the transport of properties along and across the Antarctic Circumpolar Current (ACC. The main ocean current in the SSWA, the Falkland-Malvinas Current (FMC, exhibits numerous embedded eddies south of 50°S which may contribute to the patchiness, transport and mixing of passive scalars by this strong, turbulent current. Large eddies associated with meanders are observed in the ACC fronts, some of them remaining stationary for long periods. Two particular cases are examined using a satellite altimeter in combination with in situ observations, suggesting that cross-frontal eddy transport and strong meandering occur where the ACC flow intensifies along the sub-Antarctic Front (SAF and the Southern ACC Front (SACCF.

  11. Milestones on a Shoestring: A Cost-Effective, Semi-automated Implementation of the New ACGME Requirements for Radiology.

    Science.gov (United States)

    Schmitt, J Eric; Scanlon, Mary H; Servaes, Sabah; Levin, Dayna; Cook, Tessa S

    2015-10-01

    The advent of the ACGME's Next Accreditation System represents a significant new challenge for residencies and fellowships, owing to its requirements for more complex and detailed information. We developed a system of online assessment tools to provide comprehensive coverage of the twelve ACGME Milestones and digitized them using freely available cloud-based productivity tools. These tools include a combination of point-of-care procedural assessments, electronic quizzes, online modules, and other data entry forms. Using free statistical analytic tools, we also developed an automated system for management, processing, and data reporting. After one year of use, our Milestones project has resulted in the submission of over 20,000 individual data points. The use of automated statistical methods to generate resident-specific profiles has allowed for dynamic reports of individual residents' progress. These profiles both summarize data and also allow program directors access to more granular information as needed. Informatics-driven strategies for data assessment and processing represent feasible solutions to Milestones assessment and analysis, reducing the potential administrative burden for program directors, residents, and staff. Copyright © 2015 AUR. Published by Elsevier Inc. All rights reserved.

  12. Comparison of Four Mixed Layer Mesoscale Parameterizations and the Equation for an Arbitrary Tracer

    Science.gov (United States)

    Canuto, V. M.; Dubovikov, M. S.

    2011-01-01

    In this paper we discuss two issues, the inter-comparison of four mixed layer mesoscale parameterizations and the search for the eddy induced velocity for an arbitrary tracer. It must be stressed that our analysis is limited to mixed layer mesoscales since we do not treat sub-mesoscales and small turbulent mixing. As for the first item, since three of the four parameterizations are expressed in terms of a stream function and a residual flux of the RMT formalism (residual mean theory), while the fourth is expressed in terms of vertical and horizontal fluxes, we needed a formalism to connect the two formulations. The standard RMT representation developed for the deep ocean cannot be extended to the mixed layer since its stream function does not vanish at the ocean's surface. We develop a new RMT representation that satisfies the surface boundary condition. As for the general form of the eddy induced velocity for an arbitrary tracer, thus far, it has been assumed that there is only the one that originates from the curl of the stream function. This is because it was assumed that the tracer residual flux is purely diffusive. On the other hand, we show that in the case of an arbitrary tracer, the residual flux has also a skew component that gives rise to an additional bolus velocity. Therefore, instead of only one bolus velocity, there are now two, one coming from the curl of the stream function and other from the skew part of the residual flux. In the buoyancy case, only one bolus velocity contributes to the mean buoyancy equation since the residual flux is indeed only diffusive.

  13. AMDA: an R package for the automated microarray data analysis

    Directory of Open Access Journals (Sweden)

    Foti Maria

    2006-07-01

    Full Text Available Abstract Background Microarrays are routinely used to assess mRNA transcript levels on a genome-wide scale. Large amount of microarray datasets are now available in several databases, and new experiments are constantly being performed. In spite of this fact, few and limited tools exist for quickly and easily analyzing the results. Microarray analysis can be challenging for researchers without the necessary training and it can be time-consuming for service providers with many users. Results To address these problems we have developed an automated microarray data analysis (AMDA software, which provides scientists with an easy and integrated system for the analysis of Affymetrix microarray experiments. AMDA is free and it is available as an R package. It is based on the Bioconductor project that provides a number of powerful bioinformatics and microarray analysis tools. This automated pipeline integrates different functions available in the R and Bioconductor projects with newly developed functions. AMDA covers all of the steps, performing a full data analysis, including image analysis, quality controls, normalization, selection of differentially expressed genes, clustering, correspondence analysis and functional evaluation. Finally a LaTEX document is dynamically generated depending on the performed analysis steps. The generated report contains comments and analysis results as well as the references to several files for a deeper investigation. Conclusion AMDA is freely available as an R package under the GPL license. The package as well as an example analysis report can be downloaded in the Services/Bioinformatics section of the Genopolis http://www.genopolis.it/

  14. Automated magnification calibration in transmission electron microscopy using Fourier analysis of replica images

    International Nuclear Information System (INIS)

    Laak, Jeroen A.W.M. van der; Dijkman, Henry B.P.M.; Pahlplatz, Martin M.M.

    2006-01-01

    The magnification factor in transmission electron microscopy is not very precise, hampering for instance quantitative analysis of specimens. Calibration of the magnification is usually performed interactively using replica specimens, containing line or grating patterns with known spacing. In the present study, a procedure is described for automated magnification calibration using digital images of a line replica. This procedure is based on analysis of the power spectrum of Fourier transformed replica images, and is compared to interactive measurement in the same images. Images were used with magnification ranging from 1,000x to 200,000x. The automated procedure deviated on average 0.10% from interactive measurements. Especially for catalase replicas, the coefficient of variation of automated measurement was considerably smaller (average 0.28%) compared to that of interactive measurement (average 3.5%). In conclusion, calibration of the magnification in digital images from transmission electron microscopy may be performed automatically, using the procedure presented here, with high precision and accuracy

  15. Semi-automatic geographic atrophy segmentation for SD-OCT images

    OpenAIRE

    Chen, Qiang; de Sisternes, Luis; Leng, Theodore; Zheng, Luoluo; Kutzscher, Lauren; Rubin, Daniel L.

    2013-01-01

    Geographic atrophy (GA) is a condition that is associated with retinal thinning and loss of the retinal pigment epithelium (RPE) layer. It appears in advanced stages of non-exudative age-related macular degeneration (AMD) and can lead to vision loss. We present a semi-automated GA segmentation algorithm for spectral-domain optical coherence tomography (SD-OCT) images. The method first identifies and segments a surface between the RPE and the choroid to generate retinal projection images in wh...

  16. Preliminary analysis of four numerical models for calculating the mesoscale transport of Kr-85

    Energy Technology Data Exchange (ETDEWEB)

    Pepper, D W; Cooper, R E [Du Pont de Nemours (E.I.) and Co., Aiken, SC (USA). Savannah River Lab.

    1983-01-01

    A performance study of four numerical algorithms for multi-dimensional advection-diffusion prediction on mesoscale grids has been made. Dispersion from point and distributed sources and a simulation of a continuous source are compared with analytical solutions to assess relative accuracy. Model predictions are then compared with actual measurements of Kr-85 emitted from the Savannah River Plant (SRP). The particle-in-cell and method of moments algorithms exhibit superior accuracy in modeling single source releases. For modeling distributed sources, algorithms based on the pseudospectral and finite element interpolation concepts exhibit comparable accuracy. The method of moments is felt to be the best overall performer, although all the models appear to be relatively close in accuracy.

  17. Mesoscale mixing of the Denmark Strait Overflow in the Irminger Basin

    Science.gov (United States)

    Koszalka, Inga M.; Haine, Thomas W. N.; Magaldi, Marcello G.

    2017-04-01

    The Denmark Strait Overflow (DSO) is a major export route for dense waters from the Nordic Seas forming the lower limb of the Atlantic Meridional Overturning Circulation, an important element of the climate system. Mixing processes along the DSO pathway influence its volume transport and properties contributing to the variability of the deep overturning circulation. They are poorly sampled by observations, however, which hinders development of a proper DSO representation in global circulation models. We employ a high resolution regional ocean model of the Irminger Basin to quantify impact of the mesoscale flows on DSO mixing focusing on geographical localization and the time-modulation of water property changes. The model reproduces the observed bulk warming of the DSO plume 100-200 km downstream of the Denmark Strait sill. It also reveals that mesoscale variability of the overflow ('DSO-eddies', of 20-30 km extent and a time scale of 2-5 day) modulates water property changes and turbulent mixing, diagnosed with the vertical shear of horizontal velocity and the eddy heat flux divergence. The space-time localization of the DSO mixing and warming and the role of coherent mesoscale structures should be explored by turbulence measurements and factored into the coarse circulation models.

  18. Automated microfluidic devices integrating solid-phase extraction, fluorescent labeling, and microchip electrophoresis for preterm birth biomarker analysis.

    Science.gov (United States)

    Sahore, Vishal; Sonker, Mukul; Nielsen, Anna V; Knob, Radim; Kumar, Suresh; Woolley, Adam T

    2018-01-01

    We have developed multichannel integrated microfluidic devices for automated preconcentration, labeling, purification, and separation of preterm birth (PTB) biomarkers. We fabricated multilayer poly(dimethylsiloxane)-cyclic olefin copolymer (PDMS-COC) devices that perform solid-phase extraction (SPE) and microchip electrophoresis (μCE) for automated PTB biomarker analysis. The PDMS control layer had a peristaltic pump and pneumatic valves for flow control, while the PDMS fluidic layer had five input reservoirs connected to microchannels and a μCE system. The COC layers had a reversed-phase octyl methacrylate porous polymer monolith for SPE and fluorescent labeling of PTB biomarkers. We determined μCE conditions for two PTB biomarkers, ferritin (Fer) and corticotropin-releasing factor (CRF). We used these integrated microfluidic devices to preconcentrate and purify off-chip-labeled Fer and CRF in an automated fashion. Finally, we performed a fully automated on-chip analysis of unlabeled PTB biomarkers, involving SPE, labeling, and μCE separation with 1 h total analysis time. These integrated systems have strong potential to be combined with upstream immunoaffinity extraction, offering a compact sample-to-answer biomarker analysis platform. Graphical abstract Pressure-actuated integrated microfluidic devices have been developed for automated solid-phase extraction, fluorescent labeling, and microchip electrophoresis of preterm birth biomarkers.

  19. Evaluation of meteorological fields generated by a prognostic mesoscale model using data collected during the 1993 GMAQS/COAST field study

    International Nuclear Information System (INIS)

    Lolk, N.K.; Douglas, S.G.

    1996-01-01

    In 1993, the US Interior Department's Minerals Management Service (MMS) sponsored the Gulf of Mexico Air Quality Study (GMAQS). Its purpose was to assess potential impacts of offshore petrochemical development on ozone concentrations in nonattainment areas in the Texas/Louisiana Gulf Coast region as mandated by the 1990 Clean Air Act Amendments. The GMAQS comprised data collection, data analysis, and applications of an advanced photochemical air quality model, the variable-grid Urban Airshed Model (UAM-V), and a prognostic mesoscale meteorological model (SAIMM -- Systems Applications International Mesoscale Model) to simulate two ozone episodes that were captured during the summer field study. The primary purpose of this paper is to evaluate the SAIMM-simulated meteorological fields using graphical analysis that utilize the comprehensive GMAQS/COAST (Gulf of Mexico Air Quality Study/Coastal Oxidant Assessment for Southeast Texas) database and to demonstrate the ability of the SAIMM to simulate the day-to-day variations in the evolution and structure of the gulf breeze and the mixed layer

  20. Automated reticle inspection data analysis for wafer fabs

    Science.gov (United States)

    Summers, Derek; Chen, Gong; Reese, Bryan; Hutchinson, Trent; Liesching, Marcus; Ying, Hai; Dover, Russell

    2009-04-01

    To minimize potential wafer yield loss due to mask defects, most wafer fabs implement some form of reticle inspection system to monitor photomask quality in high-volume wafer manufacturing environments. Traditionally, experienced operators review reticle defects found by an inspection tool and then manually classify each defect as 'pass, warn, or fail' based on its size and location. However, in the event reticle defects are suspected of causing repeating wafer defects on a completed wafer, potential defects on all associated reticles must be manually searched on a layer-by-layer basis in an effort to identify the reticle responsible for the wafer yield loss. This 'problem reticle' search process is a very tedious and time-consuming task and may cause extended manufacturing line-down situations. Often times, Process Engineers and other team members need to manually investigate several reticle inspection reports to determine if yield loss can be tied to a specific layer. Because of the very nature of this detailed work, calculation errors may occur resulting in an incorrect root cause analysis effort. These delays waste valuable resources that could be spent working on other more productive activities. This paper examines an automated software solution for converting KLA-Tencor reticle inspection defect maps into a format compatible with KLA-Tencor's Klarity Defect(R) data analysis database. The objective is to use the graphical charting capabilities of Klarity Defect to reveal a clearer understanding of defect trends for individual reticle layers or entire mask sets. Automated analysis features include reticle defect count trend analysis and potentially stacking reticle defect maps for signature analysis against wafer inspection defect data. Other possible benefits include optimizing reticle inspection sample plans in an effort to support "lean manufacturing" initiatives for wafer fabs.