WorldWideScience

Sample records for semi-automated mesoscale analysis

  1. Semi-automated retinal vessel analysis in nonmydriatic fundus photography.

    Science.gov (United States)

    Schuster, Alexander Karl-Georg; Fischer, Joachim Ernst; Vossmerbaeumer, Urs

    2014-02-01

    Funduscopic assessment of the retinal vessels may be used to assess the health status of microcirculation and as a component in the evaluation of cardiovascular risk factors. Typically, the evaluation is restricted to morphological appreciation without strict quantification. Our purpose was to develop and validate a software tool for semi-automated quantitative analysis of retinal vasculature in nonmydriatic fundus photography. matlab software was used to develop a semi-automated image recognition and analysis tool for the determination of the arterial-venous (A/V) ratio in the central vessel equivalent on 45° digital fundus photographs. Validity and reproducibility of the results were ascertained using nonmydriatic photographs of 50 eyes from 25 subjects recorded from a 3DOCT device (Topcon Corp.). Two hundred and thirty-three eyes of 121 healthy subjects were evaluated to define normative values. A software tool was developed using image thresholds for vessel recognition and vessel width calculation in a semi-automated three-step procedure: vessel recognition on the photograph and artery/vein designation, width measurement and calculation of central retinal vessel equivalents. Mean vessel recognition rate was 78%, vessel class designation rate 75% and reproducibility between 0.78 and 0.91. Mean A/V ratio was 0.84. Application on a healthy norm cohort showed high congruence with prior published manual methods. Processing time per image was one minute. Quantitative geometrical assessment of the retinal vasculature may be performed in a semi-automated manner using dedicated software tools. Yielding reproducible numerical data within a short time leap, this may contribute additional value to mere morphological estimates in the clinical evaluation of fundus photographs. © 2013 Acta Ophthalmologica Scandinavica Foundation. Published by John Wiley & Sons Ltd.

  2. Application of fluorescence-based semi-automated AFLP analysis in barley and wheat

    DEFF Research Database (Denmark)

    Schwarz, G.; Herz, M.; Huang, X.Q.

    2000-01-01

    of semi-automated codominant analysis for hemizygous AFLP markers in an F-2 population was too low, proposing the use of dominant allele-typing defaults. Nevertheless, the efficiency of genetic mapping, especially of complex plant genomes, will be accelerated by combining the presented genotyping......Genetic mapping and the selection of closely linked molecular markers for important agronomic traits require efficient, large-scale genotyping methods. A semi-automated multifluorophore technique was applied for genotyping AFLP marker loci in barley and wheat. In comparison to conventional P-33...

  3. Semi-automated digital image analysis of patellofemoral joint space width from lateral knee radiographs

    Energy Technology Data Exchange (ETDEWEB)

    Grochowski, S.J. [Mayo Clinic, Department of Orthopedic Surgery, Rochester (United States); Amrami, K.K. [Mayo Clinic, Department of Radiology, Rochester (United States); Kaufman, K. [Mayo Clinic, Department of Orthopedic Surgery, Rochester (United States); Mayo Clinic/Foundation, Biomechanics Laboratory, Department of Orthopedic Surgery, Charlton North L-110L, Rochester (United States)

    2005-10-01

    To design a semi-automated program to measure minimum patellofemoral joint space width (JSW) using standing lateral view radiographs. Lateral patellofemoral knee radiographs were obtained from 35 asymptomatic subjects. The radiographs were analyzed to report both the repeatability of the image analysis program and the reproducibility of JSW measurements within a 2 week period. The results were also compared with manual measurements done by an experienced musculoskeletal radiologist. The image analysis program was shown to have an excellent coefficient of repeatability of 0.18 and 0.23 mm for intra- and inter-observer measurements respectively. The manual method measured a greater minimum JSW than the automated method. Reproducibility between days was comparable to other published results, but was less satisfactory for both manual and semi-automated measurements. The image analysis program had an inter-day coefficient of repeatability of 1.24 mm, which was lower than 1.66 mm for the manual method. A repeatable semi-automated method for measurement of the patellofemoral JSW from radiographs has been developed. The method is more accurate than manual measurements. However, the between-day reproducibility is higher than the intra-day reproducibility. Further investigation of the protocol for obtaining sequential lateral knee radiographs is needed in order to reduce the between-day variability. (orig.)

  4. Semi-automated analysis of three-dimensional track images

    International Nuclear Information System (INIS)

    Meesen, G.; Poffijn, A.

    2001-01-01

    In the past, three-dimensional (3-d) track images in solid state detectors were difficult to obtain. With the introduction of the confocal scanning laser microscope it is now possible to record 3-d track images in a non-destructive way. These 3-d track images can latter be used to measure typical track parameters. Preparing the detectors and recording the 3-d images however is only the first step. The second step in this process is enhancing the image quality by means of deconvolution techniques to obtain the maximum possible resolution. The third step is extracting the typical track parameters. This can be done on-screen by an experienced operator. For large sets of data however, this manual technique is not desirable. This paper will present some techniques to analyse 3-d track data in an automated way by means of image analysis routines. Advanced thresholding techniques guarantee stable results in different recording situations. By using pre-knowledge about the track shape, reliable object identification is obtained. In case of ambiguity, manual intervention is possible

  5. Semi-automated volumetric analysis of artificial lymph nodes in a phantom study

    International Nuclear Information System (INIS)

    Fabel, M.; Biederer, J.; Jochens, A.; Bornemann, L.; Soza, G.; Heller, M.; Bolte, H.

    2011-01-01

    Purpose: Quantification of tumour burden in oncology requires accurate and reproducible image evaluation. The current standard is one-dimensional measurement (e.g. RECIST) with inherent disadvantages. Volumetric analysis is discussed as an alternative for therapy monitoring of lung and liver metastases. The aim of this study was to investigate the accuracy of semi-automated volumetric analysis of artificial lymph node metastases in a phantom study. Materials and methods: Fifty artificial lymph nodes were produced in a size range from 10 to 55 mm; some of them enhanced using iodine contrast media. All nodules were placed in an artificial chest phantom (artiCHEST ® ) within different surrounding tissues. MDCT was performed using different collimations (1–5 mm) at varying reconstruction kernels (B20f, B40f, B60f). Volume and RECIST measurements were performed using Oncology Software (Siemens Healthcare, Forchheim, Germany) and were compared to reference volume and diameter by calculating absolute percentage errors. Results: The software performance allowed a robust volumetric analysis in a phantom setting. Unsatisfying segmentation results were frequently found for native nodules within surrounding muscle. The absolute percentage error (APE) for volumetric analysis varied between 0.01 and 225%. No significant differences were seen between different reconstruction kernels. The most unsatisfactory segmentation results occurred in higher slice thickness (4 and 5 mm). Contrast enhanced lymph nodes showed better segmentation results by trend. Conclusion: The semi-automated 3D-volumetric analysis software tool allows a reliable and convenient segmentation of artificial lymph nodes in a phantom setting. Lymph nodes adjacent to tissue of similar density cause segmentation problems. For volumetric analysis of lymph node metastases in clinical routine a slice thickness of ≤3 mm and a medium soft reconstruction kernel (e.g. B40f for Siemens scan systems) may be a suitable

  6. Volumetric analysis of pelvic hematomas after blunt trauma using semi-automated seeded region growing segmentation: a method validation study.

    Science.gov (United States)

    Dreizin, David; Bodanapally, Uttam K; Neerchal, Nagaraj; Tirada, Nikki; Patlas, Michael; Herskovits, Edward

    2016-11-01

    Manually segmented traumatic pelvic hematoma volumes are strongly predictive of active bleeding at conventional angiography, but the method is time intensive, limiting its clinical applicability. We compared volumetric analysis using semi-automated region growing segmentation to manual segmentation and diameter-based size estimates in patients with pelvic hematomas after blunt pelvic trauma. A 14-patient cohort was selected in an anonymous randomized fashion from a dataset of patients with pelvic binders at MDCT, collected retrospectively as part of a HIPAA-compliant IRB-approved study from January 2008 to December 2013. To evaluate intermethod differences, one reader (R1) performed three volume measurements using the manual technique and three volume measurements using the semi-automated technique. To evaluate interobserver differences for semi-automated segmentation, a second reader (R2) performed three semi-automated measurements. One-way analysis of variance was used to compare differences in mean volumes. Time effort was also compared. Correlation between the two methods as well as two shorthand appraisals (greatest diameter, and the ABC/2 method for estimating ellipsoid volumes) was assessed with Spearman's rho (r). Intraobserver variability was lower for semi-automated compared to manual segmentation, with standard deviations ranging between ±5-32 mL and ±17-84 mL, respectively (p = 0.0003). There was no significant difference in mean volumes between the two readers' semi-automated measurements (p = 0.83); however, means were lower for the semi-automated compared with the manual technique (manual: mean and SD 309.6 ± 139 mL; R1 semi-auto: 229.6 ± 88.2 mL, p = 0.004; R2 semi-auto: 243.79 ± 99.7 mL, p = 0.021). Despite differences in means, the correlation between the two methods was very strong and highly significant (r = 0.91, p hematoma volumes correlate strongly with manually segmented volumes. Since semi-automated segmentation

  7. Semi-automated analysis of EEG spikes in the preterm fetal sheep using wavelet analysis

    International Nuclear Information System (INIS)

    Walbran, A.C.; Unsworth, C.P.; Gunn, A.J.; Benett, L.

    2010-01-01

    Full text: Presentation Preference Oral Presentation Perinatal hypoxia plays a key role in the cause of brain injury in premature infants. Cerebral hypothermia commenced in the latent phase of evolving injury (first 6-8 h post hypoxic-ischemic insult) is the lead candidate for treatment however currently there is no means to identify which infants can benefit from treatment. Recent studies suggest that epileptiform transients in latent phase are predictive of neural outcome. To quantify this, an automated means of EEG analysis is required as EEG monitoring produces vast amounts of data which is timely to analyse manually. We have developed a semi-automated EEG spike detection method which employs a discretized version of the continuous wavelet transform (CWT). EEG data was obtained from a fetal sheep at approximately 0.7 of gestation. Fetal asphyxia was maintained for 25 min and the EEG recorded for 8 h before and after asphyxia. The CWT was calculated followed by the power of the wavelet transform coefficients. Areas of high power corresponded to spike waves so thresholding was employed to identify the spikes. The performance of the method was found have a good sensitivity and selectivity, thus demonstrating that this method is a simple, robust and potentially effective spike detection algorithm.

  8. PCR evaluation : considering transition from manual to semi-automated pavement distress collection and analysis.

    Science.gov (United States)

    2013-07-01

    This study is designed to assist the Ohio Department of Transportation (ODOT) in determining : whether transitioning from manual to state-of the-practice semi-automated pavement distress : data collection is feasible and recommended. Statistical and ...

  9. Semi-automated vectorial analysis of anorectal motion by magnetic resonance defecography in healthy subjects and fecal incontinence.

    Science.gov (United States)

    Noelting, J; Bharucha, A E; Lake, D S; Manduca, A; Fletcher, J G; Riederer, S J; Joseph Melton, L; Zinsmeister, A R

    2012-10-01

    Inter-observer variability limits the reproducibility of pelvic floor motion measured by magnetic resonance imaging (MRI). Our aim was to develop a semi-automated program measuring pelvic floor motion in a reproducible and refined manner. Pelvic floor anatomy and motion during voluntary contraction (squeeze) and rectal evacuation were assessed by MRI in 64 women with fecal incontinence (FI) and 64 age-matched controls. A radiologist measured anorectal angles and anorectal junction motion. A semi-automated program did the same and also dissected anorectal motion into perpendicular vectors representing the puborectalis and other pelvic floor muscles, assessed the pubococcygeal angle, and evaluated pelvic rotation. Manual and semi-automated measurements of anorectal junction motion (r = 0.70; P controls. This semi-automated program provides a reproducible, efficient, and refined analysis of pelvic floor motion by MRI. Puborectalis injury is independently associated with impaired motion of puborectalis, not other pelvic floor muscles in controls and women with FI. © 2012 Blackwell Publishing Ltd.

  10. Semi-Automated Digital Image Analysis of Pick's Disease and TDP-43 Proteinopathy.

    Science.gov (United States)

    Irwin, David J; Byrne, Matthew D; McMillan, Corey T; Cooper, Felicia; Arnold, Steven E; Lee, Edward B; Van Deerlin, Vivianna M; Xie, Sharon X; Lee, Virginia M-Y; Grossman, Murray; Trojanowski, John Q

    2016-01-01

    Digital image analysis of histology sections provides reliable, high-throughput methods for neuropathological studies but data is scant in frontotemporal lobar degeneration (FTLD), which has an added challenge of study due to morphologically diverse pathologies. Here, we describe a novel method of semi-automated digital image analysis in FTLD subtypes including: Pick's disease (PiD, n=11) with tau-positive intracellular inclusions and neuropil threads, and TDP-43 pathology type C (FTLD-TDPC, n=10), defined by TDP-43-positive aggregates predominantly in large dystrophic neurites. To do this, we examined three FTLD-associated cortical regions: mid-frontal gyrus (MFG), superior temporal gyrus (STG) and anterior cingulate gyrus (ACG) by immunohistochemistry. We used a color deconvolution process to isolate signal from the chromogen and applied both object detection and intensity thresholding algorithms to quantify pathological burden. We found object-detection algorithms had good agreement with gold-standard manual quantification of tau- and TDP-43-positive inclusions. Our sampling method was reliable across three separate investigators and we obtained similar results in a pilot analysis using open-source software. Regional comparisons using these algorithms finds differences in regional anatomic disease burden between PiD and FTLD-TDP not detected using traditional ordinal scale data, suggesting digital image analysis is a powerful tool for clinicopathological studies in morphologically diverse FTLD syndromes. © The Author(s) 2015.

  11. Semi-Automated Digital Image Analysis of Pick’s Disease and TDP-43 Proteinopathy

    Science.gov (United States)

    Irwin, David J.; Byrne, Matthew D.; McMillan, Corey T.; Cooper, Felicia; Arnold, Steven E.; Lee, Edward B.; Van Deerlin, Vivianna M.; Xie, Sharon X.; Lee, Virginia M.-Y.; Grossman, Murray; Trojanowski, John Q.

    2015-01-01

    Digital image analysis of histology sections provides reliable, high-throughput methods for neuropathological studies but data is scant in frontotemporal lobar degeneration (FTLD), which has an added challenge of study due to morphologically diverse pathologies. Here, we describe a novel method of semi-automated digital image analysis in FTLD subtypes including: Pick’s disease (PiD, n=11) with tau-positive intracellular inclusions and neuropil threads, and TDP-43 pathology type C (FTLD-TDPC, n=10), defined by TDP-43-positive aggregates predominantly in large dystrophic neurites. To do this, we examined three FTLD-associated cortical regions: mid-frontal gyrus (MFG), superior temporal gyrus (STG) and anterior cingulate gyrus (ACG) by immunohistochemistry. We used a color deconvolution process to isolate signal from the chromogen and applied both object detection and intensity thresholding algorithms to quantify pathological burden. We found object-detection algorithms had good agreement with gold-standard manual quantification of tau- and TDP-43-positive inclusions. Our sampling method was reliable across three separate investigators and we obtained similar results in a pilot analysis using open-source software. Regional comparisons using these algorithms finds differences in regional anatomic disease burden between PiD and FTLD-TDP not detected using traditional ordinal scale data, suggesting digital image analysis is a powerful tool for clinicopathological studies in morphologically diverse FTLD syndromes. PMID:26538548

  12. Chemical composition dispersion in bi-metallic nanoparticles: semi-automated analysis using HAADF-STEM

    International Nuclear Information System (INIS)

    Epicier, T.; Sato, K.; Tournus, F.; Konno, T.

    2012-01-01

    We present a method using high-angle annular dark field scanning transmission electron microscopy (HAADF-STEM) to determine the chemical composition of bi-metallic nanoparticles. This method, which can be applied in a semi-automated way, allows large scale analysis with a statistical number of particles (several hundreds) in a short time. Once a calibration curve has been obtained, e.g., using energy-dispersive X-ray spectroscopy (EDX) measurements on a few particles, the HAADF integrated intensity of each particle can indeed be directly related to its chemical composition. After a theoretical description, this approach is applied to the case of iron–palladium nanoparticles (expected to be nearly stoichiometric) with a mean size of 8.3 nm. It will be shown that an accurate chemical composition histogram is obtained, i.e., the Fe content has been determined to be 49.0 at.% with a dispersion of 10.4 %. HAADF-STEM analysis represents a powerful alternative to fastidious single particle EDX measurements, for the compositional dispersion in alloy nanoparticles.

  13. Semi-automated uranium analysis by a modified Davies--Gray procedure

    International Nuclear Information System (INIS)

    Swanson, G.C.

    1977-01-01

    To rapidly and reliably determine uranium in fuel materials a semi-automated implementation of the Davies-Gray uranium titration was developed. The Davies-Gray method is essentially a three step procedure. First uranium is reduced quantitatively from +6 valence to +4 valence by excess of iron (II) in strong phosphoric acid in the absence of nitrite. Prior to the uranium reduction nitrite is destroyed by addition of sulfamic acid. In the second step iron (II) is selectively oxidized to iron (III) by nitric acid in the presence of Mo (VI) catalyst. Finally after dilution to reduce phosphate concentration, the uranium is titrated to U (VI) by standard dichromate. The original sluggish colorimetric endpoint determination used by Davies and Gray is seldom used since New Brunswick Laboratory discovered that addition of vanadium (IV) just prior to titration sufficiently improves reaction rate to allow a potentiometric endpoint determination. One of the advantages of the Davies-Gray uranium titration is that it is quite specific for uranium, most common impurity elements do not interfere with the analysis, and specifically high levels of Pu, Th, and Fe are tolerated

  14. Analysis of the thoracic aorta using a semi-automated post processing tool

    International Nuclear Information System (INIS)

    Entezari, Pegah; Kino, Aya; Honarmand, Amir R.; Galizia, Mauricio S.; Yang, Yan; Collins, Jeremy; Yaghmai, Vahid; Carr, James C.

    2013-01-01

    Objective: To evaluates a semi-automated method for Thoracic Aortic Aneurysm (TAA) measurement using ECG-gated Dual Source CT Angiogram (DSCTA). Methods: This retrospective HIPAA compliant study was approved by our IRB. Transaxial maximum diameters of outer wall to outer wall were studied in fifty patients at seven anatomic locations of the thoracic aorta: annulus, sinus, sinotubular junction (STJ), mid ascending aorta (MAA) at the level of right pulmonary artery, proximal aortic arch (PROX) immediately proximal to innominate artery, distal aortic arch (DIST) immediately distal to left subclavian artery, and descending aorta (DESC) at the level of diaphragm. Measurements were performed using a manual method and semi-automated software. All readers repeated their measurements. Inter-method, intra-observer and inter-observer agreements were evaluated according to intraclass correlation coefficient (ICC) and Bland–Altman plot. The number of cases with manual contouring or center line adjustment for the semi-automated method and also the post-processing time for each method were recorded. Results: The mean difference between semi-automated and manual methods was less than 1.3 mm at all seven points. Strong inter-method, inter-observer and intra-observer agreement was recorded at all levels (ICC ≥ 0.9). The maximum rate of manual adjustment of center line and contour was at the level of annulus. The average time for manual post-processing of the aorta was 19 ± 0.3 min, while it took 8.26 ± 2.1 min to do the measurements with the semi-automated tool (Vitrea version 6.0.0.1 software). The center line was edited manually at all levels, with most corrections at the level of annulus (60%), while the contour was adjusted at all levels with highest and lowest number of corrections at the levels of annulus and DESC (75% and 0.07% of the cases), respectively. Conclusion: Compared to the commonly used manual method, semi-automated measurement of vessel dimensions is

  15. Comparison of manual and semi-automated delineation of regions of interest for radioligand PET imaging analysis

    International Nuclear Information System (INIS)

    Chow, Tiffany W; Verhoeff, Nicolaas PLG; Takeshita, Shinichiro; Honjo, Kie; Pataky, Christina E; St Jacques, Peggy L; Kusano, Maggie L; Caldwell, Curtis B; Ramirez, Joel; Black, Sandra

    2007-01-01

    As imaging centers produce higher resolution research scans, the number of man-hours required to process regional data has become a major concern. Comparison of automated vs. manual methodology has not been reported for functional imaging. We explored validation of using automation to delineate regions of interest on positron emission tomography (PET) scans. The purpose of this study was to ascertain improvements in image processing time and reproducibility of a semi-automated brain region extraction (SABRE) method over manual delineation of regions of interest (ROIs). We compared 2 sets of partial volume corrected serotonin 1a receptor binding potentials (BPs) resulting from manual vs. semi-automated methods. BPs were obtained from subjects meeting consensus criteria for frontotemporal degeneration and from age- and gender-matched healthy controls. Two trained raters provided each set of data to conduct comparisons of inter-rater mean image processing time, rank order of BPs for 9 PET scans, intra- and inter-rater intraclass correlation coefficients (ICC), repeatability coefficients (RC), percentages of the average parameter value (RM%), and effect sizes of either method. SABRE saved approximately 3 hours of processing time per PET subject over manual delineation (p < .001). Quality of the SABRE BP results was preserved relative to the rank order of subjects by manual methods. Intra- and inter-rater ICC were high (>0.8) for both methods. RC and RM% were lower for the manual method across all ROIs, indicating less intra-rater variance across PET subjects' BPs. SABRE demonstrated significant time savings and no significant difference in reproducibility over manual methods, justifying the use of SABRE in serotonin 1a receptor radioligand PET imaging analysis. This implies that semi-automated ROI delineation is a valid methodology for future PET imaging analysis

  16. Semi-automated volumetric analysis of lymph node metastases in patients with malignant melanoma stage III/IV-A feasibility study

    International Nuclear Information System (INIS)

    Fabel, M.; Tengg-Kobligk, H. von; Giesel, F.L.; Delorme, S.; Kauczor, H.-U.; Bornemann, L.; Dicken, V.; Kopp-Schneider, A.; Moser, C.

    2008-01-01

    Therapy monitoring in oncological patient care requires accurate and reliable imaging and post-processing methods. RECIST criteria are the current standard, with inherent disadvantages. The aim of this study was to investigate the feasibility of semi-automated volumetric analysis of lymph node metastases in patients with malignant melanoma compared to manual volumetric analysis and RECIST. Multislice CT was performed in 47 patients, covering the chest, abdomen and pelvis. In total, 227 suspicious, enlarged lymph nodes were evaluated retrospectively by two radiologists regarding diameters (RECIST), manually measured volume by placement of ROIs and semi-automated volumetric analysis. Volume (ml), quality of segmentation (++/-) and time effort (s) were evaluated in the study. The semi-automated volumetric analysis software tool was rated acceptable to excellent in 81% of all cases (reader 1) and 79% (reader 2). Median time for the entire segmentation process and necessary corrections was shorter with the semi-automated software than by manual segmentation. Bland-Altman plots showed a significantly lower interobserver variability for semi-automated volumetric than for RECIST measurements. The study demonstrated feasibility of volumetric analysis of lymph node metastases. The software allows a fast and robust segmentation in up to 80% of all cases. Ease of use and time needed are acceptable for application in the clinical routine. Variability and interuser bias were reduced to about one third of the values found for RECIST measurements. (orig.)

  17. Semi Automated Land Cover Layer Updating Process Utilizing Spectral Analysis and GIS Data Fusion

    Science.gov (United States)

    Cohen, L.; Keinan, E.; Yaniv, M.; Tal, Y.; Felus, A.; Regev, R.

    2018-04-01

    Technological improvements made in recent years of mass data gathering and analyzing, influenced the traditional methods of updating and forming of the national topographic database. It has brought a significant increase in the number of use cases and detailed geo information demands. Processes which its purpose is to alternate traditional data collection methods developed in many National Mapping and Cadaster Agencies. There has been significant progress in semi-automated methodologies aiming to facilitate updating of a topographic national geodatabase. Implementation of those is expected to allow a considerable reduction of updating costs and operation times. Our previous activity has focused on building automatic extraction (Keinan, Zilberstein et al, 2015). Before semiautomatic updating method, it was common that interpreter identification has to be as detailed as possible to hold most reliable database eventually. When using semi-automatic updating methodologies, the ability to insert human insights based knowledge is limited. Therefore, our motivations were to reduce the created gap by allowing end-users to add their data inputs to the basic geometric database. In this article, we will present a simple Land cover database updating method which combines insights extracted from the analyzed image, and a given spatial data of vector layers. The main stages of the advanced practice are multispectral image segmentation and supervised classification together with given vector data geometric fusion while maintaining the principle of low shape editorial work to be done. All coding was done utilizing open source software components.

  18. Semi-automated relative quantification of cell culture contamination with mycoplasma by Photoshop-based image analysis on immunofluorescence preparations.

    Science.gov (United States)

    Kumar, Ashok; Yerneni, Lakshmana K

    2009-01-01

    Mycoplasma contamination in cell culture is a serious setback for the cell-culturist. The experiments undertaken using contaminated cell cultures are known to yield unreliable or false results due to various morphological, biochemical and genetic effects. Earlier surveys revealed incidences of mycoplasma contamination in cell cultures to range from 15 to 80%. Out of a vast array of methods for detecting mycoplasma in cell culture, the cytological methods directly demonstrate the contaminating organism present in association with the cultured cells. In this investigation, we report the adoption of a cytological immunofluorescence assay (IFA), in an attempt to obtain a semi-automated relative quantification of contamination by employing the user-friendly Photoshop-based image analysis. The study performed on 77 cell cultures randomly collected from various laboratories revealed mycoplasma contamination in 18 cell cultures simultaneously by IFA and Hoechst DNA fluorochrome staining methods. It was observed that the Photoshop-based image analysis on IFA stained slides was very valuable as a sensitive tool in providing quantitative assessment on the extent of contamination both per se and in comparison to cellularity of cell cultures. The technique could be useful in estimating the efficacy of anti-mycoplasma agents during decontaminating measures.

  19. Semi-automated curation of metabolic models via flux balance analysis: a case study with Mycoplasma gallisepticum.

    Directory of Open Access Journals (Sweden)

    Eddy J Bautista

    Full Text Available Primarily used for metabolic engineering and synthetic biology, genome-scale metabolic modeling shows tremendous potential as a tool for fundamental research and curation of metabolism. Through a novel integration of flux balance analysis and genetic algorithms, a strategy to curate metabolic networks and facilitate identification of metabolic pathways that may not be directly inferable solely from genome annotation was developed. Specifically, metabolites involved in unknown reactions can be determined, and potentially erroneous pathways can be identified. The procedure developed allows for new fundamental insight into metabolism, as well as acting as a semi-automated curation methodology for genome-scale metabolic modeling. To validate the methodology, a genome-scale metabolic model for the bacterium Mycoplasma gallisepticum was created. Several reactions not predicted by the genome annotation were postulated and validated via the literature. The model predicted an average growth rate of 0.358±0.12[Formula: see text], closely matching the experimentally determined growth rate of M. gallisepticum of 0.244±0.03[Formula: see text]. This work presents a powerful algorithm for facilitating the identification and curation of previously known and new metabolic pathways, as well as presenting the first genome-scale reconstruction of M. gallisepticum.

  20. Feasibility of a semi-automated method for cardiac conduction velocity analysis of high-resolution activation maps

    NARCIS (Netherlands)

    Doshi, Ashish N.; Walton, Richard D.; Krul, Sébastien P.; de Groot, Joris R.; Bernus, Olivier; Efimov, Igor R.; Boukens, Bastiaan J.; Coronel, Ruben

    2015-01-01

    Myocardial conduction velocity is important for the genesis of arrhythmias. In the normal heart, conduction is primarily dependent on fiber direction (anisotropy) and may be discontinuous at sites with tissue heterogeneities (trabeculated or fibrotic tissue). We present a semi-automated method for

  1. NetFCM: A Semi-Automated Web-Based Method for Flow Cytometry Data Analysis

    DEFF Research Database (Denmark)

    Frederiksen, Juliet Wairimu; Buggert, Marcus; Karlsson, Annika C.

    2014-01-01

    data analysis has become more complex and labor-intensive than previously. We have therefore developed a semi-automatic gating strategy (NetFCM) that uses clustering and principal component analysis (PCA) together with other statistical methods to mimic manual gating approaches. NetFCM is an online...... tool both for subset identification as well as for quantification of differences between samples. Additionally, NetFCM can classify and cluster samples based on multidimensional data. We tested the method using a data set of peripheral blood mononuclear cells collected from 23 HIV-infected individuals...... corresponding to those obtained by manual gating strategies. These data demonstrate that NetFCM has the potential to identify relevant T cell populations by mimicking classical FCM data analysis and reduce the subjectivity and amount of time associated with such analysis. (c) 2014 International Society...

  2. Fast-FISH Detection and Semi-Automated Image Analysis of Numerical Chromosome Aberrations in Hematological Malignancies

    Directory of Open Access Journals (Sweden)

    Arif Esa

    1998-01-01

    Full Text Available A new fluorescence in situ hybridization (FISH technique called Fast-FISH in combination with semi-automated image analysis was applied to detect numerical aberrations of chromosomes 8 and 12 in interphase nuclei of peripheral blood lymphocytes and bone marrow cells from patients with acute myelogenous leukemia (AML and chronic lymphocytic leukemia (CLL. Commercially available α-satellite DNA probes specific for the centromere regions of chromosome 8 and chromosome 12, respectively, were used. After application of the Fast-FISH protocol, the microscopic images of the fluorescence-labelled cell nuclei were recorded by the true color CCD camera Kappa CF 15 MC and evaluated quantitatively by computer analysis on a PC. These results were compared to results obtained from the same type of specimens using the same analysis system but with a standard FISH protocol. In addition, automated spot counting after both FISH techniques was compared to visual spot counting after standard FISH. A total number of about 3,000 cell nuclei was evaluated. For quantitative brightness parameters, a good correlation between standard FISH labelling and Fast-FISH was found. Automated spot counting after Fast-FISH coincided within a few percent to automated and visual spot counting after standard FISH. The examples shown indicate the reliability and reproducibility of Fast-FISH and its potential for automatized interphase cell diagnostics of numerical chromosome aberrations. Since the Fast-FISH technique requires a hybridization time as low as 1/20 of established standard FISH techniques, omitting most of the time consuming working steps in the protocol, it may contribute considerably to clinical diagnostics. This may especially be interesting in cases where an accurate result is required within a few hours.

  3. Scaling up Ecological Measurements of Coral Reefs Using Semi-Automated Field Image Collection and Analysis

    Directory of Open Access Journals (Sweden)

    Manuel González-Rivero

    2016-01-01

    Full Text Available Ecological measurements in marine settings are often constrained in space and time, with spatial heterogeneity obscuring broader generalisations. While advances in remote sensing, integrative modelling and meta-analysis enable generalisations from field observations, there is an underlying need for high-resolution, standardised and geo-referenced field data. Here, we evaluate a new approach aimed at optimising data collection and analysis to assess broad-scale patterns of coral reef community composition using automatically annotated underwater imagery, captured along 2 km transects. We validate this approach by investigating its ability to detect spatial (e.g., across regions and temporal (e.g., over years change, and by comparing automated annotation errors to those of multiple human annotators. Our results indicate that change of coral reef benthos can be captured at high resolution both spatially and temporally, with an average error below 5%, among key benthic groups. Cover estimation errors using automated annotation varied between 2% and 12%, slightly larger than human errors (which varied between 1% and 7%, but small enough to detect significant changes among dominant groups. Overall, this approach allows a rapid collection of in-situ observations at larger spatial scales (km than previously possible, and provides a pathway to link, calibrate, and validate broader analyses across even larger spatial scales (10–10,000 km2.

  4. Semi-automated petrographic assessment of coal by coal grain analysis

    Energy Technology Data Exchange (ETDEWEB)

    O' Brien, G.; Jenkins, B.; Ofori, P.; Ferguson, K. [CSIRO Exploration and Mining, Pullenvale, Qld. (Australia)

    2007-04-15

    A new classification method, coal grain analysis, which uses optical imaging techniques for the microscopic characterisation of the individual grains present in coal samples is discussed. This differs from other coal petrography imaging methods in that a mask is used to remove the pixels of mounting resin to obtain compositional information of the maceral (vitrinite, inertinite and liptinite) and mineral abundances on each individual grain within each image. Experiments were conducted to establish the density of individual constituents in order to enable the density of each grain to be determined and the results reported on a mass basis. The grains were sorted into eight grain classes of liberated (single component) and composite grains. By analysing all streams (feed, concentrate and tailings) of the flotation circuit at a coal washing plant, the flotation response of the individual grain classes was tracked. This has implications for flotation process diagnostics and optimisation.

  5. Semi-automated digital measurement as the method of choice for beta cell mass analysis.

    Directory of Open Access Journals (Sweden)

    Violette Coppens

    Full Text Available Pancreas injury by partial duct ligation (PDL activates beta cell differentiation and proliferation in adult mouse pancreas but remains controversial regarding the anticipated increase in beta cell volume. Several reports unable to show beta cell volume augmentation in PDL pancreas used automated digital image analysis software. We hypothesized that fully automatic beta cell morphometry without manual micrograph artifact remediation introduces bias and therefore might be responsible for reported discrepancies and controversy. However, our present results prove that standard digital image processing with automatic thresholding is sufficiently robust albeit less sensitive and less adequate to demonstrate a significant increase in beta cell volume in PDL versus Sham-operated pancreas. We therefore conclude that other confounding factors such as quality of surgery, selection of samples based on relative abundance of the transcription factor Neurogenin 3 (Ngn3 and tissue processing give rise to inter-laboratory inconsistencies in beta cell volume quantification in PDL pancreas.

  6. MSCT follow-up in malignant lymphoma. Comparison of manual linear measurements with semi-automated lymph node analysis for therapy response classification

    International Nuclear Information System (INIS)

    Wessling, J.; Puesken, M.; Kohlhase, N.; Persigehl, T.; Mesters, R.; Heindel, W.; Buerke, B.; Koch, R.

    2012-01-01

    Purpose: Assignment of semi-automated lymph node analysis compared to manual measurements for therapy response classification of malignant lymphoma in MSCT. Materials and Methods: MSCT scans of 63 malignant lymphoma patients before and after 2 cycles of chemotherapy (307 target lymph nodes) were evaluated. The long axis diameter (LAD), short axis diameter (SAD) and bi-dimensional WHO were determined manually and semi-automatically. The time for manual and semi-automatic segmentation was evaluated. The ref. standard response was defined as the mean relative change across all manual and semi-automatic measurements (mean manual/semi-automatic LAD, SAD, semi-automatic volume). Statistical analysis encompassed t-test and McNemar's test for clustered data. Results: Response classification per lymph node revealed semi-automated volumetry and bi-dimensional WHO to be significantly more accurate than manual linear metric measurements. Response classification per patient based on RECIST revealed more patients to be correctly classified by semi-automatic measurements, e.g. 96.0 %/92.9 % (WHO bi-dimensional/volume) compared to 85.7/84.1 % for manual LAD and SAD, respectively (mean reduction in misclassified patients of 9.95 %). Considering the use of correction tools, the time expenditure for lymph node segmentation (29.7 ± 17.4 sec) was the same as with the manual approach (29.1 ± 14.5 sec). Conclusion: Semi-automatically derived 'lymph node volume' and 'bi-dimensional WHO' significantly reduce the number of misclassified patients in the CT follow-up of malignant lymphoma by at least 10 %. However, lymph node volumetry does not outperform bi-dimensional WHO. (orig.)

  7. MSCT follow-up in malignant lymphoma. Comparison of manual linear measurements with semi-automated lymph node analysis for therapy response classification

    Energy Technology Data Exchange (ETDEWEB)

    Wessling, J.; Puesken, M.; Kohlhase, N.; Persigehl, T.; Mesters, R.; Heindel, W.; Buerke, B. [Muenster Univ. (Germany). Dept. of Clinical Radiology; Koch, R. [Muenster Univ. (Germany). Inst. of Biostatistics and Clinical Research

    2012-09-15

    Purpose: Assignment of semi-automated lymph node analysis compared to manual measurements for therapy response classification of malignant lymphoma in MSCT. Materials and Methods: MSCT scans of 63 malignant lymphoma patients before and after 2 cycles of chemotherapy (307 target lymph nodes) were evaluated. The long axis diameter (LAD), short axis diameter (SAD) and bi-dimensional WHO were determined manually and semi-automatically. The time for manual and semi-automatic segmentation was evaluated. The ref. standard response was defined as the mean relative change across all manual and semi-automatic measurements (mean manual/semi-automatic LAD, SAD, semi-automatic volume). Statistical analysis encompassed t-test and McNemar's test for clustered data. Results: Response classification per lymph node revealed semi-automated volumetry and bi-dimensional WHO to be significantly more accurate than manual linear metric measurements. Response classification per patient based on RECIST revealed more patients to be correctly classified by semi-automatic measurements, e.g. 96.0 %/92.9 % (WHO bi-dimensional/volume) compared to 85.7/84.1 % for manual LAD and SAD, respectively (mean reduction in misclassified patients of 9.95 %). Considering the use of correction tools, the time expenditure for lymph node segmentation (29.7 {+-} 17.4 sec) was the same as with the manual approach (29.1 {+-} 14.5 sec). Conclusion: Semi-automatically derived 'lymph node volume' and 'bi-dimensional WHO' significantly reduce the number of misclassified patients in the CT follow-up of malignant lymphoma by at least 10 %. However, lymph node volumetry does not outperform bi-dimensional WHO. (orig.)

  8. Development and Evaluation of a Semi-automated Segmentation Tool and a Modified Ellipsoid Formula for Volumetric Analysis of the Kidney in Non-contrast T2-Weighted MR Images.

    Science.gov (United States)

    Seuss, Hannes; Janka, Rolf; Prümmer, Marcus; Cavallaro, Alexander; Hammon, Rebecca; Theis, Ragnar; Sandmair, Martin; Amann, Kerstin; Bäuerle, Tobias; Uder, Michael; Hammon, Matthias

    2017-04-01

    Volumetric analysis of the kidney parenchyma provides additional information for the detection and monitoring of various renal diseases. Therefore the purposes of the study were to develop and evaluate a semi-automated segmentation tool and a modified ellipsoid formula for volumetric analysis of the kidney in non-contrast T2-weighted magnetic resonance (MR)-images. Three readers performed semi-automated segmentation of the total kidney volume (TKV) in axial, non-contrast-enhanced T2-weighted MR-images of 24 healthy volunteers (48 kidneys) twice. A semi-automated threshold-based segmentation tool was developed to segment the kidney parenchyma. Furthermore, the three readers measured renal dimensions (length, width, depth) and applied different formulas to calculate the TKV. Manual segmentation served as a reference volume. Volumes of the different methods were compared and time required was recorded. There was no significant difference between the semi-automatically and manually segmented TKV (p = 0.31). The difference in mean volumes was 0.3 ml (95% confidence interval (CI), -10.1 to 10.7 ml). Semi-automated segmentation was significantly faster than manual segmentation, with a mean difference = 188 s (220 vs. 408 s); p T2-weighted MR data delivers accurate and reproducible results and was significantly faster than manual segmentation. Applying a modified ellipsoid formula quickly provides an accurate kidney volume.

  9. Semi-automated landmark-based 3D analysis reveals new morphometric characteristics in the trochlear dysplastic femur.

    Science.gov (United States)

    Van Haver, Annemieke; De Roo, Karel; De Beule, Matthieu; Van Cauter, Sofie; Audenaert, Emmanuel; Claessens, Tom; Verdonk, Peter

    2014-11-01

    The authors hypothesise that the trochlear dysplastic distal femur is not only characterised by morphological changes to the trochlea. The purpose of this study is to describe the morphological characteristics of the trochlear dysplastic femur in and outside the trochlear region with a landmark-based 3D analysis. Arthro-CT scans of 20 trochlear dysplastic and 20 normal knees were used to generate 3D models including the cartilage. To rule out size differences, a set of landmarks were defined on the distal femur to isotropically scale the 3D models to a standard size. A predefined series of landmark-based reference planes were applied on the distal femur. With these landmarks and reference planes, a series of previously described characteristics associated with trochlear dysplasia as well as a series of morphometric characteristics were measured. For the previously described characteristics, the analysis replicated highly significant differences between trochlear dysplastic and normal knees. Furthermore, the analysis showed that, when knee size is taken into account, the cut-off values of the trochlear bump and depth would be 1 mm larger in the largest knees compared to the smallest knees. For the morphometric characteristics, the analysis revealed that the trochlear dysplastic femur is also characterised by a 10% smaller intercondylar notch, 6-8% larger posterior condyles (lateral-medial) in the anteroposterior direction and a 6% larger medial condyle in the proximodistal direction compared to a normal femur. This study shows that knee size is important in the application of absolute metric cut-off values and that the posterior femur also shows a significantly different morphology.

  10. Supervised Semi-Automated Data Analysis Software for Gas Chromatography / Differential Mobility Spectrometry (GC/DMS) Metabolomics Applications.

    Science.gov (United States)

    Peirano, Daniel J; Pasamontes, Alberto; Davis, Cristina E

    2016-09-01

    Modern differential mobility spectrometers (DMS) produce complex and multi-dimensional data streams that allow for near-real-time or post-hoc chemical detection for a variety of applications. An active area of interest for this technology is metabolite monitoring for biological applications, and these data sets regularly have unique technical and data analysis end user requirements. While there are initial publications on how investigators have individually processed and analyzed their DMS metabolomic data, there are no user-ready commercial or open source software packages that are easily used for this purpose. We have created custom software uniquely suited to analyze gas chromatograph / differential mobility spectrometry (GC/DMS) data from biological sources. Here we explain the implementation of the software, describe the user features that are available, and provide an example of how this software functions using a previously-published data set. The software is compatible with many commercial or home-made DMS systems. Because the software is versatile, it can also potentially be used for other similarly structured data sets, such as GC/GC and other IMS modalities.

  11. Technical note on the validation of a semi-automated image analysis software application for estrogen and progesterone receptor detection in breast cancer

    Science.gov (United States)

    2011-01-01

    Background The immunohistochemical detection of estrogen (ER) and progesterone (PR) receptors in breast cancer is routinely used for prognostic and predictive testing. Whole slide digitalization supported by dedicated software tools allows quantization of the image objects (e.g. cell membrane, nuclei) and an unbiased analysis of immunostaining results. Validation studies of image analysis applications for the detection of ER and PR in breast cancer specimens provided strong concordance between the pathologist's manual assessment of slides and scoring performed using different software applications. Methods The effectiveness of two connected semi-automated image analysis software (NuclearQuant v. 1.13 application for Pannoramic™ Viewer v. 1.14) for determination of ER and PR status in formalin-fixed paraffin embedded breast cancer specimens immunostained with the automated Leica Bond Max system was studied. First the detection algorithm was calibrated to the scores provided an independent assessors (pathologist), using selected areas from 38 small digital slides (created from 16 cases) containing a mean number of 195 cells. Each cell was manually marked and scored according to the Allred-system combining frequency and intensity scores. The performance of the calibrated algorithm was tested on 16 cases (14 invasive ductal carcinoma, 2 invasive lobular carcinoma) against the pathologist's manual scoring of digital slides. Results The detection was calibrated to 87 percent object detection agreement and almost perfect Total Score agreement (Cohen's kappa 0.859, quadratic weighted kappa 0.986) from slight or moderate agreement at the start of the study, using the un-calibrated algorithm. The performance of the application was tested against the pathologist's manual scoring of digital slides on 53 regions of interest of 16 ER and PR slides covering all positivity ranges, and the quadratic weighted kappa provided almost perfect agreement (κ = 0.981) among the two

  12. A semi-automated motion-tracking analysis of locomotion speed in the C. elegans transgenics overexpressing beta-amyloid in neurons

    Directory of Open Access Journals (Sweden)

    Kevin eMachino

    2014-07-01

    Full Text Available Multi-Worm Tracker (MWT is a real-time computer vision system that can simultaneously quantify motional patterns of multiple worms. MWT provides several behavioral parameters, including analysis of accurate real-time locomotion speed in the nematode, Caenorhabditis elegans. Here, we determined locomotion speed of the Alzheimer’s disease (AD transgenic strain that over-expresses human beta-amyloid1-42 (Aβ in the neurons. The MWT analysis showed that the AD strain logged a slower average speed than the wild type worms. The results may be consistent with the observation that the AD patients with dementia tend to show deficits in physical activities, including frequent falls. The AD strain showed reduced ability of the eggs to hatch and slowed hatching of the eggs. Thus, over-expression of Aβ in neurons causes negative effects on locomotion and hatchability. This study sheds light on new examples of detrimental effects that Aβ deposits can exhibit using C. elegans as a model system. The information gathered from this study indicates that the motion tracking analysis is a cost-effective, efficient way to assess the deficits of Aβ over-expression in the C. elegans system.

  13. Validation of an semi-automated multi component method using protein precipitation LC-MS-MS for the analysis of whole blood samples

    DEFF Research Database (Denmark)

    Slots, Tina

    BACKGROUND: Solid phase extraction (SPE) are one of many multi-component methods, but can be very time-consuming and labour-intensive. Protein precipitation is, on the other hand, a much simpler and faster sample pre-treatment than SPE, and protein precipitation also has the ability to cover a wi......-mortem whole blood sample preparation for toxicological analysis; from the primary sample tube to a 96-deepwell plate ready for injection on the liquid chromatography mass spectrometry (LC-MS/MS)....

  14. Three-dimensional reconstruction of the human spine from bi-planar radiographs: using multiscale wavelet analysis and spline interpolators for semi-automation

    Science.gov (United States)

    Deschenes, Sylvain; Godbout, Benoit; Branchaud, Dominic; Mitton, David; Pomero, Vincent; Bleau, Andre; Skalli, Wafa; de Guise, Jacques A.

    2003-05-01

    We propose a new fast stereoradiographic 3D reconstruction method for the spine. User input is limited to few points passing through the spine on two radiographs and two line segments representing the end plates of the limiting vertebrae. A 3D spline that hints the positions of the vertebrae in space is then generated. We then use wavelet multi-scale analysis (WMSA) to automatically localize specific features in both lateral and frontal radiographs. The WMSA gives an elegant spectral investigation that leads to gradient generation and edge extraction. Analysis of the information contained at several scales leads to the detection of 1) two curves enclosing the vertebral bodies' walls and 2) inter-vertebral spaces along the spine. From this data, we extract four points per vertebra per view, corresponding to the corners of the vertebral bodies. These points delimit a hexahedron in space where we can match the vertebral body. This hexahedron is then passed through a 3D statistical database built using local and global information generated from a bank of normal and scoliotic spines. Finally, models of the vertebrae are positioned with respect to these landmarks, completing the 3D reconstruction.

  15. Clinical validation of semi-automated software for volumetric and dynamic contrast enhancement analysis of soft tissue venous malformations on magnetic resonance imaging examination

    Energy Technology Data Exchange (ETDEWEB)

    Caty, Veronique [Hopital Maisonneuve-Rosemont, Universite de Montreal, Department of Radiology, Montreal, QC (Canada); Kauffmann, Claude; Giroux, Marie-France; Oliva, Vincent; Therasse, Eric [Centre Hospitalier de l' Universite de Montreal (CHUM), Universite de Montreal and Research Centre, CHUM (CRCHUM), Department of Radiology, Montreal, QC (Canada); Dubois, Josee [Centre Hospitalier Universitaire Sainte-Justine et Universite de Montreal, Department of Radiology, Montreal, QC (Canada); Mansour, Asmaa [Institut de Cardiologie de Montreal, Heart Institute Coordinating Centre, Montreal, QC (Canada); Piche, Nicolas [Object Research System, Montreal, QC (Canada); Soulez, Gilles [Centre Hospitalier de l' Universite de Montreal (CHUM), Universite de Montreal and Research Centre, CHUM (CRCHUM), Department of Radiology, Montreal, QC (Canada); CHUM - Hopital Notre-Dame, Department of Radiology, Montreal, Quebec (Canada)

    2014-02-15

    To evaluate venous malformation (VM) volume and contrast-enhancement analysis on magnetic resonance imaging (MRI) compared with diameter evaluation. Baseline MRI was undertaken in 44 patients, 20 of whom were followed by MRI after sclerotherapy. All patients underwent short-tau inversion recovery (STIR) acquisitions and dynamic contrast assessment. VM diameters in three orthogonal directions were measured to obtain the largest and mean diameters. Volumetric reconstruction of VM was generated from two orthogonal STIR sequences and fused with acquisitions after contrast medium injection. Reproducibility (interclass correlation coefficients [ICCs]) of diameter and volume measurements was estimated. VM size variations in diameter and volume after sclerotherapy and contrast enhancement before sclerotherapy were compared in patients with clinical success or failure. Inter-observer ICCs were similar for diameter and volume measurements at baseline and follow-up (range 0.87-0.99). Higher percentages of size reduction after sclerotherapy were observed with volume (32.6 ± 30.7 %) than with diameter measurements (14.4 ± 21.4 %; P = 0.037). Contrast enhancement values were estimated at 65.3 ± 27.5 % and 84 ± 13 % in patients with clinical failure and success respectively (P = 0.056). Venous malformation volume was as reproducible as diameter measurement and more sensitive in detecting therapeutic responses. Patients with better clinical outcome tend to have stronger malformation enhancement. (orig.)

  16. Semi-Automated Analysis of Diaphragmatic Motion with Dynamic Magnetic Resonance Imaging in Healthy Controls and Non-Ambulant Subjects with Duchenne Muscular Dystrophy

    Directory of Open Access Journals (Sweden)

    Courtney A. Bishop

    2018-01-01

    Full Text Available Subjects with Duchenne Muscular Dystrophy (DMD suffer from progressive muscle damage leading to diaphragmatic weakness that ultimately requires ventilation. Emerging treatments have generated interest in better characterizing the natural history of respiratory impairment in DMD and responses to therapy. Dynamic (cine Magnetic Resonance Imaging (MRI may provide a more sensitive measure of diaphragm function in DMD than the commonly used spirometry. This study presents an analysis pipeline for measuring parameters of diaphragmatic motion from dynamic MRI and its application to investigate MRI measures of respiratory function in both healthy controls and non-ambulant DMD boys. We scanned 13 non-ambulant DMD boys and 10 age-matched healthy male volunteers at baseline, with a subset (n = 10, 10, 8 of the DMD subjects also assessed 3, 6, and 12 months later. Spirometry-derived metrics including forced vital capacity were recorded. The MRI-derived measures included the lung cross-sectional area (CSA, the anterior, central, and posterior lung lengths in the sagittal imaging plane, and the diaphragm length over the time-course of the dynamic MRI. Regression analyses demonstrated strong linear correlations between lung CSA and the length measures over the respiratory cycle, with a reduction of these correlations in DMD, and diaphragmatic motions that contribute less efficiently to changing lung capacity in DMD. MRI measures of pulmonary function were reduced in DMD, controlling for height differences between the groups: at maximal inhalation, the maximum CSA and the total distance of motion of the diaphragm were 45% and 37% smaller. MRI measures of pulmonary function were correlated with spirometry data and showed relationships with disease progression surrogates of age and months non-ambulatory, suggesting that they provide clinically meaningful information. Changes in the MRI measures over 12 months were consistent with weakening of

  17. A multistage, semi-automated procedure for analyzing the morphology of nanoparticles

    KAUST Repository

    Park, Chiwoo

    2012-07-01

    This article presents a multistage, semi-automated procedure that can expedite the morphology analysis of nanoparticles. Material scientists have long conjectured that the morphology of nanoparticles has a profound impact on the properties of the hosting material, but a bottleneck is the lack of a reliable and automated morphology analysis of the particles based on their image measurements. This article attempts to fill in this critical void. One particular challenge in nanomorphology analysis is how to analyze the overlapped nanoparticles, a problem not well addressed by the existing methods but effectively tackled by the method proposed in this article. This method entails multiple stages of operations, executed sequentially, and is considered semi-automated due to the inclusion of a semi-supervised clustering step. The proposed method is applied to several images of nanoparticles, producing the needed statistical characterization of their morphology. © 2012 "IIE".

  18. A multistage, semi-automated procedure for analyzing the morphology of nanoparticles

    KAUST Repository

    Park, Chiwoo; Huang, Jianhua Z.; Huitink, David; Kundu, Subrata; Mallick, Bani K.; Liang, Hong; Ding, Yu

    2012-01-01

    This article presents a multistage, semi-automated procedure that can expedite the morphology analysis of nanoparticles. Material scientists have long conjectured that the morphology of nanoparticles has a profound impact on the properties of the hosting material, but a bottleneck is the lack of a reliable and automated morphology analysis of the particles based on their image measurements. This article attempts to fill in this critical void. One particular challenge in nanomorphology analysis is how to analyze the overlapped nanoparticles, a problem not well addressed by the existing methods but effectively tackled by the method proposed in this article. This method entails multiple stages of operations, executed sequentially, and is considered semi-automated due to the inclusion of a semi-supervised clustering step. The proposed method is applied to several images of nanoparticles, producing the needed statistical characterization of their morphology. © 2012 "IIE".

  19. Semi-automated Robust Quantification of Lesions (SRQL Toolbox

    Directory of Open Access Journals (Sweden)

    Kaori L Ito

    2017-05-01

    Full Text Available Quantifying lesions in a reliable manner is fundamental for studying the effects of neuroanatomical changes related to recovery in the post-stroke brain. However, the wide variability in lesion characteristics across individuals makes manual lesion segmentation a challenging and often subjective process. This often makes it difficult to combine stroke lesion data across multiple research sites, due to subjective differences in how lesions may be defined. Thus, we developed the Semi-automated Robust Quantification of Lesions (SRQL; https://github.com/npnl/SRQL; DOI: 10.5281/zenodo.557114 Toolbox that performs several analysis steps: 1 a white matter intensity correction that removes healthy white matter voxels from the lesion mask, thereby making lesions slightly more robust to subjective errors; 2 an automated report of descriptive statistics on lesions for simplified comparison between or across groups, and 3 an option to perform analyses in both native and standard space to facilitate analyses in either space. Here, we describe the methods implemented in the toolbox.

  20. Semi-automated Robust Quantification of Lesions (SRQL Toolbox

    Directory of Open Access Journals (Sweden)

    Kaori Ito

    2017-02-01

    Full Text Available Quantifying lesions in a robust manner is fundamental for studying the effects of neuroanatomical changes in the post-stroke brain on recovery. However, the wide variability in lesion characteristics across individuals makes manual lesion segmentation a challenging and often subjective process. This makes it difficult to combine stroke lesion data across multiple research sites, due to subjective differences in how lesions may be defined. We developed the Semi-automated Robust Quantification of Lesions (SRQL; https://github.com/npnl/SRQL; DOI: 10.5281/zenodo.267213 Toolbox that performs several analysis steps: 1 a white matter intensity correction that removes healthy white matter voxels from the lesion mask, thereby making lesions slightly more robust to subjective errors; 2 an automated report of descriptive statistics on lesions for simplified comparison between or across groups, and 3 an option to perform analyses in both native and standard space to facilitate analyses in either space, or comparisons between spaces. Here, we describe the methods implemented in the toolbox and demonstrate the outputs of the SRQL toolbox.

  1. Enhanced detection levels in a semi-automated sandwich ...

    African Journals Online (AJOL)

    A peptide nucleic acid (PNA) signal probe was tested as a replacement for a typical DNA oligonucleotidebased signal probe in a semi-automated sandwich hybridisation assay designed to detect the harmful phytoplankton species Alexandrium tamarense. The PNA probe yielded consistently higher fluorescent signal ...

  2. White matter hyperintensities segmentation: a new semi-automated method

    Directory of Open Access Journals (Sweden)

    Mariangela eIorio

    2013-12-01

    Full Text Available White matter hyperintensities (WMH are brain areas of increased signal on T2-weighted or fluid attenuated inverse recovery magnetic resonance imaging (MRI scans. In this study we present a new semi-automated method to measure WMH load that is based on the segmentation of the intensity histogram of fluid-attenuated inversion recovery images. Thirty patients with Mild Cognitive Impairment with variable WMH load were enrolled. The semi-automated WMH segmentation included: removal of non-brain tissue, spatial normalization, removal of cerebellum and brain stem, spatial filtering, thresholding to segment probable WMH, manual editing for correction of false positives and negatives, generation of WMH map and volumetric estimation of the WMH load. Accuracy was quantitatively evaluated by comparing semi-automated and manual WMH segmentations performed by two independent raters. Differences between the two procedures were assessed using Student’s t tests and similarity was evaluated using linear regression model and Dice Similarity Coefficient (DSC. The volumes of the manual and semi-automated segmentations did not statistically differ (t-value= -1.79, DF=29, p= 0.839 for rater 1; t-value= 1.113, DF=29, p= 0.2749 for rater 2, were highly correlated (R²= 0.921, F (1,29 =155,54, p

  3. Refuelling: Swiss station will be semi-automated

    International Nuclear Information System (INIS)

    Fontaine, B.; Ribaux, P.

    1981-01-01

    The first semi-automated LWR refuelling machine in Europe has been supplied to the Leibstadt General Electric BWR in Switzerland. The system relieves operators of the boring and repetitive job of moving and accurately positioning the refuelling machine during fuelling operations and will thus contribute to plant safety. The machine and its mode of operation are described. (author)

  4. Semi-automated scoring of triple-probe FISH in human sperm using confocal microscopy.

    Science.gov (United States)

    Branch, Francesca; Nguyen, GiaLinh; Porter, Nicholas; Young, Heather A; Martenies, Sheena E; McCray, Nathan; Deloid, Glen; Popratiloff, Anastas; Perry, Melissa J

    2017-09-01

    Structural and numerical sperm chromosomal aberrations result from abnormal meiosis and are directly linked to infertility. Any live births that arise from aneuploid conceptuses can result in syndromes such as Kleinfelter, Turners, XYY and Edwards. Multi-probe fluorescence in situ hybridization (FISH) is commonly used to study sperm aneuploidy, however manual FISH scoring in sperm samples is labor-intensive and introduces errors. Automated scoring methods are continuously evolving. One challenging aspect for optimizing automated sperm FISH scoring has been the overlap in excitation and emission of the fluorescent probes used to enumerate the chromosomes of interest. Our objective was to demonstrate the feasibility of combining confocal microscopy and spectral imaging with high-throughput methods for accurately measuring sperm aneuploidy. Our approach used confocal microscopy to analyze numerical chromosomal abnormalities in human sperm using enhanced slide preparation and rigorous semi-automated scoring methods. FISH for chromosomes X, Y, and 18 was conducted to determine sex chromosome disomy in sperm nuclei. Application of online spectral linear unmixing was used for effective separation of four fluorochromes while decreasing data acquisition time. Semi-automated image processing, segmentation, classification, and scoring were performed on 10 slides using custom image processing and analysis software and results were compared with manual methods. No significant differences in disomy frequencies were seen between the semi automated and manual methods. Samples treated with pepsin were observed to have reduced background autofluorescence and more uniform distribution of cells. These results demonstrate that semi-automated methods using spectral imaging on a confocal platform are a feasible approach for analyzing numerical chromosomal aberrations in sperm, and are comparable to manual methods. © 2017 International Society for Advancement of Cytometry. © 2017

  5. Development and analysis of prognostic equations for mesoscale kinetic energy and mesoscale (subgrid scale) fluxes for large-scale atmospheric models

    Science.gov (United States)

    Avissar, Roni; Chen, Fei

    1993-01-01

    Generated by landscape discontinuities (e.g., sea breezes) mesoscale circulation processes are not represented in large-scale atmospheric models (e.g., general circulation models), which have an inappropiate grid-scale resolution. With the assumption that atmospheric variables can be separated into large scale, mesoscale, and turbulent scale, a set of prognostic equations applicable in large-scale atmospheric models for momentum, temperature, moisture, and any other gaseous or aerosol material, which includes both mesoscale and turbulent fluxes is developed. Prognostic equations are also developed for these mesoscale fluxes, which indicate a closure problem and, therefore, require a parameterization. For this purpose, the mean mesoscale kinetic energy (MKE) per unit of mass is used, defined as E-tilde = 0.5 (the mean value of u'(sub i exp 2), where u'(sub i) represents the three Cartesian components of a mesoscale circulation (the angle bracket symbol is the grid-scale, horizontal averaging operator in the large-scale model, and a tilde indicates a corresponding large-scale mean value). A prognostic equation is developed for E-tilde, and an analysis of the different terms of this equation indicates that the mesoscale vertical heat flux, the mesoscale pressure correlation, and the interaction between turbulence and mesoscale perturbations are the major terms that affect the time tendency of E-tilde. A-state-of-the-art mesoscale atmospheric model is used to investigate the relationship between MKE, landscape discontinuities (as characterized by the spatial distribution of heat fluxes at the earth's surface), and mesoscale sensible and latent heat fluxes in the atmosphere. MKE is compared with turbulence kinetic energy to illustrate the importance of mesoscale processes as compared to turbulent processes. This analysis emphasizes the potential use of MKE to bridge between landscape discontinuities and mesoscale fluxes and, therefore, to parameterize mesoscale fluxes

  6. Semi-automated microwave assisted solid-phase peptide synthesis

    DEFF Research Database (Denmark)

    Pedersen, Søren Ljungberg

    with microwaves for SPPS has gained in popularity as it for many syntheses has provided significant improvement in terms of speed, purity, and yields, maybe especially in the synthesis of long and "difficult" peptides. Thus, precise microwave heating has emerged as one new parameter for SPPS, in addition...... to coupling reagents, resins, solvents etc. We have previously reported on microwave heating to promote a range of solid-phase reactions in SPPS. Here we present a new, flexible semi-automated instrument for the application of precise microwave heating in solid-phase synthesis. It combines a slightly modified...... Biotage Initiator microwave instrument, which is available in many laboratories, with a modified semi-automated peptide synthesizer from MultiSynTech. A custom-made reaction vessel is placed permanently in the microwave oven, thus the reactor does not have to be moved between steps. Mixing is achieved...

  7. Semi-automated contour recognition using DICOMautomaton

    International Nuclear Information System (INIS)

    Clark, H; Duzenli, C; Wu, J; Moiseenko, V; Lee, R; Gill, B; Thomas, S

    2014-01-01

    Purpose: A system has been developed which recognizes and classifies Digital Imaging and Communication in Medicine contour data with minimal human intervention. It allows researchers to overcome obstacles which tax analysis and mining systems, including inconsistent naming conventions and differences in data age or resolution. Methods: Lexicographic and geometric analysis is used for recognition. Well-known lexicographic methods implemented include Levenshtein-Damerau, bag-of-characters, Double Metaphone, Soundex, and (word and character)-N-grams. Geometrical implementations include 3D Fourier Descriptors, probability spheres, boolean overlap, simple feature comparison (e.g. eccentricity, volume) and rule-based techniques. Both analyses implement custom, domain-specific modules (e.g. emphasis differentiating left/right organ variants). Contour labels from 60 head and neck patients are used for cross-validation. Results: Mixed-lexicographical methods show an effective improvement in more than 10% of recognition attempts compared with a pure Levenshtein-Damerau approach when withholding 70% of the lexicon. Domain-specific and geometrical techniques further boost performance. Conclusions: DICOMautomaton allows users to recognize contours semi-automatically. As usage increases and the lexicon is filled with additional structures, performance improves, increasing the overall utility of the system.

  8. New space sensor and mesoscale data analysis

    Science.gov (United States)

    Hickey, John S.

    1987-01-01

    The developed Earth Science and Application Division (ESAD) system/software provides the research scientist with the following capabilities: an extensive data base management capibility to convert various experiment data types into a standard format; and interactive analysis and display package (AVE80); an interactive imaging/color graphics capability utilizing the Apple III and IBM PC workstations integrated into the ESAD computer system; and local and remote smart-terminal capability which provides color video, graphics, and Laserjet output. Recommendations for updating and enhancing the performance of the ESAD computer system are listed.

  9. Semi-automated quantitative Drosophila wings measurements.

    Science.gov (United States)

    Loh, Sheng Yang Michael; Ogawa, Yoshitaka; Kawana, Sara; Tamura, Koichiro; Lee, Hwee Kuan

    2017-06-28

    Drosophila melanogaster is an important organism used in many fields of biological research such as genetics and developmental biology. Drosophila wings have been widely used to study the genetics of development, morphometrics and evolution. Therefore there is much interest in quantifying wing structures of Drosophila. Advancement in technology has increased the ease in which images of Drosophila can be acquired. However such studies have been limited by the slow and tedious process of acquiring phenotypic data. We have developed a system that automatically detects and measures key points and vein segments on a Drosophila wing. Key points are detected by performing image transformations and template matching on Drosophila wing images while vein segments are detected using an Active Contour algorithm. The accuracy of our key point detection was compared against key point annotations of users. We also performed key point detection using different training data sets of Drosophila wing images. We compared our software with an existing automated image analysis system for Drosophila wings and showed that our system performs better than the state of the art. Vein segments were manually measured and compared against the measurements obtained from our system. Our system was able to detect specific key points and vein segments from Drosophila wing images with high accuracy.

  10. Semi-automated ontology generation and evolution

    Science.gov (United States)

    Stirtzinger, Anthony P.; Anken, Craig S.

    2009-05-01

    Extending the notion of data models or object models, ontology can provide rich semantic definition not only to the meta-data but also to the instance data of domain knowledge, making these semantic definitions available in machine readable form. However, the generation of an effective ontology is a difficult task involving considerable labor and skill. This paper discusses an Ontology Generation and Evolution Processor (OGEP) aimed at automating this process, only requesting user input when un-resolvable ambiguous situations occur. OGEP directly attacks the main barrier which prevents automated (or self learning) ontology generation: the ability to understand the meaning of artifacts and the relationships the artifacts have to the domain space. OGEP leverages existing lexical to ontological mappings in the form of WordNet, and Suggested Upper Merged Ontology (SUMO) integrated with a semantic pattern-based structure referred to as the Semantic Grounding Mechanism (SGM) and implemented as a Corpus Reasoner. The OGEP processing is initiated by a Corpus Parser performing a lexical analysis of the corpus, reading in a document (or corpus) and preparing it for processing by annotating words and phrases. After the Corpus Parser is done, the Corpus Reasoner uses the parts of speech output to determine the semantic meaning of a word or phrase. The Corpus Reasoner is the crux of the OGEP system, analyzing, extrapolating, and evolving data from free text into cohesive semantic relationships. The Semantic Grounding Mechanism provides a basis for identifying and mapping semantic relationships. By blending together the WordNet lexicon and SUMO ontological layout, the SGM is given breadth and depth in its ability to extrapolate semantic relationships between domain entities. The combination of all these components results in an innovative approach to user assisted semantic-based ontology generation. This paper will describe the OGEP technology in the context of the architectural

  11. Literature classification for semi-automated updating of biological knowledgebases

    DEFF Research Database (Denmark)

    Olsen, Lars Rønn; Kudahl, Ulrich Johan; Winther, Ole

    2013-01-01

    abstracts yielded classification accuracy of 0.95, thus showing significant value in support of data extraction from the literature. Conclusion: We here propose a conceptual framework for semi-automated extraction of epitope data embedded in scientific literature using principles from text mining...... types of biological data, such as sequence data, are extensively stored in biological databases, functional annotations, such as immunological epitopes, are found primarily in semi-structured formats or free text embedded in primary scientific literature. Results: We defined and applied a machine...

  12. Application of semi-automated ultrasonography on nutritional support for severe acute pancreatitis.

    Science.gov (United States)

    Li, Ying; Ye, Yu; Yang, Mei; Ruan, Haiying; Yu, Yuan

    2018-04-25

    To evaluate the application value of semi-automated ultrasound on the guidance of nasogastrojejunal tube replacement for patients with acute severe pancreatitis (ASP), as well as the value of the nutritional support for standardized treatment in clinical practice. The retrospective research was performed in our hospital, and 34 patients suffering from ASP were enrolled into this study. All these identified participants ever received CT scans in order to make definitive diagnoses. Following, these patients received semi-automated ultrasound examinations within 1 days after their onset, in order to provide enteral nutrititon treatment via nasogastrojejunal tube, or freehand nasogastrojejunal tube replacement. In terms of statistical analysis, the application value of semi-automated ultrasound guidance on nasogastrojejunal tube replacement was evaluated, and was compared with tube replacement of no guidance. After cathetering, the additional enteral nutrition was provided, and its therapeutic effect on SAP was analyzed in further. A total of 34 patients with pancreatitis were identified in this research, 29 cases with necrosis of pancreas parenchyma. After further examinations, 32 cases were SAP, 2 cases were mild acute pancreatitis. When the firm diagnosis was made, additional enteral nutrition (EN) was given, all the patient conditions appeared good, and they all were satisfied with this kind of nutritional support. According to our clinical experience, when there was 200-250 ml liquid in the stomach, the successful rate of intubation appeared higher. Additionally, the comparison between ultrasound-guided and freehand nasogastrojejunal tube replacement was made. According to the statistical results, in terms of the utilization ratio of nutritional support, it was better in ultrasound-guided group, when compared with it in freehand group, within 1 day, after 3 days and after 7 days (7/20 versus 2/14; P groups was not statistically different (P > 0.05). It can

  13. Dynamic CT myocardial perfusion imaging: performance of 3D semi-automated evaluation software

    Energy Technology Data Exchange (ETDEWEB)

    Ebersberger, Ullrich [Medical University of South Carolina, Heart and Vascular Center, Charleston, SC (United States); Heart Center Munich-Bogenhausen, Department of Cardiology and Intensive Care Medicine, Munich (Germany); Marcus, Roy P.; Nikolaou, Konstantin; Bamberg, Fabian [University of Munich, Institute of Clinical Radiology, Munich (Germany); Schoepf, U.J.; Gray, J.C.; McQuiston, Andrew D. [Medical University of South Carolina, Heart and Vascular Center, Charleston, SC (United States); Lo, Gladys G. [Hong Kong Sanatorium and Hospital, Department of Diagnostic and Interventional Radiology, Hong Kong (China); Wang, Yining [Medical University of South Carolina, Heart and Vascular Center, Charleston, SC (United States); Peking Union Medical College Hospital, Chinese Academy of Medical Sciences, Department of Radiology, Beijing (China); Blanke, Philipp [Medical University of South Carolina, Heart and Vascular Center, Charleston, SC (United States); University Hospital Freiburg, Department of Diagnostic Radiology, Freiburg (Germany); Geyer, Lucas L. [Medical University of South Carolina, Heart and Vascular Center, Charleston, SC (United States); University of Munich, Institute of Clinical Radiology, Munich (Germany); Cho, Young Jun [Medical University of South Carolina, Heart and Vascular Center, Charleston, SC (United States); Konyang University College of Medicine, Department of Radiology, Daejeon (Korea, Republic of); Scheuering, Michael; Canstein, Christian [Siemens Healthcare, CT Division, Forchheim (Germany); Hoffmann, Ellen [Heart Center Munich-Bogenhausen, Department of Cardiology and Intensive Care Medicine, Munich (Germany)

    2014-01-15

    To evaluate the performance of three-dimensional semi-automated evaluation software for the assessment of myocardial blood flow (MBF) and blood volume (MBV) at dynamic myocardial perfusion computed tomography (CT). Volume-based software relying on marginal space learning and probabilistic boosting tree-based contour fitting was applied to CT myocardial perfusion imaging data of 37 subjects. In addition, all image data were analysed manually and both approaches were compared with SPECT findings. Study endpoints included time of analysis and conventional measures of diagnostic accuracy. Of 592 analysable segments, 42 showed perfusion defects on SPECT. Average analysis times for the manual and software-based approaches were 49.1 ± 11.2 and 16.5 ± 3.7 min respectively (P < 0.01). There was strong agreement between the two measures of interest (MBF, ICC = 0.91, and MBV, ICC = 0.88, both P < 0.01) and no significant difference in MBF/MBV with respect to diagnostic accuracy between the two approaches for both MBF and MBV for manual versus software-based approach; respectively; all comparisons P > 0.05. Three-dimensional semi-automated evaluation of dynamic myocardial perfusion CT data provides similar measures and diagnostic accuracy to manual evaluation, albeit with substantially reduced analysis times. This capability may aid the integration of this test into clinical workflows. (orig.)

  14. Semi-automated, occupationally safe immunofluorescence microtip sensor for rapid detection of Mycobacterium cells in sputum.

    Directory of Open Access Journals (Sweden)

    Shinnosuke Inoue

    Full Text Available An occupationally safe (biosafe sputum liquefaction protocol was developed for use with a semi-automated antibody-based microtip immunofluorescence sensor. The protocol effectively liquefied sputum and inactivated microorganisms including Mycobacterium tuberculosis, while preserving the antibody-binding activity of Mycobacterium cell surface antigens. Sputum was treated with a synergistic chemical-thermal protocol that included moderate concentrations of NaOH and detergent at 60°C for 5 to 10 min. Samples spiked with M. tuberculosis complex cells showed approximately 10(6-fold inactivation of the pathogen after treatment. Antibody binding was retained post-treatment, as determined by analysis with a microtip immunosensor. The sensor correctly distinguished between Mycobacterium species and other cell types naturally present in biosafe-treated sputum, with a detection limit of 100 CFU/mL for M. tuberculosis, in a 30-minute sample-to-result process. The microtip device was also semi-automated and shown to be compatible with low-cost, LED-powered fluorescence microscopy. The device and biosafe sputum liquefaction method opens the door to rapid detection of tuberculosis in settings with limited laboratory infrastructure.

  15. Fast semi-automated lesion demarcation in stroke

    Directory of Open Access Journals (Sweden)

    Bianca de Haan

    2015-01-01

    Full Text Available Lesion–behaviour mapping analyses require the demarcation of the brain lesion on each (usually transverse slice of the individual stroke patient's brain image. To date, this is generally thought to be most precise when done manually, which is, however, both time-consuming and potentially observer-dependent. Fully automated lesion demarcation methods have been developed to address these issues, but these are often not practicable in acute stroke research where for each patient only a single image modality is available and the available image modality differs over patients. In the current study, we evaluated a semi-automated lesion demarcation approach, the so-called Clusterize algorithm, in acute stroke patients scanned in a range of common image modalities. Our results suggest that, compared to the standard of manual lesion demarcation, the semi-automated Clusterize algorithm is capable of significantly speeding up lesion demarcation in the most commonly used image modalities, without loss of either lesion demarcation precision or lesion demarcation reproducibility. For the three investigated acute datasets (CT, DWI, T2FLAIR, containing a total of 44 patient images obtained in a regular clinical setting at patient admission, the reduction in processing time was on average 17.8 min per patient and this advantage increased with increasing lesion volume (up to 60 min per patient for the largest lesion volumes in our datasets. Additionally, our results suggest that performance of the Clusterize algorithm in a chronic dataset with 11 T1 images was comparable to its performance in the acute datasets. We thus advocate the use of the Clusterize algorithm, integrated into a simple, freely available SPM toolbox, for the precise, reliable and fast preparation of imaging data for lesion–behaviour mapping analyses.

  16. Semi-automated technique for the separation and determination of barium and strontium in surface waters by ion exchange chromatography and atomic emission spectrometry

    International Nuclear Information System (INIS)

    Pierce, F.D.; Brown, H.R.

    1977-01-01

    A semi-automated method for the separation and the analysis of barium and strontium in surface waters by atomic emission spectrometry is described. The method employs a semi-automated separation technique using ion exchange and an automated aspiration-analysis procedure. Forty specimens can be prepared in approximately 90 min and can be analyzed for barium and strontium content in 20 min. The detection limits and sensitivities provided by the described technique are 0.003 mg/l and 0.01 mg/l respectively for barium and 0.00045 mg/l and 0.003 mg/l respectively for strontium

  17. Mesoscale carbon sequestration site screening and CCS infrastructure analysis.

    Science.gov (United States)

    Keating, Gordon N; Middleton, Richard S; Stauffer, Philip H; Viswanathan, Hari S; Letellier, Bruce C; Pasqualini, Donatella; Pawar, Rajesh J; Wolfsberg, Andrew V

    2011-01-01

    We explore carbon capture and sequestration (CCS) at the meso-scale, a level of study between regional carbon accounting and highly detailed reservoir models for individual sites. We develop an approach to CO(2) sequestration site screening for industries or energy development policies that involves identification of appropriate sequestration basin, analysis of geologic formations, definition of surface sites, design of infrastructure, and analysis of CO(2) transport and storage costs. Our case study involves carbon management for potential oil shale development in the Piceance-Uinta Basin, CO and UT. This study uses new capabilities of the CO(2)-PENS model for site screening, including reservoir capacity, injectivity, and cost calculations for simple reservoirs at multiple sites. We couple this with a model of optimized source-sink-network infrastructure (SimCCS) to design pipeline networks and minimize CCS cost for a given industry or region. The CLEAR(uff) dynamical assessment model calculates the CO(2) source term for various oil production levels. Nine sites in a 13,300 km(2) area have the capacity to store 6.5 GtCO(2), corresponding to shale-oil production of 1.3 Mbbl/day for 50 years (about 1/4 of U.S. crude oil production). Our results highlight the complex, nonlinear relationship between the spatial deployment of CCS infrastructure and the oil-shale production rate.

  18. ALLocator: an interactive web platform for the analysis of metabolomic LC-ESI-MS datasets, enabling semi-automated, user-revised compound annotation and mass isotopomer ratio analysis.

    Science.gov (United States)

    Kessler, Nikolas; Walter, Frederik; Persicke, Marcus; Albaum, Stefan P; Kalinowski, Jörn; Goesmann, Alexander; Niehaus, Karsten; Nattkemper, Tim W

    2014-01-01

    Adduct formation, fragmentation events and matrix effects impose special challenges to the identification and quantitation of metabolites in LC-ESI-MS datasets. An important step in compound identification is the deconvolution of mass signals. During this processing step, peaks representing adducts, fragments, and isotopologues of the same analyte are allocated to a distinct group, in order to separate peaks from coeluting compounds. From these peak groups, neutral masses and pseudo spectra are derived and used for metabolite identification via mass decomposition and database matching. Quantitation of metabolites is hampered by matrix effects and nonlinear responses in LC-ESI-MS measurements. A common approach to correct for these effects is the addition of a U-13C-labeled internal standard and the calculation of mass isotopomer ratios for each metabolite. Here we present a new web-platform for the analysis of LC-ESI-MS experiments. ALLocator covers the workflow from raw data processing to metabolite identification and mass isotopomer ratio analysis. The integrated processing pipeline for spectra deconvolution "ALLocatorSD" generates pseudo spectra and automatically identifies peaks emerging from the U-13C-labeled internal standard. Information from the latter improves mass decomposition and annotation of neutral losses. ALLocator provides an interactive and dynamic interface to explore and enhance the results in depth. Pseudo spectra of identified metabolites can be stored in user- and method-specific reference lists that can be applied on succeeding datasets. The potential of the software is exemplified in an experiment, in which abundance fold-changes of metabolites of the l-arginine biosynthesis in C. glutamicum type strain ATCC 13032 and l-arginine producing strain ATCC 21831 are compared. Furthermore, the capability for detection and annotation of uncommon large neutral losses is shown by the identification of (γ-)glutamyl dipeptides in the same strains

  19. ALLocator: an interactive web platform for the analysis of metabolomic LC-ESI-MS datasets, enabling semi-automated, user-revised compound annotation and mass isotopomer ratio analysis.

    Directory of Open Access Journals (Sweden)

    Nikolas Kessler

    Full Text Available Adduct formation, fragmentation events and matrix effects impose special challenges to the identification and quantitation of metabolites in LC-ESI-MS datasets. An important step in compound identification is the deconvolution of mass signals. During this processing step, peaks representing adducts, fragments, and isotopologues of the same analyte are allocated to a distinct group, in order to separate peaks from coeluting compounds. From these peak groups, neutral masses and pseudo spectra are derived and used for metabolite identification via mass decomposition and database matching. Quantitation of metabolites is hampered by matrix effects and nonlinear responses in LC-ESI-MS measurements. A common approach to correct for these effects is the addition of a U-13C-labeled internal standard and the calculation of mass isotopomer ratios for each metabolite. Here we present a new web-platform for the analysis of LC-ESI-MS experiments. ALLocator covers the workflow from raw data processing to metabolite identification and mass isotopomer ratio analysis. The integrated processing pipeline for spectra deconvolution "ALLocatorSD" generates pseudo spectra and automatically identifies peaks emerging from the U-13C-labeled internal standard. Information from the latter improves mass decomposition and annotation of neutral losses. ALLocator provides an interactive and dynamic interface to explore and enhance the results in depth. Pseudo spectra of identified metabolites can be stored in user- and method-specific reference lists that can be applied on succeeding datasets. The potential of the software is exemplified in an experiment, in which abundance fold-changes of metabolites of the l-arginine biosynthesis in C. glutamicum type strain ATCC 13032 and l-arginine producing strain ATCC 21831 are compared. Furthermore, the capability for detection and annotation of uncommon large neutral losses is shown by the identification of (γ-glutamyl dipeptides in

  20. Semi-automated software service integration in virtual organisations

    Science.gov (United States)

    Afsarmanesh, Hamideh; Sargolzaei, Mahdi; Shadi, Mahdieh

    2015-08-01

    To enhance their business opportunities, organisations involved in many service industries are increasingly active in pursuit of both online provision of their business services (BSs) and collaborating with others. Collaborative Networks (CNs) in service industry sector, however, face many challenges related to sharing and integration of their collection of provided BSs and their corresponding software services. Therefore, the topic of service interoperability for which this article introduces a framework is gaining momentum in research for supporting CNs. It contributes to generation of formal machine readable specification for business processes, aimed at providing their unambiguous definitions, as needed for developing their equivalent software services. The framework provides a model and implementation architecture for discovery and composition of shared services, to support the semi-automated development of integrated value-added services. In support of service discovery, a main contribution of this research is the formal representation of services' behaviour and applying desired service behaviour specified by users for automated matchmaking with other existing services. Furthermore, to support service integration, mechanisms are developed for automated selection of the most suitable service(s) according to a number of service quality aspects. Two scenario cases are presented, which exemplify several specific features related to service discovery and service integration aspects.

  1. Semi-Automated Discovery of Application Session Structure

    Energy Technology Data Exchange (ETDEWEB)

    Kannan, J.; Jung, J.; Paxson, V.; Koksal, C.

    2006-09-07

    While the problem of analyzing network traffic at the granularity of individual connections has seen considerable previous work and tool development, understanding traffic at a higher level---the structure of user-initiated sessions comprised of groups of related connections---remains much less explored. Some types of session structure, such as the coupling between an FTP control connection and the data connections it spawns, have prespecified forms, though the specifications do not guarantee how the forms appear in practice. Other types of sessions, such as a user reading email with a browser, only manifest empirically. Still other sessions might exist without us even knowing of their presence, such as a botnet zombie receiving instructions from its master and proceeding in turn to carry them out. We present algorithms rooted in the statistics of Poisson processes that can mine a large corpus of network connection logs to extract the apparent structure of application sessions embedded in the connections. Our methods are semi-automated in that we aim to present an analyst with high-quality information (expressed as regular expressions) reflecting different possible abstractions of an application's session structure. We develop and test our methods using traces from a large Internet site, finding diversity in the number of applications that manifest, their different session structures, and the presence of abnormal behavior. Our work has applications to traffic characterization and monitoring, source models for synthesizing network traffic, and anomaly detection.

  2. Mesoscale Frontogenesis: An Analysis of Two Cold Front Case Studies

    Science.gov (United States)

    1993-01-01

    marked the boundary of warm air or the "warm sector". Further development of this cyclone model by Bjerknes and Solberg (1922) and Bergeron (1928) provided...represent 25 mn s -1 Relative humidity of greater than 80% indicated by the shaded region in gray. Frontal zones marked with solid black lines. 24 two... Zuckerberg , J.T. Schaefer, and G.E. Rasch, 1986: Forecast problems: The meteorological and operational factors, In: Mesoscale Meteorology and Forecasting

  3. Intelligent, Semi-Automated Procedure Aid (ISAPA) for ISS Flight Control, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — We propose to develop the Intelligent, Semi-Automated Procedure Aid (ISAPA) intended for use by International Space Station (ISS) ground controllers to increase the...

  4. Comparison of semi-automated commercial rep-PCR fingerprinting, spoligotyping, 12-locus MIRU-VNTR typing and single nucleotide polymorphism analysis of the embB gene as molecular typing tools for Mycobacterium bovis.

    Science.gov (United States)

    Armas, Federica; Camperio, Cristina; Coltella, Luana; Selvaggini, Serena; Boniotti, Maria Beatrice; Pacciarini, Maria Lodovica; Di Marco Lo Presti, Vincenzo; Marianelli, Cinzia

    2017-08-04

    Highly discriminatory genotyping strategies are essential in molecular epidemiological studies of tuberculosis. In this study we evaluated, for the first time, the efficacy of the repetitive sequence-based PCR (rep-PCR) DiversiLab Mycobacterium typing kit over spoligotyping, 12-locus mycobacterial interspersed repetitive unit-variable number tandem repeat (MIRU-VNTR) typing and embB single nucleotide polymorphism (SNP) analysis for Mycobacterium bovis typing. A total of 49 M. bovis animal isolates were used. DNA was extracted and genomic DNA was amplified using the DiversiLab Mycobacterium typing kit. The amplified fragments were separated and detected using a microfluidics chip with Agilent 2100. The resulting rep-PCR-based DNA fingerprints were uploaded to and analysed using web-based DiversiLab software through Pearson's correlation coefficient. Rep-PCR DiversiLab grouped M. bovis isolates into ten different clusters. Most isolates sharing identical spoligotype, MIRU-VNTR profile or embB gene polymorphism were grouped into different rep-PCR clusters. Rep-PCR DiversiLab displayed greater discriminatory power than spoligotyping and embB SNP analysis but a lower resolution power than the 12-locus MIRU-VNTR analysis. MIRU-VNTR confirmed that it is superior to the other PCR-based methods tested here. In combination with spoligotyping and 12-locus MIRU-VNTR analysis, rep-PCR improved the discriminatory power for M. bovis typing.

  5. Semi-Automated Processing of Trajectory Simulator Output Files for Model Evaluation

    Science.gov (United States)

    2018-01-01

    ARL-TR-8284 ● JAN 2018 US Army Research Laboratory Semi-Automated Processing of Trajectory Simulator Output Files for Model...Semi-Automated Processing of Trajectory Simulator Output Files for Model Evaluation 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT...although some minor changes may be needed. The program processes a GTRAJ output text file that contains results from 2 or more simulations , where each

  6. Semi-Automated Diagnosis, Repair, and Rework of Spacecraft Electronics

    Science.gov (United States)

    Struk, Peter M.; Oeftering, Richard C.; Easton, John W.; Anderson, Eric E.

    2008-01-01

    NASA's Constellation Program for Exploration of the Moon and Mars places human crews in extreme isolation in resource scarce environments. Near Earth, the discontinuation of Space Shuttle flights after 2010 will alter the up- and down-mass capacity for the International Space Station (ISS). NASA is considering new options for logistics support strategies for future missions. Aerospace systems are often composed of replaceable modular blocks that minimize the need for complex service operations in the field. Such a strategy however, implies a robust and responsive logistics infrastructure with relatively low transportation costs. The modular Orbital Replacement Units (ORU) used for ISS requires relatively large blocks of replacement hardware even though the actual failed component may really be three orders of magnitude smaller. The ability to perform in-situ repair of electronics circuits at the component level can dramatically reduce the scale of spares and related logistics cost. This ability also reduces mission risk, increases crew independence and improves the overall supportability of the program. The Component-Level Electronics Assembly Repair (CLEAR) task under the NASA Supportability program was established to demonstrate the practicality of repair by first investigating widely used soldering materials and processes (M&P) performed by modest manual means. The work will result in program guidelines for performing manual repairs along with design guidance for circuit reparability. The next phase of CLEAR recognizes that manual repair has its limitations and some highly integrated devices are extremely difficult to handle and demand semi-automated equipment. Further, electronics repairs require a broad range of diagnostic capability to isolate the faulty components. Finally repairs must pass functional tests to determine that the repairs are successful and the circuit can be returned to service. To prevent equipment demands from exceeding spacecraft volume

  7. Suitability of semi-automated tumor response assessment of liver metastases using a dedicated software package

    International Nuclear Information System (INIS)

    Kalkmann, Janine; Ladd, S.C.; Greiff, A. de; Forsting, M.; Stattaus, J.

    2010-01-01

    Purpose: to evaluate the suitability of semi-automated compared to manual tumor response assessment (TRA) of liver metastases. Materials and methods: in total, 32 patients with colorectal cancer and liver metastases were followed by an average of 2.8 contrast-enhanced CT scans. Two observers (O1, O2) measured the longest diameter (LD) of 269 liver metastases manually and semi-automatically using software installed as thin-client on a PACS workstation (LMS-Liver, MEDIAN Technologies). LD and TRA (''progressive'', ''stable'', ''partial remission'') were performed according to RECIST (Response Evaluation Criteria in Solid Tumors) and analyzed for between-method, interobserver and intraobserver variability. The time needed for evaluation was compared for both methods. Results: all measurements correlated excellently (r ≥ 0.96). Intraobserver (semi-automated), interobserver (manual) and between-method differences (by O1) in LD of 1.4 ± 2.6 mm, 1.9 ± 1.9 mm and 2.1 ± 2.0 mm, respectively, were not significant. Interobserver (semi-automated) and between-method (by O2) differences in LD of 3.0 ± 3.0 mm and 2.6 ± 2.0 mm, respectively, reflected a significant variability (p < 0.01). The interobserver agreement in manual and semi-automated TRA was 91.4%. The intraobserver agreement in semi-automated TRA was 84.5%. Between both methods a TRA agreement of 86.2% was obtained. Semi-automated evaluation (2.7 min) took slightly more time than manual evaluation (2.3 min). Conclusion: semi-automated and manual evaluation of liver metastases yield comparable results in response assessments and require comparable effort. (orig.)

  8. The influence of image setting on intracranial translucency measurement by manual and semi-automated system.

    Science.gov (United States)

    Zhen, Li; Yang, Xin; Ting, Yuen Ha; Chen, Min; Leung, Tak Yeung

    2013-09-01

    To investigate the agreement between manual and semi-automated system and the effect of different image settings on intracranial translucency (IT) measurement. A prospective study was conducted on 55 women carrying singleton pregnancy who attended first trimester Down syndrome screening. IT was measured both manually and by semi-automated system at the same default image setting. The IT measurements were then repeated with the post-processing changes in the image setting one at a time. The difference in IT measurements between the altered and the original images were assessed. Intracranial translucency was successfully measured on 55 images both manually and by semi-automated method. There was strong agreement in IT measurements between the two methods with a mean difference (manual minus semi-automated) of 0.011 mm (95% confidence interval--0.052 mm-0.094 mm). There were statistically significant variations in both manual and semi-automated IT measurement after changing the Gain and the Contrast. The greatest changes occurred when the Contrast was reduced to 1 (IT reduced by 0.591 mm in semi-automated; 0.565 mm in manual), followed by when the Gain was increased to 15 (IT reduced by 0.424 mm in semi-automated; 0.524 mm in manual). The image settings may affect IT identification and measurement. Increased Gain and reduced Contrast are the most influential factors and may cause under-measurement of IT. © 2013 John Wiley & Sons, Ltd.

  9. Method for semi-automated microscopy of filtration-enriched circulating tumor cells.

    Science.gov (United States)

    Pailler, Emma; Oulhen, Marianne; Billiot, Fanny; Galland, Alexandre; Auger, Nathalie; Faugeroux, Vincent; Laplace-Builhé, Corinne; Besse, Benjamin; Loriot, Yohann; Ngo-Camus, Maud; Hemanda, Merouan; Lindsay, Colin R; Soria, Jean-Charles; Vielh, Philippe; Farace, Françoise

    2016-07-14

    Circulating tumor cell (CTC)-filtration methods capture high numbers of CTCs in non-small-cell lung cancer (NSCLC) and metastatic prostate cancer (mPCa) patients, and hold promise as a non-invasive technique for treatment selection and disease monitoring. However filters have drawbacks that make the automation of microscopy challenging. We report the semi-automated microscopy method we developed to analyze filtration-enriched CTCs from NSCLC and mPCa patients. Spiked cell lines in normal blood and CTCs were enriched by ISET (isolation by size of epithelial tumor cells). Fluorescent staining was carried out using epithelial (pan-cytokeratins, EpCAM), mesenchymal (vimentin, N-cadherin), leukocyte (CD45) markers and DAPI. Cytomorphological staining was carried out with Mayer-Hemalun or Diff-Quik. ALK-, ROS1-, ERG-rearrangement were detected by filter-adapted-FISH (FA-FISH). Microscopy was carried out using an Ariol scanner. Two combined assays were developed. The first assay sequentially combined four-color fluorescent staining, scanning, automated selection of CD45(-) cells, cytomorphological staining, then scanning and analysis of CD45(-) cell phenotypical and cytomorphological characteristics. CD45(-) cell selection was based on DAPI and CD45 intensity, and a nuclear area >55 μm(2). The second assay sequentially combined fluorescent staining, automated selection of CD45(-) cells, FISH scanning on CD45(-) cells, then analysis of CD45(-) cell FISH signals. Specific scanning parameters were developed to deal with the uneven surface of filters and CTC characteristics. Thirty z-stacks spaced 0.6 μm apart were defined as the optimal setting, scanning 82 %, 91 %, and 95 % of CTCs in ALK-, ROS1-, and ERG-rearranged patients respectively. A multi-exposure protocol consisting of three separate exposure times for green and red fluorochromes was optimized to analyze the intensity, size and thickness of FISH signals. The semi-automated microscopy method reported here

  10. Method for semi-automated microscopy of filtration-enriched circulating tumor cells

    International Nuclear Information System (INIS)

    Pailler, Emma; Oulhen, Marianne; Billiot, Fanny; Galland, Alexandre; Auger, Nathalie; Faugeroux, Vincent; Laplace-Builhé, Corinne; Besse, Benjamin; Loriot, Yohann; Ngo-Camus, Maud; Hemanda, Merouan; Lindsay, Colin R.; Soria, Jean-Charles; Vielh, Philippe; Farace, Françoise

    2016-01-01

    Circulating tumor cell (CTC)-filtration methods capture high numbers of CTCs in non-small-cell lung cancer (NSCLC) and metastatic prostate cancer (mPCa) patients, and hold promise as a non-invasive technique for treatment selection and disease monitoring. However filters have drawbacks that make the automation of microscopy challenging. We report the semi-automated microscopy method we developed to analyze filtration-enriched CTCs from NSCLC and mPCa patients. Spiked cell lines in normal blood and CTCs were enriched by ISET (isolation by size of epithelial tumor cells). Fluorescent staining was carried out using epithelial (pan-cytokeratins, EpCAM), mesenchymal (vimentin, N-cadherin), leukocyte (CD45) markers and DAPI. Cytomorphological staining was carried out with Mayer-Hemalun or Diff-Quik. ALK-, ROS1-, ERG-rearrangement were detected by filter-adapted-FISH (FA-FISH). Microscopy was carried out using an Ariol scanner. Two combined assays were developed. The first assay sequentially combined four-color fluorescent staining, scanning, automated selection of CD45 − cells, cytomorphological staining, then scanning and analysis of CD45 − cell phenotypical and cytomorphological characteristics. CD45 − cell selection was based on DAPI and CD45 intensity, and a nuclear area >55 μm 2 . The second assay sequentially combined fluorescent staining, automated selection of CD45 − cells, FISH scanning on CD45 − cells, then analysis of CD45 − cell FISH signals. Specific scanning parameters were developed to deal with the uneven surface of filters and CTC characteristics. Thirty z-stacks spaced 0.6 μm apart were defined as the optimal setting, scanning 82 %, 91 %, and 95 % of CTCs in ALK-, ROS1-, and ERG-rearranged patients respectively. A multi-exposure protocol consisting of three separate exposure times for green and red fluorochromes was optimized to analyze the intensity, size and thickness of FISH signals. The semi-automated microscopy method reported here

  11. Analysis of Surface Heterogeneity Effects with Mesoscale Terrestrial Modeling Platforms

    Science.gov (United States)

    Simmer, C.

    2015-12-01

    An improved understanding of the full variability in the weather and climate system is crucial for reducing the uncertainty in weather forecasting and climate prediction, and to aid policy makers to develop adaptation and mitigation strategies. A yet unknown part of uncertainty in the predictions from the numerical models is caused by the negligence of non-resolved land surface heterogeneity and the sub-surface dynamics and their potential impact on the state of the atmosphere. At the same time, mesoscale numerical models using finer horizontal grid resolution [O(1)km] can suffer from inconsistencies and neglected scale-dependencies in ABL parameterizations and non-resolved effects of integrated surface-subsurface lateral flow at this scale. Our present knowledge suggests large-eddy-simulation (LES) as an eventual solution to overcome the inadequacy of the physical parameterizations in the atmosphere in this transition scale, yet we are constrained by the computational resources, memory management, big-data, when using LES for regional domains. For the present, there is a need for scale-aware parameterizations not only in the atmosphere but also in the land surface and subsurface model components. In this study, we use the recently developed Terrestrial Systems Modeling Platform (TerrSysMP) as a numerical tool to analyze the uncertainty in the simulation of surface exchange fluxes and boundary layer circulations at grid resolutions of the order of 1km, and explore the sensitivity of the atmospheric boundary layer evolution and convective rainfall processes on land surface heterogeneity.

  12. Moving from proprietary to open-source solutions for academic research in remote sensing: Example with semi-automated land cover mapping

    OpenAIRE

    Grippa, Taïs

    2017-01-01

    GRASS GIS has recently experienced significant improvements for Object-Based Image Analysis. At ULB the choice was made to combine GRASS GIS and Python in a semi-automated processing chain for land-cover mapping. The later proved its ability of being quickly customized in order to match the requirements of different projects. In order to promote the OSGEO software, we decided to make it freely available, allowing anyone interested to review, reuse and/or enhance it for further studies.

  13. Simulation and analysis of the mesoscale circulation in the northwestern Mediterranean Sea

    Directory of Open Access Journals (Sweden)

    V. Echevin

    Full Text Available The large-scale and mesoscale circulation of the northwestern Mediterranean Sea are simulated with an eddy-resolving primitive-equation regional model (RM of 1/16° resolution embedded in a general circulation model (GM of the Mediterranean Sea of 1/8° resolution. The RM is forced by a monthly climatology of heat fluxes, precipitation and wind stress. The GM, which uses the same atmospheric forcing, provides initial and boundary conditions for the RM. Analysis of the RM results shows that several realistic features of the large-scale and mesoscale circulation are evident in this region. The mean cyclonic circulation is in good agreement with observations. Mesoscale variability is intense along the coasts of Sardinia and Corsica, in the Gulf of Lions and in the Catalan Sea. The length scales of the Northern Current meanders along the Provence coast and in the Gulf of Lions’ shelf are in good agreement with observations. Winter Intermediate Water is formed along most of the north-coast shelves, between the Gulf of Genoa and Cape Creus. Advection of this water by the mean cyclonic circulation generates a complex eddy field in the Catalan Sea. Intense anticyclonic eddies are generated northeast of the Balearic Islands. These results are in good agreement with mesoscale activity inferred from satellite altimetric data. This work demonstrates the feasibility of a down-scaling system composed of a general-circulation, a regional and a coastal model, which is one of the goals of the Mediterranean Forecasting System Pilot Project.

    Key words. Oceanography: physical (currents; eddies and mesoscale processes; general circulation

  14. Expert-driven semi-automated geomorphological mapping for a mountainaous area using a laser DTM

    NARCIS (Netherlands)

    van Asselen, S.; Seijmonsbergen, A.C.

    2006-01-01

    n this paper a semi-automated method is presented to recognize and spatially delineate geomorphological units in mountainous forested ecosystems, using statistical information extracted from a 1-m resolution laser digital elevation dataset. The method was applied to a mountainous area in Austria.

  15. Rapid and convenient semi-automated microwave-assisted solid-phase synthesis of arylopeptoids

    DEFF Research Database (Denmark)

    Rasmussen, Jakob Ewald; Boccia, Marcello Massimo; Nielsen, John

    2014-01-01

    A facile and expedient route to the synthesis of arylopeptoid oligomers (N-alkylated aminomethyl benz-amides) using semi-automated microwave-assisted solid-phase synthesis is presented. The synthesis was optimized for the incorporation of side chains derived from sterically hindered or unreactive...

  16. Semi-Automated Quantification of Finger Joint Space Narrowing Using Tomosynthesis in Patients with Rheumatoid Arthritis.

    Science.gov (United States)

    Ichikawa, Shota; Kamishima, Tamotsu; Sutherland, Kenneth; Kasahara, Hideki; Shimizu, Yuka; Fujimori, Motoshi; Yasojima, Nobutoshi; Ono, Yohei; Kaneda, Takahiko; Koike, Takao

    2017-06-01

    The purpose of the study is to validate the semi-automated method using tomosynthesis images for the assessment of finger joint space narrowing (JSN) in patients with rheumatoid arthritis (RA), by using the semi-quantitative scoring method as the reference standard. Twenty patients (14 females and 6 males) with RA were included in this retrospective study. All patients underwent radiography and tomosynthesis of the bilateral hand and wrist. Two rheumatologists and a radiologist independently scored JSN with two modalities according to the Sharp/van der Heijde score. Two observers independently measured joint space width on tomosynthesis images using an in-house semi-automated method. More joints with JSN were revealed with tomosynthesis score (243 joints) and the semi-automated method (215 joints) than with radiography (120 joints), and the associations between tomosynthesis scores and radiography scores were demonstrated (P tomosynthesis scores with r = -0.606 (P tomosynthesis images was in almost perfect agreement with intra-class correlation coefficient (ICC) values of 0.964 and 0.963, respectively. The semi-automated method using tomosynthesis images provided sensitive, quantitative, and reproducible measurement of finger joint space in patients with RA.

  17. A semi-automated method for measuring thickness and white matter ...

    African Journals Online (AJOL)

    A semi-automated method for measuring thickness and white matter integrity of the corpus callosum. ... and interhemispheric differences. Future research will determine normal values for age and compare CC thickness with peripheral white matter volume loss in large groups of patients, using the semiautomated technique.

  18. NeuronMetrics: software for semi-automated processing of cultured neuron images.

    Science.gov (United States)

    Narro, Martha L; Yang, Fan; Kraft, Robert; Wenk, Carola; Efrat, Alon; Restifo, Linda L

    2007-03-23

    Using primary cell culture to screen for changes in neuronal morphology requires specialized analysis software. We developed NeuronMetrics for semi-automated, quantitative analysis of two-dimensional (2D) images of fluorescently labeled cultured neurons. It skeletonizes the neuron image using two complementary image-processing techniques, capturing fine terminal neurites with high fidelity. An algorithm was devised to span wide gaps in the skeleton. NeuronMetrics uses a novel strategy based on geometric features called faces to extract a branch number estimate from complex arbors with numerous neurite-to-neurite contacts, without creating a precise, contact-free representation of the neurite arbor. It estimates total neurite length, branch number, primary neurite number, territory (the area of the convex polygon bounding the skeleton and cell body), and Polarity Index (a measure of neuronal polarity). These parameters provide fundamental information about the size and shape of neurite arbors, which are critical factors for neuronal function. NeuronMetrics streamlines optional manual tasks such as removing noise, isolating the largest primary neurite, and correcting length for self-fasciculating neurites. Numeric data are output in a single text file, readily imported into other applications for further analysis. Written as modules for ImageJ, NeuronMetrics provides practical analysis tools that are easy to use and support batch processing. Depending on the need for manual intervention, processing time for a batch of approximately 60 2D images is 1.0-2.5 h, from a folder of images to a table of numeric data. NeuronMetrics' output accelerates the quantitative detection of mutations and chemical compounds that alter neurite morphology in vitro, and will contribute to the use of cultured neurons for drug discovery.

  19. Semi-automated scar detection in delayed enhanced cardiac magnetic resonance images

    Science.gov (United States)

    Morisi, Rita; Donini, Bruno; Lanconelli, Nico; Rosengarden, James; Morgan, John; Harden, Stephen; Curzen, Nick

    2015-06-01

    Late enhancement cardiac magnetic resonance images (MRI) has the ability to precisely delineate myocardial scars. We present a semi-automated method for detecting scars in cardiac MRI. This model has the potential to improve routine clinical practice since quantification is not currently offered due to time constraints. A first segmentation step was developed for extracting the target regions for potential scar and determining pre-candidate objects. Pattern recognition methods are then applied to the segmented images in order to detect the position of the myocardial scar. The database of late gadolinium enhancement (LE) cardiac MR images consists of 111 blocks of images acquired from 63 patients at the University Hospital Southampton NHS Foundation Trust (UK). At least one scar was present for each patient, and all the scars were manually annotated by an expert. A group of images (around one third of the entire set) was used for training the system which was subsequently tested on all the remaining images. Four different classifiers were trained (Support Vector Machine (SVM), k-nearest neighbor (KNN), Bayesian and feed-forward neural network) and their performance was evaluated by using Free response Receiver Operating Characteristic (FROC) analysis. Feature selection was implemented for analyzing the importance of the various features. The segmentation method proposed allowed the region affected by the scar to be extracted correctly in 96% of the blocks of images. The SVM was shown to be the best classifier for our task, and our system reached an overall sensitivity of 80% with less than 7 false positives per patient. The method we present provides an effective tool for detection of scars on cardiac MRI. This may be of value in clinical practice by permitting routine reporting of scar quantification.

  20. Application of semi-automated settlement detection for an integrated ...

    African Journals Online (AJOL)

    DRDLR

    The image is displayed as a false colour composite (RGB, 321) with red highlighting vegetation, cyan indicating settlement, and blue as water. 3. ... However, it is still regarded as a relatively new approach for earth observation image analysis ...

  1. Semi-automated quantification of living cells with internalized nanostructures

    KAUST Repository

    Margineanu, Michael B.; Julfakyan, Khachatur; Sommer, Christoph; Perez, Jose E.; Contreras, Maria F.; Khashab, Niveen M.; Kosel, Jü rgen; Ravasi, Timothy

    2016-01-01

    novel method for the quantification of cells that internalize a specific type of nanostructures. This approach is suitable for high-throughput and real-time data analysis and has the potential to be used to study the interaction of different types

  2. Semi-automated potentiometric titration method for uranium characterization.

    Science.gov (United States)

    Cristiano, B F G; Delgado, J U; da Silva, J W S; de Barros, P D; de Araújo, R M S; Lopes, R T

    2012-07-01

    The manual version of the potentiometric titration method has been used for certification and characterization of uranium compounds. In order to reduce the analysis time and the influence of the analyst, a semi-automatic version of the method was developed in the Brazilian Nuclear Energy Commission. The method was applied with traceability assured by using a potassium dichromate primary standard. The combined standard uncertainty in determining the total concentration of uranium was around 0.01%, which is suitable for uranium characterization. Copyright © 2011 Elsevier Ltd. All rights reserved.

  3. A High Throughput, 384-Well, Semi-Automated, Hepatocyte Intrinsic Clearance Assay for Screening New Molecular Entities in Drug Discovery.

    Science.gov (United States)

    Heinle, Lance; Peterkin, Vincent; de Morais, Sonia M; Jenkins, Gary J; Badagnani, Ilaria

    2015-01-01

    A high throughput, semi-automated clearance screening assay in hepatocytes was developed allowing a scientist to generate data for 96 compounds in one week. The 384-well format assay utilizes a Thermo Multidrop Combi and an optimized LC-MS/MS method. The previously reported LCMS/ MS method reduced the analytical run time by 3-fold, down to 1.2 min injection-to-injection. The Multidrop was able to deliver hepatocytes to 384-well plates with minimal viability loss. Comparison of results from the new 384-well and historical 24-well assays yielded a correlation of 0.95. In addition, results obtained for 25 marketed drugs with various metabolism pathways had a correlation of 0.75 when compared with literature values. Precision was maintained in the new format as 8 compounds tested in ≥39 independent experiments had coefficients of variation ≤21%. The ability to predict in vivo clearances using the new stability assay format was also investigated using 22 marketed drugs and 26 AbbVie compounds. Correction of intrinsic clearance values with binding to hepatocytes (in vitro data) and plasma (in vivo data) resulted in a higher in vitro to in vivo correlation when comparing 22 marketed compounds in human (0.80 vs 0.35) and 26 AbbVie Discovery compounds in rat (0.56 vs 0.17), demonstrating the importance of correcting for binding in clearance studies. This newly developed high throughput, semi-automated clearance assay allows for rapid screening of Discovery compounds to enable Structure Activity Relationship (SAR) analysis based on high quality hepatocyte stability data in sufficient quantity and quality to drive the next round of compound synthesis.

  4. Improvement of the banana "Musa acuminata" reference sequence using NGS data and semi-automated bioinformatics methods.

    Science.gov (United States)

    Martin, Guillaume; Baurens, Franc-Christophe; Droc, Gaëtan; Rouard, Mathieu; Cenci, Alberto; Kilian, Andrzej; Hastie, Alex; Doležel, Jaroslav; Aury, Jean-Marc; Alberti, Adriana; Carreel, Françoise; D'Hont, Angélique

    2016-03-16

    Recent advances in genomics indicate functional significance of a majority of genome sequences and their long range interactions. As a detailed examination of genome organization and function requires very high quality genome sequence, the objective of this study was to improve reference genome assembly of banana (Musa acuminata). We have developed a modular bioinformatics pipeline to improve genome sequence assemblies, which can handle various types of data. The pipeline comprises several semi-automated tools. However, unlike classical automated tools that are based on global parameters, the semi-automated tools proposed an expert mode for a user who can decide on suggested improvements through local compromises. The pipeline was used to improve the draft genome sequence of Musa acuminata. Genotyping by sequencing (GBS) of a segregating population and paired-end sequencing were used to detect and correct scaffold misassemblies. Long insert size paired-end reads identified scaffold junctions and fusions missed by automated assembly methods. GBS markers were used to anchor scaffolds to pseudo-molecules with a new bioinformatics approach that avoids the tedious step of marker ordering during genetic map construction. Furthermore, a genome map was constructed and used to assemble scaffolds into super scaffolds. Finally, a consensus gene annotation was projected on the new assembly from two pre-existing annotations. This approach reduced the total Musa scaffold number from 7513 to 1532 (i.e. by 80%), with an N50 that increased from 1.3 Mb (65 scaffolds) to 3.0 Mb (26 scaffolds). 89.5% of the assembly was anchored to the 11 Musa chromosomes compared to the previous 70%. Unknown sites (N) were reduced from 17.3 to 10.0%. The release of the Musa acuminata reference genome version 2 provides a platform for detailed analysis of banana genome variation, function and evolution. Bioinformatics tools developed in this work can be used to improve genome sequence assemblies in

  5. A semi-automated tool for treatment plan-quality evaluation and clinical trial quality assurance

    Science.gov (United States)

    Wang, Jiazhou; Chen, Wenzhou; Studenski, Matthew; Cui, Yunfeng; Lee, Andrew J.; Xiao, Ying

    2013-07-01

    The goal of this work is to develop a plan-quality evaluation program for clinical routine and multi-institutional clinical trials so that the overall evaluation efficiency is improved. In multi-institutional clinical trials evaluating the plan quality is a time-consuming and labor-intensive process. In this note, we present a semi-automated plan-quality evaluation program which combines MIMVista, Java/MATLAB, and extensible markup language (XML). More specifically, MIMVista is used for data visualization; Java and its powerful function library are implemented for calculating dosimetry parameters; and to improve the clarity of the index definitions, XML is applied. The accuracy and the efficiency of the program were evaluated by comparing the results of the program with the manually recorded results in two RTOG trials. A slight difference of about 0.2% in volume or 0.6 Gy in dose between the semi-automated program and manual recording was observed. According to the criteria of indices, there are minimal differences between the two methods. The evaluation time is reduced from 10-20 min to 2 min by applying the semi-automated plan-quality evaluation program.

  6. A semi-automated method of monitoring dam passage of American Eels Anguilla rostrata

    Science.gov (United States)

    Welsh, Stuart A.; Aldinger, Joni L.

    2014-01-01

    Fish passage facilities at dams have become an important focus of fishery management in riverine systems. Given the personnel and travel costs associated with physical monitoring programs, automated or semi-automated systems are an attractive alternative for monitoring fish passage facilities. We designed and tested a semi-automated system for eel ladder monitoring at Millville Dam on the lower Shenandoah River, West Virginia. A motion-activated eel ladder camera (ELC) photographed each yellow-phase American Eel Anguilla rostrata that passed through the ladder. Digital images (with date and time stamps) of American Eels allowed for total daily counts and measurements of eel TL using photogrammetric methods with digital imaging software. We compared physical counts of American Eels with camera-based counts; TLs obtained with a measuring board were compared with TLs derived from photogrammetric methods. Data from the ELC were consistent with data obtained by physical methods, thus supporting the semi-automated camera system as a viable option for monitoring American Eel passage. Time stamps on digital images allowed for the documentation of eel passage time—data that were not obtainable from physical monitoring efforts. The ELC has application to eel ladder facilities but can also be used to monitor dam passage of other taxa, such as crayfishes, lampreys, and water snakes.

  7. A semi-automated tool for treatment plan-quality evaluation and clinical trial quality assurance

    International Nuclear Information System (INIS)

    Wang, Jiazhou; Chen, Wenzhou; Studenski, Matthew; Cui, Yunfeng; Xiao, Ying; Lee, Andrew J

    2013-01-01

    The goal of this work is to develop a plan-quality evaluation program for clinical routine and multi-institutional clinical trials so that the overall evaluation efficiency is improved. In multi-institutional clinical trials evaluating the plan quality is a time-consuming and labor-intensive process. In this note, we present a semi-automated plan-quality evaluation program which combines MIMVista, Java/MATLAB, and extensible markup language (XML). More specifically, MIMVista is used for data visualization; Java and its powerful function library are implemented for calculating dosimetry parameters; and to improve the clarity of the index definitions, XML is applied. The accuracy and the efficiency of the program were evaluated by comparing the results of the program with the manually recorded results in two RTOG trials. A slight difference of about 0.2% in volume or 0.6 Gy in dose between the semi-automated program and manual recording was observed. According to the criteria of indices, there are minimal differences between the two methods. The evaluation time is reduced from 10–20 min to 2 min by applying the semi-automated plan-quality evaluation program. (note)

  8. A semi-automated method for bone age assessment using cervical vertebral maturation.

    Science.gov (United States)

    Baptista, Roberto S; Quaglio, Camila L; Mourad, Laila M E H; Hummel, Anderson D; Caetano, Cesar Augusto C; Ortolani, Cristina Lúcia F; Pisa, Ivan T

    2012-07-01

    To propose a semi-automated method for pattern classification to predict individuals' stage of growth based on morphologic characteristics that are described in the modified cervical vertebral maturation (CVM) method of Baccetti et al. A total of 188 lateral cephalograms were collected, digitized, evaluated manually, and grouped into cervical stages by two expert examiners. Landmarks were located on each image and measured. Three pattern classifiers based on the Naïve Bayes algorithm were built and assessed using a software program. The classifier with the greatest accuracy according to the weighted kappa test was considered best. The classifier showed a weighted kappa coefficient of 0.861 ± 0.020. If an adjacent estimated pre-stage or poststage value was taken to be acceptable, the classifier would show a weighted kappa coefficient of 0.992 ± 0.019. Results from this study show that the proposed semi-automated pattern classification method can help orthodontists identify the stage of CVM. However, additional studies are needed before this semi-automated classification method for CVM assessment can be implemented in clinical practice.

  9. Semi-automated knowledge discovery: identifying and profiling human trafficking

    Science.gov (United States)

    Poelmans, Jonas; Elzinga, Paul; Ignatov, Dmitry I.; Kuznetsov, Sergei O.

    2012-11-01

    We propose an iterative and human-centred knowledge discovery methodology based on formal concept analysis. The proposed approach recognizes the important role of the domain expert in mining real-world enterprise applications and makes use of specific domain knowledge, including human intelligence and domain-specific constraints. Our approach was empirically validated at the Amsterdam-Amstelland police to identify suspects and victims of human trafficking in 266,157 suspicious activity reports. Based on guidelines of the Attorney Generals of the Netherlands, we first defined multiple early warning indicators that were used to index the police reports. Using concept lattices, we revealed numerous unknown human trafficking and loverboy suspects. In-depth investigation by the police resulted in a confirmation of their involvement in illegal activities resulting in actual arrestments been made. Our human-centred approach was embedded into operational policing practice and is now successfully used on a daily basis to cope with the vastly growing amount of unstructured information.

  10. Semi-automated potentiometric titration method for uranium characterization

    Energy Technology Data Exchange (ETDEWEB)

    Cristiano, B.F.G., E-mail: barbara@ird.gov.br [Comissao Nacional de Energia Nuclear (CNEN), Instituto de Radioprotecao e Dosimetria (IRD), Avenida Salvador Allende s/n Recreio dos Bandeirantes, PO Box 37750, Rio de Janeiro, 22780-160 RJ (Brazil); Delgado, J.U.; Silva, J.W.S. da; Barros, P.D. de; Araujo, R.M.S. de [Comissao Nacional de Energia Nuclear (CNEN), Instituto de Radioprotecao e Dosimetria (IRD), Avenida Salvador Allende s/n Recreio dos Bandeirantes, PO Box 37750, Rio de Janeiro, 22780-160 RJ (Brazil); Lopes, R.T. [Programa de Engenharia Nuclear (PEN/COPPE), Universidade Federal do Rio de Janeiro (UFRJ), Ilha do Fundao, PO Box 68509, Rio de Janeiro, 21945-970 RJ (Brazil)

    2012-07-15

    The manual version of the potentiometric titration method has been used for certification and characterization of uranium compounds. In order to reduce the analysis time and the influence of the analyst, a semi-automatic version of the method was developed in the Brazilian Nuclear Energy Commission. The method was applied with traceability assured by using a potassium dichromate primary standard. The combined standard uncertainty in determining the total concentration of uranium was around 0.01%, which is suitable for uranium characterization. - Highlights: Black-Right-Pointing-Pointer We developed a semi-automatic version of potentiometric titration method. Black-Right-Pointing-Pointer The method is used for certification and characterization of uranium compounds. Black-Right-Pointing-Pointer The traceability of the method was assured by a K{sub 2}Cr{sub 2}O{sub 7} primary standard. Black-Right-Pointing-Pointer The results of U{sub 3}O{sub 8} reference material analyzed was consistent with certified value. Black-Right-Pointing-Pointer The uncertainty obtained, near 0.01%, is useful for characterization purposes.

  11. Semi-automated quantification of living cells with internalized nanostructures

    KAUST Repository

    Margineanu, Michael B.

    2016-01-15

    Background Nanostructures fabricated by different methods have become increasingly important for various applications in biology and medicine, such as agents for medical imaging or cancer therapy. In order to understand their interaction with living cells and their internalization kinetics, several attempts have been made in tagging them. Although methods have been developed to measure the number of nanostructures internalized by the cells, there are only few approaches aimed to measure the number of cells that internalize the nanostructures, and they are usually limited to fixed-cell studies. Flow cytometry can be used for live-cell assays on large populations of cells, however it is a single time point measurement, and does not include any information about cell morphology. To date many of the observations made on internalization events are limited to few time points and cells. Results In this study, we present a method for quantifying cells with internalized magnetic nanowires (NWs). A machine learning-based computational framework, CellCognition, is adapted and used to classify cells with internalized and no internalized NWs, labeled with the fluorogenic pH-dependent dye pHrodo™ Red, and subsequently to determine the percentage of cells with internalized NWs at different time points. In a “proof-of-concept”, we performed a study on human colon carcinoma HCT 116 cells and human epithelial cervical cancer HeLa cells interacting with iron (Fe) and nickel (Ni) NWs. Conclusions This study reports a novel method for the quantification of cells that internalize a specific type of nanostructures. This approach is suitable for high-throughput and real-time data analysis and has the potential to be used to study the interaction of different types of nanostructures in live-cell assays.

  12. Accuracy and reproducibility of aortic annular measurements obtained from echocardiographic 3D manual and semi-automated software analyses in patients referred for transcatheter aortic valve implantation: implication for prosthesis size selection.

    Science.gov (United States)

    Stella, Stefano; Italia, Leonardo; Geremia, Giulia; Rosa, Isabella; Ancona, Francesco; Marini, Claudia; Capogrosso, Cristina; Giglio, Manuela; Montorfano, Matteo; Latib, Azeem; Margonato, Alberto; Colombo, Antonio; Agricola, Eustachio

    2018-02-06

    A 3D transoesophageal echocardiography (3D-TOE) reconstruction tool has recently been introduced. The system automatically configures a geometric model of the aortic root and performs quantitative analysis of these structures. We compared the measurements of the aortic annulus (AA) obtained by semi-automated 3D-TOE quantitative software and manual analysis vs. multislice computed tomography (MSCT) ones. One hundred and seventy-five patients (mean age 81.3 ± 6.3 years, 77 men) who underwent both MSCT and 3D-TOE for annulus assessment before transcatheter aortic valve implantation were analysed. Hypothetical prosthetic valve sizing was evaluated using the 3D manual, semi-automated measurements using manufacturer-recommended CT-based sizing algorithm as gold standard. Good correlation between 3D-TOE methods vs. MSCT measurements was found, but the semi-automated analysis demonstrated slightly better correlations for AA major diameter (r = 0.89), perimeter (r = 0.89), and area (r = 0.85) (all P 3D methods underestimated the MSCT measurements, but semi-automated measurements showed narrower limits of agreement and lesser bias than manual measurements for most of AA parameters. On average, 3D-TOE semi-automated major diameter, area, and perimeter underestimated the respective MSCT measurements by 7.4%, 3.5%, and 4.4%, respectively, whereas minor diameter was overestimated by 0.3%. Moderate agreement for valve sizing for both 3D-TOE techniques was found: Kappa agreement 0.5 for both semi-automated and manual analysis. Interobserver and intraobserver agreements for the AA measurements were excellent for both techniques (intraclass correlation coefficients for all parameters >0.80). The 3D-TOE semi-automated analysis of AA is feasible and reliable and can be used in clinical practice as an alternative to MSCT for AA assessment. Published on behalf of the European Society of Cardiology. All rights reserved. © The Author(s) 2018. For permissions, please

  13. Process analysis of the modelled 3-D mesoscale impact of aircraft emissions on the atmosphere

    Energy Technology Data Exchange (ETDEWEB)

    Hendricks, J; Ebel, A; Lippert, E; Petry, H [Koeln Univ. (Germany). Inst. fuer Geophysik und Meterorologie

    1998-12-31

    A mesoscale chemistry transport model is applied to study the impact of aircraft emissions on the atmospheric trace gas composition. A special analysis of the simulations is conducted to separate the effects of chemistry, transport, diffusion and cloud processes on the transformation of the exhausts of a subsonic fleet cruising over the North Atlantic. The aircraft induced ozone production strongly depends on the tropopause height and the cruise altitude. Aircraft emissions may undergo an effective downward transport under the influence of stratosphere-troposphere exchange activity. (author) 12 refs.

  14. Process analysis of the modelled 3-D mesoscale impact of aircraft emissions on the atmosphere

    Energy Technology Data Exchange (ETDEWEB)

    Hendricks, J.; Ebel, A.; Lippert, E.; Petry, H. [Koeln Univ. (Germany). Inst. fuer Geophysik und Meterorologie

    1997-12-31

    A mesoscale chemistry transport model is applied to study the impact of aircraft emissions on the atmospheric trace gas composition. A special analysis of the simulations is conducted to separate the effects of chemistry, transport, diffusion and cloud processes on the transformation of the exhausts of a subsonic fleet cruising over the North Atlantic. The aircraft induced ozone production strongly depends on the tropopause height and the cruise altitude. Aircraft emissions may undergo an effective downward transport under the influence of stratosphere-troposphere exchange activity. (author) 12 refs.

  15. Semi-automated identification of artefact and noise signals in MEG sensors

    International Nuclear Information System (INIS)

    Rettich, E.

    2006-09-01

    . The semi-automated solution presented here was tested on real MEG data

  16. Semi-automated Digital Imaging and Processing System for Measuring Lake Ice Thickness

    Science.gov (United States)

    Singh, Preetpal

    to detect equipment failure and identify defective products at the assembly line. The research work in this thesis combines machine vision and image processing technology to build a digital imaging and processing system for monitoring and measuring lake ice thickness in real time. An ultra-compact USB camera is programmed to acquire and transmit high resolution imagery for processing with MATLAB Image Processing toolbox. The image acquisition and transmission process is fully automated; image analysis is semi-automated and requires limited user input. Potential design changes to the prototype and ideas on fully automating the imaging and processing procedure are presented to conclude this research work.

  17. Evaluation of a semi-automated computer algorithm for measuring total fat and visceral fat content in lambs undergoing in vivo whole body computed tomography.

    Science.gov (United States)

    Rosenblatt, Alana J; Scrivani, Peter V; Boisclair, Yves R; Reeves, Anthony P; Ramos-Nieves, Jose M; Xie, Yiting; Erb, Hollis N

    2017-10-01

    Computed tomography (CT) is a suitable tool for measuring body fat, since it is non-destructive and can be used to differentiate metabolically active visceral fat from total body fat. Whole body analysis of body fat is likely to be more accurate than single CT slice estimates of body fat. The aim of this study was to assess the agreement between semi-automated computer analysis of whole body volumetric CT data and conventional proximate (chemical) analysis of body fat in lambs. Data were collected prospectively from 12 lambs that underwent duplicate whole body CT, followed by slaughter and carcass analysis by dissection and chemical analysis. Agreement between methods for quantification of total and visceral fat was assessed by Bland-Altman plot analysis. The repeatability of CT was assessed for these measures using the mean difference of duplicated measures. When compared to chemical analysis, CT systematically underestimated total and visceral fat contents by more than 10% of the mean fat weight. Therefore, carcass analysis and semi-automated CT computer measurements were not interchangeable for quantifying body fat content without the use of a correction factor. CT acquisition was repeatable, with a mean difference of repeated measures being close to zero. Therefore, uncorrected whole body CT might have an application for assessment of relative changes in fat content, especially in growing lambs. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. Quality of radiomic features in glioblastoma multiforme: Impact of semi-automated tumor segmentation software

    International Nuclear Information System (INIS)

    Lee, Myung Eun; Kim, Jong Hyo; Woo, Bo Yeong; Ko, Micheal D.; Jamshidi, Neema

    2017-01-01

    The purpose of this study was to evaluate the reliability and quality of radiomic features in glioblastoma multiforme (GBM) derived from tumor volumes obtained with semi-automated tumor segmentation software. MR images of 45 GBM patients (29 males, 16 females) were downloaded from The Cancer Imaging Archive, in which post-contrast T1-weighted imaging and fluid-attenuated inversion recovery MR sequences were used. Two raters independently segmented the tumors using two semi-automated segmentation tools (TumorPrism3D and 3D Slicer). Regions of interest corresponding to contrast-enhancing lesion, necrotic portions, and non-enhancing T2 high signal intensity component were segmented for each tumor. A total of 180 imaging features were extracted, and their quality was evaluated in terms of stability, normalized dynamic range (NDR), and redundancy, using intra-class correlation coefficients, cluster consensus, and Rand Statistic. Our study results showed that most of the radiomic features in GBM were highly stable. Over 90% of 180 features showed good stability (intra-class correlation coefficient [ICC] ≥ 0.8), whereas only 7 features were of poor stability (ICC NDR ≥1), while above 35% of the texture features showed poor NDR (< 1). Features were shown to cluster into only 5 groups, indicating that they were highly redundant. The use of semi-automated software tools provided sufficiently reliable tumor segmentation and feature stability; thus helping to overcome the inherent inter-rater and intra-rater variability of user intervention. However, certain aspects of feature quality, including NDR and redundancy, need to be assessed for determination of representative signature features before further development of radiomics

  19. A comparison of semi-automated volumetric vs linear measurement of small vestibular schwannomas.

    Science.gov (United States)

    MacKeith, Samuel; Das, Tilak; Graves, Martin; Patterson, Andrew; Donnelly, Neil; Mannion, Richard; Axon, Patrick; Tysome, James

    2018-04-01

    Accurate and precise measurement of vestibular schwannoma (VS) size is key to clinical management decisions. Linear measurements are used in routine clinical practice but are prone to measurement error. This study aims to compare a semi-automated volume segmentation tool against standard linear method for measuring small VS. This study also examines whether oblique tumour orientation can contribute to linear measurement error. Experimental comparison of observer agreement using two measurement techniques. Tertiary skull base unit. Twenty-four patients with unilateral sporadic small (linear dimension following reformatting to correct for oblique orientation of VS. Intra-observer ICC was higher for semi-automated volumetric when compared with linear measurements, 0.998 (95% CI 0.994-0.999) vs 0.936 (95% CI 0.856-0.972), p linear measurements, 0.989 (95% CI 0.975-0.995) vs 0.946 (95% CI 0.880-0.976), p = 0.0045. The intra-observer %SDD was similar for volumetric and linear measurements, 9.9% vs 11.8%. However, the inter-observer %SDD was greater for volumetric than linear measurements, 20.1% vs 10.6%. Following oblique reformatting to correct tumour angulation, the mean increase in size was 1.14 mm (p = 0.04). Semi-automated volumetric measurements are more repeatable than linear measurements when measuring small VS and should be considered for use in clinical practice. Oblique orientation of VS may contribute to linear measurement error.

  20. Quality of radiomic features in glioblastoma multiforme: Impact of semi-automated tumor segmentation software

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Myung Eun; Kim, Jong Hyo [Center for Medical-IT Convergence Technology Research, Advanced Institutes of Convergence Technology, Seoul National University, Suwon (Korea, Republic of); Woo, Bo Yeong [Dept. of Transdisciplinary Studies, Graduate School of Convergence Science and Technology, Seoul National University, Suwon (Korea, Republic of); Ko, Micheal D.; Jamshidi, Neema [Dept. of Radiological Sciences, University of California, Los Angeles, Los Angeles (United States)

    2017-06-15

    The purpose of this study was to evaluate the reliability and quality of radiomic features in glioblastoma multiforme (GBM) derived from tumor volumes obtained with semi-automated tumor segmentation software. MR images of 45 GBM patients (29 males, 16 females) were downloaded from The Cancer Imaging Archive, in which post-contrast T1-weighted imaging and fluid-attenuated inversion recovery MR sequences were used. Two raters independently segmented the tumors using two semi-automated segmentation tools (TumorPrism3D and 3D Slicer). Regions of interest corresponding to contrast-enhancing lesion, necrotic portions, and non-enhancing T2 high signal intensity component were segmented for each tumor. A total of 180 imaging features were extracted, and their quality was evaluated in terms of stability, normalized dynamic range (NDR), and redundancy, using intra-class correlation coefficients, cluster consensus, and Rand Statistic. Our study results showed that most of the radiomic features in GBM were highly stable. Over 90% of 180 features showed good stability (intra-class correlation coefficient [ICC] ≥ 0.8), whereas only 7 features were of poor stability (ICC < 0.5). Most first order statistics and morphometric features showed moderate-to-high NDR (4 > NDR ≥1), while above 35% of the texture features showed poor NDR (< 1). Features were shown to cluster into only 5 groups, indicating that they were highly redundant. The use of semi-automated software tools provided sufficiently reliable tumor segmentation and feature stability; thus helping to overcome the inherent inter-rater and intra-rater variability of user intervention. However, certain aspects of feature quality, including NDR and redundancy, need to be assessed for determination of representative signature features before further development of radiomics.

  1. Vessel suppressed chest Computed Tomography for semi-automated volumetric measurements of solid pulmonary nodules.

    Science.gov (United States)

    Milanese, Gianluca; Eberhard, Matthias; Martini, Katharina; Vittoria De Martini, Ilaria; Frauenfelder, Thomas

    2018-04-01

    To evaluate whether vessel-suppressed computed tomography (VSCT) can be reliably used for semi-automated volumetric measurements of solid pulmonary nodules, as compared to standard CT (SCT) MATERIAL AND METHODS: Ninety-three SCT were elaborated by dedicated software (ClearRead CT, Riverain Technologies, Miamisburg, OH, USA), that allows subtracting vessels from lung parenchyma. Semi-automated volumetric measurements of 65 solid nodules were compared between SCT and VSCT. The measurements were repeated by two readers. For each solid nodule, volume measured on SCT by Reader 1 and Reader 2 was averaged and the average volume between readers acted as standard of reference value. Concordance between measurements was assessed using Lin's Concordance Correlation Coefficient (CCC). Limits of agreement (LoA) between readers and CT datasets were evaluated. Standard of reference nodule volume ranged from 13 to 366 mm 3 . The mean overestimation between readers was 3 mm 3 and 2.9 mm 3 on SCT and VSCT, respectively. Semi-automated volumetric measurements on VSCT showed substantial agreement with the standard of reference (Lin's CCC = 0.990 for Reader 1; 0.985 for Reader 2). The upper and lower LoA between readers' measurements were (16.3, -22.4 mm 3 ) and (15.5, -21.4 mm 3 ) for SCT and VSCT, respectively. VSCT datasets are feasible for the measurements of solid nodules, showing an almost perfect concordance between readers and with measurements on SCT. Copyright © 2018 Elsevier B.V. All rights reserved.

  2. A Framework for Semi-Automated Implementation of Multidimensional Data Models

    Directory of Open Access Journals (Sweden)

    Ilona Mariana NAGY

    2012-08-01

    Full Text Available Data warehousing solution development represents a challenging task which requires the employment of considerable resources on behalf of enterprises and sustained commitment from the stakeholders. Costs derive mostly from the amount of time invested in the design and physical implementation of these large projects, time that we consider, may be decreased through the automation of several processes. Thus, we present a framework for semi-automated implementation of multidimensional data models and introduce an automation prototype intended to reduce the time of data structures generation in the warehousing environment. Our research is focused on the design of an automation component and the development of a corresponding prototype from technical metadata.

  3. Evaluation and optimisation of preparative semi-automated electrophoresis systems for Illumina library preparation.

    Science.gov (United States)

    Quail, Michael A; Gu, Yong; Swerdlow, Harold; Mayho, Matthew

    2012-12-01

    Size selection can be a critical step in preparation of next-generation sequencing libraries. Traditional methods employing gel electrophoresis lack reproducibility, are labour intensive, do not scale well and employ hazardous interchelating dyes. In a high-throughput setting, solid-phase reversible immobilisation beads are commonly used for size-selection, but result in quite a broad fragment size range. We have evaluated and optimised the use of two semi-automated preparative DNA electrophoresis systems, the Caliper Labchip XT and the Sage Science Pippin Prep, for size selection of Illumina sequencing libraries. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. Evaluation of training nurses to perform semi-automated three-dimensional left ventricular ejection fraction using a customised workstation-based training protocol.

    Science.gov (United States)

    Guppy-Coles, Kristyan B; Prasad, Sandhir B; Smith, Kym C; Hillier, Samuel; Lo, Ada; Atherton, John J

    2015-06-01

    We aimed to determine the feasibility of training cardiac nurses to evaluate left ventricular function utilising a semi-automated, workstation-based protocol on three dimensional echocardiography images. Assessment of left ventricular function by nurses is an attractive concept. Recent developments in three dimensional echocardiography coupled with border detection assistance have reduced inter- and intra-observer variability and analysis time. This could allow abbreviated training of nurses to assess cardiac function. A comparative, diagnostic accuracy study evaluating left ventricular ejection fraction assessment utilising a semi-automated, workstation-based protocol performed by echocardiography-naïve nurses on previously acquired three dimensional echocardiography images. Nine cardiac nurses underwent two brief lectures about cardiac anatomy, physiology and three dimensional left ventricular ejection fraction assessment, before a hands-on demonstration in 20 cases. We then selected 50 cases from our three dimensional echocardiography library based on optimal image quality with a broad range of left ventricular ejection fractions, which was quantified by two experienced sonographers and the average used as the comparator for the nurses. Nurses independently measured three dimensional left ventricular ejection fraction using the Auto lvq package with semi-automated border detection. The left ventricular ejection fraction range was 25-72% (70% with a left ventricular ejection fraction nurses showed excellent agreement with the sonographers. Minimal intra-observer variability was noted on both short-term (same day) and long-term (>2 weeks later) retest. It is feasible to train nurses to measure left ventricular ejection fraction utilising a semi-automated, workstation-based protocol on previously acquired three dimensional echocardiography images. Further study is needed to determine the feasibility of training nurses to acquire three dimensional echocardiography

  5. Investigating Semi-Automated Cadastral Boundaries Extraction from Airborne Laser Scanned Data

    Directory of Open Access Journals (Sweden)

    Xianghuan Luo

    2017-09-01

    Full Text Available Many developing countries have witnessed the urgent need of accelerating cadastral surveying processes. Previous studies found that large portions of cadastral boundaries coincide with visible physical objects, namely roads, fences, and building walls. This research explores the application of airborne laser scanning (ALS techniques on cadastral surveys. A semi-automated workflow is developed to extract cadastral boundaries from an ALS point clouds. Firstly, a two-phased workflow was developed that focused on extracting digital representations of physical objects. In the automated extraction phase, after classifying points into semantic components, the outline of planar objects such as building roofs and road surfaces were generated by an α-shape algorithm, whilst the centerlines delineatiation approach was fitted into the lineate object—a fence. Afterwards, the extracted vector lines were edited and refined during the post-refinement phase. Secondly, we quantitatively evaluated the workflow performance by comparing results against an exiting cadastral map as reference. It was found that the workflow achieved promising results: around 80% completeness and 60% correctness on average, although the spatial accuracy is still modest. It is argued that the semi-automated extraction workflow could effectively speed up cadastral surveying, with both human resources and equipment costs being reduced

  6. Quality of Radiomic Features in Glioblastoma Multiforme: Impact of Semi-Automated Tumor Segmentation Software.

    Science.gov (United States)

    Lee, Myungeun; Woo, Boyeong; Kuo, Michael D; Jamshidi, Neema; Kim, Jong Hyo

    2017-01-01

    The purpose of this study was to evaluate the reliability and quality of radiomic features in glioblastoma multiforme (GBM) derived from tumor volumes obtained with semi-automated tumor segmentation software. MR images of 45 GBM patients (29 males, 16 females) were downloaded from The Cancer Imaging Archive, in which post-contrast T1-weighted imaging and fluid-attenuated inversion recovery MR sequences were used. Two raters independently segmented the tumors using two semi-automated segmentation tools (TumorPrism3D and 3D Slicer). Regions of interest corresponding to contrast-enhancing lesion, necrotic portions, and non-enhancing T2 high signal intensity component were segmented for each tumor. A total of 180 imaging features were extracted, and their quality was evaluated in terms of stability, normalized dynamic range (NDR), and redundancy, using intra-class correlation coefficients, cluster consensus, and Rand Statistic. Our study results showed that most of the radiomic features in GBM were highly stable. Over 90% of 180 features showed good stability (intra-class correlation coefficient [ICC] ≥ 0.8), whereas only 7 features were of poor stability (ICC NDR ≥1), while above 35% of the texture features showed poor NDR (software tools provided sufficiently reliable tumor segmentation and feature stability; thus helping to overcome the inherent inter-rater and intra-rater variability of user intervention. However, certain aspects of feature quality, including NDR and redundancy, need to be assessed for determination of representative signature features before further development of radiomics.

  7. A semi-automated algorithm for hypothalamus volumetry in 3 Tesla magnetic resonance images.

    Science.gov (United States)

    Wolff, Julia; Schindler, Stephanie; Lucas, Christian; Binninger, Anne-Sophie; Weinrich, Luise; Schreiber, Jan; Hegerl, Ulrich; Möller, Harald E; Leitzke, Marco; Geyer, Stefan; Schönknecht, Peter

    2018-07-30

    The hypothalamus, a small diencephalic gray matter structure, is part of the limbic system. Volumetric changes of this structure occur in psychiatric diseases, therefore there is increasing interest in precise volumetry. Based on our detailed volumetry algorithm for 7 Tesla magnetic resonance imaging (MRI), we developed a method for 3 Tesla MRI, adopting anatomical landmarks and work in triplanar view. We overlaid T1-weighted MR images with gray matter-tissue probability maps to combine anatomical information with tissue class segmentation. Then, we outlined regions of interest (ROIs) that covered potential hypothalamus voxels. Within these ROIs, seed growing technique helped define the hypothalamic volume using gray matter probabilities from the tissue probability maps. This yielded a semi-automated method with short processing times of 20-40 min per hypothalamus. In the MRIs of ten subjects, reliabilities were determined as intraclass correlations (ICC) and volume overlaps in percent. Three raters achieved very good intra-rater reliabilities (ICC 0.82-0.97) and good inter-rater reliabilities (ICC 0.78 and 0.82). Overlaps of intra- and inter-rater runs were very good (≥ 89.7%). We present a fast, semi-automated method for in vivo hypothalamus volumetry in 3 Tesla MRI. Copyright © 2018 Elsevier B.V. All rights reserved.

  8. A Study on the Cost-Effectiveness of a SemiAutomated Cutting Process at a Garment Manufacturing Company

    Directory of Open Access Journals (Sweden)

    Castro, Mark Daniel

    2017-11-01

    Full Text Available The subject of the study, Company X, has been experiencing variations in the quantity report from the cutting department and the transmittal reports. The management found that these processes are hugely affected by manual labor. To reduce the system's proneness to human error, the management decided to explore the possibility of adapting a semi-automated spreading and cutting process in the system. This research aims to evaluate the pre-sewing processes of Company X and whether introducing automation can be beneficial to the company and the garments industry. The researchers used process mapping tools, descriptive research, and process flowchart to assess the current and proposed systems, and engineering economics to evaluate the cost and benefits of implementing the semi-automated system. The results showed that with the implementation of the semi- automated system; the company will incur 66.61% more savings per year than the current system. In terms of cycle time, the semi-automated system eliminated the relaxation of fabric before the cutting process, thereby greatly reducing cycle time. In addition, the researchers found that as long as the company produce more than 4,140 pieces per day for the system will be economically feasible. Unquantifiable benefits are also identified on introducing the semi- automated system to the company. The company can have a cleaner work environment that will lead to more productivity and greater quality of goods. This will lead to a better company image that will encourage more customers to place job orders.

  9. A geometrical approach for semi-automated crystal centering and in situ X-ray diffraction data collection

    International Nuclear Information System (INIS)

    Mohammad Yaser Heidari Khajepour; Ferrer, Jean-Luc; Lebrette, Hugo; Vernede, Xavier; Rogues, Pierrick

    2013-01-01

    High-throughput protein crystallography projects pushed forward the development of automated crystallization platforms that are now commonly used. This created an urgent need for adapted and automated equipment for crystal analysis. However, first these crystals have to be harvested, cryo-protected and flash-cooled, operations that can fail or negatively impact on the crystal. In situ X-ray diffraction analysis has become a valid alternative to these operations, and a growing number of users apply it for crystal screening and to solve structures. Nevertheless, even this shortcut may require a significant amount of beam time. In this in situ high-throughput approach, the centering of crystals relative to the beam represents the bottleneck in the analysis process. In this article, a new method to accelerate this process, by recording accurately the local geometry coordinates for each crystal in the crystallization plate, is presented. Subsequently, the crystallization plate can be presented to the X-ray beam by an automated plate-handling device, such as a six-axis robot arm, for an automated crystal centering in the beam, in situ screening or data collection. Here the preliminary results of such a semi-automated pipeline are reported for two distinct test proteins. (authors)

  10. Semi-automated literature mining to identify putative biomarkers of disease from multiple biofluids

    Science.gov (United States)

    2014-01-01

    Background Computational methods for mining of biomedical literature can be useful in augmenting manual searches of the literature using keywords for disease-specific biomarker discovery from biofluids. In this work, we develop and apply a semi-automated literature mining method to mine abstracts obtained from PubMed to discover putative biomarkers of breast and lung cancers in specific biofluids. Methodology A positive set of abstracts was defined by the terms ‘breast cancer’ and ‘lung cancer’ in conjunction with 14 separate ‘biofluids’ (bile, blood, breastmilk, cerebrospinal fluid, mucus, plasma, saliva, semen, serum, synovial fluid, stool, sweat, tears, and urine), while a negative set of abstracts was defined by the terms ‘(biofluid) NOT breast cancer’ or ‘(biofluid) NOT lung cancer.’ More than 5.3 million total abstracts were obtained from PubMed and examined for biomarker-disease-biofluid associations (34,296 positive and 2,653,396 negative for breast cancer; 28,355 positive and 2,595,034 negative for lung cancer). Biological entities such as genes and proteins were tagged using ABNER, and processed using Python scripts to produce a list of putative biomarkers. Z-scores were calculated, ranked, and used to determine significance of putative biomarkers found. Manual verification of relevant abstracts was performed to assess our method’s performance. Results Biofluid-specific markers were identified from the literature, assigned relevance scores based on frequency of occurrence, and validated using known biomarker lists and/or databases for lung and breast cancer [NCBI’s On-line Mendelian Inheritance in Man (OMIM), Cancer Gene annotation server for cancer genomics (CAGE), NCBI’s Genes & Disease, NCI’s Early Detection Research Network (EDRN), and others]. The specificity of each marker for a given biofluid was calculated, and the performance of our semi-automated literature mining method assessed for breast and lung cancer

  11. Feasibility of geometric-intensity-based semi-automated delineation of the tentorium cerebelli from MRI scans.

    Science.gov (United States)

    Penumetcha, Neeraja; Kabadi, Suraj; Jedynak, Bruno; Walcutt, Charles; Gado, Mokhtar H; Wang, Lei; Ratnanather, J Tilak

    2011-04-01

    This paper describes a feasibility study of a method for delineating the tentorium cerebelli in magnetic resonance imaging (MRI) brain scans. The tentorium cerebelli is a thin sheet of dura matter covering the cerebellum and separating it from the posterior part of the temporal lobe and the occipital lobe of the cerebral hemispheres. Cortical structures such as the parahippocampal gyrus can be indistinguishable from tentorium in magnetized prepared rapid gradient echo and T1-weighted MRI scans. Similar intensities in these neighboring regions make it difficult to perform accurate cortical analysis in neuroimaging studies of schizophrenia and Alzheimer's disease. A semi-automated, geometric, intensity-based procedure for delineating the tentorium from a whole-brain scan is described. Initial and final curves are traced within the tentorium. A cost function, based on intensity and Euclidean distance, is computed between the two curves using the Fast Marching method. The initial curve is then evolved to the final curve based on the gradient of the computed costs, generating a series of intermediate curves. These curves are then used to generate a triangulated surface of the tentorium. For 3 scans, surfaces were found to be within 2 voxels from hand segmentations. Copyright © 2009 by the American Society of Neuroimaging.

  12. Percutaneous biopsy of a metastatic common iliac lymph node using hydrodissection and a semi-automated biopsy gun

    Energy Technology Data Exchange (ETDEWEB)

    Ryu, Seong Yoon; Park, Byung Kwan [Dept. of Radiology, amsung Medical Center, Sungkyunkwan University School of Medicine, Seoul (Korea, Republic of)

    2017-06-15

    Percutaneous biopsy is a less invasive technique for sampling the tissue than laparoscopic biopsy or exploratory laparotomy. However, it is difficult to perform biopsy of a deep-seated lesion because of the possibility of damage to the critical organs. Recently, we successfully performed CT-guided biopsy of a metastatic common iliac lymph node using hydrodissection and semi-automated biopsy devices. The purpose of this case report was to show how to perform hydrodissection and how to use a semi-automated gun for safe biopsy of a metastatic common iliac lymph node.

  13. Intra- and interoperator variability of lobar pulmonary volumes and emphysema scores in patients with chronic obstructive pulmonary disease and emphysema: comparison of manual and semi-automated segmentation techniques.

    Science.gov (United States)

    Molinari, Francesco; Pirronti, Tommaso; Sverzellati, Nicola; Diciotti, Stefano; Amato, Michele; Paolantonio, Guglielmo; Gentile, Luigia; Parapatt, George K; D'Argento, Francesco; Kuhnigk, Jan-Martin

    2013-01-01

    We aimed to compare the intra- and interoperator variability of lobar volumetry and emphysema scores obtained by semi-automated and manual segmentation techniques in lung emphysema patients. In two sessions held three months apart, two operators performed lobar volumetry of unenhanced chest computed tomography examinations of 47 consecutive patients with chronic obstructive pulmonary disease and lung emphysema. Both operators used the manual and semi-automated segmentation techniques. The intra- and interoperator variability of the volumes and emphysema scores obtained by semi-automated segmentation was compared with the variability obtained by manual segmentation of the five pulmonary lobes. The intra- and interoperator variability of the lobar volumes decreased when using semi-automated lobe segmentation (coefficients of repeatability for the first operator: right upper lobe, 147 vs. 96.3; right middle lobe, 137.7 vs. 73.4; right lower lobe, 89.2 vs. 42.4; left upper lobe, 262.2 vs. 54.8; and left lower lobe, 260.5 vs. 56.5; coefficients of repeatability for the second operator: right upper lobe, 61.4 vs. 48.1; right middle lobe, 56 vs. 46.4; right lower lobe, 26.9 vs. 16.7; left upper lobe, 61.4 vs. 27; and left lower lobe, 63.6 vs. 27.5; coefficients of reproducibility in the interoperator analysis: right upper lobe, 191.3 vs. 102.9; right middle lobe, 219.8 vs. 126.5; right lower lobe, 122.6 vs. 90.1; left upper lobe, 166.9 vs. 68.7; and left lower lobe, 168.7 vs. 71.6). The coefficients of repeatability and reproducibility of emphysema scores also decreased when using semi-automated segmentation and had ranges that varied depending on the target lobe and selected threshold of emphysema. Semi-automated segmentation reduces the intra- and interoperator variability of lobar volumetry and provides a more objective tool than manual technique for quantifying lung volumes and severity of emphysema.

  14. Semantic enrichment of medical forms - semi-automated coding of ODM-elements via web services.

    Science.gov (United States)

    Breil, Bernhard; Watermann, Andreas; Haas, Peter; Dziuballe, Philipp; Dugas, Martin

    2012-01-01

    Semantic interoperability is an unsolved problem which occurs while working with medical forms from different information systems or institutions. Standards like ODM or CDA assure structural homogenization but in order to compare elements from different data models it is necessary to use semantic concepts and codes on an item level of those structures. We developed and implemented a web-based tool which enables a domain expert to perform semi-automated coding of ODM-files. For each item it is possible to inquire web services which result in unique concept codes without leaving the context of the document. Although it was not feasible to perform a totally automated coding we have implemented a dialog based method to perform an efficient coding of all data elements in the context of the whole document. The proportion of codable items was comparable to results from previous studies.

  15. Semi-automated high-efficiency reflectivity chamber for vacuum UV measurements

    Science.gov (United States)

    Wiley, James; Fleming, Brian; Renninger, Nicholas; Egan, Arika

    2017-08-01

    This paper presents the design and theory of operation for a semi-automated reflectivity chamber for ultraviolet optimized optics. A graphical user interface designed in LabVIEW controls the stages, interfaces with the detector system, takes semi-autonomous measurements, and monitors the system in case of error. Samples and an optical photodiode sit on an optics plate mounted to a rotation stage in the middle of the vacuum chamber. The optics plate rotates the samples and diode between an incident and reflected position to measure the absolute reflectivity of the samples at wavelengths limited by the monochromator operational bandpass of 70 nm to 550 nm. A collimating parabolic mirror on a fine steering tip-tilt motor enables beam steering for detector peak-ups. This chamber is designed to take measurements rapidly and with minimal oversight, increasing lab efficiency for high cadence and high accuracy vacuum UV reflectivity measurements.

  16. Semi-automated extraction and characterization of Stromal Vascular Fraction using a new medical device.

    Science.gov (United States)

    Hanke, Alexander; Prantl, Lukas; Wenzel, Carina; Nerlich, Michael; Brockhoff, Gero; Loibl, Markus; Gehmert, Sebastian

    2016-01-01

    The stem cell rich Stromal Vascular Fraction (SVF) can be harvested by processing lipo-aspirate or fat tissue with an enzymatic digestion followed by centrifugation. To date neither a standardised extraction method for SVF nor a generally admitted protocol for cell application in patients exists. A novel commercially available semi-automated device for the extraction of SVF promises sterility, consistent results and usability in the clinical routine. The aim of this work was to compare the quantity and quality of the SVF between the new system and an established manual laboratory method. SVF was extracted from lipo-aspirate both by a prototype of the semi-automated UNiStation™ (NeoGenesis, Seoul, Korea) and by hand preparation with common laboratory equipment. Cell composition of the SVF was characterized by multi-parametric flow-cytometry (FACSCanto-II, BD Biosciences). The total cell number (quantity) of the SVF was determined as well the percentage of cells expressing the stem cell marker CD34, the leucocyte marker CD45 and the marker CD271 for highly proliferative stem cells (quality). Lipo-aspirate obtained from six patients was processed with both the novel device (d) and the hand preparation (h) which always resulted in a macroscopically visible SVF. However, there was a tendency of a fewer cell yield per gram of used lipo-aspirate with the device (d: 1.1×105±1.1×105 vs. h: 2.0×105±1.7×105; p = 0.06). Noteworthy, the percentage of CD34+ cells was significantly lower when using the device (d: 57.3% ±23.8% vs. h: 74.1% ±13.4%; p = 0.02) and CD45+ leukocyte counts tend to be higher when compared to the hand preparation (d: 20.7% ±15.8% vs. h: 9.8% ±7.1%; p = 0.07). The percentage of highly proliferative CD271+ cells was similar for both methods (d:12.9% ±9.6% vs. h: 13.4% ±11.6%; p = 0.74) and no differences were found for double positive cells of CD34+/CD45+ (d: 5.9% ±1.7% vs. h: 1.7% ±1.1%; p = 0.13), CD34+/CD271+ (d: 24

  17. Smartnotebook: A semi-automated approach to protein sequential NMR resonance assignments

    International Nuclear Information System (INIS)

    Slupsky, Carolyn M.; Boyko, Robert F.; Booth, Valerie K.; Sykes, Brian D.

    2003-01-01

    Complete and accurate NMR spectral assignment is a prerequisite for high-throughput automated structure determination of biological macromolecules. However, completely automated assignment procedures generally encounter difficulties for all but the most ideal data sets. Sources of these problems include difficulty in resolving correlations in crowded spectral regions, as well as complications arising from dynamics, such as weak or missing peaks, or atoms exhibiting more than one peak due to exchange phenomena. Smartnotebook is a semi-automated assignment software package designed to combine the best features of the automated and manual approaches. The software finds and displays potential connections between residues, while the spectroscopist makes decisions on which connection is correct, allowing rapid and robust assignment. In addition, smartnotebook helps the user fit chains of connected residues to the primary sequence of the protein by comparing the experimentally determined chemical shifts with expected shifts derived from a chemical shift database, while providing bookkeeping throughout the assignment procedure

  18. Network analysis of mesoscale optical recordings to assess regional, functional connectivity.

    Science.gov (United States)

    Lim, Diana H; LeDue, Jeffrey M; Murphy, Timothy H

    2015-10-01

    With modern optical imaging methods, it is possible to map structural and functional connectivity. Optical imaging studies that aim to describe large-scale neural connectivity often need to handle large and complex datasets. In order to interpret these datasets, new methods for analyzing structural and functional connectivity are being developed. Recently, network analysis, based on graph theory, has been used to describe and quantify brain connectivity in both experimental and clinical studies. We outline how to apply regional, functional network analysis to mesoscale optical imaging using voltage-sensitive-dye imaging and channelrhodopsin-2 stimulation in a mouse model. We include links to sample datasets and an analysis script. The analyses we employ can be applied to other types of fluorescence wide-field imaging, including genetically encoded calcium indicators, to assess network properties. We discuss the benefits and limitations of using network analysis for interpreting optical imaging data and define network properties that may be used to compare across preparations or other manipulations such as animal models of disease.

  19. Semi-automated extraction of longitudinal subglacial bedforms from digital terrain models - Two new methods

    Science.gov (United States)

    Jorge, Marco G.; Brennand, Tracy A.

    2017-07-01

    Relict drumlin and mega-scale glacial lineation (positive relief, longitudinal subglacial bedforms - LSBs) morphometry has been used as a proxy for paleo ice-sheet dynamics. LSB morphometric inventories have relied on manual mapping, which is slow and subjective and thus potentially difficult to reproduce. Automated methods are faster and reproducible, but previous methods for LSB semi-automated mapping have not been highly successful. Here, two new object-based methods for the semi-automated extraction of LSBs (footprints) from digital terrain models are compared in a test area in the Puget Lowland, Washington, USA. As segmentation procedures to create LSB-candidate objects, the normalized closed contour method relies on the contouring of a normalized local relief model addressing LSBs on slopes, and the landform elements mask method relies on the classification of landform elements derived from the digital terrain model. For identifying which LSB-candidate objects correspond to LSBs, both methods use the same LSB operational definition: a ruleset encapsulating expert knowledge, published morphometric data, and the morphometric range of LSBs in the study area. The normalized closed contour method was separately applied to four different local relief models, two computed in moving windows and two hydrology-based. Overall, the normalized closed contour method outperformed the landform elements mask method. The normalized closed contour method performed on a hydrological relief model from a multiple direction flow routing algorithm performed best. For an assessment of its transferability, the normalized closed contour method was evaluated on a second area, the Chautauqua drumlin field, Pennsylvania and New York, USA where it performed better than in the Puget Lowland. A broad comparison to previous methods suggests that the normalized relief closed contour method may be the most capable method to date, but more development is required.

  20. Fluctuating Finite Element Analysis (FFEA): A continuum mechanics software tool for mesoscale simulation of biomolecules.

    Science.gov (United States)

    Solernou, Albert; Hanson, Benjamin S; Richardson, Robin A; Welch, Robert; Read, Daniel J; Harlen, Oliver G; Harris, Sarah A

    2018-03-01

    Fluctuating Finite Element Analysis (FFEA) is a software package designed to perform continuum mechanics simulations of proteins and other globular macromolecules. It combines conventional finite element methods with stochastic thermal noise, and is appropriate for simulations of large proteins and protein complexes at the mesoscale (length-scales in the range of 5 nm to 1 μm), where there is currently a paucity of modelling tools. It requires 3D volumetric information as input, which can be low resolution structural information such as cryo-electron tomography (cryo-ET) maps or much higher resolution atomistic co-ordinates from which volumetric information can be extracted. In this article we introduce our open source software package for performing FFEA simulations which we have released under a GPLv3 license. The software package includes a C ++ implementation of FFEA, together with tools to assist the user to set up the system from Electron Microscopy Data Bank (EMDB) or Protein Data Bank (PDB) data files. We also provide a PyMOL plugin to perform basic visualisation and additional Python tools for the analysis of FFEA simulation trajectories. This manuscript provides a basic background to the FFEA method, describing the implementation of the core mechanical model and how intermolecular interactions and the solvent environment are included within this framework. We provide prospective FFEA users with a practical overview of how to set up an FFEA simulation with reference to our publicly available online tutorials and manuals that accompany this first release of the package.

  1. Mesoscale brain explorer, a flexible python-based image analysis and visualization tool.

    Science.gov (United States)

    Haupt, Dirk; Vanni, Matthieu P; Bolanos, Federico; Mitelut, Catalin; LeDue, Jeffrey M; Murphy, Tim H

    2017-07-01

    Imaging of mesoscale brain activity is used to map interactions between brain regions. This work has benefited from the pioneering studies of Grinvald et al., who employed optical methods to image brain function by exploiting the properties of intrinsic optical signals and small molecule voltage-sensitive dyes. Mesoscale interareal brain imaging techniques have been advanced by cell targeted and selective recombinant indicators of neuronal activity. Spontaneous resting state activity is often collected during mesoscale imaging to provide the basis for mapping of connectivity relationships using correlation. However, the information content of mesoscale datasets is vast and is only superficially presented in manuscripts given the need to constrain measurements to a fixed set of frequencies, regions of interest, and other parameters. We describe a new open source tool written in python, termed mesoscale brain explorer (MBE), which provides an interface to process and explore these large datasets. The platform supports automated image processing pipelines with the ability to assess multiple trials and combine data from different animals. The tool provides functions for temporal filtering, averaging, and visualization of functional connectivity relations using time-dependent correlation. Here, we describe the tool and show applications, where previously published datasets were reanalyzed using MBE.

  2. Semi-automated De-identification of German Content Sensitive Reports for Big Data Analytics.

    Science.gov (United States)

    Seuss, Hannes; Dankerl, Peter; Ihle, Matthias; Grandjean, Andrea; Hammon, Rebecca; Kaestle, Nicola; Fasching, Peter A; Maier, Christian; Christoph, Jan; Sedlmayr, Martin; Uder, Michael; Cavallaro, Alexander; Hammon, Matthias

    2017-07-01

    Purpose  Projects involving collaborations between different institutions require data security via selective de-identification of words or phrases. A semi-automated de-identification tool was developed and evaluated on different types of medical reports natively and after adapting the algorithm to the text structure. Materials and Methods  A semi-automated de-identification tool was developed and evaluated for its sensitivity and specificity in detecting sensitive content in written reports. Data from 4671 pathology reports (4105 + 566 in two different formats), 2804 medical reports, 1008 operation reports, and 6223 radiology reports of 1167 patients suffering from breast cancer were de-identified. The content was itemized into four categories: direct identifiers (name, address), indirect identifiers (date of birth/operation, medical ID, etc.), medical terms, and filler words. The software was tested natively (without training) in order to establish a baseline. The reports were manually edited and the model re-trained for the next test set. After manually editing 25, 50, 100, 250, 500 and if applicable 1000 reports of each type re-training was applied. Results  In the native test, 61.3 % of direct and 80.8 % of the indirect identifiers were detected. The performance (P) increased to 91.4 % (P25), 96.7 % (P50), 99.5 % (P100), 99.6 % (P250), 99.7 % (P500) and 100 % (P1000) for direct identifiers and to 93.2 % (P25), 97.9 % (P50), 97.2 % (P100), 98.9 % (P250), 99.0 % (P500) and 99.3 % (P1000) for indirect identifiers. Without training, 5.3 % of medical terms were falsely flagged as critical data. The performance increased, after training, to 4.0 % (P25), 3.6 % (P50), 4.0 % (P100), 3.7 % (P250), 4.3 % (P500), and 3.1 % (P1000). Roughly 0.1 % of filler words were falsely flagged. Conclusion  Training of the developed de-identification tool continuously improved its performance. Training with roughly 100 edited

  3. A semi-automated method for rapid detection of ripple events on interictal voltage discharges in the scalp electroencephalogram.

    Science.gov (United States)

    Chu, Catherine J; Chan, Arthur; Song, Dan; Staley, Kevin J; Stufflebeam, Steven M; Kramer, Mark A

    2017-02-01

    High frequency oscillations are emerging as a clinically important indicator of epileptic networks. However, manual detection of these high frequency oscillations is difficult, time consuming, and subjective, especially in the scalp EEG, thus hindering further clinical exploration and application. Semi-automated detection methods augment manual detection by reducing inspection to a subset of time intervals. We propose a new method to detect high frequency oscillations that co-occur with interictal epileptiform discharges. The new method proceeds in two steps. The first step identifies candidate time intervals during which high frequency activity is increased. The second step computes a set of seven features for each candidate interval. These features require that the candidate event contain a high frequency oscillation approximately sinusoidal in shape, with at least three cycles, that co-occurs with a large amplitude discharge. Candidate events that satisfy these features are stored for validation through visual analysis. We evaluate the detector performance in simulation and on ten examples of scalp EEG data, and show that the proposed method successfully detects spike-ripple events, with high positive predictive value, low false positive rate, and high intra-rater reliability. The proposed method is less sensitive than the existing method of visual inspection, but much faster and much more reliable. Accurate and rapid detection of high frequency activity increases the clinical viability of this rhythmic biomarker of epilepsy. The proposed spike-ripple detector rapidly identifies candidate spike-ripple events, thus making clinical analysis of prolonged, multielectrode scalp EEG recordings tractable. Copyright © 2016 Elsevier B.V. All rights reserved.

  4. Semi-automated tabulation of the 3D topology and morphology of branching networks using CT: application to the airway tree

    International Nuclear Information System (INIS)

    Sauret, V.; Bailey, A.G.

    1999-01-01

    Detailed information on biological branching networks (optical nerves, airways or blood vessels) is often required to improve the analysis of 3D medical imaging data. A semi-automated algorithm has been developed to obtain the full 3D topology and dimensions (direction cosine, length, diameter, branching and gravity angles) of branching networks using their CT images. It has been tested using CT images of a simple Perspex branching network and applied to the CT images of a human cast of the airway tree. The morphology and topology of the computer derived network were compared with the manually measured dimensions. Good agreement was found. The airways dimensions also compared well with previous values quoted in literature. This algorithm can provide complete data set analysis much more quickly than manual measurements. Its use is limited by the CT resolution which means that very small branches are not visible. New data are presented on the branching angles of the airway tree. (author)

  5. SU-E-J-252: Reproducibility of Radiogenomic Image Features: Comparison of Two Semi-Automated Segmentation Methods

    Energy Technology Data Exchange (ETDEWEB)

    Lee, M; Woo, B; Kim, J [Seoul National University, Seoul (Korea, Republic of); Jamshidi, N; Kuo, M [UCLA School of Medicine, Los Angeles, CA (United States)

    2015-06-15

    Purpose: Objective and reliable quantification of imaging phenotype is an essential part of radiogenomic studies. We compared the reproducibility of two semi-automatic segmentation methods for quantitative image phenotyping in magnetic resonance imaging (MRI) of glioblastoma multiforme (GBM). Methods: MRI examinations with T1 post-gadolinium and FLAIR sequences of 10 GBM patients were downloaded from the Cancer Image Archive site. Two semi-automatic segmentation tools with different algorithms (deformable model and grow cut method) were used to segment contrast enhancement, necrosis and edema regions by two independent observers. A total of 21 imaging features consisting of area and edge groups were extracted automatically from the segmented tumor. The inter-observer variability and coefficient of variation (COV) were calculated to evaluate the reproducibility. Results: Inter-observer correlations and coefficient of variation of imaging features with the deformable model ranged from 0.953 to 0.999 and 2.1% to 9.2%, respectively, and the grow cut method ranged from 0.799 to 0.976 and 3.5% to 26.6%, respectively. Coefficient of variation for especially important features which were previously reported as predictive of patient survival were: 3.4% with deformable model and 7.4% with grow cut method for the proportion of contrast enhanced tumor region; 5.5% with deformable model and 25.7% with grow cut method for the proportion of necrosis; and 2.1% with deformable model and 4.4% with grow cut method for edge sharpness of tumor on CE-T1W1. Conclusion: Comparison of two semi-automated tumor segmentation techniques shows reliable image feature extraction for radiogenomic analysis of GBM patients with multiparametric Brain MRI.

  6. Clinical feasibility of a myocardial signal intensity threshold-based semi-automated cardiac magnetic resonance segmentation method

    Energy Technology Data Exchange (ETDEWEB)

    Varga-Szemes, Akos; Schoepf, U.J.; Suranyi, Pal; De Cecco, Carlo N.; Fox, Mary A. [Medical University of South Carolina, Division of Cardiovascular Imaging, Department of Radiology and Radiological Science, Charleston, SC (United States); Muscogiuri, Giuseppe [Medical University of South Carolina, Division of Cardiovascular Imaging, Department of Radiology and Radiological Science, Charleston, SC (United States); University of Rome ' ' Sapienza' ' , Department of Medical-Surgical Sciences and Translational Medicine, Rome (Italy); Wichmann, Julian L. [Medical University of South Carolina, Division of Cardiovascular Imaging, Department of Radiology and Radiological Science, Charleston, SC (United States); University Hospital Frankfurt, Department of Diagnostic and Interventional Radiology, Frankfurt (Germany); Cannao, Paola M. [Medical University of South Carolina, Division of Cardiovascular Imaging, Department of Radiology and Radiological Science, Charleston, SC (United States); University of Milan, Scuola di Specializzazione in Radiodiagnostica, Milan (Italy); Renker, Matthias [Medical University of South Carolina, Division of Cardiovascular Imaging, Department of Radiology and Radiological Science, Charleston, SC (United States); Kerckhoff Heart and Thorax Center, Bad Nauheim (Germany); Mangold, Stefanie [Medical University of South Carolina, Division of Cardiovascular Imaging, Department of Radiology and Radiological Science, Charleston, SC (United States); Eberhard-Karls University Tuebingen, Department of Diagnostic and Interventional Radiology, Tuebingen (Germany); Ruzsics, Balazs [Royal Liverpool and Broadgreen University Hospitals, Department of Cardiology, Liverpool (United Kingdom)

    2016-05-15

    To assess the accuracy and efficiency of a threshold-based, semi-automated cardiac MRI segmentation algorithm in comparison with conventional contour-based segmentation and aortic flow measurements. Short-axis cine images of 148 patients (55 ± 18 years, 81 men) were used to evaluate left ventricular (LV) volumes and mass (LVM) using conventional and threshold-based segmentations. Phase-contrast images were used to independently measure stroke volume (SV). LV parameters were evaluated by two independent readers. Evaluation times using the conventional and threshold-based methods were 8.4 ± 1.9 and 4.2 ± 1.3 min, respectively (P < 0.0001). LV parameters measured by the conventional and threshold-based methods, respectively, were end-diastolic volume (EDV) 146 ± 59 and 134 ± 53 ml; end-systolic volume (ESV) 64 ± 47 and 59 ± 46 ml; SV 82 ± 29 and 74 ± 28 ml (flow-based 74 ± 30 ml); ejection fraction (EF) 59 ± 16 and 58 ± 17 %; and LVM 141 ± 55 and 159 ± 58 g. Significant differences between the conventional and threshold-based methods were observed in EDV, ESV, and LVM measurements; SV from threshold-based and flow-based measurements were in agreement (P > 0.05) but were significantly different from conventional analysis (P < 0.05). Excellent inter-observer agreement was observed. Threshold-based LV segmentation provides improved accuracy and faster assessment compared to conventional contour-based methods. (orig.)

  7. Toward Semi-automated Assessment of Target Volume Delineation in Radiotherapy Trials: The SCOPE 1 Pretrial Test Case

    Energy Technology Data Exchange (ETDEWEB)

    Gwynne, Sarah, E-mail: Sarah.Gwynne2@wales.nhs.uk [Department of Clinical Oncology, Velindre Cancer Centre, Cardiff, Wales (United Kingdom); Spezi, Emiliano; Wills, Lucy [Department of Medical Physics, Velindre Cancer Centre, Cardiff, Wales (United Kingdom); Nixon, Lisette; Hurt, Chris [Wales Cancer Trials Unit, School of Medicine, Cardiff University, Cardiff, Wales (United Kingdom); Joseph, George [Department of Diagnostic Radiology, Velindre Cancer Centre, Cardiff, Wales (United Kingdom); Evans, Mererid [Department of Clinical Oncology, Velindre Cancer Centre, Cardiff, Wales (United Kingdom); Griffiths, Gareth [Wales Cancer Trials Unit, School of Medicine, Cardiff University, Cardiff, Wales (United Kingdom); Crosby, Tom [Department of Clinical Oncology, Velindre Cancer Centre, Cardiff, Wales (United Kingdom); Staffurth, John [Division of Cancer, School of Medicine, Cardiff University, Cardiff, Wales (United Kingdom)

    2012-11-15

    Purpose: To evaluate different conformity indices (CIs) for use in the analysis of outlining consistency within the pretrial quality assurance (Radiotherapy Trials Quality Assurance [RTTQA]) program of a multicenter chemoradiation trial of esophageal cancer and to make recommendations for their use in future trials. Methods and Materials: The National Cancer Research Institute SCOPE 1 trial is an ongoing Cancer Research UK-funded phase II/III randomized controlled trial of chemoradiation with capecitabine and cisplatin with or without cetuximab for esophageal cancer. The pretrial RTTQA program included a detailed radiotherapy protocol, an educational package, and a single mid-esophageal tumor test case that were sent to each investigator to outline. Investigator gross tumor volumes (GTVs) were received from 50 investigators in 34 UK centers, and CERR (Computational Environment for Radiotherapy Research) was used to perform an assessment of each investigator GTV against a predefined gold-standard GTV using different CIs. A new metric, the local conformity index (l-CI), that can localize areas of maximal discordance was developed. Results: The median Jaccard conformity index (JCI) was 0.69 (interquartile range, 0.62-0.70), with 14 of 50 investigators (28%) achieving a JCI of 0.7 or greater. The median geographical miss index was 0.09 (interquartile range, 0.06-0.16), and the mean discordance index was 0.27 (95% confidence interval, 0.25-0.30). The l-CI was highest in the middle section of the volume, where the tumor was bulky and more easily definable, and identified 4 slices where fewer than 20% of investigators achieved an l-CI of 0.7 or greater. Conclusions: The available CIs analyze different aspects of a gold standard-observer variation, with JCI being the most useful as a single metric. Additional information is provided by the l-CI and can focus the efforts of the RTTQA team in these areas, possibly leading to semi-automated outlining assessment.

  8. Toward Semi-automated Assessment of Target Volume Delineation in Radiotherapy Trials: The SCOPE 1 Pretrial Test Case

    International Nuclear Information System (INIS)

    Gwynne, Sarah; Spezi, Emiliano; Wills, Lucy; Nixon, Lisette; Hurt, Chris; Joseph, George; Evans, Mererid; Griffiths, Gareth; Crosby, Tom; Staffurth, John

    2012-01-01

    Purpose: To evaluate different conformity indices (CIs) for use in the analysis of outlining consistency within the pretrial quality assurance (Radiotherapy Trials Quality Assurance [RTTQA]) program of a multicenter chemoradiation trial of esophageal cancer and to make recommendations for their use in future trials. Methods and Materials: The National Cancer Research Institute SCOPE 1 trial is an ongoing Cancer Research UK-funded phase II/III randomized controlled trial of chemoradiation with capecitabine and cisplatin with or without cetuximab for esophageal cancer. The pretrial RTTQA program included a detailed radiotherapy protocol, an educational package, and a single mid-esophageal tumor test case that were sent to each investigator to outline. Investigator gross tumor volumes (GTVs) were received from 50 investigators in 34 UK centers, and CERR (Computational Environment for Radiotherapy Research) was used to perform an assessment of each investigator GTV against a predefined gold-standard GTV using different CIs. A new metric, the local conformity index (l-CI), that can localize areas of maximal discordance was developed. Results: The median Jaccard conformity index (JCI) was 0.69 (interquartile range, 0.62-0.70), with 14 of 50 investigators (28%) achieving a JCI of 0.7 or greater. The median geographical miss index was 0.09 (interquartile range, 0.06-0.16), and the mean discordance index was 0.27 (95% confidence interval, 0.25-0.30). The l-CI was highest in the middle section of the volume, where the tumor was bulky and more easily definable, and identified 4 slices where fewer than 20% of investigators achieved an l-CI of 0.7 or greater. Conclusions: The available CIs analyze different aspects of a gold standard–observer variation, with JCI being the most useful as a single metric. Additional information is provided by the l-CI and can focus the efforts of the RTTQA team in these areas, possibly leading to semi-automated outlining assessment.

  9. SU-E-J-252: Reproducibility of Radiogenomic Image Features: Comparison of Two Semi-Automated Segmentation Methods

    International Nuclear Information System (INIS)

    Lee, M; Woo, B; Kim, J; Jamshidi, N; Kuo, M

    2015-01-01

    Purpose: Objective and reliable quantification of imaging phenotype is an essential part of radiogenomic studies. We compared the reproducibility of two semi-automatic segmentation methods for quantitative image phenotyping in magnetic resonance imaging (MRI) of glioblastoma multiforme (GBM). Methods: MRI examinations with T1 post-gadolinium and FLAIR sequences of 10 GBM patients were downloaded from the Cancer Image Archive site. Two semi-automatic segmentation tools with different algorithms (deformable model and grow cut method) were used to segment contrast enhancement, necrosis and edema regions by two independent observers. A total of 21 imaging features consisting of area and edge groups were extracted automatically from the segmented tumor. The inter-observer variability and coefficient of variation (COV) were calculated to evaluate the reproducibility. Results: Inter-observer correlations and coefficient of variation of imaging features with the deformable model ranged from 0.953 to 0.999 and 2.1% to 9.2%, respectively, and the grow cut method ranged from 0.799 to 0.976 and 3.5% to 26.6%, respectively. Coefficient of variation for especially important features which were previously reported as predictive of patient survival were: 3.4% with deformable model and 7.4% with grow cut method for the proportion of contrast enhanced tumor region; 5.5% with deformable model and 25.7% with grow cut method for the proportion of necrosis; and 2.1% with deformable model and 4.4% with grow cut method for edge sharpness of tumor on CE-T1W1. Conclusion: Comparison of two semi-automated tumor segmentation techniques shows reliable image feature extraction for radiogenomic analysis of GBM patients with multiparametric Brain MRI

  10. Fluctuating Finite Element Analysis (FFEA: A continuum mechanics software tool for mesoscale simulation of biomolecules.

    Directory of Open Access Journals (Sweden)

    Albert Solernou

    2018-03-01

    Full Text Available Fluctuating Finite Element Analysis (FFEA is a software package designed to perform continuum mechanics simulations of proteins and other globular macromolecules. It combines conventional finite element methods with stochastic thermal noise, and is appropriate for simulations of large proteins and protein complexes at the mesoscale (length-scales in the range of 5 nm to 1 μm, where there is currently a paucity of modelling tools. It requires 3D volumetric information as input, which can be low resolution structural information such as cryo-electron tomography (cryo-ET maps or much higher resolution atomistic co-ordinates from which volumetric information can be extracted. In this article we introduce our open source software package for performing FFEA simulations which we have released under a GPLv3 license. The software package includes a C ++ implementation of FFEA, together with tools to assist the user to set up the system from Electron Microscopy Data Bank (EMDB or Protein Data Bank (PDB data files. We also provide a PyMOL plugin to perform basic visualisation and additional Python tools for the analysis of FFEA simulation trajectories. This manuscript provides a basic background to the FFEA method, describing the implementation of the core mechanical model and how intermolecular interactions and the solvent environment are included within this framework. We provide prospective FFEA users with a practical overview of how to set up an FFEA simulation with reference to our publicly available online tutorials and manuals that accompany this first release of the package.

  11. Assessment of tobacco smoke effects on neonatal cardiorespiratory control using a semi-automated processing approach.

    Science.gov (United States)

    Al-Omar, Sally; Le Rolle, Virginie; Beuchée, Alain; Samson, Nathalie; Praud, Jean-Paul; Carrault, Guy

    2018-05-10

    A semi-automated processing approach was developed to assess the effects of early postnatal environmental tobacco smoke (ETS) on the cardiorespiratory control of newborn lambs. The system consists of several steps beginning with artifact rejection, followed by the selection of stationary segments, and ending with feature extraction. This approach was used in six lambs exposed to 20 cigarettes/day for the first 15 days of life, while another six control lambs were exposed to room air. On postnatal day 16, electrocardiograph and respiratory signals were obtained from a 6-h polysomnographic recording. The effects of postnatal ETS exposure on heart rate variability, respiratory rate variability, and cardiorespiratory interrelations were explored. The unique results suggest that early postnatal ETS exposure increases respiratory rate variability and decreases the coupling between cardiac and respiratory systems. Potentially harmful consequences in early life include unstable breathing and decreased adaptability of cardiorespiratory function, particularly during early life challenges, such as prematurity or viral infection. Graphical abstract ᅟ.

  12. Semi-automated segmentation of a glioblastoma multiforme on brain MR images for radiotherapy planning.

    Science.gov (United States)

    Hori, Daisuke; Katsuragawa, Shigehiko; Murakami, Ryuuji; Hirai, Toshinori

    2010-04-20

    We propose a computerized method for semi-automated segmentation of the gross tumor volume (GTV) of a glioblastoma multiforme (GBM) on brain MR images for radiotherapy planning (RTP). Three-dimensional (3D) MR images of 28 cases with a GBM were used in this study. First, a sphere volume of interest (VOI) including the GBM was selected by clicking a part of the GBM region in the 3D image. Then, the sphere VOI was transformed to a two-dimensional (2D) image by use of a spiral-scanning technique. We employed active contour models (ACM) to delineate an optimal outline of the GBM in the transformed 2D image. After inverse transform of the optimal outline to the 3D space, a morphological filter was applied to smooth the shape of the 3D segmented region. For evaluation of our computerized method, we compared the computer output with manually segmented regions, which were obtained by a therapeutic radiologist using a manual tracking method. In evaluating our segmentation method, we employed the Jaccard similarity coefficient (JSC) and the true segmentation coefficient (TSC) in volumes between the computer output and the manually segmented region. The mean and standard deviation of JSC and TSC were 74.2+/-9.8% and 84.1+/-7.1%, respectively. Our segmentation method provided a relatively accurate outline for GBM and would be useful for radiotherapy planning.

  13. Semi-automated operation of Mars Climate Simulation chamber - MCSC modelled for biological experiments

    Science.gov (United States)

    Tarasashvili, M. V.; Sabashvili, Sh. A.; Tsereteli, S. L.; Aleksidze, N. D.; Dalakishvili, O.

    2017-10-01

    The Mars Climate Simulation Chamber (MCSC) (GEO PAT 12 522/01) is designed for the investigation of the possible past and present habitability of Mars, as well as for the solution of practical tasks necessary for the colonization and Terraformation of the Planet. There are specific tasks such as the experimental investigation of the biological parameters that allow many terrestrial organisms to adapt to the imitated Martian conditions: chemistry of the ground, atmosphere, temperature, radiation, etc. MCSC is set for the simulation of the conduction of various biological experiments, as well as the selection of extremophile microorganisms for the possible Settlement, Ecopoesis and/or Terraformation purposes and investigation of their physiological functions. For long-term purposes, it is possible to cultivate genetically modified organisms (e.g., plants) adapted to the Martian conditions for future Martian agriculture to sustain human Mars missions and permanent settlements. The size of the chamber allows preliminary testing of the functionality of space-station mini-models and personal protection devices such as space-suits, covering and building materials and other structures. The reliability of the experimental biotechnological materials can also be tested over a period of years. Complex and thorough research has been performed to acquire the most appropriate technical tools for the accurate engineering of the MCSC and precious programmed simulation of Martian environmental conditions. This paper describes the construction and technical details of the equipment of the MCSC, which allows its semi-automated, long-term operation.

  14. Semi-automated camera trap image processing for the detection of ungulate fence crossing events.

    Science.gov (United States)

    Janzen, Michael; Visser, Kaitlyn; Visscher, Darcy; MacLeod, Ian; Vujnovic, Dragomir; Vujnovic, Ksenija

    2017-09-27

    Remote cameras are an increasingly important tool for ecological research. While remote camera traps collect field data with minimal human attention, the images they collect require post-processing and characterization before it can be ecologically and statistically analyzed, requiring the input of substantial time and money from researchers. The need for post-processing is due, in part, to a high incidence of non-target images. We developed a stand-alone semi-automated computer program to aid in image processing, categorization, and data reduction by employing background subtraction and histogram rules. Unlike previous work that uses video as input, our program uses still camera trap images. The program was developed for an ungulate fence crossing project and tested against an image dataset which had been previously processed by a human operator. Our program placed images into categories representing the confidence of a particular sequence of images containing a fence crossing event. This resulted in a reduction of 54.8% of images that required further human operator characterization while retaining 72.6% of the known fence crossing events. This program can provide researchers using remote camera data the ability to reduce the time and cost required for image post-processing and characterization. Further, we discuss how this procedure might be generalized to situations not specifically related to animal use of linear features.

  15. Semi-automated 86Y purification using a three-column system

    International Nuclear Information System (INIS)

    Park, Luke S.; Szajek, Lawrence P.; Wong, Karen J.; Plascjak, Paul S.; Garmestani, Kayhan; Googins, Shawn; Eckelman, William C.; Carrasquillo, Jorge A.; Paik, Chang H.

    2004-01-01

    The separation of 86 Y from 86 Sr was optimized by a semi-automated purification system involving the passage of the target sample through three sequential columns. The target material was dissolved in 4 N HNO 3 and loaded onto a Sr-selective (Sr-Spec) column to retain the 86 Sr. The yttrium was eluted with 4 N HNO 3 onto the second Y-selective (RE-Spec) column with quantitative retention. The RE-Spec column was eluted with a stepwise decreasing concentration of HNO 3 to wash out potential metallic impurities to a waste container. The eluate was then pumped onto an Aminex A5 column with 0.1 N HCl and finally with 3 N HCl to collect the radioyttrium in 0.6-0.8 mL with a >80% recovery. This method enabled us to decontaminate Sr by 250,000 times and label 30 μ g of DOTA-Biotin with a >95% yield

  16. A semi-automated methodology for finding lipid-related GO terms.

    Science.gov (United States)

    Fan, Mengyuan; Low, Hong Sang; Wenk, Markus R; Wong, Limsoon

    2014-01-01

    Although semantic similarity in Gene Ontology (GO) and other approaches may be used to find similar GO terms, there is yet a method to systematically find a class of GO terms sharing a common property with high accuracy (e.g., involving human curation). We have developed a methodology to address this issue and applied it to identify lipid-related GO terms, owing to the important and varied roles of lipids in many biological processes. Our methodology finds lipid-related GO terms in a semi-automated manner, requiring only moderate manual curation. We first obtain a list of lipid-related gold-standard GO terms by keyword search and manual curation. Then, based on the hypothesis that co-annotated GO terms share similar properties, we develop a machine learning method that expands the list of lipid-related terms from the gold standard. Those terms predicted most likely to be lipid related are examined by a human curator following specific curation rules to confirm the class labels. The structure of GO is also exploited to help reduce the curation effort. The prediction and curation cycle is repeated until no further lipid-related term is found. Our approach has covered a high proportion, if not all, of lipid-related terms with relatively high efficiency. http://compbio.ddns.comp.nus.edu.sg/∼lipidgo. © The Author(s) 2014. Published by Oxford University Press.

  17. OMIT: dynamic, semi-automated ontology development for the microRNA domain.

    Directory of Open Access Journals (Sweden)

    Jingshan Huang

    Full Text Available As a special class of short non-coding RNAs, microRNAs (a.k.a. miRNAs or miRs have been reported to perform important roles in various biological processes by regulating respective target genes. However, significant barriers exist during biologists' conventional miR knowledge discovery. Emerging semantic technologies, which are based upon domain ontologies, can render critical assistance to this problem. Our previous research has investigated the construction of a miR ontology, named Ontology for MIcroRNA Target Prediction (OMIT, the very first of its kind that formally encodes miR domain knowledge. Although it is unavoidable to have a manual component contributed by domain experts when building ontologies, many challenges have been identified for a completely manual development process. The most significant issue is that a manual development process is very labor-intensive and thus extremely expensive. Therefore, we propose in this paper an innovative ontology development methodology. Our contributions can be summarized as: (i We have continued the development and critical improvement of OMIT, solidly based on our previous research outcomes. (ii We have explored effective and efficient algorithms with which the ontology development can be seamlessly combined with machine intelligence and be accomplished in a semi-automated manner, thus significantly reducing large amounts of human efforts. A set of experiments have been conducted to thoroughly evaluate our proposed methodology.

  18. OMIT: dynamic, semi-automated ontology development for the microRNA domain.

    Science.gov (United States)

    Huang, Jingshan; Dang, Jiangbo; Borchert, Glen M; Eilbeck, Karen; Zhang, He; Xiong, Min; Jiang, Weijian; Wu, Hao; Blake, Judith A; Natale, Darren A; Tan, Ming

    2014-01-01

    As a special class of short non-coding RNAs, microRNAs (a.k.a. miRNAs or miRs) have been reported to perform important roles in various biological processes by regulating respective target genes. However, significant barriers exist during biologists' conventional miR knowledge discovery. Emerging semantic technologies, which are based upon domain ontologies, can render critical assistance to this problem. Our previous research has investigated the construction of a miR ontology, named Ontology for MIcroRNA Target Prediction (OMIT), the very first of its kind that formally encodes miR domain knowledge. Although it is unavoidable to have a manual component contributed by domain experts when building ontologies, many challenges have been identified for a completely manual development process. The most significant issue is that a manual development process is very labor-intensive and thus extremely expensive. Therefore, we propose in this paper an innovative ontology development methodology. Our contributions can be summarized as: (i) We have continued the development and critical improvement of OMIT, solidly based on our previous research outcomes. (ii) We have explored effective and efficient algorithms with which the ontology development can be seamlessly combined with machine intelligence and be accomplished in a semi-automated manner, thus significantly reducing large amounts of human efforts. A set of experiments have been conducted to thoroughly evaluate our proposed methodology.

  19. OMIT: Dynamic, Semi-Automated Ontology Development for the microRNA Domain

    Science.gov (United States)

    Huang, Jingshan; Dang, Jiangbo; Borchert, Glen M.; Eilbeck, Karen; Zhang, He; Xiong, Min; Jiang, Weijian; Wu, Hao; Blake, Judith A.; Natale, Darren A.; Tan, Ming

    2014-01-01

    As a special class of short non-coding RNAs, microRNAs (a.k.a. miRNAs or miRs) have been reported to perform important roles in various biological processes by regulating respective target genes. However, significant barriers exist during biologists' conventional miR knowledge discovery. Emerging semantic technologies, which are based upon domain ontologies, can render critical assistance to this problem. Our previous research has investigated the construction of a miR ontology, named Ontology for MIcroRNA Target Prediction (OMIT), the very first of its kind that formally encodes miR domain knowledge. Although it is unavoidable to have a manual component contributed by domain experts when building ontologies, many challenges have been identified for a completely manual development process. The most significant issue is that a manual development process is very labor-intensive and thus extremely expensive. Therefore, we propose in this paper an innovative ontology development methodology. Our contributions can be summarized as: (i) We have continued the development and critical improvement of OMIT, solidly based on our previous research outcomes. (ii) We have explored effective and efficient algorithms with which the ontology development can be seamlessly combined with machine intelligence and be accomplished in a semi-automated manner, thus significantly reducing large amounts of human efforts. A set of experiments have been conducted to thoroughly evaluate our proposed methodology. PMID:25025130

  20. A semi-automated volumetric software for segmentation and perfusion parameter quantification of brain tumors using 320-row multidetector computed tomography: a validation study

    Energy Technology Data Exchange (ETDEWEB)

    Chae, Soo Young; Suh, Sangil; Ryoo, Inseon; Park, Arim; Seol, Hae Young [Korea University Guro Hospital, Department of Radiology, Seoul (Korea, Republic of); Noh, Kyoung Jin [Soonchunhyang University, Department of Electronic Engineering, Asan (Korea, Republic of); Shim, Hackjoon [Toshiba Medical Systems Korea Co., Seoul (Korea, Republic of)

    2017-05-15

    We developed a semi-automated volumetric software, NPerfusion, to segment brain tumors and quantify perfusion parameters on whole-brain CT perfusion (WBCTP) images. The purpose of this study was to assess the feasibility of the software and to validate its performance compared with manual segmentation. Twenty-nine patients with pathologically proven brain tumors who underwent preoperative WBCTP between August 2012 and February 2015 were included. Three perfusion parameters, arterial flow (AF), equivalent blood volume (EBV), and Patlak flow (PF, which is a measure of permeability of capillaries), of brain tumors were generated by a commercial software and then quantified volumetrically by NPerfusion, which also semi-automatically segmented tumor boundaries. The quantification was validated by comparison with that of manual segmentation in terms of the concordance correlation coefficient and Bland-Altman analysis. With NPerfusion, we successfully performed segmentation and quantified whole volumetric perfusion parameters of all 29 brain tumors that showed consistent perfusion trends with previous studies. The validation of the perfusion parameter quantification exhibited almost perfect agreement with manual segmentation, with Lin concordance correlation coefficients (ρ {sub c}) for AF, EBV, and PF of 0.9988, 0.9994, and 0.9976, respectively. On Bland-Altman analysis, most differences between this software and manual segmentation on the commercial software were within the limit of agreement. NPerfusion successfully performs segmentation of brain tumors and calculates perfusion parameters of brain tumors. We validated this semi-automated segmentation software by comparing it with manual segmentation. NPerfusion can be used to calculate volumetric perfusion parameters of brain tumors from WBCTP. (orig.)

  1. A semi-automated volumetric software for segmentation and perfusion parameter quantification of brain tumors using 320-row multidetector computed tomography: a validation study.

    Science.gov (United States)

    Chae, Soo Young; Suh, Sangil; Ryoo, Inseon; Park, Arim; Noh, Kyoung Jin; Shim, Hackjoon; Seol, Hae Young

    2017-05-01

    We developed a semi-automated volumetric software, NPerfusion, to segment brain tumors and quantify perfusion parameters on whole-brain CT perfusion (WBCTP) images. The purpose of this study was to assess the feasibility of the software and to validate its performance compared with manual segmentation. Twenty-nine patients with pathologically proven brain tumors who underwent preoperative WBCTP between August 2012 and February 2015 were included. Three perfusion parameters, arterial flow (AF), equivalent blood volume (EBV), and Patlak flow (PF, which is a measure of permeability of capillaries), of brain tumors were generated by a commercial software and then quantified volumetrically by NPerfusion, which also semi-automatically segmented tumor boundaries. The quantification was validated by comparison with that of manual segmentation in terms of the concordance correlation coefficient and Bland-Altman analysis. With NPerfusion, we successfully performed segmentation and quantified whole volumetric perfusion parameters of all 29 brain tumors that showed consistent perfusion trends with previous studies. The validation of the perfusion parameter quantification exhibited almost perfect agreement with manual segmentation, with Lin concordance correlation coefficients (ρ c ) for AF, EBV, and PF of 0.9988, 0.9994, and 0.9976, respectively. On Bland-Altman analysis, most differences between this software and manual segmentation on the commercial software were within the limit of agreement. NPerfusion successfully performs segmentation of brain tumors and calculates perfusion parameters of brain tumors. We validated this semi-automated segmentation software by comparing it with manual segmentation. NPerfusion can be used to calculate volumetric perfusion parameters of brain tumors from WBCTP.

  2. Wavelet Scale Analysis of Mesoscale Convective Systems for Detecting Deep Convection From Infrared Imagery

    Science.gov (United States)

    Klein, Cornelia; Belušić, Danijel; Taylor, Christopher M.

    2018-03-01

    Mesoscale convective systems (MCSs) are frequently associated with rainfall extremes and are expected to further intensify under global warming. However, despite the significant impact of such extreme events, the dominant processes favoring their occurrence are still under debate. Meteosat geostationary satellites provide unique long-term subhourly records of cloud top temperatures, allowing to track changes in MCS structures that could be linked to rainfall intensification. Focusing on West Africa, we show that Meteosat cloud top temperatures are a useful proxy for rainfall intensities, as derived from snapshots from the Tropical Rainfall Measuring Mission 2A25 product: MCSs larger than 15,000 km2 at a temperature threshold of -40°C are found to produce 91% of all extreme rainfall occurrences in the study region, with 80% of the storms producing extreme rain when their minimum temperature drops below -80°C. Furthermore, we present a new method based on 2-D continuous wavelet transform to explore the relationship between cloud top temperature and rainfall intensity for subcloud features at different length scales. The method shows great potential for separating convective and stratiform cloud parts when combining information on temperature and scale, improving the common approach of using a temperature threshold only. We find that below -80°C, every fifth pixel is associated with deep convection. This frequency is doubled when looking at subcloud features smaller than 35 km. Scale analysis of subcloud features can thus help to better exploit cloud top temperature data sets, which provide much more spatiotemporal detail of MCS characteristics than available rainfall data sets alone.

  3. Applicability Of A Semi-Automated Clinical Chemistry Analyzer In Determining The Antioxidant Concentrations Of Selected Plants

    OpenAIRE

    Allan L. Hilario; Phylis C. Rio; Geraldine Susan C. Tengco; Danilo M. Menorca

    2017-01-01

    Plants are rich sources of antioxidants that are protective against diseases associated to oxidative stress. There is a need for high throughput screening method that should be useful in determining the antioxidant concentration in plants. Such screening method should significantly simplify and speed up most antioxidant assays. This paper aimed at comparing the applicability of a semi-automated clinical chemistry analyzer Pointe Scientific MI USA with the traditional standard curve method and...

  4. Methods for semi-automated indexing for high precision information retrieval

    Science.gov (United States)

    Berrios, Daniel C.; Cucina, Russell J.; Fagan, Lawrence M.

    2002-01-01

    OBJECTIVE: To evaluate a new system, ISAID (Internet-based Semi-automated Indexing of Documents), and to generate textbook indexes that are more detailed and more useful to readers. DESIGN: Pilot evaluation: simple, nonrandomized trial comparing ISAID with manual indexing methods. Methods evaluation: randomized, cross-over trial comparing three versions of ISAID and usability survey. PARTICIPANTS: Pilot evaluation: two physicians. Methods evaluation: twelve physicians, each of whom used three different versions of the system for a total of 36 indexing sessions. MEASUREMENTS: Total index term tuples generated per document per minute (TPM), with and without adjustment for concordance with other subjects; inter-indexer consistency; ratings of the usability of the ISAID indexing system. RESULTS: Compared with manual methods, ISAID decreased indexing times greatly. Using three versions of ISAID, inter-indexer consistency ranged from 15% to 65% with a mean of 41%, 31%, and 40% for each of three documents. Subjects using the full version of ISAID were faster (average TPM: 5.6) and had higher rates of concordant index generation. There were substantial learning effects, despite our use of a training/run-in phase. Subjects using the full version of ISAID were much faster by the third indexing session (average TPM: 9.1). There was a statistically significant increase in three-subject concordant indexing rate using the full version of ISAID during the second indexing session (p < 0.05). SUMMARY: Users of the ISAID indexing system create complex, precise, and accurate indexing for full-text documents much faster than users of manual methods. Furthermore, the natural language processing methods that ISAID uses to suggest indexes contributes substantially to increased indexing speed and accuracy.

  5. Semi-Automated Classification of Seafloor Data Collected on the Delmarva Inner Shelf

    Science.gov (United States)

    Sweeney, E. M.; Pendleton, E. A.; Brothers, L. L.; Mahmud, A.; Thieler, E. R.

    2017-12-01

    We tested automated classification methods on acoustic bathymetry and backscatter data collected by the U.S. Geological Survey (USGS) and National Oceanic and Atmospheric Administration (NOAA) on the Delmarva inner continental shelf to efficiently and objectively identify sediment texture and geomorphology. Automated classification techniques are generally less subjective and take significantly less time than manual classification methods. We used a semi-automated process combining unsupervised and supervised classification techniques to characterize seafloor based on bathymetric slope and relative backscatter intensity. Statistical comparison of our automated classification results with those of a manual classification conducted on a subset of the acoustic imagery indicates that our automated method was highly accurate (95% total accuracy and 93% Kappa). Our methods resolve sediment ridges, zones of flat seafloor and areas of high and low backscatter. We compared our classification scheme with mean grain size statistics of samples collected in the study area and found that strong correlations between backscatter intensity and sediment texture exist. High backscatter zones are associated with the presence of gravel and shells mixed with sand, and low backscatter areas are primarily clean sand or sand mixed with mud. Slope classes further elucidate textural and geomorphologic differences in the seafloor, such that steep slopes (>0.35°) with high backscatter are most often associated with the updrift side of sand ridges and bedforms, whereas low slope with high backscatter correspond to coarse lag or shell deposits. Low backscatter and high slopes are most often found on the downdrift side of ridges and bedforms, and low backscatter and low slopes identify swale areas and sand sheets. We found that poor acoustic data quality was the most significant cause of inaccurate classification results, which required additional user input to mitigate. Our method worked well

  6. Semi-automated procedures for shoreline extraction using single RADARSAT-1 SAR image

    Science.gov (United States)

    Al Fugura, A.'kif; Billa, Lawal; Pradhan, Biswajeet

    2011-12-01

    Coastline identification is important for surveying and mapping reasons. Coastline serves as the basic point of reference and is used on nautical charts for navigation purposes. Its delineation has become crucial and more important in the wake of the many recent earthquakes and tsunamis resulting in complete change and redraw of some shorelines. In a tropical country like Malaysia, presence of cloud cover hinders the application of optical remote sensing data. In this study a semi-automated technique and procedures are presented for shoreline delineation from RADARSAT-1 image. A scene of RADARSAT-1 satellite image was processed using enhanced filtering technique to identify and extract the shoreline coast of Kuala Terengganu, Malaysia. RADSARSAT image has many advantages over the optical data because of its ability to penetrate cloud cover and its night sensing capabilities. At first, speckles were removed from the image by using Lee sigma filter which was used to reduce random noise and to enhance the image and discriminate the boundary between land and water. The results showed an accurate and improved extraction and delineation of the entire coastline of Kuala Terrenganu. The study demonstrated the reliability of the image averaging filter in reducing random noise over the sea surface especially near the shoreline. It enhanced land-water boundary differentiation, enabling better delineation of the shoreline. Overall, the developed techniques showed the potential of radar imagery for accurate shoreline mapping and will be useful for monitoring shoreline changes during high and low tides as well as shoreline erosion in a tropical country like Malaysia.

  7. Interobserver agreement of semi-automated and manual measurements of functional MRI metrics of treatment response in hepatocellular carcinoma

    International Nuclear Information System (INIS)

    Bonekamp, David; Bonekamp, Susanne; Halappa, Vivek Gowdra; Geschwind, Jean-Francois H.; Eng, John; Corona-Villalobos, Celia Pamela; Pawlik, Timothy M.; Kamel, Ihab R.

    2014-01-01

    Purpose: To assess the interobserver agreement in 50 patients with hepatocellular carcinoma (HCC) before and 1 month after intra-arterial therapy (IAT) using two semi-automated methods and a manual approach for the following functional, volumetric and morphologic parameters: (1) apparent diffusion coefficient (ADC), (2) arterial phase enhancement (AE), (3) portal venous phase enhancement (VE), (4) tumor volume, and assessment according to (5) the Response Evaluation Criteria in Solid Tumors (RECIST), and (6) the European Association for the Study of the Liver (EASL). Materials and methods: This HIPAA-compliant retrospective study had institutional review board approval. The requirement for patient informed consent was waived. Tumor ADC, AE, VE, volume, RECIST, and EASL in 50 index lesions was measured by three observers. Interobserver reproducibility was evaluated using intraclass correlation coefficients (ICC). P < 0.05 was considered to indicate a significant difference. Results: Semi-automated volumetric measurements of functional parameters (ADC, AE, and VE) before and after IAT as well as change in tumor ADC, AE, or VE had better interobserver agreement (ICC = 0.830–0.974) compared with manual ROI-based axial measurements (ICC = 0.157–0.799). Semi-automated measurements of tumor volume and size in the axial plane before and after IAT had better interobserver agreement (ICC = 0.854–0.996) compared with manual size measurements (ICC = 0.543–0.596), and interobserver agreement for change in tumor RECIST size was also higher using semi-automated measurements (ICC = 0.655) compared with manual measurements (ICC = 0.169). EASL measurements of tumor enhancement in the axial plane before and after IAT ((ICC = 0.758–0.809), and changes in EASL after IAT (ICC = 0.653) had good interobserver agreement. Conclusion: Semi-automated measurements of functional changes assessed by ADC and VE based on whole-lesion segmentation demonstrated better reproducibility than

  8. Geostatistical Analysis of Mesoscale Spatial Variability and Error in SeaWiFS and MODIS/Aqua Global Ocean Color Data

    Science.gov (United States)

    Glover, David M.; Doney, Scott C.; Oestreich, William K.; Tullo, Alisdair W.

    2018-01-01

    Mesoscale (10-300 km, weeks to months) physical variability strongly modulates the structure and dynamics of planktonic marine ecosystems via both turbulent advection and environmental impacts upon biological rates. Using structure function analysis (geostatistics), we quantify the mesoscale biological signals within global 13 year SeaWiFS (1998-2010) and 8 year MODIS/Aqua (2003-2010) chlorophyll a ocean color data (Level-3, 9 km resolution). We present geographical distributions, seasonality, and interannual variability of key geostatistical parameters: unresolved variability or noise, resolved variability, and spatial range. Resolved variability is nearly identical for both instruments, indicating that geostatistical techniques isolate a robust measure of biophysical mesoscale variability largely independent of measurement platform. In contrast, unresolved variability in MODIS/Aqua is substantially lower than in SeaWiFS, especially in oligotrophic waters where previous analysis identified a problem for the SeaWiFS instrument likely due to sensor noise characteristics. Both records exhibit a statistically significant relationship between resolved mesoscale variability and the low-pass filtered chlorophyll field horizontal gradient magnitude, consistent with physical stirring acting on large-scale gradient as an important factor supporting observed mesoscale variability. Comparable horizontal length scales for variability are found from tracer-based scaling arguments and geostatistical decorrelation. Regional variations between these length scales may reflect scale dependence of biological mechanisms that also create variability directly at the mesoscale, for example, enhanced net phytoplankton growth in coastal and frontal upwelling and convective mixing regions. Global estimates of mesoscale biophysical variability provide an improved basis for evaluating higher resolution, coupled ecosystem-ocean general circulation models, and data assimilation.

  9. A semi-automated approach to derive elevation time-series and calculate glacier mass balance from historical aerial imagery

    Science.gov (United States)

    Whorton, E.; Headman, A.; Shean, D. E.; McCann, E.

    2017-12-01

    Understanding the implications of glacier recession on water resources in the western U.S. requires quantifying glacier mass change across large regions over several decades. Very few glaciers in North America have long-term continuous field measurements of glacier mass balance. However, systematic aerial photography campaigns began in 1957 on many glaciers in the western U.S. and Alaska. These historical, vertical aerial stereo-photographs documenting glacier evolution have recently become publically available. Digital elevation models (DEM) of the transient glacier surface preserved in each imagery timestamp can be derived, then differenced to calculate glacier volume and mass change to improve regional geodetic solutions of glacier mass balance. In order to batch process these data, we use Python-based algorithms and Agisoft Photoscan structure from motion (SfM) photogrammetry software to semi-automate DEM creation, and orthorectify and co-register historical aerial imagery in a high-performance computing environment. Scanned photographs are rotated to reduce scaling issues, cropped to the same size to remove fiducials, and batch histogram equalization is applied to improve image quality and aid pixel-matching algorithms using the Python library OpenCV. Processed photographs are then passed to Photoscan through the Photoscan Python library to create DEMs and orthoimagery. To extend the period of record, the elevation products are co-registered to each other, airborne LiDAR data, and DEMs derived from sub-meter commercial satellite imagery. With the exception of the placement of ground control points, the process is entirely automated with Python. Current research is focused on: one, applying these algorithms to create geodetic mass balance time series for the 90 photographed glaciers in Washington State and two, evaluating the minimal amount of positional information required in Photoscan to prevent distortion effects that cannot be addressed during co

  10. Semi-automated scoring of pulmonary emphysema from X-ray CT: Trainee reproducibility and accuracy

    Energy Technology Data Exchange (ETDEWEB)

    Owrangi, Amir M., E-mail: aowrangi@robarts.ca [Imaging Research Laboratories, Robarts Research Institute, London (Canada); Entwistle, Brandon, E-mail: Brandon.Entwistle@londonhospitals.ca; Lu, Andrew, E-mail: Andrew.Lu@londonhospitals.ca; Chiu, Jack, E-mail: Jack.Chiu@londonhospitals.ca; Hussain, Nabil, E-mail: Nabil.Hussain@londonhospitals.ca; Etemad-Rezai, Roya, E-mail: Roya.EtemadRezai@lhsc.on.ca; Parraga, Grace, E-mail: gparraga@robarts.ca [Imaging Research Laboratories, Robarts Research Institute, London (Canada); Graduate Program in Biomedical Engineering, Department of Medical Imaging, Department of Medical Biophysics, The University of Western Ontario, London (Canada)

    2013-11-01

    Objective: We developed a semi-automated tool to quantify emphysema from thoracic X-ray multi-detector (64-slice) computed tomography (CT) for training purposes and multi-reader studies. Materials and Methods: Thoracic X-ray CT was acquired in 93 ex-smokers, who were evaluated by six trainees with little or no expertise (trainees) and a single experienced thoracic radiologist (expert). A graphic user interface (GUI) was developed for emphysema quantification based on the percentile of lung where a score of 0 = no abnormalities, 1 = 1–25%, 2 = 26–50%, 3 = 51–75% and 4 = 76–100% for each lung side/slice. Trainees blinded to subject characteristics scored randomized images twice; accuracy was determined by comparison to expert scores, density histogram 15th percentile (HU{sub 15}), relative area at −950 HU (RA{sub 950}), low attenuation clusters at −950 HU (LAC{sub 950}), −856 HU (LAC{sub 856}) and the diffusing capacity for carbon monoxide (DL{sub CO%pred}). Intra- and inter-observer reproducibility was evaluated using coefficients-of-variation (COV), intra-class (ICC) and Pearson correlations. Results: Trainee–expert correlations were significant (r = 0.85–0.97, p < 0.0001) and a significant trainee bias (0.15 ± 0.22) was observed. Emphysema score was correlated with RA{sub 950} (r = 0.88, p < 0.0001), HU{sub 15} (r = −0.77, p < 0.0001), LAC{sub 950} (r = 0.76, p < 0.0001), LAC{sub 856} (r = 0.74, p = 0.0001) and DL{sub CO%pred} (r = −0.71, p < 0.0001). Intra-observer reproducibility (COV = 4–27%; ICC = 0.75–0.94) was moderate to high for trainees; intra- and inter-observer COV were negatively and non-linearly correlated with emphysema score. Conclusion: We developed a GUI for rapid and interactive emphysema scoring that allows for comparison of multiple readers with clinical and radiological standards.

  11. Automated concept and relationship extraction for the semi-automated ontology management (SEAM) system.

    Science.gov (United States)

    Doing-Harris, Kristina; Livnat, Yarden; Meystre, Stephane

    2015-01-01

    We develop medical-specialty specific ontologies that contain the settled science and common term usage. We leverage current practices in information and relationship extraction to streamline the ontology development process. Our system combines different text types with information and relationship extraction techniques in a low overhead modifiable system. Our SEmi-Automated ontology Maintenance (SEAM) system features a natural language processing pipeline for information extraction. Synonym and hierarchical groups are identified using corpus-based semantics and lexico-syntactic patterns. The semantic vectors we use are term frequency by inverse document frequency and context vectors. Clinical documents contain the terms we want in an ontology. They also contain idiosyncratic usage and are unlikely to contain the linguistic constructs associated with synonym and hierarchy identification. By including both clinical and biomedical texts, SEAM can recommend terms from those appearing in both document types. The set of recommended terms is then used to filter the synonyms and hierarchical relationships extracted from the biomedical corpus. We demonstrate the generality of the system across three use cases: ontologies for acute changes in mental status, Medically Unexplained Syndromes, and echocardiogram summary statements. Across the three uses cases, we held the number of recommended terms relatively constant by changing SEAM's parameters. Experts seem to find more than 300 recommended terms to be overwhelming. The approval rate of recommended terms increased as the number and specificity of clinical documents in the corpus increased. It was 60% when there were 199 clinical documents that were not specific to the ontology domain and 90% when there were 2879 documents very specific to the target domain. We found that fewer than 100 recommended synonym groups were also preferred. Approval rates for synonym recommendations remained low varying from 43% to 25% as the

  12. Preliminary clinical evaluation of semi-automated nailfold capillaroscopy in the assessment of patients with Raynaud's phenomenon.

    Science.gov (United States)

    Murray, Andrea K; Feng, Kaiyan; Moore, Tonia L; Allen, Phillip D; Taylor, Christopher J; Herrick, Ariane L

    2011-08-01

      Nailfold capillaroscopy is well established in screening patients with Raynaud's phenomenon for underlying SSc-spectrum disorders, by identifying abnormal capillaries. Our aim was to compare semi-automatic feature measurement from newly developed software with manual measurements, and determine the degree to which semi-automated data allows disease group classification.   Images from 46 healthy controls, 21 patients with PRP and 49 with SSc were preprocessed, and semi-automated measurements of intercapillary distance and capillary width, tortuosity, and derangement were performed. These were compared with manual measurements. Features were used to classify images into the three subject groups.   Comparison of automatic and manual measures for distance, width, tortuosity, and derangement had correlations of r=0.583, 0.624, 0.495 (p<0.001), and 0.195 (p=0.040). For automatic measures, correlations were found between width and intercapillary distance, r=0.374, and width and tortuosity, r=0.573 (p<0.001). Significant differences between subject groups were found for all features (p<0.002). Overall, 75% of images correctly matched clinical classification using semi-automated features, compared with 71% for manual measurements.   Semi-automatic and manual measurements of distance, width, and tortuosity showed moderate (but statistically significant) correlations. Correlation for derangement was weaker. Semi-automatic measurements are faster than manual measurements. Semi-automatic parameters identify differences between groups, and are as good as manual measurements for between-group classification. © 2011 John Wiley & Sons Ltd.

  13. Applicability Of A Semi-Automated Clinical Chemistry Analyzer In Determining The Antioxidant Concentrations Of Selected Plants

    Directory of Open Access Journals (Sweden)

    Allan L. Hilario

    2017-07-01

    Full Text Available Plants are rich sources of antioxidants that are protective against diseases associated to oxidative stress. There is a need for high throughput screening method that should be useful in determining the antioxidant concentration in plants. Such screening method should significantly simplify and speed up most antioxidant assays. This paper aimed at comparing the applicability of a semi-automated clinical chemistry analyzer Pointe Scientific MI USA with the traditional standard curve method and using a Vis spectrophotometer in performing the DPPH assay for antioxidant screening. Samples of crude aqueous leaf extract of kulitis Amaranthus viridis Linn and chayote Sechium edule Linn were screened for the Total Antioxidant Concentration TAC using the two methods. Results presented in mean SD amp956gdl were compared using unpaired Students t-test P0.05. All runs were done in triplicates. The mean TAC of A. viridis was 646.0 45.5 amp956gdl using the clinical chemistry analyzer and 581.9 19.4 amp956gdl using the standard curve-spectrophotometer. On the other hand the mean TAC of S. edule was 660.2 35.9 amp956gdl using the semi-automated clinical chemistry analyzer and 672.3 20.9 amp956gdl using the spectrophotometer. No significant differences were observed between the readings of the two methods for A. viridis P0.05 and S. edible P0.05. This implies that the clinical chemistry analyzer can be an alternative method in conducting the DPPH assay to determine the TAC in plants. This study presented the applicability of a semi-automated clinical chemistry analyzer in performing the DPPH assay. Further validation can be conducted by performing other antioxidant assays using this equipment.

  14. Semi-automated preparation of the dopamine transporter ligand [18F]FECNT for human PET imaging studies

    International Nuclear Information System (INIS)

    Voll, Ronald J.; McConathy, Jonathan; Waldrep, Michael S.; Crowe, Ronald J.; Goodman, Mark M.

    2005-01-01

    The fluorine-18 labeled dopamine transport (DAT) ligand 2β-carbomethoxy-3β-(4-chlorophenyl)-8-(2-fluoroethyl)nortropane (FECNT) has shown promising properties as an in vivo DAT imaging agent in human and monkey PET studies. A semi-automated synthesis has been developed to reliably produce [ 18 F]FECNT in a 16% decay corrected yield. This method utilizes a new [ 18 F]fluoralkylating agent and provides high purity [ 18 F]FECNT in a formulation suitable for human use

  15. Preliminary analysis of four numerical models for calculating the mesoscale transport of Kr-85

    Energy Technology Data Exchange (ETDEWEB)

    Pepper, D W; Cooper, R E [Du Pont de Nemours (E.I.) and Co., Aiken, SC (USA). Savannah River Lab.

    1983-01-01

    A performance study of four numerical algorithms for multi-dimensional advection-diffusion prediction on mesoscale grids has been made. Dispersion from point and distributed sources and a simulation of a continuous source are compared with analytical solutions to assess relative accuracy. Model predictions are then compared with actual measurements of Kr-85 emitted from the Savannah River Plant (SRP). The particle-in-cell and method of moments algorithms exhibit superior accuracy in modeling single source releases. For modeling distributed sources, algorithms based on the pseudospectral and finite element interpolation concepts exhibit comparable accuracy. The method of moments is felt to be the best overall performer, although all the models appear to be relatively close in accuracy.

  16. Maggot Instructor: Semi-Automated Analysis of Learning and Memory in Drosophila Larvae

    Directory of Open Access Journals (Sweden)

    Urte Tomasiunaite

    2018-06-01

    Full Text Available For several decades, Drosophila has been widely used as a suitable model organism to study the fundamental processes of associative olfactory learning and memory. More recently, this condition also became true for the Drosophila larva, which has become a focus for learning and memory studies based on a number of technical advances in the field of anatomical, molecular, and neuronal analyses. The ongoing efforts should be mentioned to reconstruct the complete connectome of the larval brain featuring a total of about 10,000 neurons and the development of neurogenic tools that allow individual manipulation of each neuron. By contrast, standardized behavioral assays that are commonly used to analyze learning and memory in Drosophila larvae exhibit no such technical development. Most commonly, a simple assay with Petri dishes and odor containers is used; in this method, the animals must be manually transferred in several steps. The behavioral approach is therefore labor-intensive and limits the capacity to conduct large-scale genetic screenings in small laboratories. To circumvent these limitations, we introduce a training device called the Maggot Instructor. This device allows automatic training up to 10 groups of larvae in parallel. To achieve such goal, we used fully automated, computer-controlled optogenetic activation of single olfactory neurons in combination with the application of electric shocks. We showed that Drosophila larvae trained with the Maggot Instructor establish an odor-specific memory, which is independent of handling and non-associative effects. The Maggot Instructor will allow to investigate the large collections of genetically modified larvae in a short period and with minimal human resources. Therefore, the Maggot Instructor should be able to help extensive behavioral experiments in Drosophila larvae to keep up with the current technical advancements. In the longer term, this condition will lead to a better understanding of how learning and memory are organized at the cellular, synaptic, and molecular levels in Drosophila larvae.

  17. Semi-automated separation of the epimeric dehydropyrrolizidine alkaloids lycopsamine and intermedine: preparation of their N-oxides and NMR comparison with diastereoisomeric rinderine and echinatine.

    Science.gov (United States)

    Colegate, Steven M; Gardner, Dale R; Betz, Joseph M; Panter, Kip E

    2014-01-01

    The diversity of structure and, particularly, stereochemical variation of the dehydropyrrolizidine alkaloids can present challenges for analysis and the isolation of pure compounds for the preparation of analytical standards and for toxicology studies. To investigate methods for the separation of gram-scale quantities of the epimeric dehydropyrrolizidine alkaloids lycopsamine and intermedine and to compare their NMR spectroscopic data with those of their heliotridine-based analogues echinatine and rinderine. Lycopsamine and intermedine were extracted, predominantly as their N-oxides and along with their acetylated derivatives, from commercial samples of comfrey (Symphytum officinale) root. Alkaloid enrichment involved liquid-liquid partitioning of the crude methanol extract between dilute aqueous acid and n-butanol, reduction of N-oxides and subsequent continuous liquid-liquid extraction of free base alkaloids into CHCl3 . The alkaloid-rich fraction was further subjected to semi-automated flash chromatography using boronated soda glass beads or boronated quartz sand. Boronated soda glass beads (or quartz sand) chromatography adapted to a Biotage Isolera Flash Chromatography System enabled large-scale separation (at least up to 1-2 g quantities) of lycopsamine and intermedine. The structures were confirmed using one- and two-dimensional (1) H- and (13) C-NMR spectroscopy. Examination of the NMR data for lycopsamine, intermedine and their heliotridine-based analogues echinatine and rinderine allowed for some amendments of literature data and provided useful comparisons for determining relative configurations in monoester dehydropyrrolizidine alkaloids. A similar NMR comparison of lycopsamine and intermedine with their N-oxides showed the effects of N-oxidation on some key chemical shifts. A levorotatory shift in specific rotation from +3.29° to -1.5° was observed for lycopsamine when dissolved in ethanol or methanol respectively. A semi-automated flash

  18. Semi-automated solid phase extraction method for the mass spectrometric quantification of 12 specific metabolites of organophosphorus pesticides, synthetic pyrethroids, and select herbicides in human urine.

    Science.gov (United States)

    Davis, Mark D; Wade, Erin L; Restrepo, Paula R; Roman-Esteva, William; Bravo, Roberto; Kuklenyik, Peter; Calafat, Antonia M

    2013-06-15

    Organophosphate and pyrethroid insecticides and phenoxyacetic acid herbicides represent important classes of pesticides applied in commercial and residential settings. Interest in assessing the extent of human exposure to these pesticides exists because of their widespread use and their potential adverse health effects. An analytical method for measuring 12 biomarkers of several of these pesticides in urine has been developed. The target analytes were extracted from one milliliter of urine by a semi-automated solid phase extraction technique, separated from each other and from other urinary biomolecules by reversed-phase high performance liquid chromatography, and detected using tandem mass spectrometry with isotope dilution quantitation. This method can be used to measure all the target analytes in one injection with similar repeatability and detection limits of previous methods which required more than one injection. Each step of the procedure was optimized to produce a robust, reproducible, accurate, precise and efficient method. The required selectivity and sensitivity for trace-level analysis (e.g., limits of detection below 0.5ng/mL) was achieved using a narrow diameter analytical column, higher than unit mass resolution for certain analytes, and stable isotope labeled internal standards. The method was applied to the analysis of 55 samples collected from adult anonymous donors with no known exposure to the target pesticides. This efficient and cost-effective method is adequate to handle the large number of samples required for national biomonitoring surveys. Published by Elsevier B.V.

  19. Variability of mesoscale features in the Mediterranean Sea from XBT data analysis

    Directory of Open Access Journals (Sweden)

    G. Fusco

    2003-01-01

    Full Text Available During the period 1998–2000, the Mediterranean Forecasting System Pilot Project, aiming to build a forecasting system for the physical state of the sea, has been carried out. A ship-of-opportunity programme sampled the Mediterranean upper ocean thermal structure by means of eXpendable Bathy-Thermographs (XBTs, along seven tracks, from September 1999 to May 2000. The tracks were designed to detect some of the main circulation features, such as the stream of surface Atlantic water flowing from the Alboran Sea to the Eastern Levantine Basin. The cyclonic gyres in the Liguro-Provenal Basin, the southern Adriatic and Ionian Seas and the anticyclonic gyres in the Levantine Basin were also features to be detected. The monitoring system confirmed a long-term persistence of structures (at least during the entire observing period, which were previously thought to be transient features. In particular, in the Levantine Basin anticyclonic Shikmona and Ierapetra Gyres have been observed during the monitoring period. In order to identify the major changes in the thermal structures and the dynamical implications, the XBT data are compared with historical measurements collected in the 1980s and 1990s. The results indicate that some thermal features are being restored to the situation that existed in the 1980s, after the changes induced by the so-called "Eastern Mediterranean Transient". Key words. Oceanography: physical (eddies and mesoscale processes; general circulation; instruments and techniques

  20. An ImageJ-based algorithm for a semi-automated method for microscopic image enhancement and DNA repair foci counting

    International Nuclear Information System (INIS)

    Klokov, D.; Suppiah, R.

    2015-01-01

    Proper evaluation of the health risks of low-dose ionizing radiation exposure heavily relies on the ability to accurately measure very low levels of DNA damage in cells. One of the most sensitive methods for measuring DNA damage levels is the quantification of DNA repair foci that consist of macromolecular aggregates of DNA repair proteins, such as γH2AX and 53BP1, forming around individual DNA double-strand breaks. They can be quantified using immunofluorescence microscopy and are widely used as markers of DNA double-strand breaks. However this quantification, if performed manually, may be very tedious and prone to inter-individual bias. Low-dose radiation studies are especially sensitive to this potential bias due to a very low magnitude of the effects anticipated. Therefore, we designed and validated an algorithm for the semi-automated processing of microscopic images and quantification of DNA repair foci. The algorithm uses ImageJ, a freely available image analysis software that is customizable to individual cellular properties or experimental conditions. We validated the algorithm using immunolabeled 53BP1 and γH2AX in normal human fibroblast AG01522 cells under both normal and irradiated conditions. This method is easy to learn, can be used by nontrained personnel, and can help avoiding discrepancies in inter-laboratory comparison studies examining the effects of low-dose radiation. (author)

  1. An ImageJ-based algorithm for a semi-automated method for microscopic image enhancement and DNA repair foci counting

    Energy Technology Data Exchange (ETDEWEB)

    Klokov, D., E-mail: dmitry.klokov@cnl.ca [Canadian Nuclear Laboratories, Chalk River, Ontario (Canada); Suppiah, R. [Queen' s Univ., Dept. of Biomedical and Molecular Sciences, Kingston, Ontario (Canada)

    2015-06-15

    Proper evaluation of the health risks of low-dose ionizing radiation exposure heavily relies on the ability to accurately measure very low levels of DNA damage in cells. One of the most sensitive methods for measuring DNA damage levels is the quantification of DNA repair foci that consist of macromolecular aggregates of DNA repair proteins, such as γH2AX and 53BP1, forming around individual DNA double-strand breaks. They can be quantified using immunofluorescence microscopy and are widely used as markers of DNA double-strand breaks. However this quantification, if performed manually, may be very tedious and prone to inter-individual bias. Low-dose radiation studies are especially sensitive to this potential bias due to a very low magnitude of the effects anticipated. Therefore, we designed and validated an algorithm for the semi-automated processing of microscopic images and quantification of DNA repair foci. The algorithm uses ImageJ, a freely available image analysis software that is customizable to individual cellular properties or experimental conditions. We validated the algorithm using immunolabeled 53BP1 and γH2AX in normal human fibroblast AG01522 cells under both normal and irradiated conditions. This method is easy to learn, can be used by nontrained personnel, and can help avoiding discrepancies in inter-laboratory comparison studies examining the effects of low-dose radiation. (author)

  2. Analysis and simulation of mesoscale convective systems accompanying heavy rainfall: The goyang case

    Science.gov (United States)

    Choi, Hyun-Young; Ha, Ji-Hyun; Lee, Dong-Kyou; Kuo, Ying-Hwa

    2011-05-01

    We investigated a torrential rainfall case with a daily rainfall amount of 379 mm and a maximum hourly rain rate of 77.5 mm that took place on 12 July 2006 at Goyang in the middlewestern part of the Korean Peninsula. The heavy rainfall was responsible for flash flooding and was highly localized. High-resolution Doppler radar data from 5 radar sites located over central Korea were analyzed. Numerical simulations using the Weather Research and Forecasting (WRF) model were also performed to complement the high-resolution observations and to further investigate the thermodynamic structure and development of the convective system. The grid nudging method using the Global Final (FNL) Analyses data was applied to the coarse model domain (30 km) in order to provide a more realistic and desirable initial and boundary conditions for the nested model domains (10 km, 3.3 km). The mesoscale convective system (MCS) which caused flash flooding was initiated by the strong low level jet (LLJ) at the frontal region of high equivalent potential temperature (θe) near the west coast over the Yellow Sea. The ascending of the warm and moist air was induced dynamically by the LLJ. The convective cells were triggered by small thermal perturbations and abruptly developed by the warm θe inflow. Within the MCS, several convective cells responsible for the rainfall peak at Goyang simultaneously developed with neighboring cells and interacted with each other. Moist absolutely unstable layers (MAULs) were seen at the lower troposphere with the very moist environment adding the instability for the development of the MCS.

  3. Microbialite Biosignature Analysis by Mesoscale X-ray Fluorescence (μXRF) Mapping.

    Science.gov (United States)

    Tice, Michael M; Quezergue, Kimbra; Pope, Michael C

    2017-11-01

    As part of its biosignature detection package, the Mars 2020 rover will carry PIXL, the Planetary Instrument for X-ray Lithochemistry, a spatially resolved X-ray fluorescence (μXRF) spectrometer. Understanding the types of biosignatures detectable by μXRF and the rock types μXRF is most effective at analyzing is therefore an important goal in preparation for in situ Mars 2020 science and sample selection. We tested mesoscale chemical mapping for biosignature interpretation in microbialites. In particular, we used μXRF to identify spatial distributions and associations between various elements ("fluorescence microfacies") to infer the physical, biological, and chemical processes that produced the observed compositional distributions. As a test case, elemental distributions from μXRF scans of stromatolites from the Mesoarchean Nsuze Group (2.98 Ga) were analyzed. We included five fluorescence microfacies: laminated dolostone, laminated chert, clotted dolostone and chert, stromatolite clast breccia, and cavity fill. Laminated dolostone was formed primarily by microbial mats that trapped and bound loose sediment and likely precipitated carbonate mud at a shallow depth below the mat surface. Laminated chert was produced by the secondary silicification of microbial mats. Clotted dolostone and chert grew as cauliform, cryptically laminated mounds similar to younger thrombolites and was likely formed by a combination of mat growth and patchy precipitation of early-formed carbonate. Stromatolite clast breccias formed as lag deposits filling erosional scours and interstromatolite spaces. Cavities were filled by microquartz, Mn-rich dolomite, and partially dolomitized calcite. Overall, we concluded that μXRF is effective for inferring genetic processes and identifying biosignatures in compositionally heterogeneous rocks. Key Words: Stromatolites-Biosignatures-Spectroscopy-Archean. Astrobiology 17, 1161-1172.

  4. Semi-automated non-invasive diagnostics method for melanoma differentiation from nevi and pigmented basal cell carcinomas

    Science.gov (United States)

    Lihacova, I.; Bolocko, K.; Lihachev, A.

    2017-12-01

    The incidence of skin cancer is still increasing mostly in in industrialized countries with light- skinned people. Late tumour detection is the main reason of the high mortality associated with skin cancer. The accessibility of early diagnostics of skin cancer in Latvia is limited by several factors, such as high cost of dermatology services, long queues on state funded oncologist examinations, as well as inaccessibility of oncologists in the countryside regions - this is an actual clinical problem. The new strategies and guidelines for skin cancer early detection and post-surgical follow-up intend to realize the full body examination (FBE) by primary care physicians (general practitioners, interns) in combination with classical dermoscopy. To implement this approach, a semi- automated method was established. Developed software analyses the combination of 3 optical density images at 540 nm, 650 nm, and 950 nm from pigmented skin malformations and classifies them into three groups- nevi, pigmented basal cell carcinoma or melanoma.

  5. Serum bactericidal assay for the evaluation of typhoid vaccine using a semi-automated colony-counting method.

    Science.gov (United States)

    Jang, Mi Seon; Sahastrabuddhe, Sushant; Yun, Cheol-Heui; Han, Seung Hyun; Yang, Jae Seung

    2016-08-01

    Typhoid fever, mainly caused by Salmonella enterica serovar Typhi (S. Typhi), is a life-threatening disease, mostly in developing countries. Enzyme-linked immunosorbent assay (ELISA) is widely used to quantify antibodies against S. Typhi in serum but does not provide information about functional antibody titers. Although the serum bactericidal assay (SBA) using an agar plate is often used to measure functional antibody titers against various bacterial pathogens in clinical specimens, it has rarely been used for typhoid vaccines because it is time-consuming and labor-intensive. In the present study, we established an improved SBA against S. Typhi using a semi-automated colony-counting system with a square agar plate harboring 24 samples. The semi-automated SBA efficiently measured bactericidal titers of sera from individuals immunized with S. Typhi Vi polysaccharide vaccines. The assay specifically responded to S. Typhi Ty2 but not to other irrelevant enteric bacteria including Vibrio cholerae and Shigella flexneri. Baby rabbit complement was more appropriate source for the SBA against S. Typhi than complements from adult rabbit, guinea pig, and human. We also examined the correlation between SBA and ELISA for measuring antibody responses against S. Typhi using pre- and post-vaccination sera from 18 human volunteers. The SBA titer showed a good correlation with anti-Vi IgG quantity in the serum as determined by Spearman correlation coefficient of 0.737 (P measure functional antibody titers against S. Typhi in sera from human subjects immunized with typhoid vaccines. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  6. A modular, prospective, semi-automated drug safety monitoring system for use in a distributed data environment.

    Science.gov (United States)

    Gagne, Joshua J; Wang, Shirley V; Rassen, Jeremy A; Schneeweiss, Sebastian

    2014-06-01

    The aim of this study was to develop and test a semi-automated process for conducting routine active safety monitoring for new drugs in a network of electronic healthcare databases. We built a modular program that semi-automatically performs cohort identification, confounding adjustment, diagnostic checks, aggregation and effect estimation across multiple databases, and application of a sequential alerting algorithm. During beta-testing, we applied the system to five databases to evaluate nine examples emulating prospective monitoring with retrospective data (five pairs for which we expected signals, two negative controls, and two examples for which it was uncertain whether a signal would be expected): cerivastatin versus atorvastatin and rhabdomyolysis; paroxetine versus tricyclic antidepressants and gastrointestinal bleed; lisinopril versus angiotensin receptor blockers and angioedema; ciprofloxacin versus macrolide antibiotics and Achilles tendon rupture; rofecoxib versus non-selective non-steroidal anti-inflammatory drugs (ns-NSAIDs) and myocardial infarction; telithromycin versus azithromycin and hepatotoxicity; rosuvastatin versus atorvastatin and diabetes and rhabdomyolysis; and celecoxib versus ns-NSAIDs and myocardial infarction. We describe the program, the necessary inputs, and the assumed data environment. In beta-testing, the system generated four alerts, all among positive control examples (i.e., lisinopril and angioedema; rofecoxib and myocardial infarction; ciprofloxacin and tendon rupture; and cerivastatin and rhabdomyolysis). Sequential effect estimates for each example were consistent in direction and magnitude with existing literature. Beta-testing across nine drug-outcome examples demonstrated the feasibility of the proposed semi-automated prospective monitoring approach. In retrospective assessments, the system identified an increased risk of myocardial infarction with rofecoxib and an increased risk of rhabdomyolysis with cerivastatin years

  7. Campylobacter in Broiler Chicken and Broiler Meat in Sri Lanka: Influence of Semi-Automated vs. Wet Market Processing on Campylobacter Contamination of Broiler Neck Skin Samples

    Directory of Open Access Journals (Sweden)

    Kottawattage S. A. Kottawatta

    2017-11-01

    Full Text Available Broiler meat can become contaminated with Campylobacter of intestinal origin during processing. The present study aimed to identify the prevalence of Campylobacter in broiler flocks and meat contamination at retail shops, and determine the influence of semi-automated and wet market processing on Campylobacter contamination of neck skin samples. Samples were collected from semi-automated plants (n = 102 and wet markets (n = 25. From each batch of broilers, pooled caecal samples and neck skin samples were tested for Campylobacter. Broiler meat purchased from retail outlets (n = 37 was also tested. The prevalence of Campylobacter colonized broiler flocks was 67%. The contamination of meat at retail was 59%. Both semi-automated and wet market processing resulted to contaminate the broiler neck skins to the levels of 27.4% and 48%, respectively. When Campylobacter-free broiler flocks were processed in semi-automated facilities 15% (5/33 of neck skin samples became contaminated by the end of processing whereas 25% (2/8 became contaminated after wet market processing. Characterization of isolates revealed a higher proportion of C. coli compared to C. jejuni. Higher proportions of isolates were resistant to important antimicrobials. This study shows the importance of Campylobacter in poultry industry in Sri Lanka and the need for controlling antimicrobial resistance.

  8. Campylobacter in Broiler Chicken and Broiler Meat in Sri Lanka: Influence of Semi-Automated vs. Wet Market Processing on Campylobacter Contamination of Broiler Neck Skin Samples.

    Science.gov (United States)

    Kottawatta, Kottawattage S A; Van Bergen, Marcel A P; Abeynayake, Preeni; Wagenaar, Jaap A; Veldman, Kees T; Kalupahana, Ruwani S

    2017-11-29

    Broiler meat can become contaminated with Campylobacter of intestinal origin during processing. The present study aimed to identify the prevalence of Campylobacter in broiler flocks and meat contamination at retail shops, and determine the influence of semi-automated and wet market processing on Campylobacter contamination of neck skin samples. Samples were collected from semi-automated plants ( n = 102) and wet markets ( n = 25). From each batch of broilers, pooled caecal samples and neck skin samples were tested for Campylobacter . Broiler meat purchased from retail outlets ( n = 37) was also tested. The prevalence of Campylobacter colonized broiler flocks was 67%. The contamination of meat at retail was 59%. Both semi-automated and wet market processing resulted to contaminate the broiler neck skins to the levels of 27.4% and 48%, respectively. When Campylobacter -free broiler flocks were processed in semi-automated facilities 15% (5/33) of neck skin samples became contaminated by the end of processing whereas 25% (2/8) became contaminated after wet market processing. Characterization of isolates revealed a higher proportion of C. coli compared to C. jejuni . Higher proportions of isolates were resistant to important antimicrobials. This study shows the importance of Campylobacter in poultry industry in Sri Lanka and the need for controlling antimicrobial resistance.

  9. Campylobacter in Broiler Chicken and Broiler Meat in Sri Lanka : Influence of Semi-Automated vs. Wet Market Processing on Campylobacter Contamination of Broiler Neck Skin Samples

    NARCIS (Netherlands)

    Kottawatta, Kottawattage S A; van Bergen, Marcel A P; Abeynayake, Preeni; Wagenaar, Jaap A; Veldman, Kees T; Kalupahana, Ruwani S

    2017-01-01

    Broiler meat can become contaminated with Campylobacter of intestinal origin during processing. The present study aimed to identify the prevalence of Campylobacter in broiler flocks and meat contamination at retail shops, and determine the influence of semi-automated and wet market processing on

  10. Terminal digit bias is not an issue for properly trained healthcare personnel using manual or semi-automated devices - biomed 2010.

    Science.gov (United States)

    Butler, Kenneth R; Minor, Deborah S; Benghuzzi, Hamed A; Tucci, Michelle

    2010-01-01

    The objective of this study was to evaluate terminal digit preference in blood pressure (BP) measurements taken from a sample of clinics at a large academic health sciences center. We hypothesized that terminal digit preference would occur more frequently in BP measurements taken with manual mercury sphygmomanometry compared to those obtained with semi-automated instruments. A total of 1,393 BP measures were obtained in 16 ambulatory and inpatient sites by personnel using both mercury (n=1,286) and semi-automated (n=107) devices For the semi-automated devices, a trained observer repeated the patients BP following American Heart Association recommendations using a similar device with a known calibration history. At least two recorded systolic and diastolic blood pressures (average of two or more readings for each) were obtained for all manual mercury readings. Data were evaluated using descriptive statistics and Chi square as appropriate (SPSS software, 17.0). Overall, zero and other terminal digit preference was observed more frequently in systolic (?2 = 883.21, df = 9, p manual instruments, while all end digits obtained by clinic staff using semi-automated devices were more evenly distributed (?2 = 8.23, df = 9, p = 0.511 for systolic and ?2 = 10.48, df = 9, p = 0.313 for diastolic). In addition to zero digit bias in mercury readings, even numbers were reported with significantly higher frequency than odd numbers. There was no detectable digit preference observed when examining semi-automated measurements by clinic staff or device type for either systolic or diastolic BP measures. These findings demonstrate that terminal digit preference was more likely to occur with manual mercury sphygmomanometry. This phenomenon was most likely the result of mercury column graduation in 2 mm Hg increments producing a higher than expected frequency of even digits.

  11. Meso-Scale Finite Element Analysis of Mechanical Behavior of 3D Braided Composites Subjected to Biaxial Tension Loadings

    Science.gov (United States)

    Zhang, Chao; Curiel-Sosa, Jose L.; Bui, Tinh Quoc

    2018-04-01

    In many engineering applications, 3D braided composites are designed for primary loading-bearing structures, and they are frequently subjected to multi-axial loading conditions during service. In this paper, a unit-cell based finite element model is developed for assessment of mechanical behavior of 3D braided composites under different biaxial tension loadings. To predict the damage initiation and evolution of braiding yarns and matrix in the unit-cell, we thus propose an anisotropic damage model based on Murakami damage theory in conjunction with Hashin failure criteria and maximum stress criteria. To attain exact stress ratio, force loading mode of periodic boundary conditions which never been attempted before is first executed to the unit-cell model to apply the biaxial tension loadings. The biaxial mechanical behaviors, such as the stress distribution, tensile modulus and tensile strength are analyzed and discussed. The damage development of 3D braided composites under typical biaxial tension loadings is simulated and the damage mechanisms are revealed in the simulation process. The present study generally provides a new reference to the meso-scale finite element analysis (FEA) of multi-axial mechanical behavior of other textile composites.

  12. TU-F-CAMPUS-I-05: Semi-Automated, Open Source MRI Quality Assurance and Quality Control Program for Multi-Unit Institution

    International Nuclear Information System (INIS)

    Yung, J; Stefan, W; Reeve, D; Stafford, RJ

    2015-01-01

    Purpose: Phantom measurements allow for the performance of magnetic resonance (MR) systems to be evaluated. Association of Physicists in Medicine (AAPM) Report No. 100 Acceptance Testing and Quality Assurance Procedures for MR Imaging Facilities, American College of Radiology (ACR) MR Accreditation Program MR phantom testing, and ACR MRI quality control (QC) program documents help to outline specific tests for establishing system performance baselines as well as system stability over time. Analyzing and processing tests from multiple systems can be time-consuming for medical physicists. Besides determining whether tests are within predetermined limits or criteria, monitoring longitudinal trends can also help prevent costly downtime of systems during clinical operation. In this work, a semi-automated QC program was developed to analyze and record measurements in a database that allowed for easy access to historical data. Methods: Image analysis was performed on 27 different MR systems of 1.5T and 3.0T field strengths from GE and Siemens manufacturers. Recommended measurements involved the ACR MRI Accreditation Phantom, spherical homogenous phantoms, and a phantom with an uniform hole pattern. Measurements assessed geometric accuracy and linearity, position accuracy, image uniformity, signal, noise, ghosting, transmit gain, center frequency, and magnetic field drift. The program was designed with open source tools, employing Linux, Apache, MySQL database and Python programming language for the front and backend. Results: Processing time for each image is <2 seconds. Figures are produced to show regions of interests (ROIs) for analysis. Historical data can be reviewed to compare previous year data and to inspect for trends. Conclusion: A MRI quality assurance and QC program is necessary for maintaining high quality, ACR MRI Accredited MR programs. A reviewable database of phantom measurements assists medical physicists with processing and monitoring of large datasets

  13. TU-F-CAMPUS-I-05: Semi-Automated, Open Source MRI Quality Assurance and Quality Control Program for Multi-Unit Institution

    Energy Technology Data Exchange (ETDEWEB)

    Yung, J; Stefan, W; Reeve, D; Stafford, RJ [UT MD Anderson Cancer Center, Houston, TX (United States)

    2015-06-15

    Purpose: Phantom measurements allow for the performance of magnetic resonance (MR) systems to be evaluated. Association of Physicists in Medicine (AAPM) Report No. 100 Acceptance Testing and Quality Assurance Procedures for MR Imaging Facilities, American College of Radiology (ACR) MR Accreditation Program MR phantom testing, and ACR MRI quality control (QC) program documents help to outline specific tests for establishing system performance baselines as well as system stability over time. Analyzing and processing tests from multiple systems can be time-consuming for medical physicists. Besides determining whether tests are within predetermined limits or criteria, monitoring longitudinal trends can also help prevent costly downtime of systems during clinical operation. In this work, a semi-automated QC program was developed to analyze and record measurements in a database that allowed for easy access to historical data. Methods: Image analysis was performed on 27 different MR systems of 1.5T and 3.0T field strengths from GE and Siemens manufacturers. Recommended measurements involved the ACR MRI Accreditation Phantom, spherical homogenous phantoms, and a phantom with an uniform hole pattern. Measurements assessed geometric accuracy and linearity, position accuracy, image uniformity, signal, noise, ghosting, transmit gain, center frequency, and magnetic field drift. The program was designed with open source tools, employing Linux, Apache, MySQL database and Python programming language for the front and backend. Results: Processing time for each image is <2 seconds. Figures are produced to show regions of interests (ROIs) for analysis. Historical data can be reviewed to compare previous year data and to inspect for trends. Conclusion: A MRI quality assurance and QC program is necessary for maintaining high quality, ACR MRI Accredited MR programs. A reviewable database of phantom measurements assists medical physicists with processing and monitoring of large datasets

  14. Methods and measurement variance for field estimations of coral colony planar area using underwater photographs and semi-automated image segmentation.

    Science.gov (United States)

    Neal, Benjamin P; Lin, Tsung-Han; Winter, Rivah N; Treibitz, Tali; Beijbom, Oscar; Kriegman, David; Kline, David I; Greg Mitchell, B

    2015-08-01

    Size and growth rates for individual colonies are some of the most essential descriptive parameters for understanding coral communities, which are currently experiencing worldwide declines in health and extent. Accurately measuring coral colony size and changes over multiple years can reveal demographic, growth, or mortality patterns often not apparent from short-term observations and can expose environmental stress responses that may take years to manifest. Describing community size structure can reveal population dynamics patterns, such as periods of failed recruitment or patterns of colony fission, which have implications for the future sustainability of these ecosystems. However, rapidly and non-invasively measuring coral colony sizes in situ remains a difficult task, as three-dimensional underwater digital reconstruction methods are currently not practical for large numbers of colonies. Two-dimensional (2D) planar area measurements from projection of underwater photographs are a practical size proxy, although this method presents operational difficulties in obtaining well-controlled photographs in the highly rugose environment of the coral reef, and requires extensive time for image processing. Here, we present and test the measurement variance for a method of making rapid planar area estimates of small to medium-sized coral colonies using a lightweight monopod image-framing system and a custom semi-automated image segmentation analysis program. This method demonstrated a coefficient of variation of 2.26% for repeated measurements in realistic ocean conditions, a level of error appropriate for rapid, inexpensive field studies of coral size structure, inferring change in colony size over time, or measuring bleaching or disease extent of large numbers of individual colonies.

  15. 3D neuromelanin-sensitive magnetic resonance imaging with semi-automated volume measurement of the substantia nigra pars compacta for diagnosis of Parkinson's disease

    Energy Technology Data Exchange (ETDEWEB)

    Ogisu, Kimihiro; Shirato, Hiroki [Hokkaido University Graduate School of Medicine, Department of Radiology, Hokkaido (Japan); Kudo, Kohsuke; Sasaki, Makoto [Iwate Medical University, Division of Ultrahigh Field MRI, Iwate (Japan); Sakushima, Ken; Yabe, Ichiro; Sasaki, Hidenao [Hokkaido University Hospital, Department of Neurology, Hokkaido (Japan); Terae, Satoshi; Nakanishi, Mitsuhiro [Hokkaido University Hospital, Department of Radiology, Hokkaido (Japan)

    2013-06-15

    Neuromelanin-sensitive MRI has been reported to be used in the diagnosis of Parkinson's disease (PD), which results from loss of dopamine-producing cells in the substantia nigra pars compacta (SNc). In this study, we aimed to apply a 3D turbo field echo (TFE) sequence for neuromelanin-sensitive MRI and to evaluate the diagnostic performance of semi-automated method for measurement of SNc volume in patients with PD. We examined 18 PD patients and 27 healthy volunteers (control subjects). A 3D TFE technique with off-resonance magnetization transfer pulse was used for neuromelanin-sensitive MRI on a 3T scanner. The SNc volume was semi-automatically measured using a region-growing technique at various thresholds (ranging from 1.66 to 2.48), with the signals measured relative to that for the superior cerebellar peduncle. Receiver operating characteristic (ROC) analysis was performed at all thresholds. Intra-rater reproducibility was evaluated by intraclass correlation coefficient (ICC). The average SNc volume in the PD group was significantly smaller than that in the control group at all the thresholds (P < 0.01, student t test). At higher thresholds (>2.0), the area under the curve of ROC (Az) increased (0.88). In addition, we observed balanced sensitivity and specificity (0.83 and 0.85, respectively). At lower thresholds, sensitivity tended to increase but specificity reduced in comparison with that at higher thresholds. ICC was larger than 0.9 when the threshold was over 1.86. Our method can distinguish the PD group from the control group with high sensitivity and specificity, especially for early stage of PD. (orig.)

  16. A semi-automated tool for reducing the creation of false closed depressions from a filled LIDAR-derived digital elevation model

    Science.gov (United States)

    Waller, John S.; Doctor, Daniel H.; Terziotti, Silvia

    2015-01-01

    Closed depressions on the land surface can be identified by ‘filling’ a digital elevation model (DEM) and subtracting the filled model from the original DEM. However, automated methods suffer from artificial ‘dams’ where surface streams cross under bridges and through culverts. Removal of these false depressions from an elevation model is difficult due to the lack of bridge and culvert inventories; thus, another method is needed to breach these artificial dams. Here, we present a semi-automated workflow and toolbox to remove falsely detected closed depressions created by artificial dams in a DEM. The approach finds the intersections between transportation routes (e.g., roads) and streams, and then lowers the elevation surface across the roads to stream level allowing flow to be routed under the road. Once the surface is corrected to match the approximate location of the National Hydrologic Dataset stream lines, the procedure is repeated with sequentially smaller flow accumulation thresholds in order to generate stream lines with less contributing area within the watershed. Through multiple iterations, artificial depressions that may arise due to ephemeral flow paths can also be removed. Preliminary results reveal that this new technique provides significant improvements for flow routing across a DEM and minimizes artifacts within the elevation surface. Slight changes in the stream flow lines generally improve the quality of flow routes; however some artificial dams may persist. Problematic areas include extensive road ditches, particularly along divided highways, and where surface flow crosses beneath road intersections. Limitations do exist, and the results partially depend on the quality of data being input. Of 166 manually identified culverts from a previous study by Doctor and Young in 2013, 125 are within 25 m of culverts identified by this tool. After three iterations, 1,735 culverts were identified and cataloged. The result is a reconditioned

  17. Computed tomography landmark-based semi-automated mesh morphing and mapping techniques: generation of patient specific models of the human pelvis without segmentation.

    Science.gov (United States)

    Salo, Zoryana; Beek, Maarten; Wright, David; Whyne, Cari Marisa

    2015-04-13

    Current methods for the development of pelvic finite element (FE) models generally are based upon specimen specific computed tomography (CT) data. This approach has traditionally required segmentation of CT data sets, which is time consuming and necessitates high levels of user intervention due to the complex pelvic anatomy. The purpose of this research was to develop and assess CT landmark-based semi-automated mesh morphing and mapping techniques to aid the generation and mechanical analysis of specimen-specific FE models of the pelvis without the need for segmentation. A specimen-specific pelvic FE model (source) was created using traditional segmentation methods and morphed onto a CT scan of a different (target) pelvis using a landmark-based method. The morphed model was then refined through mesh mapping by moving the nodes to the bone boundary. A second target model was created using traditional segmentation techniques. CT intensity based material properties were assigned to the morphed/mapped model and to the traditionally segmented target models. Models were analyzed to evaluate their geometric concurrency and strain patterns. Strains generated in a double-leg stance configuration were compared to experimental strain gauge data generated from the same target cadaver pelvis. CT landmark-based morphing and mapping techniques were efficiently applied to create a geometrically multifaceted specimen-specific pelvic FE model, which was similar to the traditionally segmented target model and better replicated the experimental strain results (R(2)=0.873). This study has shown that mesh morphing and mapping represents an efficient validated approach for pelvic FE model generation without the need for segmentation. Copyright © 2015 Elsevier Ltd. All rights reserved.

  18. An evaluation of semi-automated methods for collecting ecosystem-level data in temperate marine systems.

    Science.gov (United States)

    Griffin, Kingsley J; Hedge, Luke H; González-Rivero, Manuel; Hoegh-Guldberg, Ove I; Johnston, Emma L

    2017-07-01

    Historically, marine ecologists have lacked efficient tools that are capable of capturing detailed species distribution data over large areas. Emerging technologies such as high-resolution imaging and associated machine-learning image-scoring software are providing new tools to map species over large areas in the ocean. Here, we combine a novel diver propulsion vehicle (DPV) imaging system with free-to-use machine-learning software to semi-automatically generate dense and widespread abundance records of a habitat-forming algae over ~5,000 m 2 of temperate reef. We employ replicable spatial techniques to test the effectiveness of traditional diver-based sampling, and better understand the distribution and spatial arrangement of one key algal species. We found that the effectiveness of a traditional survey depended on the level of spatial structuring, and generally 10-20 transects (50 × 1 m) were required to obtain reliable results. This represents 2-20 times greater replication than have been collected in previous studies. Furthermore, we demonstrate the usefulness of fine-resolution distribution modeling for understanding patterns in canopy algae cover at multiple spatial scales, and discuss applications to other marine habitats. Our analyses demonstrate that semi-automated methods of data gathering and processing provide more accurate results than traditional methods for describing habitat structure at seascape scales, and therefore represent vastly improved techniques for understanding and managing marine seascapes.

  19. Semi-automated delineation of breast cancer tumors and subsequent materialization using three-dimensional printing (rapid prototyping).

    Science.gov (United States)

    Schulz-Wendtland, Rüdiger; Harz, Markus; Meier-Meitinger, Martina; Brehm, Barbara; Wacker, Till; Hahn, Horst K; Wagner, Florian; Wittenberg, Thomas; Beckmann, Matthias W; Uder, Michael; Fasching, Peter A; Emons, Julius

    2017-03-01

    Three-dimensional (3D) printing has become widely available, and a few cases of its use in clinical practice have been described. The aim of this study was to explore facilities for the semi-automated delineation of breast cancer tumors and to assess the feasibility of 3D printing of breast cancer tumors. In a case series of five patients, different 3D imaging methods-magnetic resonance imaging (MRI), digital breast tomosynthesis (DBT), and 3D ultrasound-were used to capture 3D data for breast cancer tumors. The volumes of the breast tumors were calculated to assess the comparability of the breast tumor models, and the MRI information was used to render models on a commercially available 3D printer to materialize the tumors. The tumor volumes calculated from the different 3D methods appeared to be comparable. Tumor models with volumes between 325 mm 3 and 7,770 mm 3 were printed and compared with the models rendered from MRI. The materialization of the tumors reflected the computer models of them. 3D printing (rapid prototyping) appears to be feasible. Scenarios for the clinical use of the technology might include presenting the model to the surgeon to provide a better understanding of the tumor's spatial characteristics in the breast, in order to improve decision-making in relation to neoadjuvant chemotherapy or surgical approaches. J. Surg. Oncol. 2017;115:238-242. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  20. Semi-automated preparation of a 11C-labelled antibiotic - [N-methyl-11C]erythromycin A lactobionate

    International Nuclear Information System (INIS)

    Pike, V.W.; Palmer, A.J.; Horlock, P.L.; Liss, R.H.

    1984-01-01

    A fast semi-automated method is described for labelling the antibiotic, erythromycin A (1), with the short-lived positron-emitting radionuclide, 11 C(tsub(1/2)=20.4 min), in order to permit the non-invasive study of its tissue uptake in vivo. Labelling was achieved by the fast reductive methylation of N-demethylerythromycin A (2) with [ 11 C]formaldehyde, itself prepared from cyclotron-produced [ 11 C]-carbon dioxide. Rapid chemical and radiochemical purification of the [N-methyl- 11 C]erythromycin A (3) were achieved by HPLC and verified by TLC with autoradiography. The purified material was formulated for human i.v. injection as a sterile apyrogenic solution of the lactobionate salt. The preparation takes 42 min from the end of radionuclide production and from [ 11 C]carbon dioxide produces [N-methyl- 11 C]erythromycin A lactobionate in 4-12% radiochemical yield, corrected for radioactive decay. (author)

  1. Improved analysis and visualization of friction loop data: unraveling the energy dissipation of meso-scale stick-slip motion

    Science.gov (United States)

    Kokorian, Jaap; Merlijn van Spengen, W.

    2017-11-01

    In this paper we demonstrate a new method for analyzing and visualizing friction force measurements of meso-scale stick-slip motion, and introduce a method for extracting two separate dissipative energy components. Using a microelectromechanical system tribometer, we execute 2 million reciprocating sliding cycles, during which we measure the static friction force with a resolution of \

  2. Analysis of mesoscale factors at the onset of deep convection on hailstorm days in Southern France and their relation to the synoptic patterns

    Science.gov (United States)

    Sanchez, Jose Luis; Wu, Xueke; Gascón, Estibaliz; López, Laura; Melcón, Pablo; García-Ortega, Eduardo; Berthet, Claude; Dessens, Jean; Merino, Andrés

    2013-04-01

    Storms and the weather phenomena associated to intense precipitation, lightning, strong winds or hail, are among the most common and dangerous weather risks in many European countries. To get a reliable forecast of their occurrence is remaining an open problem. The question is: how is possible to improve the reliability of forecast? Southwestern France is frequently affected by hailstorms, producing severe damages on crops and properties. Considerable efforts were made to improve the forecast of hailfall in this area. First of all, if we want to improve this type of forecast, it is necessary to have a good "ground truth" of the hail days and zones affected by hailfall. Fortunately, ANELFA has deployed thousands of hailpad stations in Southern France. The ANELFA processed the point hailfall data recorded during each hail season at these stations. The focus of this paper presents a methodology to improve the forecast of the occurrence of hailfall according to the synoptic environment and mesoscale factors in the study area. One hundred of hail days were selected, following spatial and severity criteria, occurred in the period 2000-2010. The mesoscale model WRF was applied for all cases to study the synoptic environment of mean geopotential and temperature fields at 500 hPa. Three nested domains have been defined following a two-way nesting strategy, with a horizontal spatial resolution of 36, 12 and 4 km, and 30 vertical terrains— following σ-levels. Then, using the Principal Component Analysis in T-Mode, 4 mesoscale configurations were defined for the fields of convective instability (CI), water vapor flux divergence and wind flow and humidity at low layer (850hPa), and several clusters were classified followed by using the K-means Clustering. Finally, we calculated several characteristic values of four hail forecast parameters: Convective Available Potential Energy (CAPE), Storm Relative Helicity between 0 and 3 km (SRH0-3), Energy-Helicity Index (EHI) and

  3. Modeling mesoscale eddies

    Science.gov (United States)

    Canuto, V. M.; Dubovikov, M. S.

    Mesoscale eddies are not resolved in coarse resolution ocean models and must be modeled. They affect both mean momentum and scalars. At present, no generally accepted model exists for the former; in the latter case, mesoscales are modeled with a bolus velocity u∗ to represent a sink of mean potential energy. However, comparison of u∗(model) vs. u∗ (eddy resolving code, [J. Phys. Ocean. 29 (1999) 2442]) has shown that u∗(model) is incomplete and that additional terms, "unrelated to thickness source or sinks", are required. Thus far, no form of the additional terms has been suggested. To describe mesoscale eddies, we employ the Navier-Stokes and scalar equations and a turbulence model to treat the non-linear interactions. We then show that the problem reduces to an eigenvalue problem for the mesoscale Bernoulli potential. The solution, which we derive in analytic form, is used to construct the momentum and thickness fluxes. In the latter case, the bolus velocity u∗ is found to contain two types of terms: the first type entails the gradient of the mean potential vorticity and represents a positive contribution to the production of mesoscale potential energy; the second type of terms, which is new, entails the velocity of the mean flow and represents a negative contribution to the production of mesoscale potential energy, or equivalently, a backscatter process whereby a fraction of the mesoscale potential energy is returned to the original reservoir of mean potential energy. This type of terms satisfies the physical description of the additional terms given by [J. Phys. Ocean. 29 (1999) 2442]. The mesoscale flux that enters the momentum equations is also contributed by two types of terms of the same physical nature as those entering the thickness flux. The potential vorticity flux is also shown to contain two types of terms: the first is of the gradient-type while the other terms entail the velocity of the mean flow. An expression is derived for the mesoscale

  4. Reducing the cost of semi-automated in-gel tryptic digestion and GeLC sample preparation for high-throughput proteomics.

    Science.gov (United States)

    Ruelcke, Jayde E; Loo, Dorothy; Hill, Michelle M

    2016-10-21

    Peptide generation by trypsin digestion is typically the first step in mass spectrometry-based proteomics experiments, including 'bottom-up' discovery and targeted proteomics using multiple reaction monitoring. Manual tryptic digest and the subsequent clean-up steps can add variability even before the sample reaches the analytical platform. While specialized filter plates and tips have been designed for automated sample processing, the specialty reagents required may not be accessible or feasible due to their high cost. Here, we report a lower-cost semi-automated protocol for in-gel digestion and GeLC using standard 96-well microplates. Further cost savings were realized by re-using reagent tips with optimized sample ordering. To evaluate the methodology, we compared a simple mixture of 7 proteins and a complex cell-lysate sample. The results across three replicates showed that our semi-automated protocol had performance equal to or better than a manual in-gel digestion with respect to replicate variability and level of contamination. In this paper, we also provide the Agilent Bravo method file, which can be adapted to other liquid handlers. The simplicity, reproducibility, and cost-effectiveness of our semi-automated protocol make it ideal for routine in-gel and GeLC sample preparations, as well as high throughput processing of large clinical sample cohorts. Copyright © 2016 Elsevier B.V. All rights reserved.

  5. SAIDE: A Semi-Automated Interface for Hydrogen/Deuterium Exchange Mass Spectrometry.

    Science.gov (United States)

    Villar, Maria T; Miller, Danny E; Fenton, Aron W; Artigues, Antonio

    2010-01-01

    Deuterium/hydrogen exchange in combination with mass spectrometry (DH MS) is a sensitive technique for detection of changes in protein conformation and dynamics. Since temperature, pH and timing control are the key elements for reliable and efficient measurement of hydrogen/deuterium content in proteins and peptides, we have developed a small, semiautomatic interface for deuterium exchange that interfaces the HPLC pumps with a mass spectrometer. This interface is relatively inexpensive to build, and provides efficient temperature and timing control in all stages of enzyme digestion, HPLC separation and mass analysis of the resulting peptides. We have tested this system with a series of standard tryptic peptides reconstituted in a solvent containing increasing concentration of deuterium. Our results demonstrate the use of this interface results in minimal loss of deuterium due to back exchange during HPLC desalting and separation. For peptides reconstituted in a buffer containing 100% deuterium, and assuming that all amide linkages have exchanged hydrogen with deuterium, the maximum loss of deuterium content is only 17% of the label, indicating the loss of only one deuterium molecule per peptide.

  6. Interscan variation of semi-automated volumetry of subsolid pulmonary nodules

    International Nuclear Information System (INIS)

    Scholten, Ernst T.; Jong, Pim A. de; Willemink, Martin J.; Mali, Willem P.T.M.; Gietema, Hester A.; Jacobs, Colin; Riel, Sarah van; Ginneken, Bram van; Vliegenthart, Rozemarijn; Oudkerk, Matthijs; Koning, Harry J. de; Horeweg, Nanda; Prokop, Mathias

    2015-01-01

    We aimed to test the interscan variation of semi-automatic volumetry of subsolid nodules (SSNs), as growth evaluation is important for SSN management. From a lung cancer screening trial all SSNs that were stable over at least 3 months were included (N = 44). SSNs were quantified on the baseline CT by two observers using semi-automatic volumetry software for effective diameter, volume, and mass. One observer also measured the SSNs on the second CT 3 months later. Interscan variation was evaluated using Bland-Altman plots. Observer agreement was calculated as intraclass correlation coefficient (ICC). Data are presented as mean (± standard deviation) or median and interquartile range (IQR). A Mann-Whitney U test was used for the analysis of the influence of adjustments on the measurements. Semi-automatic measurements were feasible in all 44 SSNs. The interscan limits of agreement ranged from -12.0 % to 9.7 % for diameter, -35.4 % to 28.6 % for volume and -27.6 % to 30.8 % for mass. Agreement between observers was good with intraclass correlation coefficients of 0.978, 0.957, and 0.968 for diameter, volume, and mass, respectively. Our data suggest that when using our software an increase in mass of 30 % can be regarded as significant growth. (orig.)

  7. Interscan variation of semi-automated volumetry of subsolid pulmonary nodules

    Energy Technology Data Exchange (ETDEWEB)

    Scholten, Ernst T. [University Medical Center, Department of Radiology, Utrecht (Netherlands); Kennemer Gasthuis, Department of Radiology, Haarlem (Netherlands); Jong, Pim A. de; Willemink, Martin J.; Mali, Willem P.T.M.; Gietema, Hester A. [University Medical Center, Department of Radiology, Utrecht (Netherlands); Jacobs, Colin; Riel, Sarah van [Radboud University Medical Center, Diagnostic Image Analysis Group, Nijmegen (Netherlands); Ginneken, Bram van [Radboud University Medical Center, Diagnostic Image Analysis Group, Nijmegen (Netherlands); Fraunhofer MEVIS, Bremen (Germany); Vliegenthart, Rozemarijn [University of Groningen, University Medical Center Groningen, Department of Radiology, Groningen (Netherlands); University of Groningen, University Medical Centre Groningen, Center for Medical Imaging-North East Netherlands, Groningen (Netherlands); Oudkerk, Matthijs [University of Groningen, University Medical Centre Groningen, Center for Medical Imaging-North East Netherlands, Groningen (Netherlands); Koning, Harry J. de [Erasmus Medical Center, Department of Public Health, Rotterdam (Netherlands); Horeweg, Nanda [Erasmus Medical Center, Department of Public Health, Rotterdam (Netherlands); Erasmus Medical Center, Department of Pulmonology, Rotterdam (Netherlands); Prokop, Mathias [Radboud University Medical Center, Department of Radiology, Nijmegen (Netherlands)

    2015-04-01

    We aimed to test the interscan variation of semi-automatic volumetry of subsolid nodules (SSNs), as growth evaluation is important for SSN management. From a lung cancer screening trial all SSNs that were stable over at least 3 months were included (N = 44). SSNs were quantified on the baseline CT by two observers using semi-automatic volumetry software for effective diameter, volume, and mass. One observer also measured the SSNs on the second CT 3 months later. Interscan variation was evaluated using Bland-Altman plots. Observer agreement was calculated as intraclass correlation coefficient (ICC). Data are presented as mean (± standard deviation) or median and interquartile range (IQR). A Mann-Whitney U test was used for the analysis of the influence of adjustments on the measurements. Semi-automatic measurements were feasible in all 44 SSNs. The interscan limits of agreement ranged from -12.0 % to 9.7 % for diameter, -35.4 % to 28.6 % for volume and -27.6 % to 30.8 % for mass. Agreement between observers was good with intraclass correlation coefficients of 0.978, 0.957, and 0.968 for diameter, volume, and mass, respectively. Our data suggest that when using our software an increase in mass of 30 % can be regarded as significant growth. (orig.)

  8. Semi-automated measurement of anatomical structures using statistical and morphological priors

    Science.gov (United States)

    Ashton, Edward A.; Du, Tong

    2004-05-01

    Rapid, accurate and reproducible delineation and measurement of arbitrary anatomical structures in medical images is a widely held goal, with important applications in both clinical diagnostics and, perhaps more significantly, pharmaceutical trial evaluation. This process requires the ability first to localize a structure within the body, and then to find a best approximation of the structure"s boundaries within a given scan. Structures that are tortuous and small in cross section, such as the hippocampus in the brain or the abdominal aorta, present a particular challenge. Their apparent shape and position can change significantly from slice to slice, and accurate prior shape models for such structures are often difficult to form. In this work, we have developed a system that makes use of both a user-defined shape model and a statistical maximum likelihood classifier to identify and measure structures of this sort in MRI and CT images. Experiments show that this system can reduce analysis time by 75% or more with respect to manual tracing with no loss of precision or accuracy.

  9. Differential phenotyping of Brucella species using a newly developed semi-automated metabolic system

    Directory of Open Access Journals (Sweden)

    Appel Bernd

    2010-10-01

    Full Text Available Abstract Background A commercial biotyping system (Taxa Profile™, Merlin Diagnostika testing the metabolization of various substrates by bacteria was used to determine if a set of phenotypic features will allow the identification of members of the genus Brucella and their differentiation into species and biovars. Results A total of 191 different amines, amides, amino acids, other organic acids and heterocyclic and aromatic substrates (Taxa Profile™ A, 191 different mono-, di-, tri- and polysaccharides and sugar derivates (Taxa Profile™ C and 95 amino peptidase- and protease-reactions, 76 glycosidase-, phosphatase- and other esterase-reactions, and 17 classic reactions (Taxa Profile™ E were tested with the 23 reference strains representing the currently known species and biovars of Brucella and a collection of 60 field isolates. Based on specific and stable reactions a 96-well "Brucella identification and typing" plate (Micronaut™ was designed and re-tested in 113 Brucella isolates and a couple of closely related bacteria. Brucella species and biovars revealed characteristic metabolic profiles and each strain showed an individual pattern. Due to their typical metabolic profiles a differentiation of Brucella isolates to the species level could be achieved. The separation of B. canis from B. suis bv 3, however, failed. At the biovar level, B. abortus bv 4, 5, 7 and B. suis bv 1-5 could be discriminated with a specificity of 100%. B. melitensis isolates clustered in a very homogenous group and could not be resolved according to their assigned biovars. Conclusions The comprehensive testing of metabolic activity allows cluster analysis within the genus Brucella. The biotyping system developed for the identification of Brucella and differentiation of its species and biovars may replace or at least complement time-consuming tube testing especially in case of atypical strains. An easy to handle identification software facilitates the

  10. Magnetic resonance image segmentation using semi-automated software for quantification of knee articular cartilage - initial evaluation of a technique for paired scans

    International Nuclear Information System (INIS)

    Brem, M.H.; Lang, P.K.; Neumann, G.; Schlechtweg, P.M.; Yoshioka, H.; Pappas, G.; Duryea, J.; Schneider, E.; Jackson, R.; Yu, J.; Eaton, C.B.; Hennig, F.F.

    2009-01-01

    Software-based image analysis is important for studies of cartilage changes in knee osteoarthritis (OA). This study describes an evaluation of a semi-automated cartilage segmentation software tool capable of quantifying paired images for potential use in longitudinal studies of knee OA. We describe the methodology behind the analysis and demonstrate its use by determination of test-retest analysis precision of duplicate knee magnetic resonance imaging (MRI) data sets. Test-retest knee MR images of 12 subjects with a range of knee health were evaluated from the Osteoarthritis Initiative (OAI) pilot MR study. Each subject was removed from the magnet between the two scans. The 3D DESS (sagittal, 0.456 mm x 0.365 mm, 0.7 mm slice thickness, TR 16.5 ms, TE 4.7 ms) images were obtained on a 3-T Siemens Trio MR system with a USA Instruments quadrature transmit-receive extremity coil. Segmentation of one 3D-image series was first performed and then the corresponding retest series was segmented by viewing both image series concurrently in two adjacent windows. After manual registration of the series, the first segmentation cartilage outline served as an initial estimate for the second segmentation. We evaluated morphometric measures of the bone and cartilage surface area (tAB and AC), cartilage volume (VC), and mean thickness (ThC.me) for medial/lateral tibia (MT/LT), total femur (F) and patella (P). Test-retest reproducibility was assessed using the root-mean square coefficient of variation (RMS CV%). For the paired analyses, RMS CV % ranged from 0.9% to 1.2% for VC, from 0.3% to 0.7% for AC, from 0.6% to 2.7% for tAB and 0.8% to 1.5% for ThC.me. Paired image analysis improved the measurement precision of cartilage segmentation. Our results are in agreement with other publications supporting the use of paired analysis for longitudinal studies of knee OA. (orig.)

  11. Magnetic resonance image segmentation using semi-automated software for quantification of knee articular cartilage - initial evaluation of a technique for paired scans

    Energy Technology Data Exchange (ETDEWEB)

    Brem, M.H. [Brigham and Women' s Hospital, Department of Radiology, Boston, MA (United States); Friedrich-Alexander-University Erlangen Nurenberg, Division of Orthopaedic and Trauma Surgery, Department of Surgery, Erlangen (Germany); Lang, P.K.; Neumann, G.; Schlechtweg, P.M.; Yoshioka, H.; Pappas, G.; Duryea, J. [Brigham and Women' s Hospital, Department of Radiology, Boston, MA (United States); Schneider, E. [LLC, SciTrials, Rocky River, OH (United States); Cleveland Clinic, Imaging Institute, Cleveland, OH (United States); Jackson, R.; Yu, J. [Ohio State University, Diabetes and Metabolism and Radiology, Department of Endocrinology, Columbus, OH (United States); Eaton, C.B. [Center for Primary Care and Prevention and the Warren Alpert Medical School of Brown University, Memorial Hospital of Rhode Island, Providence, RI (United States); Hennig, F.F. [Friedrich-Alexander-University Erlangen Nurenberg, Division of Orthopaedic and Trauma Surgery, Department of Surgery, Erlangen (Germany)

    2009-05-15

    Software-based image analysis is important for studies of cartilage changes in knee osteoarthritis (OA). This study describes an evaluation of a semi-automated cartilage segmentation software tool capable of quantifying paired images for potential use in longitudinal studies of knee OA. We describe the methodology behind the analysis and demonstrate its use by determination of test-retest analysis precision of duplicate knee magnetic resonance imaging (MRI) data sets. Test-retest knee MR images of 12 subjects with a range of knee health were evaluated from the Osteoarthritis Initiative (OAI) pilot MR study. Each subject was removed from the magnet between the two scans. The 3D DESS (sagittal, 0.456 mm x 0.365 mm, 0.7 mm slice thickness, TR 16.5 ms, TE 4.7 ms) images were obtained on a 3-T Siemens Trio MR system with a USA Instruments quadrature transmit-receive extremity coil. Segmentation of one 3D-image series was first performed and then the corresponding retest series was segmented by viewing both image series concurrently in two adjacent windows. After manual registration of the series, the first segmentation cartilage outline served as an initial estimate for the second segmentation. We evaluated morphometric measures of the bone and cartilage surface area (tAB and AC), cartilage volume (VC), and mean thickness (ThC.me) for medial/lateral tibia (MT/LT), total femur (F) and patella (P). Test-retest reproducibility was assessed using the root-mean square coefficient of variation (RMS CV%). For the paired analyses, RMS CV % ranged from 0.9% to 1.2% for VC, from 0.3% to 0.7% for AC, from 0.6% to 2.7% for tAB and 0.8% to 1.5% for ThC.me. Paired image analysis improved the measurement precision of cartilage segmentation. Our results are in agreement with other publications supporting the use of paired analysis for longitudinal studies of knee OA. (orig.)

  12. Semi-automated segmentation of the sigmoid and descending colon for radiotherapy planning using the fast marching method

    International Nuclear Information System (INIS)

    Losnegaard, Are; Hodneland, Erlend; Lundervold, Arvid; Hysing, Liv Bolstad; Muren, Ludvig Paul

    2010-01-01

    A fast and accurate segmentation of organs at risk, such as the healthy colon, would be of benefit for planning of radiotherapy, in particular in an adaptive scenario. For the treatment of pelvic tumours, a great challenge is the segmentation of the most adjacent and sensitive parts of the gastrointestinal tract, the sigmoid and descending colon. We propose a semi-automated method to segment these bowel parts using the fast marching (FM) method. Standard 3D computed tomography (CT) image data obtained from routine radiotherapy planning were used. Our pre-processing steps distinguish the intestine, muscles and air from connective tissue. The core part of our method separates the sigmoid and descending colon from the muscles and other segments of the intestine. This is done by utilizing the ability of the FM method to compute a specified minimal energy functional integrated along a path, and thereby extracting the colon centre line between user-defined control points in the sigmoid and descending colon. Further, we reconstruct the tube-shaped geometry of the sigmoid and descending colon by fitting ellipsoids to points on the path and by adding adjacent voxels that are likely voxels belonging to these bowel parts. Our results were compared to manually outlined sigmoid and descending colon, and evaluated using the Dice coefficient (DC). Tests on 11 patients gave an average DC of 0.83 (±0.07) with little user interaction. We conclude that the proposed method makes it possible to fast and accurately segment the sigmoid and descending colon from routine CT image data.

  13. Development of a semi-automated method for subspecialty case distribution and prediction of intraoperative consultations in surgical pathology

    Directory of Open Access Journals (Sweden)

    Raul S Gonzalez

    2015-01-01

    Full Text Available Background: In many surgical pathology laboratories, operating room schedules are prospectively reviewed to determine specimen distribution to different subspecialty services and to predict the number and nature of potential intraoperative consultations for which prior medical records and slides require review. At our institution, such schedules were manually converted into easily interpretable, surgical pathology-friendly reports to facilitate these activities. This conversion, however, was time-consuming and arguably a non-value-added activity. Objective: Our goal was to develop a semi-automated method of generating these reports that improved their readability while taking less time to perform than the manual method. Materials and Methods: A dynamic Microsoft Excel workbook was developed to automatically convert published operating room schedules into different tabular formats. Based on the surgical procedure descriptions in the schedule, a list of linked keywords and phrases was utilized to sort cases by subspecialty and to predict potential intraoperative consultations. After two trial-and-optimization cycles, the method was incorporated into standard practice. Results: The workbook distributed cases to appropriate subspecialties and accurately predicted intraoperative requests. Users indicated that they spent 1-2 h fewer per day on this activity than before, and team members preferred the formatting of the newer reports. Comparison of the manual and semi-automatic predictions showed that the mean daily difference in predicted versus actual intraoperative consultations underwent no statistically significant changes before and after implementation for most subspecialties. Conclusions: A well-designed, lean, and simple information technology solution to determine subspecialty case distribution and prediction of intraoperative consultations in surgical pathology is approximately as accurate as the gold standard manual method and requires less

  14. SEMI-AUTOMATED APPROACH FOR MAPPING URBAN TREES FROM INTEGRATED AERIAL LiDAR POINT CLOUD AND DIGITAL IMAGERY DATASETS

    Directory of Open Access Journals (Sweden)

    M. A. Dogon-Yaro

    2016-09-01

    Full Text Available Mapping of trees plays an important role in modern urban spatial data management, as many benefits and applications inherit from this detailed up-to-date data sources. Timely and accurate acquisition of information on the condition of urban trees serves as a tool for decision makers to better appreciate urban ecosystems and their numerous values which are critical to building up strategies for sustainable development. The conventional techniques used for extracting trees include ground surveying and interpretation of the aerial photography. However, these techniques are associated with some constraints, such as labour intensive field work and a lot of financial requirement which can be overcome by means of integrated LiDAR and digital image datasets. Compared to predominant studies on trees extraction mainly in purely forested areas, this study concentrates on urban areas, which have a high structural complexity with a multitude of different objects. This paper presented a workflow about semi-automated approach for extracting urban trees from integrated processing of airborne based LiDAR point cloud and multispectral digital image datasets over Istanbul city of Turkey. The paper reveals that the integrated datasets is a suitable technology and viable source of information for urban trees management. As a conclusion, therefore, the extracted information provides a snapshot about location, composition and extent of trees in the study area useful to city planners and other decision makers in order to understand how much canopy cover exists, identify new planting, removal, or reforestation opportunities and what locations have the greatest need or potential to maximize benefits of return on investment. It can also help track trends or changes to the urban trees over time and inform future management decisions.

  15. Robofurnace: A semi-automated laboratory chemical vapor deposition system for high-throughput nanomaterial synthesis and process discovery

    International Nuclear Information System (INIS)

    Oliver, C. Ryan; Westrick, William; Koehler, Jeremy; Brieland-Shoultz, Anna; Anagnostopoulos-Politis, Ilias; Cruz-Gonzalez, Tizoc; Hart, A. John

    2013-01-01

    Laboratory research and development on new materials, such as nanostructured thin films, often utilizes manual equipment such as tube furnaces due to its relatively low cost and ease of setup. However, these systems can be prone to inconsistent outcomes due to variations in standard operating procedures and limitations in performance such as heating and cooling rates restrict the parameter space that can be explored. Perhaps more importantly, maximization of research throughput and the successful and efficient translation of materials processing knowledge to production-scale systems, relies on the attainment of consistent outcomes. In response to this need, we present a semi-automated lab-scale chemical vapor deposition (CVD) furnace system, called “Robofurnace.” Robofurnace is an automated CVD system built around a standard tube furnace, which automates sample insertion and removal and uses motion of the furnace to achieve rapid heating and cooling. The system has a 10-sample magazine and motorized transfer arm, which isolates the samples from the lab atmosphere and enables highly repeatable placement of the sample within the tube. The system is designed to enable continuous operation of the CVD reactor, with asynchronous loading/unloading of samples. To demonstrate its performance, Robofurnace is used to develop a rapid CVD recipe for carbon nanotube (CNT) forest growth, achieving a 10-fold improvement in CNT forest mass density compared to a benchmark recipe using a manual tube furnace. In the long run, multiple systems like Robofurnace may be linked to share data among laboratories by methods such as Twitter. Our hope is Robofurnace and like automation will enable machine learning to optimize and discover relationships in complex material synthesis processes

  16. MO-D-213-07: RadShield: Semi- Automated Calculation of Air Kerma Rate and Barrier Thickness

    International Nuclear Information System (INIS)

    DeLorenzo, M; Wu, D; Rutel, I; Yang, K

    2015-01-01

    Purpose: To develop the first Java-based semi-automated calculation program intended to aid professional radiation shielding design. Air-kerma rate and barrier thickness calculations are performed by implementing NCRP Report 147 formalism into a Graphical User Interface (GUI). The ultimate aim of this newly created software package is to reduce errors and improve radiographic and fluoroscopic room designs over manual approaches. Methods: Floor plans are first imported as images into the RadShield software program. These plans serve as templates for drawing barriers, occupied regions and x-ray tube locations. We have implemented sub-GUIs that allow the specification in regions and equipment for occupancy factors, design goals, number of patients, primary beam directions, source-to-patient distances and workload distributions. Once the user enters the above parameters, the program automatically calculates air-kerma rate at sampled points beyond all barriers. For each sample point, a corresponding minimum barrier thickness is calculated to meet the design goal. RadShield allows control over preshielding, sample point location and material types. Results: A functional GUI package was developed and tested. Examination of sample walls and source distributions yields a maximum percent difference of less than 0.1% between hand-calculated air-kerma rates and RadShield. Conclusion: The initial results demonstrated that RadShield calculates air-kerma rates and required barrier thicknesses with reliable accuracy and can be used to make radiation shielding design more efficient and accurate. This newly developed approach differs from conventional calculation methods in that it finds air-kerma rates and thickness requirements for many points outside the barriers, stores the information and selects the largest value needed to comply with NCRP Report 147 design goals. Floor plans, parameters, designs and reports can be saved and accessed later for modification and recalculation

  17. MO-D-213-07: RadShield: Semi- Automated Calculation of Air Kerma Rate and Barrier Thickness

    Energy Technology Data Exchange (ETDEWEB)

    DeLorenzo, M [Oklahoma University Health Sciences Center, Oklahoma City, OK (United States); Wu, D [University of Oklahoma Health Sciences Center, Oklahoma City, Ok (United States); Rutel, I [University of Oklahoma Health Science Center, Oklahoma City, OK (United States); Yang, K [Massachusetts General Hospital, Boston, MA (United States)

    2015-06-15

    Purpose: To develop the first Java-based semi-automated calculation program intended to aid professional radiation shielding design. Air-kerma rate and barrier thickness calculations are performed by implementing NCRP Report 147 formalism into a Graphical User Interface (GUI). The ultimate aim of this newly created software package is to reduce errors and improve radiographic and fluoroscopic room designs over manual approaches. Methods: Floor plans are first imported as images into the RadShield software program. These plans serve as templates for drawing barriers, occupied regions and x-ray tube locations. We have implemented sub-GUIs that allow the specification in regions and equipment for occupancy factors, design goals, number of patients, primary beam directions, source-to-patient distances and workload distributions. Once the user enters the above parameters, the program automatically calculates air-kerma rate at sampled points beyond all barriers. For each sample point, a corresponding minimum barrier thickness is calculated to meet the design goal. RadShield allows control over preshielding, sample point location and material types. Results: A functional GUI package was developed and tested. Examination of sample walls and source distributions yields a maximum percent difference of less than 0.1% between hand-calculated air-kerma rates and RadShield. Conclusion: The initial results demonstrated that RadShield calculates air-kerma rates and required barrier thicknesses with reliable accuracy and can be used to make radiation shielding design more efficient and accurate. This newly developed approach differs from conventional calculation methods in that it finds air-kerma rates and thickness requirements for many points outside the barriers, stores the information and selects the largest value needed to comply with NCRP Report 147 design goals. Floor plans, parameters, designs and reports can be saved and accessed later for modification and recalculation

  18. Semi-automated algorithm for localization of dermal/epidermal junction in reflectance confocal microscopy images of human skin

    Science.gov (United States)

    Kurugol, Sila; Dy, Jennifer G.; Rajadhyaksha, Milind; Gossage, Kirk W.; Weissmann, Jesse; Brooks, Dana H.

    2011-03-01

    The examination of the dermis/epidermis junction (DEJ) is clinically important for skin cancer diagnosis. Reflectance confocal microscopy (RCM) is an emerging tool for detection of skin cancers in vivo. However, visual localization of the DEJ in RCM images, with high accuracy and repeatability, is challenging, especially in fair skin, due to low contrast, heterogeneous structure and high inter- and intra-subject variability. We recently proposed a semi-automated algorithm to localize the DEJ in z-stacks of RCM images of fair skin, based on feature segmentation and classification. Here we extend the algorithm to dark skin. The extended algorithm first decides the skin type and then applies the appropriate DEJ localization method. In dark skin, strong backscatter from the pigment melanin causes the basal cells above the DEJ to appear with high contrast. To locate those high contrast regions, the algorithm operates on small tiles (regions) and finds the peaks of the smoothed average intensity depth profile of each tile. However, for some tiles, due to heterogeneity, multiple peaks in the depth profile exist and the strongest peak might not be the basal layer peak. To select the correct peak, basal cells are represented with a vector of texture features. The peak with most similar features to this feature vector is selected. The results show that the algorithm detected the skin types correctly for all 17 stacks tested (8 fair, 9 dark). The DEJ detection algorithm achieved an average distance from the ground truth DEJ surface of around 4.7μm for dark skin and around 7-14μm for fair skin.

  19. Robofurnace: A semi-automated laboratory chemical vapor deposition system for high-throughput nanomaterial synthesis and process discovery

    Energy Technology Data Exchange (ETDEWEB)

    Oliver, C. Ryan; Westrick, William; Koehler, Jeremy; Brieland-Shoultz, Anna; Anagnostopoulos-Politis, Ilias; Cruz-Gonzalez, Tizoc [Department of Mechanical Engineering, University of Michigan, Ann Arbor, Michigan 48109 (United States); Hart, A. John, E-mail: ajhart@mit.edu [Department of Mechanical Engineering, University of Michigan, Ann Arbor, Michigan 48109 (United States); Department of Mechanical Engineering, Massachusetts Institute of Technology, Cambridge, Massachusetts 02139 (United States)

    2013-11-15

    Laboratory research and development on new materials, such as nanostructured thin films, often utilizes manual equipment such as tube furnaces due to its relatively low cost and ease of setup. However, these systems can be prone to inconsistent outcomes due to variations in standard operating procedures and limitations in performance such as heating and cooling rates restrict the parameter space that can be explored. Perhaps more importantly, maximization of research throughput and the successful and efficient translation of materials processing knowledge to production-scale systems, relies on the attainment of consistent outcomes. In response to this need, we present a semi-automated lab-scale chemical vapor deposition (CVD) furnace system, called “Robofurnace.” Robofurnace is an automated CVD system built around a standard tube furnace, which automates sample insertion and removal and uses motion of the furnace to achieve rapid heating and cooling. The system has a 10-sample magazine and motorized transfer arm, which isolates the samples from the lab atmosphere and enables highly repeatable placement of the sample within the tube. The system is designed to enable continuous operation of the CVD reactor, with asynchronous loading/unloading of samples. To demonstrate its performance, Robofurnace is used to develop a rapid CVD recipe for carbon nanotube (CNT) forest growth, achieving a 10-fold improvement in CNT forest mass density compared to a benchmark recipe using a manual tube furnace. In the long run, multiple systems like Robofurnace may be linked to share data among laboratories by methods such as Twitter. Our hope is Robofurnace and like automation will enable machine learning to optimize and discover relationships in complex material synthesis processes.

  20. Semi-Automated Approach for Mapping Urban Trees from Integrated Aerial LiDAR Point Cloud and Digital Imagery Datasets

    Science.gov (United States)

    Dogon-Yaro, M. A.; Kumar, P.; Rahman, A. Abdul; Buyuksalih, G.

    2016-09-01

    Mapping of trees plays an important role in modern urban spatial data management, as many benefits and applications inherit from this detailed up-to-date data sources. Timely and accurate acquisition of information on the condition of urban trees serves as a tool for decision makers to better appreciate urban ecosystems and their numerous values which are critical to building up strategies for sustainable development. The conventional techniques used for extracting trees include ground surveying and interpretation of the aerial photography. However, these techniques are associated with some constraints, such as labour intensive field work and a lot of financial requirement which can be overcome by means of integrated LiDAR and digital image datasets. Compared to predominant studies on trees extraction mainly in purely forested areas, this study concentrates on urban areas, which have a high structural complexity with a multitude of different objects. This paper presented a workflow about semi-automated approach for extracting urban trees from integrated processing of airborne based LiDAR point cloud and multispectral digital image datasets over Istanbul city of Turkey. The paper reveals that the integrated datasets is a suitable technology and viable source of information for urban trees management. As a conclusion, therefore, the extracted information provides a snapshot about location, composition and extent of trees in the study area useful to city planners and other decision makers in order to understand how much canopy cover exists, identify new planting, removal, or reforestation opportunities and what locations have the greatest need or potential to maximize benefits of return on investment. It can also help track trends or changes to the urban trees over time and inform future management decisions.

  1. Mesoscale analysis of failure in quasi-brittle materials: comparison between lattice model and acoustic emission data.

    Science.gov (United States)

    Grégoire, David; Verdon, Laura; Lefort, Vincent; Grassl, Peter; Saliba, Jacqueline; Regoin, Jean-Pierre; Loukili, Ahmed; Pijaudier-Cabot, Gilles

    2015-10-25

    The purpose of this paper is to analyse the development and the evolution of the fracture process zone during fracture and damage in quasi-brittle materials. A model taking into account the material details at the mesoscale is used to describe the failure process at the scale of the heterogeneities. This model is used to compute histograms of the relative distances between damaged points. These numerical results are compared with experimental data, where the damage evolution is monitored using acoustic emissions. Histograms of the relative distances between damage events in the numerical calculations and acoustic events in the experiments exhibit good agreement. It is shown that the mesoscale model provides relevant information from the point of view of both global responses and the local failure process. © 2015 The Authors. International Journal for Numerical and Analytical Methods in Geomechanics published by John Wiley & Sons Ltd.

  2. Semi-automated segmentation and quantification of adipose tissue in calf and thigh by MRI: a preliminary study in patients with monogenic metabolic syndrome

    International Nuclear Information System (INIS)

    Al-Attar, Salam A; Pollex, Rebecca L; Robinson, John F; Miskie, Brooke A; Walcarius, Rhonda; Rutt, Brian K; Hegele, Robert A

    2006-01-01

    With the growing prevalence of obesity and metabolic syndrome, reliable quantitative imaging methods for adipose tissue are required. Monogenic forms of the metabolic syndrome include Dunnigan-variety familial partial lipodystrophy subtypes 2 and 3 (FPLD2 and FPLD3), which are characterized by the loss of subcutaneous fat in the extremities. Through magnetic resonance imaging (MRI) of FPLD patients, we have developed a method of quantifying the core FPLD anthropometric phenotype, namely adipose tissue in the mid-calf and mid-thigh regions. Four female subjects, including an FPLD2 subject (LMNA R482Q), an FPLD3 subject (PPARG F388L), and two control subjects were selected for MRI and analysis. MRI scans of subjects were performed on a 1.5T GE MR Medical system, with 17 transaxial slices comprising a 51 mm section obtained in both the mid-calf and mid-thigh regions. Using ImageJ 1.34 n software, analysis of raw MR images involved the creation of a connectedness map of the subcutaneous adipose tissue contours within the lower limb segment from a user-defined seed point. Quantification of the adipose tissue was then obtained after thresholding the connected map and counting the voxels (volumetric pixels) present within the specified region. MR images revealed significant differences in the amounts of subcutaneous adipose tissue in lower limb segments of FPLD3 and FPLD2 subjects: respectively, mid-calf, 15.5% and 0%, and mid-thigh, 25.0% and 13.3%. In comparison, old and young healthy controls had values, respectively, of mid-calf, 32.5% and 26.2%, and mid-thigh, 52.2% and 36.1%. The FPLD2 patient had significantly reduced subcutaneous adipose tissue compared to FPLD3 patient. Thus, semi-automated quantification of adipose tissue of the lower extremity can detect differences between individuals of various lipodystrophy genotypes and represents a potentially useful tool for extended quantitative phenotypic analysis of other genetic metabolic disorders

  3. Mesoscale Connections Summer 2017

    Energy Technology Data Exchange (ETDEWEB)

    Kippen, Karen Elizabeth [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Bourke, Mark Andrew M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-06-21

    Our challenge derives from the fact that in metals or explosives grains, interfaces and defects control engineering performance in ways that are neither amenable to continuum codes (which fail to rigorously describe the heterogeneities derived from microstructure) nor computationally tractable to first principles atomistic calculations. This is a region called the mesoscale, which stands at the frontier of our desire to translate fundamental science insights into confidence in aging system performance over the range of extreme conditions relevant in a nuclear weapon. For dynamic problems, the phenomena of interest can require extremely good temporal resolutions. A shock wave traveling at 1000 m/s (or 1 mm/μs) passes through a grain with a diameter of 1 micron in a nanosecond (10-9 sec). Thus, to observe the mesoscale phenomena—such as dislocations or phase transformations—as the shock passes, temporal resolution better than picoseconds (10-12 sec) may be needed. As we anticipate the science challenges over the next decade, experimental insights on material performance at the micron spatial scale with picosecond temporal resolution—at the mesoscale— are a clear challenge. This is a challenge fit for Los Alamos in partnership with our sister labs and academia. Mesoscale Connections will draw attention to our progress as we tackle the mesoscale challenge. We hope you like it and encourage suggestions of content you are interested in.

  4. Fast and accurate semi-automated segmentation method of spinal cord MR images at 3T applied to the construction of a cervical spinal cord template.

    Directory of Open Access Journals (Sweden)

    Mohamed-Mounir El Mendili

    Full Text Available To design a fast and accurate semi-automated segmentation method for spinal cord 3T MR images and to construct a template of the cervical spinal cord.A semi-automated double threshold-based method (DTbM was proposed enabling both cross-sectional and volumetric measures from 3D T2-weighted turbo spin echo MR scans of the spinal cord at 3T. Eighty-two healthy subjects, 10 patients with amyotrophic lateral sclerosis, 10 with spinal muscular atrophy and 10 with spinal cord injuries were studied. DTbM was compared with active surface method (ASM, threshold-based method (TbM and manual outlining (ground truth. Accuracy of segmentations was scored visually by a radiologist in cervical and thoracic cord regions. Accuracy was also quantified at the cervical and thoracic levels as well as at C2 vertebral level. To construct a cervical template from healthy subjects' images (n=59, a standardization pipeline was designed leading to well-centered straight spinal cord images and accurate probability tissue map.Visual scoring showed better performance for DTbM than for ASM. Mean Dice similarity coefficient (DSC was 95.71% for DTbM and 90.78% for ASM at the cervical level and 94.27% for DTbM and 89.93% for ASM at the thoracic level. Finally, at C2 vertebral level, mean DSC was 97.98% for DTbM compared with 98.02% for TbM and 96.76% for ASM. DTbM showed similar accuracy compared with TbM, but with the advantage of limited manual interaction.A semi-automated segmentation method with limited manual intervention was introduced and validated on 3T images, enabling the construction of a cervical spinal cord template.

  5. Fast and accurate semi-automated segmentation method of spinal cord MR images at 3T applied to the construction of a cervical spinal cord template.

    Science.gov (United States)

    El Mendili, Mohamed-Mounir; Chen, Raphaël; Tiret, Brice; Villard, Noémie; Trunet, Stéphanie; Pélégrini-Issac, Mélanie; Lehéricy, Stéphane; Pradat, Pierre-François; Benali, Habib

    2015-01-01

    To design a fast and accurate semi-automated segmentation method for spinal cord 3T MR images and to construct a template of the cervical spinal cord. A semi-automated double threshold-based method (DTbM) was proposed enabling both cross-sectional and volumetric measures from 3D T2-weighted turbo spin echo MR scans of the spinal cord at 3T. Eighty-two healthy subjects, 10 patients with amyotrophic lateral sclerosis, 10 with spinal muscular atrophy and 10 with spinal cord injuries were studied. DTbM was compared with active surface method (ASM), threshold-based method (TbM) and manual outlining (ground truth). Accuracy of segmentations was scored visually by a radiologist in cervical and thoracic cord regions. Accuracy was also quantified at the cervical and thoracic levels as well as at C2 vertebral level. To construct a cervical template from healthy subjects' images (n=59), a standardization pipeline was designed leading to well-centered straight spinal cord images and accurate probability tissue map. Visual scoring showed better performance for DTbM than for ASM. Mean Dice similarity coefficient (DSC) was 95.71% for DTbM and 90.78% for ASM at the cervical level and 94.27% for DTbM and 89.93% for ASM at the thoracic level. Finally, at C2 vertebral level, mean DSC was 97.98% for DTbM compared with 98.02% for TbM and 96.76% for ASM. DTbM showed similar accuracy compared with TbM, but with the advantage of limited manual interaction. A semi-automated segmentation method with limited manual intervention was introduced and validated on 3T images, enabling the construction of a cervical spinal cord template.

  6. O the Development and Use of Four-Dimensional Data Assimilation in Limited-Area Mesoscale Models Used for Meteorological Analysis.

    Science.gov (United States)

    Stauffer, David R.

    1990-01-01

    The application of dynamic relationships to the analysis problem for the atmosphere is extended to use a full-physics limited-area mesoscale model as the dynamic constraint. A four-dimensional data assimilation (FDDA) scheme based on Newtonian relaxation or "nudging" is developed and evaluated in the Penn State/National Center for Atmospheric Research (PSU/NCAR) mesoscale model, which is used here as a dynamic-analysis tool. The thesis is to determine what assimilation strategies and what meterological fields (mass, wind or both) have the greatest positive impact on the 72-h numerical simulations (dynamic analyses) of two mid-latitude, real-data cases. The basic FDDA methodology is tested in a 10-layer version of the model with a bulk-aerodynamic (single-layer) representation of the planetary boundary layer (PBL), and refined in a 15-layer version of the model by considering the effects of data assimilation within a multi-layer PBL scheme. As designed, the model solution can be relaxed toward either gridded analyses ("analysis nudging"), or toward the actual observations ("obs nudging"). The data used for assimilation include standard 12-hourly rawinsonde data, and also 3-hourly mesoalpha-scale surface data which are applied within the model's multi-layer PBL. Continuous assimilation of standard-resolution rawinsonde data into the 10-layer model successfully reduced large-scale amplitude and phase errors while the model realistically simulated mesoscale structures poorly defined or absent in the rawinsonde analyses and in the model simulations without FDDA. Nudging the model fields directly toward the rawinsonde observations generally produced results comparable to nudging toward gridded analyses. This obs -nudging technique is especially attractive for the assimilation of high-frequency, asynoptic data. Assimilation of 3-hourly surface wind and moisture data into the 15-layer FDDA system was most effective for improving the simulated precipitation fields because a

  7. Mesoscale hybrid calibration artifact

    Science.gov (United States)

    Tran, Hy D.; Claudet, Andre A.; Oliver, Andrew D.

    2010-09-07

    A mesoscale calibration artifact, also called a hybrid artifact, suitable for hybrid dimensional measurement and the method for make the artifact. The hybrid artifact has structural characteristics that make it suitable for dimensional measurement in both vision-based systems and touch-probe-based systems. The hybrid artifact employs the intersection of bulk-micromachined planes to fabricate edges that are sharp to the nanometer level and intersecting planes with crystal-lattice-defined angles.

  8. Reduced fractal model for quantitative analysis of averaged micromotions in mesoscale: Characterization of blow-like signals

    International Nuclear Information System (INIS)

    Nigmatullin, Raoul R.; Toboev, Vyacheslav A.; Lino, Paolo; Maione, Guido

    2015-01-01

    Highlights: •A new approach describes fractal-branched systems with long-range fluctuations. •A reduced fractal model is proposed. •The approach is used to characterize blow-like signals. •The approach is tested on data from different fields. -- Abstract: It has been shown that many micromotions in the mesoscale region are averaged in accordance with their self-similar (geometrical/dynamical) structure. This distinctive feature helps to reduce a wide set of different micromotions describing relaxation/exchange processes to an averaged collective motion, expressed mathematically in a rather general form. This reduction opens new perspectives in description of different blow-like signals (BLS) in many complex systems. The main characteristic of these signals is a finite duration also when the generalized reduced function is used for their quantitative fitting. As an example, we describe quantitatively available signals that are generated by bronchial asthmatic people, songs by queen bees, and car engine valves operating in the idling regime. We develop a special treatment procedure based on the eigen-coordinates (ECs) method that allows to justify the generalized reduced fractal model (RFM) for description of BLS that can propagate in different complex systems. The obtained describing function is based on the self-similar properties of the different considered micromotions. This kind of cooperative model is proposed here for the first time. In spite of the fact that the nature of the dynamic processes that take place in fractal structure on a mesoscale level is not well understood, the parameters of the RFM fitting function can be used for construction of calibration curves, affected by various external/random factors. Then, the calculated set of the fitting parameters of these calibration curves can characterize BLS of different complex systems affected by those factors. Though the method to construct and analyze the calibration curves goes beyond the scope

  9. Delayed shear enhancement in mesoscale atmospheric dispersion

    Energy Technology Data Exchange (ETDEWEB)

    Moran, M.D. [Atmospheric Environment Service, Ontario (Canada); Pielke, R.A. [Colorado State Univ., Fort Collins, CO (United States)

    1994-12-31

    Mesoscale atmospheric dispersion (MAD) is more complicated than smaller-scale dispersion because the mean wind field can no longer be considered steady or horizontally homogeneous over mesoscale time and space scales. Wind shear also plays a much more important role on the mesoscale: horizontal dispersion can be enhanced and often dominated by vertical wind shear on these scales through the interaction of horizontal differential advection and vertical mixing. Just over 30 years ago, Pasquill suggested that this interaction need not be simultaneous and that the combination of differential horizontal advection with delayed or subsequent vertical mixing could maintain effective horizontal diffusion in spite of temporal or spatial reductions in boundary-layer turbulence intensity. This two-step mechanism has not received much attention since then, but a recent analysis of observations from and numerical simulations of two mesoscale tracer experiments suggests that delayed shear enhancement can play an important role in MAD. This paper presents an overview of this analysis, with particular emphasis on the influence of resolvable vertical shear on MAD in these two case studies and the contributions made by delayed shear enhancement.

  10. Development, implementation and outcome analysis of semi-automated alerts for metformin dose adjustment in hospitalized patients with renal impairment.

    Science.gov (United States)

    Niedrig, David; Krattinger, Regina; Jödicke, Annika; Gött, Carmen; Bucklar, Guido; Russmann, Stefan

    2016-10-01

    Overdosing of the oral antidiabetic metformin in impaired renal function is an important contributory cause to life-threatening lactic acidosis. The presented project aimed to quantify and prevent this avoidable medication error in clinical practice. We developed and implemented an algorithm into a hospital's clinical information system that prospectively identifies metformin prescriptions if the estimated glomerular filtration rate is below 60 mL/min. Resulting real-time electronic alerts are sent to clinical pharmacologists and pharmacists, who validate cases in electronic medical records and contact prescribing physicians with recommendations if necessary. The screening algorithm has been used in routine clinical practice for 3 years and generated 2145 automated alerts (about 2 per day). Validated expert recommendations regarding metformin therapy, i.e., dose reduction or stop, were issued for 381 patients (about 3 per week). Follow-up was available for 257 cases, and prescribers' compliance with recommendations was 79%. Furthermore, during 3 years, we identified eight local cases of lactic acidosis associated with metformin therapy in renal impairment that could not be prevented, e.g., because metformin overdosing had occurred before hospitalization. Automated sensitive screening followed by specific expert evaluation and personal recommendations can prevent metformin overdosing in renal impairment with high efficiency and efficacy. Repeated cases of metformin-associated lactic acidosis in renal impairment underline the clinical relevance of this medication error. Our locally developed and customized alert system is a successful proof of concept for a proactive clinical drug safety program that is now expanded to other clinically and economically relevant medication errors. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  11. Quantification of filamentation by uropathogenic Escherichia coli during experimental bladder cell infection by using semi-automated image analysis

    DEFF Research Database (Denmark)

    Klein, Kasper; Palarasah, Yaseelan; Kolmos, Hans Jørn

    2015-01-01

    in focus-stacked microscopy images. Used in combination with a flow-chamber based in vitro cystitis model, we study the factors involved in filament formation by uropathogenic E. coli (UPEC) during infection. The influence of substratum surface, intracellular proliferation and flow media on UPEC...... rod-shaped cells. Evidence has emerged over the past decade suggesting that this morphological transformation is controlled and reversible and provides selective advantages under certain growth conditions, such as during infection in humans. In order to identify the factors which induce filamentation...... filamentation is evaluated. We show that reversible UPEC filamentation during cystitis is not dependent on intracellular infection, which previous studies have suggested. Instead, we find that filamentation can be induced by contact with surfaces, both biological and artificial. Lastly our data indicate...

  12. Integrated biochemical, molecular genetic, and bioacoustical analysis of mesoscale variability of the euphausiid Nematoscelis difficilis in the California Current

    Science.gov (United States)

    Bucklin, Ann; Wiebe, Peter H.; Smolenack, Sara B.; Copley, Nancy J.; Clarke, M. Elizabeth

    2002-03-01

    Integrated assessment of the euphausiid Nematoscelis difficilis (Crustacea; Euphausiacea) and the zooplankton assemblage of the California Current was designed to investigate individual, population, and community responses to mesoscale variability in biological and physical characters of the ocean. Zooplankton samples and observational data were collected along a cross-shelf transect of the California Current in association with the California Cooperative Fisheries Investigations (CalCOFI) Survey during October 1996. The transect crossed three domains defined by temperature and salinity: nearshore, mid-Current, and offshore. Individual N. difficilis differed in physiological condition along the transect, with higher size-corrected concentrations of four central metabolic enzymes (citrate synthetase, hexokinase, lactate dehydrogenase (LDH), and phosphoglucose isomerase (PGI)) for euphausiids collected in nearshore waters than in mid-Current and offshore waters. There was little variation in the DNA sequences of the genes encoding PGI and LDH (all DNA changes were either silent or heterozygous base substitutions), suggesting that differences in enzyme concentration did not result from underlying molecular genetic variation. The population genetic makeup of N. difficilis varied from sample to sample based on haplotype frequencies of mitochondrial cytochrome oxidase I (mtCOI; P=0.029). There were significant differences between pooled nearshore and offshore samples, based on allele frequencies at two sites of common substitutions in the mtCOI sequence ( P=0.020 and 0.026). Silhouette and bioacoustical backscattering measurements of the zooplankton assemblage of the top 100 m showed marked diel vertical migration of the scattering layer, of which euphausiids were a small but significant fraction. The biochemical and molecular assays are used as indices of complex physiological (i.e., growth and condition) and genetic (i.e., mortality) processes; the bioacoustical

  13. A semi-automated method for non-invasive internal organ weight estimation by post-mortem magnetic resonance imaging in fetuses, newborns and children

    International Nuclear Information System (INIS)

    Thayyil, Sudhin; Schievano, Silvia; Robertson, Nicola J.; Jones, Rodney; Chitty, Lyn S.; Sebire, Neil J.; Taylor, Andrew M.

    2009-01-01

    Magnetic resonance (MR) imaging allows minimally invasive autopsy, especially when consent is declined for traditional autopsy. Estimation of individual visceral organ weights is an important component of traditional autopsy. Objective: To examine whether a semi-automated can be used for non-invasive internal organ weight measurement using post-mortem MR imaging in fetuses, newborns and children. Methods: Phase 1: In vitro scanning of 36 animal organs (heart, liver, kidneys) was performed to check the accuracy of volume reconstruction methodology. Real volumes were measured by water displacement method. Phase 2: Sixty-five whole body post-mortem MR scans were performed in fetuses (n = 30), newborns (n = 5) and children (n = 30) at 1.5 T using a 3D TSE T2-weighted sequence. These data were analysed offline using the image processing software Mimics 11.0. Results: Phase 1: Mean difference (S.D.) between estimated and actual volumes were -0.3 (1.5) ml for kidney, -0.7 (1.3) ml for heart, -1.7 (3.6) ml for liver in animal experiments. Phase 2: In fetuses, newborns and children mean differences between estimated and actual weights (S.D.) were -0.6 (4.9) g for liver, -5.1 (1.2) g for spleen, -0.3 (0.6) g for adrenals, 0.4 (1.6) g for thymus, 0.9 (2.5) g for heart, -0.7 (2.4) g for kidneys and 2.7 (14) g for lungs. Excellent co-correlation was noted for estimated and actual weights (r 2 = 0.99, p < 0.001). Accuracy was lower when fetuses were less than 20 weeks or less than 300 g. Conclusion: Rapid, accurate and reproducible estimation of solid internal organ weights is feasible using the semi-automated 3D volume reconstruction method.

  14. A semi-automated 2D/3D marker-based registration algorithm modelling prostate shrinkage during radiotherapy for prostate cancer

    International Nuclear Information System (INIS)

    Budiharto, Tom; Slagmolen, Pieter; Hermans, Jeroen; Maes, Frederik; Verstraete, Jan; Heuvel, Frank Van den; Depuydt, Tom; Oyen, Raymond; Haustermans, Karin

    2009-01-01

    Background and purpose: Currently, most available patient alignment tools based on implanted markers use manual marker matching and rigid registration transformations to measure the needed translational shifts. To quantify the particular effect of prostate gland shrinkage, implanted gold markers were tracked during a course of radiotherapy including an isotropic scaling factor to model prostate shrinkage. Materials and methods: Eight patients with prostate cancer had gold markers implanted transrectally and seven were treated with (neo) adjuvant androgen deprivation therapy. After patient alignment to skin tattoos, orthogonal electronic portal images (EPIs) were taken. A semi-automated 2D/3D marker-based registration was performed to calculate the necessary couch shifts. The registration consists of a rigid transformation combined with an isotropic scaling to model prostate shrinkage. Results: The inclusion of an isotropic shrinkage model in the registration algorithm cancelled the corresponding increase in registration error. The mean scaling factor was 0.89 ± 0.09. For all but two patients, a decrease of the isotropic scaling factor during treatment was observed. However, there was almost no difference in the translation offset between the manual matching of the EPIs to the digitally reconstructed radiographs and the semi-automated 2D/3D registration. A decrease in the intermarker distance was found correlating with prostate shrinkage rather than with random marker migration. Conclusions: Inclusion of shrinkage in the registration process reduces registration errors during a course of radiotherapy. Nevertheless, this did not lead to a clinically significant change in the proposed table translations when compared to translations obtained with manual marker matching without a scaling correction

  15. Accuracy of Estimation of Graft Size for Living-Related Liver Transplantation: First Results of a Semi-Automated Interactive Software for CT-Volumetry

    Science.gov (United States)

    Mokry, Theresa; Bellemann, Nadine; Müller, Dirk; Lorenzo Bermejo, Justo; Klauß, Miriam; Stampfl, Ulrike; Radeleff, Boris; Schemmer, Peter; Kauczor, Hans-Ulrich; Sommer, Christof-Matthias

    2014-01-01

    Objectives To evaluate accuracy of estimated graft size for living-related liver transplantation using a semi-automated interactive software for CT-volumetry. Materials and Methods Sixteen donors for living-related liver transplantation (11 male; mean age: 38.2±9.6 years) underwent contrast-enhanced CT prior to graft removal. CT-volumetry was performed using a semi-automated interactive software (P), and compared with a manual commercial software (TR). For P, liver volumes were provided either with or without vessels. For TR, liver volumes were provided always with vessels. Intraoperative weight served as reference standard. Major study goals included analyses of volumes using absolute numbers, linear regression analyses and inter-observer agreements. Minor study goals included the description of the software workflow: degree of manual correction, speed for completion, and overall intuitiveness using five-point Likert scales: 1–markedly lower/faster/higher for P compared with TR, 2–slightly lower/faster/higher for P compared with TR, 3–identical for P and TR, 4–slightly lower/faster/higher for TR compared with P, and 5–markedly lower/faster/higher for TR compared with P. Results Liver segments II/III, II–IV and V–VIII served in 6, 3, and 7 donors as transplanted liver segments. Volumes were 642.9±368.8 ml for TR with vessels, 623.8±349.1 ml for P with vessels, and 605.2±345.8 ml for P without vessels (Pvolumetry performed with P can predict accurately graft size for living-related liver transplantation while improving workflow compared with TR. PMID:25330198

  16. Accuracy and Feasibility of Estimated Tumour Volumetry in Primary Gastric Gastrointestinal Stromal Tumours: Validation Using Semi-automated Technique in 127 Patients

    Science.gov (United States)

    Tirumani, Sree Harsha; Shinagare, Atul B.; O’Neill, Ailbhe C.; Nishino, Mizuki; Rosenthal, Michael H.; Ramaiya, Nikhil H.

    2015-01-01

    Objective To validate estimated tumour volumetry in primary gastric gastrointestinal stromal tumours (GISTs) using semi-automated volumetry. Materials and Methods In this IRB-approved retrospective study, we measured the three longest diameters in x, y, z axes on CTs of primary gastric GISTs in 127 consecutive patients (52 women, 75 men, mean age: 61 years) at our institute between 2000 and 2013. Segmented volumes (Vsegmented) were obtained using commercial software by two radiologists. Estimate volumes (V1–V6) were obtained using formulae for spheres and ellipsoids. Intra- and inter-observer agreement of Vsegmented and agreement of V1–6 with Vsegmented were analysed with concordance correlation coefficients (CCC) and Bland-Altman plots. Results Median Vsegmented and V1–V6 were 75.9 cm3, 124.9 cm3, 111.6 cm3, 94.0 cm3, 94.4cm3, 61.7 cm3 and 80.3 cm3 respectively. There was strong intra- and inter-observer agreement for Vsegmented. Agreement with Vsegmented was highest for V6 (scalene ellipsoid, x≠y≠z), with CCC of 0.96 [95%CI: 0.95–0.97]. Mean relative difference was smallest for V6 (0.6%), while it was −19.1% for V5, +14.5% for V4, +17.9% for V3, +32.6 % for V2 and +47% for V1. Conclusion Ellipsoidal approximations of volume using three measured axes may be used to closely estimate Vsegmented when semi-automated techniques are unavailable. PMID:25991487

  17. A mobile, high-throughput semi-automated system for testing cognition in large non-primate animal models of Huntington disease.

    Science.gov (United States)

    McBride, Sebastian D; Perentos, Nicholas; Morton, A Jennifer

    2016-05-30

    For reasons of cost and ethical concerns, models of neurodegenerative disorders such as Huntington disease (HD) are currently being developed in farm animals, as an alternative to non-human primates. Developing reliable methods of testing cognitive function is essential to determining the usefulness of such models. Nevertheless, cognitive testing of farm animal species presents a unique set of challenges. The primary aims of this study were to develop and validate a mobile operant system suitable for high throughput cognitive testing of sheep. We designed a semi-automated testing system with the capability of presenting stimuli (visual, auditory) and reward at six spatial locations. Fourteen normal sheep were used to validate the system using a two-choice visual discrimination task. Four stages of training devised to acclimatise animals to the system are also presented. All sheep progressed rapidly through the training stages, over eight sessions. All sheep learned the 2CVDT and performed at least one reversal stage. The mean number of trials the sheep took to reach criterion in the first acquisition learning was 13.9±1.5 and for the reversal learning was 19.1±1.8. This is the first mobile semi-automated operant system developed for testing cognitive function in sheep. We have designed and validated an automated operant behavioural testing system suitable for high throughput cognitive testing in sheep and other medium-sized quadrupeds, such as pigs and dogs. Sheep performance in the two-choice visual discrimination task was very similar to that reported for non-human primates and strongly supports the use of farm animals as pre-clinical models for the study of neurodegenerative diseases. Copyright © 2015 Elsevier B.V. All rights reserved.

  18. Mesoscale wind fluctuations over Danish waters

    Energy Technology Data Exchange (ETDEWEB)

    Vincent, C.L.

    2010-12-15

    Mesoscale wind fluctuations affect the large scale integration of wind power because they undermine the day-ahead predictability of wind speed and power production, and because they can result in large fluctuations in power generation that must be balanced using reserve power. Large fluctuations in generated power are a particular problem for offshore wind farms because the typically high concentration of turbines within a limited geographical area means that fluctuations can be correlated across large numbers of turbines. Furthermore, organised mesoscale structures that often form over water, such as convective rolls and cellular convection, have length scales of tens of kilometers, and can cause large wind fluctuations on a time scale of around an hour. This thesis is an exploration of the predictability of mesoscale wind fluctuations using observations from the world's first two large offshore wind farms - Horns Rev I in the North Sea, and Nysted in the Baltic Sea. The thesis begins with a climatological analysis of wind fluctuations on time scales of 1-10 hours at the two sites. A novel method for calculating conditional climatologies of spectral information is proposed, based on binning and averaging the time axis of the Hilbert spectrum. Results reveal clear patterns between wind fluctuations and locally observed meteorological conditions. The analysis is expanded by classifying wind fluctuations on time scales of 1-3 hours according to synoptic patterns, satellite pictures and wind classes. Results indicate that cold air outbreaks and open cellular convection are a significant contributor to mesoscale wind variability at Horns Rev. The predictability of mesoscale wind fluctuations is tested by implementing standard statistical models that relate local wind variability to parameters based on a large scale weather analysis. The models show some skill, but only achieve a 15% improvement on a persistence forecast. The possibility of explicitly modelling

  19. A semi-automated measuring system of brain diffusion and perfusion magnetic resonance imaging abnormalities in patients with multiple sclerosis based on the integration of coregistration and tissue segmentation procedures

    International Nuclear Information System (INIS)

    Revenaz, Alfredo; Ruggeri, Massimiliano; Laganà, Marcella; Bergsland, Niels; Groppo, Elisabetta; Rovaris, Marco; Fainardi, Enrico

    2016-01-01

    Diffusion-weighted imaging (DWI) and perfusion-weighted imaging (PWI) abnormalities in patients with multiple sclerosis (MS) are currently measured by a complex combination of separate procedures. Therefore, the purpose of this study was to provide a reliable method for reducing analysis complexity and obtaining reproducible results. We implemented a semi-automated measuring system in which different well-known software components for magnetic resonance imaging (MRI) analysis are integrated to obtain reliable measurements of DWI and PWI disturbances in MS. We generated the Diffusion/Perfusion Project (DPP) Suite, in which a series of external software programs are managed and harmonically and hierarchically incorporated by in-house developed Matlab software to perform the following processes: 1) image pre-processing, including imaging data anonymization and conversion from DICOM to Nifti format; 2) co-registration of 2D and 3D non-enhanced and Gd-enhanced T1-weighted images in fluid-attenuated inversion recovery (FLAIR) space; 3) lesion segmentation and classification, in which FLAIR lesions are at first segmented and then categorized according to their presumed evolution; 4) co-registration of segmented FLAIR lesion in T1 space to obtain the FLAIR lesion mask in the T1 space; 5) normal appearing tissue segmentation, in which T1 lesion mask is used to segment basal ganglia/thalami, normal appearing grey matter (NAGM) and normal appearing white matter (NAWM); 6) DWI and PWI map generation; 7) co-registration of basal ganglia/thalami, NAGM, NAWM, DWI and PWI maps in previously segmented FLAIR space; 8) data analysis. All these steps are automatic, except for lesion segmentation and classification. We developed a promising method to limit misclassifications and user errors, providing clinical researchers with a practical and reproducible tool to measure DWI and PWI changes in MS

  20. Accuracy of estimation of graft size for living-related liver transplantation: first results of a semi-automated interactive software for CT-volumetry.

    Directory of Open Access Journals (Sweden)

    Theresa Mokry

    Full Text Available To evaluate accuracy of estimated graft size for living-related liver transplantation using a semi-automated interactive software for CT-volumetry.Sixteen donors for living-related liver transplantation (11 male; mean age: 38.2±9.6 years underwent contrast-enhanced CT prior to graft removal. CT-volumetry was performed using a semi-automated interactive software (P, and compared with a manual commercial software (TR. For P, liver volumes were provided either with or without vessels. For TR, liver volumes were provided always with vessels. Intraoperative weight served as reference standard. Major study goals included analyses of volumes using absolute numbers, linear regression analyses and inter-observer agreements. Minor study goals included the description of the software workflow: degree of manual correction, speed for completion, and overall intuitiveness using five-point Likert scales: 1--markedly lower/faster/higher for P compared with TR, 2--slightly lower/faster/higher for P compared with TR, 3--identical for P and TR, 4--slightly lower/faster/higher for TR compared with P, and 5--markedly lower/faster/higher for TR compared with P.Liver segments II/III, II-IV and V-VIII served in 6, 3, and 7 donors as transplanted liver segments. Volumes were 642.9±368.8 ml for TR with vessels, 623.8±349.1 ml for P with vessels, and 605.2±345.8 ml for P without vessels (P<0.01. Regression equations between intraoperative weights and volumes were y = 0.94x+30.1 (R2 = 0.92; P<0.001 for TR with vessels, y = 1.00x+12.0 (R2 = 0.92; P<0.001 for P with vessels, and y = 1.01x+28.0 (R2 = 0.92; P<0.001 for P without vessels. Inter-observer agreement showed a bias of 1.8 ml for TR with vessels, 5.4 ml for P with vessels, and 4.6 ml for P without vessels. For the degree of manual correction, speed for completion and overall intuitiveness, scale values were 2.6±0.8, 2.4±0.5 and 2.CT-volumetry performed with P can predict accurately graft

  1. Unifying Inference of Meso-Scale Structures in Networks.

    Science.gov (United States)

    Tunç, Birkan; Verma, Ragini

    2015-01-01

    Networks are among the most prevalent formal representations in scientific studies, employed to depict interactions between objects such as molecules, neuronal clusters, or social groups. Studies performed at meso-scale that involve grouping of objects based on their distinctive interaction patterns form one of the main lines of investigation in network science. In a social network, for instance, meso-scale structures can correspond to isolated social groupings or groups of individuals that serve as a communication core. Currently, the research on different meso-scale structures such as community and core-periphery structures has been conducted via independent approaches, which precludes the possibility of an algorithmic design that can handle multiple meso-scale structures and deciding which structure explains the observed data better. In this study, we propose a unified formulation for the algorithmic detection and analysis of different meso-scale structures. This facilitates the investigation of hybrid structures that capture the interplay between multiple meso-scale structures and statistical comparison of competing structures, all of which have been hitherto unavailable. We demonstrate the applicability of the methodology in analyzing the human brain network, by determining the dominant organizational structure (communities) of the brain, as well as its auxiliary characteristics (core-periphery).

  2. Unifying Inference of Meso-Scale Structures in Networks.

    Directory of Open Access Journals (Sweden)

    Birkan Tunç

    Full Text Available Networks are among the most prevalent formal representations in scientific studies, employed to depict interactions between objects such as molecules, neuronal clusters, or social groups. Studies performed at meso-scale that involve grouping of objects based on their distinctive interaction patterns form one of the main lines of investigation in network science. In a social network, for instance, meso-scale structures can correspond to isolated social groupings or groups of individuals that serve as a communication core. Currently, the research on different meso-scale structures such as community and core-periphery structures has been conducted via independent approaches, which precludes the possibility of an algorithmic design that can handle multiple meso-scale structures and deciding which structure explains the observed data better. In this study, we propose a unified formulation for the algorithmic detection and analysis of different meso-scale structures. This facilitates the investigation of hybrid structures that capture the interplay between multiple meso-scale structures and statistical comparison of competing structures, all of which have been hitherto unavailable. We demonstrate the applicability of the methodology in analyzing the human brain network, by determining the dominant organizational structure (communities of the brain, as well as its auxiliary characteristics (core-periphery.

  3. Can a semi-automated surface matching and principal axis-based algorithm accurately quantify femoral shaft fracture alignment in six degrees of freedom?

    Science.gov (United States)

    Crookshank, Meghan C; Beek, Maarten; Singh, Devin; Schemitsch, Emil H; Whyne, Cari M

    2013-07-01

    Accurate alignment of femoral shaft fractures treated with intramedullary nailing remains a challenge for orthopaedic surgeons. The aim of this study is to develop and validate a cone-beam CT-based, semi-automated algorithm to quantify the malalignment in six degrees of freedom (6DOF) using a surface matching and principal axes-based approach. Complex comminuted diaphyseal fractures were created in nine cadaveric femora and cone-beam CT images were acquired (27 cases total). Scans were cropped and segmented using intensity-based thresholding, producing superior, inferior and comminution volumes. Cylinders were fit to estimate the long axes of the superior and inferior fragments. The angle and distance between the two cylindrical axes were calculated to determine flexion/extension and varus/valgus angulation and medial/lateral and anterior/posterior translations, respectively. Both surfaces were unwrapped about the cylindrical axes. Three methods of matching the unwrapped surface for determination of periaxial rotation were compared based on minimizing the distance between features. The calculated corrections were compared to the input malalignment conditions. All 6DOF were calculated to within current clinical tolerances for all but two cases. This algorithm yielded accurate quantification of malalignment of femoral shaft fractures for fracture gaps up to 60 mm, based on a single CBCT image of the fractured limb. Copyright © 2012 IPEM. Published by Elsevier Ltd. All rights reserved.

  4. A semi-automated approach for mapping geomorphology of El Bardawil Lake, Northern Sinai, Egypt, using integrated remote sensing and GIS techniques

    Directory of Open Access Journals (Sweden)

    Nabil Sayed Embabi

    2014-06-01

    Full Text Available Among the other coastal lakes of the Mediterranean northern coast of Egypt, Bardawil Lake is a unique lagoon, as it is fed only by seawater. The lagoon is composed of two main basins, and several other internal small basins interconnected to one another. Although the general geomorphologic characteristics are treated in some regional studies, we used a semi-automated approach based on a wide variety of digital image processing for mapping the major geomorphological landforms of the lake on a medium scale of 1:250,000. The approach is based primarily on data fusion of Landsat ETM+ image, and validated by other ancillary spatial data (e.g. topographic maps, Google images and GPS in situ data. Interpretations of high resolution space images by Google Earth and the large-scale topographic maps (1:25,000, in specific, revealed new microforms and some detailed geomorphologic aspects with the aid of GPS measurements. Small sand barriers, submerged sand dunes, tidal channels, fans and flats, and micro-lagoons are the recurrent forms in the lake. The approach used in this study could be widely applied to study the low-lying coastal lands along the Nile Delta. However, it is concluded from geological data and geomorphologic aspects that Bardawil Lake is of a tectonic origin; it was much deeper than it is currently, and has been filled with sediments mostly since the Flandrian transgression (∼8–6 ka bp.

  5. Semi-automated CCTV surveillance: the effects of system confidence, system accuracy and task complexity on operator vigilance, reliance and workload.

    Science.gov (United States)

    Dadashi, N; Stedmon, A W; Pridmore, T P

    2013-09-01

    Recent advances in computer vision technology have lead to the development of various automatic surveillance systems, however their effectiveness is adversely affected by many factors and they are not completely reliable. This study investigated the potential of a semi-automated surveillance system to reduce CCTV operator workload in both detection and tracking activities. A further focus of interest was the degree of user reliance on the automated system. A simulated prototype was developed which mimicked an automated system that provided different levels of system confidence information. Dependent variable measures were taken for secondary task performance, reliance and subjective workload. When the automatic component of a semi-automatic CCTV surveillance system provided reliable system confidence information to operators, workload significantly decreased and spare mental capacity significantly increased. Providing feedback about system confidence and accuracy appears to be one important way of making the status of the automated component of the surveillance system more 'visible' to users and hence more effective to use. Copyright © 2012 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  6. Comparison of semi-automated center-dot and fully automated endothelial cell analyses from specular microscopy images.

    Science.gov (United States)

    Maruoka, Sachiko; Nakakura, Shunsuke; Matsuo, Naoko; Yoshitomi, Kayo; Katakami, Chikako; Tabuchi, Hitoshi; Chikama, Taiichiro; Kiuchi, Yoshiaki

    2017-10-30

    To evaluate two specular microscopy analysis methods across different endothelial cell densities (ECDs). Endothelial images of one eye from each of 45 patients were taken by using three different specular microscopes (three replicates each). To determine the consistency of the center-dot method, we compared SP-6000 and SP-2000P images. CME-530 and SP-6000 images were compared to assess the consistency of the fully automated method. The SP-6000 images from the two methods were compared. Intraclass correlation coefficients (ICCs) for the three measurements were calculated, and parametric multiple comparisons tests and Bland-Altman analysis were performed. The ECD mean value was 2425 ± 883 (range 516-3707) cells/mm 2 . ICC values were > 0.9 for all three microscopes for ECD, but the coefficients of variation (CVs) were 0.3-0.6. For ECD measurements, Bland-Altman analysis revealed that the mean difference was 42 cells/mm 2 between the SP-2000P and SP-6000 for the center-dot method; 57 cells/mm 2 between the SP-6000 measurements from both methods; and -5 cells/mm 2 between the SP-6000 and CME-530 for the fully automated method (95% limits of agreement: - 201 to 284 cell/mm 2 , - 410 to 522 cells/mm 2 , and - 327 to 318 cells/mm 2 , respectively). For CV measurements, the mean differences were - 3, - 12, and 13% (95% limits of agreement - 18 to 11, - 26 to 2, and - 5 to 32%, respectively). Despite using three replicate measurements, the precision of the center-dot method with the SP-2000P and SP-6000 software was only ± 10% for ECD data and was even worse for the fully automated method. Japan Clinical Trials Register ( http://www.umin.ac.jp/ctr/index/htm9 ) number UMIN 000015236.

  7. Acoustic Characterization of Mesoscale Objects

    Energy Technology Data Exchange (ETDEWEB)

    Chinn, D; Huber, R; Chambers, D; Cole, G; Balogun, O; Spicer, J; Murray, T

    2007-03-13

    This report describes the science and engineering performed to provide state-of-the-art acoustic capabilities for nondestructively characterizing mesoscale (millimeter-sized) objects--allowing micrometer resolution over the objects entire volume. Materials and structures used in mesoscale objects necessitate the use of (1) GHz acoustic frequencies and (2) non-contacting laser generation and detection of acoustic waves. This effort demonstrated that acoustic methods at gigahertz frequencies have the necessary penetration depth and spatial resolution to effectively detect density discontinuities, gaps, and delaminations. A prototype laser-based ultrasonic system was designed and built. The system uses a micro-chip laser for excitation of broadband ultrasonic waves with frequency components reaching 1.0 GHz, and a path-stabilized Michelson interferometer for detection. The proof-of-concept for mesoscale characterization is demonstrated by imaging a micro-fabricated etched pattern in a 70 {micro}m thick silicon wafer.

  8. A Large-Sample Test of a Semi-Automated Clavicle Search Engine to Assist Skeletal Identification by Radiograph Comparison.

    Science.gov (United States)

    D'Alonzo, Susan S; Guyomarc'h, Pierre; Byrd, John E; Stephan, Carl N

    2017-01-01

    In 2014, a morphometric capability to search chest radiograph databases by quantified clavicle shape was published to assist skeletal identification. Here, we extend the validation tests conducted by increasing the search universe 18-fold, from 409 to 7361 individuals to determine whether there is any associated decrease in performance under these more challenging circumstances. The number of trials and analysts were also increased, respectively, from 17 to 30 skeletons, and two to four examiners. Elliptical Fourier analysis was conducted on clavicles from each skeleton by each analyst (shadowgrams trimmed from scratch in every instance) and compared to the search universe. Correctly matching individuals were found in shortlists of 10% of the sample 70% of the time. This rate is similar to, although slightly lower than, rates previously found for much smaller samples (80%). Accuracy and reliability are thereby maintained, even when the comparison system is challenged by much larger search universes. © 2016 American Academy of Forensic Sciences.

  9. SU-E-T-497: Semi-Automated in Vivo Radiochromic Film Dosimetry Using a Novel Image Processing Algorithm

    International Nuclear Information System (INIS)

    Reyhan, M; Yue, N

    2014-01-01

    Purpose: To validate an automated image processing algorithm designed to detect the center of radiochromic film used for in vivo film dosimetry against the current gold standard of manual selection. Methods: An image processing algorithm was developed to automatically select the region of interest (ROI) in *.tiff images that contain multiple pieces of radiochromic film (0.5x1.3cm 2 ). After a user has linked a calibration file to the processing algorithm and selected a *.tiff file for processing, an ROI is automatically detected for all films by a combination of thresholding and erosion, which removes edges and any additional markings for orientation. Calibration is applied to the mean pixel values from the ROIs and a *.tiff image is output displaying the original image with an overlay of the ROIs and the measured doses. Validation of the algorithm was determined by comparing in vivo dose determined using the current gold standard (manually drawn ROIs) versus automated ROIs for n=420 scanned films. Bland-Altman analysis, paired t-test, and linear regression were performed to demonstrate agreement between the processes. Results: The measured doses ranged from 0.2-886.6cGy. Bland-Altman analysis of the two techniques (automatic minus manual) revealed a bias of -0.28cGy and a 95% confidence interval of (5.5cGy,-6.1cGy). These values demonstrate excellent agreement between the two techniques. Paired t-test results showed no statistical differences between the two techniques, p=0.98. Linear regression with a forced zero intercept demonstrated that Automatic=0.997*Manual, with a Pearson correlation coefficient of 0.999. The minimal differences between the two techniques may be explained by the fact that the hand drawn ROIs were not identical to the automatically selected ones. The average processing time was 6.7seconds in Matlab on an IntelCore2Duo processor. Conclusion: An automated image processing algorithm has been developed and validated, which will help minimize

  10. Semi-automated limit-dilution assay and clonal expansion of all T-cell precursors of cytotoxic lymphocytes

    International Nuclear Information System (INIS)

    Wilson, A.; Chen, W.-F.; Scollay, R.; Shortman, K.

    1982-01-01

    A limit-dilution microculture system is described, where almost all precursor T cells of the cytotoxic lineage (CTL-p) develop into extended clones of cytotoxic T cells (CTL), which are then detected with a new radio-autographic 111 In-release assay. The principle is to polyclonally activate all T cells with concanavalin A, to expand the resultant clones over an 8-9 day period in cultures saturated with growth factors, then to detect all clones with cytotoxic function by phytohaemagglutinin mediated lysis of P815 tumour cells. The key variables for obtaining high cloning efficiency are the use of flat-bottomed 96-well culture trays, the use of appropriately irradiated spleen filler cells, and the inclusion of a T-cell growth factor supplement. Cultures are set up at input levels of around one T cell per well. Forty percent of T cells then form CTL clones readily detected by the cytotoxic assay. The lytic activity of the average clone is equivalent to 3000 CTL, but clone size appears to be much larger. The precursor cells are predominantly if not entirely from the Lyt 2 + T-cell subclass and almost all cells of this subclass form cytolytic clones. Analysis of the frequency of positive cultures shows a good fit to the expected Poisson distribution, with no evidence of the CTL-p frequency estimates being distorted by helper or suppressor effects. (Auth.)

  11. Milestones on a Shoestring: A Cost-Effective, Semi-automated Implementation of the New ACGME Requirements for Radiology.

    Science.gov (United States)

    Schmitt, J Eric; Scanlon, Mary H; Servaes, Sabah; Levin, Dayna; Cook, Tessa S

    2015-10-01

    The advent of the ACGME's Next Accreditation System represents a significant new challenge for residencies and fellowships, owing to its requirements for more complex and detailed information. We developed a system of online assessment tools to provide comprehensive coverage of the twelve ACGME Milestones and digitized them using freely available cloud-based productivity tools. These tools include a combination of point-of-care procedural assessments, electronic quizzes, online modules, and other data entry forms. Using free statistical analytic tools, we also developed an automated system for management, processing, and data reporting. After one year of use, our Milestones project has resulted in the submission of over 20,000 individual data points. The use of automated statistical methods to generate resident-specific profiles has allowed for dynamic reports of individual residents' progress. These profiles both summarize data and also allow program directors access to more granular information as needed. Informatics-driven strategies for data assessment and processing represent feasible solutions to Milestones assessment and analysis, reducing the potential administrative burden for program directors, residents, and staff. Copyright © 2015 AUR. Published by Elsevier Inc. All rights reserved.

  12. Semi-automated limit-dilution assay and clonal expansion of all T-cell precursors of cytotoxic lymphocytes

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, A.; Chen, W.F.; Scollay, R.; Shortman, K. (Walter and Eliza Hall Inst. of Medical Research, Parkville (Australia))

    1982-08-13

    A limit-dilution microculture system is described, where almost all precursor T cells of the cytotoxic lineage (CTL-p) develop into extended clones of cytotoxic T cells (CTL), which are then detected with a new radio-autographic /sup 111/In-release assay. The principle is to polyclonally activate all T cells with concanavalin A, to expand the resultant clones over an 8-9 day period in cultures saturated with growth factors, then to detect all clones with cytotoxic function by phytohaemagglutinin mediated lysis of P815 tumour cells. The key variables for obtaining high cloning efficiency are the use of flat-bottomed 96-well culture trays, the use of appropriately irradiated spleen filler cells, and the inclusion of a T-cell growth factor supplement. Cultures are set up at input levels of around one T cell per well. Forty percent of T cells then form CTL clones readily detected by the cytotoxic assay. The lytic activity of the average clone is equivalent to 3000 CTL, but clone size appears to be much larger. The precursor cells are predominantly if not entirely from the Lyt 2/sup +/ T-cell subclass and almost all cells of this subclass form cytolytic clones. Analysis of the frequency of positive cultures shows a good fit to the expected Poisson distribution, with no evidence of the CTL-p frequency estimates being distorted by helper or suppressor effects.

  13. ASSESSING THE AGREEMENT BETWEEN EO-BASED SEMI-AUTOMATED LANDSLIDE MAPS WITH FUZZY MANUAL LANDSLIDE DELINEATION

    Directory of Open Access Journals (Sweden)

    F. Albrecht

    2017-09-01

    Full Text Available Landslide mapping benefits from the ever increasing availability of Earth Observation (EO data resulting from programmes like the Copernicus Sentinel missions and improved infrastructure for data access. However, there arises the need for improved automated landslide information extraction processes from EO data while the dominant method is still manual delineation. Object-based image analysis (OBIA provides the means for the fast and efficient extraction of landslide information. To prove its quality, automated results are often compared to manually delineated landslide maps. Although there is awareness of the uncertainties inherent in manual delineations, there is a lack of understanding how they affect the levels of agreement in a direct comparison of OBIA-derived landslide maps and manually derived landslide maps. In order to provide an improved reference, we present a fuzzy approach for the manual delineation of landslides on optical satellite images, thereby making the inherent uncertainties of the delineation explicit. The fuzzy manual delineation and the OBIA classification are compared by accuracy metrics accepted in the remote sensing community. We have tested this approach for high resolution (HR satellite images of three large landslides in Austria and Italy. We were able to show that the deviation of the OBIA result from the manual delineation can mainly be attributed to the uncertainty inherent in the manual delineation process, a relevant issue for the design of validation processes for OBIA-derived landslide maps.

  14. Integrity of chromatin and replicating DNA in nuclei released from fission yeast by semi-automated grinding in liquid nitrogen

    Science.gov (United States)

    2011-01-01

    Background Studies of nuclear function in many organisms, especially those with tough cell walls, are limited by lack of availability of simple, economical methods for large-scale preparation of clean, undamaged nuclei. Findings Here we present a useful method for nuclear isolation from the important model organism, the fission yeast, Schizosaccharomyces pombe. To preserve in vivo molecular configurations, we flash-froze the yeast cells in liquid nitrogen. Then we broke their tough cell walls, without damaging their nuclei, by grinding in a precision-controlled motorized mortar-and-pestle apparatus. The cryo-ground cells were resuspended and thawed in a buffer designed to preserve nuclear morphology, and the nuclei were enriched by differential centrifugation. The washed nuclei were free from contaminating nucleases and have proven well-suited as starting material for genome-wide chromatin analysis and for preparation of fragile DNA replication intermediates. Conclusions We have developed a simple, reproducible, economical procedure for large-scale preparation of endogenous-nuclease-free, morphologically intact nuclei from fission yeast. With appropriate modifications, this procedure may well prove useful for isolation of nuclei from other organisms with, or without, tough cell walls. PMID:22088094

  15. Integrity of chromatin and replicating DNA in nuclei released from fission yeast by semi-automated grinding in liquid nitrogen.

    Science.gov (United States)

    Givens, Robert M; Mesner, Larry D; Hamlin, Joyce L; Buck, Michael J; Huberman, Joel A

    2011-11-16

    Studies of nuclear function in many organisms, especially those with tough cell walls, are limited by lack of availability of simple, economical methods for large-scale preparation of clean, undamaged nuclei. Here we present a useful method for nuclear isolation from the important model organism, the fission yeast, Schizosaccharomyces pombe. To preserve in vivo molecular configurations, we flash-froze the yeast cells in liquid nitrogen. Then we broke their tough cell walls, without damaging their nuclei, by grinding in a precision-controlled motorized mortar-and-pestle apparatus. The cryo-ground cells were resuspended and thawed in a buffer designed to preserve nuclear morphology, and the nuclei were enriched by differential centrifugation. The washed nuclei were free from contaminating nucleases and have proven well-suited as starting material for genome-wide chromatin analysis and for preparation of fragile DNA replication intermediates. We have developed a simple, reproducible, economical procedure for large-scale preparation of endogenous-nuclease-free, morphologically intact nuclei from fission yeast. With appropriate modifications, this procedure may well prove useful for isolation of nuclei from other organisms with, or without, tough cell walls.

  16. Integrity of chromatin and replicating DNA in nuclei released from fission yeast by semi-automated grinding in liquid nitrogen

    Directory of Open Access Journals (Sweden)

    Givens Robert M

    2011-11-01

    Full Text Available Abstract Background Studies of nuclear function in many organisms, especially those with tough cell walls, are limited by lack of availability of simple, economical methods for large-scale preparation of clean, undamaged nuclei. Findings Here we present a useful method for nuclear isolation from the important model organism, the fission yeast, Schizosaccharomyces pombe. To preserve in vivo molecular configurations, we flash-froze the yeast cells in liquid nitrogen. Then we broke their tough cell walls, without damaging their nuclei, by grinding in a precision-controlled motorized mortar-and-pestle apparatus. The cryo-ground cells were resuspended and thawed in a buffer designed to preserve nuclear morphology, and the nuclei were enriched by differential centrifugation. The washed nuclei were free from contaminating nucleases and have proven well-suited as starting material for genome-wide chromatin analysis and for preparation of fragile DNA replication intermediates. Conclusions We have developed a simple, reproducible, economical procedure for large-scale preparation of endogenous-nuclease-free, morphologically intact nuclei from fission yeast. With appropriate modifications, this procedure may well prove useful for isolation of nuclei from other organisms with, or without, tough cell walls.

  17. Assessing the Agreement Between Eo-Based Semi-Automated Landslide Maps with Fuzzy Manual Landslide Delineation

    Science.gov (United States)

    Albrecht, F.; Hölbling, D.; Friedl, B.

    2017-09-01

    Landslide mapping benefits from the ever increasing availability of Earth Observation (EO) data resulting from programmes like the Copernicus Sentinel missions and improved infrastructure for data access. However, there arises the need for improved automated landslide information extraction processes from EO data while the dominant method is still manual delineation. Object-based image analysis (OBIA) provides the means for the fast and efficient extraction of landslide information. To prove its quality, automated results are often compared to manually delineated landslide maps. Although there is awareness of the uncertainties inherent in manual delineations, there is a lack of understanding how they affect the levels of agreement in a direct comparison of OBIA-derived landslide maps and manually derived landslide maps. In order to provide an improved reference, we present a fuzzy approach for the manual delineation of landslides on optical satellite images, thereby making the inherent uncertainties of the delineation explicit. The fuzzy manual delineation and the OBIA classification are compared by accuracy metrics accepted in the remote sensing community. We have tested this approach for high resolution (HR) satellite images of three large landslides in Austria and Italy. We were able to show that the deviation of the OBIA result from the manual delineation can mainly be attributed to the uncertainty inherent in the manual delineation process, a relevant issue for the design of validation processes for OBIA-derived landslide maps.

  18. Solid recovered fuels in the cement industry--semi-automated sample preparation unit as a means for facilitated practical application.

    Science.gov (United States)

    Aldrian, Alexia; Sarc, Renato; Pomberger, Roland; Lorber, Karl E; Sipple, Ernst-Michael

    2016-03-01

    One of the challenges for the cement industry is the quality assurance of alternative fuel (e.g., solid recovered fuel, SRF) in co-incineration plants--especially for inhomogeneous alternative fuels with large particle sizes (d95⩾100 mm), which will gain even more importance in the substitution of conventional fuels due to low production costs. Existing standards for sampling and sample preparation do not cover the challenges resulting from these kinds of materials. A possible approach to ensure quality monitoring is shown in the present contribution. For this, a specially manufactured, automated comminution and sample divider device was installed at a cement plant in Rohožnik. In order to prove its practical suitability with methods according to current standards, the sampling and sample preparation process were validated for alternative fuel with a grain size >30 mm (i.e., d95=approximately 100 mm), so-called 'Hotdisc SRF'. Therefore, series of samples were taken and analysed. A comparison of the analysis results with the yearly average values obtained through a reference investigation route showed good accordance. Further investigations during the validation process also showed that segregation or enrichment of material throughout the comminution plant does not occur. The results also demonstrate that compliance with legal standards regarding the minimum sample amount is not sufficient for inhomogeneous and coarse particle size alternative fuels. Instead, higher sample amounts after the first particle size reduction step are strongly recommended in order to gain a representative laboratory sample. © The Author(s) 2016.

  19. Developing a semi/automated protocol to post-process large volume, High-resolution airborne thermal infrared (TIR) imagery for urban waste heat mapping

    Science.gov (United States)

    Rahman, Mir Mustafizur

    In collaboration with The City of Calgary 2011 Sustainability Direction and as part of the HEAT (Heat Energy Assessment Technologies) project, the focus of this research is to develop a semi/automated 'protocol' to post-process large volumes of high-resolution (H-res) airborne thermal infrared (TIR) imagery to enable accurate urban waste heat mapping. HEAT is a free GeoWeb service, designed to help Calgary residents improve their home energy efficiency by visualizing the amount and location of waste heat leaving their homes and communities, as easily as clicking on their house in Google Maps. HEAT metrics are derived from 43 flight lines of TABI-1800 (Thermal Airborne Broadband Imager) data acquired on May 13--14, 2012 at night (11:00 pm--5:00 am) over The City of Calgary, Alberta (˜825 km 2) at a 50 cm spatial resolution and 0.05°C thermal resolution. At present, the only way to generate a large area, high-spatial resolution TIR scene is to acquire separate airborne flight lines and mosaic them together. However, the ambient sensed temperature within, and between flight lines naturally changes during acquisition (due to varying atmospheric and local micro-climate conditions), resulting in mosaicked images with different temperatures for the same scene components (e.g. roads, buildings), and mosaic join-lines arbitrarily bisect many thousands of homes. In combination these effects result in reduced utility and classification accuracy including, poorly defined HEAT Metrics, inaccurate hotspot detection and raw imagery that are difficult to interpret. In an effort to minimize these effects, three new semi/automated post-processing algorithms (the protocol) are described, which are then used to generate a 43 flight line mosaic of TABI-1800 data from which accurate Calgary waste heat maps and HEAT metrics can be generated. These algorithms (presented as four peer-reviewed papers)---are: (a) Thermal Urban Road Normalization (TURN)---used to mitigate the microclimatic

  20. Abdominal adipose tissue quantification on water-suppressed and non-water-suppressed MRI at 3T using semi-automated FCM clustering algorithm

    Science.gov (United States)

    Valaparla, Sunil K.; Peng, Qi; Gao, Feng; Clarke, Geoffrey D.

    2014-03-01

    Accurate measurements of human body fat distribution are desirable because excessive body fat is associated with impaired insulin sensitivity, type 2 diabetes mellitus (T2DM) and cardiovascular disease. In this study, we hypothesized that the performance of water suppressed (WS) MRI is superior to non-water suppressed (NWS) MRI for volumetric assessment of abdominal subcutaneous (SAT), intramuscular (IMAT), visceral (VAT), and total (TAT) adipose tissues. We acquired T1-weighted images on a 3T MRI system (TIM Trio, Siemens), which was analyzed using semi-automated segmentation software that employs a fuzzy c-means (FCM) clustering algorithm. Sixteen contiguous axial slices, centered at the L4-L5 level of the abdomen, were acquired in eight T2DM subjects with water suppression (WS) and without (NWS). Histograms from WS images show improved separation of non-fatty tissue pixels from fatty tissue pixels, compared to NWS images. Paired t-tests of WS versus NWS showed a statistically significant lower volume of lipid in the WS images for VAT (145.3 cc less, p=0.006) and IMAT (305 cc less, p1), but not SAT (14.1 cc more, NS). WS measurements of TAT also resulted in lower fat volumes (436.1 cc less, p=0.002). There is strong correlation between WS and NWS quantification methods for SAT measurements (r=0.999), but poorer correlation for VAT studies (r=0.845). These results suggest that NWS pulse sequences may overestimate adipose tissue volumes and that WS pulse sequences are more desirable due to the higher contrast generated between fatty and non-fatty tissues.

  1. A new assay for cytotoxic lymphocytes, based on a radioautographic readout of 111In release, suitable for rapid, semi-automated assessment of limit-dilution cultures

    International Nuclear Information System (INIS)

    Shortman, K.; Wilson, A.

    1981-01-01

    A new assay for cytotoxic T lymphocytes is described, of general application, but particularly suitable for rapid, semi-automated assessment of multiple microculture tests. Target cells are labelled with high efficiency and to high specific activity with the oxine chelate of 111 indium. After a 3-4 h incubation of test cells with 5 X 10 3 labelled target cells in V wells of microtitre trays, samples of the supernatant are spotted on paper (5 μl) or transferred to soft-plastic U wells (25-50 μl) and the 111 In release assessed by radioautography. Overnight exposure of X-ray film with intensifying screens at -70 0 C gives an image which is an intense dark spot for maximum release, a barely visible darkening with the low spontaneous release, and a definite positive with 10% specific lysis. The degree of film darkening, which can be quantitated by microdensitometry, shows a linear relationship with cytotoxic T lymphocyte dose up to the 40% lysis level. The labelling intensity and sensitivity can be adjusted over a wide range, allowing a single batch of the short half-life isotope to serve for 2 weeks. The 96 assays from a single tray are developed simultaneously on a single small sheet of film. Many trays can be processed together, and handling is rapid if 96-channel automatic pipettors are used. The method allows rapid visual scanning for positive and negative limit dilution cultures in cytotoxic T cell precursor frequency and specificity studies. In addition, in conjunction with an automated densitometer designed to scan microtitre trays, the method provides an efficient alternative to isotope counting in routine cytotoxic assays. (Auth.)

  2. MetMatch: A Semi-Automated Software Tool for the Comparison and Alignment of LC-HRMS Data from Different Metabolomics Experiments

    Directory of Open Access Journals (Sweden)

    Stefan Koch

    2016-11-01

    Full Text Available Due to its unsurpassed sensitivity and selectivity, LC-HRMS is one of the major analytical techniques in metabolomics research. However, limited stability of experimental and instrument parameters may cause shifts and drifts of retention time and mass accuracy or the formation of different ion species, thus complicating conclusive interpretation of the raw data, especially when generated in different analytical batches. Here, a novel software tool for the semi-automated alignment of different measurement sequences is presented. The tool is implemented in the Java programming language, it features an intuitive user interface and its main goal is to facilitate the comparison of data obtained from different metabolomics experiments. Based on a feature list (i.e., processed LC-HRMS chromatograms with mass-to-charge ratio (m/z values and retention times that serves as a reference, the tool recognizes both m/z and retention time shifts of single or multiple analytical datafiles/batches of interest. MetMatch is also designed to account for differently formed ion species of detected metabolites. Corresponding ions and metabolites are matched and chromatographic peak areas, m/z values and retention times are combined into a single data matrix. The convenient user interface allows for easy manipulation of processing results and graphical illustration of the raw data as well as the automatically matched ions and metabolites. The software tool is exemplified with LC-HRMS data from untargeted metabolomics experiments investigating phenylalanine-derived metabolites in wheat and T-2 toxin/HT-2 toxin detoxification products in barley.

  3. Automated quantification of optic nerve axons in primate glaucomatous and normal eyes--method and comparison to semi-automated manual quantification.

    Science.gov (United States)

    Reynaud, Juan; Cull, Grant; Wang, Lin; Fortune, Brad; Gardiner, Stuart; Burgoyne, Claude F; Cioffi, George A

    2012-05-01

    To describe an algorithm and software application (APP) for 100% optic nerve axon counting and to compare its performance with a semi-automated manual (SAM) method in optic nerve cross-section images (images) from normal and experimental glaucoma (EG) nonhuman primate (NHP) eyes. ON cross sections from eight EG eyes from eight NHPs, five EG and five normal eyes from five NHPs, and 12 normal eyes from 12 NHPs were imaged at 100×. Calibration (n = 500) and validation (n = 50) image sets ranging from normal to end-stage damage were assembled. Correlation between APP and SAM axon counts was assessed by Deming regression within the calibration set and a compensation formula was generated to account for the subtle, systematic differences. Then, compensated APP counts for each validation image were compared with the mean and 95% confidence interval of five SAM counts of the validation set performed by a single observer. Calibration set APP counts linearly correlated to SAM counts (APP = 10.77 + 1.03 [SAM]; R(2) = 0.94, P < 0.0001) in normal to end-stage damage images. In the validation set, compensated APP counts fell within the 95% confidence interval of the SAM counts in 42 of the 50 images and were within 12 axons of the confidence intervals in six of the eight remaining images. Uncompensated axon density maps for the normal and EG eyes of a representative NHP were generated. An APP for 100% ON axon counts has been calibrated and validated relative to SAM counts in normal and EG NHP eyes.

  4. Automated Quantification of Optic Nerve Axons in Primate Glaucomatous and Normal Eyes—Method and Comparison to Semi-Automated Manual Quantification

    Science.gov (United States)

    Reynaud, Juan; Cull, Grant; Wang, Lin; Fortune, Brad; Gardiner, Stuart; Burgoyne, Claude F; Cioffi, George A

    2012-01-01

    Purpose. To describe an algorithm and software application (APP) for 100% optic nerve axon counting and to compare its performance with a semi-automated manual (SAM) method in optic nerve cross-section images (images) from normal and experimental glaucoma (EG) nonhuman primate (NHP) eyes. Methods. ON cross sections from eight EG eyes from eight NHPs, five EG and five normal eyes from five NHPs, and 12 normal eyes from 12 NHPs were imaged at 100×. Calibration (n = 500) and validation (n = 50) image sets ranging from normal to end-stage damage were assembled. Correlation between APP and SAM axon counts was assessed by Deming regression within the calibration set and a compensation formula was generated to account for the subtle, systematic differences. Then, compensated APP counts for each validation image were compared with the mean and 95% confidence interval of five SAM counts of the validation set performed by a single observer. Results. Calibration set APP counts linearly correlated to SAM counts (APP = 10.77 + 1.03 [SAM]; R2 = 0.94, P < 0.0001) in normal to end-stage damage images. In the validation set, compensated APP counts fell within the 95% confidence interval of the SAM counts in 42 of the 50 images and were within 12 axons of the confidence intervals in six of the eight remaining images. Uncompensated axon density maps for the normal and EG eyes of a representative NHP were generated. Conclusions. An APP for 100% ON axon counts has been calibrated and validated relative to SAM counts in normal and EG NHP eyes. PMID:22467571

  5. Mesoscale modeling of smoke transport from equatorial Southeast Asian Maritime Continent to the Philippines: First comparison of ensemble analysis with in situ observations

    Science.gov (United States)

    Ge, Cui; Wang, Jun; Reid, Jeffrey S.; Posselt, Derek J.; Xian, Peng; Hyer, Edward

    2017-05-01

    Atmospheric transport of smoke from equatorial Southeast Asian Maritime Continent (Indonesia, Singapore, and Malaysia) to the Philippines was recently verified by the first-ever measurement of aerosol composition in the region of the Sulu Sea from a research vessel named Vasco. However, numerical modeling of such transport can have large uncertainties due to the lack of observations for parameterization schemes and for describing fire emission and meteorology in this region. These uncertainties are analyzed here, for the first time, with an ensemble of 24 Weather Research and Forecasting model with Chemistry (WRF-Chem) simulations. The ensemble reproduces the time series of observed surface nonsea-salt PM2.5 concentrations observed from the Vasco vessel during 17-30 September 2011 and overall agrees with satellite (Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observation (CALIPSO) and Moderate Resolution Imaging Spectroradiometer (MODIS)) and Aerosol Robotic Network (AERONET) data. The difference of meteorology between National Centers for Environmental Prediction (NCEP's) Final (FNL) and European Center for Medium range Weather Forecasting (ECMWF's) ERA renders the biggest spread in the ensemble (up to 20 μg m-3 or 200% in surface PM2.5), with FNL showing systematically superior results. The second biggest uncertainty is from fire emissions; the 2 day maximum Fire Locating and Modelling of Burning Emissions (FLAMBE) emission is superior than the instantaneous one. While Grell-Devenyi (G3) and Betts-Miller-Janjić cumulus schemes only produce a difference of 3 μg m-3 of surface PM2.5 over the Sulu Sea, the ensemble mean agrees best with Climate Prediction Center (CPC) MORPHing (CMORPH)'s spatial distribution of precipitation. Simulation with FNL-G3, 2 day maximum FLAMBE, and 800 m injection height outperforms other ensemble members. Finally, the global transport model (Navy Aerosol Analysis and Prediction System (NAAPS)) outperforms all WRF

  6. Intense mesoscale variability in the Sardinia Sea

    Science.gov (United States)

    Russo, Aniello; Borrione, Ines; Falchetti, Silvia; Knoll, Michaela; Fiekas, Heinz-Volker; Heywood, Karen; Oddo, Paolo; Onken, Reiner

    2015-04-01

    From the 6 to 25 June 2014, the REP14-MED sea trial was conducted by CMRE, supported by 20 partners from six different nations. The at-sea activities were carried out onboard the research vessels Alliance (NATO) and Planet (German Ministry of Defense), comprising a marine area of about 110 x 110 km2 to the west of the Sardinian coast. More than 300 CTD casts typically spaced at 10 km were collected; both ships continuously recorded vertical profiles of currents by means of their ADCPs, and a ScanFish® and a CTD chain were towed for almost three days by Alliance and Planet, respectively, following parallel routes. Twelve gliders from different manufacturers (Slocum, SeaGliderTM and SeaExplorer) were continuously sampling the study area following zonal tracks spaced at 10 km. In addition, six moorings, 17 surface drifters and one ARVOR float were deployed. From a first analysis of the observations, several mesoscale features were identified in the survey area, in particular: (i) a warm-core anticyclonic eddy in the southern part of the domain, about 50 km in diameter and with the strongest signal at about 50-m depth (ii) another warm-core anticyclonic eddy of comparable dimensions in the central part of the domain, but extending to greater depth than the former one, and (iii) a small (less than 15 km in diameter) cold-core cyclonic eddy of Winter Intermediate Water in the depth range between 170 m and 370 m. All three eddies showed intensified currents, up to 50 cm s-1. The huge high-resolution observational data set and the variety of observation techniques enabled the mesoscale features and their variability to be tracked for almost three weeks. In order to obtain a deeper understanding of the mesoscale dynamic behaviour and their interactions, assimilation studies with an ocean circulation model are underway.

  7. A semi-automated magnetic capture probe based DNA extraction and real-time PCR method applied in the Swedish surveillance of Echinococcus multilocularis in red fox (Vulpes vulpes) faecal samples.

    Science.gov (United States)

    Isaksson, Mats; Hagström, Åsa; Armua-Fernandez, Maria Teresa; Wahlström, Helene; Ågren, Erik Olof; Miller, Andrea; Holmberg, Anders; Lukacs, Morten; Casulli, Adriano; Deplazes, Peter; Juremalm, Mikael

    2014-12-19

    Following the first finding of Echinococcus multilocularis in Sweden in 2011, 2985 red foxes (Vulpes vulpes) were analysed by the segmental sedimentation and counting technique. This is a labour intensive method and requires handling of the whole carcass of the fox, resulting in a costly analysis. In an effort to reduce the cost of labour and sample handling, an alternative method has been developed. The method is sensitive and partially automated for detection of E. multilocularis in faecal samples. The method has been used in the Swedish E. multilocularis monitoring program for 2012-2013 on more than 2000 faecal samples. We describe a new semi-automated magnetic capture probe DNA extraction method and real time hydrolysis probe polymerase chain reaction assay (MC-PCR) for the detection of E. multilocularis DNA in faecal samples from red fox. The diagnostic sensitivity was determined by validating the new method against the sedimentation and counting technique in fox samples collected in Switzerland where E. multilocularis is highly endemic. Of 177 foxes analysed by the sedimentation and counting technique, E. multilocularis was detected in 93 animals. Eighty-two (88%, 95% C.I 79.8-93.9) of these were positive in the MC-PCR. In foxes with more than 100 worms, the MC-PCR was positive in 44 out of 46 (95.7%) cases. The two MC-PCR negative samples originated from foxes with only immature E. multilocularis worms. In foxes with 100 worms or less, (n = 47), 38 (80.9%) were positive in the MC-PCR. The diagnostic specificity of the MC-PCR was evaluated using fox scats collected within the Swedish screening. Of 2158 samples analysed, two were positive. This implies that the specificity is at least 99.9% (C.I. = 99.7-100). The MC-PCR proved to have a high sensitivity and a very high specificity. The test is partially automated but also possible to perform manually if desired. The test is well suited for nationwide E. multilocularis surveillance programs where sampling

  8. SU-D-16A-02: A Novel Methodology for Accurate, Semi-Automated Delineation of Oral Mucosa for Radiation Therapy Dose-Response Studies

    International Nuclear Information System (INIS)

    Dean, J; Welsh, L; Gulliford, S; Harrington, K; Nutting, C

    2014-01-01

    Purpose: The significant morbidity caused by radiation-induced acute oral mucositis means that studies aiming to elucidate dose-response relationships in this tissue are a high priority. However, there is currently no standardized method for delineating the mucosal structures within the oral cavity. This report describes the development of a methodology to delineate the oral mucosa accurately on CT scans in a semi-automated manner. Methods: An oral mucosa atlas for automated segmentation was constructed using the RayStation Atlas-Based Segmentation (ABS) module. A radiation oncologist manually delineated the full surface of the oral mucosa on a planning CT scan of a patient receiving radiotherapy (RT) to the head and neck region. A 3mm fixed annulus was added to incorporate the mucosal wall thickness. This structure was saved as an atlas template. ABS followed by model-based segmentation was performed on four further patients sequentially, adding each patient to the atlas. Manual editing of the automatically segmented structure was performed. A dose comparison between these contours and previously used oral cavity volume contours was performed. Results: The new approach was successful in delineating the mucosa, as assessed by an experienced radiation oncologist, when applied to a new series of patients receiving head and neck RT. Reductions in the mean doses obtained when using the new delineation approach, compared with the previously used technique, were demonstrated for all patients (median: 36.0%, range: 25.6% – 39.6%) and were of a magnitude that might be expected to be clinically significant. Differences in the maximum dose that might reasonably be expected to be clinically significant were observed for two patients. Conclusion: The method developed provides a means of obtaining the dose distribution delivered to the oral mucosa more accurately than has previously been achieved. This will enable the acquisition of high quality dosimetric data for use in

  9. Feasibility of a semi-automated contrast-oriented algorithm for tumor segmentation in retrospectively gated PET images: phantom and clinical validation

    Science.gov (United States)

    Carles, Montserrat; Fechter, Tobias; Nemer, Ursula; Nanko, Norbert; Mix, Michael; Nestle, Ursula; Schaefer, Andrea

    2015-12-01

    PET/CT plays an important role in radiotherapy planning for lung tumors. Several segmentation algorithms have been proposed for PET tumor segmentation. However, most of them do not take into account respiratory motion and are not well validated. The aim of this work was to evaluate a semi-automated contrast-oriented algorithm (COA) for PET tumor segmentation adapted to retrospectively gated (4D) images. The evaluation involved a wide set of 4D-PET/CT acquisitions of dynamic experimental phantoms and lung cancer patients. In addition, segmentation accuracy of 4D-COA was compared with four other state-of-the-art algorithms. In phantom evaluation, the physical properties of the objects defined the gold standard. In clinical evaluation, the ground truth was estimated by the STAPLE (Simultaneous Truth and Performance Level Estimation) consensus of three manual PET contours by experts. Algorithm evaluation with phantoms resulted in: (i) no statistically significant diameter differences for different targets and movements (Δ φ =0.3+/- 1.6 mm); (ii) reproducibility for heterogeneous and irregular targets independent of user initial interaction and (iii) good segmentation agreement for irregular targets compared to manual CT delineation in terms of Dice Similarity Coefficient (DSC  =  0.66+/- 0.04 ), Positive Predictive Value (PPV  =  0.81+/- 0.06 ) and Sensitivity (Sen.  =  0.49+/- 0.05 ). In clinical evaluation, the segmented volume was in reasonable agreement with the consensus volume (difference in volume (%Vol)  =  40+/- 30 , DSC  =  0.71+/- 0.07 and PPV  =  0.90+/- 0.13 ). High accuracy in target tracking position (Δ ME) was obtained for experimental and clinical data (Δ ME{{}\\text{exp}}=0+/- 3 mm; Δ ME{{}\\text{clin}}=0.3+/- 1.4 mm). In the comparison with other lung segmentation methods, 4D-COA has shown the highest volume accuracy in both experimental and clinical data. In conclusion, the accuracy in volume

  10. Mesoscale Modeling, Forecasting and Remote Sensing Research.

    Science.gov (United States)

    remote sensing , cyclonic scale diagnostic studies and mesoscale numerical modeling and forecasting are summarized. Mechanisms involved in the release of potential instability are discussed and simulated quantitatively, giving particular attention to the convective formulation. The basic mesoscale model is documented including the equations, boundary condition, finite differences and initialization through an idealized frontal zone. Results of tests including a three dimensional test with real data, tests of convective/mesoscale interaction and tests with a detailed

  11. Wake modelling combining mesoscale and microscale models

    DEFF Research Database (Denmark)

    Badger, Jake; Volker, Patrick; Prospathospoulos, J.

    2013-01-01

    In this paper the basis for introducing thrust information from microscale wake models into mesocale model wake parameterizations will be described. A classification system for the different types of mesoscale wake parameterizations is suggested and outlined. Four different mesoscale wake paramet...

  12. Parameterization of Mixed Layer and Deep-Ocean Mesoscales Including Nonlinearity

    Science.gov (United States)

    Canuto, V. M.; Cheng, Y.; Dubovikov, M. S.; Howard, A. M.; Leboissetier, A.

    2018-01-01

    In 2011, Chelton et al. carried out a comprehensive census of mesoscales using altimetry data and reached the following conclusions: "essentially all of the observed mesoscale features are nonlinear" and "mesoscales do not move with the mean velocity but with their own drift velocity," which is "the most germane of all the nonlinear metrics."� Accounting for these results in a mesoscale parameterization presents conceptual and practical challenges since linear analysis is no longer usable and one needs a model of nonlinearity. A mesoscale parameterization is presented that has the following features: 1) it is based on the solutions of the nonlinear mesoscale dynamical equations, 2) it describes arbitrary tracers, 3) it includes adiabatic (A) and diabatic (D) regimes, 4) the eddy-induced velocity is the sum of a Gent and McWilliams (GM) term plus a new term representing the difference between drift and mean velocities, 5) the new term lowers the transfer of mean potential energy to mesoscales, 6) the isopycnal slopes are not as flat as in the GM case, 7) deep-ocean stratification is enhanced compared to previous parameterizations where being more weakly stratified allowed a large heat uptake that is not observed, 8) the strength of the Deacon cell is reduced. The numerical results are from a stand-alone ocean code with Coordinated Ocean-Ice Reference Experiment I (CORE-I) normal-year forcing.

  13. Probabilistic, meso-scale flood loss modelling

    Science.gov (United States)

    Kreibich, Heidi; Botto, Anna; Schröter, Kai; Merz, Bruno

    2016-04-01

    Flood risk analyses are an important basis for decisions on flood risk management and adaptation. However, such analyses are associated with significant uncertainty, even more if changes in risk due to global change are expected. Although uncertainty analysis and probabilistic approaches have received increased attention during the last years, they are still not standard practice for flood risk assessments and even more for flood loss modelling. State of the art in flood loss modelling is still the use of simple, deterministic approaches like stage-damage functions. Novel probabilistic, multi-variate flood loss models have been developed and validated on the micro-scale using a data-mining approach, namely bagging decision trees (Merz et al. 2013). In this presentation we demonstrate and evaluate the upscaling of the approach to the meso-scale, namely on the basis of land-use units. The model is applied in 19 municipalities which were affected during the 2002 flood by the River Mulde in Saxony, Germany (Botto et al. submitted). The application of bagging decision tree based loss models provide a probability distribution of estimated loss per municipality. Validation is undertaken on the one hand via a comparison with eight deterministic loss models including stage-damage functions as well as multi-variate models. On the other hand the results are compared with official loss data provided by the Saxon Relief Bank (SAB). The results show, that uncertainties of loss estimation remain high. Thus, the significant advantage of this probabilistic flood loss estimation approach is that it inherently provides quantitative information about the uncertainty of the prediction. References: Merz, B.; Kreibich, H.; Lall, U. (2013): Multi-variate flood damage assessment: a tree-based data-mining approach. NHESS, 13(1), 53-64. Botto A, Kreibich H, Merz B, Schröter K (submitted) Probabilistic, multi-variable flood loss modelling on the meso-scale with BT-FLEMO. Risk Analysis.

  14. Potencial da técnica in vitro semi-automática de produção de gases para avaliação de silagens de sorgo (Sorghum bicolor (L. Moench

    Directory of Open Access Journals (Sweden)

    Maurício Rogério Martins

    2003-01-01

    Full Text Available O potential da técnica in vitro semi-automática de produção de gases foi estudado pela avaliação das silagens de quatro híbridos de sorgo (BR700, BR701, BR601 e AG2002. Os resultados desse experimento foram comparados aos obtidos em experimento de digestibilidade aparente. A relação entre a digestibilidade da matéria seca obtida pela técnica de produção de gases após 96 horas de fermentação (DMS e a digestibilidade aparente da MS foi representada pela equação: digestibilidade in vivo (g/kg = 0,46 x DMS (g/kg + 361,08 (r²=0,97. A técnica in vitro semi automática de produção de gases estimou de forma precisa os valores de digestibilidade aparente da MS das silagens avaliadas nesse experimento. Além disto, forneceu informações adicionais sobre a cinética de fermentação ruminal das silagens e degradabilidade efetiva da matéria seca em diferentes taxas de passagem. A superioridade da taxa de produção de gases (%/h do híbrido BR601 (0,056 em relação ao BR700 (0,051, BR701 (0,044 e AG2002 (0,045 está correlacionada com a maior DMS do material (649, 598, 601 e 593 g/kg, respectivamente. Dessa forma, a técnica in vitro semi-automática de produção de gases foi capaz de selecionar o híbrido BR601, em termos de digestibilidade e cinética de fermentação ruminal, como o mais promissor para uso na alimentação dos ruminantes, demonstrando assim o seu potencial para avaliação de silagens de sorgo.

  15. Meso-scale on-road vehicle emission inventory approach: a study on Dhaka City of Bangladesh supporting the 'cause-effect' analysis of the transport system.

    Science.gov (United States)

    Iqbal, Asif; Allan, Andrew; Zito, Rocco

    2016-03-01

    The study aims to develop an emission inventory (EI) approach and conduct an inventory for vehicular sources in Dhaka City, Bangladesh. A meso-scale modelling approach was adopted for the inventory; the factors that influence the emissions and the magnitude of emission variation were identified and reported on, which was an innovative approach to account emissions unlike the conventional inventory approaches. Two techniques for the emission inventory were applied, viz. (i) a combined top-down and bottom-up approach that considered the total vehicle population and the average diurnal on-road vehicle speed profile in the city and (ii) a bottom-up approach that accounted for road link-specific emissions of the city considering diurnal traffic volume and speed profiles of the respective roads. For the bottom-up approach, road link-specific detailed data were obtained through field survey in 2012, where mid-block traffic count of the day, vehicle speed profile, road network and congestion data were collected principally. The emission variances for the change in transport system characteristics (like change in fuel type, AC usage pattern, increased speed and reduced congestion/stopping) were predicted and analysed in this study; congestion influenced average speed of the vehicles, and fuel types in the vehicles were identified as the major stressors. The study performance was considered reasonable when comparing with the limited number of similar studies conducted earlier. Given the increasing trend of private vehicles each year coupled with increasing traffic congestion, the city is under threat of increased vehicular emissions unless a good management strategy is implemented. Although the inventory is conducted for Dhaka and the result may be important locally, the approach adopted in this research is innovative in nature to be followed for conducting research on other urban transport systems.

  16. Meso-scale wind variability. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Larsen, S.; Larsen, X.; Vincent, C.; Soerensen, P.; Pinson, P.; Trombe, P.-J.; Madsen, H.; Cutululis, N.

    2011-11-15

    The project has aimed to characterize mesoscale meteorological phenomenon for the North Sea and the Inner Danish waters, and additionally aimed on improving the predictability and quality of the power production from offshore windfarms. The meso-scale meteorology has been characterized with respect to the physical processes, climatology, spectral characteristics and correlation properties based on measurements from wind farms, satellite data (SAR) and mesoscale numerical modeling (WRF). The abilities of the WRF model to characterize and predict relevant mesoscale phenomenon has been proven. Additionally application of statistical forecasting, using a Markov switching approach that can be related to the meteorological conditions, to analyze and short term predict the power production from an offshore wind farms have been documented. Two PhD studies have been conducted in connection with the project. The project has been a cooperative project between Risoe DTU, IMM DTU, DONG Energy, Vattenfall and VESTAS. It is registered as Energinet.dk, project no. 2007-1-7141. (Author)

  17. Error Covariance Estimation of Mesoscale Data Assimilation

    National Research Council Canada - National Science Library

    Xu, Qin

    2005-01-01

    The goal of this project is to explore and develop new methods of error covariance estimation that will provide necessary statistical descriptions of prediction and observation errors for mesoscale data assimilation...

  18. Improvement of Hydrological Simulations by Applying Daily Precipitation Interpolation Schemes in Meso-Scale Catchments

    Directory of Open Access Journals (Sweden)

    Mateusz Szcześniak

    2015-02-01

    Full Text Available Ground-based precipitation data are still the dominant input type for hydrological models. Spatial variability in precipitation can be represented by spatially interpolating gauge data using various techniques. In this study, the effect of daily precipitation interpolation methods on discharge simulations using the semi-distributed SWAT (Soil and Water Assessment Tool model over a 30-year period is examined. The study was carried out in 11 meso-scale (119–3935 km2 sub-catchments lying in the Sulejów reservoir catchment in central Poland. Four methods were tested: the default SWAT method (Def based on the Nearest Neighbour technique, Thiessen Polygons (TP, Inverse Distance Weighted (IDW and Ordinary Kriging (OK. =The evaluation of methods was performed using a semi-automated calibration program SUFI-2 (Sequential Uncertainty Fitting Procedure Version 2 with two objective functions: Nash-Sutcliffe Efficiency (NSE and the adjusted R2 coefficient (bR2. The results show that: (1 the most complex OK method outperformed other methods in terms of NSE; and (2 OK, IDW, and TP outperformed Def in terms of bR2. The median difference in daily/monthly NSE between OK and Def/TP/IDW calculated across all catchments ranged between 0.05 and 0.15, while the median difference between TP/IDW/OK and Def ranged between 0.05 and 0.07. The differences between pairs of interpolation methods were, however, spatially variable and a part of this variability was attributed to catchment properties: catchments characterised by low station density and low coefficient of variation of daily flows experienced more pronounced improvement resulting from using interpolation methods. Methods providing higher precipitation estimates often resulted in a better model performance. The implication from this study is that appropriate consideration of spatial precipitation variability (often neglected by model users that can be achieved using relatively simple interpolation methods can

  19. A Mesoscale Analysis of Column-Integrated Aerosol Properties in Northern India During the TIGERZ 2008 Pre-Monsoon Period and a Comparison to MODIS Retrievals

    Science.gov (United States)

    Giles, D. M.; Holben, B. N.; Tripathi, S. N.; Eck, T. F.; Newcomb, W. W.; Slutsker, I.; Dickerson, R. R.; Thompson, A. M.; Wang, S.-H.; Singh, R. P.; hide

    2010-01-01

    The Indo-Gangetic Plain (IGP) of the northern Indian subcontinent produces anthropogenic pollution from urban, industrial and rural combustion sources nearly continuously and is affected by convection-induced winds driving desert and alluvial dust into the atmosphere during the premonsoon period. Within the IGP, the NASA Aerosol Robotic Network (AERONET) project initiated the TIGERZ measurement campaign in May 2008 with an intensive operational period from May 1 to June 23, 2008. Mesoscale spatial variability of aerosol optical depth (AOD, tau) measurements at 500mn was assessed at sites around Kanpur, India, with averages ranging from 0.31 to 0.89 for spatial variability study (SVS) deployments. Sites located downwind from the city of Kanpur indicated slightly higher average aerosol optical depth (delta Tau(sub 500)=0.03-0.09). In addition, SVS AOD area-averages were compared to the long-tenn Kanpur AERONET site data: Four SVS area-averages were within +/- 1 cr of the climatological mean of the Kanpur site, while one SVS was within 2sigma below climatology. For a SVS case using AERONET inversions, the 440-870mn Angstrom exponent of approximately 0.38, the 440-870mn absorption Angstrom exponent (AAE) of 1.15-1.53, and the sphericity parameter near zero suggested the occurrence of large, strongly absorbing, non-spherical aerosols over Kanpur (e.g., mixed black carbon and dust) as well as stronger absorption downwind of Kanpur. Furthermore, the 3km and lOkm Terra and Aqua MODIS C005 aerosol retrieval algorithms at tau(sub 550) were compared to the TIGERZ data set. Although MODIS retrievals at higher quality levels were comparable to the MODIS retrieval uncertainty, the total number of MODIS matchups (N) were reduced with subsequent quality levels (N=25, QA>=0; N=9,QA>=l; N=6, QA>=2; N=1, QA=3) over Kanpur during the premonsoon primarily due to the semi-bright surface, complex aerosol mixture and cloud-contaminated pixels. The TIGERZ 2008 data set provided a unique

  20. Mesoscale Climate Evaluation Using Grid Computing

    Science.gov (United States)

    Campos Velho, H. F.; Freitas, S. R.; Souto, R. P.; Charao, A. S.; Ferraz, S.; Roberti, D. R.; Streck, N.; Navaux, P. O.; Maillard, N.; Collischonn, W.; Diniz, G.; Radin, B.

    2012-04-01

    The CLIMARS project is focused to establish an operational environment for seasonal climate prediction for the Rio Grande do Sul state, Brazil. The dynamical downscaling will be performed with the use of several software platforms and hardware infrastructure to carry out the investigation on mesoscale of the global change impact. The grid computing takes advantage of geographically spread out computer systems, connected by the internet, for enhancing the power of computation. The ensemble climate prediction is an appropriated application for processing on grid computing, because the integration of each ensemble member does not have a dependency on information from another ensemble members. The grid processing is employed to compute the 20-year climatology and the long range simulations under ensemble methodology. BRAMS (Brazilian Regional Atmospheric Model) is a mesoscale model developed from a version of the RAMS (from the Colorado State University - CSU, USA). BRAMS model is the tool for carrying out the dynamical downscaling from the IPCC scenarios. Long range BRAMS simulations will provide data for some climate (data) analysis, and supply data for numerical integration of different models: (a) Regime of the extreme events for temperature and precipitation fields: statistical analysis will be applied on the BRAMS data, (b) CCATT-BRAMS (Coupled Chemistry Aerosol Tracer Transport - BRAMS) is an environmental prediction system that will be used to evaluate if the new standards of temperature, rain regime, and wind field have a significant impact on the pollutant dispersion in the analyzed regions, (c) MGB-IPH (Portuguese acronym for the Large Basin Model (MGB), developed by the Hydraulic Research Institute, (IPH) from the Federal University of Rio Grande do Sul (UFRGS), Brazil) will be employed to simulate the alteration of the river flux under new climate patterns. Important meteorological input variables for the MGB-IPH are the precipitation (most relevant

  1. The integration of the terrestrial and airborne laser scanning technologies in the semi-automated process of retrieving selected trees and forest stand parameters Integração das tecnologias terrestre e aerotransportada de scanner laser no processo semi-automático de recuperação de árvores selecionadas e de parâmetros de povoamentos florestais

    Directory of Open Access Journals (Sweden)

    Piotr Wezyk

    2012-10-01

    quantidade de estoque de madeira ou outro parâmetro selecionado do povoamento e da árvore. Uma das tecnologias promissoras de sensoriamento remoto é o LiDAR coletando os dados da nuvem de pontos 3D. A tecnologia TLS é muito precisa e rápida mas limitada a áreas relativamente pequenas como as parcelas de inventários florestais. A ALS é mais focada na coleta de dados em grandes áreas. Ambas as tecnologias são complementares portanto, é necessário para a fusão das duas fontes de informação aumentar a acurácia dos parâmetros de árvore e ampliar os resultados para grandes áreas florestais com modelos estatísticos. O artigo apresenta um método de registro e transformação do TLS e nuvem de pontos do ALS para um sistema de coordenadas. O objetivo da fusão dos dados foi a extração semi automática de parâmetros selecionados de árvores (altura, DAP, área basal, densidade de copa, base da copa, área 2D e 3D da copa da árvore do transecto TR2 da Floresta Niepolomice (Krakow, Polônia. Os resultados mostraram que o grande potencial do aprimoramento da altura e do densidade de copa ou da base da copa existe.

  2. Laser guidance of mesoscale particles

    Science.gov (United States)

    Underdown, Frank Hartman, Jr.

    Mesoscale particles are guided and trapped in hollow optical fibers using radiation pressure forces. Laser light from a 0.4W, 780nm diode laser is guided in a low- loss fiber mode and used to generate the guidance forces. Laser scattering and absorption forces propels particles along the fiber and polarization gradient forces attract them to the fiber's axial center. Using two counter propagating laser beams, inside the fiber, particles can be trapped in three dimensions. Measuring the spring constant of the trap gives the gradient force. This dissertation describes Rayleigh and Mie scattering models for calculating guidance forces. Calculated forces as a function of particle size and composition (i.e. dielectric, semiconductor, and metals) will be presented. For example, under typical experimental conditions 100nm Au particles are guided by a 2 × 10-14 N propulsive force in a water filled fiber. In comparison, the measured force, obtained from the particle's velocity and Stokes' law, is 7.98 × 10-14 N.

  3. Contribution of mesoscale eddies to Black Sea ventilation

    Science.gov (United States)

    Capet, Arthur; Mason, Evan; Pascual, Ananda; Grégoire, Marilaure

    2017-04-01

    The shoaling of the Black Sea oxycline is one of the most urgent environmental issues in the Black Sea. The permanent oxycline derives directly from the Black Sea permanent stratification and has shoaled alarmingly in the last decades, due to a shifting balance between oxygen consumption and ventilation processes (Capet et al. 2016). The understanding of this balance is thus of the utmost importance and requires to quantify 1) the export of nutrients and organic materials from the shelf regions to the open sea and 2) the ventilation processes. These two processes being influenced by mesoscale features, it is critical to understand the role of the semi-permanent mesoscale structures in horizontal (center/periphery) and vertical (diapycnal and isopycnal) exchanges. A useful insight can be obtained by merging observations from satellite altimeter and in situ profilers (ARGO). In such composite analyses, eddies are first automatically identified and tracked from altimeter data (Mason et al. 2014, py-eddy-tracker). Vertical ARGO profiles are then expressed in terms of their position relative to eddy centers and radii. Derived statistics indicate how consistently mesoscale eddies alter the vertical structure, and provide a deeper understanding of the associated horizontal and vertical fluxes. However, this data-based approach is limited in the Black Sea due to the lower quality of gridded altimetric products in the vicinity of the coast, where semi-permanent mesoscale structures prevail. To complement the difficult analysis of this sparse dataset, a compositing methodology. is also applied to model outputs from the 5km GHER-BHAMBI Black Sea implementation (CMEMS BS-MFC). Characteristic biogeochemical anomalies associated with eddies in the model are analyzed per se, and compared to the observation-based analysis. Capet, A., Stanev, E. V., Beckers, J.-M., Murray, J. W., and Grégoire, M.: Decline of the Black Sea oxygen inventory, Biogeosciences, 13, 1287-1297, doi:10

  4. Thermally forced mesoscale atmospheric flow over complex terrain in Southern Italy

    International Nuclear Information System (INIS)

    Baldi, M.; Colacino, M.; Dalu, G. A.; Piervitali, E.; Ye, Z.

    1998-01-01

    In this paper the Authors discuss some results concerning the analysis of the local atmospheric flow over the southern part of Italy, the peninsula of Calabria, using a mesoscale numerical model. Our study is focused on two different but related topics: a detailed analysis of the meteorology and climate of the region based on a data collection, reported in Colacino et al., 'Elementi di Climatologia della Calabria', edited by A. Guerrini, in the series P. S., 'Clima, Ambiente e Territorio nel Mezzogiorno' (CNR, Rome) 1997, pp. 218, and an analysis of the results based on the simulated flow produced using a mesoscale numerical model. The Colorado State University mesoscale numerical model has been applied to study several different climatic situations of particular interest for the region, as discussed in this paper

  5. Thermally forced mesoscale atmospheric flow over complex terrain in Southern Italy

    Energy Technology Data Exchange (ETDEWEB)

    Baldi, M.; Colacino, M.; Dalu, G. A.; Piervitali, E.; Ye, Z. [CNR, Rome (Italy). Ist. di Fisica dell`Atmosfera

    1998-07-01

    In this paper the Authors discuss some results concerning the analysis of the local atmospheric flow over the southern part of Italy, the peninsula of Calabria, using a mesoscale numerical model. Our study is focused on two different but related topics: a detailed analysis of the meteorology and climate of the region based on a data collection, reported in Colacino et al., `Elementi di Climatologia della Calabria`, edited by A. Guerrini, in the series P. S., `Clima, Ambiente e Territorio nel Mezzogiorno` (CNR, Rome) 1997, pp. 218, and an analysis of the results based on the simulated flow produced using a mesoscale numerical model. The Colorado State University mesoscale numerical model has been applied to study several different climatic situations of particular interest for the region, as discussed in this paper.

  6. Ressuscitação cardiopulmonar com a utilização do desfibrilador externo semi-automático: avaliação do processo ensino-aprendizagem Resucitador cardiopulmonar con utilización del disfibrilador externo semiautomático: evaluación del proceso enseñanza-aprendizaje Cardiopulmonary resuscitation with semi-automated external defibrillator: assessment of the teaching-learning process

    Directory of Open Access Journals (Sweden)

    Ana Maria Kazue Miyadahira

    2008-09-01

    Full Text Available Estudos demonstram que a sobrevida após uma parada cardíaca diminui 10% para cada minuto de atraso na desfibrilação, e que a taxa de sobrevivência é de 98% quando ela é conseguida em 30 segundos. No atendimento de uma parada cardíaca, é primordial que seja incluído no treinamento a utilização dos desfibriladores externos semi-automáticos (DEA. O objetivo deste estudo foi comparar a Habilidade Psicomotora e o Conhecimento Teórico de leigos na técnica da ressuscitação cardiopulmonar (RCP utilizando o DEA, antes e após treinamento. A amostra constituiu-se de 40 funcionários administrativos de uma instituição pública que receberam treinamento da técnica da RCP, utilizando o DEA, em laboratório. O aumento significativo de acertos nos itens do instrumento de avaliação da Habilidade Psicomotora e do Conhecimento Teórico, após o treinamento, indica que houve melhora no desempenho dos participantes.Los estudios demuestran que la sobrevida después de un paro cardíaco disminuye el 10% por cada minuto de atraso en la desfibrilación y que la tasa de supervivencia es del 98% cuando se consigue en 30 segundos. En la atención de un paro cardíaco es primordial que se incluya en la capacitación la utilización de los desfibriladores externos semi-automáticos (DEA. El objetivo de este estudio fue comparar la Habilidad Psicomotora y el Conocimiento Teórico de legos en la técnica de la resucitación cardiopulmonar (RCP utilizando el DEA, antes y después de la capacitación. La muestra estuvo formada por 40 empleados administrativos de una institución pública que recibieron capitación en la técnica de RCP, utilizando el DEA, en laboratorio. El aumento significativo de aciertos en los ítems del instrumento de evaluación de la habilidad Psicomotora y del Conocimiento teórico, después de la capacitación, indica que hubo mejora en el desempeño de los participantes para realizar la RCP con el uso del DEA.Studies demonstrate

  7. The evaluation of a deformable image registration segmentation technique for semi-automating internal target volume (ITV) production from 4DCT images of lung stereotactic body radiotherapy (SBRT) patients

    International Nuclear Information System (INIS)

    Speight, Richard; Sykes, Jonathan; Lindsay, Rebecca; Franks, Kevin; Thwaites, David

    2011-01-01

    Purpose: To evaluate a deformable image registration (DIR) segmentation technique for semi-automating ITV production from 4DCT for lung patients, in terms of accuracy and efficiency. Methods: Twenty-five stereotactic body radiotherapy lung patients were selected in this retrospective study. ITVs were manually delineated by an oncologist and semi-automatically produced by propagating the GTV manually delineated on the mid-ventilation phase to all other phases using two different DIR algorithms, using commercial software. The two ITVs produced by DIR were compared to the manually delineated ITV using the dice similarity coefficient (DSC), mean distance between agreement and normalised DSC. DIR-produced ITVs were assessed for their clinical suitability and also the time savings were estimated. Results: Eighteen out of 25 ITVs had normalised DSC > 1 indicating an agreement with the manually produced ITV within 1 mm uncertainty. Four of the other seven ITVs were deemed clinically acceptable and three would require a small amount of editing. In general, ITVs produced by DIR were smoother than those produced by manual delineation. It was estimated that using this technique would save clinicians on average 28 min/patient. Conclusions: ABAS was found to be a useful tool in the production of ITVs for lung patients. The ITVs produced are either immediately clinically acceptable or require minimal editing. This approach represents a significant time saving for clinicians.

  8. A comparison of six software packages for evaluation of solid lung nodules using semi-automated volumetry: What is the minimum increase in size to detect growth in repeated CT examinations

    International Nuclear Information System (INIS)

    Hoop, Bartjan de; Gietema, Hester; Prokop, Mathias; Ginneken, Bram van; Zanen, Pieter; Groenewegen, Gerard

    2009-01-01

    We compared interexamination variability of CT lung nodule volumetry with six currently available semi-automated software packages to determine the minimum change needed to detect the growth of solid lung nodules. We had ethics committee approval. To simulate a follow-up examination with zero growth, we performed two low-dose unenhanced CT scans in 20 patients referred for pulmonary metastases. Between examinations, patients got off and on the table. Volumes of all pulmonary nodules were determined on both examinations using six nodule evaluation software packages. Variability (upper limit of the 95% confidence interval of the Bland-Altman plot) was calculated for nodules for which segmentation was visually rated as adequate. We evaluated 214 nodules (mean diameter 10.9 mm, range 3.3 mm-30.0 mm). Software packages provided adequate segmentation in 71% to 86% of nodules (p < 0.001). In case of adequate segmentation, variability in volumetry between scans ranged from 16.4% to 22.3% for the various software packages. Variability with five to six software packages was significantly less for nodules ≥8 mm in diameter (range 12.9%-17.1%) than for nodules <8 mm (range 18.5%-25.6%). Segmented volumes of each package were compared to each of the other packages. Systematic volume differences were detected in 11/15 comparisons. This hampers comparison of nodule volumes between software packages. (orig.)

  9. Wind-Farm Parametrisations in Mesoscale Models

    DEFF Research Database (Denmark)

    Volker, Patrick; Badger, Jake; Hahmann, Andrea N.

    2013-01-01

    In this paper we compare three wind-farm parametrisations for mesoscale models against measurement data from the Horns Rev I offshore wind-farm. The parametrisations vary from a simple rotor drag method, to more sophisticated models. Additional to (4) we investigated the horizontal resolution dep...

  10. Optimized Heart Sampling and Systematic Evaluation of Cardiac Therapies in Mouse Models of Ischemic Injury: Assessment of Cardiac Remodeling and Semi-Automated Quantification of Myocardial Infarct Size.

    Science.gov (United States)

    Valente, Mariana; Araújo, Ana; Esteves, Tiago; Laundos, Tiago L; Freire, Ana G; Quelhas, Pedro; Pinto-do-Ó, Perpétua; Nascimento, Diana S

    2015-12-02

    Cardiac therapies are commonly tested preclinically in small-animal models of myocardial infarction. Following functional evaluation, post-mortem histological analysis is essential to assess morphological and molecular alterations underlying the effectiveness of treatment. However, non-methodical and inadequate sampling of the left ventricle often leads to misinterpretations and variability, making direct study comparisons unreliable. Protocols are provided for representative sampling of the ischemic mouse heart followed by morphometric analysis of the left ventricle. Extending the use of this sampling to other types of in situ analysis is also illustrated through the assessment of neovascularization and cellular engraftment in a cell-based therapy setting. This is of interest to the general cardiovascular research community as it details methods for standardization and simplification of histo-morphometric evaluation of emergent heart therapies. © 2015 by John Wiley & Sons, Inc. Copyright © 2015 John Wiley & Sons, Inc.

  11. A three-dimensional image processing program for accurate, rapid, and semi-automated segmentation of neuronal somata with dense neurite outgrowth

    Science.gov (United States)

    Ross, James D.; Cullen, D. Kacy; Harris, James P.; LaPlaca, Michelle C.; DeWeerth, Stephen P.

    2015-01-01

    Three-dimensional (3-D) image analysis techniques provide a powerful means to rapidly and accurately assess complex morphological and functional interactions between neural cells. Current software-based identification methods of neural cells generally fall into two applications: (1) segmentation of cell nuclei in high-density constructs or (2) tracing of cell neurites in single cell investigations. We have developed novel methodologies to permit the systematic identification of populations of neuronal somata possessing rich morphological detail and dense neurite arborization throughout thick tissue or 3-D in vitro constructs. The image analysis incorporates several novel automated features for the discrimination of neurites and somata by initially classifying features in 2-D and merging these classifications into 3-D objects; the 3-D reconstructions automatically identify and adjust for over and under segmentation errors. Additionally, the platform provides for software-assisted error corrections to further minimize error. These features attain very accurate cell boundary identifications to handle a wide range of morphological complexities. We validated these tools using confocal z-stacks from thick 3-D neural constructs where neuronal somata had varying degrees of neurite arborization and complexity, achieving an accuracy of ≥95%. We demonstrated the robustness of these algorithms in a more complex arena through the automated segmentation of neural cells in ex vivo brain slices. These novel methods surpass previous techniques by improving the robustness and accuracy by: (1) the ability to process neurites and somata, (2) bidirectional segmentation correction, and (3) validation via software-assisted user input. This 3-D image analysis platform provides valuable tools for the unbiased analysis of neural tissue or tissue surrogates within a 3-D context, appropriate for the study of multi-dimensional cell-cell and cell-extracellular matrix interactions. PMID

  12. Simple quasi-analytical holonomic homogenization model for the non-linear analysis of in-plane loaded masonry panels: Part 1, meso-scale

    Science.gov (United States)

    Milani, G.; Bertolesi, E.

    2017-07-01

    A simple quasi analytical holonomic homogenization approach for the non-linear analysis of masonry walls in-plane loaded is presented. The elementary cell (REV) is discretized with 24 triangular elastic constant stress elements (bricks) and non-linear interfaces (mortar). A holonomic behavior with softening is assumed for mortar. It is shown how the mechanical problem in the unit cell is characterized by very few displacement variables and how homogenized stress-strain behavior can be evaluated semi-analytically.

  13. Development of an unbiased, semi-automated approach for classifying plasma cell immunophenotype following multicolor flow cytometry of bone marrow aspirates.

    Science.gov (United States)

    Post, Steven R; Post, Ginell R; Nikolic, Dejan; Owens, Rebecca; Insuasti-Beltran, Giovanni

    2018-03-24

    Despite increased usage of multiparameter flow cytometry (MFC) to assess diagnosis, prognosis, and therapeutic efficacy (minimal residual disease, MRD) in plasma cell neoplasms (PCNs), standardization of methodology and data analysis is suboptimal. We investigated the utility of using the mean and median fluorescence intensities (FI) obtained from MFC to objectively describe parameters that distinguish plasma cell (PC) phenotypes. In this retrospective study, flow cytometry results from bone marrow aspirate specimens from 570 patients referred to the Myeloma Institute at UAMS were evaluated. Mean and median FI data were obtained from 8-color MFC of non-neoplastic, malignant, and mixed PC populations using antibodies to CD38, CD138, CD19, CD20, CD27, CD45, CD56, and CD81. Of 570 cases, 252 cases showed only non-neoplastic PCs, 168 showed only malignant PCs, and 150 showed mixed PC populations. Statistical analysis of median FI data for each CD marker showed no difference in expression intensity on non-neoplastic and malignant PCs, between pure and mixed PC populations. ROC analysis of the median FI of CD expression in non-neoplastic and malignant PCs was used to develop an algorithm to convert quantitative FI values to qualitative assessments including "negative," "positive," "dim," and "heterogeneous" expression. FI data derived from 8-color MFC can be used to define marker expression on PCs. Translation of FI data from Infinicyt software to an Excel worksheet streamlines workflow and eliminates transcriptional errors when generating flow reports. © 2018 International Clinical Cytometry Society. © 2018 International Clinical Cytometry Society.

  14. Multisource multibeam backscatter data: developing a strategy for the production of benthic habitat maps using semi-automated seafloor classification methods

    Science.gov (United States)

    Lacharité, Myriam; Brown, Craig J.; Gazzola, Vicki

    2018-06-01

    The establishment of multibeam echosounders (MBES) as a mainstream tool in ocean mapping has facilitated integrative approaches towards nautical charting, benthic habitat mapping, and seafloor geotechnical surveys. The bathymetric and backscatter information generated by MBES enables marine scientists to present highly accurate bathymetric data with a spatial resolution closely matching that of terrestrial mapping, and can generate customized thematic seafloor maps to meet multiple ocean management needs. However, when a variety of MBES systems are used, the creation of objective habitat maps can be hindered by the lack of backscatter calibration, due for example, to system-specific settings, yielding relative rather than absolute values. Here, we describe an approach using object-based image analysis to combine 4 non-overlapping and uncalibrated (backscatter) MBES coverages to form a seamless habitat map on St. Anns Bank (Atlantic Canada), a marine protected area hosting a diversity of benthic habitats. The benthoscape map was produced by analysing each coverage independently with supervised classification (k-nearest neighbor) of image-objects based on a common suite of 7 benthoscapes (determined with 4214 ground-truthing photographs at 61 stations, and characterized with backscatter, bathymetry, and bathymetric position index). Manual re-classification based on uncertainty in membership values to individual classes—especially at the boundaries between coverages—was used to build the final benthoscape map. Given the costs and scarcity of MBES surveys in offshore marine ecosystems—particularly in large ecosystems in need of adequate conservation strategies, such as in Canadian waters—developing approaches to synthesize multiple datasets to meet management needs is warranted.

  15. Lightning characteristics of derecho producing mesoscale convective systems

    Science.gov (United States)

    Bentley, Mace L.; Franks, John R.; Suranovic, Katelyn R.; Barbachem, Brent; Cannon, Declan; Cooper, Stonie R.

    2016-06-01

    Derechos, or widespread, convectively induced wind storms, are a common warm season phenomenon in the Central and Eastern United States. These damaging and severe weather events are known to sweep quickly across large spatial regions of more than 400 km and produce wind speeds exceeding 121 km h-1. Although extensive research concerning derechos and their parent mesoscale convective systems already exists, there have been few investigations of the spatial and temporal distribution of associated cloud-to-ground lightning with these events. This study analyzes twenty warm season (May through August) derecho events between 2003 and 2013 in an effort to discern their lightning characteristics. Data used in the study included cloud-to-ground flash data derived from the National Lightning Detection Network, WSR-88D imagery from the University Corporation for Atmospheric Research, and damaging wind report data obtained from the Storm Prediction Center. A spatial and temporal analysis was conducted by incorporating these data into a geographic information system to determine the distribution and lightning characteristics of the environments of derecho producing mesoscale convective systems. Primary foci of this research include: (1) finding the approximate size of the lightning activity region for individual and combined event(s); (2) determining the intensity of each event by examining the density and polarity of lightning flashes; (3) locating areas of highest lightning flash density; and (4) to provide a lightning spatial analysis that outlines the temporal and spatial distribution of flash activity for particularly strong derecho producing thunderstorm episodes.

  16. Reliability of a semi-automated 3D-CT measuring method for tunnel diameters after anterior cruciate ligament reconstruction: A comparison between soft-tissue single-bundle allograft vs. autograft.

    Science.gov (United States)

    Robbrecht, Cedric; Claes, Steven; Cromheecke, Michiel; Mahieu, Peter; Kakavelakis, Kyriakos; Victor, Jan; Bellemans, Johan; Verdonk, Peter

    2014-10-01

    Post-operative widening of tibial and/or femoral bone tunnels is a common observation after ACL reconstruction, especially with soft-tissue grafts. There are no studies comparing tunnel widening in hamstring autografts versus tibialis anterior allografts. The goal of this study was to observe the difference in tunnel widening after the use of allograft vs. autograft for ACL reconstruction, by measuring it with a novel 3-D computed tomography based method. Thirty-five ACL-deficient subjects were included, underwent anatomic single-bundle ACL reconstruction and were evaluated at one year after surgery with the use of 3-D CT imaging. Three independent observers semi-automatically delineated femoral and tibial tunnel outlines, after which a best-fit cylinder was derived and the tunnel diameter was determined. Finally, intra- and inter-observer reliability of this novel measurement protocol was defined. In femoral tunnels, the intra-observer ICC was 0.973 (95% CI: 0.922-0.991) and the inter-observer ICC was 0.992 (95% CI: 0.982-0.996). In tibial tunnels, the intra-observer ICC was 0.955 (95% CI: 0.875-0.985). The combined inter-observer ICC was 0.970 (95% CI: 0.987-0.917). Tunnel widening was significantly higher in allografts compared to autografts, in the tibial tunnels (p=0.013) as well as in the femoral tunnels (p=0.007). To our knowledge, this novel, semi-automated 3D-computed tomography image processing method has shown to yield highly reproducible results for the measurement of bone tunnel diameter and area. This series showed a significantly higher amount of tunnel widening observed in the allograft group at one-year follow-up. Level II, Prospective comparative study. Copyright © 2014 Elsevier B.V. All rights reserved.

  17. A semi-automated LC-MS/MS method for the determination of LCI699, a steroid 11β-hydroxylase inhibitor, in human plasma.

    Science.gov (United States)

    Li, Wenkui; Luo, Suyi; Rebello, Sam; Flarakos, Jimmy; Tse, Francis L S

    2014-06-01

    A novel liquid chromatographic method with tandem mass spectrometric detection (LC-MS/MS) for the determination of LCI699 was developed and validated with dynamic ranges of 0.0500-50.0 ng/mL and 1.00-1,000 ng/mL using 0.0500 mL and 0.100mL, respectively, of human plasma. LCI699 and the internal standard, [M+6]LCI699, were extracted from fortified human plasma via protein precipitation. After transfer or dilution of the supernatant followed by solvent evaporation and/or reconstitution, the extract was injected onto the LC-MS/MS system. Optimal chromatographic separation was achieved on an ACE C18 (50 mm × 4.6mm, 3 μm) column with 30% aqueous methanol (containing 0.5% acetic acid and 0.05% TFA) as the mobile phase run in isocratic at a flow rate of 1.0 mL/min. The total analysis cycle time is approximately 3.5 min per injection. The addition of an ion-pair reagent, TFA (0.05%, v/v), to the mobile phases significantly improved the chromatographic retention and resolution of the analyte on silica based reversed-phase column. Although addition of TFA to the mobile phase suppresses the ESI signals of the analyte due to its ion-pairing characteristics in the gas phase of MS source, this negative impact was effectively alleviated by adding 0.5% acetic acid to the mobile phase. The current method was validated for sensitivity, selectivity, linearity, reproducibility, stability and recovery. For the low curve range (0.0500-50.0 ng/mL), the accuracy and precision for the LLOQs (0.0500 ng/mL) were -13.0 to 2.0% bias and 3.4-19.2% CV, respectively. For other QC samples (0.100, 6.00, 20.0 and 40.0 ng/mL), the precision ranged from 1.2 to 9.0% and from 3.8 to 8.8% CV, respectively, in the intra-day and inter-day evaluations. The accuracy ranged from -11.3 to 8.0% and -7.2 to 1.6% bias, respectively, in the intra-day and inter-day batches. For the high curve range (1.00-1,000 ng/mL), the accuracy and precision for the LLOQs (1.00 ng/mL) were 1.0-15.0% bias and 7.4-9.2% CV

  18. Mesoscale simulation of concrete spall failure

    Science.gov (United States)

    Knell, S.; Sauer, M.; Millon, O.; Riedel, W.

    2012-05-01

    Although intensively studied, it is still being debated which physical mechanisms are responsible for the increase of dynamic strength and fracture energy of concrete observed at high loading rates, and to what extent structural inertia forces on different scales contribute to the observation. We present a new approach for the three dimensional mesoscale modelling of dynamic damage and cracking in concrete. Concrete is approximated as a composite of spherical elastic aggregates of mm to cm size embedded in an elastic cement stone matrix. Cracking within the matrix and at aggregate interfaces in the μm range are modelled with adaptively inserted—initially rigid—cohesive interface elements. The model is applied to analyse the dynamic tensile failure observed in Hopkinson-Bar spallation experiments with strain rates up to 100/s. The influence of the key mesoscale failure parameters of strength, fracture energy and relative weakening of the ITZ on macromechanic strength, momentum and energy conservation is numerically investigated.

  19. Mesoscale wind fluctuations over Danish waters

    DEFF Research Database (Denmark)

    Vincent, Claire Louise

    in generated power are a particular problem for oshore wind farms because the typically high concentration of turbines within a limited geographical area means that uctuations can be correlated across large numbers of turbines. Furthermore, organised mesoscale structures that often form over water......Mesoscale wind uctuations aect the large scale integration of wind power because they undermine the day-ahead predictability of wind speed and power production, and because they can result in large uctuations in power generation that must be balanced using reserve power. Large uctuations...... that realistic hour-scale wind uctuations and open cellular convection patterns develop in WRF simulations with 2km horizontal grid spacing. The atmospheric conditions during one of the case studies are then used to initialise a simplied version of the model that has no large scale weather forcing, topography...

  20. Optogenetic stimulation of a meso-scale human cortical model

    Science.gov (United States)

    Selvaraj, Prashanth; Szeri, Andrew; Sleigh, Jamie; Kirsch, Heidi

    2015-03-01

    Neurological phenomena like sleep and seizures depend not only on the activity of individual neurons, but on the dynamics of neuron populations as well. Meso-scale models of cortical activity provide a means to study neural dynamics at the level of neuron populations. Additionally, they offer a safe and economical way to test the effects and efficacy of stimulation techniques on the dynamics of the cortex. Here, we use a physiologically relevant meso-scale model of the cortex to study the hypersynchronous activity of neuron populations during epileptic seizures. The model consists of a set of stochastic, highly non-linear partial differential equations. Next, we use optogenetic stimulation to control seizures in a hyperexcited cortex, and to induce seizures in a normally functioning cortex. The high spatial and temporal resolution this method offers makes a strong case for the use of optogenetics in treating meso scale cortical disorders such as epileptic seizures. We use bifurcation analysis to investigate the effect of optogenetic stimulation in the meso scale model, and its efficacy in suppressing the non-linear dynamics of seizures.

  1. RCrane: semi-automated RNA model building.

    Science.gov (United States)

    Keating, Kevin S; Pyle, Anna Marie

    2012-08-01

    RNA crystals typically diffract to much lower resolutions than protein crystals. This low-resolution diffraction results in unclear density maps, which cause considerable difficulties during the model-building process. These difficulties are exacerbated by the lack of computational tools for RNA modeling. Here, RCrane, a tool for the partially automated building of RNA into electron-density maps of low or intermediate resolution, is presented. This tool works within Coot, a common program for macromolecular model building. RCrane helps crystallographers to place phosphates and bases into electron density and then automatically predicts and builds the detailed all-atom structure of the traced nucleotides. RCrane then allows the crystallographer to review the newly built structure and select alternative backbone conformations where desired. This tool can also be used to automatically correct the backbone structure of previously built nucleotides. These automated corrections can fix incorrect sugar puckers, steric clashes and other structural problems.

  2. Semi Automated Ferrous Material Scouring System (SAFMSS)

    Science.gov (United States)

    2016-03-14

    Mapping System GPR Ground Penetrating Radar IP Intellectual Property MEC Munitions and Explosives of Concern MPPEH Material Potentially Presenting...the development phase and well aware of the complexity, components and parameters of the system, developed an estimate of $800,000 for the price of...compared the weekly rental fee and the purchase price of excavators. Table 5 below lists the purchase price and weekly rental fee for three different

  3. RCrane: semi-automated RNA model building

    International Nuclear Information System (INIS)

    Keating, Kevin S.; Pyle, Anna Marie

    2012-01-01

    RCrane is a new tool for the partially automated building of RNA crystallographic models into electron-density maps of low or intermediate resolution. This tool helps crystallographers to place phosphates and bases into electron density and then automatically predicts and builds the detailed all-atom structure of the traced nucleotides. RNA crystals typically diffract to much lower resolutions than protein crystals. This low-resolution diffraction results in unclear density maps, which cause considerable difficulties during the model-building process. These difficulties are exacerbated by the lack of computational tools for RNA modeling. Here, RCrane, a tool for the partially automated building of RNA into electron-density maps of low or intermediate resolution, is presented. This tool works within Coot, a common program for macromolecular model building. RCrane helps crystallographers to place phosphates and bases into electron density and then automatically predicts and builds the detailed all-atom structure of the traced nucleotides. RCrane then allows the crystallographer to review the newly built structure and select alternative backbone conformations where desired. This tool can also be used to automatically correct the backbone structure of previously built nucleotides. These automated corrections can fix incorrect sugar puckers, steric clashes and other structural problems

  4. On Improving 4-km Mesoscale Model Simulations

    Science.gov (United States)

    Deng, Aijun; Stauffer, David R.

    2006-03-01

    A previous study showed that use of analysis-nudging four-dimensional data assimilation (FDDA) and improved physics in the fifth-generation Pennsylvania State University National Center for Atmospheric Research Mesoscale Model (MM5) produced the best overall performance on a 12-km-domain simulation, based on the 18 19 September 1983 Cross-Appalachian Tracer Experiment (CAPTEX) case. However, reducing the simulated grid length to 4 km had detrimental effects. The primary cause was likely the explicit representation of convection accompanying a cold-frontal system. Because no convective parameterization scheme (CPS) was used, the convective updrafts were forced on coarser-than-realistic scales, and the rainfall and the atmospheric response to the convection were too strong. The evaporative cooling and downdrafts were too vigorous, causing widespread disruption of the low-level winds and spurious advection of the simulated tracer. In this study, a series of experiments was designed to address this general problem involving 4-km model precipitation and gridpoint storms and associated model sensitivities to the use of FDDA, planetary boundary layer (PBL) turbulence physics, grid-explicit microphysics, a CPS, and enhanced horizontal diffusion. Some of the conclusions include the following: 1) Enhanced parameterized vertical mixing in the turbulent kinetic energy (TKE) turbulence scheme has shown marked improvements in the simulated fields. 2) Use of a CPS on the 4-km grid improved the precipitation and low-level wind results. 3) Use of the Hong and Pan Medium-Range Forecast PBL scheme showed larger model errors within the PBL and a clear tendency to predict much deeper PBL heights than the TKE scheme. 4) Combining observation-nudging FDDA with a CPS produced the best overall simulations. 5) Finer horizontal resolution does not always produce better simulations, especially in convectively unstable environments, and a new CPS suitable for 4-km resolution is needed. 6

  5. Uniaxial flicker analysis of the psychophysical Stiles-Crawford effects

    NARCIS (Netherlands)

    Lochocki, Benjamin; Vohnsen, Brian

    2017-01-01

    Purpose: We report on a semi-automated system for frequency analysis of the Stiles-Crawford effect of the first kind (SCE-I) using flicker methodology designed to gain insight into the temporal dynamics of the perceived visibility for alternating pupil entrance points. We describe the system and its

  6. Development of a parameterization scheme of mesoscale convective systems

    International Nuclear Information System (INIS)

    Cotton, W.R.

    1994-01-01

    The goal of this research is to develop a parameterization scheme of mesoscale convective systems (MCS) including diabatic heating, moisture and momentum transports, cloud formation, and precipitation. The approach is to: Perform explicit cloud-resolving simulation of MCSs; Perform statistical analyses of simulated MCSs to assist in fabricating a parameterization, calibrating coefficients, etc.; Test the parameterization scheme against independent field data measurements and in numerical weather prediction (NWP) models emulating general circulation model (GCM) grid resolution. Thus far we have formulated, calibrated, implemented and tested a deep convective engine against explicit Florida sea breeze convection and in coarse-grid regional simulations of mid-latitude and tropical MCSs. Several explicit simulations of MCSs have been completed, and several other are in progress. Analysis code is being written and run on the explicitly simulated data

  7. Mesoscale modeling of amorphous metals by shear transformation zone dynamics

    International Nuclear Information System (INIS)

    Homer, Eric R.; Schuh, Christopher A.

    2009-01-01

    A new mesoscale modeling technique for the thermo-mechanical behavior of metallic glasses is proposed. The modeling framework considers the shear transformation zone (STZ) as the fundamental unit of deformation, and coarse-grains an amorphous collection of atoms into an ensemble of STZs on a mesh. By employing finite element analysis and a kinetic Monte Carlo algorithm, the modeling technique is capable of simulating glass processing and deformation on time and length scales greater than those usually attainable by atomistic modeling. A thorough explanation of the framework is presented, along with a specific two-dimensional implementation for a model metallic glass. The model is shown to capture the basic behaviors of metallic glasses, including high-temperature homogeneous flow following the expected constitutive law, and low-temperature strain localization into shear bands. Details of the effects of processing and thermal history on the glass structure and properties are also discussed.

  8. Dynamics of Clouds and Mesoscale Circulations over the Maritime Continent

    Science.gov (United States)

    Jin, Y.; Wang, S.; Xian, P.; Reid, J. S.; Nachamkin, J.

    2010-12-01

    In recent decades Southeast Asia (SEA) has seen rapid economic growth as well as increased biomass burning, resulting in high air pollution levels and reduced air qual-ity. At the same time clouds often prevent accurate air-quality monitoring and analysis using satellite observations. The Seven SouthEast Asian Studies (7SEAS) field campaign currently underway over SEA provides an unprecedented opportunity to study the com-plex interplay between aerosol and clouds. 7SEAS is a comprehensive interdisciplinary atmospheric sciences program through international partnership of NASA, NRL, ONR and seven local institutions including those from Indonesia, Malaysia, the Philippines, Singapore, Taiwan, Thailand, and Vietnam. While the original goal of 7SEAS is to iso-late the impacts of aerosol particles on weather and the environment, it is recognized that better understanding of SEA meteorological conditions, especially those associated with cloud formation and evolution, is critical to the success of the campaign. In this study we attempt to gain more insight into the dynamic and physical processes associated with low level clouds and atmospheric circulation at the regional scale over SEA, using the Navy’s Coupled Ocean/Atmosphere Mesoscale Prediction System (COAMPS® ), a regional forecast model in operation at FNMOC since 1998. This effort comprises two main components. First, multiple-years of COAMPS operational forecasts over SEA are analyzed for basic climatology of atmospheric fea-tures. Second, mesoscale circulation and cloud properties are simulated at relatively higher resolution (15-km) for selected periods in the Gulf of Tonkin and adjacent coastal areas. Simulation results are compared to MODIS cloud observations and local sound-ings obtained during 7SEAS for model verifications. Atmospheric boundary layer proc-esses are examined in relation to spatial and temporal variations of cloud fields. The cur-rent work serves as an important step toward improving our

  9. Mesoscale Eddies in the Solomon Sea

    Science.gov (United States)

    Hristova, H. G.; Kessler, W. S.; McWilliams, J. C.; Molemaker, M. J.

    2011-12-01

    Water mass transformation in the strong equatorward flows through the Solomon Sea influences the properties of the Equatorial Undercurrent and subsequent cold tongue upwelling. High eddy activity in the interior Solomon Sea seen in altimetric sea surface height (SSH) and in several models may provide a mechanism for these transformations. We investigate these effects using a mesoscale (4-km resolution) sigma-coordinate (ROMS) model of the Solomon Sea nested in a basin solution, forced by a repeating seasonal cycle, and evaluated against observational data. The model generates a vigorous upper layer eddy field; some of these are apparently shed as the New Guinea Coastal Undercurrent threads through the complex topography of the region, others are independent of the strong western boundary current. We diagnose the scales and vertical structure of the eddies in different parts of the Solomon Sea to illuminate their generation processes and propagation characteristics, and compare these to observed eddy statistics. Hypotheses tested are that the Solomon Sea mesoscale eddies are generated locally by baroclinic instability, that the eddies are shed as the South Equatorial Current passes around and through the Solomon Island chain, that eddies are generated by the New Guinea Coastal Undercurrent, or that eddies occurring outside of the Solomon Sea propagate into the Solomon Sea. These different mechanisms have different implications for the resulting mixing and property fluxes. They also provide different interpretations for SSH signals observed from satellites (e.g., that will be observed by the upcoming SWOT satellite).

  10. The South China Sea Mesoscale Eddy Experiment (S-MEE) and Its Primary Findings

    Science.gov (United States)

    Zhang, Z.; Tian, J.; Zhao, W.; Qiu, B.

    2016-02-01

    South China Sea (SCS), the largest marginal sea in the northwestern Pacific, have strong eddy activities as revealed by both satellite and in situ observations. The 3D structures of the SCS mesoscale eddies and their lifecycles, including the generation and dissipation processes, are, however, still not well understood at present because of the lack of well-designed field observations. In order to address the above two scientific issues (3D structure and lifecycle of SCS mesoscale eddies), the SCS Mesoscale Eddy Experiment (S-MEE for short) was designed and conducted in the period from October 2013 to June 2014. As part of S-MEE, two bottom-anchored subsurface mooring arrays with one consisting of 10 moorings and the other 7 moorings, were deployed along the historical pathway of the mesoscale eddies in the northern SCS. All the moorings were equipped with ADCPs, RCMs, CTDs and temperature chains to make continues measurements of horizontal current velocity and temperature/salinity in the whole water column. During the S-MEE, a total of 5 distinct mesoscale eddies were observed to cross the mooring arrays, among which one anticyclonic and cyclonic eddy pair was fully captured by the mooring arrays. In addition to moored observations, we also conducted two transects across the center of the anticyclonic eddy and made high-resolution hydrographic and turbulent mixing measurements. Based on the data collected by the S-MEE and concurrent satellite-derived observations, we constructed the full-depth 3D structure of the eddy pair and analyzed its generation and dissipation mechanisms. We found that the eddies extend from the surface to the sea bottom and display prominent tilted structures in the vertical. By conducting an eddy energy budget analysis, we further identified that generation of submesoscale motions constitutes the dominant mechanism for the oceanic eddy dissipation.

  11. Optical 3D printing: bridging the gaps in the mesoscale

    Science.gov (United States)

    Jonušauskas, Linas; Juodkazis, Saulius; Malinauskas, Mangirdas

    2018-05-01

    Over the last decade, optical 3D printing has proved itself to be a flexible and capable approach in fabricating an increasing variety of functional structures. One of the main reasons why this technology has become so prominent is the fact that it allows the creation of objects in the mesoscale, where structure dimensions range from nanometers to centimeters. At this scale, the size and spatial configuration of produced single features start to influence the characteristics of the whole object, enabling an array of new, exotic and otherwise unachievable properties and structures (i.e. metamaterials). Here, we present the advantages of this technology in creating mesoscale structures in comparison to subtractive manufacturing techniques and to other branches of 3D printing. Differences between stereolithography, sintering, laser-induced forward transfer and femtosecond laser 3D multi-photon polymerization are highlighted. Attention is given to the discussion of applicable light sources, as well as to an ongoing analysis of the light–matter interaction mechanisms, as they determine the processable materials, required technological steps and the fidelity of feature sizes in fabricated patterns and workpieces. Optical 3D printing-enabled functional structures in micromechanics, medicine, microfluidics, micro-optics and photonics are discussed, with an emphasis on how this particular technology benefits advances in those fields. 4D printing, achieved by varying both the architecture and spatial material composition of the 3D structure, feature-size reduction via stimulated emission depletion-inspired nanolithography or thermal post-treatment, as well as plasmonic nanoparticle-polymer nanocomposites, are presented among examples of the newest trends in the development of this technology. Finally, an outlook is given, examining further scientific frontiers in the field as well as possibilities and challenges in transferring laboratory-level know-how to industrial

  12. Mesoscale Effects on Carbon Export: A Global Perspective

    Science.gov (United States)

    Harrison, Cheryl S.; Long, Matthew C.; Lovenduski, Nicole S.; Moore, Jefferson K.

    2018-04-01

    Carbon export from the surface to the deep ocean is a primary control on global carbon budgets and is mediated by plankton that are sensitive to physical forcing. Earth system models generally do not resolve ocean mesoscale circulation (O(10-100) km), scales that strongly affect transport of nutrients and plankton. The role of mesoscale circulation in modulating export is evaluated by comparing global ocean simulations conducted at 1° and 0.1° horizontal resolution. Mesoscale resolution produces a small reduction in globally integrated export production (export production can be large (±50%), with compensating effects in different ocean basins. With mesoscale resolution, improved representation of coastal jets block off-shelf transport, leading to lower export in regions where shelf-derived nutrients fuel production. Export is further reduced in these regions by resolution of mesoscale turbulence, which restricts the spatial area of production. Maximum mixed layer depths are narrower and deeper across the Subantarctic at higher resolution, driving locally stronger nutrient entrainment and enhanced summer export production. In energetic regions with seasonal blooms, such as the Subantarctic and North Pacific, internally generated mesoscale variability drives substantial interannual variation in local export production. These results suggest that biogeochemical tracer dynamics show different sensitivities to transport biases than temperature and salinity, which should be considered in the formulation and validation of physical parameterizations. Efforts to compare estimates of export production from observations and models should account for large variability in space and time expected for regions strongly affected by mesoscale circulation.

  13. Extreme gust wind estimation using mesoscale modeling

    DEFF Research Database (Denmark)

    Larsén, Xiaoli Guo; Kruger, Andries

    2014-01-01

    , surface turbulence characteristics. In this study, we follow a theory that is different from the local gust concept as described above. In this theory, the gust at the surface is non-local; it is produced by the deflection of air parcels flowing in the boundary layer and brought down to the surface...... from the Danish site Høvsøre help us to understand the limitation of the traditional method. Good agreement was found between the extreme gust atlases for South Africa and the existing map made from a limited number of measurements across the country. Our study supports the non-local gust theory. While...... through turbulent eddies. This process is modeled using the mesoscale Weather Forecasting and Research (WRF) model. The gust at the surface is calculated as the largest winds over a layer where the averaged turbulence kinetic energy is greater than the averaged buoyancy force. The experiments have been...

  14. Mesoscale Modelling of the Response of Aluminas

    International Nuclear Information System (INIS)

    Bourne, N. K.

    2006-01-01

    The response of polycrystalline alumina to shock is not well addressed. There are several operating mechanisms that only hypothesized which results in models which are empirical. A similar state of affairs in reactive flow modelling led to the development of mesoscale representations of the flow to illuminate operating mechanisms. In this spirit, a similar effort is undergone for a polycrystalline alumina. Simulations are conducted to observe operating mechanisms at the micron scale. A method is then developed to extend the simulations to meet response at the continuum level where measurements are made. The approach is validated by comparison with continuum experiments. The method and results are presented, and some of the operating mechanisms are illuminated by the observed response

  15. From Quanta to the Continuum: Opportunities for Mesoscale Science

    Energy Technology Data Exchange (ETDEWEB)

    Crabtree, George [Argonne National Lab. (ANL), Argonne, IL (United States); Sarrao, John [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Alivisatos, Paul [Univ. of California, Berkeley, CA (United States); Barletta, William [MIT (Massachusetts Inst. of Technology), Cambridge, MA (United States); Bates, Frank [Univ. of Minnesota, Minneapolis, MN (United States); Brown, Gordon [Stanford Univ., CA (United States); French, Roger [Case Western Reserve Univ., Cleveland, OH (United States); Greene, Laura [Univ. of Illinois, Urbana, IL (United States); Hemminger, John [Univ. of California, Irvine, CA (United States); Kastner, Marc [MIT (Massachusetts Inst. of Technology), Cambridge, MA (United States); Kay, Bruce [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Lewis, Jennifer [Univ. of Illinois, Urbana, IL (United States); Ratner, Mark [Northwestern Univ., Evanston, IL (United States); Anthony, Rollett [Carnegie Mellon Univ., Pittsburgh, PA (United States); Rubloff, Gary [University of Maryland, College Park, MD (United States); Spence, John [Arizona State Univ., Mesa, AZ (United States); Tobias, Douglas [Univ. of California, Irvine, CA (United States); Tranquada, John [Brookhaven National Lab. (BNL), Upton, NY (United States)

    2012-09-01

    This report explores the opportunity and defines the research agenda for mesoscale science—discovering, understanding, and controlling interactions among disparate systems and phenomena to reach the full potential of materials complexity and functionality. The ability to predict and control mesoscale phenomena and architectures is essential if atomic and molecular knowledge is to blossom into a next generation of technology opportunities, societal benefits, and scientific advances.. The body of this report outlines the need, the opportunities, the challenges, and the benefits of mastering mesoscale science.

  16. North American Mesoscale Forecast System (NAM) [12 km

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The North American Mesoscale Forecast System (NAM) is one of the major regional weather forecast models run by the National Centers for Environmental Prediction...

  17. Assimilation of Doppler weather radar observations in a mesoscale ...

    Indian Academy of Sciences (India)

    Research (PSU–NCAR) mesoscale model (MM5) version 3.5.6. The variational data assimilation ... investigation of the direct assimilation of radar reflectivity data in 3DVAR system. The present ...... Results presented in this paper are based on.

  18. The Coupled Ocean/Atmosphere Mesoscale Prediction System (COAMPS)

    National Research Council Canada - National Science Library

    Hodur, Richard M; Hong, Xiaodong; Doyle, James D; Pullen, Julie; Cummings, James; Martin, Paul; Rennick, Mary Alice

    2002-01-01

    ... of the Couple Ocean/Atmosphere Mesoscale Prediction System (COAMPS). The goal of this modeling project is to gain predictive skill in simulating the ocean and atmosphere at high resolution on time-scales of hours to several days...

  19. Towards high resolution mapping of 3-D mesoscale dynamics from observations

    Directory of Open Access Journals (Sweden)

    B. Buongiorno Nardelli

    2012-10-01

    Full Text Available The MyOcean R&D project MESCLA (MEsoSCaLe dynamical Analysis through combined model, satellite and in situ data was devoted to the high resolution 3-D retrieval of tracer and velocity fields in the oceans, based on the combination of in situ and satellite observations and quasi-geostrophic dynamical models. The retrieval techniques were also tested and compared with the output of a primitive equation model, with particular attention to the accuracy of the vertical velocity field as estimated through the Q vector formulation of the omega equation. The project focused on a test case, covering the region where the Gulf Stream separates from the US East Coast. This work demonstrated that innovative methods for the high resolution mapping of 3-D mesoscale dynamics from observations can be used to build the next generations of operational observation-based products.

  20. Cycloidal meandering of a mesoscale anticyclonic eddy

    Science.gov (United States)

    Kizner, Ziv; Shteinbuch-Fridman, Biana; Makarov, Viacheslav; Rabinovich, Michael

    2017-08-01

    By applying a theoretical approach, we propose a hypothetical scenario that might explain some features of the movement of a long-lived mesoscale anticyclone observed during 1990 in the Bay of Biscay [R. D. Pingree and B. Le Cann, "Three anticyclonic slope water oceanic eddies (SWODDIES) in the southern Bay of Biscay in 1990," Deep-Sea Res., Part A 39, 1147 (1992)]. In the remote-sensing infrared images, at the initial stage of observations, the anticyclone was accompanied by two cyclonic eddies, so the entire structure appeared as a tripole. However, at later stages, only the anticyclone was seen in the images, traveling generally west. Unusual for an individual eddy were the high speed of its motion (relative to the expected planetary beta-drift) and the presence of almost cycloidal meanders in its trajectory. Although surface satellites seem to have quickly disappeared, we hypothesize that subsurface satellites continued to exist, and the coherence of the three vortices persisted for a long time. A significant perturbation of the central symmetry in the mutual arrangement of three eddies constituting a tripole can make reasonably fast cycloidal drift possible. This hypothesis is tested with two-layer contour-dynamics f-plane simulations and with finite-difference beta-plane simulations. In the latter case, the interplay of the planetary beta-effect and that due to the sloping bottom is considered.

  1. Mesoscale simulations of hydrodynamic squirmer interactions.

    Science.gov (United States)

    Götze, Ingo O; Gompper, Gerhard

    2010-10-01

    The swimming behavior of self-propelled microorganisms is studied by particle-based mesoscale simulations. The simulation technique includes both hydrodynamics and thermal fluctuations that are both essential for the dynamics of microswimmers. The swimmers are modeled as squirmers, i.e., spherical objects with a prescribed tangential surface velocity, where the focus of thrust generation can be tuned from pushers to pullers. For passive squirmers (colloids), we show that the velocity autocorrelation function agrees quantitatively with the Boussinesq approximation. Single active squirmers show a persistent random-walk behavior, determined by forward motion, lateral diffusion, and orientational fluctuations, in agreement with theoretical predictions. For pairs of squirmers, which are initially swimming in parallel, we find an attraction for pushers and a repulsion for pullers, as expected. The hydrodynamic force between squirmer pairs is calculated as a function of the center-to-center distances d(cm) and is found to be consistent with a logarithmic distance dependence for d(cm) less than about two sphere diameters; here, the force is considerably stronger than expected from the far-field expansion. The dependence of the force strength on the asymmetry of the polar surface velocity is obtained. During the collision process, thermal fluctuations turn out to be very important and to strongly affect the postcollision velocity directions of both squirmers.

  2. Mobile Disdrometer Observations of Nocturnal Mesoscale Convective Systems During PECAN

    Science.gov (United States)

    Bodine, D. J.; Rasmussen, K. L.

    2015-12-01

    Understanding microphysical processes in nocturnal mesoscale convective systems (MCSs) is an important objective of the Plains Elevated Convection At Night (PECAN) experiment, which occurred from 1 June - 15 July 2015 in the central Great Plains region of the United States. Observations of MCSs were collected using a large array of mobile and fixed instrumentation, including ground-based radars, soundings, PECAN Integrated Sounding Arrays (PISAs), and aircraft. In addition to these observations, three mobile Parsivel disdrometers were deployed to obtain drop-size distribution (DSD) measurements to further explore microphysical processes in convective and stratiform regions of nocturnal MCSs. Disdrometers were deployed within close range of a multiple frequency network of mobile and fixed dual-polarization radars (5 - 30 km range), and near mobile sounding units and PISAs. Using mobile disdrometer and multiple-wavelength, dual-polarization radar data, microphysical properties of convective and stratiform regions of MCSs are investigated. The analysis will also examine coordinated Range-Height Indicator (RHI) scans over the disdrometers to elucidate vertical DSD structure. Analysis of dense observations obtained during PECAN in combination with mobile disdrometer DSD measurements contributes to a greater understanding of the structural characteristics and evolution of nocturnal MCSs.

  3. Impacts of Mesoscale Eddies on the Vertical Nitrate Flux in the Gulf Stream Region

    Science.gov (United States)

    Zhang, Shuwen; Curchitser, Enrique N.; Kang, Dujuan; Stock, Charles A.; Dussin, Raphael

    2018-01-01

    The Gulf Stream (GS) region has intense mesoscale variability that can affect the supply of nutrients to the euphotic zone (Zeu). In this study, a recently developed high-resolution coupled physical-biological model is used to conduct a 25-year simulation in the Northwest Atlantic. The Reynolds decomposition method is applied to quantify the nitrate budget and shows that the mesoscale variability is important to the vertical nitrate supply over the GS region. The decomposition, however, cannot isolate eddy effects from those arising from other mesoscale phenomena. This limitation is addressed by analyzing a large sample of eddies detected and tracked from the 25-year simulation. The eddy composite structures indicate that positive nitrate anomalies within Zeu exist in both cyclonic eddies (CEs) and anticyclonic eddies (ACEs) over the GS region, and are even more pronounced in the ACEs. Our analysis further indicates that positive nitrate anomalies mostly originate from enhanced vertical advective flux rather than vertical turbulent diffusion. The eddy-wind interaction-induced Ekman pumping is very likely the mechanism driving the enhanced vertical motions and vertical nitrate transport within ACEs. This study suggests that the ACEs in GS region may play an important role in modulating the oceanic biogeochemical properties by fueling local biomass production through the persistent supply of nitrate.

  4. Impact of SLA assimilation in the Sicily Channel Regional Model: model skills and mesoscale features

    Directory of Open Access Journals (Sweden)

    A. Olita

    2012-07-01

    Full Text Available The impact of the assimilation of MyOcean sea level anomalies along-track data on the analyses of the Sicily Channel Regional Model was studied. The numerical model has a resolution of 1/32° degrees and is capable to reproduce mesoscale and sub-mesoscale features. The impact of the SLA assimilation is studied by comparing a simulation (SIM, which does not assimilate data with an analysis (AN assimilating SLA along-track multi-mission data produced in the framework of MyOcean project. The quality of the analysis was evaluated by computing RMSE of the misfits between analysis background and observations (sea level before assimilation. A qualitative evaluation of the ability of the analyses to reproduce mesoscale structures is accomplished by comparing model results with ocean colour and SST satellite data, able to detect such features on the ocean surface. CTD profiles allowed to evaluate the impact of the SLA assimilation along the water column. We found a significant improvement for AN solution in terms of SLA RMSE with respect to SIM (the averaged RMSE of AN SLA misfits over 2 years is about 0.5 cm smaller than SIM. Comparison with CTD data shows a questionable improvement produced by the assimilation process in terms of vertical features: AN is better in temperature while for salinity it gets worse than SIM at the surface. This suggests that a better a-priori description of the vertical error covariances would be desirable. The qualitative comparison of simulation and analyses with synoptic satellite independent data proves that SLA assimilation allows to correctly reproduce some dynamical features (above all the circulation in the Ionian portion of the domain and mesoscale structures otherwise misplaced or neglected by SIM. Such mesoscale changes also infer that the eddy momentum fluxes (i.e. Reynolds stresses show major changes in the Ionian area. Changes in Reynolds stresses reflect a different pumping of eastward momentum from the eddy to

  5. O desempenho terminológico dos descritores em Ciência da Informação do Vocabulário Controlado do SIBi/USP nos processos de indexação manual, automática e semi-automática

    Directory of Open Access Journals (Sweden)

    Vania Mara Alves Lima

    Full Text Available Avaliou-se o desempenho terminológico, nos processos de indexação manual, automática e semi-automática, dos descritores, do Vocabulário Controlado do SIBi/USP, que representam o domínio da Ciência da Informação. Concluiu-se que os atuais descritores em Ciência da Informação do Vocabulário Controlado do SIBi/USP para representar adequadamente o conteúdo do corpus indexado devem ser ampliados e contextualizados através de definições terminológicas, de maneira a atender as necessidades de informação de seus usuários.

  6. Implementation of meso-scale radioactive dispersion model for GPU

    Energy Technology Data Exchange (ETDEWEB)

    Sunarko [National Nuclear Energy Agency of Indonesia (BATAN), Jakarta (Indonesia). Nuclear Energy Assessment Center; Suud, Zaki [Bandung Institute of Technology (ITB), Bandung (Indonesia). Physics Dept.

    2017-05-15

    Lagrangian Particle Dispersion Method (LPDM) is applied to model atmospheric dispersion of radioactive material in a meso-scale of a few tens of kilometers for site study purpose. Empirical relationships are used to determine the dispersion coefficient for various atmospheric stabilities. Diagnostic 3-D wind-field is solved based on data from one meteorological station using mass-conservation principle. Particles representing radioactive pollutant are dispersed in the wind-field as a point source. Time-integrated air concentration is calculated using kernel density estimator (KDE) in the lowest layer of the atmosphere. Parallel code is developed for GTX-660Ti GPU with a total of 1 344 scalar processors using CUDA. A test of 1-hour release discovers that linear speedup is achieved starting at 28 800 particles-per-hour (pph) up to about 20 x at 14 4000 pph. Another test simulating 6-hour release with 36 000 pph resulted in a speedup of about 60 x. Statistical analysis reveals that resulting grid doses are nearly identical in both CPU and GPU versions of the code.

  7. Investigating Mesoscale Convective Systems and their Predictability Using Machine Learning

    Science.gov (United States)

    Daher, H.; Duffy, D.; Bowen, M. K.

    2016-12-01

    A mesoscale convective system (MCS) is a thunderstorm region that lasts several hours long and forms near weather fronts and can often develop into tornadoes. Here we seek to answer the question of whether these tornadoes are "predictable" by looking for a defining characteristic(s) separating MCSs that evolve into tornadoes versus those that do not. Using NASA's Modern Era Retrospective-analysis for Research and Applications 2 reanalysis data (M2R12K), we apply several state of the art machine learning techniques to investigate this question. The spatial region examined in this experiment is Tornado Alley in the United States over the peak tornado months. A database containing select variables from M2R12K is created using PostgreSQL. This database is then analyzed using machine learning methods such as Symbolic Aggregate approXimation (SAX) and DBSCAN (an unsupervised density-based data clustering algorithm). The incentive behind using these methods is to mathematically define a MCS so that association rule mining techniques can be used to uncover some sort of signal or teleconnection that will help us forecast which MCSs will result in tornadoes and therefore give society more time to prepare and in turn reduce casualties and destruction.

  8. Mesoscale inversion of carbon sources and sinks

    International Nuclear Information System (INIS)

    Lauvaux, T.

    2008-01-01

    Inverse methods at large scales are used to infer the spatial variability of carbon sources and sinks over the continents but their uncertainties remain large. Atmospheric concentrations integrate the surface flux variability but atmospheric transport models at low resolution are not able to simulate properly the local atmospheric dynamics at the measurement sites. However, the inverse estimates are more representative of the large spatial heterogeneity of the ecosystems compared to direct flux measurements. Top-down and bottom-up methods that aim at quantifying the carbon exchanges between the surface and the atmosphere correspond to different scales and are not easily comparable. During this phD, a mesoscale inverse system was developed to correct carbon fluxes at 8 km resolution. The high resolution transport model MesoNH was used to simulate accurately the variability of the atmospheric concentrations, which allowed us to reduce the uncertainty of the retrieved fluxes. All the measurements used here were observed during the intensive regional campaign CERES of May and June 2005, during which several instrumented towers measured CO 2 concentrations and fluxes in the South West of France. Airborne measurements allowed us to observe concentrations at high altitude but also CO 2 surface fluxes over large parts of the domain. First, the capacity of the inverse system to correct the CO 2 fluxes was estimated using pseudo-data experiments. The largest fraction of the concentration variability was attributed to regional surface fluxes over an area of about 300 km around the site locations depending on the meteorological conditions. Second, an ensemble of simulations allowed us to define the spatial and temporal structures of the transport errors. Finally, the inverse fluxes at 8 km resolution were compared to direct flux measurements. The inverse system has been validated in space and time and showed an improvement of the first guess fluxes from a vegetation model

  9. Vertical Transport by Coastal Mesoscale Convective Systems

    Science.gov (United States)

    Lombardo, K.; Kading, T.

    2016-12-01

    This work is part of an ongoing investigation of coastal mesoscale convective systems (MCSs), including changes in vertical transport of boundary layer air by storms moving from inland to offshore. The density of a storm's cold pool versus that of the offshore marine atmospheric boundary layer (MABL), in part, determines the ability of the storm to successfully cross the coast, the mechanism driving storm propagation, and the ability of the storm to lift air from the boundary layer aloft. The ability of an MCS to overturn boundary layer air can be especially important over the eastern US seaboard, where warm season coastal MCSs are relatively common and where large coastal population centers generate concentrated regions of pollution. Recent work numerically simulating idealized MCSs in a coastal environment has provided some insight into the physical mechanisms governing MCS coastal crossing success and the impact on vertical transport of boundary layer air. Storms are simulated using a cloud resolving model initialized with atmospheric conditions representative of a Mid-Atlantic environment. Simulations are run in 2-D at 250 m horizontal resolution with a vertical resolution stretched from 100 m in the boundary layer to 250 m aloft. The left half of the 800 km domain is configured to represent land, while the right half is assigned as water. Sensitivity experiments are conducted to quantify the influence of varying MABL structure on MCS coastal crossing success and air transport, with MABL values representative of those observed over the western Mid-Atlantic during warm season. Preliminary results indicate that when the density of the cold pool is much greater than the MABL, the storm successfully crosses the coastline, with lifting of surface parcels, which ascend through the troposphere. When the density of the cold pool is similar to that of the MABL, parcels within the MABL remain at low levels, though parcels above the MABL ascend through the troposphere.

  10. Mesoscale modeling of solute precipitation and radiation damage

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Yongfeng [Idaho National Lab. (INL), Idaho Falls, ID (United States); Schwen, Daniel [Idaho National Lab. (INL), Idaho Falls, ID (United States); Ke, Huibin [Idaho National Lab. (INL), Idaho Falls, ID (United States); Univ. of Wisconsin, Madison, WI (United States); Bai, Xianming [Idaho National Lab. (INL), Idaho Falls, ID (United States); Hales, Jason [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-09-01

    This report summarizes the low length scale effort during FY 2014 in developing mesoscale capabilities for microstructure evolution in reactor pressure vessels. During operation, reactor pressure vessels are subject to hardening and embrittlement caused by irradiation-induced defect accumulation and irradiation-enhanced solute precipitation. Both defect production and solute precipitation start from the atomic scale, and manifest their eventual effects as degradation in engineering-scale properties. To predict the property degradation, multiscale modeling and simulation are needed to deal with the microstructure evolution, and to link the microstructure feature to material properties. In this report, the development of mesoscale capabilities for defect accumulation and solute precipitation are summarized. Atomic-scale efforts that supply information for the mesoscale capabilities are also included.

  11. Low-level wind response to mesoscale pressure systems

    Science.gov (United States)

    Garratt, J. R.; Physick, W. L.

    1983-09-01

    Observations are presented which show a strong correlation between low-level wind behaviour (e.g., rotation near the surface) and the passage of mesoscale pressure systems. The latter are associated with frontal transition zones, are dominated by a pressure-jump line and a mesoscale high pressure area, and produce locally large horizontal pressure gradients. The wind observations are simulated by specifying a time sequence of perturbation pressure gradient and subsequently solving the vertically-integrated momentum equations with appropriate initial conditions. Very good agreement is found between observed and calculated winds; in particular, (i) a 360 ° rotation in wind on passage of the mesoscale high; (ii) wind-shift lines produced dynamically by the pressure-jump line; (iii) rapid linear increase in wind speed on passage of the pressure jump.

  12. Intercomparison of state-of-the-art models for wind energy resources with mesoscale models:

    Science.gov (United States)

    Olsen, Bjarke Tobias; Hahmann, Andrea N.; Sempreviva, Anna Maria; Badger, Jake; Joergensen, Hans E.

    2016-04-01

    vertical resolution, model parameterizations, surface roughness length) that could be used to group the various models and interpret the results of the intercomparison. 3. Main body abstract Twenty separate entries were received by the deadline of 31 March 2015. They included simulations done with various versions of the Weather Research and Forecast (WRF) model, but also of six other well-known mesoscale models. The various entries represent an excellent sample of the various models used in by the wind energy industry today. The analysis of the submitted time series included comparison to observations, summarized with well-known measures such as biases, RMSE, correlations, and of sector-wise statistics, e.g. frequency and Weibull A and k. The comparison also includes the observed and modeled temporal spectra. The various statistics were grouped as a function of the various models, their spatial resolution, forcing data, and the various integration methods. Many statistics have been computed and will be presented in addition to those shown in the Helsinki presentation. 4. Conclusions The analysis of the time series from twenty entries has shown to be an invaluable source of information about state of the art in wind modeling with mesoscale models. Biases between the simulated and observed wind speeds at hub heights (80-100 m AGL) from the various models are around ±1.0 m/s and fairly independent of the site and do not seem to be directly related to the model horizontal resolution used in the modeling. As probably expected, the wind speeds from the simulations using the various version of the WRF model cluster close to each other, especially in their description of the wind profile.

  13. Construction of a pathological risk model of occult lymph node metastases for prognostication by semi-automated image analysis of tumor budding in early-stage oral squamous cell carcinoma

    DEFF Research Database (Denmark)

    Pedersen, Nicklas Juel; Jensen, David Hebbelstrup; Lelkaitis, Giedrius

    2017-01-01

    It is challenging to identify at diagnosis those patients with early oral squamous cell carcinoma (OSCC), who have a poor prognosis and those that have a high risk of harboring occult lymph node metastases. The aim of this study was to develop a standardized and objective digital scoring method...

  14. Multiscale Modeling of Mesoscale and Interfacial Phenomena

    Science.gov (United States)

    Petsev, Nikolai Dimitrov

    With rapidly emerging technologies that feature interfaces modified at the nanoscale, traditional macroscopic models are pushed to their limits to explain phenomena where molecular processes can play a key role. Often, such problems appear to defy explanation when treated with coarse-grained continuum models alone, yet remain prohibitively expensive from a molecular simulation perspective. A prominent example is surface nanobubbles: nanoscopic gaseous domains typically found on hydrophobic surfaces that have puzzled researchers for over two decades due to their unusually long lifetimes. We show how an entirely macroscopic, non-equilibrium model explains many of their anomalous properties, including their stability and abnormally small gas-side contact angles. From this purely transport perspective, we investigate how factors such as temperature and saturation affect nanobubbles, providing numerous experimentally testable predictions. However, recent work also emphasizes the relevance of molecular-scale phenomena that cannot be described in terms of bulk phases or pristine interfaces. This is true for nanobubbles as well, whose nanoscale heights may require molecular detail to capture the relevant physics, in particular near the bubble three-phase contact line. Therefore, there is a clear need for general ways to link molecular granularity and behavior with large-scale continuum models in the treatment of many interfacial problems. In light of this, we have developed a general set of simulation strategies that couple mesoscale particle-based continuum models to molecular regions simulated through conventional molecular dynamics (MD). In addition, we derived a transport model for binary mixtures that opens the possibility for a wide range of applications in biological and drug delivery problems, and is readily reconciled with our hybrid MD-continuum techniques. Approaches that couple multiple length scales for fluid mixtures are largely absent in the literature, and

  15. New Mesoscale Fluvial Landscapes - Seismic Geomorphology and Exploration

    Science.gov (United States)

    Wilkinson, M. J.

    2013-01-01

    Megafans (100-600 km radius) are very large alluvial fans that cover significant areas on most continents, the surprising finding of recent global surveys. The number of such fans and patterns of sedimentation on them provides new mesoscale architectures that can now be applied on continental fluvial depositional systems, and therefore on. Megafan-scale reconstructions underground as yet have not been attempted. Seismic surveys offer new possibilities in identifying the following prospective situations at potentially unsuspected locations: (i) sand concentrations points, (ii) sand-mud continuums at the mesoscale, (iii) paleo-valley forms in these generally unvalleyed landscapes, (iv) stratigraphic traps, and (v) structural traps.

  16. Land surface sensitivity of mesoscale convective systems

    Science.gov (United States)

    Tournay, Robert C.

    Mesoscale convective systems (MCSs) are important contributors to the hydrologic cycle in many regions of the world as well as major sources of severe weather. MCSs continue to challenge forecasters and researchers alike, arising from difficulties in understanding system initiation, propagation, and demise. One distinct type of MCS is that formed from individual convective cells initiated primarily by daytime heating over high terrain. This work is aimed at improving our understanding of the land surface sensitivity of this class of MCS in the contiguous United States. First, a climatology of mesoscale convective systems originating in the Rocky Mountains and adjacent high plains from Wyoming southward to New Mexico is developed through a combination of objective and subjective methods. This class of MCS is most important, in terms of total warm season precipitation, in the 500 to 1300m elevations of the Great Plains (GP) to the east in eastern Colorado to central Nebraska and northwest Kansas. Examining MCSs by longevity, short lasting MCSs (15 hrs) reveals that longer lasting systems tend to form further south and have a longer track with a more southerly track. The environment into which the MCS is moving showed differences across commonly used variables in convection forecasting, with some variables showing more favorable conditions throughout (convective inhibition, 0-6 km shear and 250 hPa wind speed) ahead of longer lasting MCSs. Other variables, such as convective available potential energy, showed improving conditions through time for longer lasting MCSs. Some variables showed no difference across longevity of MCS (precipitable water and large-scale vertical motion). From subsets of this MCS climatology, three regions of origin were chosen based on the presence of ridgelines extending eastward from the Rocky Mountains known to be foci for convection initiation and subsequent MCS formation: Southern Wyoming (Cheyenne Ridge), Colorado (Palmer divide) and

  17. Skills of different mesoscale models over Indian region during ...

    Indian Academy of Sciences (India)

    tion and prediction of high impact severe weather systems. Such models ... mesoscale models can be run at cloud resolving resolutions (∼1km) ... J. Earth Syst. Sci. 117, No. ..... similar to climate drift, indicating that those error components are ...

  18. Mesoscale meteorological model based on radioactive explosion cloud simulation

    International Nuclear Information System (INIS)

    Zheng Yi; Zhang Yan; Ying Chuntong

    2008-01-01

    In order to simulate nuclear explosion and dirty bomb radioactive cloud movement and concentration distribution, mesoscale meteorological model RAMS was used. Particles-size, size-active distribution and gravitational fallout in the cloud were considered. The results show that the model can simulate the 'mushroom' clouds of explosion. Three-dimension fluid field and radioactive concentration field were received. (authors)

  19. Role of land state in a high resolution mesoscale model

    Indian Academy of Sciences (India)

    ... Proceedings – Mathematical Sciences · Resonance – Journal of Science ... Land surface characteristics; high resolution mesoscale model; Uttarakhand ... to predict realistic location, timing, amount,intensity and distribution of rainfall ... region embedded within two low pressure centers over Arabian Seaand Bay of Bengal.

  20. Modeling Air-Quality in Complex Terrain Using Mesoscale and ...

    African Journals Online (AJOL)

    Air-quality in a complex terrain (Colorado-River-Valley/Grand-Canyon Area, Southwest U.S.) is modeled using a higher-order closure mesoscale model and a higher-order closure dispersion model. Non-reactive tracers have been released in the Colorado-River valley, during winter and summer 1992, to study the ...

  1. Mesoscale characterization of local property distributions in heterogeneous electrodes

    Science.gov (United States)

    Hsu, Tim; Epting, William K.; Mahbub, Rubayyat; Nuhfer, Noel T.; Bhattacharya, Sudip; Lei, Yinkai; Miller, Herbert M.; Ohodnicki, Paul R.; Gerdes, Kirk R.; Abernathy, Harry W.; Hackett, Gregory A.; Rollett, Anthony D.; De Graef, Marc; Litster, Shawn; Salvador, Paul A.

    2018-05-01

    The performance of electrochemical devices depends on the three-dimensional (3D) distributions of microstructural features in their electrodes. Several mature methods exist to characterize 3D microstructures over the microscale (tens of microns), which are useful in understanding homogeneous electrodes. However, methods that capture mesoscale (hundreds of microns) volumes at appropriate resolution (tens of nm) are lacking, though they are needed to understand more common, less ideal electrodes. Using serial sectioning with a Xe plasma focused ion beam combined with scanning electron microscopy (Xe PFIB-SEM), two commercial solid oxide fuel cell (SOFC) electrodes are reconstructed over volumes of 126 × 73 × 12.5 and 124 × 110 × 8 μm3 with a resolution on the order of ≈ 503 nm3. The mesoscale distributions of microscale structural features are quantified and both microscale and mesoscale inhomogeneities are found. We analyze the origin of inhomogeneity over different length scales by comparing experimental and synthetic microstructures, generated with different particle size distributions, with such synthetic microstructures capturing well the high-frequency heterogeneity. Effective medium theory models indicate that significant mesoscale variations in local electrochemical activity are expected throughout such electrodes. These methods offer improved understanding of the performance of complex electrodes in energy conversion devices.

  2. Onset of meso-scale turbulence in active nematics

    NARCIS (Netherlands)

    Doostmohammadi, A.; Shendruk, T.N.; Thijssen, K.; Yeomans, J.M.

    2017-01-01

    Meso-scale turbulence is an innate phenomenon, distinct from inertial turbulence, that spontaneously occurs at low Reynolds number in fluidized biological systems. This spatiotemporal disordered flow radically changes nutrient and molecular transport in living fluids and can strongly affect the

  3. Calculation of extreme wind atlases using mesoscale modeling. Final report

    DEFF Research Database (Denmark)

    Larsén, Xiaoli Guo; Badger, Jake

    This is the final report of the project PSO-10240 "Calculation of extreme wind atlases using mesoscale modeling". The overall objective is to improve the estimation of extreme winds by developing and applying new methodologies to confront the many weaknesses in the current methodologies as explai...

  4. Probabilistic flood damage modelling at the meso-scale

    Science.gov (United States)

    Kreibich, Heidi; Botto, Anna; Schröter, Kai; Merz, Bruno

    2014-05-01

    Decisions on flood risk management and adaptation are usually based on risk analyses. Such analyses are associated with significant uncertainty, even more if changes in risk due to global change are expected. Although uncertainty analysis and probabilistic approaches have received increased attention during the last years, they are still not standard practice for flood risk assessments. Most damage models have in common that complex damaging processes are described by simple, deterministic approaches like stage-damage functions. Novel probabilistic, multi-variate flood damage models have been developed and validated on the micro-scale using a data-mining approach, namely bagging decision trees (Merz et al. 2013). In this presentation we show how the model BT-FLEMO (Bagging decision Tree based Flood Loss Estimation MOdel) can be applied on the meso-scale, namely on the basis of ATKIS land-use units. The model is applied in 19 municipalities which were affected during the 2002 flood by the River Mulde in Saxony, Germany. The application of BT-FLEMO provides a probability distribution of estimated damage to residential buildings per municipality. Validation is undertaken on the one hand via a comparison with eight other damage models including stage-damage functions as well as multi-variate models. On the other hand the results are compared with official damage data provided by the Saxon Relief Bank (SAB). The results show, that uncertainties of damage estimation remain high. Thus, the significant advantage of this probabilistic flood loss estimation model BT-FLEMO is that it inherently provides quantitative information about the uncertainty of the prediction. Reference: Merz, B.; Kreibich, H.; Lall, U. (2013): Multi-variate flood damage assessment: a tree-based data-mining approach. NHESS, 13(1), 53-64.

  5. Initializing a Mesoscale Boundary-Layer Model with Radiosonde Observations

    Science.gov (United States)

    Berri, Guillermo J.; Bertossa, Germán

    2018-01-01

    A mesoscale boundary-layer model is used to simulate low-level regional wind fields over the La Plata River of South America, a region characterized by a strong daily cycle of land-river surface-temperature contrast and low-level circulations of sea-land breeze type. The initial and boundary conditions are defined from a limited number of local observations and the upper boundary condition is taken from the only radiosonde observations available in the region. The study considers 14 different upper boundary conditions defined from the radiosonde data at standard levels, significant levels, level of the inversion base and interpolated levels at fixed heights, all of them within the first 1500 m. The period of analysis is 1994-2008 during which eight daily observations from 13 weather stations of the region are used to validate the 24-h surface-wind forecast. The model errors are defined as the root-mean-square of relative error in wind-direction frequency distribution and mean wind speed per wind sector. Wind-direction errors are greater than wind-speed errors and show significant dispersion among the different upper boundary conditions, not present in wind speed, revealing a sensitivity to the initialization method. The wind-direction errors show a well-defined daily cycle, not evident in wind speed, with the minimum at noon and the maximum at dusk, but no systematic deterioration with time. The errors grow with the height of the upper boundary condition level, in particular wind direction, and double the errors obtained when the upper boundary condition is defined from the lower levels. The conclusion is that defining the model upper boundary condition from radiosonde data closer to the ground minimizes the low-level wind-field errors throughout the region.

  6. Do mesoscale faults in a young fold belt indicate regional or local stress?

    Science.gov (United States)

    Kokado, Akihiro; Yamaji, Atsushi; Sato, Katsushi

    2017-04-01

    The result of paleostress analyses of mesoscale faults is usually thought of as evidence of a regional stress. On the other hand, the recent advancement of the trishear modeling has enabled us to predict the deformation field around fault-propagation folds without the difficulty of assuming paleo mechanical properties of rocks and sediments. We combined the analysis of observed mesoscale faults and the trishear modeling to understand the significance of regional and local stresses for the formation of mesoscale faults. To this end, we conducted the 2D trishear inverse modeling with a curved thrust fault to predict the subsurface structure and strain field of an anticline, which has a more or less horizontal axis and shows a map-scale plane strain perpendicular to the axis, in the active fold belt of Niigata region, central Japan. The anticline is thought to have been formed by fault-propagation folding under WNW-ESE regional compression. Based on the attitudes of strata and the positions of key tephra beds in Lower Pleistocene soft sediments cropping out at the surface, we obtained (1) a fault-propagation fold with the fault tip at a depth of ca. 4 km as the optimal subsurface structure, and (2) the temporal variation of deformation field during the folding. We assumed that mesoscale faults were activated along the direction of maximum shear strain on the faults to test whether the fault-slip data collected at the surface were consistent with the deformation in some stage(s) of folding. The Wallace-Bott hypothesis was used to estimate the consistence of faults with the regional stress. As a result, the folding and the regional stress explained 27 and 33 of 45 observed faults, respectively, with the 11 faults being consistent with the both. Both the folding and regional one were inconsistent with the remaining 17 faults, which could be explained by transfer faulting and/or the gravitational spreading of the growing anticline. The lesson we learnt from this work was

  7. Characterizing the Meso-scale Plasma Flows in Earth's Coupled Magnetosphere-Ionosphere-Thermosphere System

    Science.gov (United States)

    Gabrielse, C.; Nishimura, T.; Lyons, L. R.; Gallardo-Lacourt, B.; Deng, Y.; McWilliams, K. A.; Ruohoniemi, J. M.

    2017-12-01

    NASA's Heliophysics Decadal Survey put forth several imperative, Key Science Goals. The second goal communicates the urgent need to "Determine the dynamics and coupling of Earth's magnetosphere, ionosphere, and atmosphere and their response to solar and terrestrial inputs...over a range of spatial and temporal scales." Sun-Earth connections (called Space Weather) have strong societal impacts because extreme events can disturb radio communications and satellite operations. The field's current modeling capabilities of such Space Weather phenomena include large-scale, global responses of the Earth's upper atmosphere to various inputs from the Sun, but the meso-scale ( 50-500 km) structures that are much more dynamic and powerful in the coupled system remain uncharacterized. Their influences are thus far poorly understood. We aim to quantify such structures, particularly auroral flows and streamers, in order to create an empirical model of their size, location, speed, and orientation based on activity level (AL index), season, solar cycle (F10.7), interplanetary magnetic field (IMF) inputs, etc. We present a statistical study of meso-scale flow channels in the nightside auroral oval and polar cap using SuperDARN. These results are used to inform global models such as the Global Ionosphere Thermosphere Model (GITM) in order to evaluate the role of meso-scale disturbances on the fully coupled magnetosphere-ionosphere-thermosphere system. Measuring the ionospheric footpoint of magnetospheric fast flows, our analysis technique from the ground also provides a 2D picture of flows and their characteristics during different activity levels that spacecraft alone cannot.

  8. Comparison of Four Mixed Layer Mesoscale Parameterizations and the Equation for an Arbitrary Tracer

    Science.gov (United States)

    Canuto, V. M.; Dubovikov, M. S.

    2011-01-01

    In this paper we discuss two issues, the inter-comparison of four mixed layer mesoscale parameterizations and the search for the eddy induced velocity for an arbitrary tracer. It must be stressed that our analysis is limited to mixed layer mesoscales since we do not treat sub-mesoscales and small turbulent mixing. As for the first item, since three of the four parameterizations are expressed in terms of a stream function and a residual flux of the RMT formalism (residual mean theory), while the fourth is expressed in terms of vertical and horizontal fluxes, we needed a formalism to connect the two formulations. The standard RMT representation developed for the deep ocean cannot be extended to the mixed layer since its stream function does not vanish at the ocean's surface. We develop a new RMT representation that satisfies the surface boundary condition. As for the general form of the eddy induced velocity for an arbitrary tracer, thus far, it has been assumed that there is only the one that originates from the curl of the stream function. This is because it was assumed that the tracer residual flux is purely diffusive. On the other hand, we show that in the case of an arbitrary tracer, the residual flux has also a skew component that gives rise to an additional bolus velocity. Therefore, instead of only one bolus velocity, there are now two, one coming from the curl of the stream function and other from the skew part of the residual flux. In the buoyancy case, only one bolus velocity contributes to the mean buoyancy equation since the residual flux is indeed only diffusive.

  9. Investigating the Potential Impact of the Surface Water and Ocean Topography (SWOT) Altimeter on Ocean Mesoscale Prediction

    Science.gov (United States)

    Carrier, M.; Ngodock, H.; Smith, S. R.; Souopgui, I.

    2016-02-01

    NASA's Surface Water and Ocean Topography (SWOT) satellite, scheduled for launch in 2020, will provide sea surface height anomaly (SSHA) observations with a wider swath width and higher spatial resolution than current satellite altimeters. It is expected that this will help to further constrain ocean models in terms of the mesoscale circulation. In this work, this expectation is investigated by way of twin data assimilation experiments using the Navy Coastal Ocean Model Four Dimensional Variational (NCOM-4DVAR) data assimilation system using a weak constraint formulation. Here, a nature run is created from which SWOT observations are sampled, as well as along-track SSHA observations from simulated Jason-2 tracks. The simulated SWOT data has appropriate spatial coverage, resolution, and noise characteristics based on an observation-simulator program provided by the SWOT science team. The experiment is run for a three-month period during which the analysis is updated every 24 hours and each analysis is used to initialize a 96 hour forecast. The forecasts in each experiment are compared to the available nature run to determine the impact of the assimilated data. It is demonstrated here that the SWOT observations help to constrain the model mesoscale in a more consistent manner than traditional altimeter observations. The findings of this study suggest that data from SWOT may have a substantial impact on improving the ocean model analysis and forecast of mesoscale features and surface ocean transport.

  10. Laser polishing of 3D printed mesoscale components

    International Nuclear Information System (INIS)

    Bhaduri, Debajyoti; Penchev, Pavel; Batal, Afif; Dimov, Stefan; Soo, Sein Leung; Sten, Stella; Harrysson, Urban; Zhang, Zhenxue; Dong, Hanshan

    2017-01-01

    Highlights: • Process optimisation for laser polishing novel 3D printed SS316L parts. • Evaluating the effects of key polishing parameters on SS316L surface roughness. • Detailed spectroscopic analysis of oxide layer formation due to laser polishing. • Comparative surface integrity analysis of SS parts polished in air and argon. • A maximum reduction in roughness of over 94% achieved at optimised polishing settings. - Abstract: Laser polishing of various engineered materials such as glass, silica, steel, nickel and titanium alloys, has attracted considerable interest in the last 20 years due to its superior flexibility, operating speed and capability for localised surface treatment compared to conventional mechanical based methods. The paper initially reports results from process optimisation experiments aimed at investigating the influence of laser fluence and pulse overlap parameters on resulting workpiece surface roughness following laser polishing of planar 3D printed stainless steel (SS316L) specimens. A maximum reduction in roughness of over 94% (from ∼3.8 to ∼0.2 μm S_a) was achieved at the optimised settings (fluence of 9 J/cm"2 and overlap factors of 95% and 88–91% along beam scanning and step-over directions respectively). Subsequent analysis using both X-ray photoelectron spectroscopy (XPS) and glow discharge optical emission spectroscopy (GDOES) confirmed the presence of surface oxide layers (predominantly consisting of Fe and Cr phases) up to a depth of ∼0.5 μm when laser polishing was performed under normal atmospheric conditions. Conversely, formation of oxide layers was negligible when operating in an inert argon gas environment. The microhardness of the polished specimens was primarily influenced by the input thermal energy, with greater sub-surface hardness (up to ∼60%) recorded in the samples processed with higher energy density. Additionally, all of the polished surfaces were free of the scratch marks, pits, holes, lumps

  11. Laser polishing of 3D printed mesoscale components

    Energy Technology Data Exchange (ETDEWEB)

    Bhaduri, Debajyoti, E-mail: debajyoti.bhaduri@gmail.com [Department of Mechanical Engineering, School of Engineering, University of Birmingham, Edgbaston, Birmingham, B15 2TT (United Kingdom); Penchev, Pavel; Batal, Afif; Dimov, Stefan; Soo, Sein Leung [Department of Mechanical Engineering, School of Engineering, University of Birmingham, Edgbaston, Birmingham, B15 2TT (United Kingdom); Sten, Stella; Harrysson, Urban [Digital Metal, Höganäs AB, 263 83 Höganäs (Sweden); Zhang, Zhenxue; Dong, Hanshan [School of Metallurgy and Materials, University of Birmingham, Edgbaston, Birmingham, B15 2TT (United Kingdom)

    2017-05-31

    Highlights: • Process optimisation for laser polishing novel 3D printed SS316L parts. • Evaluating the effects of key polishing parameters on SS316L surface roughness. • Detailed spectroscopic analysis of oxide layer formation due to laser polishing. • Comparative surface integrity analysis of SS parts polished in air and argon. • A maximum reduction in roughness of over 94% achieved at optimised polishing settings. - Abstract: Laser polishing of various engineered materials such as glass, silica, steel, nickel and titanium alloys, has attracted considerable interest in the last 20 years due to its superior flexibility, operating speed and capability for localised surface treatment compared to conventional mechanical based methods. The paper initially reports results from process optimisation experiments aimed at investigating the influence of laser fluence and pulse overlap parameters on resulting workpiece surface roughness following laser polishing of planar 3D printed stainless steel (SS316L) specimens. A maximum reduction in roughness of over 94% (from ∼3.8 to ∼0.2 μm S{sub a}) was achieved at the optimised settings (fluence of 9 J/cm{sup 2} and overlap factors of 95% and 88–91% along beam scanning and step-over directions respectively). Subsequent analysis using both X-ray photoelectron spectroscopy (XPS) and glow discharge optical emission spectroscopy (GDOES) confirmed the presence of surface oxide layers (predominantly consisting of Fe and Cr phases) up to a depth of ∼0.5 μm when laser polishing was performed under normal atmospheric conditions. Conversely, formation of oxide layers was negligible when operating in an inert argon gas environment. The microhardness of the polished specimens was primarily influenced by the input thermal energy, with greater sub-surface hardness (up to ∼60%) recorded in the samples processed with higher energy density. Additionally, all of the polished surfaces were free of the scratch marks, pits, holes

  12. Observed 3D Structure, Generation, and Dissipation of Mesoscale Eddies in the South China Sea

    Science.gov (United States)

    Zhang, Z.; Tian, J.; Qiu, B.; Zhao, W.

    2016-12-01

    South China Sea (SCS), the largest marginal sea in the western Pacific, is abundant with strong mesoscale eddies as revealed by both satellite and in situ observations. The 3D structure, generation and dissipation mechanisms of the SCS mesoscale eddies, however, are still not well understood at present due to the lack of well-designed and comprehensive field observations. In order to address the above scientific issues, the SCS Mesoscale Eddy Experiment (S-MEE for short) was designed and conducted in the period from October 2013 to June 2014. As part of S-MEE, two bottom-anchored subsurface mooring arrays with one consisting of 10 moorings and the other 7 moorings, were deployed along the historical pathway of the mesoscale eddies in the northern SCS. All the moorings were equipped with ADCPs, RCMs, CTDs and temperature chains to make continues measurements of horizontal current velocity and temperature/salinity in the whole water column. In addition to moored observations, we also conducted two transects across the center of one anticyclonic eddy (AE) and made high-resolution hydrographic and turbulent mixing measurements. Based on the data collected by the S-MEE, we obtained the full-depth 3D structures of one AE and one cyclonic eddy (CE) and revealed their generation and dissipation mechanisms. For the first time we found that the eddies in the northern SCS extend from the surface to the sea bottom and display prominent tilted structures in the vertical. The AE was suggested to be shed from the Kuroshio current, which intruded into the SCS through Luzon Strait in winter. For the CE, its generation was associated with the barotropic instability of the Kuroshio current. By conducting an eddy energy budget analysis, we further identified that generation of submesoscale motions constitutes the dominant mechanism for the eddy dissipation. The findings in this study, not only provides new insights into the 3D structure of oceanic eddies, but also contributes to

  13. Micro- and meso-scale effects of forested terrain

    DEFF Research Database (Denmark)

    Dellwik, Ebba; Mann, Jakob; Sogachev, Andrey

    2011-01-01

    scales are the height of the planetary boundary layer and the Monin-Obukhov length, which both are related to the energy balance of the surface. Examples of important micro- and meso-scale effects of forested terrain are shown using data and model results from recent and ongoing experiments. For micro......The height and rotor diameter of modern wind turbines are so extensive, that the wind conditions they encounter often are well above the surface layer, where traditionally it is assumed that wind direction and turbulent fluxes are constant with respect to height, if the surface is homogenous....... Deviations from the requirement of homogeneity are often the focus of micro-scale studies in forested areas. Yet, to explain the wind climate in the relevant height range for turbines, it is necessary to also account for the length scales that are important parameters for the meso-scale flow. These length...

  14. Spectral structure of mesoscale winds over the water

    DEFF Research Database (Denmark)

    Larsén, Xiaoli Guo; Vincent, Claire Louise; Larsen, Søren Ejling

    2013-01-01

    to describe the spectral slope transition as well as the limit for application of the Taylor hypothesis. The stability parameter calculated from point measurements, the bulk Richardson number, is found insufficient to represent the various atmospheric structures that have their own spectral behaviours under...... spectra show universal characteristics, in agreement with the findings in literature, including the energy amplitude and the −5/3 spectral slope in the mesoscale range transitioning to a slope of −3 for synoptic and planetary scales. The integral time-scale of the local weather is found to be useful...... different stability conditions, such as open cells and gravity waves. For stationary conditions, the mesoscale turbulence is found to bear some characteristics of two-dimensional isotropy, including (1) very minor vertical variation of spectra; (2) similar spectral behaviour for the along- and across...

  15. Assessment of MARMOT. A Mesoscale Fuel Performance Code

    Energy Technology Data Exchange (ETDEWEB)

    Tonks, M. R. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Schwen, D. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Zhang, Y. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Chakraborty, P. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Bai, X. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Fromm, B. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Yu, J. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Teague, M. C. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Andersson, D. A. [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-04-01

    MARMOT is the mesoscale fuel performance code under development as part of the US DOE Nuclear Energy Advanced Modeling and Simulation Program. In this report, we provide a high level summary of MARMOT, its capabilities, and its current state of validation. The purpose of MARMOT is to predict the coevolution of microstructure and material properties of nuclear fuel and cladding. It accomplished this using the phase field method coupled to solid mechanics and heat conduction. MARMOT is based on the Multiphysics Object-Oriented Simulation Environment (MOOSE), and much of its basic capability in the areas of the phase field method, mechanics, and heat conduction come directly from MOOSE modules. However, additional capability specific to fuel and cladding is available in MARMOT. While some validation of MARMOT has been completed in the areas of fission gas behavior and grain growth, much more validation needs to be conducted. However, new mesoscale data needs to be obtained in order to complete this validation.

  16. Mesoscale modeling: solving complex flows in biology and biotechnology.

    Science.gov (United States)

    Mills, Zachary Grant; Mao, Wenbin; Alexeev, Alexander

    2013-07-01

    Fluids are involved in practically all physiological activities of living organisms. However, biological and biorelated flows are hard to analyze due to the inherent combination of interdependent effects and processes that occur on a multitude of spatial and temporal scales. Recent advances in mesoscale simulations enable researchers to tackle problems that are central for the understanding of such flows. Furthermore, computational modeling effectively facilitates the development of novel therapeutic approaches. Among other methods, dissipative particle dynamics and the lattice Boltzmann method have become increasingly popular during recent years due to their ability to solve a large variety of problems. In this review, we discuss recent applications of these mesoscale methods to several fluid-related problems in medicine, bioengineering, and biotechnology. Copyright © 2013 Elsevier Ltd. All rights reserved.

  17. Multi-sensor in situ observations to resolve the sub-mesoscale features in the stratified Gulf of Finland, Baltic Sea

    Science.gov (United States)

    Lips, Urmas; Kikas, Villu; Liblik, Taavi; Lips, Inga

    2016-05-01

    High-resolution numerical modeling, remote sensing, and in situ data have revealed significant role of sub-mesoscale features in shaping the distribution pattern of tracers in the ocean's upper layer. However, in situ measurements are difficult to conduct with the required resolution and coverage in time and space to resolve the sub-mesoscale, especially in such relatively shallow basins as the Gulf of Finland, where the typical baroclinic Rossby radius is 2-5 km. To map the multi-scale spatiotemporal variability in the gulf, we initiated continuous measurements with autonomous devices, including a moored profiler and Ferrybox system, which were complemented by dedicated research-vessel-based surveys. The analysis of collected high-resolution data in the summers of 2009-2012 revealed pronounced variability at the sub-mesoscale in the presence of mesoscale upwelling/downwelling, fronts, and eddies. The horizontal wavenumber spectra of temperature variance in the surface layer had slopes close to -2 between the lateral scales from 10 to 0.5 km. Similar tendency towards the -2 slopes of horizontal wavenumber spectra of temperature variance was found in the seasonal thermocline between the lateral scales from 10 to 1 km. It suggests that the ageostrophic sub-mesoscale processes could contribute considerably to the energy cascade in such a stratified sea basin. We showed that the intrusions of water with different salinity, which indicate the occurrence of a layered flow structure, could appear in the process of upwelling/downwelling development and relaxation in response to variable wind forcing. We suggest that the sub-mesoscale processes play a major role in feeding surface blooms in the conditions of coupled coastal upwelling and downwelling events in the Gulf of Finland.

  18. Parameterization of phase change of water in a mesoscale model

    Energy Technology Data Exchange (ETDEWEB)

    Levkov, L; Eppel, D; Grassl, H

    1987-01-01

    A parameterization scheme of phase change of water is suggested to be used in the 3-D numerical nonhydrostatic model GESIMA. The microphysical formulation follows the so-called bulk technique. With this procedure the net production rates in the balance equations for water and potential temperature are given both for liquid and ice-phase. Convectively stable as well as convectively unstable mesoscale systems are considered. With 2 figs..

  19. Maps of mesoscale wind variability over the North Sea region

    DEFF Research Database (Denmark)

    Vincent, Claire Louise; Hahmann, Andrea N.; Badger, Jake

    Mesoscale wind fluctuations affect the operation of wind farms, particularly as the number of geographically concentrated wind farms in the North Sea increases (Akhmatov et al. 2007). The frequency and intensity of wind fluctuations could be considered as a new siting criterion, together with exi...... for a 1 year period. The model was run with a horizontal grid spacing of 2 km. The variability maps are created by integrating the average 24 hour spectra at every grid point over different time-scales....

  20. Mesoscale modeling of metal-loaded high explosives

    Energy Technology Data Exchange (ETDEWEB)

    Bdzil, John Bohdan [Los Alamos National Laboratory; Lieberthal, Brandon [UNIV OF ILLINOIS; Srewart, Donald S [UNIV OF ILLINOIS

    2010-01-01

    We describe a 3D approach to modeling multi-phase blast explosive, which is primarily condensed explosive by volume with inert embedded particles. These embedded particles are uniform in size and placed on the array of a regular lattice. The asymptotic theory of detonation shock dynamics governs the detonation shock propagation in the explosive. Mesoscale hydrodynamic simulations are used to show how the particles are compressed, deformed, and accelerated by the high-speed detonation products flow.

  1. Does mesoscale matters in decadal changes observed in the northern Canary upwelling system?

    Science.gov (United States)

    Relvas, P.; Luís, J.; Santos, A. M. P.

    2009-04-01

    The Western Iberia constitutes the northern limb of the Canary Current Upwelling System, one of the four Eastern Boundary Upwelling Systems of the world ocean. The strong dynamic link between the atmosphere and the ocean makes these systems highly sensitive to global change, ideal to monitor and investigate its effects. In order to investigate decadal changes of the mesoscale patterns in the Northern Canary upwelling system (off Western Iberia), the field of the satellite-derived sea surface temperature (SST) trends was built at the pixel scale (4x4 km) for the period 1985-2007, based on the monthly mean data from the Advanced Very High Resolution Radiometer (AVHRR) on board NOAA series satellites, provided by the NASA Physical Oceanography Distributed Active Archive Center (PO.DAAC) at the Jet Propulsion Laboratory. The time series were limited to the nighttime passes to avoid the solar heating effect and a suite of procedures were followed to guarantee that the temperature trends were not biased towards the seasonally more abundant summer data, when the sky is considerably clear. A robust linear fit was applied to each individual pixel, crossing along the time the same pixel in all the processed monthly mean AVHRR SST images from 1985 until 2007. The field of the SST trends was created upon the slopes of the linear fits applied to each pixel. Monthly mean SST time series from the one degree enhanced International Comprehensive Ocean-Atmosphere Data Set (ICOADS) and from near-shore measurements collected on a daily basis by the Portuguese Meteorological Office (IM) are also used to compare the results and extend the analysis back until 1960. A generalized warming trend is detected in the coastal waters off Western Iberia during the last decades, no matter which data set we analyse. However, significant spatial differences in the warming rates are observed in the satellite-derived SST trends. Remarkably, off the southern part of the Western Iberia the known

  2. Explicit simulation of a midlatitude Mesoscale Convective System

    Energy Technology Data Exchange (ETDEWEB)

    Alexander, G.D.; Cotton, W.R. [Colorado State Univ., Fort Collins, CO (United States)

    1996-04-01

    We have explicitly simulated the mesoscale convective system (MCS) observed on 23-24 June 1985 during PRE-STORM, the Preliminary Regional Experiment for the Stormscale Operational and Research and Meterology Program. Stensrud and Maddox (1988), Johnson and Bartels (1992), and Bernstein and Johnson (1994) are among the researchers who have investigated various aspects of this MCS event. We have performed this MCS simulation (and a similar one of a tropical MCS; Alexander and Cotton 1994) in the spirit of the Global Energy and Water Cycle Experiment Cloud Systems Study (GCSS), in which cloud-resolving models are used to assist in the formulation and testing of cloud parameterization schemes for larger-scale models. In this paper, we describe (1) the nature of our 23-24 June MCS dimulation and (2) our efforts to date in using our explicit MCS simulations to assist in the development of a GCM parameterization for mesoscale flow branches. The paper is organized as follows. First, we discuss the synoptic situation surrounding the 23-24 June PRE-STORM MCS followed by a discussion of the model setup and results of our simulation. We then discuss the use of our MCS simulation. We then discuss the use of our MCS simulations in developing a GCM parameterization for mesoscale flow branches and summarize our results.

  3. Mesoscale cyclogenesis over the western north Pacific Ocean during TPARC

    Directory of Open Access Journals (Sweden)

    Christopher A. Davis

    2013-01-01

    Full Text Available Three cases of mesoscale marine cyclogenesis over the subtropics of the Western Pacific Ocean are investigated. Each case occurred during the THORPEX Pacific Asia Regional Campaign and Tropical Cyclone Structure (TCS-08 field phases in 2008. Each cyclone developed from remnants of disturbances that earlier showed potential for tropical cyclogenesis within the tropics. Two of the cyclones produced gale-force surface winds, and one, designated as a tropical cyclone, resulted in a significant coastal storm over eastern Japan. Development was initiated by a burst of organized mesoscale convection that consolidated and intensified the surface cyclonic circulation over a period of 12–24 h. Upper-tropospheric potential vorticity anomalies modulated the vertical wind shear that, in turn, influenced the periods of cyclone intensification and weakening. Weak baroclinicity associated with vertical shear was also deemed important in organizing mesoscale ascent and the convection outbreaks. The remnant tropical disturbances contributed exceptional water vapour content to higher latitudes that led to strong diabatic heating, and the tropical remnants contributed vorticity that was the seed of the development in the subtropics. Predictability of these events more than three days in advance appears to be minimal.

  4. An intercomparison of several diagnostic meteorological processors used in mesoscale air quality modeling

    Energy Technology Data Exchange (ETDEWEB)

    Vimont, J.C. [National Park Service, Lakewood, CO (United States); Scire, J.S. [Sigma Research Corp., Concord, MA (United States)

    1994-12-31

    A major component, and area of uncertainty, in mesoscale air quality modeling, is the specification of the meteorological fields which affect the transport and dispersion of pollutants. Various options are available for estimating the wind and mixing depth fields over a mesoscale domain. Estimates of the wind field can be obtained from spatial and temporal interpolation of available observations or from diagnostic meteorological models, which estimate a meteorological field from available data and adjust those fields based on parameterizations of physical processes. A major weakness of these processors is their dependence on spatially and temporally sparse input data, particularly upper air data. These problems are exacerbated in regions of complex terrain and along the shorelines of large bodies of water. Similarly, the estimation of mixing depth is also reliant upon sparse observations and the parameterization of the convective and mechanical processes. The meteorological processors examined in this analysis were developed to drive different Lagrangian puff models. This paper describes the algorithms these processors use to estimate the wind fields and mixing depth fields.

  5. Seasonal to Mesoscale Variability of Water Masses in Barrow Canyon,Chukchi Sea

    Science.gov (United States)

    Nobre, C.; Pickart, R. S.; Moore, K.; Ashjian, C. J.; Arrigo, K. R.; Grebmeier, J. M.; Vagle, S.; Itoh, M.; Berchok, C.; Stabeno, P. J.; Kikuchi, T.; Cooper, L. W.; Hartwell, I.; He, J.

    2016-02-01

    Barrow Canyon is one of the primary conduits by which Pacific-origin water exits the Chukchi Sea into the Canada Basin. As such, it is an ideal location to monitor the different water masses through the year. At the same time, the canyon is an energetic environment where mixing and entrainment can occur, modifying the pacific-origin waters. As part of the Distributed Biological Observatory (DBO) program, a transect across the canyon was occupied 24 times between 2010-2013 by international ships of opportunity passing through the region during summer and early-fall. Here we present results from an analysis of these sections to determine the seasonal evolution of the water masses and to investigate the nature of the mesoscale variability. The mean state shows the clear presence of six water masses present at various times through the summer. The seasonal evolution of these summer water masses is characterized both in depth space and in temperature-salinity (T-S) space. Clear patterns emerge, including the arrival of Alaskan coastal water and its modification in early-fall. The primary mesoscale variability is associated with wind-driven upwelling events which occur predominantly in September. The atmospheric forcing of these events is investigated as is the oceanic response.

  6. Environments of Long-Lived Mesoscale Convective Systems Over the Central United States in Convection Permitting Climate Simulations: Long-Lived Mesoscale Convective Systems

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Qing [Atmospheric Sciences and Global Change Division, Pacific Northwest National Laboratory, Richland WA USA; Houze, Robert A. [Atmospheric Sciences and Global Change Division, Pacific Northwest National Laboratory, Richland WA USA; Department of Atmospheric Sciences, University of Washington, Seattle WA USA; Leung, L. Ruby [Atmospheric Sciences and Global Change Division, Pacific Northwest National Laboratory, Richland WA USA; Feng, Zhe [Atmospheric Sciences and Global Change Division, Pacific Northwest National Laboratory, Richland WA USA

    2017-12-27

    Continental-scale convection-permitting simulations of the warm seasons of 2011 and 2012 reproduce realistic structure and frequency distribution of lifetime and event mean precipitation of mesoscale convective systems (MCSs) over the central United States. Analysis is performed to determine the environmental conditions conducive to generating the longest-lived MCSs and their subsequent interactions. The simulations show that MCSs systematically form over the Great Plains ahead of a trough in the westerlies in combination with an enhanced low-level jet from the Gulf of Mexico. These environmental properties at the time of storm initiation are most prominent for the MCSs that persist for the longest times. Systems reaching 9 h or more in lifetime exhibit feedback to the environment conditions through diabatic heating in the MCS stratiform regions. As a result, the parent synoptic-scale wave is strengthened as a divergent perturbation develops over the MCS at high levels, while a cyclonic circulation perturbation develops in the midlevels of the trough, where the vertical gradient of heating in the MCS region is maximized. The quasi-balanced mesoscale vortex helps to maintain the MCS over a long period of time by feeding dry, cool air into the environment at the rear of the MCS region, so that the MCS can draw in air that increases the evaporative cooling that helps maintain the MCS. At lower levels the south-southeasterly jet of warm moist air from the Gulf is enhanced in the presence of the synoptic-scale wave. That moisture supply is essential to the continued redevelopment of the MCS.

  7. Mesoscale variability in the Bransfield Strait region (Antarctica during Austral summer

    Directory of Open Access Journals (Sweden)

    M. A. García

    1994-08-01

    Full Text Available The Bransfield Strait is one the best-known areas of Antarctica's oceanic surroundings. In spite of this, the study of the mesoscale variability of its local circulation has been addressed only recently. This paper focuses on the mesoscale structure of local physical oceanographic conditions in the Bransfield Strait during the Austral summer as derived from the BIOANTAR 93 cruise and auxiliary remote sensing data. Moreover, data recovered from moored current meters allow identification of transient mesoscale phenomena.

  8. Mesoscale Model Data Preparation and Execution: A New Method Utilizing the Internet

    National Research Council Canada - National Science Library

    Kirby, Stephen

    2002-01-01

    In order to streamline and simplify the methodologies required to obtain and process the requisite meteorological data for mesoscale meteorological models such as the Battlescale Forecast Model (BFM...

  9. North Pacific Mesoscale Coupled Air-Ocean Simulations Compared with Observations

    Energy Technology Data Exchange (ETDEWEB)

    Cerovecki, Ivana [Univ. of California, San Diego, CA (United States). Scripps Inst. of Oceanography; McClean, Julie [Univ. of California, San Diego, CA (United States). Scripps Inst. of Oceanography; Koracin, Darko [Desert Research Inst. (DRI), Reno, NV (United States). Division of Atmospheric Sciences

    2014-11-14

    The overall objective of this study was to improve the representation of regional ocean circulation in the North Pacific by using high resolution atmospheric forcing that accurately represents mesoscale processes in ocean-atmosphere regional (North Pacific) model configuration. The goal was to assess the importance of accurate representation of mesoscale processes in the atmosphere and the ocean on large scale circulation. This is an important question, as mesoscale processes in the atmosphere which are resolved by the high resolution mesoscale atmospheric models such as Weather Research and Forecasting (WRF), are absent in commonly used atmospheric forcing such as CORE forcing, employed in e.g. the Community Climate System Model (CCSM).

  10. Experimental Study on Meso-Scale Milling Process Using Nanofluid Minimum Quantity Lubrication

    International Nuclear Information System (INIS)

    Lee, P. H.; Nam, T. S.; Li, Cheng Jun; Lee, S. W.

    2010-01-01

    This paper present the characteristics of micro- and meso-scale milling processes in which compressed cold air, minimum quantity lubrication (MQL) and MoS 2 nanofluid MQL are used. For process characterization, the micro and meso-scale milling experiments are conducted using desktop meso-scale machine tool system and the surface roughness is measured. The experimental results show that the use of compressed chilly air and nanofluid MQL in the micro- and meso-scale milling processes is effective in improving the surface finish

  11. Data assimilation of a ten-day period during June 1993 over the Southern Great Plains Site using a nested mesoscale model

    Energy Technology Data Exchange (ETDEWEB)

    Dudhia, J.; Guo, Y.R. [National Center for Atmospheric Research, Boulder, CO (United States)

    1996-04-01

    A goal of the Atmospheric Radiation Measurement (ARM) Program has been to obtain a complete representation of physical processes on the scale of a general circulation model (GCM) grid box in order to better parameterize radiative processes in these models. Since an observational network of practical size cannot be used alone to characterize the Cloud and Radiation Testbed (CART) site`s 3D structure and time development, data assimilation using the enhanced observations together with a mesoscale model is used to give a full 4D analysis at high resolution. The National Center for Atmospheric Research (NCAR)/Penn State Mesoscale Model (MM5) has been applied over a ten-day continuous period in a triple-nested mode with grid sizes of 60, 20 and 6.67 in. The outer domain covers the United States` 48 contiguous states; the innermost is a 480-km square centered on Lamont, Oklahoma. A simulation has been run with data assimilation using the Mesoscale Analysis and Prediction System (MAPS) 60-km analyses from the Forecast Systems Laboratory (FSL) of the National Ocean and Atmospheric Administration (NOAA). The nested domains take boundary conditions from and feed back continually to their parent meshes (i.e., they are two-way interactive). As reported last year, this provided a simulation of the basic features of mesoscale events over the CART site during the period 16-26 June 1993 when an Intensive Observation Period (IOP) was under way.

  12. Preliminary design of mesoscale turbocompressor and rotordynamics tests of rotor bearing system

    Science.gov (United States)

    Hossain, Md Saddam

    2011-12-01

    A mesoscale turbocompressor spinning above 500,000 RPM is evolutionary technology for micro turbochargers, turbo blowers, turbo compressors, micro-gas turbines, auxiliary power units, etc for automotive, aerospace, and fuel cell industries. Objectives of this work are: (1) to evaluate different air foil bearings designed for the intended applications, and (2) to design & perform CFD analysis of a micro-compressor. CFD analysis of shrouded 3-D micro compressor was conducted using Ansys Bladegen as blade generation tool, ICEM CFD as mesh generation tool, and CFX as main solver for different design and off design cases and also for different number of blades. Comprehensive experimental facilities for testing the turbocompressor system have been also designed and proposed for future work.

  13. Avaliação da exeqüibilidade, eficácia e segurança do transplante lamelar semi-automatizado de córnea Evaluation of performance, efficacy and safety of semi-automated lamellar keratoplasty

    Directory of Open Access Journals (Sweden)

    Núbia Cristina de Freitas Maia

    2006-12-01

    Full Text Available OBJETIVO: Avaliar a exeqüibilidade, eficácia e segurança do uso de microcerátomo e câmara anterior artificial para o transplante lamelar (sistema ALTK®. MÉTODOS: 21 olhos com opacidades corneanas superficiais foram submetidos ao transplante lamelar semi-automatizado de córnea. Nos olhos receptores a ceratectomia foi realizada de modo semelhante a uma cirurgia refrativa. As lamelas doadoras foram obtidas a partir de botões esclero-corneanos utilizando o mesmo microcerátomo e uma câmara anterior artificial. As medidas das espessuras corneanas foram feitas através da biomicroscopia ultra-sônica. RESULTADOS: As cirurgias obtiveram êxito em 19 olhos. Em 80% das lamelas obtidas em córneas doadoras e em 84,2% das lamelas em olhos receptores houve uma variação de até 0,5 mm do diâmetro desejado. Verificou-se alta semelhança entre as espessuras das lamelas obtidas nos olhos receptores e lamelas doadoras. Obteve-se acuidade visual corrigida pós-operatória igual ou superior a 20/40 em 52,6% dos olhos. Foram observadas complicações como diâmetro inadequado da lamela, perfuração intra-operatória no olho receptor e ectasia corneana pós-operatória (um caso. CONCLUSÕES: O transplante lamelar semi-automatizado de córnea mostrou-se exequível pela reprodutibilidade das espessuras e diâmetros das lamelas; eficaz pela melhora da acuidade visual pós-operatória e seguro, devido ao baixo índice de complicações cirúrgicas.PURPOSE: To evaluate the feasibility, efficacy and safety of a manual microkeratome and an artificial anterior chamber for lamellar keratoplasty (ALTK® system. METHODS: Twenty-one eyes with superficial corneal opacities were submitted to semi-automated lamellar keratectomy. In recipient eyes keratectomy was performed as in refractive surgery. The donor flap was removed from the preserved corneal shell using the same microkeratome and an artificial anterior chamber. Lamella thickness was measured through

  14. Towards a Spatial Understanding of Trade-Offs in Sustainable Development: A Meso-Scale Analysis of the Nexus between Land Use, Poverty, and Environment in the Lao PDR

    Science.gov (United States)

    Messerli, Peter; Bader, Christoph; Hett, Cornelia; Epprecht, Michael; Heinimann, Andreas

    2015-01-01

    In land systems, equitably managing trade-offs between planetary boundaries and human development needs represents a grand challenge in sustainability oriented initiatives. Informing such initiatives requires knowledge about the nexus between land use, poverty, and environment. This paper presents results from Lao PDR, where we combined nationwide spatial data on land use types and the environmental state of landscapes with village-level poverty indicators. Our analysis reveals two general but contrasting trends. First, landscapes with paddy or permanent agriculture allow a greater number of people to live in less poverty but come at the price of a decrease in natural vegetation cover. Second, people practising extensive swidden agriculture and living in intact environments are often better off than people in degraded paddy or permanent agriculture. As poverty rates within different landscape types vary more than between landscape types, we cannot stipulate a land use–poverty–environment nexus. However, the distinct spatial patterns or configurations of these rates point to other important factors at play. Drawing on ethnicity as a proximate factor for endogenous development potentials and accessibility as a proximate factor for external influences, we further explore these linkages. Ethnicity is strongly related to poverty in all land use types almost independently of accessibility, implying that social distance outweighs geographic or physical distance. In turn, accessibility, almost a precondition for poverty alleviation, is mainly beneficial to ethnic majority groups and people living in paddy or permanent agriculture. These groups are able to translate improved accessibility into poverty alleviation. Our results show that the concurrence of external influences with local—highly contextual—development potentials is key to shaping outcomes of the land use–poverty–environment nexus. By addressing such leverage points, these findings help guide more

  15. Three-dimensional mesoscale heterostructures of ZnO nanowire arrays epitaxially grown on CuGaO2 nanoplates as individual diodes.

    Science.gov (United States)

    Forticaux, Audrey; Hacialioglu, Salih; DeGrave, John P; Dziedzic, Rafal; Jin, Song

    2013-09-24

    We report a three-dimensional (3D) mesoscale heterostructure composed of one-dimensional (1D) nanowire (NW) arrays epitaxially grown on two-dimensional (2D) nanoplates. Specifically, three facile syntheses are developed to assemble vertical ZnO NWs on CuGaO2 (CGO) nanoplates in mild aqueous solution conditions. The key to the successful 3D mesoscale integration is the preferential nucleation and heteroepitaxial growth of ZnO NWs on the CGO nanoplates. Using transmission electron microscopy, heteroepitaxy was found between the basal planes of CGO nanoplates and ZnO NWs, which are their respective (001) crystallographic planes, by the observation of a hexagonal Moiré fringes pattern resulting from the slight mismatch between the c planes of ZnO and CGO. Careful analysis shows that this pattern can be described by a hexagonal supercell with a lattice parameter of almost exactly 11 and 12 times the a lattice constants for ZnO and CGO, respectively. The electrical properties of the individual CGO-ZnO mesoscale heterostructures were measured using a current-sensing atomic force microscopy setup to confirm the rectifying p-n diode behavior expected from the band alignment of p-type CGO and n-type ZnO wide band gap semiconductors. These 3D mesoscale heterostructures represent a new motif in nanoassembly for the integration of nanomaterials into functional devices with potential applications in electronics, photonics, and energy.

  16. Modeling of mesoscale dispersion effect on the piezoresistivity of carbon nanotube-polymer nanocomposites via 3D computational multiscale micromechanics methods

    International Nuclear Information System (INIS)

    Ren, Xiang; Seidel, Gary D; Chaurasia, Adarsh K; Oliva-Avilés, Andrés I; Ku-Herrera, José J; Avilés, Francis

    2015-01-01

    In uniaxial tension and compression experiments, carbon nanotube (CNT)-polymer nanocomposites have demonstrated exceptional mechanical and coupled electrostatic properties in the form of piezoresistivity. In order to better understand the correlation of the piezoresistive response with the CNT dispersion at the mesoscale, a 3D computational multiscale micromechanics model based on finite element analysis is constructed to predict the effective macroscale piezoresistive response of CNT/polymer nanocomposites. The key factors that may contribute to the overall piezoresistive response, i.e. the nanoscale electrical tunneling effect, the inherent CNT piezoresistivity and the CNT mesoscale network effect are incorporated in the model based on a 3D multiscale mechanical–electrostatic coupled code. The results not only explain how different nanoscale mechanisms influence the overall macroscale piezoresistive response through the mesoscale CNT network, but also give reason and provide bounds for the wide range of gauge factors found in the literature offering insight regarding how control of the mesoscale CNT networks can be used to tailor nanocomposite piezoresistive response. (paper)

  17. Mesoscale simulations of confined Nafion thin films

    Science.gov (United States)

    Vanya, P.; Sharman, J.; Elliott, J. A.

    2017-12-01

    The morphology and transport properties of thin films of the ionomer Nafion, with thicknesses on the order of the bulk cluster size, have been investigated as a model system to explain the anomalous behaviour of catalyst/electrode-polymer interfaces in membrane electrode assemblies. We have employed dissipative particle dynamics (DPD) to investigate the interaction of water and fluorocarbon chains, with carbon and quartz as confining materials, for a wide range of operational water contents and film thicknesses. We found confinement-induced clustering of water perpendicular to the thin film. Hydrophobic carbon forms a water depletion zone near the film interface, whereas hydrophilic quartz results in a zone with excess water. There are, on average, oscillating water-rich and fluorocarbon-rich regions, in agreement with experimental results from neutron reflectometry. Water diffusivity shows increasing directional anisotropy of up to 30% with decreasing film thickness, depending on the hydrophilicity of the confining material. A percolation analysis revealed significant differences in water clustering and connectivity with the confining material. These findings indicate the fundamentally different nature of ionomer thin films, compared to membranes, and suggest explanations for increased ionic resistances observed in the catalyst layer.

  18. Mesoscale meteorological measurements characterizing complex flows

    International Nuclear Information System (INIS)

    Hubbe, J.M.; Allwine, K.J.

    1993-09-01

    Meteorological measurements are an integral and essential component of any emergency response system for addressing accidental releases from nuclear facilities. An important element of the US Department of Energy's (DOE's) Atmospheric Studies in Complex Terrain (ASCOT) program is the refinement and use of state-of-the-art meteorological instrumentation. ASCOT is currently making use of ground-based remote wind sensing instruments such as doppler acoustic sounders (sodars). These instruments are capable of continuously and reliably measuring winds up to several hundred meters above the ground, unattended. Two sodars are currently measuring the winds, as part of ASCOT's Front Range Study, in the vicinity of DOE's Rocky Flats Plant (RFP) near Boulder, Colorado. A brief description of ASCOT's ongoing Front Range Study is given followed by a case study analysis that demonstrates the utility of the meteorological measurement equipment and the complexity of flow phenomena that are experienced near RFP. These complex flow phenomena can significantly influence the transport of the released material and consequently need to be identified for accurate assessments of the consequences of a release

  19. Towards a generalization procedure for WRF mesoscale wind climatologies

    DEFF Research Database (Denmark)

    Hahmann, Andrea N.; Casso, P.; Campmany, E.

    We present a method for generalizing wind climatologies generated from mesoscale model output (e.g. the Weather, Research and Forecasting (WRF) model.) The generalization procedure is based on Wind Atlas framework of WAsP and KAMM/WAsP, and been extensively in wind resources assessment in DTU Wind...... generalized wind climatologies estimated by the microscale model WAsP and the methodology presented here. For the Danish wind measurements the mean absolute error in the ‘raw’ wind speeds is 9.2%, while the mean absolute error in the generalized wind speeds is 4.1%. The generalization procedure has been...

  20. A Reanalysis System for the Generation of Mesoscale Climatographies

    DEFF Research Database (Denmark)

    Hahmann, Andrea N.; Rostkier-Edelstein, Dorita; Warner, Thomas T.

    2010-01-01

    ), wherein Newtonian relaxation terms in the prognostic equations continually nudge the model solution toward surface and upper-air observations. When applied to a mesoscale climatography, the system is called Climate-FDDA (CFDDA). Here, the CFDDA system is used for downscaling eastern Mediterranean...... the frequency distributions of atmospheric states in addition to time means. The verification of the monthly rainfall climatography shows that CFDDA captures most of the observed spatial and interannual variability, although the model tends to underestimate rainfall amounts over the sea. The frequency...

  1. LBM estimation of thermal conductivity in meso-scale modelling

    International Nuclear Information System (INIS)

    Grucelski, A

    2016-01-01

    Recently, there is a growing engineering interest in more rigorous prediction of effective transport coefficients for multicomponent, geometrically complex materials. We present main assumptions and constituents of the meso-scale model for the simulation of the coal or biomass devolatilisation with the Lattice Boltzmann method. For the results, the estimated values of the thermal conductivity coefficient of coal (solids), pyrolytic gases and air matrix are presented for a non-steady state with account for chemical reactions in fluid flow and heat transfer. (paper)

  2. Adaptation of Mesoscale Weather Models to Local Forecasting

    Science.gov (United States)

    Manobianco, John T.; Taylor, Gregory E.; Case, Jonathan L.; Dianic, Allan V.; Wheeler, Mark W.; Zack, John W.; Nutter, Paul A.

    2003-01-01

    Methodologies have been developed for (1) configuring mesoscale numerical weather-prediction models for execution on high-performance computer workstations to make short-range weather forecasts for the vicinity of the Kennedy Space Center (KSC) and the Cape Canaveral Air Force Station (CCAFS) and (2) evaluating the performances of the models as configured. These methodologies have been implemented as part of a continuing effort to improve weather forecasting in support of operations of the U.S. space program. The models, methodologies, and results of the evaluations also have potential value for commercial users who could benefit from tailoring their operations and/or marketing strategies based on accurate predictions of local weather. More specifically, the purpose of developing the methodologies for configuring the models to run on computers at KSC and CCAFS is to provide accurate forecasts of winds, temperature, and such specific thunderstorm-related phenomena as lightning and precipitation. The purpose of developing the evaluation methodologies is to maximize the utility of the models by providing users with assessments of the capabilities and limitations of the models. The models used in this effort thus far include the Mesoscale Atmospheric Simulation System (MASS), the Regional Atmospheric Modeling System (RAMS), and the National Centers for Environmental Prediction Eta Model ( Eta for short). The configuration of the MASS and RAMS is designed to run the models at very high spatial resolution and incorporate local data to resolve fine-scale weather features. Model preprocessors were modified to incorporate surface, ship, buoy, and rawinsonde data as well as data from local wind towers, wind profilers, and conventional or Doppler radars. The overall evaluation of the MASS, Eta, and RAMS was designed to assess the utility of these mesoscale models for satisfying the weather-forecasting needs of the U.S. space program. The evaluation methodology includes

  3. Down-scaling wind energy resource from mesoscale to local scale by nesting and data assimilation with a CFD model

    International Nuclear Information System (INIS)

    Duraisamy Jothiprakasam, Venkatesh

    2014-01-01

    procedure is carried out with either sonic or cup anemometers measurements. First a detailed analysis of the results obtained with the mesoscale-CFD coupling and with or without data assimilation is shown for two main wind directions, including a sensitivity study to the parameters involved in the coupling and in the nudging. The last part of the work is devoted to the estimate of the wind potential using clustering. A comparison of the annual mean wind speed with measurements that do not enter the assimilation process and with the WAsP model is presented. The improvement provided by the data assimilation on the distribution of differences with measurements is shown on the wind speed and direction for different configurations. (author) [fr

  4. Recipes for correcting the impact of effective mesoscale resolution on the estimation of extreme winds

    DEFF Research Database (Denmark)

    Larsén, Xiaoli Guo; Ott, Søren; Badger, Jake

    2012-01-01

    Extreme winds derived from simulations using mesoscale models are underestimated due to the effective spatial and temporal resolutions. This is reflected in the spectral domain as an energy deficit in the mesoscale range. The energy deficit implies smaller spectral moments and thus underestimatio...

  5. Numerical simulation of terrain-induced mesoscale circulation in the Chiang Mai area, Thailand

    Science.gov (United States)

    Sathitkunarat, Surachai; Wongwises, Prungchan; Pan-Aram, Rudklao; Zhang, Meigen

    2008-11-01

    The regional atmospheric modeling system (RAMS) was applied to Chiang Mai province, a mountainous area in Thailand, to study terrain-induced mesoscale circulations. Eight cases in wet and dry seasons under different weather conditions were analyzed to show thermal and dynamic impacts on local circulations. This is the first study of RAMS in Thailand especially investigating the effect of mountainous area on the simulated meteorological data. Analysis of model results indicates that the model can reproduce major features of local circulation and diurnal variations in temperatures. For evaluating the model performance, model results were compared with observed wind speed, wind direction, and temperature monitored at a meteorological tower. Comparison shows that the modeled values are generally in good agreement with observations and that the model captured many of the observed features.

  6. The Karlsruhe Atmospheric Mesoscale Model KAMM; Das Karlsruher Atmosphaerische Mesoskalige Modell KAMM

    Energy Technology Data Exchange (ETDEWEB)

    Adrian, G. [Forschungszentrum Karlsruhe GmbH Umwelt und Technik (Germany). Inst. fuer Meteorologie und Klimaforschung]|[Karlsruhe Univ. (T.H.). (Germany). Inst. fuer Meteorologie und Klimaforschung

    1998-01-01

    The applications of the KAMM model range from real-time simulations over the analysis of mesoscale phenomena and the development of parametrizations to describing climatology. In the course of time, wishes emerged to change essential parts of the original model concept, calling for substantial reprogramming; so it was decided to entirely redraft the dynamic core of KAMM and to program it from the beginning including the parallelization of the code. The paper describes the basics of the new model core. (orig./KW) [Deutsch] Der Anwendungsbereich des KAMM-Modells erstreckt sich von Echtzeitsimulationen, ueber Analyse mesoskaliger Phaenomene, Entwicklung von Parametrisierungen bis hin zur beschreibenden Klimatologie. Weil im Laufe der Entstehungszeit wesentliche Aenderungswuensche des urspruenglichen Konzeptes entstanden sind, die eine Neuprogrammierung in wesentlichen Teilen erforderlich erscheinen lassen, wurde entschieden, den dynamischen Kern von KAMM voellig neu zu gestalten und bei der Programmierung eine Parallelisierung des Codes von Anfang an mit einzubeziehen. Die Grundlagen dieses neuen Modellkernes werden vorgestellt. (orig./KW)

  7. Origin of the pre-tropical storm Debby (2006) African easterly wave-mesoscale convective system

    Science.gov (United States)

    Lin, Yuh-Lang; Liu, Liping; Tang, Guoqing; Spinks, James; Jones, Wilson

    2013-05-01

    The origins of the pre-Debby (2006) mesoscale convective system (MCS) and African easterly wave (AEW) and their precursors were traced back to the southwest Arabian Peninsula, Asir Mountains (AS), and Ethiopian Highlands (EH) in the vicinity of the ITCZ using satellite imagery, GFS analysis data and ARW model. The sources of the convective cloud clusters and vorticity perturbations were attributed to the cyclonic convergence of northeasterly Shamal wind and the Somali jet, especially when the Mediterranean High shifted toward east and the Indian Ocean high strengthened and its associated Somali jet penetrated farther to the north. The cyclonic vorticity perturbations were strengthened by the vorticity stretching associated with convective cloud clusters in the genesis region—southwest Arabian Peninsula. A conceptual model was proposed to explain the genesis of convective cloud clusters and cyclonic vorticity perturbations preceding the pre-Debby (2006) AEW-MCS system.

  8. A Semi-Automated Workflow Solution for Data Set Publication

    Directory of Open Access Journals (Sweden)

    Suresh Vannan

    2016-03-01

    Full Text Available To address the need for published data, considerable effort has gone into formalizing the process of data publication. From funding agencies to publishers, data publication has rapidly become a requirement. Digital Object Identifiers (DOI and data citations have enhanced the integration and availability of data. The challenge facing data publishers now is to deal with the increased number of publishable data products and most importantly the difficulties of publishing diverse data products into an online archive. The Oak Ridge National Laboratory Distributed Active Archive Center (ORNL DAAC, a NASA-funded data center, faces these challenges as it deals with data products created by individual investigators. This paper summarizes the challenges of curating data and provides a summary of a workflow solution that ORNL DAAC researcher and technical staffs have created to deal with publication of the diverse data products. The workflow solution presented here is generic and can be applied to data from any scientific domain and data located at any data center.

  9. Semi-automated reviewing station for IAEA optical surveillance data

    International Nuclear Information System (INIS)

    Darnell, R.A.; Sonnier, C.S.

    1987-01-01

    A study is underway on the use of computer vision technology to assist in visual inspection of optical surveillance data. The IAEA currently uses optical surveillance as one of its principle Containment and Surveillance (C/S) measures. The review process is a very time-consuming and tedious task, due to the large amount of optical surveillance data to be reviewed. For some time, the IAEA has identified as one of its principle needs an automated optical surveillance data reviewing station that assists the reviewer in identifying activities of safeguards interest, such as the movement of a very large spent fuel cask. The present development reviewing station consists of commercially available digital image processing hardware controlled by a personal computer. The areas under study include change detection, target discrimination, tracking, and classification. Several algorithms are being evaluated in each of these areas using recorded video tape of safeguards relevant scenes. The computer vision techniques and current status of the studies are discussed

  10. Semi-automated categorization of open-ended questions

    Directory of Open Access Journals (Sweden)

    Matthias Schonlau

    2016-08-01

    Full Text Available Text data from open-ended questions in surveys are difficult to analyze and are frequently ignored. Yet open-ended questions are important because they do not constrain respondents’ answer choices. Where open-ended questions are necessary, sometimes multiple human coders hand-code answers into one of several categories. At the same time, computer scientists have made impressive advances in text mining that may allow automation of such coding. Automated algorithms do not achieve an overall accuracy high enough to entirely replace humans. We categorize open-ended questions soliciting narrative responses using text mining for easy-to-categorize answers and humans for the remainder using expected accuracies to guide the choice of the threshold delineating between “easy” and “hard”. Employing multinomial boosting avoids the common practice of converting machine learning “confidence scores” into pseudo-probabilities. This approach is illustrated with examples from open-ended questions related to respondents’ advice to a patient in a hypothetical dilemma, a follow-up probe related to respondents’ perception of disclosure/privacy risk, and from a question on reasons for quitting smoking from a follow-up survey from the Ontario Smoker’s Helpline. Targeting 80% combined accuracy, we found that 54%-80% of the data could be categorized automatically in research surveys.

  11. A semi-automated vascular access system for preclinical models

    International Nuclear Information System (INIS)

    Berry-Pusey, B N; David, J; Taschereau, R; Silverman, R W; Williams, D; Ladno, W; Stout, D; Chatziioannou, A; Chang, Y C; Prince, S W; Chu, K; Tsao, T C

    2013-01-01

    Murine models are used extensively in biological and translational research. For many of these studies it is necessary to access the vasculature for the injection of biologically active agents. Among the possible methods for accessing the mouse vasculature, tail vein injections are a routine but critical step for many experimental protocols. To perform successful tail vein injections, a high skill set and experience is required, leaving most scientists ill-suited to perform this task. This can lead to a high variability between injections, which can impact experimental results. To allow more scientists to perform tail vein injections and to decrease the variability between injections, a vascular access system (VAS) that semi-automatically inserts a needle into the tail vein of a mouse was developed. The VAS uses near infrared light, image processing techniques, computer controlled motors, and a pressure feedback system to insert the needle and to validate its proper placement within the vein. The VAS was tested by injecting a commonly used radiolabeled probe (FDG) into the tail veins of five mice. These mice were then imaged using micro-positron emission tomography to measure the percentage of the injected probe remaining in the tail. These studies showed that, on average, the VAS leaves 3.4% of the injected probe in the tail. With these preliminary results, the VAS system demonstrates the potential for improving the accuracy of tail vein injections in mice. (paper)

  12. Semi-automated tracking of behaviour of Betta splendens

    DEFF Research Database (Denmark)

    Durey, Maëlle; Paulsen, Rasmus Reinhold; Matessi, Giuliano

    2010-01-01

    In this paper, a novel software system for animal behaviour tracking is described. It is used for tracking fish filmed in aquariums using a low quality acquisition system. The tracking is based on a multiscale template matching technique that finds both the position and the orientation of the tra......In this paper, a novel software system for animal behaviour tracking is described. It is used for tracking fish filmed in aquariums using a low quality acquisition system. The tracking is based on a multiscale template matching technique that finds both the position and the orientation...... of the tracked fish. The template is matched in the background subtracted frames, where the background is estimated using a median based approach. The system is very stable and has been used in a large behavioural study design to the use of the behavioural pattern known as mate choice copying in Betta splendens....

  13. Semi-automated x-ray gauging process control system

    International Nuclear Information System (INIS)

    Draut, C.F.; Homan, D.A.

    1976-01-01

    An x-ray gauging method was developed and a production gauging system was subsequently fabricated to control the quality of precision manufactured components. The gauging system measures via x-ray absorption the density of pressed finely divided solids held in a dissimilar container. The two dissimilar materials condition necessitated a ''two scan'' technique: first, the x-ray attenuation (absorption) of the empty container prior to loading and then, the attenuation of the loaded container are measured; that is, four variables. The system provided greatly improved product control via timely data feedback and increased product quality assurance via 100 percent inspection of product. In addition, it reduced labor costs, product cost, and possibilities for human errors

  14. Application of semi-automated settlement detection for an integrated ...

    African Journals Online (AJOL)

    Complete, accurate and up-to-date topographic data is of vast importance as it is widely required by different government agencies, non-governmental organisations, the private sector as well as the general public for urban mapping, rural development and environmental management, to mention but a few applications.

  15. Influence of mesoscale eddies on the distribution of nitrous oxide in the eastern tropical South Pacific

    Science.gov (United States)

    Arévalo-Martínez, Damian L.; Kock, Annette; Löscher, Carolin R.; Schmitz, Ruth A.; Stramma, Lothar; Bange, Hermann W.

    2016-02-01

    Recent observations in the eastern tropical South Pacific (ETSP) have shown the key role of meso- and submesoscale processes (e.g. eddies) in shaping its hydrographic and biogeochemical properties. Off Peru, elevated primary production from coastal upwelling in combination with sluggish ventilation of subsurface waters fuels a prominent oxygen minimum zone (OMZ). Given that nitrous oxide (N2O) production-consumption processes in the water column are sensitive to oxygen (O2) concentrations, the ETSP is a region of particular interest to investigate its source-sink dynamics. To date, no detailed surveys linking mesoscale processes and N2O distributions as well as their relevance to nitrogen (N) cycling are available. In this study, we present the first measurements of N2O across three mesoscale eddies (two mode water or anticyclonic and one cyclonic) which were identified, tracked, and sampled during two surveys carried out in the ETSP in November-December 2012. A two-peak structure was observed for N2O, wherein the two maxima coincide with the upper and lower boundaries of the OMZ, indicating active nitrification and partial denitrification. This was further supported by the abundances of the key gene for nitrification, ammonium monooxygenase (amoA), and the gene marker for N2O production during denitrification, nitrite reductase (nirS). Conversely, we found strong N2O depletion in the core of the OMZ (O2 nitrate (NO3-), thus suggesting active denitrification. N2O depletion within the OMZ's core was substantially higher in the centre of mode water eddies, supporting the view that eddy activity enhances N-loss processes off Peru, in particular near the shelf break where nutrient-rich, productive waters from upwelling are trapped before being transported offshore. Analysis of eddies during their propagation towards the open ocean showed that, in general, "ageing" of mesoscale eddies tends to decrease N2O concentrations through the water column in response to the

  16. Analysis of new bone, cartilage, and fibrosis tissue in healing murine allografts using whole slide imaging and a new automated histomorphometric algorithm

    OpenAIRE

    Zhang, Longze; Chang, Martin; Beck, Christopher A; Schwarz, Edward M; Boyce, Brendan F

    2016-01-01

    Histomorphometric analysis of histologic sections of normal and diseased bone samples, such as healing allografts and fractures, is widely used in bone research. However, the utility of traditional semi-automated methods is limited because they are labor-intensive and can have high interobserver variability depending upon the parameters being assessed, and primary data cannot be re-analyzed automatically. Automated histomorphometry has long been recognized as a solution for these issues, and ...

  17. Mesoscale eddies are oases for higher trophic marine life

    KAUST Repository

    Godø , Olav R.; Samuelsen, Annette; Macaulay, Gavin J.; Patel, Ruben; Hjø llo, Solfrid Sæ tre; Horne, John; Kaartvedt, Stein; Johannessen, Johnny A.

    2012-01-01

    Mesoscale eddies stimulate biological production in the ocean, but knowledge of energy transfers to higher trophic levels within eddies remains fragmented and not quantified. Increasing the knowledge base is constrained by the inability of traditional sampling methods to adequately sample biological processes at the spatio-temporal scales at which they occur. By combining satellite and acoustic observations over spatial scales of 10 s of km horizontally and 100 s of m vertically, supported by hydrographical and biological sampling we show that anticyclonic eddies shape distribution and density of marine life from the surface to bathyal depths. Fish feed along density structures of eddies, demonstrating that eddies catalyze energy transfer across trophic levels. Eddies create attractive pelagic habitats, analogous to oases in the desert, for higher trophic level aquatic organisms through enhanced 3-D motion that accumulates and redistributes biomass, contributing to overall bioproduction in the ocean. Integrating multidisciplinary observation methodologies promoted a new understanding of biophysical interaction in mesoscale eddies. Our findings emphasize the impact of eddies on the patchiness of biomass in the sea and demonstrate that they provide rich feeding habitat for higher trophic marine life. 2012 God et al.

  18. Rotational and divergent kinetic energy in the mesoscale model ALADIN

    Directory of Open Access Journals (Sweden)

    V. Blažica

    2013-03-01

    Full Text Available Kinetic energy spectra from the mesoscale numerical weather prediction (NWP model ALADIN with horizontal resolution 4.4 km are split into divergent and rotational components which are then compared at horizontal scales below 300 km and various vertical levels. It is shown that about 50% of kinetic energy in the free troposphere in ALADIN is divergent energy. The percentage increases towards 70% near the surface and in the upper troposphere towards 100 hPa. The maximal percentage of divergent energy is found at stratospheric levels around 100 hPa and at scales below 100 km which are not represented by the global models. At all levels, the divergent energy spectra are characterised by shallower slopes than the rotational energy spectra, and the difference increases as horizontal scales become larger. A very similar vertical distribution of divergent energy is obtained by using the standard ALADIN approach for the computation of spectra based on the extension zone and by applying detrending approach commonly used in mesoscale NWP community.

  19. On the Nature of the Mesoscale Variability in Denmark Strait

    Science.gov (United States)

    Pickart, Robert; von Appen, Wilken; Mastropole, Dana; Valdimarsson, Hedinn; Vage, Kjetil; Jonsson, Steingriumur; Jochumsen, Kerstin; Girton, James

    2017-04-01

    The dense overflow through Denmark Strait is the largest contributor to the lower limb of the Atlantic Meridional Overturning Circulation. As such, it is important to understand the sources of water feeding the overflow and how the water negotiates the sill as it passes into the Irminger Sea. Here we use a large collection of shipboard hydrographic transects occupied across the strait, together with 6-years of mooring data from the sill, to investigate the water masses and mesoscale variability of the overflow water. Two dominant types of mesoscale features were identified, referred to as a "bolus" and a "pulse". The former is a large lens of weakly stratified water corresponding to a slight increase in along-strait velocity. The latter is a thin layer with greater stratification and strongly enhanced along-strait flow. The boluses, which are often noted in the historical literature, are associated with cyclonic circulation, while pulses, which have not been previously identified, are associated with anti-cyclonic circulation. Both features result in increased transport of overflow water. It is argued that these fluctuations at the sill trigger energetic variability downstream in the Deep Western Boundary Current.

  20. Mesoscale eddies are oases for higher trophic marine life.

    Directory of Open Access Journals (Sweden)

    Olav R Godø

    Full Text Available Mesoscale eddies stimulate biological production in the ocean, but knowledge of energy transfers to higher trophic levels within eddies remains fragmented and not quantified. Increasing the knowledge base is constrained by the inability of traditional sampling methods to adequately sample biological processes at the spatio-temporal scales at which they occur. By combining satellite and acoustic observations over spatial scales of 10 s of km horizontally and 100 s of m vertically, supported by hydrographical and biological sampling we show that anticyclonic eddies shape distribution and density of marine life from the surface to bathyal depths. Fish feed along density structures of eddies, demonstrating that eddies catalyze energy transfer across trophic levels. Eddies create attractive pelagic habitats, analogous to oases in the desert, for higher trophic level aquatic organisms through enhanced 3-D motion that accumulates and redistributes biomass, contributing to overall bioproduction in the ocean. Integrating multidisciplinary observation methodologies promoted a new understanding of biophysical interaction in mesoscale eddies. Our findings emphasize the impact of eddies on the patchiness of biomass in the sea and demonstrate that they provide rich feeding habitat for higher trophic marine life.

  1. Derivation and precision of mean field electrodynamics with mesoscale fluctuations

    Science.gov (United States)

    Zhou, Hongzhe; Blackman, Eric G.

    2018-06-01

    Mean field electrodynamics (MFE) facilitates practical modelling of secular, large scale properties of astrophysical or laboratory systems with fluctuations. Practitioners commonly assume wide scale separation between mean and fluctuating quantities, to justify equality of ensemble and spatial or temporal averages. Often however, real systems do not exhibit such scale separation. This raises two questions: (I) What are the appropriate generalized equations of MFE in the presence of mesoscale fluctuations? (II) How precise are theoretical predictions from MFE? We address both by first deriving the equations of MFE for different types of averaging, along with mesoscale correction terms that depend on the ratio of averaging scale to variation scale of the mean. We then show that even if these terms are small, predictions of MFE can still have a significant precision error. This error has an intrinsic contribution from the dynamo input parameters and a filtering contribution from differences in the way observations and theory are projected through the measurement kernel. Minimizing the sum of these contributions can produce an optimal scale of averaging that makes the theory maximally precise. The precision error is important to quantify when comparing to observations because it quantifies the resolution of predictive power. We exemplify these principles for galactic dynamos, comment on broader implications, and identify possibilities for further work.

  2. Use of ground-based wind profiles in mesoscale forecasting

    Science.gov (United States)

    Schlatter, Thomas W.

    1985-01-01

    A brief review is presented of recent uses of ground-based wind profile data in mesoscale forecasting. Some of the applications are in real time, and some are after the fact. Not all of the work mentioned here has been published yet, but references are given wherever possible. As Gage and Balsley (1978) point out, sensitive Doppler radars have been used to examine tropospheric wind profiles since the 1970's. It was not until the early 1980's, however, that the potential contribution of these instruments to operational forecasting and numerical weather prediction became apparent. Profiler winds and radiosonde winds compare favorably, usually within a few m/s in speed and 10 degrees in direction (see Hogg et al., 1983), but the obvious advantage of the profiler is its frequent (hourly or more often) sampling of the same volume. The rawinsonde balloon is launched only twice a day and drifts with the wind. In this paper, I will: (1) mention two operational uses of data from a wind profiling system developed jointly by the Wave Propagation and Aeronomy Laboratories of NOAA; (2) describe a number of displays of these same data on a workstation for mesoscale forecasting developed by the Program for Regional Observing and Forecasting Services (PROFS); and (3) explain some interesting diagnostic calculations performed by meteorologists of the Wave Propagation Laboratory.

  3. Mesoscale eddies are oases for higher trophic marine life

    KAUST Repository

    Godø, Olav R.

    2012-01-17

    Mesoscale eddies stimulate biological production in the ocean, but knowledge of energy transfers to higher trophic levels within eddies remains fragmented and not quantified. Increasing the knowledge base is constrained by the inability of traditional sampling methods to adequately sample biological processes at the spatio-temporal scales at which they occur. By combining satellite and acoustic observations over spatial scales of 10 s of km horizontally and 100 s of m vertically, supported by hydrographical and biological sampling we show that anticyclonic eddies shape distribution and density of marine life from the surface to bathyal depths. Fish feed along density structures of eddies, demonstrating that eddies catalyze energy transfer across trophic levels. Eddies create attractive pelagic habitats, analogous to oases in the desert, for higher trophic level aquatic organisms through enhanced 3-D motion that accumulates and redistributes biomass, contributing to overall bioproduction in the ocean. Integrating multidisciplinary observation methodologies promoted a new understanding of biophysical interaction in mesoscale eddies. Our findings emphasize the impact of eddies on the patchiness of biomass in the sea and demonstrate that they provide rich feeding habitat for higher trophic marine life. 2012 God et al.

  4. Mesoscale eddies in the Subantarctic Front-Southwest Atlantic

    Directory of Open Access Journals (Sweden)

    Pablo D. Glorioso

    2005-12-01

    Full Text Available Satellite and ship observations in the southern southwest Atlantic (SSWA reveal an intense eddy field and highlight the potential for using continuous real-time satellite altimetry to detect and monitor mesoscale phenomena with a view to understanding the regional circulation. The examples presented suggest that mesoscale eddies are a dominant feature of the circulation and play a fundamental role in the transport of properties along and across the Antarctic Circumpolar Current (ACC. The main ocean current in the SSWA, the Falkland-Malvinas Current (FMC, exhibits numerous embedded eddies south of 50°S which may contribute to the patchiness, transport and mixing of passive scalars by this strong, turbulent current. Large eddies associated with meanders are observed in the ACC fronts, some of them remaining stationary for long periods. Two particular cases are examined using a satellite altimeter in combination with in situ observations, suggesting that cross-frontal eddy transport and strong meandering occur where the ACC flow intensifies along the sub-Antarctic Front (SAF and the Southern ACC Front (SACCF.

  5. Integrated analysis of water quality in a mesoscale lowland basin

    Directory of Open Access Journals (Sweden)

    A. Habeck

    2005-01-01

    Full Text Available This article describes a modelling study on nitrogen transport from diffuse sources in the Nuthe catchment, representing a typical lowland region in the north-eastern Germany. Building on a hydrological validation performed in advance using the ecohydrological model SWIM, the nitrogen flows were simulated over a 20-year period (1981-2000. The relatively good quality of the input data, particularly for the years from 1993 to 2000, enabled the nitrogen flows to be reproduced sufficiently well, although modelling nutrient flows is always associated with a great deal of uncertainty. Subsequently, scenario calculations were carried out in order to investigate how nitrogen transport from the catchment could be further reduced. The selected scenario results with the greatest reduction of nitrogen washoff will briefly be presented in the paper.

  6. Silicate:nitrate ratios of upwelled waters control the phytoplankton community sustained by mesoscale eddies in sub-tropical North Atlantic and Pacific

    Directory of Open Access Journals (Sweden)

    T. S. Bibby

    2011-03-01

    Full Text Available Mesoscale eddies in sub-tropical gyres physically perturb the water column and can introduce macronutrients to the euphotic zone, stimulating a biological response in which phytoplankton communities can become dominated by large phytoplankton. Mesoscale eddies may therefore be important in driving export in oligotrophic regions of the modern ocean. However, the character and magnitude of the biological response sustained by eddies is variable. Here we present data from mesoscale eddies in the Sargasso Sea (Atlantic and the waters off Hawai'i (Pacific, alongside mesoscale events that affected the Bermuda Atlantic Time-Series Study (BATS over the past decade. From this analysis, we suggest that the phytoplankton community structure sustained by mesoscale eddies is predetermined by the relative abundance of silicate over nitrate (Si* in the upwelled waters. We present data that demonstrate that mode-water eddies (MWE in the Sargasso Sea upwell locally formed waters with relatively high Si* to the euphotic zone, and that cyclonic eddies in the Sargasso Sea introduce waters with relatively low Si*, a signature that originated in the iron-limited Southern Ocean. We propose that this phenomenon can explain the observed dominance of the phytoplankton community by large-diatom species in MWE and by small prokaryotic phytoplankton in cyclonic features. In contrast to the Atlantic, North Pacific Intermediate Water (NPIW with high Si* may influence the cyclonic eddies in waters off Hawai'i, which also appear capable of sustaining diatom populations. These observations suggest that the structure of phytoplankton communities sustained by eddies may be related to the chemical composition of the upwelled waters in addition to the physical nature of the eddy.

  7. Study of the air-sea interactions at the mesoscale: the SEMAPHORE experiment

    Directory of Open Access Journals (Sweden)

    L. Eymard

    1996-09-01

    Full Text Available The SEMAPHORE (Structure des Echanges Mer-Atmosphère, Propriétés des Hétérogénéités Océaniques: Recherche Expérimentale experiment has been conducted from June to November 1993 in the Northeast Atlantic between the Azores and Madeira. It was centered on the study of the mesoscale ocean circulation and air-sea interactions. The experimental investigation was achieved at the mesoscale using moorings, floats, and ship hydrological survey, and at a smaller scale by one dedicated ship, two instrumented aircraft, and surface drifting buoys, for one and a half month in October-November (IOP: intense observing period. Observations from meteorological operational satellites as well as spaceborne microwave sensors were used in complement. The main studies undertaken concern the mesoscale ocean, the upper ocean, the atmospheric boundary layer, and the sea surface, and first results are presented for the various topics. From data analysis and model simulations, the main characteristics of the ocean circulation were deduced, showing the close relationship between the Azores front meander and the occurrence of Mediterranean water lenses (meddies, and the shift between the Azores current frontal signature at the surface and within the thermocline. Using drifting buoys and ship data in the upper ocean, the gap between the scales of the atmospheric forcing and the oceanic variability was made evident. A 2 °C decrease and a 40-m deepening of the mixed layer were measured within the IOP, associated with a heating loss of about 100 W m-2. This evolution was shown to be strongly connected to the occurrence of storms at the beginning and the end of October. Above the surface, turbulent measurements from ship and aircraft were analyzed across the surface thermal front, showing a 30% difference in heat fluxes between both sides during a 4-day period, and the respective contributions of the wind and the surface temperature were evaluated. The classical

  8. Study of the air-sea interactions at the mesoscale: the SEMAPHORE experiment

    Science.gov (United States)

    Eymard, L.; Planton, S.; Durand, P.; Le Visage, C.; Le Traon, P. Y.; Prieur, L.; Weill, A.; Hauser, D.; Rolland, J.; Pelon, J.; Baudin, F.; Bénech, B.; Brenguier, J. L.; Caniaux, G.; de Mey, P.; Dombrowski, E.; Druilhet, A.; Dupuis, H.; Ferret, B.; Flamant, C.; Flamant, P.; Hernandez, F.; Jourdan, D.; Katsaros, K.; Lambert, D.; Lefèvre, J. M.; Le Borgne, P.; Le Squere, B.; Marsoin, A.; Roquet, H.; Tournadre, J.; Trouillet, V.; Tychensky, A.; Zakardjian, B.

    1996-09-01

    The SEMAPHORE (Structure des Echanges Mer-Atmosphère, Propriétés des Hétérogénéités Océaniques: Recherche Expérimentale) experiment has been conducted from June to November 1993 in the Northeast Atlantic between the Azores and Madeira. It was centered on the study of the mesoscale ocean circulation and air-sea interactions. The experimental investigation was achieved at the mesoscale using moorings, floats, and ship hydrological survey, and at a smaller scale by one dedicated ship, two instrumented aircraft, and surface drifting buoys, for one and a half month in October-November (IOP: intense observing period). Observations from meteorological operational satellites as well as spaceborne microwave sensors were used in complement. The main studies undertaken concern the mesoscale ocean, the upper ocean, the atmospheric boundary layer, and the sea surface, and first results are presented for the various topics. From data analysis and model simulations, the main characteristics of the ocean circulation were deduced, showing the close relationship between the Azores front meander and the occurrence of Mediterranean water lenses (meddies), and the shift between the Azores current frontal signature at the surface and within the thermocline. Using drifting buoys and ship data in the upper ocean, the gap between the scales of the atmospheric forcing and the oceanic variability was made evident. A 2 °C decrease and a 40-m deepening of the mixed layer were measured within the IOP, associated with a heating loss of about 100 W m-2. This evolution was shown to be strongly connected to the occurrence of storms at the beginning and the end of October. Above the surface, turbulent measurements from ship and aircraft were analyzed across the surface thermal front, showing a 30% difference in heat fluxes between both sides during a 4-day period, and the respective contributions of the wind and the surface temperature were evaluated. The classical momentum flux bulk

  9. Study of the air-sea interactions at the mesoscale: the SEMAPHORE experiment

    Directory of Open Access Journals (Sweden)

    L. Eymard

    Full Text Available The SEMAPHORE (Structure des Echanges Mer-Atmosphère, Propriétés des Hétérogénéités Océaniques: Recherche Expérimentale experiment has been conducted from June to November 1993 in the Northeast Atlantic between the Azores and Madeira. It was centered on the study of the mesoscale ocean circulation and air-sea interactions. The experimental investigation was achieved at the mesoscale using moorings, floats, and ship hydrological survey, and at a smaller scale by one dedicated ship, two instrumented aircraft, and surface drifting buoys, for one and a half month in October-November (IOP: intense observing period. Observations from meteorological operational satellites as well as spaceborne microwave sensors were used in complement. The main studies undertaken concern the mesoscale ocean, the upper ocean, the atmospheric boundary layer, and the sea surface, and first results are presented for the various topics. From data analysis and model simulations, the main characteristics of the ocean circulation were deduced, showing the close relationship between the Azores front meander and the occurrence of Mediterranean water lenses (meddies, and the shift between the Azores current frontal signature at the surface and within the thermocline. Using drifting buoys and ship data in the upper ocean, the gap between the scales of the atmospheric forcing and the oceanic variability was made evident. A 2 °C decrease and a 40-m deepening of the mixed layer were measured within the IOP, associated with a heating loss of about 100 W m-2. This evolution was shown to be strongly connected to the occurrence of storms at the beginning and the end of October. Above the surface, turbulent measurements from ship and aircraft were analyzed across the surface thermal front, showing a 30% difference in heat fluxes between both sides during a 4-day period, and the respective contributions of the wind and the surface temperature were evaluated. The

  10. On the influence of temporal and spatial resolution of aircraft emission inventories for mesoscale modeling of pollutant dispersion

    Energy Technology Data Exchange (ETDEWEB)

    Franzkowiak, V.; Petry, H.; Ebel, A. [Cologne Univ. (Germany). Inst. for Geophysics and Meteorology

    1997-12-31

    The sensitivity of a mesoscale chemistry transport model to the temporal and spatial resolution of aircraft emission inventories is evaluated. A statistical analysis of air traffic in the North-Atlantic flight corridor is carried out showing a highly variable, fine structured spatial distribution and a pronounced daily variation. Sensitivity studies comparing different emission scenarios reveal a strong dependency to the emission time and location of both transport and response in chemical formation of subsequent products. The introduction of a pronounced daily variation leads to a 30% higher ozone production in comparison to uniformly distributed emissions. (author) 9 refs.

  11. On the influence of temporal and spatial resolution of aircraft emission inventories for mesoscale modeling of pollutant dispersion

    Energy Technology Data Exchange (ETDEWEB)

    Franzkowiak, V; Petry, H; Ebel, A [Cologne Univ. (Germany). Inst. for Geophysics and Meteorology

    1998-12-31

    The sensitivity of a mesoscale chemistry transport model to the temporal and spatial resolution of aircraft emission inventories is evaluated. A statistical analysis of air traffic in the North-Atlantic flight corridor is carried out showing a highly variable, fine structured spatial distribution and a pronounced daily variation. Sensitivity studies comparing different emission scenarios reveal a strong dependency to the emission time and location of both transport and response in chemical formation of subsequent products. The introduction of a pronounced daily variation leads to a 30% higher ozone production in comparison to uniformly distributed emissions. (author) 9 refs.

  12. Diurnal and seasonal variations in surface methane at a tropical coastal station: Role of mesoscale meteorology.

    Science.gov (United States)

    Kavitha, M; Nair, Prabha R; Girach, I A; Aneesh, S; Sijikumar, S; Renju, R

    2018-08-01

    In view of the large uncertainties in the methane (CH 4 ) emission estimates and the large spatial gaps in its measurements, studies on near-surface CH 4 on regional basis become highly relevant. This paper presents the first time observational results of a study on the impacts of mesoscale meteorology on the temporal variations of near-surface CH 4 at a tropical coastal station, in India. It is based on the in-situ measurements conducted during January 2014 to August 2016, using an on-line CH 4 analyzer working on the principle of gas chromatography. The diurnal variation shows a daytime low (1898-1925ppbv) and nighttime high (1936-2022ppbv) extending till early morning hours. These changes are closely associated with the mesoscale circulations, namely Sea Breeze (SB) and Land Breeze (LB), as obtained through the meteorological observations, WRF simulations of the circulations and the diurnal variation of boundary layer height as observed by the Microwave Radiometer Profiler. The diurnal enhancement always coincides with the onset of LB. Several cases of different onset timings of LB were examined and results presented. The CH 4 mixing ratio also exhibits significant seasonal patterns being maximum in winter and minimum in pre-monsoon/monsoon with significant inter-annual variations, which is also reflected in diurnal patterns, and are associated with changing synoptic meteorology. This paper also presents an analysis of in-situ measured near-surface CH 4 , column averaged and upper tropospheric CH 4 retrieved by Atmospheric Infrared Sounder (AIRS) onboard Earth Observing System (EOS)/Aqua which gives insight into the vertical distribution of the CH 4 over the location. An attempt is also made to estimate the instantaneous radiative forcing for the measured CH 4 mixing ratio. Copyright © 2018 Elsevier B.V. All rights reserved.

  13. Deep drivers of mesoscale circulation in the central Rockall Trough

    Science.gov (United States)

    Sherwin, T. J.; Alyenik, D.; Dumont, E.; Inall, M.

    2014-11-01

    Mesoscale variability in the central Rockall Trough between about 56 and 58° N has been investigated using a combination of ship-borne, underwater glider and gridded satellite altimeter measurements. Altimeter observations show that mesoscale features such as eddies and large scale circulation cells are ubiquitous phenomena. They have horizontal length scales of order 100 km with vertical scales of over 1000 m and are associated with mean current speeds (over the upper 1000 m) of 15 ± 7 cm s-1. Monthly area averaged surface Eddy Kinetic Energy (EKE) has substantial inter-annual variability, which at times can dominate a mean seasonal signal that varies from a maximum in May (74 cm2 s-2) to a minimum in October (52 cm2 s-2) and has increased gradually since 1992 at about 1.1 cm2 s-2 per year. A five month glider mission in the Trough showed that much of this energy comes from features that are located over 1000 m below the surface in the deep cold waters of the Trough (possibly from eddies associated the North Atlantic Current). The surface currents from altimeters had similar magnitude to the drift currents averaged over 1000 m from the glider in the stratified autumn, but were half the deep water speed during late winter. Although the mesoscale features move in an apparent random manner they may also be quasi-trapped by submarine topography such as seamounts. Occasionally anti-cyclonic and cyclonic cells combine to cause a coherent westward deflection of the European slope current that warms the Rockall side of the Trough. Such deflections contribute to the inter-annual variability in the observed temperature and salinity that are monitored in the upper 800 m of the Trough. By combining glider and altimeter measurements it is shown that altimeter measurements fail to observe a 15 cm s-1 northward flowing slope current on the eastern side and a small persistent southward current on the western side. There is much to be gained from the synergy between satellite

  14. Mesoscale Benchmark Demonstration Problem 1: Mesoscale Simulations of Intra-granular Fission Gas Bubbles in UO2 under Post-irradiation Thermal Annealing

    Energy Technology Data Exchange (ETDEWEB)

    Li, Yulan; Hu, Shenyang Y.; Montgomery, Robert; Gao, Fei; Sun, Xin; Tonks, Michael; Biner, Bullent; Millet, Paul; Tikare, Veena; Radhakrishnan, Balasubramaniam; Andersson , David

    2012-04-11

    A study was conducted to evaluate the capabilities of different numerical methods used to represent microstructure behavior at the mesoscale for irradiated material using an idealized benchmark problem. The purpose of the mesoscale benchmark problem was to provide a common basis to assess several mesoscale methods with the objective of identifying the strengths and areas of improvement in the predictive modeling of microstructure evolution. In this work, mesoscale models (phase-field, Potts, and kinetic Monte Carlo) developed by PNNL, INL, SNL, and ORNL were used to calculate the evolution kinetics of intra-granular fission gas bubbles in UO2 fuel under post-irradiation thermal annealing conditions. The benchmark problem was constructed to include important microstructural evolution mechanisms on the kinetics of intra-granular fission gas bubble behavior such as the atomic diffusion of Xe atoms, U vacancies, and O vacancies, the effect of vacancy capture and emission from defects, and the elastic interaction of non-equilibrium gas bubbles. An idealized set of assumptions was imposed on the benchmark problem to simplify the mechanisms considered. The capability and numerical efficiency of different models are compared against selected experimental and simulation results. These comparisons find that the phase-field methods, by the nature of the free energy formulation, are able to represent a larger subset of the mechanisms influencing the intra-granular bubble growth and coarsening mechanisms in the idealized benchmark problem as compared to the Potts and kinetic Monte Carlo methods. It is recognized that the mesoscale benchmark problem as formulated does not specifically highlight the strengths of the discrete particle modeling used in the Potts and kinetic Monte Carlo methods. Future efforts are recommended to construct increasingly more complex mesoscale benchmark problems to further verify and validate the predictive capabilities of the mesoscale modeling

  15. Green's Kernels and meso-scale approximations in perforated domains

    CERN Document Server

    Maz'ya, Vladimir; Nieves, Michael

    2013-01-01

    There are a wide range of applications in physics and structural mechanics involving domains with singular perturbations of the boundary. Examples include perforated domains and bodies with defects of different types. The accurate direct numerical treatment of such problems remains a challenge. Asymptotic approximations offer an alternative, efficient solution. Green’s function is considered here as the main object of study rather than a tool for generating solutions of specific boundary value problems. The uniformity of the asymptotic approximations is the principal point of attention. We also show substantial links between Green’s functions and solutions of boundary value problems for meso-scale structures. Such systems involve a large number of small inclusions, so that a small parameter, the relative size of an inclusion, may compete with a large parameter, represented as an overall number of inclusions. The main focus of the present text is on two topics: (a) asymptotics of Green’s kernels in domai...

  16. Design of a mesoscale continuous flow route towards lithiated methoxyallene.

    Science.gov (United States)

    Seghers, Sofie; Heugebaert, Thomas S A; Moens, Matthias; Sonck, Jolien; Thybaut, Joris; Stevens, Chris Victor

    2018-05-11

    The unique nucleophilic properties of lithiated methoxyallene allow for C-C bond formation with a wide variety of electrophiles, thus introducing an allenic group for further functionalization. This approach has yielded a tremendously broad range of (hetero)cyclic scaffolds, including API precursors. To date, however, its valorization at scale is hampered by the batch synthesis protocol which suffers from serious safety issues. Hence, the attractive heat and mass transfer properties of flow technology were exploited to establish a mesoscale continuous flow route towards lithiated methoxyallene. An excellent conversion of 94% was obtained, corresponding to a methoxyallene throughput of 8.2 g/h. The process is characterized by short reaction times, mild reaction conditions and a stoichiometric use of reagents. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. Advanced mesoscale forecasts of icing events for Gaspe wind farms

    International Nuclear Information System (INIS)

    Gayraud, A.; Benoit, R.; Camion, A.

    2009-01-01

    Atmospheric icing includes every event which causes ice accumulations of various shapes on different structures. In terms of its effects on wind farms, atmospheric icing can decrease the aerodynamic performance, cause structure overloading, and add vibrations leading to failure and breaking. This presentation discussed advanced mesoscale forecasts of icing events for Gaspe wind farms. The context of the study was discussed with particular reference to atmospheric icing; effects on wind farms; and forecast objectives. The presentation also described the models and results of the study. These included MC2, a compressible community model, as well as a Milbrandt and Yau condensation scheme. It was shown that the study has provided good estimates of the duration of events as well as reliable precipitation categories. tabs., figs.

  18. A three-dimensional viscous topography mesoscale model

    Energy Technology Data Exchange (ETDEWEB)

    Eichhorn, J; Flender, M; Kandlbinder, T; Panhans, W G; Trautmann, T; Zdunkowski, W G [Mainz Univ. (Germany). Inst. fuer Physik der Atmosphaere; Cui, K; Ries, R; Siebert, J; Wedi, N

    1997-11-01

    This study describes the theoretical foundation and applications of a newly designed mesoscale model named CLIMM (climate model Mainz). In contrast to terrain following coordinates, a cartesian grid is used to keep the finite difference equations as simple as possible. The method of viscous topography is applied to the flow part of the model. Since the topography intersects the cartesian grid cells, the new concept of boundary weight factors is introduced for the solution of Poisson`s equation. A three-dimensional radiosity model was implemented to handle radiative transfer at the ground. The model is applied to study thermally induced circulations and gravity waves at an idealized mountain. Furthermore, CLIMM was used to simulate typical wind and temperature distributions for the city of Mainz and its rural surroundings. It was found that the model in all cases produced realistic results. (orig.) 38 refs.

  19. Mesoscale simulations of shockwave energy dissipation via chemical reactions.

    Science.gov (United States)

    Antillon, Edwin; Strachan, Alejandro

    2015-02-28

    We use a particle-based mesoscale model that incorporates chemical reactions at a coarse-grained level to study the response of materials that undergo volume-reducing chemical reactions under shockwave-loading conditions. We find that such chemical reactions can attenuate the shockwave and characterize how the parameters of the chemical model affect this behavior. The simulations show that the magnitude of the volume collapse and velocity at which the chemistry propagates are critical to weaken the shock, whereas the energetics in the reactions play only a minor role. Shock loading results in transient states where the material is away from local equilibrium and, interestingly, chemical reactions can nucleate under such non-equilibrium states. Thus, the timescales for equilibration between the various degrees of freedom in the material affect the shock-induced chemistry and its ability to attenuate the propagating shock.

  20. Dynamics of premixed hydrogen/air flames in mesoscale channels

    Energy Technology Data Exchange (ETDEWEB)

    Pizza, Gianmarco [Paul Scherrer Institute, Combustion Research, CH-5232, Villigen PSI (Switzerland); Aerothermochemistry and Combustion Systems Laboratory, Swiss Federal Institute of Technology, CH-8092, Zurich (Switzerland); Frouzakis, Christos E.; Boulouchos, Konstantinos [Aerothermochemistry and Combustion Systems Laboratory, Swiss Federal Institute of Technology, CH-8092, Zurich (Switzerland); Mantzaras, John [Paul Scherrer Institute, Combustion Research, CH-5232, Villigen PSI (Switzerland); Tomboulides, Ananias G. [Department of Engineering and Management of Energy Resources, University of Western Macedonia, 50100 Kozani (Greece)

    2008-10-15

    Direct numerical simulation with detailed chemistry and transport is used to study the stabilization and dynamics of lean ({phi}=0.5) premixed hydrogen/air atmospheric pressure flames in mesoscale planar channels. Channel heights of h=2, 4, and 7 mm, and inflow velocities in the range 0.3{<=}U{sub IN}{<=}1100cm/ s are investigated. Six different burning modes are identified: mild combustion, ignition/extinction, closed steady symmetric flames, open steady symmetric flames, oscillating and, finally, asymmetric flames. Chaotic behavior of cellular flame structures is observed for certain values of U{sub IN}. Stability maps delineating the regions of the different flame types are finally constructed. (author)

  1. Assimilation of Aircraft Observations in High-Resolution Mesoscale Modeling

    Directory of Open Access Journals (Sweden)

    Brian P. Reen

    2018-01-01

    Full Text Available Aircraft-based observations are a promising source of above-surface observations for assimilation into mesoscale model simulations. The Tropospheric Airborne Meteorological Data Reporting (TAMDAR observations have potential advantages over some other aircraft observations including the presence of water vapor observations. The impact of assimilating TAMDAR observations via observation nudging in 1 km horizontal grid spacing Weather Research and Forecasting model simulations is evaluated using five cases centered over California. Overall, the impact of assimilating the observations is mixed, with the layer with the greatest benefit being above the surface in the lowest 1000 m above ground level and the variable showing the most consistent benefit being temperature. Varying the nudging configuration demonstrates the sensitivity of the results to details of the assimilation, but does not clearly demonstrate the superiority of a specific configuration.

  2. Modification of inertial oscillations by the mesoscale eddy field

    Science.gov (United States)

    Elipot, Shane; Lumpkin, Rick; Prieto, GermáN.

    2010-09-01

    The modification of near-surface near-inertial oscillations (NIOs) by the geostrophic vorticity is studied globally from an observational standpoint. Surface drifter are used to estimate NIO characteristics. Despite its spatial resolution limits, altimetry is used to estimate the geostrophic vorticity. Three characteristics of NIOs are considered: the relative frequency shift with respect to the local inertial frequency; the near-inertial variance; and the inverse excess bandwidth, which is interpreted as a decay time scale. The geostrophic mesoscale flow shifts the frequency of NIOs by approximately half its vorticity. Equatorward of 30°N and S, this effect is added to a global pattern of blue shift of NIOs. While the global pattern of near-inertial variance is interpretable in terms of wind forcing, it is also observed that the geostrophic vorticity organizes the near-inertial variance; it is maximum for near zero values of the Laplacian of the vorticity and decreases for nonzero values, albeit not as much for positive as for negative values. Because the Laplacian of vorticity and vorticity are anticorrelated in the altimeter data set, overall, more near-inertial variance is found in anticyclonic vorticity regions than in cyclonic regions. While this is compatible with anticyclones trapping NIOs, the organization of near-inertial variance by the Laplacian of vorticity is also in very good agreement with previous theoretical and numerical predictions. The inverse bandwidth is a decreasing function of the gradient of vorticity, which acts like the gradient of planetary vorticity to increase the decay of NIOs from the ocean surface. Because the altimetry data set captures the largest vorticity gradients in energetic mesoscale regions, it is also observed that NIOs decay faster in large geostrophic eddy kinetic energy regions.

  3. Meso-scale modeling of irradiated concrete in test reactor

    International Nuclear Information System (INIS)

    Giorla, A.; Vaitová, M.; Le Pape, Y.; Štemberk, P.

    2015-01-01

    Highlights: • A meso-scale finite element model for irradiated concrete is developed. • Neutron radiation-induced volumetric expansion is a predominant degradation mode. • Confrontation with expansion and damage obtained from experiments is successful. • Effects of paste shrinkage, creep and ductility are discussed. - Abstract: A numerical model accounting for the effects of neutron irradiation on concrete at the mesoscale is detailed in this paper. Irradiation experiments in test reactor (Elleuch et al., 1972), i.e., in accelerated conditions, are simulated. Concrete is considered as a two-phase material made of elastic inclusions (aggregate) subjected to thermal and irradiation-induced swelling and embedded in a cementitious matrix subjected to shrinkage and thermal expansion. The role of the hardened cement paste in the post-peak regime (brittle-ductile transition with decreasing loading rate), and creep effects are investigated. Radiation-induced volumetric expansion (RIVE) of the aggregate cause the development and propagation of damage around the aggregate which further develops in bridging cracks across the hardened cement paste between the individual aggregate particles. The development of damage is aggravated when shrinkage occurs simultaneously with RIVE during the irradiation experiment. The post-irradiation expansion derived from the simulation is well correlated with the experimental data and, the obtained damage levels are fully consistent with previous estimations based on a micromechanical interpretation of the experimental post-irradiation elastic properties (Le Pape et al., 2015). The proposed modeling opens new perspectives for the interpretation of test reactor experiments in regards to the actual operation of light water reactors.

  4. Meso-scale modeling of irradiated concrete in test reactor

    Energy Technology Data Exchange (ETDEWEB)

    Giorla, A. [Oak Ridge National Laboratory, One Bethel Valley Road, Oak Ridge, TN 37831 (United States); Vaitová, M. [Czech Technical University, Thakurova 7, 166 29 Praha 6 (Czech Republic); Le Pape, Y., E-mail: lepapeym@ornl.gov [Oak Ridge National Laboratory, One Bethel Valley Road, Oak Ridge, TN 37831 (United States); Štemberk, P. [Czech Technical University, Thakurova 7, 166 29 Praha 6 (Czech Republic)

    2015-12-15

    Highlights: • A meso-scale finite element model for irradiated concrete is developed. • Neutron radiation-induced volumetric expansion is a predominant degradation mode. • Confrontation with expansion and damage obtained from experiments is successful. • Effects of paste shrinkage, creep and ductility are discussed. - Abstract: A numerical model accounting for the effects of neutron irradiation on concrete at the mesoscale is detailed in this paper. Irradiation experiments in test reactor (Elleuch et al., 1972), i.e., in accelerated conditions, are simulated. Concrete is considered as a two-phase material made of elastic inclusions (aggregate) subjected to thermal and irradiation-induced swelling and embedded in a cementitious matrix subjected to shrinkage and thermal expansion. The role of the hardened cement paste in the post-peak regime (brittle-ductile transition with decreasing loading rate), and creep effects are investigated. Radiation-induced volumetric expansion (RIVE) of the aggregate cause the development and propagation of damage around the aggregate which further develops in bridging cracks across the hardened cement paste between the individual aggregate particles. The development of damage is aggravated when shrinkage occurs simultaneously with RIVE during the irradiation experiment. The post-irradiation expansion derived from the simulation is well correlated with the experimental data and, the obtained damage levels are fully consistent with previous estimations based on a micromechanical interpretation of the experimental post-irradiation elastic properties (Le Pape et al., 2015). The proposed modeling opens new perspectives for the interpretation of test reactor experiments in regards to the actual operation of light water reactors.

  5. Air Pollutant Distribution and Mesoscale Circulation Systems During Escompte

    Science.gov (United States)

    Kottmeier, Ch.; Kalthoff, N.; Corsmeier, U.; Robin, D.; Thürauf, J.; Hofherr, T.; Hasel, M.

    The distribution of pollutants observed with an Dornier 128 instrumented aircraft and from AIRMARAIX ground stations during one day of the Escompte experiment (June 25, 2001) is analysed in relation to the mesoscale wind systems and vertical mixing from aircraft and radiosonde data. The ESCOMPTE-experiment (http://medias.obs- mip.fr/escompte) was carried out in June and July 2001 in the urban area of Marseille and its rural surroundings to investigate periods with photosmog conditions. The over- all aim is to produce an appropriate high quality 3-D data set which includes emission, meteorological, and chemical data. The data is used for the validation of mesoscale models and for chemical and meteorological process studies. The evolution of pho- tosmog episodes with high ozone concentrations depends on both chemical transfor- mation processes and meteorological conditions. As Marseille is situated between the Mediterranean Sea in the south and mountainous sites in the north, under weak large- scale flow the meteorological conditions are dominated by thermally driven circula- tion systems which strongly influence the horizontal transport of air pollutants. Ad- ditionally, vertically exchange processes like mountain venting and slope winds may contribute in the temporal evolution of the trace gas concentration of the city plume in the atmospheric boundary layer and are particularly studied by the Dornier flight measurements. Therefore the experiment was designed to measure both, the chemi- cal species and meteorological parameters with high resolution in space and time by surface stations, aircraft and vertical profiling systems like radiosondes, sodars and lidars. Results are shown (a) on the evolution of the wind field and the ozone concen- trations during June 25, when an ozone maximum develops about 60 km in the lee site of Marseille and (b) the vertical transport of air pollutants between the boundary layer and the free troposphere.

  6. Coherent mesoscale eddies in the North Atlantic subtropical gyre: 3-D structure and transport with application to the salinity maximum

    Science.gov (United States)

    Amores, Angel; Melnichenko, Oleg; Maximenko, Nikolai

    2017-01-01

    The mean vertical structure and transport properties of mesoscale eddies are investigated in the North Atlantic subtropical gyre by combining historical records of Argo temperature/salinity profiles and satellite sea level anomaly data in the framework of the eddy tracking technique. The study area is characterized by a low eddy kinetic energy and sea surface salinity maximum. Although eddies have a relatively weak signal at surface (amplitudes around 3-7 cm), the eddy composites reveal a clear deep signal that penetrates down to at least 1200 m depth. The analysis also reveals that the vertical structure of the eddy composites is strongly affected by the background stratification. The horizontal patterns of temperature/salinity anomalies can be reconstructed by a linear combination of a monopole, related to the elevation/depression of the isopycnals in the eddy core, and a dipole, associated with the horizontal advection of the background gradient by the eddy rotation. A common feature of all the eddy composites reconstructed is the phase coherence between the eddy temperature/salinity and velocity anomalies in the upper ˜300 m layer, resulting in the transient eddy transports of heat and salt. As an application, a box model of the near-surface layer is used to estimate the role of mesoscale eddies in maintaining a quasi-steady state distribution of salinity in the North Atlantic subtropical salinity maximum. The results show that mesoscale eddies are able to provide between 4 and 21% of the salt flux out of the area required to compensate for the local excess of evaporation over precipitation.

  7. A Study of Mesoscale Gravity Waves over the North Atlantic with Satellite Observations and a Mesoscale Model

    Science.gov (United States)

    Wu, Dong L.; Zhang, Fuqing

    2004-01-01

    Satellite microwave data are used to study gravity wave properties and variabilities over the northeastern United States and the North Atlantic in the December-January periods. The gravity waves in this region, found in many winters, can reach the stratopause with growing amplitude. The Advanced Microwave Sounding Unit-A (AMSU-A) observations show that the wave occurrences are correlated well with the intensity and location of the tropospheric baroclinic jet front systems. To further investigate the cause(s) and properties of the North Atlantic gravity waves, we focus on a series of wave events during 19-21 January 2003 and compare AMSU-A observations to simulations from a mesoscale model (MM5). The simulated gravity waves compare qualitatively well with the satellite observations in terms of wave structures, timing, and overall morphology. Excitation mechanisms of these large-amplitude waves in the troposphere are complex and subject to further investigations.

  8. MIZEX. A Program for Mesoscale Air-Ice-Ocean Interaction Experiments in Arctic Marginal Ice Zones. II. A Science Plan for a Summer Marginal Ice Zone Experiment in the Fram Strait/Greenland Sea: 1984.

    Science.gov (United States)

    1983-05-01

    size and thickness characteris- tics. N’ore complete analysis will require combin- ing ice data with data obtained by the oceano - graphic... sol concentration and microwave brightness tem- perature. A long-range aircraft and a light aircraft Hying from Spitzbergen will study mesoscale

  9. Tools and Methods for Visualization of Mesoscale Ocean Eddies

    Science.gov (United States)

    Bemis, K. G.; Liu, L.; Silver, D.; Kang, D.; Curchitser, E.

    2017-12-01

    Mesoscale ocean eddies form in the Gulf Stream and transport heat and nutrients across the ocean basin. The internal structure of these three-dimensional eddies and the kinematics with which they move are critical to a full understanding of their transport capacity. A series of visualization tools have been developed to extract, characterize, and track ocean eddies from 3D modeling results, to visually show the ocean eddy story by applying various illustrative visualization techniques, and to interactively view results stored on a server from a conventional browser. In this work, we apply a feature-based method to track instances of ocean eddies through the time steps of a high-resolution multidecadal regional ocean model and generate a series of eddy paths which reflect the life cycle of individual eddy instances. The basic method uses the Okubu-Weiss parameter to define eddy cores but could be adapted to alternative specifications of an eddy. Stored results include pixel-lists for each eddy instance, tracking metadata for eddy paths, and physical and geometric properties. In the simplest view, isosurfaces are used to display eddies along an eddy path. Individual eddies can then be selected and viewed independently or an eddy path can be viewed in the context of all eddy paths (longer than a specified duration) and the ocean basin. To tell the story of mesoscale ocean eddies, we combined illustrative visualization techniques, including visual effectiveness enhancement, focus+context, and smart visibility, with the extracted volume features to explore eddy characteristics at multiple scales from ocean basin to individual eddy. An evaluation by domain experts indicates that combining our feature-based techniques with illustrative visualization techniques provides an insight into the role eddies play in ocean circulation. A web-based GUI is under development to facilitate easy viewing of stored results. The GUI provides the user control to choose amongst available

  10. Modulating Effects of Mesoscale Oceanic Eddies on Sea Surface Temperature Response to Tropical Cyclones Over the Western North Pacific

    Science.gov (United States)

    Ma, Zhanhong; Fei, Jianfang; Huang, Xiaogang; Cheng, Xiaoping

    2018-01-01

    The impact of mesoscale oceanic eddies on the temporal and spatial characteristics of sea surface temperature (SST) response to tropical cyclones is investigated in this study based on composite analysis of cyclone-eddy interactions over the western North Pacific. The occurrence times of maximum cooling, recovery time, and spatial patterns of SST response are specially evaluated. The influence of cold-core eddies (CCEs) renders the mean occurrence time of maximum SST cooling to become about half a day longer than that in eddy-free condition, while warm-core eddies (WCEs) have little effect on this facet. The recovery time of SST cooling also takes longer in presence of CCEs, being overall more pronounced for stronger or slower tropical cyclones. The effect of WCEs on the recovery time is again not significant. The modulation of maximum SST decrease by WCEs for category 2-5 storms is found to be remarkable in the subtropical region but not evident in the tropical region, while the role of CCEs is remarkable in both regions. The CCEs are observed to change the spatial characteristics of SST response, with enhanced SST decrease initially at the right side of storm track. During the recovery period the strengthened SST cooling by CCEs propagates leftward gradually, with a feature similar as both the westward-propagating eddies and the recovery of cold wake. These results underscore the importance of resolving mesoscale oceanic eddies in coupled numerical models to improve the prediction of storm-induced SST response.

  11. Meso-scale modelling of the heat conductivity effect on the shock response of a porous material

    Science.gov (United States)

    Resnyansky, A. D.

    2017-06-01

    Understanding of deformation mechanisms of porous materials under shock compression is important for tailoring material properties at the shock manufacturing of advanced materials from substrate powders and for studying the response of porous materials under shock loading. Numerical set-up of the present work considers a set of solid particles separated by air representing a volume of porous material. Condensed material in the meso-scale set-up is simulated with a viscoelastic rate sensitive material model with heat conduction formulated from the principles of irreversible thermodynamics. The model is implemented in the CTH shock physics code. The meso-scale CTH simulation of the shock loading of the representative volume reveals the mechanism of pore collapse and shows in detail the transition from a high porosity case typical for abnormal Hugoniot response to a moderate porosity case typical for conventional Hugoniot response. Results of the analysis agree with previous analytical considerations and support hypotheses used in the two-phase approach.

  12. Diatoms as a fingerprint of sub-catchment contributions to meso-scale catchment runoff

    Science.gov (United States)

    Klaus, Julian; Wetzel, Carlos E.; Martinez-Carreras, Nuria; Ector, Luc; Pfister, Laurent

    2014-05-01

    In recent years, calls were made for new eco-hydrological approaches to improve understanding of hydrological processes. Recently diatoms, one of the most common and diverse algal groups that can be easily transported by flowing water due to their small size (~10-200 µm), were used to detect the onset and cessation of surface runoff to small headwater streams and constrain isotopic and hydro-chemical hydrograph separation methods. While the method showed its potential in the hillslope-riparian zone-stream continuum of headwater catchments, the behavior of diatoms and their use for hydrological process research in meso-scale catchments remains uncertain. Diatoms can be a valuable support for isotope and hydro-chemical tracer methods when these become ambiguous with increasing scale. Distribution and abundance of diatom species is controlled by various environmental factors (pH, soil type, moisture conditions, exposition to sunlight, etc.). We therefore hypothesize that species abundance and composition can be used as a proxy for source areas. This presentation evaluates the potential for diatoms to trace source-areas in the nested meso-scale Attert River basin (250 km2, Luxembourg, Europe). We sampled diatom populations in streamwater during one flood event in Fall 2011 in 6 sub-catchments and the basin outlet - 17 to 28 samples/catchment for the different sampling locations. Diatoms were classified and counted in every individual sample. In total more than 400 diatom species were detected. Ordination analysis revealed a clear distinction between communities sampled in different sub-catchments. The species composition at the catchment outlet reflects a mixing of the diatom composition originating from different sub-catchments. This data suggests that diatoms indeed can reflect the geographic origin of stream water at the catchment outlet. The centroids of the ordination analysis might be linked to the physiographic characteristics (geology and land use) of the

  13. Mesoscale surface equivalent temperature (T E) for East Central USA

    Science.gov (United States)

    Younger, Keri; Mahmood, Rezaul; Goodrich, Gregory; Pielke, Roger A.; Durkee, Joshua

    2018-04-01

    The purpose of this research is to investigate near surface mesoscale equivalent temperatures (T E) in Kentucky (located in east central USA) and potential land cover influences. T E is a measure of the moist enthalpy composed of the dry bulb temperature, T, and absolute humidity. Kentucky presents a unique opportunity to perform a study of this kind because of the observational infrastructure provided by the Kentucky Mesonet (www.kymesonet.org). This network maintains 69 research-grade, in-situ weather and climate observing stations across the Commonwealth. Equivalent temperatures were calculated utilizing high-quality observations from 33 of these stations. In addition, the Kentucky Mesonet offers higher spatial and temporal resolution than previous research on this topic. As expected, the differences (T E - T) were greatest in the summer (smallest in the winter), with an average of 35 °C (5 °C). In general, the differences were found to be the largest in the western climate division. This is attributed to agricultural land use and poorly drained land. These differences are smaller during periods of drought, signifying less influence of moisture.

  14. Condensate localization by mesoscale disorder in high-Tc superconductors

    International Nuclear Information System (INIS)

    Kumar, N.

    1994-06-01

    We propose and solve approximately a phenomenological model for Anderson localization of the macroscopic wavefunction for an inhomogeneous superconductor quench-disordered on the mesoscale of the order of the coherence length ξ 0 . Our treatment is based on the non-linear Schroedinger equation resulting from the Ginzburg-Landau free-energy functional having a spatially random coefficient representing spatial disorder of the pairing interaction. Linearization of the equation, valid close to the critical temperature T c , or to the upper critical field H c2 (T c ) maps it to the Anderson localization problem with T c identified with the mobility edge. For the highly anisotropic high-T c materials and thin (2D) films in the quantum Hall geometry, we predict windows of re-entrant superconductivity centered at integrally spaced temperature values. Our model treatment also provides a possible explanation for the critical current J c perpendicular becoming non-zero on cooling before J c parallel does in some high-T c superconductors. (author). 18 refs

  15. 2D mesoscale colloidal crystal patterns on polymer substrates

    Science.gov (United States)

    Bredikhin, Vladimir; Bityurin, Nikita

    2018-05-01

    The development of nanosphere lithography relies on the ability of depositing 2D colloidal crystals comprising micro- and nano-size elements on substrates of different materials. One of the most difficult problems here is deposition of coatings on hydrophobic substrates, e.g. polymers, from aqueous colloidal solutions. We use UV photooxidation for substrate hydrophilization. We demonstrate a new method of producing a two-dimensional ordered array of polymer microparticles (polystyrene microspheres ∼1 μm in diameter) on a polymer substrate (PMMA). We show that implementation of the new deposition technique for directed self-assembly of microspheres on an UV irradiated surface provides an opportunity to obtain coatings on a hydrophilized PMMA surface of large area (∼5 cm2). UV irradiation of the surface through masks allows creating 2D patterns consisting of mesoscale elements formed by the deposited self-assembled microparticles owing to the fact that the colloidal particles are deposited only on the irradiated area leaving the non-irradiated sections intact.

  16. Modeling of Mesoscale Variability in Biofilm Shear Behavior.

    Directory of Open Access Journals (Sweden)

    Pallab Barai

    Full Text Available Formation of bacterial colonies as biofilm on the surface/interface of various objects has the potential to impact not only human health and disease but also energy and environmental considerations. Biofilms can be regarded as soft materials, and comprehension of their shear response to external forces is a key element to the fundamental understanding. A mesoscale model has been presented in this article based on digitization of a biofilm microstructure. Its response under externally applied shear load is analyzed. Strain stiffening type behavior is readily observed under high strain loads due to the unfolding of chains within soft polymeric substrate. Sustained shear loading of the biofilm network results in strain localization along the diagonal direction. Rupture of the soft polymeric matrix can potentially reduce the intercellular interaction between the bacterial cells. Evolution of stiffness within the biofilm network under shear reveals two regimes: a initial increase in stiffness due to strain stiffening of polymer matrix, and b eventual reduction in stiffness because of tear in polymeric substrate.

  17. MICRO-SEISMOMETERS VIA ADVANCED MESO-SCALE FABRICATION

    Energy Technology Data Exchange (ETDEWEB)

    Garcia, Caesar A; Onaran, Guclu; Avenson, Brad; Hall, Neal

    2014-11-07

    The Department of Energy (DOE) and the National Nuclear Security Administration (NNSA) seek revolutionary sensing innovations for the monitoring of nuclear detonations. Performance specifications are to be consistent with those obtainable by only an elite few products available today, but with orders of magnitude reduction in size, weight, power, and cost. The proposed commercial innovation calls upon several technologies including the combination of meso-scale fabrication and assembly, photonics-based displacement / motion detection methods, and the use of digital control electronics . Early Phase II development has demonstrated verified and repeatable sub 2ng noise floor from 3Hz to 100Hz, compact integration of 3-axis prototypes, and robust deployment exercises. Ongoing developments are focusing on low frequency challenges, low power consumption, ultra-miniature size, and low cross axis sensitivity. We are also addressing the rigorous set of specifications required for repeatable and reliable long-term explosion monitoring, including thermal stability, reduced recovery time from mass re-centering and large mechanical shocks, sensitivity stability, and transportability. Successful implementation will result in small, hand-held demonstration units with the ability to address national security needs of the DOE/NNSA. Additional applications envisioned include military/defense, scientific instrumentation, oil and gas exploration, inertial navigation, and civil infrastructure monitoring.

  18. Flame dynamics of a meso-scale heat recirculating combustor

    Energy Technology Data Exchange (ETDEWEB)

    Vijayan, V.; Gupta, A.K. [Department of Mechanical Engineering, University of Maryland, College Park, MD 20742 (United States)

    2010-12-15

    The dynamics of premixed propane-air flame in a meso-scale ceramic combustor has been examined here. The flame characteristics in the combustor were examined by measuring the acoustic emissions and preheat temperatures together with high-speed cinematography. For the small-scale combustor, the volume to surface area ratio is small and hence the walls have significant effect on the global flame structure, flame location and flame dynamics. In addition to the flame-wall thermal coupling there is a coupling between flame and acoustics in the case of confined flames. Flame-wall thermal interactions lead to low frequency flame fluctuations ({proportional_to}100 Hz) depending upon the thermal response of the wall. However, the flame-acoustic interactions can result in a wide range of flame fluctuations ranging from few hundred Hz to few kHz. Wall temperature distribution is one of the factors that control the amount of reactant preheating which in turn effects the location of flame stabilization. Acoustic emission signals and high-speed flame imaging confirmed that for the present case flame-acoustic interactions have more significant effect on flame dynamics. Based on the acoustic emissions, five different flame regimes have been identified; whistling/harmonic mode, rich instability mode, lean instability mode, silent mode and pulsating flame mode. (author)

  19. Understanding Mesoscale Land-Atmosphere Interactions in Arctic Region

    Science.gov (United States)

    Hong, X.; Wang, S.; Nachamkin, J. E.

    2017-12-01

    Land-atmosphere interactions in Arctic region are examined using the U.S. Navy Coupled Ocean/Atmosphere Mesoscale Prediction System (COAMPS©*) with the Noah Land Surface Model (LSM). Initial land surface variables in COAMPS are interpolated from the real-time NASA Land Information System (LIS). The model simulations are configured for three nest grids with 27-9-3 km horizontal resolutions. The simulation period is set for October 2015 with 12-h data assimilation update cycle and 24-h integration length. The results are compared with those simulated without using LSM and evaluated with observations from ONR Sea State R/V Sikuliaq cruise and the North Slope of Alaska (NSA). There are complex soil and vegetation types over the surface for simulation with LSM, compared to without LSM simulation. The results show substantial differences in surface heat fluxes between bulk surface scheme and LSM, which may have an important impact on the sea ice evolution over the Arctic region. Evaluations from station data show surface air temperature and relative humidity have smaller biases for simulation using LSM. Diurnal variation of land surface temperature, which is necessary for physical processes of land-atmosphere, is also better captured than without LSM.

  20. Prediction of shock initiation thresholds and ignition probability of polymer-bonded explosives using mesoscale simulations

    Science.gov (United States)

    Kim, Seokpum; Wei, Yaochi; Horie, Yasuyuki; Zhou, Min

    2018-05-01

    The design of new materials requires establishment of macroscopic measures of material performance as functions of microstructure. Traditionally, this process has been an empirical endeavor. An approach to computationally predict the probabilistic ignition thresholds of polymer-bonded explosives (PBXs) using mesoscale simulations is developed. The simulations explicitly account for microstructure, constituent properties, and interfacial responses and capture processes responsible for the development of hotspots and damage. The specific mechanisms tracked include viscoelasticity, viscoplasticity, fracture, post-fracture contact, frictional heating, and heat conduction. The probabilistic analysis uses sets of statistically similar microstructure samples to directly mimic relevant experiments for quantification of statistical variations of material behavior due to inherent material heterogeneities. The particular thresholds and ignition probabilities predicted are expressed in James type and Walker-Wasley type relations, leading to the establishment of explicit analytical expressions for the ignition probability as function of loading. Specifically, the ignition thresholds corresponding to any given level of ignition probability and ignition probability maps are predicted for PBX 9404 for the loading regime of Up = 200-1200 m/s where Up is the particle speed. The predicted results are in good agreement with available experimental measurements. A parametric study also shows that binder properties can significantly affect the macroscopic ignition behavior of PBXs. The capability to computationally predict the macroscopic engineering material response relations out of material microstructures and basic constituent and interfacial properties lends itself to the design of new materials as well as the analysis of existing materials.

  1. Meteorology, Macrophysics, Microphysics, Microwaves, and Mesoscale Modeling of Mediterranean Mountain Storms: The M8 Laboratory

    Science.gov (United States)

    Starr, David O. (Technical Monitor); Smith, Eric A.

    2002-01-01

    Comprehensive understanding of the microphysical nature of Mediterranean storms can be accomplished by a combination of in situ meteorological data analysis and radar-passive microwave data analysis, effectively integrated with numerical modeling studies at various scales, from synoptic scale down through the mesoscale, the cloud macrophysical scale, and ultimately the cloud microphysical scale. The microphysical properties of and their controls on severe storms are intrinsically related to meteorological processes under which storms have evolved, processes which eventually select and control the dominant microphysical properties themselves. This involves intense convective development, stratiform decay, orographic lifting, and sloped frontal lifting processes, as well as the associated vertical motions and thermodynamical instabilities governing physical processes that affect details of the size distributions and fall rates of the various types of hydrometeors found within the storm environment. Insofar as hazardous Mediterranean storms, highlighted in this study by three mountain storms producing damaging floods in northern Italy between 1992 and 2000, developing a comprehensive microphysical interpretation requires an understanding of the multiple phases of storm evolution and the heterogeneous nature of precipitation fields within a storm domain. This involves convective development, stratiform transition and decay, orographic lifting, and sloped frontal lifting processes. This also involves vertical motions and thermodynamical instabilities governing physical processes that determine details of the liquid/ice water contents, size disi:ributions, and fall rates of the various modes of hydrometeors found within hazardous storm environments.

  2. The effect of network resolution on data assimilation in a mesoscale model

    International Nuclear Information System (INIS)

    Dudhia, J.

    1994-01-01

    One goal of the Atmospheric Radiation Measurement (ARM) Program is to characterize meteorological fields over wide areas (200-km square) in order to better parameterize sub-grid-scale variability in general circulation models used for climate studies. Such a detailed knowledge over these areas is impossible with current observational methods alone, but the synthesis of a dataset by combining observations with a mesoscale numerical model is feasible. Current data assimilation techniques allow observed data to be incorporated while a model is running, thus constraining the model to fit the data as well as the data to be dynamically consistent with the model atmosphere. This interaction may therefore be regarded as a dynamical analysis technique. The technique used for data assimilation here will be the nudging method (Stauffer and Seaman 1990, Kuo and Guo 1989). Specifically, observational nudging where data at observational sites are gradually forced in the model without the need for a gridded analysis. This method is particularly appropriate for asynoptic data covering meso-β-scales, such as will be available at the Cloud and Radiation Testbed (CART) sites. The method makes it possible to incorporate the wide variety of data coming from these sites

  3. Observations of Coastally Transitioning West African Mesoscale Convective Systems during NAMMA

    Directory of Open Access Journals (Sweden)

    Bradley W. Klotz

    2012-01-01

    Full Text Available Observations from the NASA 10 cm polarimetric Doppler weather radar (NPOL were used to examine structure, development, and oceanic transition of West African Mesoscale Convective Systems (MCSs during the NASA African Monsoon Multidisciplinary Analysis (NAMMA to determine possible indicators leading to downstream tropical cyclogenesis. Characteristics examined from the NPOL data include echo-top heights, maximum radar reflectivity, height of maximum radar reflectivity, and convective and stratiform coverage areas. Atmospheric radiosondes launched during NAMMA were used to investigate environmental stability characteristics that the MCSs encountered while over land and ocean, respectively. Strengths of African Easterly Waves (AEWs were examined along with the MCSs in order to improve the analysis of MCS characteristics. Mean structural and environmental characteristics were calculated for systems that produced TCs and for those that did not in order to determine differences between the two types. Echo-top heights were similar between the two types, but maximum reflectivity and height and coverage of intense convection (>50 dBZ are all larger than for the TC producing cases. Striking differences in environmental conditions related to future TC formation include stronger African Easterly Jet, increased moisture especially at middle and upper levels, and increased stability as the MCSs coastally transition.

  4. Application of a mesoscale forecasting model (NMM) coupled to the CALMET to develop forecast meteorology to use with the CALPUFF air dispersion model

    International Nuclear Information System (INIS)

    Radonjic, Z.; Telenta, B.; Kirklady, J.; Chambers, D.; Kleb, H.

    2006-01-01

    An air quality assessment was undertaken as part of the Environmental Assessment for the Port Hope Area Initiative. The assessment predicted potential effects associated with the remediation efforts for historic low-level radioactive wastes and construction of Long-Term Waste Management Facilities (LTWMFs) for both the Port Hope and Port Granby Projects. A necessary element of air dispersion modelling is the development of suitable meteorological data. For the Port Hope and Port Granby Projects, a meteorological station was installed in close proximity to the location of the recommended LTWMF in Port Hope. The recommended location for the Port Granby LTWMF is approximately 10 km west of the Port Hope LTWMF. Concerns were raised regarding the applicability of data collected for the Port Hope meteorological station to the Port Granby Site. To address this concern, a new method for processing meteorological data, which coupled mesoscale meteorological forecasting data the U.S. EPA CALMET meteorological data processor, was applied. This methodology is possible because a new and advanced mesoscale forecasting modelling system enables extensive numerical calculations on personal computers. As a result of this advancement, mesoscale forecasting systems can now be coupled with the CALMET meteorological data processor and the CALPUFF air dispersion modelling system to facilitate wind field estimations and air dispersion analysis. (author)

  5. Seasonal and mesoscale variability of oceanic transport of anthropogenic CO2

    Directory of Open Access Journals (Sweden)

    J.-C. Dutay

    2009-11-01

    Full Text Available Estimates of the ocean's large-scale transport of anthropogenic CO2 are based on one-time hydrographic sections, but the temporal variability of this transport has not been investigated. The aim of this study is to evaluate how the seasonal and mesoscale variability affect data-based estimates of anthropogenic CO2 transport. To diagnose this variability, we made a global anthropogenic CO2 simulation using an eddy-permitting version of the coupled ocean sea-ice model ORCA-LIM. As for heat transport, the seasonally varying transport of anthropogenic CO2 is largest within 20° of the equator and shows secondary maxima in the subtropics. Ekman transport generally drives most of the seasonal variability, but the contribution of the vertical shear becomes important near the equator and in the Southern Ocean. Mesoscale variabilty contributes to the annual-mean transport of both heat and anthropogenic CO2 with strong poleward transport in the Southern Ocean and equatorward transport in the tropics. This "rectified" eddy transport is largely baroclinic in the tropics and barotropic in the Southern Ocean due to a larger contribution from standing eddies. Our analysis revealed that most previous hydrographic estimates of meridional transport of anthropogenic CO2 are severely biased because they neglect temporal fluctuations due to non-Ekman velocity variations. In each of the three major ocean basins, this bias is largest near the equator and in the high southern latitudes. In the subtropical North Atlantic, where most of the hydrographic-based estimates have been focused, this uncertainty represents up to 20% and 30% of total meridional transport of heat and CO2. Generally though, outside the tropics and Southern Ocean, there are only small variations in meridional transport due to seasonal variations in tracer fields and time variations in eddy transport. For the North Atlantic, eddy variability accounts for up to 10% and 15% of the total transport of

  6. Mesoscale energetics and flows induced by sea-land and mountain-valley contrasts

    Directory of Open Access Journals (Sweden)

    S. Federico

    2000-02-01

    Full Text Available We study the relative importance of sea-land and mountain-valley thermal contrasts in determining the development of thermally forced mesoscale circulations (TFMCs over a mountainous peninsula. We first analyse the energetics of the problem, and using this theory, we interprete the numerical simulations over Calabria, a mountainous peninsula in southern Italy. The CSU 3-D nonlinear numerical model is utilised to simulate the dynamics and the thermodynamics of the atmospheric fields over Calabria. Results show the importance of orography in determining the pattern of the flow and the local climate in a region as complex as Calabria. Analysis of the results shows that the energetics due to the sea-land interactions are more efficient when the peninsula is flat. The importance of the energy due to the sea-land decreases as the mountain height of the peninsula increases. The energy stored over the mountain gains in importance, untill it is released by the readjustment of the warm mountain air as it prevails over the energy released by the inland penetration of the sea breeze front. For instance, our results show that over a peninsula 100 km wide the energy over the mountain and the energy in the sea-land contrast are of the same order when the height of the mountain is about 700 m, for a 1500 m convective boundary layer (CBL depth. Over the Calabrian peninsula, the energy released by the hot air in the CBL of the mountain prevails over the energy released by the inland penetration of the sea air. Calabria is about 1500 m high and about 50 km wide, and the CBL is of the order of 1500 m. The energy over the mountain is about four time larger than the energy contained in the sea-land contrast. Furthermore, the energetics increase with the patch width of the peninsula, and when its half width is much less than the Rossby radius, the MAPE of the sea breeze is negligible. When its half width is much larger than the Rossby radius, the breezes from the two

  7. Mesoscale Characterization of Fracture Properties of Steel Fiber-Reinforced Concrete Using a Lattice–Particle Model

    Directory of Open Access Journals (Sweden)

    Francisco Montero-Chacón

    2017-02-01

    Full Text Available This work presents a lattice–particle model for the analysis of steel fiber-reinforced concrete (SFRC. In this approach, fibers are explicitly modeled and connected to the concrete matrix lattice via interface elements. The interface behavior was calibrated by means of pullout tests and a range for the bond properties is proposed. The model was validated with analytical and experimental results under uniaxial tension and compression, demonstrating the ability of the model to correctly describe the effect of fiber volume fraction and distribution on fracture properties of SFRC. The lattice–particle model was integrated into a hierarchical homogenization-based scheme in which macroscopic material parameters are obtained from mesoscale simulations. Moreover, a representative volume element (RVE analysis was carried out and the results shows that such an RVE does exist in the post-peak regime and until localization takes place. Finally, the multiscale upscaling strategy was successfully validated with three-point bending tests.

  8. Mesoscale Characterization of Fracture Properties of Steel Fiber-Reinforced Concrete Using a Lattice-Particle Model.

    Science.gov (United States)

    Montero-Chacón, Francisco; Cifuentes, Héctor; Medina, Fernando

    2017-02-21

    This work presents a lattice-particle model for the analysis of steel fiber-reinforced concrete (SFRC). In this approach, fibers are explicitly modeled and connected to the concrete matrix lattice via interface elements. The interface behavior was calibrated by means of pullout tests and a range for the bond properties is proposed. The model was validated with analytical and experimental results under uniaxial tension and compression, demonstrating the ability of the model to correctly describe the effect of fiber volume fraction and distribution on fracture properties of SFRC. The lattice-particle model was integrated into a hierarchical homogenization-based scheme in which macroscopic material parameters are obtained from mesoscale simulations. Moreover, a representative volume element (RVE) analysis was carried out and the results shows that such an RVE does exist in the post-peak regime and until localization takes place. Finally, the multiscale upscaling strategy was successfully validated with three-point bending tests.

  9. Mesoscale Characterization of Fracture Properties of Steel Fiber-Reinforced Concrete Using a Lattice–Particle Model

    Science.gov (United States)

    Montero-Chacón, Francisco; Cifuentes, Héctor; Medina, Fernando

    2017-01-01

    This work presents a lattice–particle model for the analysis of steel fiber-reinforced concrete (SFRC). In this approach, fibers are explicitly modeled and connected to the concrete matrix lattice via interface elements. The interface behavior was calibrated by means of pullout tests and a range for the bond properties is proposed. The model was validated with analytical and experimental results under uniaxial tension and compression, demonstrating the ability of the model to correctly describe the effect of fiber volume fraction and distribution on fracture properties of SFRC. The lattice–particle model was integrated into a hierarchical homogenization-based scheme in which macroscopic material parameters are obtained from mesoscale simulations. Moreover, a representative volume element (RVE) analysis was carried out and the results shows that such an RVE does exist in the post-peak regime and until localization takes place. Finally, the multiscale upscaling strategy was successfully validated with three-point bending tests. PMID:28772568

  10. An overview of mesoscale aerosol processes, comparisons, and validation studies from DRAGON networks

    Science.gov (United States)

    Holben, Brent N.; Kim, Jhoon; Sano, Itaru; Mukai, Sonoyo; Eck, Thomas F.; Giles, David M.; Schafer, Joel S.; Sinyuk, Aliaksandr; Slutsker, Ilya; Smirnov, Alexander; Sorokin, Mikhail; Anderson, Bruce E.; Che, Huizheng; Choi, Myungje; Crawford, James H.; Ferrare, Richard A.; Garay, Michael J.; Jeong, Ukkyo; Kim, Mijin; Kim, Woogyung; Knox, Nichola; Li, Zhengqiang; Lim, Hwee S.; Liu, Yang; Maring, Hal; Nakata, Makiko; Pickering, Kenneth E.; Piketh, Stuart; Redemann, Jens; Reid, Jeffrey S.; Salinas, Santo; Seo, Sora; Tan, Fuyi; Tripathi, Sachchida N.; Toon, Owen B.; Xiao, Qingyang

    2018-01-01

    Over the past 24 years, the AErosol RObotic NETwork (AERONET) program has provided highly accurate remote-sensing characterization of aerosol optical and physical properties for an increasingly extensive geographic distribution including all continents and many oceanic island and coastal sites. The measurements and retrievals from the AERONET global network have addressed satellite and model validation needs very well, but there have been challenges in making comparisons to similar parameters from in situ surface and airborne measurements. Additionally, with improved spatial and temporal satellite remote sensing of aerosols, there is a need for higher spatial-resolution ground-based remote-sensing networks. An effort to address these needs resulted in a number of field campaign networks called Distributed Regional Aerosol Gridded Observation Networks (DRAGONs) that were designed to provide a database for in situ and remote-sensing comparison and analysis of local to mesoscale variability in aerosol properties. This paper describes the DRAGON deployments that will continue to contribute to the growing body of research related to meso- and microscale aerosol features and processes. The research presented in this special issue illustrates the diversity of topics that has resulted from the application of data from these networks.

  11. Probabilistic, Multivariable Flood Loss Modeling on the Mesoscale with BT-FLEMO.

    Science.gov (United States)

    Kreibich, Heidi; Botto, Anna; Merz, Bruno; Schröter, Kai

    2017-04-01

    Flood loss modeling is an important component for risk analyses and decision support in flood risk management. Commonly, flood loss models describe complex damaging processes by simple, deterministic approaches like depth-damage functions and are associated with large uncertainty. To improve flood loss estimation and to provide quantitative information about the uncertainty associated with loss modeling, a probabilistic, multivariable Bagging decision Tree Flood Loss Estimation MOdel (BT-FLEMO) for residential buildings was developed. The application of BT-FLEMO provides a probability distribution of estimated losses to residential buildings per municipality. BT-FLEMO was applied and validated at the mesoscale in 19 municipalities that were affected during the 2002 flood by the River Mulde in Saxony, Germany. Validation was undertaken on the one hand via a comparison with six deterministic loss models, including both depth-damage functions and multivariable models. On the other hand, the results were compared with official loss data. BT-FLEMO outperforms deterministic, univariable, and multivariable models with regard to model accuracy, although the prediction uncertainty remains high. An important advantage of BT-FLEMO is the quantification of prediction uncertainty. The probability distribution of loss estimates by BT-FLEMO well represents the variation range of loss estimates of the other models in the case study. © 2016 Society for Risk Analysis.

  12. An avenue of eddies: Quantifying the biophysical properties of mesoscale eddies in the Tasman Sea

    Science.gov (United States)

    Everett, J. D.; Baird, M. E.; Oke, P. R.; Suthers, I. M.

    2012-08-01

    The Tasman Sea is unique - characterised by a strong seasonal western boundary current that breaks down into a complicated field of mesoscale eddies almost immediately after separating from the coast. Through a 16-year analysis of Tasman Sea eddies, we identify a region along the southeast Australian coast which we name ‘Eddy Avenue’ where eddies have higher sea level anomalies, faster rotation and greater sea surface temperature and chlorophyll a anomalies. The density of cyclonic and anticyclonic eddies within Eddy Avenue is 23% and 16% higher respectively than the broader Tasman Sea. We find that Eddy Avenue cyclonic and anticyclonic eddies have more strongly differentiated biological properties than those of the broader Tasman Sea, as a result of larger anticyclonic eddies formed from Coral Sea water depressing chl. a concentrations, and for coastal cyclonic eddies due to the entrainment of nutrient-rich shelf waters. Cyclonic eddies within Eddy Avenue have almost double the chlorophyll a (0.35 mg m-3) of anticyclonic eddies (0.18 mg m-3). The average chlorophyll a concentration for cyclonic eddies is 16% higher in Eddy Avenue and 28% lower for anticyclonic eddies when compared to the Tasman Sea. With a strengthening East Australian Current, the propagation of these eddies will have significant implications for heat transport and the entrainment and connectivity of plankton and larval fish populations.

  13. Impact of different parameterization schemes on simulation of mesoscale convective system over south-east India

    Science.gov (United States)

    Madhulatha, A.; Rajeevan, M.

    2018-02-01

    Main objective of the present paper is to examine the role of various parameterization schemes in simulating the evolution of mesoscale convective system (MCS) occurred over south-east India. Using the Weather Research and Forecasting (WRF) model, numerical experiments are conducted by considering various planetary boundary layer, microphysics, and cumulus parameterization schemes. Performances of different schemes are evaluated by examining boundary layer, reflectivity, and precipitation features of MCS using ground-based and satellite observations. Among various physical parameterization schemes, Mellor-Yamada-Janjic (MYJ) boundary layer scheme is able to produce deep boundary layer height by simulating warm temperatures necessary for storm initiation; Thompson (THM) microphysics scheme is capable to simulate the reflectivity by reasonable distribution of different hydrometeors during various stages of system; Betts-Miller-Janjic (BMJ) cumulus scheme is able to capture the precipitation by proper representation of convective instability associated with MCS. Present analysis suggests that MYJ, a local turbulent kinetic energy boundary layer scheme, which accounts strong vertical mixing; THM, a six-class hybrid moment microphysics scheme, which considers number concentration along with mixing ratio of rain hydrometeors; and BMJ, a closure cumulus scheme, which adjusts thermodynamic profiles based on climatological profiles might have contributed for better performance of respective model simulations. Numerical simulation carried out using the above combination of schemes is able to capture storm initiation, propagation, surface variations, thermodynamic structure, and precipitation features reasonably well. This study clearly demonstrates that the simulation of MCS characteristics is highly sensitive to the choice of parameterization schemes.

  14. Modelling study of mesoscale cyclogenesis over Ross Sea, Antarctica, on February 18, 1988

    Energy Technology Data Exchange (ETDEWEB)

    Stortini, M.; Morelli, S.; Marchesi, S. [Modena e Reggio Emilia Univ., Modena (Italy). Dipt. di Scienze dell' Ingegneria, Sez. Osservatorio Geofisico

    2000-04-01

    This paper examines the development of a summer event of mesoscale cyclogenesis off the coast of Victoria Land in the presence of katabatic winds, by means of numerical simulations. These refer to the period from 00 UTC 17 February to 00 UTC 19 February 1988 and were performed using the hydrostatic ETA (1993 version) limited area model with resolution 55 km x 55 km x 17 levels. The ETA model reproduces katabatic winds from Terra Nova Bay and a trough on the southwestern Ross Sea. A cyclonic vortex is simulated in the trough, even though it is weaker than the one present in the analysis initialized by the European Center for Medium Range Weather Forecast (Reading, United Kingdom). Idealized simulations with varied surface conditions were also performed. In particular, an ice-covered ocean acts to weaken the atmospheric phenomena, while a no-mountain simulation emphasizes the influence of the orography and the cold winds from the coast of Victoria Land on the mesocyclonic activity.

  15. Mesoscale storm and dry period parameters from hourly precipitation data: program documentation

    Energy Technology Data Exchange (ETDEWEB)

    Thorp, J.M.

    1984-09-01

    Wet deposition of airborne chemical pollutants occurs primarily from precipitation. Precipitation rate, amount, duration, and location are important meteorological factors to be considered when attempting to understand the relationship of precipitation to pollutant deposition. The Pacific Northwest Laboratory (PNL) has conducted studies and experiments in numerous locations to collect data that can be incorporated into theories and models that attempt to describe the complex relationship between precipitation occurrence and chemical wet desposition. Model development often requires the use of average rather than random condition as input. To provide mean values of storm parameters, the task, Climatological Analysis of Mesoscale Storms, was created as a facet of the Environmental Protection Agency's related-service project, Precipitation Scavenging Module Development. Within this task computer programs have been developed at PNL which incorporate hourly precipitation data from National Weather Service stations to calculate mean values and frequency distributions of precipitation periods and of the interspersed dry periods. These programs have been written with a degree of flexibiity that will allow user modification for applications to different, but similar, analyses. This report describes in detail the rationale and operation of the two computer programs which produce the tables of average and frequency distributions of storm and dry period parameters from the precipitation data. A listing of the programs and examples of the generated output are included in the appendices. 3 references, 3 figures, 6 tables.

  16. An overview of mesoscale aerosol processes, comparisons, and validation studies from DRAGON networks

    Directory of Open Access Journals (Sweden)

    B. N. Holben

    2018-01-01

    Full Text Available Over the past 24 years, the AErosol RObotic NETwork (AERONET program has provided highly accurate remote-sensing characterization of aerosol optical and physical properties for an increasingly extensive geographic distribution including all continents and many oceanic island and coastal sites. The measurements and retrievals from the AERONET global network have addressed satellite and model validation needs very well, but there have been challenges in making comparisons to similar parameters from in situ surface and airborne measurements. Additionally, with improved spatial and temporal satellite remote sensing of aerosols, there is a need for higher spatial-resolution ground-based remote-sensing networks. An effort to address these needs resulted in a number of field campaign networks called Distributed Regional Aerosol Gridded Observation Networks (DRAGONs that were designed to provide a database for in situ and remote-sensing comparison and analysis of local to mesoscale variability in aerosol properties. This paper describes the DRAGON deployments that will continue to contribute to the growing body of research related to meso- and microscale aerosol features and processes. The research presented in this special issue illustrates the diversity of topics that has resulted from the application of data from these networks.

  17. Tropical Atlantic Hurricanes, Easterly Waves, and West African Mesoscale Convective Systems

    Directory of Open Access Journals (Sweden)

    Yves K. Kouadio

    2010-01-01

    Full Text Available The relationship between tropical Atlantic hurricanes (Hs, atmospheric easterly waves (AEWs, and West African mesoscale convective systems (MCSs is investigated. It points out atmospheric conditions over West Africa before hurricane formation. The analysis was performed for two periods, June–November in 2004 and 2005, during which 12 hurricanes (seven in 2004, five in 2005 were selected. Using the AEW signature in the 700 hPa vorticity, a backward trajectory was performed to the African coast, starting from the date and position of each hurricane, when and where it was catalogued as a tropical depression. At this step, using the Meteosat-7 satellite dataset, we selected all the MCSs around this time and region, and tracked them from their initiation until their dissipation. This procedure allowed us to relate each of the selected Hs with AEWs and a succession of MCSs that occurred a few times over West Africa before initiation of the hurricane. Finally, a dipole in sea surface temperature (SST was observed with a positive SST anomaly within the region of H generation and a negative SST anomaly within the Gulf of Guinea. This SST anomaly dipole could contribute to enhance the continental convergence associated with the monsoon that impacts on the West African MCSs formation.

  18. Upscale Impact of Mesoscale Disturbances of Tropical Convection on Convectively Coupled Kelvin Waves

    Science.gov (United States)

    Yang, Q.; Majda, A.

    2017-12-01

    Tropical convection associated with convectively coupled Kelvin waves (CCKWs) is typically organized by an eastward-moving synoptic-scale convective envelope with numerous embedded westward-moving mesoscale disturbances. It is of central importance to assess upscale impact of mesoscale disturbances on CCKWs as mesoscale disturbances propagate at various tilt angles and speeds. Here a simple multi-scale model is used to capture this multi-scale structure, where mesoscale fluctuations are directly driven by mesoscale heating and synoptic-scale circulation is forced by mean heating and eddy transfer of momentum and temperature. The two-dimensional version of the multi-scale model drives the synoptic-scale circulation, successfully reproduces key features of flow fields with a front-to-rear tilt and compares well with results from a cloud resolving model. In the scenario with an elevated upright mean heating, the tilted vertical structure of synoptic-scale circulation is still induced by the upscale impact of mesoscale disturbances. In a faster propagation scenario, the upscale impact becomes less important, while the synoptic-scale circulation response to mean heating dominates. In the unrealistic scenario with upward/westward tilted mesoscale heating, positive potential temperature anomalies are induced in the leading edge, which will suppress shallow convection in a moist environment. In its three-dimensional version, results show that upscale impact of mesoscale disturbances that propagate at tilt angles (110o 250o) induces negative lower-tropospheric potential temperature anomalies in the leading edge, providing favorable conditions for shallow convection in a moist environment, while the remaining tilt angle cases have opposite effects. Even in the presence of upright mean heating, the front-to-rear tilted synoptic-scale circulation can still be induced by eddy terms at tilt angles (120o 240o). In the case with fast propagating mesoscale heating, positive

  19. Tropical continental downdraft characteristics: mesoscale systems versus unorganized convection

    Science.gov (United States)

    Schiro, Kathleen A.; Neelin, J. David

    2018-02-01

    Downdrafts and cold pool characteristics for strong mesoscale convective systems (MCSs) and isolated, unorganized deep precipitating convection are analyzed using multi-instrument data from the DOE Atmospheric Radiation Measurement (ARM) GoAmazon2014/5 campaign. Increases in column water vapor (CWV) are observed leading convection, with higher CWV preceding MCSs than for isolated cells. For both MCSs and isolated cells, increases in wind speed, decreases in surface moisture and temperature, and increases in relative humidity occur coincidentally with system passages. Composites of vertical velocity data and radar reflectivity from a radar wind profiler show that the downdrafts associated with the sharpest decreases in surface equivalent potential temperature (θe) have a probability of occurrence that increases with decreasing height below the freezing level. Both MCSs and unorganized convection show similar mean downdraft magnitudes and probabilities with height. Mixing computations suggest that, on average, air originating at heights greater than 3 km must undergo substantial mixing, particularly in the case of isolated cells, to match the observed cold pool θe, implying a low typical origin level. Precipitation conditionally averaged on decreases in surface equivalent potential temperature (Δθe) exhibits a strong relationship because the most negative Δθe values are associated with a high probability of precipitation. The more physically motivated conditional average of Δθe on precipitation shows that decreases in θe level off with increasing precipitation rate, bounded by the maximum difference between surface θe and its minimum in the profile aloft. Robustness of these statistics observed across scales and regions suggests their potential use as model diagnostic tools for the improvement of downdraft parameterizations in climate models.

  20. Observations of near-inertial kinetic energy inside mesoscale eddies.

    Science.gov (United States)

    Garcia Gomez, B. I.; Pallas Sanz, E.; Candela, J.

    2016-02-01

    The near-nertial oscillations (NIOs), generated by the wind stress on the surface mixed layer, are the inertia gravity waves with the lowest frequency and the highest kinetic energy. NIOs are important because they drive vertical mixing in the interior ocean during wave breaking events. Although the interaction between NIOs and mesoescale eddies has been reported by several authors, these studies are mostly analytical and numerical, and only few observational studies have attempted to show the differences in near-inertial kinetic energy (KEi) between anticyclonic and cyclonic eddies. In this work the spatial structure of the KEi inside the mesoscale eddies is computed using daily satellite altimetry and observations of horizontal velocity from 30 moorings equipped with acoustic Doppler current profilers in the western Gulf of Mexico. Consistent to theory, the obtained four-year KEi-composites show two times more KEi inside the anticyclonic eddies than inside the cyclonic ones. The vertical cross-sections of the KEi-composites show that the KEi is mainly located near the surface and at the edge of the cyclonic eddies (positive vorticity), whereas the KEi in anticyclonic eddies (negative vorticity) is maximum in the eddy's center and near to the base of the eddy where the NIOs become more inertial, are trapped, and amplified. A relative maximum in the upper anticyclonic eddy is also observed. The cyclonic eddies present a maximum of KEi near to the surface at 70 m, while the maximum of KEi in the anticyclonic eddies occurs between 800 and 1000 m. It is also shown the dependence between the distribution and magnitude of the KEi and the eddy's characteristics such as radius, vorticity, and amplitude.

  1. An Observational Study of the Mesoscale Mistral Dynamics

    Science.gov (United States)

    Guenard, Vincent; Drobinski, Philippe; Caccia, Jean-Luc; Campistron, Bernard; Bench, Bruno

    2005-05-01

    We investigate the mesoscale dynamics of the mistral through the wind profiler observations of the MAP (autumn 1999) and ESCOMPTE (summer 2001) field campaigns. We show that the mistral wind field can dramatically change on a time scale less than 3 hours. Transitions from a deep to a shallow mistral are often observed at any season when the lower layers are stable. The variability, mainly attributed in summer to the mistral/land-sea breeze interactions on a 10-km scale, is highlighted by observations from the wind profiler network set up during ESCOMPTE. The interpretations of the dynamical mistral structure are performed through comparisons with existing basic theories. The linear theory of R. B. Smith [ Advances in Geophysics, Vol. 31, 1989, Academic Press, 1-41] and the shallow water theory [Schär, C. and Smith, R. B.: 1993a, J. Atmos. Sci. 50, 1373-1400] give some complementary explanations for the deep-to-shallow transition especially for the MAP mistral event. The wave breaking process induces a low-level jet (LLJ) downstream of the Alps that degenerates into a mountain wake, which in turn provokes the cessation of the mistral downstream of the Alps. Both theories indicate that the flow splits around the Alps and results in a persistent LLJ at the exit of the Rhône valley. The LLJ is strengthened by the channelling effect of the Rhône valley that is more efficient for north-easterly than northerly upstream winds despite the north-south valley axis. Summer moderate and weak mistral episodes are influenced by land-sea breezes and convection over land that induce a very complex interaction that cannot be accurately described by the previous theories.

  2. Comparison of methods for the identification of mesoscale wind speed fluctuations

    Directory of Open Access Journals (Sweden)

    Anna Rieke Mehrens

    2017-06-01

    Full Text Available Mesoscale wind speed fluctuations influence the characteristics of offshore wind energy. These recurring wind speed changes on time scales between tens of minutes and six hours lead to power output fluctuations. In order to investigate the meteorological conditions associated with mesoscale wind speed fluctuations, a measure is needed to detect these situations in wind speed time series. Previous studies used the empirical Hilbert-Huang Transform to determine the energy in the mesoscale frequency range or calculated the standard deviation of a band-pass filtered wind speed time series. The aim of this paper is to introduce newly developed empirical mesoscale fluctuation measures and to compare them with existing measures in regard to their sensitivity to recurring wind speed changes. One of the methods is based on the Hilbert-Huang Transform, two on the Fast Fourier Transform and one on wind speed increments. It is found that despite various complexity of the methods, all methods can identify days with highly variable mesoscale wind speeds equally well.

  3. Toward the use of a mesoscale model at a very high resolution

    Energy Technology Data Exchange (ETDEWEB)

    Gasset, N.; Benoit, R.; Masson, C. [Canada Research Chair on Nordic Environment Aerodynamics of Wind Turbines, Ottawa, ON (Canada)

    2008-07-01

    This presentation described a new compressible mesoscale model designed to obtain wind speed data for potential wind power resource development. Microscale modelling and computerized fluid dynamics (CFD) are used to study the mean properties of the surface layer of the atmospheric boundary layer (ABL). Mesoscale models study the temporal evolution of synoptic to mesoscale atmospheric phenomena and environmental modelling. Mesoscale modelling is essential for wind energy applications and large-scale resource evaluation, and can be compared with microscale models in order to validate input data and determine boundary conditions. The compressible community mesoscale model (MC2) was comprised of a national weather prediction (NWP) model with semi-implicit semi-Lagrangian (SISL) dynamics and compressible Euler equation solutions. Physical parameters included radiations; microphysics; thermal stratification; turbulence; and convection. The turbulence diffusion feature included unsteady Reynolds averaged Navier-Stokes; transport equations for turbulent kinetic energy; and mixing lengths. Operating modes included 3-D weather data, and surface and ground properties as well as 1-way self-nesting abilities. The validation framework for the model included a simulation of a set of realistic cases and theoretical cases including full dynamics and physics. Theoretical cases included manually imposed initial and boundary conditions and minimalist physics. Further research is being conducted to refine operating modes and boundary conditions. tabs., figs.

  4. A mesoscale chemical transport model (MEDIUM) nested in a global chemical transport model (MEDIANTE)

    Energy Technology Data Exchange (ETDEWEB)

    Claveau, J; Ramaroson, R [Office National d` Etudes et de Recherches Aerospatiales (ONERA), 92 - Chatillon (France)

    1998-12-31

    The lower stratosphere and upper troposphere (UT-LS) are frequently subject to mesoscale or local scale exchange of air masses occurring along discontinuities. This exchange (e.g. downward) can constitute one of the most important source of ozone from the stratosphere down to the middle troposphere where strong mixing dilutes the air mass and competing the non-linear chemistry. The distribution of the chemical species in the troposphere and the lower stratosphere depends upon various source emissions, e.g. from polluted boundary layer or from aircraft emissions. Global models, as well as chemical transport models describe the climatological state of the atmosphere and are not able to describe correctly the stratosphere and troposphere exchange. Mesoscale models go further in the description of smaller scales and can reasonably include a rather detailed chemistry. They can be used to assess the budget of NO{sub x} from aircraft emissions in a mesoscale domain. (author) 4 refs.

  5. EMMA model: an advanced operational mesoscale air quality model for urban and regional environments

    International Nuclear Information System (INIS)

    Jose, R.S.; Rodriguez, M.A.; Cortes, E.; Gonzalez, R.M.

    1999-01-01

    Mesoscale air quality models are an important tool to forecast and analyse the air quality in regional and urban areas. In recent years an increased interest has been shown by decision makers in these types of software tools. The complexity of such a model has grown exponentially with the increase of computer power. Nowadays, medium workstations can run operational versions of these modelling systems successfully. Presents a complex mesoscale air quality model which has been installed in the Environmental Office of the Madrid community (Spain) in order to forecast accurately the ozone, nitrogen dioxide and sulphur dioxide air concentrations in a 3D domain centred on Madrid city. Describes the challenging scientific matters to be solved in order to develop an operational version of the atmospheric mesoscale numerical pollution model for urban and regional areas (ANA). Some encouraging results have been achieved in the attempts to improve the accuracy of the predictions made by the version already installed. (Author)

  6. A mesoscale chemical transport model (MEDIUM) nested in a global chemical transport model (MEDIANTE)

    Energy Technology Data Exchange (ETDEWEB)

    Claveau, J.; Ramaroson, R. [Office National d`Etudes et de Recherches Aerospatiales (ONERA), 92 - Chatillon (France)

    1997-12-31

    The lower stratosphere and upper troposphere (UT-LS) are frequently subject to mesoscale or local scale exchange of air masses occurring along discontinuities. This exchange (e.g. downward) can constitute one of the most important source of ozone from the stratosphere down to the middle troposphere where strong mixing dilutes the air mass and competing the non-linear chemistry. The distribution of the chemical species in the troposphere and the lower stratosphere depends upon various source emissions, e.g. from polluted boundary layer or from aircraft emissions. Global models, as well as chemical transport models describe the climatological state of the atmosphere and are not able to describe correctly the stratosphere and troposphere exchange. Mesoscale models go further in the description of smaller scales and can reasonably include a rather detailed chemistry. They can be used to assess the budget of NO{sub x} from aircraft emissions in a mesoscale domain. (author) 4 refs.

  7. The influence of mesoscale porosity on cortical bone anisotropy. Investigations via asymptotic homogenization

    Science.gov (United States)

    Parnell, William J; Grimal, Quentin

    2008-01-01

    Recently, the mesoscale of cortical bone has been given particular attention in association with novel experimental techniques such as nanoindentation, micro-computed X-ray tomography and quantitative scanning acoustic microscopy (SAM). A need has emerged for reliable mathematical models to interpret the related microscopic and mesoscopic data in terms of effective elastic properties. In this work, a new model of cortical bone elasticity is developed and used to assess the influence of mesoscale porosity on the induced anisotropy of the material. Only the largest pores (Haversian canals and resorption cavities), characteristic of the mesoscale, are considered. The input parameters of the model are derived from typical mesoscale experimental data (e.g. SAM data). We use the method of asymptotic homogenization to determine the local effective elastic properties by modelling the propagation of low-frequency elastic waves through an idealized material that models the local mesostructure. We use a novel solution of the cell problem developed by Parnell & Abrahams. This solution is stable for the physiological range of variation of mesoscopic porosity and elasticity found in bone. Results are computed efficiently (in seconds) and the solutions can be implemented easily by other workers. Parametric studies are performed in order to assess the influence of mesoscopic porosity, the assumptions regarding the material inside the mesoscale pores (drained or undrained bone) and the shape of pores. Results are shown to be in good qualitative agreement with existing schemes and we describe the potential of the scheme for future use in modelling more complex microstructures for cortical bone. In particular, the scheme is shown to be a useful tool with which to predict the qualitative changes in anisotropy due to variations in the structure at the mesoscale. PMID:18628200

  8. The instability characteristics of lean premixed hydrogen and syngas flames stabilized on meso-scale bluff-body

    KAUST Repository

    Kim, Yu Jeong

    2017-01-05

    Bluff-body flame stabilization has been used as one of main flame stabilization schemes to improve combustion stability in both large and small scale premixed combustion systems. The detailed investigation of instability characteristics is needed to understand flame stability mechanism. Direct numerical simulations are conducted to investigate flame dynamics on the instability of lean premixed hydrogen/air and syngas/air flames stabilized on a meso-scale bluff-body. A two-dimensional channel of 10 mm height and 10 mm length with a square bluff-body stabilizer of 0.5 mm is considered. The height of domain is chosen as an unconfined condition to minimize the effect of the blockage ratio. Flame/flow dynamics are observed by increasing the mean inflow velocity from a steady stable to unsteady asymmetrical instability, followed by blowoff. Detailed observations between hydrogen and syngas flames with a time scale analysis are presented.

  9. Development and application of a chemistry mechanism for mesoscale simulations of the troposphere and lower stratosphere

    Energy Technology Data Exchange (ETDEWEB)

    Lippert, E.; Hendricks, J.; Petry, H. [Cologne Univ. (Germany). Inst. for Geophysics and Meteorology

    1997-12-31

    A new chemical mechanism is applied for mesoscale simulations of the impact of aircraft exhausts on the atmospheric composition. The temporal and spatial variation of the tropopause height is associated with a change of the trace gas composition in these heights. Box and three dimensional mesoscale model studies show that the conversion of aircraft exhausts depends strongly on the cruise heights as well as on the location of release in relation to the tropopause. The impact of aircraft emissions on ozone is strongly dependent on the individual meteorological situation. A rising of the tropopause height within a few days results in a strong increase of ozone caused by aircraft emissions. (author) 12 refs.

  10. Development and application of a chemistry mechanism for mesoscale simulations of the troposphere and lower stratosphere

    Energy Technology Data Exchange (ETDEWEB)

    Lippert, E; Hendricks, J; Petry, H [Cologne Univ. (Germany). Inst. for Geophysics and Meteorology

    1998-12-31

    A new chemical mechanism is applied for mesoscale simulations of the impact of aircraft exhausts on the atmospheric composition. The temporal and spatial variation of the tropopause height is associated with a change of the trace gas composition in these heights. Box and three dimensional mesoscale model studies show that the conversion of aircraft exhausts depends strongly on the cruise heights as well as on the location of release in relation to the tropopause. The impact of aircraft emissions on ozone is strongly dependent on the individual meteorological situation. A rising of the tropopause height within a few days results in a strong increase of ozone caused by aircraft emissions. (author) 12 refs.

  11. A three-dimensional meso-scale modeling for helium bubble growth in metals

    International Nuclear Information System (INIS)

    Suzudo, T.; Kaburaki, H.; Wakai, E.

    2007-01-01

    A three-dimensional meso-scale computer model using a Monte-Carlo simulation method has been proposed to simulate the helium bubble growth in metals. The primary merit of this model is that it enables the visual comparison between the microstructure observed by the TEM imaging and those by calculations. The modeling is so simple that one can control easily the calculation by tuning parameters. The simulation results are confirmed by the ideal gas law and the capillary relation. helium bubble growth, meso-scale modeling, Monte-Carlo simulation, the ideal gas law and the capillary relation. (authors)

  12. WRF Mesoscale Pre-Run for the Wind Atlas of Mexico

    OpenAIRE

    Hahmann, Andrea N.; Pena Diaz, Alfredo; Hansen, Jens Carsten

    2016-01-01

    This report documents the work performed by DTU Wind Energy for the project “Atlas Eólico Mexicano” or the Wind Atlas of Mexico. This document reports on the methods used in “Pre-run” of the windmapping project for Mexico. The interim mesoscale modeling results were calculated from the output of simulations using the Weather, Research and Forecasting (WRF) model. We document the method used to run the mesoscale simulations and to generalize the WRF model wind climatologies. A separate section...

  13. Development of extended WRF variational data assimilation system (WRFDA) for WRF non-hydrostatic mesoscale model

    Science.gov (United States)

    Pattanayak, Sujata; Mohanty, U. C.

    2018-06-01

    The paper intends to present the development of the extended weather research forecasting data assimilation (WRFDA) system in the framework of the non-hydrostatic mesoscale model core of weather research forecasting system (WRF-NMM), as an imperative aspect of numerical modeling studies. Though originally the WRFDA provides improved initial conditions for advanced research WRF, we have successfully developed a unified WRFDA utility that can be used by the WRF-NMM core, as well. After critical evaluation, it has been strategized to develop a code to merge WRFDA framework and WRF-NMM output. In this paper, we have provided a few selected implementations and initial results through single observation test, and background error statistics like eigenvalues, eigenvector and length scale among others, which showcase the successful development of extended WRFDA code for WRF-NMM model. Furthermore, the extended WRFDA system is applied for the forecast of three severe cyclonic storms: Nargis (27 April-3 May 2008), Aila (23-26 May 2009) and Jal (4-8 November 2010) formed over the Bay of Bengal. Model results are compared and contrasted within the analysis fields and later on with high-resolution model forecasts. The mean initial position error is reduced by 33% with WRFDA as compared to GFS analysis. The vector displacement errors in track forecast are reduced by 33, 31, 30 and 20% to 24, 48, 72 and 96 hr forecasts respectively, in data assimilation experiments as compared to control run. The model diagnostics indicates successful implementation of WRFDA within the WRF-NMM system.

  14. Mesoscale atmospheric modelling technology as a tool for the long-term meteorological dataset development

    Science.gov (United States)

    Platonov, Vladimir; Kislov, Alexander; Rivin, Gdaly; Varentsov, Mikhail; Rozinkina, Inna; Nikitin, Mikhail; Chumakov, Mikhail

    2017-04-01

    The detailed hydrodynamic modelling of meteorological parameters during the last 30 years (1985 - 2014) was performed for the Okhotsk Sea and the Sakhalin island regions. The regional non-hydrostatic atmospheric model COSMO-CLM used for this long-term simulation with 13.2, 6.6 and 2.2 km horizontal resolutions. The main objective of creation this dataset was the outlook of the investigation of statistical characteristics and the physical mechanisms of extreme weather events (primarily, wind speed extremes) on the small spatio-temporal scales. COSMO-CLM is the climate version of the well-known mesoscale COSMO model, including some modifications and extensions adapting to the long-term numerical experiments. The downscaling technique was realized and developed for the long-term simulations with three consequent nesting domains. ERA-Interim reanalysis ( 0.75 degrees resolution) used as global forcing data for the starting domain ( 13.2 km horizontal resolution), then these simulation data used as initial and boundary conditions for the next model runs over the domain with 6.6 km resolution, and similarly, for the next step to 2.2 km domain. Besides, the COSMO-CLM model configuration for 13.2 km run included the spectral nudging technique, i.e. an additional assimilation of reanalysis data not only at boundaries, but also inside the whole domain. Practically, this computational scheme realized on the SGI Altix 4700 supercomputer system in the Main Computer Center of Roshydromet and used 2,400 hours of CPU time total. According to modelling results, the verification of the obtained dataset was performed on the observation data. Estimations showed the mean error -0.5 0C, up to 2 - 3 0C RMSE in temperature, and overestimation in wind speed (RMSE is up to 2 m/s). Overall, analysis showed that the used downscaling technique with applying the COSMO-CLM model reproduced the meteorological conditions, spatial distribution, seasonal and synoptic variability of temperature and

  15. A three-dimensional ocean mesoscale simulation using data from the SEMAPHORE experiment: Mixed layer heat budget

    Science.gov (United States)

    Caniaux, Guy; Planton, Serge

    1998-10-01

    A primitive equation model is used to simulate the mesoscale circulation associated with a portion of the Azores Front investigated during the intensive observation period (IOP) of the Structure des Echanges Mer-Atmosphere, Proprietes des Heterogeneites Oceaniques: Recherche Experimentale (SEMAPHORE) experiment in fall 1993. The model is a mesoscale version of the ocean general circulation model (OGCM) developed at the Laboratoire d'Océanographie Dynamique et de Climatologie (LODYC) in Paris and includes open lateral boundaries, a 1.5-level-order turbulence closure scheme, and fine mesh resolution (0.11° for latitude and 0.09° for longitude). The atmospheric forcing is provided by satellite data for the solar and infrared fluxes and by analyzed (or reanalyzed for the wind) atmospheric data from the European Centre for Medium-Range Weather Forecasts (ECMWF) forecast model. The extended data set collected during the IOP of SEMAPHORE enables a detailed initialization of the model, a coupling with the rest of the basin through time dependent open boundaries, and a model/data comparison for validation. The analysis of model outputs indicates that most features are in good agreement with independent available observations. The surface front evolution is subject to an intense deformation different from that of the deep front system, which evolves only weakly. An estimate of the upper layer heat budget is performed during the 22 days of the integration of the model. Each term of this budget is analyzed according to various atmospheric events that occurred during the experiment, such as the passage of a strong storm. This facilitates extended estimates of mixed layer or relevant surface processes beyond those which are obtainable directly from observations. Surface fluxes represent 54% of the heat loss in the mixed layer and 70% in the top 100-m layer, while vertical transport at the mixed layer bottom accounts for 31% and three-dimensional processes account for 14%.

  16. Tool for Turbine Engine Closed-Loop Transient Analysis (TTECTrA) Users' Guide

    Science.gov (United States)

    Csank, Jeffrey T.; Zinnecker, Alicia M.

    2014-01-01

    The tool for turbine engine closed-loop transient analysis (TTECTrA) is a semi-automated control design tool for subsonic aircraft engine simulations. At a specific flight condition, TTECTrA produces a basic controller designed to meet user-defined goals and containing only the fundamental limiters that affect the transient performance of the engine. The purpose of this tool is to provide the user a preliminary estimate of the transient performance of an engine model without the need to design a full nonlinear controller.

  17. Scaling of mesoscale simulations of polymer melts with the bare friction coefficient

    NARCIS (Netherlands)

    Kindt, P.; Kindt, P.; Briels, Willem J.

    2005-01-01

    Both the Rouse and reptation model predict that the dynamics of a polymer melt scale inversely proportional with the Langevin friction coefficient (E). Mesoscale Brownian dynamics simulations of polyethylene validate these scaling predictions, providing the reptational friction (E)R=(E)+(E)C is

  18. The diffusion of radioactive gases in the meso-scale (20 km-400 km)

    International Nuclear Information System (INIS)

    Wippermann, F.

    1974-01-01

    The term ''Mesoscale'' refers to distances between 20 km and 400 km from the source; in defining this range, the structure of atmospheric turbulence is taken into account. To arrive at an evaluation of diffusion in the mesoscale, quantitative methods from the microscale (source distance 400 km) are extrapolated into the mesoscale. In the first case a table is given to read off the minimum factor by which the concentration is reduced in the mesoscale as the source distance increases to obtain the diffusion for the worst possible case, the existence of a mixing-layer topped by a temperature inversion, was assumed. For this it was essential, first of all, to determine the source distance xsub(D) beyond which the diffusing gases are completely mixed within the mixing-layer of thickness D. To make allowance for all possible thicknesses of this mixing-layer, a measurement carried out at ground level at only 10 km from the source can be used to calculate the correct concentrations in the mixing-layer; the dilution factors will then be related to this value. Possible ways of an improved incorporation of certain factors in the diffusion estimate, such as the topography of the earth's surface, the roughness of terrain, the vertical profiles of wind and exchange coefficients and the effects of non-stability are given in the last section

  19. Three-dimensional Mesoscale Simulations of Detonation Initiation in Energetic Materials with Density-based Kinetics

    Science.gov (United States)

    Jackson, Thomas; Jost, A. M.; Zhang, Ju; Sridharan, P.; Amadio, G.

    2017-06-01

    In this work we present three-dimensional mesoscale simulations of detonation initiation in energetic materials. We solve the reactive Euler equations, with the energy equation augmented by a power deposition term. The reaction rate at the mesoscale is modelled using a density-based kinetics scheme, adapted from standard Ignition and Growth models. The deposition term is based on previous results of simulations of pore collapse at the microscale, modelled at the mesoscale as hot-spots. We carry out three-dimensional mesoscale simulations of random packs of HMX crystals in a binder, and show that the transition between no-detonation and detonation depends on the number density of the hot-spots, the initial radius of the hot-spot, the post-shock pressure of an imposed shock, and the amplitude of the power deposition term. The trends of transition at lower pressure of the imposed shock for larger number density of pore observed in experiments is reproduced. Initial attempts to improve the agreement between the simulation and experiments through calibration of various parameters will also be made.

  20. A shallow convection parameterization for the non-hydrostatic MM5 mesoscale model

    Energy Technology Data Exchange (ETDEWEB)

    Seaman, N.L.; Kain, J.S.; Deng, A. [Pennsylvania State Univ., University Park, PA (United States)

    1996-04-01

    A shallow convection parameterization suitable for the Pennsylvannia State University (PSU)/National Center for Atmospheric Research nonhydrostatic mesoscale model (MM5) is being developed at PSU. The parameterization is based on parcel perturbation theory developed in conjunction with a 1-D Mellor Yamada 1.5-order planetary boundary layer scheme and the Kain-Fritsch deep convection model.

  1. DeepEddy : a simple deep architecture for mesoscale oceanic eddy detection in SAR images

    NARCIS (Netherlands)

    Huang, Dongmei; Du, Yanling; He, Qi; Song, Wei; Liotta, Antonio

    2017-01-01

    Automatic detection of mesoscale oceanic eddies is in great demand to monitor their dynamics which play a significant role in ocean current circulation and marine climate change. Traditional methods of eddies detection using remotely sensed data are usually based on physical parameters, geometrics,

  2. Mesoscale Iron Enrichment Experiments 1993–2005 : Synthesis and Future Directions

    NARCIS (Netherlands)

    Boyd, P.W.; Jickells, T.; Law, C.S.; Blain, S.; Boyle, E.A.; Buesseler, K.O.; Coale, K.H.; Cullen, J.J.; Baar, H.J.W. de; Follows, M.; Harvey, M.; Lancelot, C.; Levasseur, M.; Owens, N.P.J.; Pollard, R.; Rivkin, R.B.; Sarmiento, J.; Schoemann, V.; Smetacek, V.; Takeda, S.; Tsuda, A.; Turner, S.; Watson, A.J.; Jickells, S.

    2007-01-01

    Since the mid-1980s, our understanding of nutrient limitation of oceanic primary production has radically changed. Mesoscale iron addition experiments (FeAXs) have unequivocally shown that iron supply limits production in one-third of the world ocean, where surface macronutrient concentrations are

  3. Phase Behavior of Semiflexible-Flexible Diblock Copolymer Melt: Insight from Mesoscale Modeling.

    Czech Academy of Sciences Publication Activity Database

    Beránek, P.; Posel, Zbyšek

    2016-01-01

    Roč. 16, č. 8 (2016), s. 7832-7835 ISSN 1533-4880 R&D Projects: GA MŠk(CZ) LH12020 Institutional support: RVO:67985858 Keywords : conformational asymmetry * dissipative particle dynamics * mesoscale modeling Subject RIV: CF - Physical ; Theoretical Chemistry Impact factor: 1.483, year: 2016