WorldWideScience

Sample records for semiautomated reproducible batch

  1. Systematic Methodology for Reproducible Optimizing Batch Operation

    DEFF Research Database (Denmark)

    Bonné, Dennis; Jørgensen, Sten Bay

    2006-01-01

    This contribution presents a systematic methodology for rapid acquirement of discrete-time state space model representations of batch processes based on their historical operation data. These state space models are parsimoniously parameterized as a set of local, interdependent models. The present...

  2. Semi-automated scoring of pulmonary emphysema from X-ray CT: Trainee reproducibility and accuracy

    Energy Technology Data Exchange (ETDEWEB)

    Owrangi, Amir M., E-mail: aowrangi@robarts.ca [Imaging Research Laboratories, Robarts Research Institute, London (Canada); Entwistle, Brandon, E-mail: Brandon.Entwistle@londonhospitals.ca; Lu, Andrew, E-mail: Andrew.Lu@londonhospitals.ca; Chiu, Jack, E-mail: Jack.Chiu@londonhospitals.ca; Hussain, Nabil, E-mail: Nabil.Hussain@londonhospitals.ca; Etemad-Rezai, Roya, E-mail: Roya.EtemadRezai@lhsc.on.ca; Parraga, Grace, E-mail: gparraga@robarts.ca [Imaging Research Laboratories, Robarts Research Institute, London (Canada); Graduate Program in Biomedical Engineering, Department of Medical Imaging, Department of Medical Biophysics, The University of Western Ontario, London (Canada)

    2013-11-01

    Objective: We developed a semi-automated tool to quantify emphysema from thoracic X-ray multi-detector (64-slice) computed tomography (CT) for training purposes and multi-reader studies. Materials and Methods: Thoracic X-ray CT was acquired in 93 ex-smokers, who were evaluated by six trainees with little or no expertise (trainees) and a single experienced thoracic radiologist (expert). A graphic user interface (GUI) was developed for emphysema quantification based on the percentile of lung where a score of 0 = no abnormalities, 1 = 1–25%, 2 = 26–50%, 3 = 51–75% and 4 = 76–100% for each lung side/slice. Trainees blinded to subject characteristics scored randomized images twice; accuracy was determined by comparison to expert scores, density histogram 15th percentile (HU{sub 15}), relative area at −950 HU (RA{sub 950}), low attenuation clusters at −950 HU (LAC{sub 950}), −856 HU (LAC{sub 856}) and the diffusing capacity for carbon monoxide (DL{sub CO%pred}). Intra- and inter-observer reproducibility was evaluated using coefficients-of-variation (COV), intra-class (ICC) and Pearson correlations. Results: Trainee–expert correlations were significant (r = 0.85–0.97, p < 0.0001) and a significant trainee bias (0.15 ± 0.22) was observed. Emphysema score was correlated with RA{sub 950} (r = 0.88, p < 0.0001), HU{sub 15} (r = −0.77, p < 0.0001), LAC{sub 950} (r = 0.76, p < 0.0001), LAC{sub 856} (r = 0.74, p = 0.0001) and DL{sub CO%pred} (r = −0.71, p < 0.0001). Intra-observer reproducibility (COV = 4–27%; ICC = 0.75–0.94) was moderate to high for trainees; intra- and inter-observer COV were negatively and non-linearly correlated with emphysema score. Conclusion: We developed a GUI for rapid and interactive emphysema scoring that allows for comparison of multiple readers with clinical and radiological standards.

  3. SU-E-J-252: Reproducibility of Radiogenomic Image Features: Comparison of Two Semi-Automated Segmentation Methods

    Energy Technology Data Exchange (ETDEWEB)

    Lee, M; Woo, B; Kim, J [Seoul National University, Seoul (Korea, Republic of); Jamshidi, N; Kuo, M [UCLA School of Medicine, Los Angeles, CA (United States)

    2015-06-15

    Purpose: Objective and reliable quantification of imaging phenotype is an essential part of radiogenomic studies. We compared the reproducibility of two semi-automatic segmentation methods for quantitative image phenotyping in magnetic resonance imaging (MRI) of glioblastoma multiforme (GBM). Methods: MRI examinations with T1 post-gadolinium and FLAIR sequences of 10 GBM patients were downloaded from the Cancer Image Archive site. Two semi-automatic segmentation tools with different algorithms (deformable model and grow cut method) were used to segment contrast enhancement, necrosis and edema regions by two independent observers. A total of 21 imaging features consisting of area and edge groups were extracted automatically from the segmented tumor. The inter-observer variability and coefficient of variation (COV) were calculated to evaluate the reproducibility. Results: Inter-observer correlations and coefficient of variation of imaging features with the deformable model ranged from 0.953 to 0.999 and 2.1% to 9.2%, respectively, and the grow cut method ranged from 0.799 to 0.976 and 3.5% to 26.6%, respectively. Coefficient of variation for especially important features which were previously reported as predictive of patient survival were: 3.4% with deformable model and 7.4% with grow cut method for the proportion of contrast enhanced tumor region; 5.5% with deformable model and 25.7% with grow cut method for the proportion of necrosis; and 2.1% with deformable model and 4.4% with grow cut method for edge sharpness of tumor on CE-T1W1. Conclusion: Comparison of two semi-automated tumor segmentation techniques shows reliable image feature extraction for radiogenomic analysis of GBM patients with multiparametric Brain MRI.

  4. SU-E-J-252: Reproducibility of Radiogenomic Image Features: Comparison of Two Semi-Automated Segmentation Methods

    International Nuclear Information System (INIS)

    Lee, M; Woo, B; Kim, J; Jamshidi, N; Kuo, M

    2015-01-01

    Purpose: Objective and reliable quantification of imaging phenotype is an essential part of radiogenomic studies. We compared the reproducibility of two semi-automatic segmentation methods for quantitative image phenotyping in magnetic resonance imaging (MRI) of glioblastoma multiforme (GBM). Methods: MRI examinations with T1 post-gadolinium and FLAIR sequences of 10 GBM patients were downloaded from the Cancer Image Archive site. Two semi-automatic segmentation tools with different algorithms (deformable model and grow cut method) were used to segment contrast enhancement, necrosis and edema regions by two independent observers. A total of 21 imaging features consisting of area and edge groups were extracted automatically from the segmented tumor. The inter-observer variability and coefficient of variation (COV) were calculated to evaluate the reproducibility. Results: Inter-observer correlations and coefficient of variation of imaging features with the deformable model ranged from 0.953 to 0.999 and 2.1% to 9.2%, respectively, and the grow cut method ranged from 0.799 to 0.976 and 3.5% to 26.6%, respectively. Coefficient of variation for especially important features which were previously reported as predictive of patient survival were: 3.4% with deformable model and 7.4% with grow cut method for the proportion of contrast enhanced tumor region; 5.5% with deformable model and 25.7% with grow cut method for the proportion of necrosis; and 2.1% with deformable model and 4.4% with grow cut method for edge sharpness of tumor on CE-T1W1. Conclusion: Comparison of two semi-automated tumor segmentation techniques shows reliable image feature extraction for radiogenomic analysis of GBM patients with multiparametric Brain MRI

  5. Impact of contrast injection and stent-graft implantation on reproducibility of volume measurements in semiautomated segmentation of abdominal aortic aneurysm on computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Morin-Roy, Florence; Hadjadj, Sofiane; Thomas, Olivier; Yang, Dan Yang [Centre Hospitalier Universitaire de Montreal (CHUM), Hopital Notre-Dame, Department of Radiology, Montreal, Quebec (Canada); Kauffmann, Claude [University of Montreal, Centre de Recherche, Centre Hospitalier Universitaire de Montreal (CRCHUM), Montreal, Quebec (Canada); Tang, An [University of Montreal, Centre de Recherche, Centre Hospitalier Universitaire de Montreal (CRCHUM), Montreal, Quebec (Canada); Centre Hospitalier Universitaire de Montreal (CHUM), Hopital Saint-Luc, Department of Radiology, Montreal, Quebec (Canada); Piche, Nicolas [Object Research System, Montreal, Quebec (Canada); Elkouri, Stephane [Centre Hospitalier Universitaire de Montreal (CHUM), Hopital Hotel-Dieu, Department of Vascular surgery, Montreal, Quebec (Canada); Therasse, Eric [University of Montreal, Centre de Recherche, Centre Hospitalier Universitaire de Montreal (CRCHUM), Montreal, Quebec (Canada); Centre Hospitalier Universitaire de Montreal (CHUM), Hopital Hotel-Dieu, Department of Radiology, Montreal, Quebec (Canada); Soulez, Gilles [Centre Hospitalier Universitaire de Montreal (CHUM), Hopital Notre-Dame, Department of Radiology, Montreal, Quebec (Canada); University of Montreal, Centre de Recherche, Centre Hospitalier Universitaire de Montreal (CRCHUM), Montreal, Quebec (Canada)

    2014-07-15

    To assess the impact of contrast injection and stent-graft implantation on feasibility, accuracy, and reproducibility of abdominal aortic aneurysm (AAA) volume and maximal diameter (D-max) measurements using segmentation software. CT images of 80 subjects presenting AAA were divided into four equal groups: with or without contrast enhancement, and with or without stent-graft implantation. Semiautomated software was used to segment the aortic wall, once by an expert and twice by three readers. Volume and D-max reproducibility was estimated by intraclass correlation coefficients (ICC), and accuracy was estimated between the expert and the readers by mean relative errors. All segmentations were technically successful. The mean AAA volume was 167.0 ± 82.8 mL and the mean D-max 55.0 ± 10.6 mm. Inter- and intraobserver ICCs for volume and D-max measurements were greater than 0.99. Mean relative errors between readers varied between -1.8 ± 4.6 and 0.0 ± 3.6 mL. Mean relative errors in volume and D-max measurements between readers showed no significant difference between the four groups (P ≥ 0.2). The feasibility, accuracy, and reproducibility of AAA volume and D-max measurements using segmentation software were not affected by the absence of contrast injection or the presence of stent-graft. (orig.)

  6. Accuracy and reproducibility of aortic annular measurements obtained from echocardiographic 3D manual and semi-automated software analyses in patients referred for transcatheter aortic valve implantation: implication for prosthesis size selection.

    Science.gov (United States)

    Stella, Stefano; Italia, Leonardo; Geremia, Giulia; Rosa, Isabella; Ancona, Francesco; Marini, Claudia; Capogrosso, Cristina; Giglio, Manuela; Montorfano, Matteo; Latib, Azeem; Margonato, Alberto; Colombo, Antonio; Agricola, Eustachio

    2018-02-06

    A 3D transoesophageal echocardiography (3D-TOE) reconstruction tool has recently been introduced. The system automatically configures a geometric model of the aortic root and performs quantitative analysis of these structures. We compared the measurements of the aortic annulus (AA) obtained by semi-automated 3D-TOE quantitative software and manual analysis vs. multislice computed tomography (MSCT) ones. One hundred and seventy-five patients (mean age 81.3 ± 6.3 years, 77 men) who underwent both MSCT and 3D-TOE for annulus assessment before transcatheter aortic valve implantation were analysed. Hypothetical prosthetic valve sizing was evaluated using the 3D manual, semi-automated measurements using manufacturer-recommended CT-based sizing algorithm as gold standard. Good correlation between 3D-TOE methods vs. MSCT measurements was found, but the semi-automated analysis demonstrated slightly better correlations for AA major diameter (r = 0.89), perimeter (r = 0.89), and area (r = 0.85) (all P 3D methods underestimated the MSCT measurements, but semi-automated measurements showed narrower limits of agreement and lesser bias than manual measurements for most of AA parameters. On average, 3D-TOE semi-automated major diameter, area, and perimeter underestimated the respective MSCT measurements by 7.4%, 3.5%, and 4.4%, respectively, whereas minor diameter was overestimated by 0.3%. Moderate agreement for valve sizing for both 3D-TOE techniques was found: Kappa agreement 0.5 for both semi-automated and manual analysis. Interobserver and intraobserver agreements for the AA measurements were excellent for both techniques (intraclass correlation coefficients for all parameters >0.80). The 3D-TOE semi-automated analysis of AA is feasible and reliable and can be used in clinical practice as an alternative to MSCT for AA assessment. Published on behalf of the European Society of Cardiology. All rights reserved. © The Author(s) 2018. For permissions, please

  7. Validating New Software for Semiautomated Liver Volumetry--Better than Manual Measurement?

    Science.gov (United States)

    Noschinski, L E; Maiwald, B; Voigt, P; Wiltberger, G; Kahn, T; Stumpp, P

    2015-09-01

    This prospective study compared a manual program for liver volumetry with semiautomated software. The hypothesis was that the semiautomated software would be faster, more accurate and less dependent on the evaluator's experience. Ten patients undergoing hemihepatectomy were included in this IRB approved study after written informed consent. All patients underwent a preoperative abdominal 3-phase CT scan, which was used for whole liver volumetry and volume prediction for the liver part to be resected. Two different types of software were used: 1) manual method: borders of the liver had to be defined per slice by the user; 2) semiautomated software: automatic identification of liver volume with manual assistance for definition of Couinaud segments. Measurements were done by six observers with different experience levels. Water displacement volumetry immediately after partial liver resection served as the gold standard. The resected part was examined with a CT scan after displacement volumetry. Volumetry of the resected liver scan showed excellent correlation to water displacement volumetry (manual: ρ = 0.997; semiautomated software: ρ = 0.995). The difference between the predicted volume and the real volume was significantly smaller with the semiautomated software than with the manual method (33% vs. 57%, p = 0.002). The semiautomated software was almost four times faster for volumetry of the whole liver (manual: 6:59 ± 3:04 min; semiautomated: 1:47 ± 1:11 min). Both methods for liver volumetry give an estimated liver volume close to the real one. The tested semiautomated software is faster, more accurate in predicting the volume of the resected liver part, gives more reproducible results and is less dependent on the user's experience. Both tested types of software allow exact volumetry of resected liver parts. Preoperative prediction can be performed more accurately with the semiautomated software. The semiautomated software is nearly four times faster than the

  8. Semi-automated retinal vessel analysis in nonmydriatic fundus photography.

    Science.gov (United States)

    Schuster, Alexander Karl-Georg; Fischer, Joachim Ernst; Vossmerbaeumer, Urs

    2014-02-01

    Funduscopic assessment of the retinal vessels may be used to assess the health status of microcirculation and as a component in the evaluation of cardiovascular risk factors. Typically, the evaluation is restricted to morphological appreciation without strict quantification. Our purpose was to develop and validate a software tool for semi-automated quantitative analysis of retinal vasculature in nonmydriatic fundus photography. matlab software was used to develop a semi-automated image recognition and analysis tool for the determination of the arterial-venous (A/V) ratio in the central vessel equivalent on 45° digital fundus photographs. Validity and reproducibility of the results were ascertained using nonmydriatic photographs of 50 eyes from 25 subjects recorded from a 3DOCT device (Topcon Corp.). Two hundred and thirty-three eyes of 121 healthy subjects were evaluated to define normative values. A software tool was developed using image thresholds for vessel recognition and vessel width calculation in a semi-automated three-step procedure: vessel recognition on the photograph and artery/vein designation, width measurement and calculation of central retinal vessel equivalents. Mean vessel recognition rate was 78%, vessel class designation rate 75% and reproducibility between 0.78 and 0.91. Mean A/V ratio was 0.84. Application on a healthy norm cohort showed high congruence with prior published manual methods. Processing time per image was one minute. Quantitative geometrical assessment of the retinal vasculature may be performed in a semi-automated manner using dedicated software tools. Yielding reproducible numerical data within a short time leap, this may contribute additional value to mere morphological estimates in the clinical evaluation of fundus photographs. © 2013 Acta Ophthalmologica Scandinavica Foundation. Published by John Wiley & Sons Ltd.

  9. Validating new software for semiautomated liver volumetry. Better than manual measurement?

    Energy Technology Data Exchange (ETDEWEB)

    Noschinski, L.E.; Maiwald, B.; Voigt, P.; Kahn, T.; Stumpp, P. [University Hospital Leipzig (Germany). Dept. of Diagnostic and Interventional Radiology; Wiltberger, G. [University Hospital Leipzig (Germany). Dept. of Visceral, Transplantation, Thoracic and Vascular Surgery

    2015-09-15

    This prospective study compared a manual program for liver volumetry with semiautomated software. The hypothesis was that the semiautomated software would be faster, more accurate and less dependent on the evaluator's experience. Ten patients undergoing hemihepatectomy were included in this IRB approved study after written informed consent. All patients underwent a preoperative abdominal 3-phase CT scan, which was used for whole liver volumetry and volume prediction for the liver part to be resected. Two different types of software were used: 1) manual method: borders of the liver had to be defined per slice by the user; 2) semiautomated software: automatic identification of liver volume with manual assistance for definition of Couinaud segments. Measurements were done by six observers with different experience levels. Water displacement volumetry immediately after partial liver resection served as the gold standard. The resected part was examined with a CT scan after displacement volumetry. Volumetry of the resected liver scan showed excellent correlation to water displacement volumetry (manual: ρ = 0.997; semiautomated software: ρ = 0.995). The difference between the predicted volume and the real volume was significantly smaller with the semiautomated software than with the manual method (33 % vs. 57 %, p = 0.002). The semiautomated software was almost four times faster for volumetry of the whole liver (manual: 6:59 ± 3:04min; semiautomated: 1:47 ± 1:11 min). Both methods for liver volumetry give an estimated liver volume close to the real one. The tested semiautomated software is faster, more accurate in predicting the volume of the resected liver part, gives more reproducible results and is less dependent on the user's experience.

  10. Validating new software for semiautomated liver volumetry. Better than manual measurement?

    International Nuclear Information System (INIS)

    Noschinski, L.E.; Maiwald, B.; Voigt, P.; Kahn, T.; Stumpp, P.; Wiltberger, G.

    2015-01-01

    This prospective study compared a manual program for liver volumetry with semiautomated software. The hypothesis was that the semiautomated software would be faster, more accurate and less dependent on the evaluator's experience. Ten patients undergoing hemihepatectomy were included in this IRB approved study after written informed consent. All patients underwent a preoperative abdominal 3-phase CT scan, which was used for whole liver volumetry and volume prediction for the liver part to be resected. Two different types of software were used: 1) manual method: borders of the liver had to be defined per slice by the user; 2) semiautomated software: automatic identification of liver volume with manual assistance for definition of Couinaud segments. Measurements were done by six observers with different experience levels. Water displacement volumetry immediately after partial liver resection served as the gold standard. The resected part was examined with a CT scan after displacement volumetry. Volumetry of the resected liver scan showed excellent correlation to water displacement volumetry (manual: ρ = 0.997; semiautomated software: ρ = 0.995). The difference between the predicted volume and the real volume was significantly smaller with the semiautomated software than with the manual method (33 % vs. 57 %, p = 0.002). The semiautomated software was almost four times faster for volumetry of the whole liver (manual: 6:59 ± 3:04min; semiautomated: 1:47 ± 1:11 min). Both methods for liver volumetry give an estimated liver volume close to the real one. The tested semiautomated software is faster, more accurate in predicting the volume of the resected liver part, gives more reproducible results and is less dependent on the user's experience.

  11. Reproducible research in palaeomagnetism

    Science.gov (United States)

    Lurcock, Pontus; Florindo, Fabio

    2015-04-01

    The reproducibility of research findings is attracting increasing attention across all scientific disciplines. In palaeomagnetism as elsewhere, computer-based analysis techniques are becoming more commonplace, complex, and diverse. Analyses can often be difficult to reproduce from scratch, both for the original researchers and for others seeking to build on the work. We present a palaeomagnetic plotting and analysis program designed to make reproducibility easier. Part of the problem is the divide between interactive and scripted (batch) analysis programs. An interactive desktop program with a graphical interface is a powerful tool for exploring data and iteratively refining analyses, but usually cannot operate without human interaction. This makes it impossible to re-run an analysis automatically, or to integrate it into a larger automated scientific workflow - for example, a script to generate figures and tables for a paper. In some cases the parameters of the analysis process itself are not saved explicitly, making it hard to repeat or improve the analysis even with human interaction. Conversely, non-interactive batch tools can be controlled by pre-written scripts and configuration files, allowing an analysis to be 'replayed' automatically from the raw data. However, this advantage comes at the expense of exploratory capability: iteratively improving an analysis entails a time-consuming cycle of editing scripts, running them, and viewing the output. Batch tools also tend to require more computer expertise from their users. PuffinPlot is a palaeomagnetic plotting and analysis program which aims to bridge this gap. First released in 2012, it offers both an interactive, user-friendly desktop interface and a batch scripting interface, both making use of the same core library of palaeomagnetic functions. We present new improvements to the program that help to integrate the interactive and batch approaches, allowing an analysis to be interactively explored and refined

  12. Semi-automated digital image analysis of patellofemoral joint space width from lateral knee radiographs

    Energy Technology Data Exchange (ETDEWEB)

    Grochowski, S.J. [Mayo Clinic, Department of Orthopedic Surgery, Rochester (United States); Amrami, K.K. [Mayo Clinic, Department of Radiology, Rochester (United States); Kaufman, K. [Mayo Clinic, Department of Orthopedic Surgery, Rochester (United States); Mayo Clinic/Foundation, Biomechanics Laboratory, Department of Orthopedic Surgery, Charlton North L-110L, Rochester (United States)

    2005-10-01

    To design a semi-automated program to measure minimum patellofemoral joint space width (JSW) using standing lateral view radiographs. Lateral patellofemoral knee radiographs were obtained from 35 asymptomatic subjects. The radiographs were analyzed to report both the repeatability of the image analysis program and the reproducibility of JSW measurements within a 2 week period. The results were also compared with manual measurements done by an experienced musculoskeletal radiologist. The image analysis program was shown to have an excellent coefficient of repeatability of 0.18 and 0.23 mm for intra- and inter-observer measurements respectively. The manual method measured a greater minimum JSW than the automated method. Reproducibility between days was comparable to other published results, but was less satisfactory for both manual and semi-automated measurements. The image analysis program had an inter-day coefficient of repeatability of 1.24 mm, which was lower than 1.66 mm for the manual method. A repeatable semi-automated method for measurement of the patellofemoral JSW from radiographs has been developed. The method is more accurate than manual measurements. However, the between-day reproducibility is higher than the intra-day reproducibility. Further investigation of the protocol for obtaining sequential lateral knee radiographs is needed in order to reduce the between-day variability. (orig.)

  13. COMPARISON OF MANUAL AND SEMIAUTOMATED FUNDUS AUTOFLUORESCENCE ANALYSIS OF MACULAR ATROPHY IN STARGARDT DISEASE PHENOTYPE.

    Science.gov (United States)

    Kuehlewein, Laura; Hariri, Amir H; Ho, Alexander; Dustin, Laurie; Wolfson, Yulia; Strauss, Rupert W; Scholl, Hendrik P N; Sadda, SriniVas R

    2016-06-01

    To evaluate manual and semiautomated grading techniques for assessing decreased fundus autofluorescence (DAF) in patients with Stargardt disease phenotype. Certified reading center graders performed manual and semiautomated (region finder-based) grading of confocal scanning laser ophthalmoscopy (cSLO) fundus autofluorescence (FAF) images for 41 eyes of 22 patients. Lesion types were defined based on the black level and sharpness of the border: definite decreased autofluorescence (DDAF), well, and poorly demarcated questionably decreased autofluorescence (WDQDAF, PDQDAF). Agreement in grading between the two methods and inter- and intra-grader agreement was assessed by kappa coefficients (κ) and intraclass correlation coefficients (ICC). The mean ± standard deviation (SD) area was 3.07 ± 3.02 mm for DDAF (n = 31), 1.53 ± 1.52 mm for WDQDAF (n = 9), and 6.94 ± 10.06 mm for PDQDAF (n = 17). The mean ± SD absolute difference in area between manual and semiautomated grading was 0.26 ± 0.28 mm for DDAF, 0.20 ± 0.26 mm for WDQDAF, and 4.05 ± 8.32 mm for PDQDAF. The ICC (95% confidence interval) for method comparison was 0.992 (0.984-0.996) for DDAF, 0.976 (0.922-0.993) for WDQDAF, and 0.648 (0.306-0.842) for PDQDAF. Inter- and intra-grader agreement in manual and semiautomated quantitative grading was better for DDAF (0.981-0.996) and WDQDAF (0.995-0.999) than for PDQDAF (0.715-0.993). Manual and semiautomated grading methods showed similar levels of reproducibility for assessing areas of decreased autofluorescence in patients with Stargardt disease phenotype. Excellent agreement and reproducibility were observed for well demarcated lesions.

  14. Semi-Automated Quantification of Finger Joint Space Narrowing Using Tomosynthesis in Patients with Rheumatoid Arthritis.

    Science.gov (United States)

    Ichikawa, Shota; Kamishima, Tamotsu; Sutherland, Kenneth; Kasahara, Hideki; Shimizu, Yuka; Fujimori, Motoshi; Yasojima, Nobutoshi; Ono, Yohei; Kaneda, Takahiko; Koike, Takao

    2017-06-01

    The purpose of the study is to validate the semi-automated method using tomosynthesis images for the assessment of finger joint space narrowing (JSN) in patients with rheumatoid arthritis (RA), by using the semi-quantitative scoring method as the reference standard. Twenty patients (14 females and 6 males) with RA were included in this retrospective study. All patients underwent radiography and tomosynthesis of the bilateral hand and wrist. Two rheumatologists and a radiologist independently scored JSN with two modalities according to the Sharp/van der Heijde score. Two observers independently measured joint space width on tomosynthesis images using an in-house semi-automated method. More joints with JSN were revealed with tomosynthesis score (243 joints) and the semi-automated method (215 joints) than with radiography (120 joints), and the associations between tomosynthesis scores and radiography scores were demonstrated (P tomosynthesis scores with r = -0.606 (P tomosynthesis images was in almost perfect agreement with intra-class correlation coefficient (ICC) values of 0.964 and 0.963, respectively. The semi-automated method using tomosynthesis images provided sensitive, quantitative, and reproducible measurement of finger joint space in patients with RA.

  15. Semi-automated vectorial analysis of anorectal motion by magnetic resonance defecography in healthy subjects and fecal incontinence.

    Science.gov (United States)

    Noelting, J; Bharucha, A E; Lake, D S; Manduca, A; Fletcher, J G; Riederer, S J; Joseph Melton, L; Zinsmeister, A R

    2012-10-01

    Inter-observer variability limits the reproducibility of pelvic floor motion measured by magnetic resonance imaging (MRI). Our aim was to develop a semi-automated program measuring pelvic floor motion in a reproducible and refined manner. Pelvic floor anatomy and motion during voluntary contraction (squeeze) and rectal evacuation were assessed by MRI in 64 women with fecal incontinence (FI) and 64 age-matched controls. A radiologist measured anorectal angles and anorectal junction motion. A semi-automated program did the same and also dissected anorectal motion into perpendicular vectors representing the puborectalis and other pelvic floor muscles, assessed the pubococcygeal angle, and evaluated pelvic rotation. Manual and semi-automated measurements of anorectal junction motion (r = 0.70; P controls. This semi-automated program provides a reproducible, efficient, and refined analysis of pelvic floor motion by MRI. Puborectalis injury is independently associated with impaired motion of puborectalis, not other pelvic floor muscles in controls and women with FI. © 2012 Blackwell Publishing Ltd.

  16. Pro Spring Batch

    CERN Document Server

    Minella, Michael T

    2011-01-01

    Since its release, Spring Framework has transformed virtually every aspect of Java development including web applications, security, aspect-oriented programming, persistence, and messaging. Spring Batch, one of its newer additions, now brings the same familiar Spring idioms to batch processing. Spring Batch addresses the needs of any batch process, from the complex calculations performed in the biggest financial institutions to simple data migrations that occur with many software development projects. Pro Spring Batch is intended to answer three questions: *What? What is batch processing? What

  17. Reproducibility of myelin content-based human habenula segmentation at 3 Tesla.

    Science.gov (United States)

    Kim, Joo-Won; Naidich, Thomas P; Joseph, Joshmi; Nair, Divya; Glasser, Matthew F; O'halloran, Rafael; Doucet, Gaelle E; Lee, Won Hee; Krinsky, Hannah; Paulino, Alejandro; Glahn, David C; Anticevic, Alan; Frangou, Sophia; Xu, Junqian

    2018-03-26

    In vivo morphological study of the human habenula, a pair of small epithalamic nuclei adjacent to the dorsomedial thalamus, has recently gained significant interest for its role in reward and aversion processing. However, segmenting the habenula from in vivo magnetic resonance imaging (MRI) is challenging due to the habenula's small size and low anatomical contrast. Although manual and semi-automated habenula segmentation methods have been reported, the test-retest reproducibility of the segmented habenula volume and the consistency of the boundaries of habenula segmentation have not been investigated. In this study, we evaluated the intra- and inter-site reproducibility of in vivo human habenula segmentation from 3T MRI (0.7-0.8 mm isotropic resolution) using our previously proposed semi-automated myelin contrast-based method and its fully-automated version, as well as a previously published manual geometry-based method. The habenula segmentation using our semi-automated method showed consistent boundary definition (high Dice coefficient, low mean distance, and moderate Hausdorff distance) and reproducible volume measurement (low coefficient of variation). Furthermore, the habenula boundary in our semi-automated segmentation from 3T MRI agreed well with that in the manual segmentation from 7T MRI (0.5 mm isotropic resolution) of the same subjects. Overall, our proposed semi-automated habenula segmentation showed reliable and reproducible habenula localization, while its fully-automated version offers an efficient way for large sample analysis. © 2018 Wiley Periodicals, Inc.

  18. Fast semi-automated lesion demarcation in stroke

    Directory of Open Access Journals (Sweden)

    Bianca de Haan

    2015-01-01

    Full Text Available Lesion–behaviour mapping analyses require the demarcation of the brain lesion on each (usually transverse slice of the individual stroke patient's brain image. To date, this is generally thought to be most precise when done manually, which is, however, both time-consuming and potentially observer-dependent. Fully automated lesion demarcation methods have been developed to address these issues, but these are often not practicable in acute stroke research where for each patient only a single image modality is available and the available image modality differs over patients. In the current study, we evaluated a semi-automated lesion demarcation approach, the so-called Clusterize algorithm, in acute stroke patients scanned in a range of common image modalities. Our results suggest that, compared to the standard of manual lesion demarcation, the semi-automated Clusterize algorithm is capable of significantly speeding up lesion demarcation in the most commonly used image modalities, without loss of either lesion demarcation precision or lesion demarcation reproducibility. For the three investigated acute datasets (CT, DWI, T2FLAIR, containing a total of 44 patient images obtained in a regular clinical setting at patient admission, the reduction in processing time was on average 17.8 min per patient and this advantage increased with increasing lesion volume (up to 60 min per patient for the largest lesion volumes in our datasets. Additionally, our results suggest that performance of the Clusterize algorithm in a chronic dataset with 11 T1 images was comparable to its performance in the acute datasets. We thus advocate the use of the Clusterize algorithm, integrated into a simple, freely available SPM toolbox, for the precise, reliable and fast preparation of imaging data for lesion–behaviour mapping analyses.

  19. Examination of reproducibility in microbiological degredation experiments

    DEFF Research Database (Denmark)

    Sommer, Helle Mølgaard; Spliid, Henrik; Holst, Helle

    1998-01-01

    Experimental data indicate that certain microbiological degradation experiments have a limited reproducibility. Nine identical batch experiments were carried out on 3 different days to examine reproducibility. A pure culture, isolated from soil, grew with toluene as the only carbon and energy...... source. Toluene was degraded under aerobic conditions at a constant temperature of 28 degreesC. The experiments were modelled by a Monod model - extended to meet the air/liquid system, and the parameter values were estimated using a statistical nonlinear estimation procedure. Model reduction analysis...... resulted in a simpler model without the biomass decay term. In order to test for model reduction and reproducibility of parameter estimates, a likelihood ratio test was employed. The limited reproducibility for these experiments implied that all 9 batch experiments could not be described by the same set...

  20. Multicenter assessment of the reproducibility of volumetric radiofrequency-based intravascular ultrasound measurements in coronary lesions that were consecutively stented

    NARCIS (Netherlands)

    Huisman, Jeroen; Egede, R.; Rdzanek, A.; Böse, D.; Erbel, R.; van der Palen, Jacobus Adrianus Maria; von Birgelen, Clemens

    2012-01-01

    To assess in a multicenter design the between-center reproducibility of volumetric virtual histology intravascular ultrasound (VH-IVUS) measurements with a semi-automated, computer-assisted contour detection system in coronary lesions that were consecutively stented. To evaluate the reproducibility

  1. Volume of Structures in the Fetal Brain Measured with a New Semiautomated Method.

    Science.gov (United States)

    Ber, R; Hoffman, D; Hoffman, C; Polat, A; Derazne, E; Mayer, A; Katorza, E

    2017-11-01

    Measuring the volume of fetal brain structures is challenging due to fetal motion, low resolution, and artifacts caused by maternal tissue. Our aim was to introduce a new, simple, Matlab-based semiautomated method to measure the volume of structures in the fetal brain and present normal volumetric curves of the structures measured. The volume of the supratentorial brain, left and right hemispheres, cerebellum, and left and right eyeballs was measured retrospectively by the new semiautomated method in MR imaging examinations of 94 healthy fetuses. Four volume ratios were calculated. Interobserver agreement was calculated with the intraclass correlation coefficient, and a Bland-Altman plot was drawn for comparison of manual and semiautomated method measurements of the supratentorial brain. We present normal volumetric curves and normal percentile values of the structures measured according to gestational age and of the ratios between the cerebellum and the supratentorial brain volume and the total eyeball and the supratentorial brain volume. Interobserver agreement was good or excellent for all structures measured. The Bland-Altman plot between manual and semiautomated measurements showed a maximal relative difference of 7.84%. We present a technologically simple, reproducible method that can be applied prospectively and retrospectively on any MR imaging protocol, and we present normal volumetric curves measured. The method shows results like manual measurements while being less time-consuming and user-dependent. By applying this method on different cranial and extracranial structures, anatomic and pathologic, we believe that fetal volumetry can turn from a research tool into a practical clinical one. © 2017 by American Journal of Neuroradiology.

  2. Evaluation and optimisation of preparative semi-automated electrophoresis systems for Illumina library preparation.

    Science.gov (United States)

    Quail, Michael A; Gu, Yong; Swerdlow, Harold; Mayho, Matthew

    2012-12-01

    Size selection can be a critical step in preparation of next-generation sequencing libraries. Traditional methods employing gel electrophoresis lack reproducibility, are labour intensive, do not scale well and employ hazardous interchelating dyes. In a high-throughput setting, solid-phase reversible immobilisation beads are commonly used for size-selection, but result in quite a broad fragment size range. We have evaluated and optimised the use of two semi-automated preparative DNA electrophoresis systems, the Caliper Labchip XT and the Sage Science Pippin Prep, for size selection of Illumina sequencing libraries. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Automated and Semiautomated Segmentation of Rectal Tumor Volumes on Diffusion-Weighted MRI: Can It Replace Manual Volumetry?

    International Nuclear Information System (INIS)

    Heeswijk, Miriam M. van; Lambregts, Doenja M.J.; Griethuysen, Joost J.M. van; Oei, Stanley; Rao, Sheng-Xiang; Graaff, Carla A.M. de; Vliegen, Roy F.A.; Beets, Geerard L.; Papanikolaou, Nikos; Beets-Tan, Regina G.H.

    2016-01-01

    Purpose: Diffusion-weighted imaging (DWI) tumor volumetry is promising for rectal cancer response assessment, but an important drawback is that manual per-slice tumor delineation can be highly time consuming. This study investigated whether manual DWI-volumetry can be reproduced using a (semi)automated segmentation approach. Methods and Materials: Seventy-nine patients underwent magnetic resonance imaging (MRI) that included DWI (highest b value [b1000 or b1100]) before and after chemoradiation therapy (CRT). Tumor volumes were assessed on b1000 (or b1100) DWI before and after CRT by means of (1) automated segmentation (by 2 inexperienced readers), (2) semiautomated segmentation (manual adjustment of the volumes obtained by method 1 by 2 radiologists), and (3) manual segmentation (by 2 radiologists); this last assessment served as the reference standard. Intraclass correlation coefficients (ICC) and Dice similarity indices (DSI) were calculated to evaluate agreement between different methods and observers. Measurement times (from a radiologist's perspective) were recorded for each method. Results: Tumor volumes were not significantly different among the 3 methods, either before or after CRT (P=.08 to .92). ICCs compared to manual segmentation were 0.80 to 0.91 and 0.53 to 0.66 before and after CRT, respectively, for the automated segmentation and 0.91 to 0.97 and 0.61 to 0.75, respectively, for the semiautomated method. Interobserver agreement (ICC) pre and post CRT was 0.82 and 0.59 for automated segmentation, 0.91 and 0.73 for semiautomated segmentation, and 0.91 and 0.75 for manual segmentation, respectively. Mean DSI between the automated and semiautomated method were 0.83 and 0.58 pre-CRT and post-CRT, respectively; DSI between the automated and manual segmentation were 0.68 and 0.42 and 0.70 and 0.41 between the semiautomated and manual segmentation, respectively. Median measurement time for the radiologists was 0 seconds (pre- and post-CRT) for the

  4. Automated and Semiautomated Segmentation of Rectal Tumor Volumes on Diffusion-Weighted MRI: Can It Replace Manual Volumetry?

    Energy Technology Data Exchange (ETDEWEB)

    Heeswijk, Miriam M. van [Department of Radiology, Maastricht University Medical Centre, Maastricht (Netherlands); Department of Surgery, Maastricht University Medical Centre, Maastricht (Netherlands); Lambregts, Doenja M.J., E-mail: d.lambregts@nki.nl [Department of Radiology, Maastricht University Medical Centre, Maastricht (Netherlands); Department of Radiology, The Netherlands Cancer Institute, Amsterdam (Netherlands); Griethuysen, Joost J.M. van [GROW School for Oncology and Developmental Biology, Maastricht University Medical Centre, Maastricht (Netherlands); Department of Radiology, The Netherlands Cancer Institute, Amsterdam (Netherlands); Oei, Stanley [Department of Radiology, Maastricht University Medical Centre, Maastricht (Netherlands); Rao, Sheng-Xiang [Department of Radiology, Maastricht University Medical Centre, Maastricht (Netherlands); Department of Radiology, Zhongshan Hospital, Fudan University, Shanghai (China); Graaff, Carla A.M. de [Department of Radiology, Maastricht University Medical Centre, Maastricht (Netherlands); Vliegen, Roy F.A. [Atrium Medical Centre Parkstad/Zuyderland Medical Centre, Heerlen (Netherlands); Beets, Geerard L. [GROW School for Oncology and Developmental Biology, Maastricht University Medical Centre, Maastricht (Netherlands); Department of Surgery, The Netherlands Cancer Institute, Amsterdam (Netherlands); Papanikolaou, Nikos [Laboratory of Computational Medicine, Institute of Computer Science, FORTH, Heraklion, Crete (Greece); Beets-Tan, Regina G.H. [GROW School for Oncology and Developmental Biology, Maastricht University Medical Centre, Maastricht (Netherlands); Department of Radiology, The Netherlands Cancer Institute, Amsterdam (Netherlands)

    2016-03-15

    Purpose: Diffusion-weighted imaging (DWI) tumor volumetry is promising for rectal cancer response assessment, but an important drawback is that manual per-slice tumor delineation can be highly time consuming. This study investigated whether manual DWI-volumetry can be reproduced using a (semi)automated segmentation approach. Methods and Materials: Seventy-nine patients underwent magnetic resonance imaging (MRI) that included DWI (highest b value [b1000 or b1100]) before and after chemoradiation therapy (CRT). Tumor volumes were assessed on b1000 (or b1100) DWI before and after CRT by means of (1) automated segmentation (by 2 inexperienced readers), (2) semiautomated segmentation (manual adjustment of the volumes obtained by method 1 by 2 radiologists), and (3) manual segmentation (by 2 radiologists); this last assessment served as the reference standard. Intraclass correlation coefficients (ICC) and Dice similarity indices (DSI) were calculated to evaluate agreement between different methods and observers. Measurement times (from a radiologist's perspective) were recorded for each method. Results: Tumor volumes were not significantly different among the 3 methods, either before or after CRT (P=.08 to .92). ICCs compared to manual segmentation were 0.80 to 0.91 and 0.53 to 0.66 before and after CRT, respectively, for the automated segmentation and 0.91 to 0.97 and 0.61 to 0.75, respectively, for the semiautomated method. Interobserver agreement (ICC) pre and post CRT was 0.82 and 0.59 for automated segmentation, 0.91 and 0.73 for semiautomated segmentation, and 0.91 and 0.75 for manual segmentation, respectively. Mean DSI between the automated and semiautomated method were 0.83 and 0.58 pre-CRT and post-CRT, respectively; DSI between the automated and manual segmentation were 0.68 and 0.42 and 0.70 and 0.41 between the semiautomated and manual segmentation, respectively. Median measurement time for the radiologists was 0 seconds (pre- and post-CRT) for the

  5. Automated and Semiautomated Segmentation of Rectal Tumor Volumes on Diffusion-Weighted MRI: Can It Replace Manual Volumetry?

    Science.gov (United States)

    van Heeswijk, Miriam M; Lambregts, Doenja M J; van Griethuysen, Joost J M; Oei, Stanley; Rao, Sheng-Xiang; de Graaff, Carla A M; Vliegen, Roy F A; Beets, Geerard L; Papanikolaou, Nikos; Beets-Tan, Regina G H

    2016-03-15

    Diffusion-weighted imaging (DWI) tumor volumetry is promising for rectal cancer response assessment, but an important drawback is that manual per-slice tumor delineation can be highly time consuming. This study investigated whether manual DWI-volumetry can be reproduced using a (semi)automated segmentation approach. Seventy-nine patients underwent magnetic resonance imaging (MRI) that included DWI (highest b value [b1000 or b1100]) before and after chemoradiation therapy (CRT). Tumor volumes were assessed on b1000 (or b1100) DWI before and after CRT by means of (1) automated segmentation (by 2 inexperienced readers), (2) semiautomated segmentation (manual adjustment of the volumes obtained by method 1 by 2 radiologists), and (3) manual segmentation (by 2 radiologists); this last assessment served as the reference standard. Intraclass correlation coefficients (ICC) and Dice similarity indices (DSI) were calculated to evaluate agreement between different methods and observers. Measurement times (from a radiologist's perspective) were recorded for each method. Tumor volumes were not significantly different among the 3 methods, either before or after CRT (P=.08 to .92). ICCs compared to manual segmentation were 0.80 to 0.91 and 0.53 to 0.66 before and after CRT, respectively, for the automated segmentation and 0.91 to 0.97 and 0.61 to 0.75, respectively, for the semiautomated method. Interobserver agreement (ICC) pre and post CRT was 0.82 and 0.59 for automated segmentation, 0.91 and 0.73 for semiautomated segmentation, and 0.91 and 0.75 for manual segmentation, respectively. Mean DSI between the automated and semiautomated method were 0.83 and 0.58 pre-CRT and post-CRT, respectively; DSI between the automated and manual segmentation were 0.68 and 0.42 and 0.70 and 0.41 between the semiautomated and manual segmentation, respectively. Median measurement time for the radiologists was 0 seconds (pre- and post-CRT) for the automated method, 41 to 69 seconds (pre-CRT) and

  6. Reproducibility of the results in ultrasonic testing

    International Nuclear Information System (INIS)

    Chalaye, M.; Launay, J.P.; Thomas, A.

    1980-12-01

    This memorandum reports on the conclusions of the tests carried out in order to evaluate the reproducibility of ultrasonic tests made on welded joints. FRAMATOME have started a study to assess the dispersion of results afforded by the test line and to characterize its behaviour. The tests covered sensors and ultrasonic generators said to be identical to each other (same commercial batch) [fr

  7. Spring batch essentials

    CERN Document Server

    Rao, P Raja Malleswara

    2015-01-01

    If you are a Java developer with basic knowledge of Spring and some experience in the development of enterprise applications, and want to learn about batch application development in detail, then this book is ideal for you. This book will be perfect as your next step towards building simple yet powerful batch applications on a Java-based platform.

  8. Semiautomated digital analysis of knee joint space width using MR images

    International Nuclear Information System (INIS)

    Agnesi, Filippo; Amrami, Kimberly K.; Frigo, Carlo A.; Kaufman, Kenton R.

    2007-01-01

    The goal of this study was to (a) develop a semiautomated computer algorithm to measure knee joint space width (JSW) from magnetic resonance (MR) images using standard imaging techniques and (b) evaluate the reproducibility of the algorithm. Using a standard clinical imaging protocol, bilateral knee MR images were obtained twice within a 2-week period from 17 asymptomatic research participants. Images were analyzed to determine the variability of the measurements performed by the program compared with the variability of manual measurements. Measurement variability of the computer algorithm was considerably smaller than the variability of manual measurements. The average difference between two measurements of the same slice performed with the computer algorithm by the same user was 0.004 ± 0.07 mm for the tibiofemoral joint (TF) and 0.009 ± 0.11 mm for the patellofemoral joint (PF) compared with an average of 0.12 ± 0.22 mm TF and 0.13 ± 0.29 mm PF, respectively, for the manual method. Interuser variability of the computer algorithm was also considerably smaller, with an average difference of 0.004 ± 0.1 mm TF and 0.0006 ± 0.1 mm PF compared with 0.38 ± 0.59 mm TF and 0.31 ± 0.66 mm PF obtained using a manual method. The between-day reproducibility was larger but still within acceptable limits at 0.09 ± 0.39 mm TF and 0.09 ± 0.51 mm PF. This technique has proven consistently reproducible on a same slice base,while the reproducibility comparing different acquisitions of the same subject was larger. Longitudinal reproducibility improvement needs to be addressed through acquisition protocol improvements. A semiautomated method for measuring knee JSW from MR images has been successfully developed. (orig.)

  9. Semiautomated digital analysis of knee joint space width using MR images

    Energy Technology Data Exchange (ETDEWEB)

    Agnesi, Filippo [Mayo Clinic, Motion Analysis Laboratory, Division of Orthopedic Research, Rochester, MN (United States); Polytechnic of Milan, Department of Bioengineering, Milan (Italy); Amrami, Kimberly K. [Mayo Clinic, Department of Radiology, Rochester, MN (United States); Frigo, Carlo A. [Polytechnic of Milan, Department of Bioengineering, Milan (Italy); Kaufman, Kenton R. [Mayo Clinic, Motion Analysis Laboratory, Division of Orthopedic Research, Rochester, MN (United States); Mayo Clinic, Mayo Foundation, Motion Analysis Laboratory, Department of Orthopedic Surgery, Rochester, MN (United States)

    2007-05-15

    The goal of this study was to (a) develop a semiautomated computer algorithm to measure knee joint space width (JSW) from magnetic resonance (MR) images using standard imaging techniques and (b) evaluate the reproducibility of the algorithm. Using a standard clinical imaging protocol, bilateral knee MR images were obtained twice within a 2-week period from 17 asymptomatic research participants. Images were analyzed to determine the variability of the measurements performed by the program compared with the variability of manual measurements. Measurement variability of the computer algorithm was considerably smaller than the variability of manual measurements. The average difference between two measurements of the same slice performed with the computer algorithm by the same user was 0.004 {+-} 0.07 mm for the tibiofemoral joint (TF) and 0.009 {+-} 0.11 mm for the patellofemoral joint (PF) compared with an average of 0.12 {+-} 0.22 mm TF and 0.13 {+-} 0.29 mm PF, respectively, for the manual method. Interuser variability of the computer algorithm was also considerably smaller, with an average difference of 0.004 {+-} 0.1 mm TF and 0.0006 {+-} 0.1 mm PF compared with 0.38 {+-} 0.59 mm TF and 0.31 {+-} 0.66 mm PF obtained using a manual method. The between-day reproducibility was larger but still within acceptable limits at 0.09 {+-} 0.39 mm TF and 0.09 {+-} 0.51 mm PF. This technique has proven consistently reproducible on a same slice base,while the reproducibility comparing different acquisitions of the same subject was larger. Longitudinal reproducibility improvement needs to be addressed through acquisition protocol improvements. A semiautomated method for measuring knee JSW from MR images has been successfully developed. (orig.)

  10. SPS batch spacing optimisation

    CERN Document Server

    Velotti, F M; Carlier, E; Goddard, B; Kain, V; Kotzian, G

    2017-01-01

    Until 2015, the LHC filling schemes used the batch spac-ing as specified in the LHC design report. The maximumnumber of bunches injectable in the LHC directly dependson the batch spacing at injection in the SPS and hence onthe MKP rise time.As part of the LHC Injectors Upgrade project for LHCheavy ions, a reduction of the batch spacing is needed. In thisdirection, studies to approach the MKP design rise time of150ns(2-98%) have been carried out. These measurementsgave clear indications that such optimisation, and beyond,could be done also for higher injection momentum beams,where the additional slower MKP (MKP-L) is needed.After the successful results from 2015 SPS batch spacingoptimisation for the Pb-Pb run [1], the same concept wasthought to be used also for proton beams. In fact, thanksto the SPS transverse feed back, it was already observedthat lower batch spacing than the design one (225ns) couldbe achieved. For the 2016 p-Pb run, a batch spacing of200nsfor the proton beam with100nsbunch spacing wasreque...

  11. Campylobacter in Broiler Chicken and Broiler Meat in Sri Lanka: Influence of Semi-Automated vs. Wet Market Processing on Campylobacter Contamination of Broiler Neck Skin Samples

    Directory of Open Access Journals (Sweden)

    Kottawattage S. A. Kottawatta

    2017-11-01

    Full Text Available Broiler meat can become contaminated with Campylobacter of intestinal origin during processing. The present study aimed to identify the prevalence of Campylobacter in broiler flocks and meat contamination at retail shops, and determine the influence of semi-automated and wet market processing on Campylobacter contamination of neck skin samples. Samples were collected from semi-automated plants (n = 102 and wet markets (n = 25. From each batch of broilers, pooled caecal samples and neck skin samples were tested for Campylobacter. Broiler meat purchased from retail outlets (n = 37 was also tested. The prevalence of Campylobacter colonized broiler flocks was 67%. The contamination of meat at retail was 59%. Both semi-automated and wet market processing resulted to contaminate the broiler neck skins to the levels of 27.4% and 48%, respectively. When Campylobacter-free broiler flocks were processed in semi-automated facilities 15% (5/33 of neck skin samples became contaminated by the end of processing whereas 25% (2/8 became contaminated after wet market processing. Characterization of isolates revealed a higher proportion of C. coli compared to C. jejuni. Higher proportions of isolates were resistant to important antimicrobials. This study shows the importance of Campylobacter in poultry industry in Sri Lanka and the need for controlling antimicrobial resistance.

  12. Campylobacter in Broiler Chicken and Broiler Meat in Sri Lanka: Influence of Semi-Automated vs. Wet Market Processing on Campylobacter Contamination of Broiler Neck Skin Samples.

    Science.gov (United States)

    Kottawatta, Kottawattage S A; Van Bergen, Marcel A P; Abeynayake, Preeni; Wagenaar, Jaap A; Veldman, Kees T; Kalupahana, Ruwani S

    2017-11-29

    Broiler meat can become contaminated with Campylobacter of intestinal origin during processing. The present study aimed to identify the prevalence of Campylobacter in broiler flocks and meat contamination at retail shops, and determine the influence of semi-automated and wet market processing on Campylobacter contamination of neck skin samples. Samples were collected from semi-automated plants ( n = 102) and wet markets ( n = 25). From each batch of broilers, pooled caecal samples and neck skin samples were tested for Campylobacter . Broiler meat purchased from retail outlets ( n = 37) was also tested. The prevalence of Campylobacter colonized broiler flocks was 67%. The contamination of meat at retail was 59%. Both semi-automated and wet market processing resulted to contaminate the broiler neck skins to the levels of 27.4% and 48%, respectively. When Campylobacter -free broiler flocks were processed in semi-automated facilities 15% (5/33) of neck skin samples became contaminated by the end of processing whereas 25% (2/8) became contaminated after wet market processing. Characterization of isolates revealed a higher proportion of C. coli compared to C. jejuni . Higher proportions of isolates were resistant to important antimicrobials. This study shows the importance of Campylobacter in poultry industry in Sri Lanka and the need for controlling antimicrobial resistance.

  13. Semi-automated volumetric analysis of artificial lymph nodes in a phantom study

    International Nuclear Information System (INIS)

    Fabel, M.; Biederer, J.; Jochens, A.; Bornemann, L.; Soza, G.; Heller, M.; Bolte, H.

    2011-01-01

    Purpose: Quantification of tumour burden in oncology requires accurate and reproducible image evaluation. The current standard is one-dimensional measurement (e.g. RECIST) with inherent disadvantages. Volumetric analysis is discussed as an alternative for therapy monitoring of lung and liver metastases. The aim of this study was to investigate the accuracy of semi-automated volumetric analysis of artificial lymph node metastases in a phantom study. Materials and methods: Fifty artificial lymph nodes were produced in a size range from 10 to 55 mm; some of them enhanced using iodine contrast media. All nodules were placed in an artificial chest phantom (artiCHEST ® ) within different surrounding tissues. MDCT was performed using different collimations (1–5 mm) at varying reconstruction kernels (B20f, B40f, B60f). Volume and RECIST measurements were performed using Oncology Software (Siemens Healthcare, Forchheim, Germany) and were compared to reference volume and diameter by calculating absolute percentage errors. Results: The software performance allowed a robust volumetric analysis in a phantom setting. Unsatisfying segmentation results were frequently found for native nodules within surrounding muscle. The absolute percentage error (APE) for volumetric analysis varied between 0.01 and 225%. No significant differences were seen between different reconstruction kernels. The most unsatisfactory segmentation results occurred in higher slice thickness (4 and 5 mm). Contrast enhanced lymph nodes showed better segmentation results by trend. Conclusion: The semi-automated 3D-volumetric analysis software tool allows a reliable and convenient segmentation of artificial lymph nodes in a phantom setting. Lymph nodes adjacent to tissue of similar density cause segmentation problems. For volumetric analysis of lymph node metastases in clinical routine a slice thickness of ≤3 mm and a medium soft reconstruction kernel (e.g. B40f for Siemens scan systems) may be a suitable

  14. Batch-batch stable microbial community in the traditional fermentation process of huyumei broad bean pastes.

    Science.gov (United States)

    Zhu, Linjiang; Fan, Zihao; Kuai, Hui; Li, Qi

    2017-09-01

    During natural fermentation processes, a characteristic microbial community structure (MCS) is naturally formed, and it is interesting to know about its batch-batch stability. This issue was explored in a traditional semi-solid-state fermentation process of huyumei, a Chinese broad bean paste product. The results showed that this MCS mainly contained four aerobic Bacillus species (8 log CFU per g), including B. subtilis, B. amyloliquefaciens, B. methylotrophicus, and B. tequilensis, and the facultative anaerobe B. cereus with a low concentration (4 log CFU per g), besides a very small amount of the yeast Zygosaccharomyces rouxii (2 log CFU per g). The dynamic change of the MCS in the brine fermentation process showed that the abundance of dominant species varied within a small range, and in the beginning of process the growth of lactic acid bacteria was inhibited and Staphylococcus spp. lost its viability. Also, the MCS and its dynamic change were proved to be highly reproducible among seven batches of fermentation. Therefore, the MCS naturally and stably forms between different batches of the traditional semi-solid-state fermentation of huyumei. Revealing microbial community structure and its batch-batch stability is helpful for understanding the mechanisms of community formation and flavour production in a traditional fermentation. This issue in a traditional semi-solid-state fermentation of huyumei broad bean paste was firstly explored. This fermentation process was revealed to be dominated by a high concentration of four aerobic species of Bacillus, a low concentration of B. cereus and a small amount of Zygosaccharomyces rouxii. Lactic acid bacteria and Staphylococcus spp. lost its viability at the beginning of fermentation. Such the community structure was proved to be highly reproducible among seven batches. © 2017 The Society for Applied Microbiology.

  15. Reliability versus reproducibility

    International Nuclear Information System (INIS)

    Lautzenheiser, C.E.

    1976-01-01

    Defect detection and reproducibility of results are two separate but closely related subjects. It is axiomatic that a defect must be detected from examination to examination or reproducibility of results is very poor. On the other hand, a defect can be detected on each of subsequent examinations for higher reliability and still have poor reproducibility of results

  16. The Need for Reproducibility

    Energy Technology Data Exchange (ETDEWEB)

    Robey, Robert W. [Los Alamos National Laboratory

    2016-06-27

    The purpose of this presentation is to consider issues of reproducibility, specifically it determines whether bitwise reproducible computation is possible, if computational research in DOE improves its publication process, and if reproducible results can be achieved apart from the peer review process?

  17. Heuristics for batching and sequencing in batch processing machines

    Directory of Open Access Journals (Sweden)

    Chuda Basnet

    2016-12-01

    Full Text Available In this paper, we discuss the “batch processing” problem, where there are multiple jobs to be processed in flow shops. These jobs can however be formed into batches and the number of jobs in a batch is limited by the capacity of the processing machines to accommodate the jobs. The processing time required by a batch in a machine is determined by the greatest processing time of the jobs included in the batch. Thus, the batch processing problem is a mix of batching and sequencing – the jobs need to be grouped into distinct batches, the batches then need to be sequenced through the flow shop. We apply certain newly developed heuristics to the problem and present computational results. The contributions of this paper are deriving a lower bound, and the heuristics developed and tested in this paper.

  18. Comparison of semi-automated and manual measurements of carotid intima-media thickening.

    LENUS (Irish Health Repository)

    Mac Ananey, Oscar

    2014-01-01

    Carotid intima-media thickening (CIMT) is a marker of both arteriosclerotic and atherosclerotic risks. Technological advances have semiautomated CIMT image acquisition and quantification. Studies comparing manual and automated methods have yielded conflicting results possibly due to plaque inclusion in measurements. Low atherosclerotic risk subjects (n = 126) were recruited to minimise the effect of focal atherosclerotic lesions on CIMT variability. CIMT was assessed by high-resolution B-mode ultrasound (Philips HDX7E, Phillips, UK) images of the common carotid artery using both manual and semiautomated methods (QLAB, Phillips, UK). Intraclass correlation coefficient (ICC) and the mean differences of paired measurements (Bland-Altman method) were used to compare both methodologies. The ICC of manual (0.547 ± 0.095 mm) and automated (0.524 ± 0.068 mm) methods was R = 0.74 and an absolute mean bias ± SD of 0.023 ± 0.052 mm was observed. Interobserver and intraobserver ICC were greater for automated (R = 0.94 and 0.99) compared to manual (R = 0.72 and 0.88) methods. Although not considered to be clinically significant, manual measurements yielded higher values compared to automated measurements. Automated measurements were more reproducible and showed lower interobserver variation compared to manual measurements. These results offer important considerations for large epidemiological studies.

  19. Prunus dulcis, Batch

    African Journals Online (AJOL)

    STORAGESEVER

    2010-06-07

    Jun 7, 2010 ... almond (Prunus dulcis, Batch) genotypes as revealed by PCR analysis. Yavar Sharafi1*, Jafar Hajilou1, Seyed AbolGhasem Mohammadi2, Mohammad Reza Dadpour1 and Sadollah Eskandari3. 1Department of Horticulture, Faculty of Agriculture, University of Tabriz, Tabriz, 5166614766, Iran.

  20. Interobserver agreement of semi-automated and manual measurements of functional MRI metrics of treatment response in hepatocellular carcinoma

    International Nuclear Information System (INIS)

    Bonekamp, David; Bonekamp, Susanne; Halappa, Vivek Gowdra; Geschwind, Jean-Francois H.; Eng, John; Corona-Villalobos, Celia Pamela; Pawlik, Timothy M.; Kamel, Ihab R.

    2014-01-01

    Purpose: To assess the interobserver agreement in 50 patients with hepatocellular carcinoma (HCC) before and 1 month after intra-arterial therapy (IAT) using two semi-automated methods and a manual approach for the following functional, volumetric and morphologic parameters: (1) apparent diffusion coefficient (ADC), (2) arterial phase enhancement (AE), (3) portal venous phase enhancement (VE), (4) tumor volume, and assessment according to (5) the Response Evaluation Criteria in Solid Tumors (RECIST), and (6) the European Association for the Study of the Liver (EASL). Materials and methods: This HIPAA-compliant retrospective study had institutional review board approval. The requirement for patient informed consent was waived. Tumor ADC, AE, VE, volume, RECIST, and EASL in 50 index lesions was measured by three observers. Interobserver reproducibility was evaluated using intraclass correlation coefficients (ICC). P < 0.05 was considered to indicate a significant difference. Results: Semi-automated volumetric measurements of functional parameters (ADC, AE, and VE) before and after IAT as well as change in tumor ADC, AE, or VE had better interobserver agreement (ICC = 0.830–0.974) compared with manual ROI-based axial measurements (ICC = 0.157–0.799). Semi-automated measurements of tumor volume and size in the axial plane before and after IAT had better interobserver agreement (ICC = 0.854–0.996) compared with manual size measurements (ICC = 0.543–0.596), and interobserver agreement for change in tumor RECIST size was also higher using semi-automated measurements (ICC = 0.655) compared with manual measurements (ICC = 0.169). EASL measurements of tumor enhancement in the axial plane before and after IAT ((ICC = 0.758–0.809), and changes in EASL after IAT (ICC = 0.653) had good interobserver agreement. Conclusion: Semi-automated measurements of functional changes assessed by ADC and VE based on whole-lesion segmentation demonstrated better reproducibility than

  1. Semiautomated spleen volumetry with diffusion-weighted MR imaging.

    Science.gov (United States)

    Lee, Jeongjin; Kim, Kyoung Won; Lee, Ho; Lee, So Jung; Choi, Sanghyun; Jeong, Woo Kyoung; Kye, Heewon; Song, Gi-Won; Hwang, Shin; Lee, Sung-Gyu

    2012-07-01

    In this article, we determined the relative accuracy of semiautomated spleen volumetry with diffusion-weighted (DW) MR images compared to standard manual volumetry with DW-MR or CT images. Semiautomated spleen volumetry using simple thresholding followed by 3D and 2D connected component analysis was performed with DW-MR images. Manual spleen volumetry was performed on DW-MR and CT images. In this study, 35 potential live liver donor candidates were included. Semiautomated volumetry results were highly correlated with manual volumetry results using DW-MR (r = 0.99; P volumetry was significantly shorter compared to that of manual volumetry with DW-MR (P volumetry with DW-MR images can be performed rapidly and accurately when compared with standard manual volumetry. Copyright © 2011 Wiley Periodicals, Inc.

  2. Semi-automated extraction of longitudinal subglacial bedforms from digital terrain models - Two new methods

    Science.gov (United States)

    Jorge, Marco G.; Brennand, Tracy A.

    2017-07-01

    Relict drumlin and mega-scale glacial lineation (positive relief, longitudinal subglacial bedforms - LSBs) morphometry has been used as a proxy for paleo ice-sheet dynamics. LSB morphometric inventories have relied on manual mapping, which is slow and subjective and thus potentially difficult to reproduce. Automated methods are faster and reproducible, but previous methods for LSB semi-automated mapping have not been highly successful. Here, two new object-based methods for the semi-automated extraction of LSBs (footprints) from digital terrain models are compared in a test area in the Puget Lowland, Washington, USA. As segmentation procedures to create LSB-candidate objects, the normalized closed contour method relies on the contouring of a normalized local relief model addressing LSBs on slopes, and the landform elements mask method relies on the classification of landform elements derived from the digital terrain model. For identifying which LSB-candidate objects correspond to LSBs, both methods use the same LSB operational definition: a ruleset encapsulating expert knowledge, published morphometric data, and the morphometric range of LSBs in the study area. The normalized closed contour method was separately applied to four different local relief models, two computed in moving windows and two hydrology-based. Overall, the normalized closed contour method outperformed the landform elements mask method. The normalized closed contour method performed on a hydrological relief model from a multiple direction flow routing algorithm performed best. For an assessment of its transferability, the normalized closed contour method was evaluated on a second area, the Chautauqua drumlin field, Pennsylvania and New York, USA where it performed better than in the Puget Lowland. A broad comparison to previous methods suggests that the normalized relief closed contour method may be the most capable method to date, but more development is required.

  3. Modelling of Batch Process Operations

    DEFF Research Database (Denmark)

    Abdul Samad, Noor Asma Fazli; Cameron, Ian; Gani, Rafiqul

    2011-01-01

    Here a batch cooling crystalliser is modelled and simulated as is a batch distillation system. In the batch crystalliser four operational modes of the crystalliser are considered, namely: initial cooling, nucleation, crystal growth and product removal. A model generation procedure is shown that s...

  4. Magni Reproducibility Example

    DEFF Research Database (Denmark)

    2016-01-01

    An example of how to use the magni.reproducibility package for storing metadata along with results from a computational experiment. The example is based on simulating the Mandelbrot set.......An example of how to use the magni.reproducibility package for storing metadata along with results from a computational experiment. The example is based on simulating the Mandelbrot set....

  5. Semiautomated system for the production and analysis of sucrose density gradients

    International Nuclear Information System (INIS)

    Lange, C.S.; Liberman, D.F.

    1974-01-01

    A semiautomated system in DNA damage studies permitting considerable accuracy, speed, and reproducibility in the making and fractionation of sucrose density gradients is described. The system consists of a modified Beckman gradient forming device that makes six gradients simultaneously and delivers them into six 12.5 ml polyallomer centrifuge tubes in such a manner that new material is continuously added to the meniscus of the gradient. The gradients are fractionated three at a time and up to 100 fractions per gradient can be collected automatically directly into scintillation vials with a choice of drop counting or time mode with rinse and automatic addition of scintillation fluid to each vial. The system can process up to six gradients per hour but centrifugation time is usually the limiting factor. With neutral sucrose gradients, sharp, reproducible, monodisperse peaks containing up to 100 percent of the gradient radioactivity are usually obtained but a smaller monodisperse peak containing as little as 3.5 percent of the gradient radioactivity can be detected under conditions where some pairs of molecules might tangle or dimerize. The resolution and reproducibility of this system when used with neutral sucrose gradients is at least the equal if not superior to that commonly claimed for alkaline sucrose gradients. (U.S.)

  6. Enhanced detection levels in a semi-automated sandwich ...

    African Journals Online (AJOL)

    A peptide nucleic acid (PNA) signal probe was tested as a replacement for a typical DNA oligonucleotidebased signal probe in a semi-automated sandwich hybridisation assay designed to detect the harmful phytoplankton species Alexandrium tamarense. The PNA probe yielded consistently higher fluorescent signal ...

  7. White matter hyperintensities segmentation: a new semi-automated method

    Directory of Open Access Journals (Sweden)

    Mariangela eIorio

    2013-12-01

    Full Text Available White matter hyperintensities (WMH are brain areas of increased signal on T2-weighted or fluid attenuated inverse recovery magnetic resonance imaging (MRI scans. In this study we present a new semi-automated method to measure WMH load that is based on the segmentation of the intensity histogram of fluid-attenuated inversion recovery images. Thirty patients with Mild Cognitive Impairment with variable WMH load were enrolled. The semi-automated WMH segmentation included: removal of non-brain tissue, spatial normalization, removal of cerebellum and brain stem, spatial filtering, thresholding to segment probable WMH, manual editing for correction of false positives and negatives, generation of WMH map and volumetric estimation of the WMH load. Accuracy was quantitatively evaluated by comparing semi-automated and manual WMH segmentations performed by two independent raters. Differences between the two procedures were assessed using Student’s t tests and similarity was evaluated using linear regression model and Dice Similarity Coefficient (DSC. The volumes of the manual and semi-automated segmentations did not statistically differ (t-value= -1.79, DF=29, p= 0.839 for rater 1; t-value= 1.113, DF=29, p= 0.2749 for rater 2, were highly correlated (R²= 0.921, F (1,29 =155,54, p

  8. Refuelling: Swiss station will be semi-automated

    International Nuclear Information System (INIS)

    Fontaine, B.; Ribaux, P.

    1981-01-01

    The first semi-automated LWR refuelling machine in Europe has been supplied to the Leibstadt General Electric BWR in Switzerland. The system relieves operators of the boring and repetitive job of moving and accurately positioning the refuelling machine during fuelling operations and will thus contribute to plant safety. The machine and its mode of operation are described. (author)

  9. Comparison of manual and semi-automated delineation of regions of interest for radioligand PET imaging analysis

    International Nuclear Information System (INIS)

    Chow, Tiffany W; Verhoeff, Nicolaas PLG; Takeshita, Shinichiro; Honjo, Kie; Pataky, Christina E; St Jacques, Peggy L; Kusano, Maggie L; Caldwell, Curtis B; Ramirez, Joel; Black, Sandra

    2007-01-01

    As imaging centers produce higher resolution research scans, the number of man-hours required to process regional data has become a major concern. Comparison of automated vs. manual methodology has not been reported for functional imaging. We explored validation of using automation to delineate regions of interest on positron emission tomography (PET) scans. The purpose of this study was to ascertain improvements in image processing time and reproducibility of a semi-automated brain region extraction (SABRE) method over manual delineation of regions of interest (ROIs). We compared 2 sets of partial volume corrected serotonin 1a receptor binding potentials (BPs) resulting from manual vs. semi-automated methods. BPs were obtained from subjects meeting consensus criteria for frontotemporal degeneration and from age- and gender-matched healthy controls. Two trained raters provided each set of data to conduct comparisons of inter-rater mean image processing time, rank order of BPs for 9 PET scans, intra- and inter-rater intraclass correlation coefficients (ICC), repeatability coefficients (RC), percentages of the average parameter value (RM%), and effect sizes of either method. SABRE saved approximately 3 hours of processing time per PET subject over manual delineation (p < .001). Quality of the SABRE BP results was preserved relative to the rank order of subjects by manual methods. Intra- and inter-rater ICC were high (>0.8) for both methods. RC and RM% were lower for the manual method across all ROIs, indicating less intra-rater variance across PET subjects' BPs. SABRE demonstrated significant time savings and no significant difference in reproducibility over manual methods, justifying the use of SABRE in serotonin 1a receptor radioligand PET imaging analysis. This implies that semi-automated ROI delineation is a valid methodology for future PET imaging analysis

  10. Progressing batch hydrolysis process

    Science.gov (United States)

    Wright, J.D.

    1985-01-10

    A progressive batch hydrolysis process is disclosed for producing sugar from a lignocellulosic feedstock. It comprises passing a stream of dilute acid serially through a plurality of percolation hydrolysis reactors charged with feed stock, at a flow rate, temperature and pressure sufficient to substantially convert all the cellulose component of the feed stock to glucose. The cooled dilute acid stream containing glucose, after exiting the last percolation hydrolysis reactor, serially fed through a plurality of pre-hydrolysis percolation reactors, charged with said feedstock, at a flow rate, temperature and pressure sufficient to substantially convert all the hemicellulose component of said feedstock to glucose. The dilute acid stream containing glucose is cooled after it exits the last prehydrolysis reactor.

  11. Reproducibility of morphometric X-ray absorptiometry

    International Nuclear Information System (INIS)

    Culton, N.; Pocock, N.

    1999-01-01

    Full text: Morphometric X-ray absorptiometry (MXA) using DXA is potentially a useful clinical tool which may provide additional vertebral fracture information with low radiation exposure. While morphometric analysis is semi-automated, operator intervention is crucial for the accurate positioning of the six data points quantifying the vertebral heights at the anterior, middle and posterior positions. Our study evaluated intra-operator reproducibility of MXA in an elderly patient population and assessed the effect of training and experience on vertebral height precision. Ten patients, with a mean lumbar T score of - 2.07, were studied. Images were processed by a trained operator who had initially only limited morphometric experience. The analysis of the data files were repeated at 2 and 6 weeks, during which time the operator had obtained further experience and training. The intra-operator precision of vertebral height measurements was calculated using the three separate combinations of paired analyses, and expressed as the coefficient of variation. This study confirms the importance of adequate training and attention to detail in MXA analysis. The data indicate that the precision of MXA is adequate for its use in the diagnosis of vertebral fractures, based on a 20% deformity criteria. Use of MXA for monitoring would require approximately an 8% change in vertebral heights to achieve statistical significance

  12. NeuronMetrics: software for semi-automated processing of cultured neuron images.

    Science.gov (United States)

    Narro, Martha L; Yang, Fan; Kraft, Robert; Wenk, Carola; Efrat, Alon; Restifo, Linda L

    2007-03-23

    Using primary cell culture to screen for changes in neuronal morphology requires specialized analysis software. We developed NeuronMetrics for semi-automated, quantitative analysis of two-dimensional (2D) images of fluorescently labeled cultured neurons. It skeletonizes the neuron image using two complementary image-processing techniques, capturing fine terminal neurites with high fidelity. An algorithm was devised to span wide gaps in the skeleton. NeuronMetrics uses a novel strategy based on geometric features called faces to extract a branch number estimate from complex arbors with numerous neurite-to-neurite contacts, without creating a precise, contact-free representation of the neurite arbor. It estimates total neurite length, branch number, primary neurite number, territory (the area of the convex polygon bounding the skeleton and cell body), and Polarity Index (a measure of neuronal polarity). These parameters provide fundamental information about the size and shape of neurite arbors, which are critical factors for neuronal function. NeuronMetrics streamlines optional manual tasks such as removing noise, isolating the largest primary neurite, and correcting length for self-fasciculating neurites. Numeric data are output in a single text file, readily imported into other applications for further analysis. Written as modules for ImageJ, NeuronMetrics provides practical analysis tools that are easy to use and support batch processing. Depending on the need for manual intervention, processing time for a batch of approximately 60 2D images is 1.0-2.5 h, from a folder of images to a table of numeric data. NeuronMetrics' output accelerates the quantitative detection of mutations and chemical compounds that alter neurite morphology in vitro, and will contribute to the use of cultured neurons for drug discovery.

  13. Semi-automated microwave assisted solid-phase peptide synthesis

    DEFF Research Database (Denmark)

    Pedersen, Søren Ljungberg

    with microwaves for SPPS has gained in popularity as it for many syntheses has provided significant improvement in terms of speed, purity, and yields, maybe especially in the synthesis of long and "difficult" peptides. Thus, precise microwave heating has emerged as one new parameter for SPPS, in addition...... to coupling reagents, resins, solvents etc. We have previously reported on microwave heating to promote a range of solid-phase reactions in SPPS. Here we present a new, flexible semi-automated instrument for the application of precise microwave heating in solid-phase synthesis. It combines a slightly modified...... Biotage Initiator microwave instrument, which is available in many laboratories, with a modified semi-automated peptide synthesizer from MultiSynTech. A custom-made reaction vessel is placed permanently in the microwave oven, thus the reactor does not have to be moved between steps. Mixing is achieved...

  14. Semiautomated radioimmunoassay for mass screening of drugs of abuse

    International Nuclear Information System (INIS)

    Sulkowski, T.S.; Lathrop, G.D.; Merritt, J.H.; Landez, J.H.; Noe, E.R.

    1975-01-01

    A rapid, semiautomated radioimmunoassay system for detection of morphine, barbiturates, and amphetamines is described. The assays are applicable to large drug abuse screening programs. The heart of the system is the automatic pipetting station which can accomplish 600 pipetting operations per hour. The method uses 15 to 30 μl for the amphetamine and combined morphine/barbiturate assays. A number of other drugs were tested for interference with the assays and the results are discussed

  15. Use of carbamylated charge standards for testing batches of ampholytes used in two-dimensional elecrophoresis

    Energy Technology Data Exchange (ETDEWEB)

    Tollaksen, S L; Edwards, J J; Anderson, N G

    1981-01-01

    A method of testing batches of ampholytes is presented. By using carbamylated charge standards to co-electrophorese with the protein sample in the first-dimension isoelectric focusing gel, one can monitor, after running and staining the second-dimension sodium dodecyl sulfate (SDS) slab gel, the continuity of the pH gradient. Charge standards can also be used to check the reproducibility of the pH gradient among batches of ampholytes and to modify the new batch with a small amount of a narrow range ampholyte to assure reproducibility of experiments. Ampholytes for comparison were obtained from three major manufacturers. 5 figures.

  16. Preserve specimens for reproducibility

    Czech Academy of Sciences Publication Activity Database

    Krell, F.-T.; Klimeš, Petr; Rocha, L. A.; Fikáček, M.; Miller, S. E.

    2016-01-01

    Roč. 539, č. 7628 (2016), s. 168 ISSN 0028-0836 Institutional support: RVO:60077344 Keywords : reproducibility * specimen * biodiversity Subject RIV: EH - Ecology, Behaviour Impact factor: 40.137, year: 2016 http://www.nature.com/nature/journal/v539/n7628/full/539168b.html

  17. Influence of radiation dose and iterative reconstruction algorithms for measurement accuracy and reproducibility of pulmonary nodule volumetry: A phantom study

    International Nuclear Information System (INIS)

    Kim, Hyungjin; Park, Chang Min; Song, Yong Sub; Lee, Sang Min; Goo, Jin Mo

    2014-01-01

    Purpose: To evaluate the influence of radiation dose settings and reconstruction algorithms on the measurement accuracy and reproducibility of semi-automated pulmonary nodule volumetry. Materials and methods: CT scans were performed on a chest phantom containing various nodules (10 and 12 mm; +100, −630 and −800 HU) at 120 kVp with tube current–time settings of 10, 20, 50, and 100 mAs. Each CT was reconstructed using filtered back projection (FBP), iDose 4 and iterative model reconstruction (IMR). Semi-automated volumetry was performed by two radiologists using commercial volumetry software for nodules at each CT dataset. Noise, contrast-to-noise ratio and signal-to-noise ratio of CT images were also obtained. The absolute percentage measurement errors and differences were then calculated for volume and mass. The influence of radiation dose and reconstruction algorithm on measurement accuracy, reproducibility and objective image quality metrics was analyzed using generalized estimating equations. Results: Measurement accuracy and reproducibility of nodule volume and mass were not significantly associated with CT radiation dose settings or reconstruction algorithms (p > 0.05). Objective image quality metrics of CT images were superior in IMR than in FBP or iDose 4 at all radiation dose settings (p < 0.05). Conclusion: Semi-automated nodule volumetry can be applied to low- or ultralow-dose chest CT with usage of a novel iterative reconstruction algorithm without losing measurement accuracy and reproducibility

  18. Influence of radiation dose and iterative reconstruction algorithms for measurement accuracy and reproducibility of pulmonary nodule volumetry: A phantom study

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Hyungjin, E-mail: khj.snuh@gmail.com [Department of Radiology, Seoul National University College of Medicine, Institute of Radiation Medicine, Seoul National University Medical Research Center, 101, Daehangno, Jongno-gu, Seoul 110-744 (Korea, Republic of); Park, Chang Min, E-mail: cmpark@radiol.snu.ac.kr [Department of Radiology, Seoul National University College of Medicine, Institute of Radiation Medicine, Seoul National University Medical Research Center, 101, Daehangno, Jongno-gu, Seoul 110-744 (Korea, Republic of); Cancer Research Institute, Seoul National University, 101, Daehangno, Jongno-gu, Seoul 110-744 (Korea, Republic of); Song, Yong Sub, E-mail: terasong@gmail.com [Department of Radiology, Seoul National University College of Medicine, Institute of Radiation Medicine, Seoul National University Medical Research Center, 101, Daehangno, Jongno-gu, Seoul 110-744 (Korea, Republic of); Lee, Sang Min, E-mail: sangmin.lee.md@gmail.com [Department of Radiology, Seoul National University College of Medicine, Institute of Radiation Medicine, Seoul National University Medical Research Center, 101, Daehangno, Jongno-gu, Seoul 110-744 (Korea, Republic of); Goo, Jin Mo, E-mail: jmgoo@plaza.snu.ac.kr [Department of Radiology, Seoul National University College of Medicine, Institute of Radiation Medicine, Seoul National University Medical Research Center, 101, Daehangno, Jongno-gu, Seoul 110-744 (Korea, Republic of); Cancer Research Institute, Seoul National University, 101, Daehangno, Jongno-gu, Seoul 110-744 (Korea, Republic of)

    2014-05-15

    Purpose: To evaluate the influence of radiation dose settings and reconstruction algorithms on the measurement accuracy and reproducibility of semi-automated pulmonary nodule volumetry. Materials and methods: CT scans were performed on a chest phantom containing various nodules (10 and 12 mm; +100, −630 and −800 HU) at 120 kVp with tube current–time settings of 10, 20, 50, and 100 mAs. Each CT was reconstructed using filtered back projection (FBP), iDose{sup 4} and iterative model reconstruction (IMR). Semi-automated volumetry was performed by two radiologists using commercial volumetry software for nodules at each CT dataset. Noise, contrast-to-noise ratio and signal-to-noise ratio of CT images were also obtained. The absolute percentage measurement errors and differences were then calculated for volume and mass. The influence of radiation dose and reconstruction algorithm on measurement accuracy, reproducibility and objective image quality metrics was analyzed using generalized estimating equations. Results: Measurement accuracy and reproducibility of nodule volume and mass were not significantly associated with CT radiation dose settings or reconstruction algorithms (p > 0.05). Objective image quality metrics of CT images were superior in IMR than in FBP or iDose{sup 4} at all radiation dose settings (p < 0.05). Conclusion: Semi-automated nodule volumetry can be applied to low- or ultralow-dose chest CT with usage of a novel iterative reconstruction algorithm without losing measurement accuracy and reproducibility.

  19. Reproducibility of ultrasonic testing

    International Nuclear Information System (INIS)

    Lecomte, J.-C.; Thomas, Andre; Launay, J.-P.; Martin, Pierre

    The reproducibility of amplitude quotations for both artificial and natural reflectors was studied for several combinations of instrument/search unit, all being of the same type. This study shows that in industrial inspection if a range of standardized equipment is used, a margin of error of about 6 decibels has to be taken into account (confidence interval of 95%). This margin is about 4 to 5 dB for natural or artificial defects located in the central area and about 6 to 7 dB for artificial defects located on the back surface. This lack of reproducibility seems to be attributable first to the search unit and then to the instrument and operator. These results were confirmed by analysis of calibration data obtained from 250 tests performed by 25 operators under shop conditions. The margin of error was higher than the 6 dB obtained in the study [fr

  20. Semi-automated extraction and characterization of Stromal Vascular Fraction using a new medical device.

    Science.gov (United States)

    Hanke, Alexander; Prantl, Lukas; Wenzel, Carina; Nerlich, Michael; Brockhoff, Gero; Loibl, Markus; Gehmert, Sebastian

    2016-01-01

    .1% ±12.0% vs. h: 14.2% ±8.5%; p = 0.07). The semi-automated closed system provides a considerable amount of sterile SVF with high reproducibility. Furthermore, the SVF extracted by both methods showed a similar cell composition which is in accordance with the data from literature. This semi-automated device offers an opportunity to take research and application of the SVF one step further to the clinic.

  1. Kubernetes as a batch scheduler

    OpenAIRE

    Souza, Clenimar; Brito Da Rocha, Ricardo

    2017-01-01

    This project aims at executing a CERN batch use case using Kubernetes, in order to figure out what are the advantages and disadvantages, as well as the functionality that can be replicated or is missing. The reference for the batch system is the CERN Batch System, which uses HTCondor. Another goal of this project is to evaluate the current status of federated resources in Kubernetes, in comparison to the single-cluster API resources. Finally, the last goal of this project is to implement buil...

  2. Batch variation between branchial cell cultures: An analysis of variance

    DEFF Research Database (Denmark)

    Hansen, Heinz Johs. Max; Grosell, M.; Kristensen, L.

    2003-01-01

    We present in detail how a statistical analysis of variance (ANOVA) is used to sort out the effect of an unexpected batch-to-batch variation between cell cultures. Two separate cultures of rainbow trout branchial cells were grown on permeable filtersupports ("inserts"). They were supposed...... and introducing the observed difference between batches as one of the factors in an expanded three-dimensional ANOVA, we were able to overcome an otherwisecrucial lack of sufficiently reproducible duplicate values. We could thereby show that the effect of changing the apical medium was much more marked when...... the radioactive lipid precursors were added on the apical, rather than on the basolateral, side. Theinsert cell cultures were obviously polarized. We argue that it is not reasonable to reject troublesome experimental results, when we do not know a priori that something went wrong. The ANOVA is a very useful...

  3. Reducing the cost of semi-automated in-gel tryptic digestion and GeLC sample preparation for high-throughput proteomics.

    Science.gov (United States)

    Ruelcke, Jayde E; Loo, Dorothy; Hill, Michelle M

    2016-10-21

    Peptide generation by trypsin digestion is typically the first step in mass spectrometry-based proteomics experiments, including 'bottom-up' discovery and targeted proteomics using multiple reaction monitoring. Manual tryptic digest and the subsequent clean-up steps can add variability even before the sample reaches the analytical platform. While specialized filter plates and tips have been designed for automated sample processing, the specialty reagents required may not be accessible or feasible due to their high cost. Here, we report a lower-cost semi-automated protocol for in-gel digestion and GeLC using standard 96-well microplates. Further cost savings were realized by re-using reagent tips with optimized sample ordering. To evaluate the methodology, we compared a simple mixture of 7 proteins and a complex cell-lysate sample. The results across three replicates showed that our semi-automated protocol had performance equal to or better than a manual in-gel digestion with respect to replicate variability and level of contamination. In this paper, we also provide the Agilent Bravo method file, which can be adapted to other liquid handlers. The simplicity, reproducibility, and cost-effectiveness of our semi-automated protocol make it ideal for routine in-gel and GeLC sample preparations, as well as high throughput processing of large clinical sample cohorts. Copyright © 2016 Elsevier B.V. All rights reserved.

  4. Magnet stability and reproducibility

    CERN Document Server

    Marks, N

    2010-01-01

    Magnet stability and reproducibility have become increasingly important as greater precision and beams with smaller dimension are required for research, medical and other purpose. The observed causes of mechanical and electrical instability are introduced and the engineering arrangements needed to minimize these problems discussed; the resulting performance of a state-of-the-art synchrotron source (Diamond) is then presented. The need for orbit feedback to obtain best possible beam stability is briefly introduced, but omitting any details of the necessary technical equipment, which is outside the scope of the presentation.

  5. Data-driven batch schuduling

    Energy Technology Data Exchange (ETDEWEB)

    Bent, John [Los Alamos National Laboratory; Denehy, Tim [GOOGLE; Arpaci - Dusseau, Remzi [UNIV OF WISCONSIN; Livny, Miron [UNIV OF WISCONSIN; Arpaci - Dusseau, Andrea C [NON LANL

    2009-01-01

    In this paper, we develop data-driven strategies for batch computing schedulers. Current CPU-centric batch schedulers ignore the data needs within workloads and execute them by linking them transparently and directly to their needed data. When scheduled on remote computational resources, this elegant solution of direct data access can incur an order of magnitude performance penalty for data-intensive workloads. Adding data-awareness to batch schedulers allows a careful coordination of data and CPU allocation thereby reducing the cost of remote execution. We offer here new techniques by which batch schedulers can become data-driven. Such systems can use our analytical predictive models to select one of the four data-driven scheduling policies that we have created. Through simulation, we demonstrate the accuracy of our predictive models and show how they can reduce time to completion for some workloads by as much as 80%.

  6. Process Analytical Technology (PAT): batch-to-batch reproducibility of fermentation processes by robust process operational design and control.

    Science.gov (United States)

    Gnoth, S; Jenzsch, M; Simutis, R; Lübbert, A

    2007-10-31

    The Process Analytical Technology (PAT) initiative of the FDA is a reaction on the increasing discrepancy between current possibilities in process supervision and control of pharmaceutical production processes and its current application in industrial manufacturing processes. With rigid approval practices based on standard operational procedures, adaptations of production reactors towards the state of the art were more or less inhibited for long years. Now PAT paves the way for continuous process and product improvements through improved process supervision based on knowledge-based data analysis, "Quality-by-Design"-concepts, and, finally, through feedback control. Examples of up-to-date implementations of this concept are presented. They are taken from one key group of processes in recombinant pharmaceutical protein manufacturing, the cultivations of genetically modified Escherichia coli bacteria.

  7. Literature classification for semi-automated updating of biological knowledgebases

    DEFF Research Database (Denmark)

    Olsen, Lars Rønn; Kudahl, Ulrich Johan; Winther, Ole

    2013-01-01

    abstracts yielded classification accuracy of 0.95, thus showing significant value in support of data extraction from the literature. Conclusion: We here propose a conceptual framework for semi-automated extraction of epitope data embedded in scientific literature using principles from text mining...... types of biological data, such as sequence data, are extensively stored in biological databases, functional annotations, such as immunological epitopes, are found primarily in semi-structured formats or free text embedded in primary scientific literature. Results: We defined and applied a machine...

  8. A semi-automated method for measuring thickness and white matter ...

    African Journals Online (AJOL)

    A semi-automated method for measuring thickness and white matter integrity of the corpus callosum. ... and interhemispheric differences. Future research will determine normal values for age and compare CC thickness with peripheral white matter volume loss in large groups of patients, using the semiautomated technique.

  9. Development and Evaluation of a Semi-automated Segmentation Tool and a Modified Ellipsoid Formula for Volumetric Analysis of the Kidney in Non-contrast T2-Weighted MR Images.

    Science.gov (United States)

    Seuss, Hannes; Janka, Rolf; Prümmer, Marcus; Cavallaro, Alexander; Hammon, Rebecca; Theis, Ragnar; Sandmair, Martin; Amann, Kerstin; Bäuerle, Tobias; Uder, Michael; Hammon, Matthias

    2017-04-01

    Volumetric analysis of the kidney parenchyma provides additional information for the detection and monitoring of various renal diseases. Therefore the purposes of the study were to develop and evaluate a semi-automated segmentation tool and a modified ellipsoid formula for volumetric analysis of the kidney in non-contrast T2-weighted magnetic resonance (MR)-images. Three readers performed semi-automated segmentation of the total kidney volume (TKV) in axial, non-contrast-enhanced T2-weighted MR-images of 24 healthy volunteers (48 kidneys) twice. A semi-automated threshold-based segmentation tool was developed to segment the kidney parenchyma. Furthermore, the three readers measured renal dimensions (length, width, depth) and applied different formulas to calculate the TKV. Manual segmentation served as a reference volume. Volumes of the different methods were compared and time required was recorded. There was no significant difference between the semi-automatically and manually segmented TKV (p = 0.31). The difference in mean volumes was 0.3 ml (95% confidence interval (CI), -10.1 to 10.7 ml). Semi-automated segmentation was significantly faster than manual segmentation, with a mean difference = 188 s (220 vs. 408 s); p T2-weighted MR data delivers accurate and reproducible results and was significantly faster than manual segmentation. Applying a modified ellipsoid formula quickly provides an accurate kidney volume.

  10. BatchJS: Implementing Batches in JavaScript

    NARCIS (Netherlands)

    D. Kasemier

    2014-01-01

    htmlabstractNone of our popular programming languages know how to handle distribution well. Yet our programs interact more and more with each other and our data resorts in databases and web services. Batches are a new addition to languages that can finally bring native support for distribution to

  11. Simulated Batch Production of Penicillin

    Science.gov (United States)

    Whitaker, A.; Walker, J. D.

    1973-01-01

    Describes a program in applied biology in which the simulation of the production of penicillin in a batch fermentor is used as a teaching technique to give students experience before handling a genuine industrial fermentation process. Details are given for the calculation of minimum production cost. (JR)

  12. NDA BATCH 2002-02

    Energy Technology Data Exchange (ETDEWEB)

    Lawrence Livermore National Laboratory

    2009-12-09

    QC sample results (daily background checks, 20-gram and 100-gram SGS drum checks) were within acceptable criteria established by WIPP's Quality Assurance Objectives for TRU Waste Characterization. Replicate runs were performed on 5 drums with IDs LL85101099TRU, LL85801147TRU, LL85801109TRU, LL85300999TRU and LL85500979TRU. All replicate measurement results are identical at the 95% confidence level as established by WIPP criteria. Note that the batch covered 5 weeks of SGS measurements from 23-Jan-2002 through 22-Feb-2002. Data packet for SGS Batch 2002-02 generated using gamma spectroscopy with the Pu Facility SGS unit is technically reasonable. All QC samples are in compliance with established control limits. The batch data packet has been reviewed for correctness, completeness, consistency and compliance with WIPP's Quality Assurance Objectives and determined to be acceptable. An Expert Review was performed on the data packet between 28-Feb-02 and 09-Jul-02 to check for potential U-235, Np-237 and Am-241 interferences and address drum cases where specific scan segments showed Se gamma ray transmissions for the 136-keV gamma to be below 0.1 %. Two drums in the batch showed Pu-238 at a relative mass ratio more than 2% of all the Pu isotopes.

  13. Batching System for Superior Service

    Science.gov (United States)

    2001-01-01

    Veridian's Portable Batch System (PBS) was the recipient of the 1997 NASA Space Act Award for outstanding software. A batch system is a set of processes for managing queues and jobs. Without a batch system, it is difficult to manage the workload of a computer system. By bundling the enterprise's computing resources, the PBS technology offers users a single coherent interface, resulting in efficient management of the batch services. Users choose which information to package into "containers" for system-wide use. PBS also provides detailed system usage data, a procedure not easily executed without this software. PBS operates on networked, multi-platform UNIX environments. Veridian's new version, PBS Pro,TM has additional features and enhancements, including support for additional operating systems. Veridian distributes the original version of PBS as Open Source software via the PBS website. Customers can register and download the software at no cost. PBS Pro is also available via the web and offers additional features such as increased stability, reliability, and fault tolerance.A company using PBS can expect a significant increase in the effective management of its computing resources. Tangible benefits include increased utilization of costly resources and enhanced understanding of computational requirements and user needs.

  14. Results on testing pilot industrial batch of SC magnets for the UNK

    International Nuclear Information System (INIS)

    Ageev, A.I.; Andreev, N.I.; Balbekov, V.I.; Chirkov, P.N.; Dolzhenkov, V.I.; Gertsev, K.F.; Gridasov, V.I.; Myznikov, K.P.; Smirnov, N.L.; Sychev, V.A.

    1992-01-01

    IHEP has developed and studied the superconducting dipoles and quadrupoles of the regular part of the UNK main ring which satisfy the requirements imposed on them. The pilot-industrial batch of the UNK SC magnets has been produced now. The reproducibility of the magnet characteristics is studied and the mass production technology is optimized with this batch. The results of the cryogenic tests and the magnetic field measurements for the UNK SC dipoles of the pilot-industrial batch are presented. (author) 5 refs.; 6 figs.; 1 tab

  15. NGBAuth - Next Generation Batch Authentication for long running batch jobs.

    CERN Document Server

    Juto, Zakarias

    2015-01-01

    This document describes the prototyping of a new solution for the CERN batch authentication of long running jobs. While the job submission requires valid user credentials, these have to be renewed due to long queuing and execution times. Described within is a new system which will guarantee a similar level of security as the old LSFAuth while simplifying the implementation and the overall architecture. The new system is being built on solid, streamlined and tested components (notably OpenSSL) and a priority has been to make it more generic in order to facilitate the evolution of the current system such as for the expected migration from LSF to Condor as backend batch system.

  16. Reproducibility in a multiprocessor system

    Science.gov (United States)

    Bellofatto, Ralph A; Chen, Dong; Coteus, Paul W; Eisley, Noel A; Gara, Alan; Gooding, Thomas M; Haring, Rudolf A; Heidelberger, Philip; Kopcsay, Gerard V; Liebsch, Thomas A; Ohmacht, Martin; Reed, Don D; Senger, Robert M; Steinmacher-Burow, Burkhard; Sugawara, Yutaka

    2013-11-26

    Fixing a problem is usually greatly aided if the problem is reproducible. To ensure reproducibility of a multiprocessor system, the following aspects are proposed; a deterministic system start state, a single system clock, phase alignment of clocks in the system, system-wide synchronization events, reproducible execution of system components, deterministic chip interfaces, zero-impact communication with the system, precise stop of the system and a scan of the system state.

  17. PCR evaluation : considering transition from manual to semi-automated pavement distress collection and analysis.

    Science.gov (United States)

    2013-07-01

    This study is designed to assist the Ohio Department of Transportation (ODOT) in determining : whether transitioning from manual to state-of the-practice semi-automated pavement distress : data collection is feasible and recommended. Statistical and ...

  18. Assessment of a semiautomated pelvic floor measurement model for evaluating pelvic organ prolapse on MRI.

    Science.gov (United States)

    Onal, S; Lai-Yuen, S; Bao, P; Weitzenfeld, A; Greene, K; Kedar, R; Hart, S

    2014-06-01

    The objective of this study was to assess the performance of a semiautomated pelvic floor measurement algorithmic model on dynamic magnetic resonance imaging (MRI) images compared with manual pelvic floor measurements for pelvic organ prolapse (POP) evaluation. We examined 15 MRIs along the midsagittal view. Five reference points used for pelvic floor measurements were identified both manually and using our semiautomated measurement model. The two processes were compared in terms of accuracy and precision. The semiautomated pelvic floor measurement model provided highly consistent and accurate locations for all reference points on MRI. Results also showed that the model can identify the reference points faster than the manual-point identification process. The semiautomated pelvic floor measurement model can be used to facilitate and improve the process of pelvic floor measurements on MRI. This will enable high throughput analysis of MRI data to improve the correlation analysis with clinical outcomes and potentially improve POP assessment.

  19. Intelligent, Semi-Automated Procedure Aid (ISAPA) for ISS Flight Control, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — We propose to develop the Intelligent, Semi-Automated Procedure Aid (ISAPA) intended for use by International Space Station (ISS) ground controllers to increase the...

  20. Intra- and interoperator variability of lobar pulmonary volumes and emphysema scores in patients with chronic obstructive pulmonary disease and emphysema: comparison of manual and semi-automated segmentation techniques.

    Science.gov (United States)

    Molinari, Francesco; Pirronti, Tommaso; Sverzellati, Nicola; Diciotti, Stefano; Amato, Michele; Paolantonio, Guglielmo; Gentile, Luigia; Parapatt, George K; D'Argento, Francesco; Kuhnigk, Jan-Martin

    2013-01-01

    We aimed to compare the intra- and interoperator variability of lobar volumetry and emphysema scores obtained by semi-automated and manual segmentation techniques in lung emphysema patients. In two sessions held three months apart, two operators performed lobar volumetry of unenhanced chest computed tomography examinations of 47 consecutive patients with chronic obstructive pulmonary disease and lung emphysema. Both operators used the manual and semi-automated segmentation techniques. The intra- and interoperator variability of the volumes and emphysema scores obtained by semi-automated segmentation was compared with the variability obtained by manual segmentation of the five pulmonary lobes. The intra- and interoperator variability of the lobar volumes decreased when using semi-automated lobe segmentation (coefficients of repeatability for the first operator: right upper lobe, 147 vs. 96.3; right middle lobe, 137.7 vs. 73.4; right lower lobe, 89.2 vs. 42.4; left upper lobe, 262.2 vs. 54.8; and left lower lobe, 260.5 vs. 56.5; coefficients of repeatability for the second operator: right upper lobe, 61.4 vs. 48.1; right middle lobe, 56 vs. 46.4; right lower lobe, 26.9 vs. 16.7; left upper lobe, 61.4 vs. 27; and left lower lobe, 63.6 vs. 27.5; coefficients of reproducibility in the interoperator analysis: right upper lobe, 191.3 vs. 102.9; right middle lobe, 219.8 vs. 126.5; right lower lobe, 122.6 vs. 90.1; left upper lobe, 166.9 vs. 68.7; and left lower lobe, 168.7 vs. 71.6). The coefficients of repeatability and reproducibility of emphysema scores also decreased when using semi-automated segmentation and had ranges that varied depending on the target lobe and selected threshold of emphysema. Semi-automated segmentation reduces the intra- and interoperator variability of lobar volumetry and provides a more objective tool than manual technique for quantifying lung volumes and severity of emphysema.

  1. PROOF on a Batch System

    International Nuclear Information System (INIS)

    Behrenhoff, W; Ehrenfeld, W; Samson, J; Stadie, H

    2011-01-01

    The 'parallel ROOT facility' (PROOF) from the ROOT framework provides a mechanism to distribute the load of interactive and non-interactive ROOT sessions on a set of worker nodes optimising the overall execution time. While PROOF is designed to work on a dedicated PROOF cluster, the benefits of PROOF can also be used on top of another batch scheduling system with the help of temporary per user PROOF clusters. We will present a lightweight tool which starts a temporary PROOF cluster on a SGE based batch cluster or, via a plugin mechanism, e.g. on a set of bare desktops via ssh. Further, we will present the result of benchmarks which compare the data throughput for different data storage back ends available at the German National Analysis Facility (NAF) at DESY.

  2. Comparison of semiautomated bird song recognition with manual detection of recorded bird song samples

    OpenAIRE

    Lisa A. Venier; Marc J. Mazerolle; Anna Rodgers; Ken A. McIlwrick; Stephen Holmes; Dean Thompson

    2017-01-01

    Automated recording units are increasingly being used to sample wildlife populations. These devices can produce large amounts of data that are difficult to process manually. However, the information in the recordings can be summarized with semiautomated sound recognition software. Our objective was to assess the utility of the semiautomated bird song recognizers to produce data useful for conservation and sustainable forest management applications. We compared detection data generated from ex...

  3. Characterization and properties of batch-processed melt-textured YBCO

    Energy Technology Data Exchange (ETDEWEB)

    Kaiser, A.W.; Adam, M.; Bornemann, H.J. [INFP, Forschungszentrum Karlsruhe GmbH, PO Box 3640, 76021 Karlsruhe (Germany)

    1998-01-01

    High-temperature superconductor bulk parts are batch processed using a semi-automated processing technique based on the melt-texturation process. Levitation properties under static and dynamic load levels were analysed using a test bench with a three-dimensional force sensor unit. Measurements of levitation force give no detailed suggestions on texture, secondary domains or cracks. Therefore other measurements to control homogeneity of the bulk were performed. Texture on full-size pellets (FWHM < 5 deg., {delta}{sub {chi}} < 3 deg.) was verified by elastic neutron scattering. To study the influence of local texture on properties a pellet was divided into nine segments. Preliminary results indicate that a correlation between neutron data and levitation force needs further investigations. Flux maps of samples magnetized by permanent magnets or by a pulsed magnetization unit were used to verify the homogeneity and domain structure of the material and to evaluate macroscopic critical currents. (author)

  4. Semiautomated analysis of small-animal PET data.

    Science.gov (United States)

    Kesner, Adam L; Dahlbom, Magnus; Huang, Sung-Cheng; Hsueh, Wei-Ann; Pio, Betty S; Czernin, Johannes; Kreissl, Michael; Wu, Hsiao-Ming; Silverman, Daniel H S

    2006-07-01

    The objective of the work reported here was to develop and test automated methods to calculate biodistribution of PET tracers using small-animal PET images. After developing software that uses visually distinguishable organs and other landmarks on a scan to semiautomatically coregister a digital mouse phantom with a small-animal PET scan, we elastically transformed the phantom to conform to those landmarks in 9 simulated scans and in 18 actual PET scans acquired of 9 mice. Tracer concentrations were automatically calculated in 22 regions of interest (ROIs) reflecting the whole body and 21 individual organs. To assess the accuracy of this approach, we compared the software-measured activities in the ROIs of simulated PET scans with the known activities, and we compared the software-measured activities in the ROIs of real PET scans both with manually established ROI activities in original scan data and with actual radioactivity content in immediately harvested tissues of imaged animals. PET/atlas coregistrations were successfully generated with minimal end-user input, allowing rapid quantification of 22 separate tissue ROIs. The simulated scan analysis found the method to be robust with respect to the overall size and shape of individual animal scans, with average activity values for all organs tested falling within the range of 98% +/- 3% of the organ activity measured in the unstretched phantom scan. Standardized uptake values (SUVs) measured from actual PET scans using this semiautomated method correlated reasonably well with radioactivity content measured in harvested organs (median r = 0.94) and compared favorably with conventional SUV correlations with harvested organ data (median r = 0.825). A semiautomated analytic approach involving coregistration of scan-derived images with atlas-type images can be used in small-animal whole-body radiotracer studies to estimate radioactivity concentrations in organs. This approach is rapid and less labor intensive than are

  5. Influence of radiation dose and iterative reconstruction algorithms for measurement accuracy and reproducibility of pulmonary nodule volumetry: A phantom study.

    Science.gov (United States)

    Kim, Hyungjin; Park, Chang Min; Song, Yong Sub; Lee, Sang Min; Goo, Jin Mo

    2014-05-01

    To evaluate the influence of radiation dose settings and reconstruction algorithms on the measurement accuracy and reproducibility of semi-automated pulmonary nodule volumetry. CT scans were performed on a chest phantom containing various nodules (10 and 12mm; +100, -630 and -800HU) at 120kVp with tube current-time settings of 10, 20, 50, and 100mAs. Each CT was reconstructed using filtered back projection (FBP), iDose(4) and iterative model reconstruction (IMR). Semi-automated volumetry was performed by two radiologists using commercial volumetry software for nodules at each CT dataset. Noise, contrast-to-noise ratio and signal-to-noise ratio of CT images were also obtained. The absolute percentage measurement errors and differences were then calculated for volume and mass. The influence of radiation dose and reconstruction algorithm on measurement accuracy, reproducibility and objective image quality metrics was analyzed using generalized estimating equations. Measurement accuracy and reproducibility of nodule volume and mass were not significantly associated with CT radiation dose settings or reconstruction algorithms (p>0.05). Objective image quality metrics of CT images were superior in IMR than in FBP or iDose(4) at all radiation dose settings (pvolumetry can be applied to low- or ultralow-dose chest CT with usage of a novel iterative reconstruction algorithm without losing measurement accuracy and reproducibility. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  6. Contextual sensitivity in scientific reproducibility.

    Science.gov (United States)

    Van Bavel, Jay J; Mende-Siedlecki, Peter; Brady, William J; Reinero, Diego A

    2016-06-07

    In recent years, scientists have paid increasing attention to reproducibility. For example, the Reproducibility Project, a large-scale replication attempt of 100 studies published in top psychology journals found that only 39% could be unambiguously reproduced. There is a growing consensus among scientists that the lack of reproducibility in psychology and other fields stems from various methodological factors, including low statistical power, researcher's degrees of freedom, and an emphasis on publishing surprising positive results. However, there is a contentious debate about the extent to which failures to reproduce certain results might also reflect contextual differences (often termed "hidden moderators") between the original research and the replication attempt. Although psychologists have found extensive evidence that contextual factors alter behavior, some have argued that context is unlikely to influence the results of direct replications precisely because these studies use the same methods as those used in the original research. To help resolve this debate, we recoded the 100 original studies from the Reproducibility Project on the extent to which the research topic of each study was contextually sensitive. Results suggested that the contextual sensitivity of the research topic was associated with replication success, even after statistically adjusting for several methodological characteristics (e.g., statistical power, effect size). The association between contextual sensitivity and replication success did not differ across psychological subdisciplines. These results suggest that researchers, replicators, and consumers should be mindful of contextual factors that might influence a psychological process. We offer several guidelines for dealing with contextual sensitivity in reproducibility.

  7. Labelling of 90Y- and 177Lu-DOTA-Bioconjugates for Targeted Radionuclide Therapy: A Comparison among Manual, Semiautomated, and Fully Automated Synthesis

    Directory of Open Access Journals (Sweden)

    Michele Iori

    2017-01-01

    Full Text Available In spite of the hazard due to the radiation exposure, preparation of 90Y- and 177Lu-labelled radiopharmaceuticals is still mainly performed using manual procedures. In the present study the performance of a commercial automatic synthesizer based on disposable cassettes for the labelling of 177Lu- and 90Y-DOTA-conjugated biomolecules (namely, DOTATOC and PSMA-617 was evaluated and compared to a manual and a semiautomated approach. The dose exposure of the operators was evaluated as well. More than 300 clinical preparations of both 90Y- and 177Lu-labelled radiopharmaceuticals have been performed using the three different methods. The mean radiochemical yields for 90Y-DOTATOC were 96.2±4.9%, 90.3±5.6%, and 82.0±8.4%, while for 177Lu-DOTATOC they were 98.3%  ± 0.6, 90.8%  ± 8.3, and 83.1±5.7% when manual, semiautomated, and automated approaches were used, respectively. The mean doses on the whole hands for yttrium-90 preparations were 0.15±0.4 mSv/GBq, 0.04±0.1 mSv/GBq, and 0.11±0.3 mSv/GBq for manual, semiautomated, and automated synthesis, respectively, and for lutetium-177 preparations, they were 0.02±0.008 mSv/GBq, 0.01±0.03 mSv/GBq, and 0.01±0.02 mSv/GBq, respectively. In conclusion, the automated approach guaranteed reliable and reproducible preparations of pharmaceutical grade therapeutic radiopharmaceuticals in a decent RCY. The radiation exposure of the operators remained comparable to the manual approach mainly due to the fact that a dedicated shielding was still not available for the system.

  8. A semiautomated computer-interactive dynamic impact testing system

    International Nuclear Information System (INIS)

    Alexander, D.J.; Nanstad, R.K.; Corwin, W.R.; Hutton, J.T.

    1989-01-01

    A computer-assisted semiautomated system has been developed for testing a variety of specimen types under dynamic impact conditions. The primary use of this system is for the testing of Charpy specimens. Full-, half-, and third-size specimens have been tested, both in the lab and remotely in a hot cell for irradiated specimens. Specimens are loaded into a transfer device which moves the specimen into a chamber, where a hot air gun is used to heat the specimen, or cold nitrogen gas is used for cooling, as required. The specimen is then quickly transferred from the furnace to the anvils and then broken. This system incorporates an instrumented tup to determine the change in voltage during the fracture process. These data are analyzed by the computer system after the test is complete. The voltage-time trace is recorded with a digital oscilloscope, transferred to the computer, and analyzed. The analysis program incorporates several unique features. It interacts with the operator and identifies the maximum voltage during the test, the amount of rapid fracture during the test (if any), and the end of the fracture process. The program then calculates the area to maximum voltage and the total area under the voltage-time curve. The data acquisition and analysis part of the system can also be used to conduct other dynamic testing. Dynamic tear and precracked specimens can be tested with an instrumented tup and analyzed in a similar manner. 3 refs., 7 figs

  9. Semi-automated software service integration in virtual organisations

    Science.gov (United States)

    Afsarmanesh, Hamideh; Sargolzaei, Mahdi; Shadi, Mahdieh

    2015-08-01

    To enhance their business opportunities, organisations involved in many service industries are increasingly active in pursuit of both online provision of their business services (BSs) and collaborating with others. Collaborative Networks (CNs) in service industry sector, however, face many challenges related to sharing and integration of their collection of provided BSs and their corresponding software services. Therefore, the topic of service interoperability for which this article introduces a framework is gaining momentum in research for supporting CNs. It contributes to generation of formal machine readable specification for business processes, aimed at providing their unambiguous definitions, as needed for developing their equivalent software services. The framework provides a model and implementation architecture for discovery and composition of shared services, to support the semi-automated development of integrated value-added services. In support of service discovery, a main contribution of this research is the formal representation of services' behaviour and applying desired service behaviour specified by users for automated matchmaking with other existing services. Furthermore, to support service integration, mechanisms are developed for automated selection of the most suitable service(s) according to a number of service quality aspects. Two scenario cases are presented, which exemplify several specific features related to service discovery and service integration aspects.

  10. Semi-Automated Discovery of Application Session Structure

    Energy Technology Data Exchange (ETDEWEB)

    Kannan, J.; Jung, J.; Paxson, V.; Koksal, C.

    2006-09-07

    While the problem of analyzing network traffic at the granularity of individual connections has seen considerable previous work and tool development, understanding traffic at a higher level---the structure of user-initiated sessions comprised of groups of related connections---remains much less explored. Some types of session structure, such as the coupling between an FTP control connection and the data connections it spawns, have prespecified forms, though the specifications do not guarantee how the forms appear in practice. Other types of sessions, such as a user reading email with a browser, only manifest empirically. Still other sessions might exist without us even knowing of their presence, such as a botnet zombie receiving instructions from its master and proceeding in turn to carry them out. We present algorithms rooted in the statistics of Poisson processes that can mine a large corpus of network connection logs to extract the apparent structure of application sessions embedded in the connections. Our methods are semi-automated in that we aim to present an analyst with high-quality information (expressed as regular expressions) reflecting different possible abstractions of an application's session structure. We develop and test our methods using traces from a large Internet site, finding diversity in the number of applications that manifest, their different session structures, and the presence of abnormal behavior. Our work has applications to traffic characterization and monitoring, source models for synthesizing network traffic, and anomaly detection.

  11. Diffuse optical tomography using semiautomated coregistered ultrasound measurements

    Science.gov (United States)

    Mostafa, Atahar; Vavadi, Hamed; Uddin, K. M. Shihab; Zhu, Quing

    2017-12-01

    Diffuse optical tomography (DOT) has demonstrated huge potential in breast cancer diagnosis and treatment monitoring. DOT image reconstruction guided by ultrasound (US) improves the diffused light localization and lesion reconstruction accuracy. However, DOT reconstruction depends on tumor geometry provided by coregistered US. Experienced operators can manually measure these lesion parameters; however, training and measurement time are needed. The wide clinical use of this technique depends on its robustness and faster imaging reconstruction capability. This article introduces a semiautomated procedure that automatically extracts lesion information from US images and incorporates it into the optical reconstruction. An adaptive threshold-based image segmentation is used to obtain tumor boundaries. For some US images, posterior shadow can extend to the chest wall and make the detection of deeper lesion boundary difficult. This problem can be solved using a Hough transform. The proposed procedure was validated from data of 20 patients. Optical reconstruction results using the proposed procedure were compared with those reconstructed using extracted tumor information from an experienced user. Mean optical absorption obtained from manual measurement was 0.21±0.06 cm-1 for malignant and 0.12±0.06 cm-1 for benign cases, whereas for the proposed method it was 0.24±0.08 cm-1 and 0.12±0.05 cm-1, respectively.

  12. Semi-automated Robust Quantification of Lesions (SRQL Toolbox

    Directory of Open Access Journals (Sweden)

    Kaori L Ito

    2017-05-01

    Full Text Available Quantifying lesions in a reliable manner is fundamental for studying the effects of neuroanatomical changes related to recovery in the post-stroke brain. However, the wide variability in lesion characteristics across individuals makes manual lesion segmentation a challenging and often subjective process. This often makes it difficult to combine stroke lesion data across multiple research sites, due to subjective differences in how lesions may be defined. Thus, we developed the Semi-automated Robust Quantification of Lesions (SRQL; https://github.com/npnl/SRQL; DOI: 10.5281/zenodo.557114 Toolbox that performs several analysis steps: 1 a white matter intensity correction that removes healthy white matter voxels from the lesion mask, thereby making lesions slightly more robust to subjective errors; 2 an automated report of descriptive statistics on lesions for simplified comparison between or across groups, and 3 an option to perform analyses in both native and standard space to facilitate analyses in either space. Here, we describe the methods implemented in the toolbox.

  13. Semi-automated Robust Quantification of Lesions (SRQL Toolbox

    Directory of Open Access Journals (Sweden)

    Kaori Ito

    2017-02-01

    Full Text Available Quantifying lesions in a robust manner is fundamental for studying the effects of neuroanatomical changes in the post-stroke brain on recovery. However, the wide variability in lesion characteristics across individuals makes manual lesion segmentation a challenging and often subjective process. This makes it difficult to combine stroke lesion data across multiple research sites, due to subjective differences in how lesions may be defined. We developed the Semi-automated Robust Quantification of Lesions (SRQL; https://github.com/npnl/SRQL; DOI: 10.5281/zenodo.267213 Toolbox that performs several analysis steps: 1 a white matter intensity correction that removes healthy white matter voxels from the lesion mask, thereby making lesions slightly more robust to subjective errors; 2 an automated report of descriptive statistics on lesions for simplified comparison between or across groups, and 3 an option to perform analyses in both native and standard space to facilitate analyses in either space, or comparisons between spaces. Here, we describe the methods implemented in the toolbox and demonstrate the outputs of the SRQL toolbox.

  14. Suitability of semi-automated tumor response assessment of liver metastases using a dedicated software package

    International Nuclear Information System (INIS)

    Kalkmann, Janine; Ladd, S.C.; Greiff, A. de; Forsting, M.; Stattaus, J.

    2010-01-01

    Purpose: to evaluate the suitability of semi-automated compared to manual tumor response assessment (TRA) of liver metastases. Materials and methods: in total, 32 patients with colorectal cancer and liver metastases were followed by an average of 2.8 contrast-enhanced CT scans. Two observers (O1, O2) measured the longest diameter (LD) of 269 liver metastases manually and semi-automatically using software installed as thin-client on a PACS workstation (LMS-Liver, MEDIAN Technologies). LD and TRA (''progressive'', ''stable'', ''partial remission'') were performed according to RECIST (Response Evaluation Criteria in Solid Tumors) and analyzed for between-method, interobserver and intraobserver variability. The time needed for evaluation was compared for both methods. Results: all measurements correlated excellently (r ≥ 0.96). Intraobserver (semi-automated), interobserver (manual) and between-method differences (by O1) in LD of 1.4 ± 2.6 mm, 1.9 ± 1.9 mm and 2.1 ± 2.0 mm, respectively, were not significant. Interobserver (semi-automated) and between-method (by O2) differences in LD of 3.0 ± 3.0 mm and 2.6 ± 2.0 mm, respectively, reflected a significant variability (p < 0.01). The interobserver agreement in manual and semi-automated TRA was 91.4%. The intraobserver agreement in semi-automated TRA was 84.5%. Between both methods a TRA agreement of 86.2% was obtained. Semi-automated evaluation (2.7 min) took slightly more time than manual evaluation (2.3 min). Conclusion: semi-automated and manual evaluation of liver metastases yield comparable results in response assessments and require comparable effort. (orig.)

  15. The influence of image setting on intracranial translucency measurement by manual and semi-automated system.

    Science.gov (United States)

    Zhen, Li; Yang, Xin; Ting, Yuen Ha; Chen, Min; Leung, Tak Yeung

    2013-09-01

    To investigate the agreement between manual and semi-automated system and the effect of different image settings on intracranial translucency (IT) measurement. A prospective study was conducted on 55 women carrying singleton pregnancy who attended first trimester Down syndrome screening. IT was measured both manually and by semi-automated system at the same default image setting. The IT measurements were then repeated with the post-processing changes in the image setting one at a time. The difference in IT measurements between the altered and the original images were assessed. Intracranial translucency was successfully measured on 55 images both manually and by semi-automated method. There was strong agreement in IT measurements between the two methods with a mean difference (manual minus semi-automated) of 0.011 mm (95% confidence interval--0.052 mm-0.094 mm). There were statistically significant variations in both manual and semi-automated IT measurement after changing the Gain and the Contrast. The greatest changes occurred when the Contrast was reduced to 1 (IT reduced by 0.591 mm in semi-automated; 0.565 mm in manual), followed by when the Gain was increased to 15 (IT reduced by 0.424 mm in semi-automated; 0.524 mm in manual). The image settings may affect IT identification and measurement. Increased Gain and reduced Contrast are the most influential factors and may cause under-measurement of IT. © 2013 John Wiley & Sons, Ltd.

  16. Batch fabrication of disposable screen printed SERS arrays.

    Science.gov (United States)

    Qu, Lu-Lu; Li, Da-Wei; Xue, Jin-Qun; Zhai, Wen-Lei; Fossey, John S; Long, Yi-Tao

    2012-03-07

    A novel facile method of fabricating disposable and highly reproducible surface-enhanced Raman spectroscopy (SERS) arrays using screen printing was explored. The screen printing ink containing silver nanoparticles was prepared and printed on supporting materials by a screen printing process to fabricate SERS arrays (6 × 10 printed spots) in large batches. The fabrication conditions, SERS performance and application of these arrays were systematically investigated, and a detection limit of 1.6 × 10(-13) M for rhodamine 6G could be achieved. Moreover, the screen printed SERS arrays exhibited high reproducibility and stability, the spot-to-spot SERS signals showed that the intensity variation was less than 10% and SERS performance could be maintained over 12 weeks. Portable high-throughput analysis of biological samples was accomplished using these disposable screen printed SERS arrays.

  17. A Model-based B2B (Batch to Batch) Control for An Industrial Batch Polymerization Process

    Science.gov (United States)

    Ogawa, Morimasa

    This paper describes overview of a model-based B2B (batch to batch) control for an industrial batch polymerization process. In order to control the reaction temperature precisely, several methods based on the rigorous process dynamics model are employed at all design stage of the B2B control, such as modeling and parameter estimation of the reaction kinetics which is one of the important part of the process dynamics model. The designed B2B control consists of the gain scheduled I-PD/II2-PD control (I-PD with double integral control), the feed-forward compensation at the batch start time, and the model adaptation utilizing the results of the last batch operation. Throughout the actual batch operations, the B2B control provides superior control performance compared with that of conventional control methods.

  18. Contextual sensitivity in scientific reproducibility

    Science.gov (United States)

    Van Bavel, Jay J.; Mende-Siedlecki, Peter; Brady, William J.; Reinero, Diego A.

    2016-01-01

    In recent years, scientists have paid increasing attention to reproducibility. For example, the Reproducibility Project, a large-scale replication attempt of 100 studies published in top psychology journals found that only 39% could be unambiguously reproduced. There is a growing consensus among scientists that the lack of reproducibility in psychology and other fields stems from various methodological factors, including low statistical power, researcher’s degrees of freedom, and an emphasis on publishing surprising positive results. However, there is a contentious debate about the extent to which failures to reproduce certain results might also reflect contextual differences (often termed “hidden moderators”) between the original research and the replication attempt. Although psychologists have found extensive evidence that contextual factors alter behavior, some have argued that context is unlikely to influence the results of direct replications precisely because these studies use the same methods as those used in the original research. To help resolve this debate, we recoded the 100 original studies from the Reproducibility Project on the extent to which the research topic of each study was contextually sensitive. Results suggested that the contextual sensitivity of the research topic was associated with replication success, even after statistically adjusting for several methodological characteristics (e.g., statistical power, effect size). The association between contextual sensitivity and replication success did not differ across psychological subdisciplines. These results suggest that researchers, replicators, and consumers should be mindful of contextual factors that might influence a psychological process. We offer several guidelines for dealing with contextual sensitivity in reproducibility. PMID:27217556

  19. Testing Reproducibility in Earth Sciences

    Science.gov (United States)

    Church, M. A.; Dudill, A. R.; Frey, P.; Venditti, J. G.

    2017-12-01

    Reproducibility represents how closely the results of independent tests agree when undertaken using the same materials but different conditions of measurement, such as operator, equipment or laboratory. The concept of reproducibility is fundamental to the scientific method as it prevents the persistence of incorrect or biased results. Yet currently the production of scientific knowledge emphasizes rapid publication of previously unreported findings, a culture that has emerged from pressures related to hiring, publication criteria and funding requirements. Awareness and critique of the disconnect between how scientific research should be undertaken, and how it actually is conducted, has been prominent in biomedicine for over a decade, with the fields of economics and psychology more recently joining the conversation. The purpose of this presentation is to stimulate the conversation in earth sciences where, despite implicit evidence in widely accepted classifications, formal testing of reproducibility is rare.As a formal test of reproducibility, two sets of experiments were undertaken with the same experimental procedure, at the same scale, but in different laboratories. Using narrow, steep flumes and spherical glass beads, grain size sorting was examined by introducing fine sediment of varying size and quantity into a mobile coarse bed. The general setup was identical, including flume width and slope; however, there were some variations in the materials, construction and lab environment. Comparison of the results includes examination of the infiltration profiles, sediment mobility and transport characteristics. The physical phenomena were qualitatively reproduced but not quantitatively replicated. Reproduction of results encourages more robust research and reporting, and facilitates exploration of possible variations in data in various specific contexts. Following the lead of other fields, testing of reproducibility can be incentivized through changes to journal

  20. Monte Carlo simulation on kinetics of batch and semi-batch free radical polymerization

    KAUST Repository

    Shao, Jing; Tang, Wei; Xia, Ru; Feng, Xiaoshuang; Chen, Peng; Qian, Jiasheng; Song, Changjiang

    2015-01-01

    experimental and simulation studies, we showed the capability of our Monte Carlo scheme on representing polymerization kinetics in batch and semi-batch processes. Various kinetics information, such as instant monomer conversion, molecular weight

  1. Application of fluorescence-based semi-automated AFLP analysis in barley and wheat

    DEFF Research Database (Denmark)

    Schwarz, G.; Herz, M.; Huang, X.Q.

    2000-01-01

    of semi-automated codominant analysis for hemizygous AFLP markers in an F-2 population was too low, proposing the use of dominant allele-typing defaults. Nevertheless, the efficiency of genetic mapping, especially of complex plant genomes, will be accelerated by combining the presented genotyping......Genetic mapping and the selection of closely linked molecular markers for important agronomic traits require efficient, large-scale genotyping methods. A semi-automated multifluorophore technique was applied for genotyping AFLP marker loci in barley and wheat. In comparison to conventional P-33...

  2. Reproducible Bioinformatics Research for Biologists

    Science.gov (United States)

    This book chapter describes the current Big Data problem in Bioinformatics and the resulting issues with performing reproducible computational research. The core of the chapter provides guidelines and summaries of current tools/techniques that a noncomputational researcher would need to learn to pe...

  3. Reproducibility of brain ADC histograms

    International Nuclear Information System (INIS)

    Steens, S.C.A.; Buchem, M.A. van; Admiraal-Behloul, F.; Schaap, J.A.; Hoogenraad, F.G.C.; Wheeler-Kingshott, C.A.M.; Tofts, P.S.; Cessie, S. le

    2004-01-01

    The aim of this study was to assess the effect of differences in acquisition technique on whole-brain apparent diffusion coefficient (ADC) histogram parameters, as well as to assess scan-rescan reproducibility. Diffusion-weighted imaging (DWI) was performed in 7 healthy subjects with b-values 0-800, 0-1000, and 0-1500 s/mm 2 and fluid-attenuated inversion recovery (FLAIR) DWI with b-values 0-1000 s/mm 2 . All sequences were repeated with and without repositioning. The peak location, peak height, and mean ADC of the ADC histograms and mean ADC of a region of interest (ROI) in the white matter were compared using paired-sample t tests. Scan-rescan reproducibility was assessed using paired-sample t tests, and repeatability coefficients were reported. With increasing maximum b-values, ADC histograms shifted to lower values, with an increase in peak height (p<0.01). With FLAIR DWI, the ADC histogram shifted to lower values with a significantly higher, narrower peak (p<0.01), although the ROI mean ADC showed no significant differences. For scan-rescan reproducibility, no significant differences were observed. Different DWI pulse sequences give rise to different ADC histograms. With a given pulse sequence, however, ADC histogram analysis is a robust and reproducible technique. Using FLAIR DWI, the partial-voluming effect of cerebrospinal fluid, and thus its confounding effect on histogram analyses, can be reduced

  4. Geoprocessing semiautomated applied to licensing of nuclear facilities

    International Nuclear Information System (INIS)

    Oliveira, Aline Fabiane Gonçalves de

    2017-01-01

    In recent decades, Brazilian environmental legislation has undergone considerable evolution. This fact occurs concurrently with changes related to environmental studies, which aim increasingly to guarantee sustainability and environmental balance. Thus, it is important to use technological resources to optimize the environmental studies involved in the licensing processes. The present work sought to analyze and direct the application of geotechnologies (Geoprocessing) in environmental studies of the Local Report (RL) of the Center for the Development of Nuclear Technology (CDTN). The proposal to apply the Geoprocessing tools and the possibilities inherent to the Geographic Information Systems (GIS) technology, as a tool to subsidize the environmental studies in accordance with the requirements of the RL was aimed at contributing to the modernization of the stages involved in the process of Nuclear licensing, such as in the structuring and execution of environmental studies, as well as in the activities of environmental monitoring, always considering the precepts in force in the laws and resolutions and standards in force of the National Nuclear Energy Commission (CNEN) for nuclear licensing. In order to achieve the objective, the ArcGis application was adopted and one of its analytical tools Model Builder. This allowed the macro (schematization) of the methodology from the GIS tools applied, presenting as an advantage the efficiency and optimization of the execution time of the procedures in situations where it is necessary to apply the same routine of tasks, besides being editable, Which provides possibilities for adaptations and improvements. In order to achieve this objective, the applicability of the methodology was highly feasible, the model developed by Model Builder / ArcMap, provided a semi-automated process, and provided a flowchart that depicts the procedure to be performed in order to reach the Final process to make inferences and analyzes with greater

  5. Semi-Automated Diagnosis, Repair, and Rework of Spacecraft Electronics

    Science.gov (United States)

    Struk, Peter M.; Oeftering, Richard C.; Easton, John W.; Anderson, Eric E.

    2008-01-01

    NASA's Constellation Program for Exploration of the Moon and Mars places human crews in extreme isolation in resource scarce environments. Near Earth, the discontinuation of Space Shuttle flights after 2010 will alter the up- and down-mass capacity for the International Space Station (ISS). NASA is considering new options for logistics support strategies for future missions. Aerospace systems are often composed of replaceable modular blocks that minimize the need for complex service operations in the field. Such a strategy however, implies a robust and responsive logistics infrastructure with relatively low transportation costs. The modular Orbital Replacement Units (ORU) used for ISS requires relatively large blocks of replacement hardware even though the actual failed component may really be three orders of magnitude smaller. The ability to perform in-situ repair of electronics circuits at the component level can dramatically reduce the scale of spares and related logistics cost. This ability also reduces mission risk, increases crew independence and improves the overall supportability of the program. The Component-Level Electronics Assembly Repair (CLEAR) task under the NASA Supportability program was established to demonstrate the practicality of repair by first investigating widely used soldering materials and processes (M&P) performed by modest manual means. The work will result in program guidelines for performing manual repairs along with design guidance for circuit reparability. The next phase of CLEAR recognizes that manual repair has its limitations and some highly integrated devices are extremely difficult to handle and demand semi-automated equipment. Further, electronics repairs require a broad range of diagnostic capability to isolate the faulty components. Finally repairs must pass functional tests to determine that the repairs are successful and the circuit can be returned to service. To prevent equipment demands from exceeding spacecraft volume

  6. Family based dispatching with batch availability

    NARCIS (Netherlands)

    van der Zee, D.J.

    2013-01-01

    Family based dispatching rules seek to lower set-up frequencies by grouping (batching) similar types of jobs for joint processing. Hence shop flow times may be improved, as less time is spent on set-ups. Motivated by an industrial project we study the control of machines with batch availability,

  7. Reproducibility of a reaming test

    DEFF Research Database (Denmark)

    Pilny, Lukas; Müller, Pavel; De Chiffre, Leonardo

    2012-01-01

    The reproducibility of a reaming test was analysed to document its applicability as a performance test for cutting fluids. Reaming tests were carried out on a drilling machine using HSS reamers. Workpiece material was an austenitic stainless steel, machined using 4.75 m∙min-1 cutting speed and 0......). Process reproducibility was assessed as the ability of different operators to ensure a consistent rating of individual lubricants. Absolute average values as well as experimental standard deviations of the evaluation parameters were calculated, and uncertainty budgeting was performed. Results document...... a built-up edge occurrence hindering a robust evaluation of cutting fluid performance, if the data evaluation is based on surface finish only. Measurements of hole geometry provide documentation to recognize systematic error distorting the performance test....

  8. Reproducibility of a reaming test

    DEFF Research Database (Denmark)

    Pilny, Lukas; Müller, Pavel; De Chiffre, Leonardo

    2014-01-01

    The reproducibility of a reaming test was analysed to document its applicability as a performance test for cutting fluids. Reaming tests were carried out on a drilling machine using HSS reamers. Workpiece material was an austenitic stainless steel, machined using 4.75 m•min−1 cutting speed and 0......). Process reproducibility was assessed as the ability of different operators to ensure a consistent rating of individual lubricants. Absolute average values as well as experimental standard deviations of the evaluation parameters were calculated, and uncertainty budgeting was performed. Results document...... a built–up edge occurrence hindering a robust evaluation of cutting fluid performance, if the data evaluation is based on surface finish only. Measurements of hole geometry provide documentation to recognise systematic error distorting the performance test....

  9. Uneven batch data alignment with application to the control of batch end-product quality.

    Science.gov (United States)

    Wan, Jian; Marjanovic, Ognjen; Lennox, Barry

    2014-03-01

    Batch processes are commonly characterized by uneven trajectories due to the existence of batch-to-batch variations. The batch end-product quality is usually measured at the end of these uneven trajectories. It is necessary to align the time differences for both the measured trajectories and the batch end-product quality in order to implement statistical process monitoring and control schemes. Apart from synchronizing trajectories with variable lengths using an indicator variable or dynamic time warping, this paper proposes a novel approach to align uneven batch data by identifying short-window PCA&PLS models at first and then applying these identified models to extend shorter trajectories and predict future batch end-product quality. Furthermore, uneven batch data can also be aligned to be a specified batch length using moving window estimation. The proposed approach and its application to the control of batch end-product quality are demonstrated with a simulated example of fed-batch fermentation for penicillin production. Copyright © 2013 ISA. Published by Elsevier Ltd. All rights reserved.

  10. A semi-automated approach to derive elevation time-series and calculate glacier mass balance from historical aerial imagery

    Science.gov (United States)

    Whorton, E.; Headman, A.; Shean, D. E.; McCann, E.

    2017-12-01

    Understanding the implications of glacier recession on water resources in the western U.S. requires quantifying glacier mass change across large regions over several decades. Very few glaciers in North America have long-term continuous field measurements of glacier mass balance. However, systematic aerial photography campaigns began in 1957 on many glaciers in the western U.S. and Alaska. These historical, vertical aerial stereo-photographs documenting glacier evolution have recently become publically available. Digital elevation models (DEM) of the transient glacier surface preserved in each imagery timestamp can be derived, then differenced to calculate glacier volume and mass change to improve regional geodetic solutions of glacier mass balance. In order to batch process these data, we use Python-based algorithms and Agisoft Photoscan structure from motion (SfM) photogrammetry software to semi-automate DEM creation, and orthorectify and co-register historical aerial imagery in a high-performance computing environment. Scanned photographs are rotated to reduce scaling issues, cropped to the same size to remove fiducials, and batch histogram equalization is applied to improve image quality and aid pixel-matching algorithms using the Python library OpenCV. Processed photographs are then passed to Photoscan through the Photoscan Python library to create DEMs and orthoimagery. To extend the period of record, the elevation products are co-registered to each other, airborne LiDAR data, and DEMs derived from sub-meter commercial satellite imagery. With the exception of the placement of ground control points, the process is entirely automated with Python. Current research is focused on: one, applying these algorithms to create geodetic mass balance time series for the 90 photographed glaciers in Washington State and two, evaluating the minimal amount of positional information required in Photoscan to prevent distortion effects that cannot be addressed during co

  11. Expert-driven semi-automated geomorphological mapping for a mountainaous area using a laser DTM

    NARCIS (Netherlands)

    van Asselen, S.; Seijmonsbergen, A.C.

    2006-01-01

    n this paper a semi-automated method is presented to recognize and spatially delineate geomorphological units in mountainous forested ecosystems, using statistical information extracted from a 1-m resolution laser digital elevation dataset. The method was applied to a mountainous area in Austria.

  12. Rapid and convenient semi-automated microwave-assisted solid-phase synthesis of arylopeptoids

    DEFF Research Database (Denmark)

    Rasmussen, Jakob Ewald; Boccia, Marcello Massimo; Nielsen, John

    2014-01-01

    A facile and expedient route to the synthesis of arylopeptoid oligomers (N-alkylated aminomethyl benz-amides) using semi-automated microwave-assisted solid-phase synthesis is presented. The synthesis was optimized for the incorporation of side chains derived from sterically hindered or unreactive...

  13. A semiautomated test apparatus for studying partner preference behavior in the rat

    NARCIS (Netherlands)

    J. Bakker (Julie); J. van Ophemert (J.); F. Eijskoot (F.); A.K. Slob (Koos)

    1994-01-01

    textabstractA semiautomated three-compartment box (3CB) for studying partner preference behavior of rats is decribed. This apparatus automatically records the rat's time spent in each compartment, as well as the locomotor activity (i.e., the number of visits an animal pays to each compartment).

  14. MR renography by semiautomated image analysis : Performance in renal transplant recipients

    NARCIS (Netherlands)

    de Priester, JA; Kessels, AGH; Giele, ELW; den Boer, J.A.; Christiaans, MHL; Hasman, A; van Engelshoven, JMA

    We evaluated a method of semiautomated analysis of dynamic MR image series in renal transplants. Nine patients were studied twice, with an average time interval of 7 days. MR examination consisted of a run of 256 T1-weighted coronal scans (GE; TR/TE/flip: = 11/3.4/60 degrees; slice thickness = 6 mm;

  15. Towards Reproducibility in Computational Hydrology

    Science.gov (United States)

    Hutton, Christopher; Wagener, Thorsten; Freer, Jim; Han, Dawei; Duffy, Chris; Arheimer, Berit

    2017-04-01

    Reproducibility is a foundational principle in scientific research. The ability to independently re-run an experiment helps to verify the legitimacy of individual findings, and evolve (or reject) hypotheses and models of how environmental systems function, and move them from specific circumstances to more general theory. Yet in computational hydrology (and in environmental science more widely) the code and data that produces published results are not regularly made available, and even if they are made available, there remains a multitude of generally unreported choices that an individual scientist may have made that impact the study result. This situation strongly inhibits the ability of our community to reproduce and verify previous findings, as all the information and boundary conditions required to set up a computational experiment simply cannot be reported in an article's text alone. In Hutton et al 2016 [1], we argue that a cultural change is required in the computational hydrological community, in order to advance and make more robust the process of knowledge creation and hypothesis testing. We need to adopt common standards and infrastructures to: (1) make code readable and re-useable; (2) create well-documented workflows that combine re-useable code together with data to enable published scientific findings to be reproduced; (3) make code and workflows available, easy to find, and easy to interpret, using code and code metadata repositories. To create change we argue for improved graduate training in these areas. In this talk we reflect on our progress in achieving reproducible, open science in computational hydrology, which are relevant to the broader computational geoscience community. In particular, we draw on our experience in the Switch-On (EU funded) virtual water science laboratory (http://www.switch-on-vwsl.eu/participate/), which is an open platform for collaboration in hydrological experiments (e.g. [2]). While we use computational hydrology as

  16. LSF usage for batch at CERN

    CERN Multimedia

    Schwickerath, Ulrich

    2007-01-01

    Contributed poster to the CHEP07. Original abstract: LSF 7, the latest version of Platform's batch workload management system, addresses many issues which limited the ability of LSF 6.1 to support large scale batch farms, such as the lxbatch service at CERN. In this paper we will present the status of the evaluation and deployment of LSF 7 at CERN, including issues concerning the integration of LSF 7 with the gLite grid middleware suite and, in particular, the steps taken to endure an efficient reporting of the local batch system status and usage to the Grid Information System

  17. Reproducibility of isotope ratio measurements

    International Nuclear Information System (INIS)

    Elmore, D.

    1981-01-01

    The use of an accelerator as part of a mass spectrometer has improved the sensitivity for measuring low levels of long-lived radionuclides by several orders of magnitude. However, the complexity of a large tandem accelerator and beam transport system has made it difficult to match the precision of low energy mass spectrometry. Although uncertainties for accelerator measured isotope ratios as low as 1% have been obtained under favorable conditions, most errors quoted in the literature for natural samples are in the 5 to 20% range. These errors are dominated by statistics and generally the reproducibility is unknown since the samples are only measured once

  18. Evaluation of multichannel reproduced sound

    DEFF Research Database (Denmark)

    Choisel, Sylvain; Wickelmaier, Florian Maria

    2007-01-01

    A study was conducted with the goal of quantifying auditory attributes which underlie listener preference for multichannel reproduced sound. Short musical excerpts were presented in mono, stereo and several multichannel formats to a panel of forty selected listeners. Scaling of auditory attributes......, as well as overall preference, was based on consistency tests of binary paired-comparison judgments and on modeling the choice frequencies using probabilistic choice models. As a result, the preferences of non-expert listeners could be measured reliably at a ratio scale level. Principal components derived...

  19. Design and Construction of a Batch Oven for Investigation of Industrial Continuous

    DEFF Research Database (Denmark)

    Stenby, Mette; Nielsen, Brian; Risum, Jørgen

    2011-01-01

    A new batch oven has been designed and build to model baking processes as seen in large scale tunnel ovens. In order to simulate the conditions found in tunnel ovens a number of critical parameters are controllable: The temperature, the humidity and the air velocity. The band movement is simulated...... by moving the two air ducts above and below the products; in this way it is possible to keep the baking tray steady for continuous measurements of the product weight. During baking the shape and colour of the product can be monitored visually through a window. The simultaneous measuring of mass and visual...... aspects is a unique feature of this batch oven. Initial experiments of reproducing tunnel oven baking in the batch oven have shown good results, based on comparisons of weight loss, dry matter content and surface colour. The measured quality parameters did not differ significantly. Even though a few...

  20. Fuzzy batch controller for granular materials

    OpenAIRE

    Zamyatin Nikolaj; Smirnov Gennadij; Fedorchuk Yuri; Rusina Olga

    2018-01-01

    The paper focuses on batch control of granular materials in production of building materials from fluorine anhydrite. Batching equipment is intended for smooth operation and timely feeding of supply hoppers at a required level. Level sensors and a controller of an asynchronous screw drive motor are used to control filling of the hopper with industrial anhydrite binders. The controller generates a required frequency and ensures required productivity of a feed conveyor. Mamdani-type fuzzy infer...

  1. Batch Computed Tomography Analysis of Projectiles

    Science.gov (United States)

    2016-05-01

    ARL-TR-7681 ● MAY 2016 US Army Research Laboratory Batch Computed Tomography Analysis of Projectiles by Michael C Golt, Chris M...Laboratory Batch Computed Tomography Analysis of Projectiles by Michael C Golt and Matthew S Bratcher Weapons and Materials Research...values to account for projectile variability in the ballistic evaluation of armor. 15. SUBJECT TERMS computed tomography , CT, BS41, projectiles

  2. Multicenter assessment of the reproducibility of volumetric radiofrequency-based intravascular ultrasound measurements in coronary lesions that were consecutively stented

    DEFF Research Database (Denmark)

    Huisman, Jennifer; Egede, Rasmus; Rdzanek, Adam

    2012-01-01

    To assess in a multicenter design the between-center reproducibility of volumetric virtual histology intravascular ultrasound (VH-IVUS) measurements with a semi-automated, computer-assisted contour detection system in coronary lesions that were consecutively stented. To evaluate the reproducibility...... of volumetric VH-IVUS measurements, experienced analysts of 4 European IVUS centers performed independent analyses (in total 8,052 cross-sectional analyses) to obtain volumetric data of 40 coronary segments (length 20.0 ± 0.3 mm) from target lesions prior to percutaneous intervention that were performed...... in the setting of stable (65%) or unstable angina pectoris (35%). Geometric and compositional VH-IVUS measurements were highly correlated for the different comparisons. Overall intraclass correlation for vessel, lumen, plaque volume and plaque burden was 0.99, 0.92, 0.96, and 0.83, respectively; for fibrous...

  3. Analysis of the thoracic aorta using a semi-automated post processing tool

    International Nuclear Information System (INIS)

    Entezari, Pegah; Kino, Aya; Honarmand, Amir R.; Galizia, Mauricio S.; Yang, Yan; Collins, Jeremy; Yaghmai, Vahid; Carr, James C.

    2013-01-01

    Objective: To evaluates a semi-automated method for Thoracic Aortic Aneurysm (TAA) measurement using ECG-gated Dual Source CT Angiogram (DSCTA). Methods: This retrospective HIPAA compliant study was approved by our IRB. Transaxial maximum diameters of outer wall to outer wall were studied in fifty patients at seven anatomic locations of the thoracic aorta: annulus, sinus, sinotubular junction (STJ), mid ascending aorta (MAA) at the level of right pulmonary artery, proximal aortic arch (PROX) immediately proximal to innominate artery, distal aortic arch (DIST) immediately distal to left subclavian artery, and descending aorta (DESC) at the level of diaphragm. Measurements were performed using a manual method and semi-automated software. All readers repeated their measurements. Inter-method, intra-observer and inter-observer agreements were evaluated according to intraclass correlation coefficient (ICC) and Bland–Altman plot. The number of cases with manual contouring or center line adjustment for the semi-automated method and also the post-processing time for each method were recorded. Results: The mean difference between semi-automated and manual methods was less than 1.3 mm at all seven points. Strong inter-method, inter-observer and intra-observer agreement was recorded at all levels (ICC ≥ 0.9). The maximum rate of manual adjustment of center line and contour was at the level of annulus. The average time for manual post-processing of the aorta was 19 ± 0.3 min, while it took 8.26 ± 2.1 min to do the measurements with the semi-automated tool (Vitrea version 6.0.0.1 software). The center line was edited manually at all levels, with most corrections at the level of annulus (60%), while the contour was adjusted at all levels with highest and lowest number of corrections at the levels of annulus and DESC (75% and 0.07% of the cases), respectively. Conclusion: Compared to the commonly used manual method, semi-automated measurement of vessel dimensions is

  4. Inter-Scan Reproducibility of Carotid Plaque Volume Measurements by 3-D Ultrasound

    DEFF Research Database (Denmark)

    Sandholt, Benjamin V; Collet-Billon, Antoine; Entrekin, Robert

    2018-01-01

    (PPV) measure centered on MPT. Total plaque volume (TPV), PPV from a 10-mm segment and MPT were measured using dedicated semi-automated software on 38 plaques from 26 patients. Inter-scan reproducibility was assessed using the t-test, Bland-Altman plots and Pearson's correlation coefficient....... There was a mean difference of 0.01 mm in MPT (limits of agreement: -0.45 to 0.42 mm, Pearson's correlation coefficient: 0.96). Both volume measurements exhibited high reproducibility, with PPV being superior (limits of agreement: -35.3 mm3to 33.5 mm3, Pearson's correlation coefficient: 0.96) to TPV (limits...... of agreement: -88.2 to 61.5 mm3, Pearson's correlation coefficient: 0.91). The good reproducibility revealed by the present results encourages future studies on establishing plaque quantification as part of cardiovascular risk assessment and for follow-up of disease progression over time....

  5. A semi-automated method for non-invasive internal organ weight estimation by post-mortem magnetic resonance imaging in fetuses, newborns and children

    International Nuclear Information System (INIS)

    Thayyil, Sudhin; Schievano, Silvia; Robertson, Nicola J.; Jones, Rodney; Chitty, Lyn S.; Sebire, Neil J.; Taylor, Andrew M.

    2009-01-01

    Magnetic resonance (MR) imaging allows minimally invasive autopsy, especially when consent is declined for traditional autopsy. Estimation of individual visceral organ weights is an important component of traditional autopsy. Objective: To examine whether a semi-automated can be used for non-invasive internal organ weight measurement using post-mortem MR imaging in fetuses, newborns and children. Methods: Phase 1: In vitro scanning of 36 animal organs (heart, liver, kidneys) was performed to check the accuracy of volume reconstruction methodology. Real volumes were measured by water displacement method. Phase 2: Sixty-five whole body post-mortem MR scans were performed in fetuses (n = 30), newborns (n = 5) and children (n = 30) at 1.5 T using a 3D TSE T2-weighted sequence. These data were analysed offline using the image processing software Mimics 11.0. Results: Phase 1: Mean difference (S.D.) between estimated and actual volumes were -0.3 (1.5) ml for kidney, -0.7 (1.3) ml for heart, -1.7 (3.6) ml for liver in animal experiments. Phase 2: In fetuses, newborns and children mean differences between estimated and actual weights (S.D.) were -0.6 (4.9) g for liver, -5.1 (1.2) g for spleen, -0.3 (0.6) g for adrenals, 0.4 (1.6) g for thymus, 0.9 (2.5) g for heart, -0.7 (2.4) g for kidneys and 2.7 (14) g for lungs. Excellent co-correlation was noted for estimated and actual weights (r 2 = 0.99, p < 0.001). Accuracy was lower when fetuses were less than 20 weeks or less than 300 g. Conclusion: Rapid, accurate and reproducible estimation of solid internal organ weights is feasible using the semi-automated 3D volume reconstruction method.

  6. Feasibility of semiautomated MR volumetry using gadoxetic acid-enhanced MRI at hepatobiliary phase for living liver donors.

    Science.gov (United States)

    Lee, Jeongjin; Kim, Kyoung Won; Kim, So Yeon; Kim, Bohyoung; Lee, So Jung; Kim, Hyoung Jung; Lee, Jong Seok; Lee, Moon Gyu; Song, Gi-Won; Hwang, Shin; Lee, Sung-Gyu

    2014-09-01

    To assess the feasibility of semiautomated MR volumetry using gadoxetic acid-enhanced MRI at the hepatobiliary phase compared with manual CT volumetry. Forty potential live liver donor candidates who underwent MR and CT on the same day, were included in our study. Semiautomated MR volumetry was performed using gadoxetic acid-enhanced MRI at the hepatobiliary phase. We performed the quadratic MR image division for correction of the bias field inhomogeneity. With manual CT volumetry as the reference standard, we calculated the average volume measurement error of the semiautomated MR volumetry. We also calculated the mean of the number and time of the manual editing, edited volume, and total processing time. The average volume measurement errors of the semiautomated MR volumetry were 2.35% ± 1.22%. The average values of the numbers of editing, operation times of manual editing, edited volumes, and total processing time for the semiautomated MR volumetry were 1.9 ± 0.6, 8.1 ± 2.7 s, 12.4 ± 8.8 mL, and 11.7 ± 2.9 s, respectively. Semiautomated liver MR volumetry using hepatobiliary phase gadoxetic acid-enhanced MRI with the quadratic MR image division is a reliable, easy, and fast tool to measure liver volume in potential living liver donors. Copyright © 2013 Wiley Periodicals, Inc.

  7. MetMatch: A Semi-Automated Software Tool for the Comparison and Alignment of LC-HRMS Data from Different Metabolomics Experiments

    Directory of Open Access Journals (Sweden)

    Stefan Koch

    2016-11-01

    Full Text Available Due to its unsurpassed sensitivity and selectivity, LC-HRMS is one of the major analytical techniques in metabolomics research. However, limited stability of experimental and instrument parameters may cause shifts and drifts of retention time and mass accuracy or the formation of different ion species, thus complicating conclusive interpretation of the raw data, especially when generated in different analytical batches. Here, a novel software tool for the semi-automated alignment of different measurement sequences is presented. The tool is implemented in the Java programming language, it features an intuitive user interface and its main goal is to facilitate the comparison of data obtained from different metabolomics experiments. Based on a feature list (i.e., processed LC-HRMS chromatograms with mass-to-charge ratio (m/z values and retention times that serves as a reference, the tool recognizes both m/z and retention time shifts of single or multiple analytical datafiles/batches of interest. MetMatch is also designed to account for differently formed ion species of detected metabolites. Corresponding ions and metabolites are matched and chromatographic peak areas, m/z values and retention times are combined into a single data matrix. The convenient user interface allows for easy manipulation of processing results and graphical illustration of the raw data as well as the automatically matched ions and metabolites. The software tool is exemplified with LC-HRMS data from untargeted metabolomics experiments investigating phenylalanine-derived metabolites in wheat and T-2 toxin/HT-2 toxin detoxification products in barley.

  8. A multistage, semi-automated procedure for analyzing the morphology of nanoparticles

    KAUST Repository

    Park, Chiwoo

    2012-07-01

    This article presents a multistage, semi-automated procedure that can expedite the morphology analysis of nanoparticles. Material scientists have long conjectured that the morphology of nanoparticles has a profound impact on the properties of the hosting material, but a bottleneck is the lack of a reliable and automated morphology analysis of the particles based on their image measurements. This article attempts to fill in this critical void. One particular challenge in nanomorphology analysis is how to analyze the overlapped nanoparticles, a problem not well addressed by the existing methods but effectively tackled by the method proposed in this article. This method entails multiple stages of operations, executed sequentially, and is considered semi-automated due to the inclusion of a semi-supervised clustering step. The proposed method is applied to several images of nanoparticles, producing the needed statistical characterization of their morphology. © 2012 "IIE".

  9. A multistage, semi-automated procedure for analyzing the morphology of nanoparticles

    KAUST Repository

    Park, Chiwoo; Huang, Jianhua Z.; Huitink, David; Kundu, Subrata; Mallick, Bani K.; Liang, Hong; Ding, Yu

    2012-01-01

    This article presents a multistage, semi-automated procedure that can expedite the morphology analysis of nanoparticles. Material scientists have long conjectured that the morphology of nanoparticles has a profound impact on the properties of the hosting material, but a bottleneck is the lack of a reliable and automated morphology analysis of the particles based on their image measurements. This article attempts to fill in this critical void. One particular challenge in nanomorphology analysis is how to analyze the overlapped nanoparticles, a problem not well addressed by the existing methods but effectively tackled by the method proposed in this article. This method entails multiple stages of operations, executed sequentially, and is considered semi-automated due to the inclusion of a semi-supervised clustering step. The proposed method is applied to several images of nanoparticles, producing the needed statistical characterization of their morphology. © 2012 "IIE".

  10. Fast-FISH Detection and Semi-Automated Image Analysis of Numerical Chromosome Aberrations in Hematological Malignancies

    Directory of Open Access Journals (Sweden)

    Arif Esa

    1998-01-01

    Full Text Available A new fluorescence in situ hybridization (FISH technique called Fast-FISH in combination with semi-automated image analysis was applied to detect numerical aberrations of chromosomes 8 and 12 in interphase nuclei of peripheral blood lymphocytes and bone marrow cells from patients with acute myelogenous leukemia (AML and chronic lymphocytic leukemia (CLL. Commercially available α-satellite DNA probes specific for the centromere regions of chromosome 8 and chromosome 12, respectively, were used. After application of the Fast-FISH protocol, the microscopic images of the fluorescence-labelled cell nuclei were recorded by the true color CCD camera Kappa CF 15 MC and evaluated quantitatively by computer analysis on a PC. These results were compared to results obtained from the same type of specimens using the same analysis system but with a standard FISH protocol. In addition, automated spot counting after both FISH techniques was compared to visual spot counting after standard FISH. A total number of about 3,000 cell nuclei was evaluated. For quantitative brightness parameters, a good correlation between standard FISH labelling and Fast-FISH was found. Automated spot counting after Fast-FISH coincided within a few percent to automated and visual spot counting after standard FISH. The examples shown indicate the reliability and reproducibility of Fast-FISH and its potential for automatized interphase cell diagnostics of numerical chromosome aberrations. Since the Fast-FISH technique requires a hybridization time as low as 1/20 of established standard FISH techniques, omitting most of the time consuming working steps in the protocol, it may contribute considerably to clinical diagnostics. This may especially be interesting in cases where an accurate result is required within a few hours.

  11. Measurement of cell proliferation in microculture using Hoechst 33342 for the rapid semiautomated microfluorimetric determination of chromatin DNA.

    Science.gov (United States)

    Richards, W L; Song, M K; Krutzsch, H; Evarts, R P; Marsden, E; Thorgeirsson, S S

    1985-07-01

    We report the development and characterization of a semiautomated method for measurement of cell proliferation in microculture using Hoechst 33342, a non-toxic specific vital stain for DNA. In this assay, fluorescence resulting from interaction of cell chromatin DNA with Hoechst 33342 dye was measured by an instrument that automatically reads the fluorescence of each well of a 96-well microtiter plate within 1 min. Each cell line examined was shown to require different Hoechst 33342 concentrations and time of incubation with the dye to attain optimum fluorescence in the assay. In all cell lines, cell chromatin-enhanced Hoechst 33342 fluorescence was shown to be a linear function of the number of cells or cell nuclei per well when optimum assay conditions were employed. Because of this linear relation, equivalent cell doubling times were calculated from growth curves based on changes in cell counts or changes in Hoechst/DNA fluorescence and the fluorimetric assay was shown to be useful for the direct assay of the influence of growth factors on cell proliferation. The fluorimetric assay also provided a means for normalizing the incorporation of tritiated thymidine ( [3H] TdR) into DNA; normalized values of DPM per fluorescence unit closely paralleled values of percent 3H-labelled nuclei when DNA synthesis was studied as a function of the concentration of rat serum in the medium. In summary, the chromatin-enhanced Hoechst 33342 fluorimetric assay provides a rapid, simple, and reproducible means for estimating cell proliferation by direct measurement of changes in cell fluorescence or by measurement of changes in the normalized incorporation of thymidine into DNA.

  12. Semi-Automated Processing of Trajectory Simulator Output Files for Model Evaluation

    Science.gov (United States)

    2018-01-01

    ARL-TR-8284 ● JAN 2018 US Army Research Laboratory Semi-Automated Processing of Trajectory Simulator Output Files for Model...Semi-Automated Processing of Trajectory Simulator Output Files for Model Evaluation 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT...although some minor changes may be needed. The program processes a GTRAJ output text file that contains results from 2 or more simulations , where each

  13. Energy efficiency of batch and semi-batch (CCRO) reverse osmosis desalination.

    Science.gov (United States)

    Warsinger, David M; Tow, Emily W; Nayar, Kishor G; Maswadeh, Laith A; Lienhard V, John H

    2016-12-01

    As reverse osmosis (RO) desalination capacity increases worldwide, the need to reduce its specific energy consumption becomes more urgent. In addition to the incremental changes attainable with improved components such as membranes and pumps, more significant reduction of energy consumption can be achieved through time-varying RO processes including semi-batch processes such as closed-circuit reverse osmosis (CCRO) and fully-batch processes that have not yet been commercialized or modelled in detail. In this study, numerical models of the energy consumption of batch RO (BRO), CCRO, and the standard continuous RO process are detailed. Two new energy-efficient configurations of batch RO are analyzed. Batch systems use significantly less energy than continuous RO over a wide range of recovery ratios and source water salinities. Relative to continuous RO, models predict that CCRO and batch RO demonstrate up to 37% and 64% energy savings, respectively, for brackish water desalination at high water recovery. For batch RO and CCRO, the primary reductions in energy use stem from atmospheric pressure brine discharge and reduced streamwise variation in driving pressure. Fully-batch systems further reduce energy consumption by not mixing streams of different concentrations, which CCRO does. These results demonstrate that time-varying processes can significantly raise RO energy efficiency. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. Reproducible research: a minority opinion

    Science.gov (United States)

    Drummond, Chris

    2018-01-01

    Reproducible research, a growing movement within many scientific fields, including machine learning, would require the code, used to generate the experimental results, be published along with any paper. Probably the most compelling argument for this is that it is simply following good scientific practice, established over the years by the greats of science. The implication is that failure to follow such a practice is unscientific, not a label any machine learning researchers would like to carry. It is further claimed that misconduct is causing a growing crisis of confidence in science. That, without this practice being enforced, science would inevitably fall into disrepute. This viewpoint is becoming ubiquitous but here I offer a differing opinion. I argue that far from being central to science, what is being promulgated is a narrow interpretation of how science works. I contend that the consequences are somewhat overstated. I would also contend that the effort necessary to meet the movement's aims, and the general attitude it engenders would not serve well any of the research disciplines, including our own.

  15. A semi-automated tool for treatment plan-quality evaluation and clinical trial quality assurance

    Science.gov (United States)

    Wang, Jiazhou; Chen, Wenzhou; Studenski, Matthew; Cui, Yunfeng; Lee, Andrew J.; Xiao, Ying

    2013-07-01

    The goal of this work is to develop a plan-quality evaluation program for clinical routine and multi-institutional clinical trials so that the overall evaluation efficiency is improved. In multi-institutional clinical trials evaluating the plan quality is a time-consuming and labor-intensive process. In this note, we present a semi-automated plan-quality evaluation program which combines MIMVista, Java/MATLAB, and extensible markup language (XML). More specifically, MIMVista is used for data visualization; Java and its powerful function library are implemented for calculating dosimetry parameters; and to improve the clarity of the index definitions, XML is applied. The accuracy and the efficiency of the program were evaluated by comparing the results of the program with the manually recorded results in two RTOG trials. A slight difference of about 0.2% in volume or 0.6 Gy in dose between the semi-automated program and manual recording was observed. According to the criteria of indices, there are minimal differences between the two methods. The evaluation time is reduced from 10-20 min to 2 min by applying the semi-automated plan-quality evaluation program.

  16. A semi-automated method of monitoring dam passage of American Eels Anguilla rostrata

    Science.gov (United States)

    Welsh, Stuart A.; Aldinger, Joni L.

    2014-01-01

    Fish passage facilities at dams have become an important focus of fishery management in riverine systems. Given the personnel and travel costs associated with physical monitoring programs, automated or semi-automated systems are an attractive alternative for monitoring fish passage facilities. We designed and tested a semi-automated system for eel ladder monitoring at Millville Dam on the lower Shenandoah River, West Virginia. A motion-activated eel ladder camera (ELC) photographed each yellow-phase American Eel Anguilla rostrata that passed through the ladder. Digital images (with date and time stamps) of American Eels allowed for total daily counts and measurements of eel TL using photogrammetric methods with digital imaging software. We compared physical counts of American Eels with camera-based counts; TLs obtained with a measuring board were compared with TLs derived from photogrammetric methods. Data from the ELC were consistent with data obtained by physical methods, thus supporting the semi-automated camera system as a viable option for monitoring American Eel passage. Time stamps on digital images allowed for the documentation of eel passage time—data that were not obtainable from physical monitoring efforts. The ELC has application to eel ladder facilities but can also be used to monitor dam passage of other taxa, such as crayfishes, lampreys, and water snakes.

  17. EVALUATION OF PATCHY ATROPHY SECONDARY TO HIGH MYOPIA BY SEMIAUTOMATED SOFTWARE FOR FUNDUS AUTOFLUORESCENCE ANALYSIS.

    Science.gov (United States)

    Miere, Alexandra; Capuano, Vittorio; Serra, Rita; Jung, Camille; Souied, Eric; Querques, Giuseppe

    2017-05-31

    To evaluate the progression of patchy atrophy in high myopia using semiautomated software for fundus autofluorescence (FAF) analysis. The medical records and multimodal imaging of 21 consecutive highly myopic patients with macular chorioretinal patchy atrophy (PA) were retrospectively analyzed. All patients underwent repeated fundus autofluorescence and spectral domain optical coherence tomography over at least 12 months. Color fundus photography was also performed in a subset of patients. Total atrophy area was measured on FAF images using Region Finder semiautomated software embedded in Spectralis (Heidelberg Engineering, Heidelberg, Germany) at baseline and during follow-up visits. Region Finder was compared with manually measured PA on FAF images. Twenty-two eyes of 21 patients (14 women, 7 men; mean age 62.8 + 13.0 years, range 32-84 years) were included. Mean PA area using Region Finder was 2.77 ± 2.91 SD mm at baseline, 3.12 ± 2.68 mm at Month 6, 3.43 ± 2.68 mm at Month 12, and 3.73 ± 2.74 mm at Month 18 (overall P autofluorescence analysis by Region Finder semiautomated software provides accurate measurements of lesion area and allows us to quantify the progression of PA in high myopia. In our series, PA enlarged significantly over at least 12 months, and its progression seemed to be related to the lesion size at baseline.

  18. A semi-automated tool for treatment plan-quality evaluation and clinical trial quality assurance

    International Nuclear Information System (INIS)

    Wang, Jiazhou; Chen, Wenzhou; Studenski, Matthew; Cui, Yunfeng; Xiao, Ying; Lee, Andrew J

    2013-01-01

    The goal of this work is to develop a plan-quality evaluation program for clinical routine and multi-institutional clinical trials so that the overall evaluation efficiency is improved. In multi-institutional clinical trials evaluating the plan quality is a time-consuming and labor-intensive process. In this note, we present a semi-automated plan-quality evaluation program which combines MIMVista, Java/MATLAB, and extensible markup language (XML). More specifically, MIMVista is used for data visualization; Java and its powerful function library are implemented for calculating dosimetry parameters; and to improve the clarity of the index definitions, XML is applied. The accuracy and the efficiency of the program were evaluated by comparing the results of the program with the manually recorded results in two RTOG trials. A slight difference of about 0.2% in volume or 0.6 Gy in dose between the semi-automated program and manual recording was observed. According to the criteria of indices, there are minimal differences between the two methods. The evaluation time is reduced from 10–20 min to 2 min by applying the semi-automated plan-quality evaluation program. (note)

  19. A semi-automated method for bone age assessment using cervical vertebral maturation.

    Science.gov (United States)

    Baptista, Roberto S; Quaglio, Camila L; Mourad, Laila M E H; Hummel, Anderson D; Caetano, Cesar Augusto C; Ortolani, Cristina Lúcia F; Pisa, Ivan T

    2012-07-01

    To propose a semi-automated method for pattern classification to predict individuals' stage of growth based on morphologic characteristics that are described in the modified cervical vertebral maturation (CVM) method of Baccetti et al. A total of 188 lateral cephalograms were collected, digitized, evaluated manually, and grouped into cervical stages by two expert examiners. Landmarks were located on each image and measured. Three pattern classifiers based on the Naïve Bayes algorithm were built and assessed using a software program. The classifier with the greatest accuracy according to the weighted kappa test was considered best. The classifier showed a weighted kappa coefficient of 0.861 ± 0.020. If an adjacent estimated pre-stage or poststage value was taken to be acceptable, the classifier would show a weighted kappa coefficient of 0.992 ± 0.019. Results from this study show that the proposed semi-automated pattern classification method can help orthodontists identify the stage of CVM. However, additional studies are needed before this semi-automated classification method for CVM assessment can be implemented in clinical practice.

  20. Fuzzy batch controller for granular materials

    Directory of Open Access Journals (Sweden)

    Zamyatin Nikolaj

    2018-01-01

    Full Text Available The paper focuses on batch control of granular materials in production of building materials from fluorine anhydrite. Batching equipment is intended for smooth operation and timely feeding of supply hoppers at a required level. Level sensors and a controller of an asynchronous screw drive motor are used to control filling of the hopper with industrial anhydrite binders. The controller generates a required frequency and ensures required productivity of a feed conveyor. Mamdani-type fuzzy inference is proposed for controlling the speed of the screw that feeds mixture components. As related to production of building materials based on fluoride anhydrite, this method is used for the first time. A fuzzy controller is proven to be effective in controlling the filling level of the supply hopper. In addition, the authors determined optimal parameters of the batching process to ensure smooth operation and production of fluorine anhydrite materials of specified properties that can compete with gypsum-based products.

  1. History based batch method preserving tally means

    International Nuclear Information System (INIS)

    Shim, Hyung Jin; Choi, Sung Hoon

    2012-01-01

    In the Monte Carlo (MC) eigenvalue calculations, the sample variance of a tally mean calculated from its cycle-wise estimates is biased because of the inter-cycle correlations of the fission source distribution (FSD). Recently, we proposed a new real variance estimation method named the history-based batch method in which a MC run is treated as multiple runs with small number of histories per cycle to generate independent tally estimates. In this paper, the history-based batch method based on the weight correction is presented to preserve the tally mean from the original MC run. The effectiveness of the new method is examined for the weakly coupled fissile array problem as a function of the dominance ratio and the batch size, in comparison with other schemes available

  2. Following an Optimal Batch Bioreactor Operations Model

    DEFF Research Database (Denmark)

    Ibarra-Junquera, V.; Jørgensen, Sten Bay; Virgen-Ortíz, J.J.

    2012-01-01

    The problem of following an optimal batch operation model for a bioreactor in the presence of uncertainties is studied. The optimal batch bioreactor operation model (OBBOM) refers to the bioreactor trajectory for nominal cultivation to be optimal. A multiple-variable dynamic optimization of fed...... as the master system which includes the optimal cultivation trajectory for the feed flow rate and the substrate concentration. The “real” bioreactor, the one with unknown dynamics and perturbations, is considered as the slave system. Finally, the controller is designed such that the real bioreactor...

  3. Supervision of Fed-Batch Fermentations

    DEFF Research Database (Denmark)

    Gregersen, Lars; Jørgensen, Sten Bay

    1999-01-01

    Process faults may be detected on-line using existing measurements based upon modelling that is entirely data driven. A multivariate statistical model is developed and used for fault diagnosis of an industrial fed-batch fermentation process. Data from several (25) batches are used to develop...... a model for cultivation behaviour. This model is validated against 13 data sets and demonstrated to explain a significant amount of variation in the data. The multivariate model may directly be used for process monitoring. With this method faults are detected in real time and the responsible measurements...

  4. Exploring the Transition From Batch to Online

    DEFF Research Database (Denmark)

    Jørgensen, Anker Helms

    2010-01-01

    of the truly interactive use of computers known today. The transition invoked changes in a number of areas: technological, such as hybrid forms between batch and online; organisational such as decentralization; and personal as users and developers alike had to adopt new technology, shape new organizational...... structures, and acquire new skills. This work-in-progress paper extends an earlier study of the transition from batch to online, based on oral history interviews with (ex)-employees in two large Danish Service Bureaus. The paper takes the next step by ana-lyzing a particular genre: the commercial computer...

  5. Semiautomated four-dimensional computed tomography segmentation using deformable models

    International Nuclear Information System (INIS)

    Ragan, Dustin; Starkschall, George; McNutt, Todd; Kaus, Michael; Guerrero, Thomas; Stevens, Craig W.

    2005-01-01

    The purpose of this work is to demonstrate a proof of feasibility of the application of a commercial prototype deformable model algorithm to the problem of delineation of anatomic structures on four-dimensional (4D) computed tomography (CT) image data sets. We acquired a 4D CT image data set of a patient's thorax that consisted of three-dimensional (3D) image data sets from eight phases in the respiratory cycle. The contours of the right and left lungs, cord, heart, and esophagus were manually delineated on the end inspiration data set. An interactive deformable model algorithm, originally intended for deforming an atlas-based model surface to a 3D CT image data set, was applied in an automated fashion. Triangulations based on the contours generated on each phase were deformed to the CT data set on the succeeding phase to generate the contours on that phase. Deformation was propagated through the eight phases, and the contours obtained on the end inspiration data set were compared with the original manually delineated contours. Structures defined by high-density gradients, such as lungs, cord, and heart, were accurately reproduced, except in regions where other gradient boundaries may have confused the algorithm, such as near bronchi. The algorithm failed to accurately contour the esophagus, a soft-tissue structure completely surrounded by tissue of similar density, without manual interaction. This technique has the potential to facilitate contour delineation in 4D CT image data sets; and future evolution of the software is expected to improve the process

  6. Production of nattokinase by batch and fed-batch culture of Bacillus subtilis.

    Science.gov (United States)

    Cho, Young-Han; Song, Jae Yong; Kim, Kyung Mi; Kim, Mi Kyoung; Lee, In Young; Kim, Sang Bum; Kim, Hyeon Shup; Han, Nam Soo; Lee, Bong Hee; Kim, Beom Soo

    2010-09-30

    Nattokinase was produced by batch and fed-batch culture of Bacillus subtilis in flask and fermentor. Effect of supplementing complex media (peptone, yeast extract, or tryptone) was investigated on the production of nattokinase. In flask culture, the highest cell growth and nattokinase activity were obtained with 50 g/L of peptone supplementation. In this condition, nattokinase activity was 630 unit/ml at 12 h. In batch culture of B. subtilis in fermentor, the highest nattokinase activity of 3400 unit/ml was obtained at 10h with 50 g/L of peptone supplementation. From the batch kinetics data, it was shown that nattokinase production was growth-associated and culture should be harvested before stationary phase for maximum nattokinase production. In fed-batch culture of B. subtilis using pH-stat feeding strategy, cell growth (optical density monitored at 600 nm) increased to ca. 100 at 22 h, which was 2.5 times higher than that in batch culture. The highest nattokinase activity was 7100 unit/ml at 19 h, which was also 2.1 times higher than that in batch culture. Copyright 2010 Elsevier B.V. All rights reserved.

  7. Batch extractive distillation for high purity methanol

    International Nuclear Information System (INIS)

    Zhang Weijiang; Ma Sisi

    2006-01-01

    In this paper, the application in chemical industry and microelectronic industry, market status and the present situation of production of high purity methanol at home and abroad were introduced firstly. Purification of industrial methanol for high purity methanol is feasible in china. Batch extractive distillation is the best separation technique for purification of industrial methanol. Dimethyl sulfoxide was better as an extractant. (authors)

  8. Monitoring of batch processes using spectroscopy

    NARCIS (Netherlands)

    Gurden, S. P.; Westerhuis, J. A.; Smilde, A. K.

    2002-01-01

    There is an increasing need for new techniques for the understanding, monitoring and the control of batch processes. Spectroscopy is now becoming established as a means of obtaining real-time, high-quality chemical information at frequent time intervals and across a wide range of industrial

  9. Theory of reproducing kernels and applications

    CERN Document Server

    Saitoh, Saburou

    2016-01-01

    This book provides a large extension of the general theory of reproducing kernels published by N. Aronszajn in 1950, with many concrete applications. In Chapter 1, many concrete reproducing kernels are first introduced with detailed information. Chapter 2 presents a general and global theory of reproducing kernels with basic applications in a self-contained way. Many fundamental operations among reproducing kernel Hilbert spaces are dealt with. Chapter 2 is the heart of this book. Chapter 3 is devoted to the Tikhonov regularization using the theory of reproducing kernels with applications to numerical and practical solutions of bounded linear operator equations. In Chapter 4, the numerical real inversion formulas of the Laplace transform are presented by applying the Tikhonov regularization, where the reproducing kernels play a key role in the results. Chapter 5 deals with ordinary differential equations; Chapter 6 includes many concrete results for various fundamental partial differential equations. In Chapt...

  10. A Reproducible Computerized Method for Quantitation of Capillary Density using Nailfold Capillaroscopy.

    Science.gov (United States)

    Cheng, Cynthia; Lee, Chadd W; Daskalakis, Constantine

    2015-10-27

    Capillaroscopy is a non-invasive, efficient, relatively inexpensive and easy to learn methodology for directly visualizing the microcirculation. The capillaroscopy technique can provide insight into a patient's microvascular health, leading to a variety of potentially valuable dermatologic, ophthalmologic, rheumatologic and cardiovascular clinical applications. In addition, tumor growth may be dependent on angiogenesis, which can be quantitated by measuring microvessel density within the tumor. However, there is currently little to no standardization of techniques, and only one publication to date reports the reliability of a currently available, complex computer based algorithms for quantitating capillaroscopy data.(1) This paper describes a new, simpler, reliable, standardized capillary counting algorithm for quantitating nailfold capillaroscopy data. A simple, reproducible computerized capillaroscopy algorithm such as this would facilitate more widespread use of the technique among researchers and clinicians. Many researchers currently analyze capillaroscopy images by hand, promoting user fatigue and subjectivity of the results. This paper describes a novel, easy-to-use automated image processing algorithm in addition to a reproducible, semi-automated counting algorithm. This algorithm enables analysis of images in minutes while reducing subjectivity; only a minimal amount of training time (in our experience, less than 1 hr) is needed to learn the technique.

  11. Monte Carlo simulation on kinetics of batch and semi-batch free radical polymerization

    KAUST Repository

    Shao, Jing

    2015-10-27

    Based on Monte Carlo simulation technology, we proposed a hybrid routine which combines reaction mechanism together with coarse-grained molecular simulation to study the kinetics of free radical polymerization. By comparing with previous experimental and simulation studies, we showed the capability of our Monte Carlo scheme on representing polymerization kinetics in batch and semi-batch processes. Various kinetics information, such as instant monomer conversion, molecular weight, and polydispersity etc. are readily calculated from Monte Carlo simulation. The kinetic constants such as polymerization rate k p is determined in the simulation without of “steady-state” hypothesis. We explored the mechanism for the variation of polymerization kinetics those observed in previous studies, as well as polymerization-induced phase separation. Our Monte Carlo simulation scheme is versatile on studying polymerization kinetics in batch and semi-batch processes.

  12. Multi-objective optimization of glycopeptide antibiotic production in batch and fed batch processes

    DEFF Research Database (Denmark)

    Maiti, Soumen K.; Eliasson Lantz, Anna; Bhushan, Mani

    2011-01-01

    batch operations using process model for Amycolatopsis balhimycina, a glycopeptide antibiotic producer. This resulted in a set of several pareto optimal solutions with the two objectives ranging from (0.75gl−1, 3.97g$-1) to (0.44gl−1, 5.19g$-1) for batch and from (1.5gl−1, 5.46g$-1) to (1.1gl−1, 6.34g...

  13. Medication waste reduction in pediatric pharmacy batch processes.

    Science.gov (United States)

    Toerper, Matthew F; Veltri, Michael A; Hamrock, Eric; Mollenkopf, Nicole L; Holt, Kristen; Levin, Scott

    2014-04-01

    To inform pediatric cart-fill batch scheduling for reductions in pharmaceutical waste using a case study and simulation analysis. A pre and post intervention and simulation analysis was conducted during 3 months at a 205-bed children's center. An algorithm was developed to detect wasted medication based on time-stamped computerized provider order entry information. The algorithm was used to quantify pharmaceutical waste and associated costs for both preintervention (1 batch per day) and postintervention (3 batches per day) schedules. Further, simulation was used to systematically test 108 batch schedules outlining general characteristics that have an impact on the likelihood for waste. Switching from a 1-batch-per-day to a 3-batch-per-day schedule resulted in a 31.3% decrease in pharmaceutical waste (28.7% to 19.7%) and annual cost savings of $183,380. Simulation results demonstrate how increasing batch frequency facilitates a more just-in-time process that reduces waste. The most substantial gains are realized by shifting from a schedule of 1 batch per day to at least 2 batches per day. The simulation exhibits how waste reduction is also achievable by avoiding batch preparation during daily time periods where medication administration or medication discontinuations are frequent. Last, the simulation was used to show how reducing batch preparation time per batch provides some, albeit minimal, opportunity to decrease waste. The case study and simulation analysis demonstrate characteristics of batch scheduling that may support pediatric pharmacy managers in redesign toward minimizing pharmaceutical waste.

  14. Quality of radiomic features in glioblastoma multiforme: Impact of semi-automated tumor segmentation software

    International Nuclear Information System (INIS)

    Lee, Myung Eun; Kim, Jong Hyo; Woo, Bo Yeong; Ko, Micheal D.; Jamshidi, Neema

    2017-01-01

    The purpose of this study was to evaluate the reliability and quality of radiomic features in glioblastoma multiforme (GBM) derived from tumor volumes obtained with semi-automated tumor segmentation software. MR images of 45 GBM patients (29 males, 16 females) were downloaded from The Cancer Imaging Archive, in which post-contrast T1-weighted imaging and fluid-attenuated inversion recovery MR sequences were used. Two raters independently segmented the tumors using two semi-automated segmentation tools (TumorPrism3D and 3D Slicer). Regions of interest corresponding to contrast-enhancing lesion, necrotic portions, and non-enhancing T2 high signal intensity component were segmented for each tumor. A total of 180 imaging features were extracted, and their quality was evaluated in terms of stability, normalized dynamic range (NDR), and redundancy, using intra-class correlation coefficients, cluster consensus, and Rand Statistic. Our study results showed that most of the radiomic features in GBM were highly stable. Over 90% of 180 features showed good stability (intra-class correlation coefficient [ICC] ≥ 0.8), whereas only 7 features were of poor stability (ICC NDR ≥1), while above 35% of the texture features showed poor NDR (< 1). Features were shown to cluster into only 5 groups, indicating that they were highly redundant. The use of semi-automated software tools provided sufficiently reliable tumor segmentation and feature stability; thus helping to overcome the inherent inter-rater and intra-rater variability of user intervention. However, certain aspects of feature quality, including NDR and redundancy, need to be assessed for determination of representative signature features before further development of radiomics

  15. A comparison of semi-automated volumetric vs linear measurement of small vestibular schwannomas.

    Science.gov (United States)

    MacKeith, Samuel; Das, Tilak; Graves, Martin; Patterson, Andrew; Donnelly, Neil; Mannion, Richard; Axon, Patrick; Tysome, James

    2018-04-01

    Accurate and precise measurement of vestibular schwannoma (VS) size is key to clinical management decisions. Linear measurements are used in routine clinical practice but are prone to measurement error. This study aims to compare a semi-automated volume segmentation tool against standard linear method for measuring small VS. This study also examines whether oblique tumour orientation can contribute to linear measurement error. Experimental comparison of observer agreement using two measurement techniques. Tertiary skull base unit. Twenty-four patients with unilateral sporadic small (linear dimension following reformatting to correct for oblique orientation of VS. Intra-observer ICC was higher for semi-automated volumetric when compared with linear measurements, 0.998 (95% CI 0.994-0.999) vs 0.936 (95% CI 0.856-0.972), p linear measurements, 0.989 (95% CI 0.975-0.995) vs 0.946 (95% CI 0.880-0.976), p = 0.0045. The intra-observer %SDD was similar for volumetric and linear measurements, 9.9% vs 11.8%. However, the inter-observer %SDD was greater for volumetric than linear measurements, 20.1% vs 10.6%. Following oblique reformatting to correct tumour angulation, the mean increase in size was 1.14 mm (p = 0.04). Semi-automated volumetric measurements are more repeatable than linear measurements when measuring small VS and should be considered for use in clinical practice. Oblique orientation of VS may contribute to linear measurement error.

  16. Quality of radiomic features in glioblastoma multiforme: Impact of semi-automated tumor segmentation software

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Myung Eun; Kim, Jong Hyo [Center for Medical-IT Convergence Technology Research, Advanced Institutes of Convergence Technology, Seoul National University, Suwon (Korea, Republic of); Woo, Bo Yeong [Dept. of Transdisciplinary Studies, Graduate School of Convergence Science and Technology, Seoul National University, Suwon (Korea, Republic of); Ko, Micheal D.; Jamshidi, Neema [Dept. of Radiological Sciences, University of California, Los Angeles, Los Angeles (United States)

    2017-06-15

    The purpose of this study was to evaluate the reliability and quality of radiomic features in glioblastoma multiforme (GBM) derived from tumor volumes obtained with semi-automated tumor segmentation software. MR images of 45 GBM patients (29 males, 16 females) were downloaded from The Cancer Imaging Archive, in which post-contrast T1-weighted imaging and fluid-attenuated inversion recovery MR sequences were used. Two raters independently segmented the tumors using two semi-automated segmentation tools (TumorPrism3D and 3D Slicer). Regions of interest corresponding to contrast-enhancing lesion, necrotic portions, and non-enhancing T2 high signal intensity component were segmented for each tumor. A total of 180 imaging features were extracted, and their quality was evaluated in terms of stability, normalized dynamic range (NDR), and redundancy, using intra-class correlation coefficients, cluster consensus, and Rand Statistic. Our study results showed that most of the radiomic features in GBM were highly stable. Over 90% of 180 features showed good stability (intra-class correlation coefficient [ICC] ≥ 0.8), whereas only 7 features were of poor stability (ICC < 0.5). Most first order statistics and morphometric features showed moderate-to-high NDR (4 > NDR ≥1), while above 35% of the texture features showed poor NDR (< 1). Features were shown to cluster into only 5 groups, indicating that they were highly redundant. The use of semi-automated software tools provided sufficiently reliable tumor segmentation and feature stability; thus helping to overcome the inherent inter-rater and intra-rater variability of user intervention. However, certain aspects of feature quality, including NDR and redundancy, need to be assessed for determination of representative signature features before further development of radiomics.

  17. Comparison of semiautomated bird song recognition with manual detection of recorded bird song samples

    Directory of Open Access Journals (Sweden)

    Lisa A. Venier

    2017-12-01

    Full Text Available Automated recording units are increasingly being used to sample wildlife populations. These devices can produce large amounts of data that are difficult to process manually. However, the information in the recordings can be summarized with semiautomated sound recognition software. Our objective was to assess the utility of the semiautomated bird song recognizers to produce data useful for conservation and sustainable forest management applications. We compared detection data generated from expert-interpreted recordings of bird songs collected with automated recording units and data derived from a semiautomated recognition process. We recorded bird songs at 109 sites in boreal forest in 2013 and 2014 using automated recording units. We developed bird-song recognizers for 10 species using Song Scope software (Wildlife Acoustics and each recognizer was used to scan a set of recordings that was also interpreted manually by an expert in birdsong identification. We used occupancy models to estimate the detection probability associated with each method. Based on these detection probability estimates we produced cumulative detection probability curves. In a second analysis we estimated detection probability of bird song recognizers using multiple 10-minute recordings for a single station and visit (35-63, 10-minute recordings in each of four one-week periods. Results show that the detection probability of most species from single 10-min recordings is substantially higher using expert-interpreted bird song recordings than using the song recognizer software. However, our results also indicate that detection probabilities for song recognizers can be significantly improved by using more than a single 10-minute recording, which can be easily done with little additional cost with the automate procedure. Based on these results we suggest that automated recording units and song recognizer software can be valuable tools to estimate detection probability and

  18. Vessel suppressed chest Computed Tomography for semi-automated volumetric measurements of solid pulmonary nodules.

    Science.gov (United States)

    Milanese, Gianluca; Eberhard, Matthias; Martini, Katharina; Vittoria De Martini, Ilaria; Frauenfelder, Thomas

    2018-04-01

    To evaluate whether vessel-suppressed computed tomography (VSCT) can be reliably used for semi-automated volumetric measurements of solid pulmonary nodules, as compared to standard CT (SCT) MATERIAL AND METHODS: Ninety-three SCT were elaborated by dedicated software (ClearRead CT, Riverain Technologies, Miamisburg, OH, USA), that allows subtracting vessels from lung parenchyma. Semi-automated volumetric measurements of 65 solid nodules were compared between SCT and VSCT. The measurements were repeated by two readers. For each solid nodule, volume measured on SCT by Reader 1 and Reader 2 was averaged and the average volume between readers acted as standard of reference value. Concordance between measurements was assessed using Lin's Concordance Correlation Coefficient (CCC). Limits of agreement (LoA) between readers and CT datasets were evaluated. Standard of reference nodule volume ranged from 13 to 366 mm 3 . The mean overestimation between readers was 3 mm 3 and 2.9 mm 3 on SCT and VSCT, respectively. Semi-automated volumetric measurements on VSCT showed substantial agreement with the standard of reference (Lin's CCC = 0.990 for Reader 1; 0.985 for Reader 2). The upper and lower LoA between readers' measurements were (16.3, -22.4 mm 3 ) and (15.5, -21.4 mm 3 ) for SCT and VSCT, respectively. VSCT datasets are feasible for the measurements of solid nodules, showing an almost perfect concordance between readers and with measurements on SCT. Copyright © 2018 Elsevier B.V. All rights reserved.

  19. Semi-automated scoring of triple-probe FISH in human sperm using confocal microscopy.

    Science.gov (United States)

    Branch, Francesca; Nguyen, GiaLinh; Porter, Nicholas; Young, Heather A; Martenies, Sheena E; McCray, Nathan; Deloid, Glen; Popratiloff, Anastas; Perry, Melissa J

    2017-09-01

    Structural and numerical sperm chromosomal aberrations result from abnormal meiosis and are directly linked to infertility. Any live births that arise from aneuploid conceptuses can result in syndromes such as Kleinfelter, Turners, XYY and Edwards. Multi-probe fluorescence in situ hybridization (FISH) is commonly used to study sperm aneuploidy, however manual FISH scoring in sperm samples is labor-intensive and introduces errors. Automated scoring methods are continuously evolving. One challenging aspect for optimizing automated sperm FISH scoring has been the overlap in excitation and emission of the fluorescent probes used to enumerate the chromosomes of interest. Our objective was to demonstrate the feasibility of combining confocal microscopy and spectral imaging with high-throughput methods for accurately measuring sperm aneuploidy. Our approach used confocal microscopy to analyze numerical chromosomal abnormalities in human sperm using enhanced slide preparation and rigorous semi-automated scoring methods. FISH for chromosomes X, Y, and 18 was conducted to determine sex chromosome disomy in sperm nuclei. Application of online spectral linear unmixing was used for effective separation of four fluorochromes while decreasing data acquisition time. Semi-automated image processing, segmentation, classification, and scoring were performed on 10 slides using custom image processing and analysis software and results were compared with manual methods. No significant differences in disomy frequencies were seen between the semi automated and manual methods. Samples treated with pepsin were observed to have reduced background autofluorescence and more uniform distribution of cells. These results demonstrate that semi-automated methods using spectral imaging on a confocal platform are a feasible approach for analyzing numerical chromosomal aberrations in sperm, and are comparable to manual methods. © 2017 International Society for Advancement of Cytometry. © 2017

  20. A Framework for Semi-Automated Implementation of Multidimensional Data Models

    Directory of Open Access Journals (Sweden)

    Ilona Mariana NAGY

    2012-08-01

    Full Text Available Data warehousing solution development represents a challenging task which requires the employment of considerable resources on behalf of enterprises and sustained commitment from the stakeholders. Costs derive mostly from the amount of time invested in the design and physical implementation of these large projects, time that we consider, may be decreased through the automation of several processes. Thus, we present a framework for semi-automated implementation of multidimensional data models and introduce an automation prototype intended to reduce the time of data structures generation in the warehousing environment. Our research is focused on the design of an automation component and the development of a corresponding prototype from technical metadata.

  1. PyParse: a semiautomated system for scoring spoken recall data.

    Science.gov (United States)

    Solway, Alec; Geller, Aaron S; Sederberg, Per B; Kahana, Michael J

    2010-02-01

    Studies of human memory often generate data on the sequence and timing of recalled items, but scoring such data using conventional methods is difficult or impossible. We describe a Python-based semiautomated system that greatly simplifies this task. This software, called PyParse, can easily be used in conjunction with many common experiment authoring systems. Scored data is output in a simple ASCII format and can be accessed with the programming language of choice, allowing for the identification of features such as correct responses, prior-list intrusions, extra-list intrusions, and repetitions.

  2. Màquina omplidora semiautomàtica per zona ATEX

    OpenAIRE

    Tiñena Guiamet, Jaume

    2014-01-01

    En aquest projecte es realitza una màquina omplidora semiautomàtica per a zones ATEX (Appareils destinés à êtreutilisés en ATmosphères EXplosives). Aquesta màquina té la capacitat d’omplir tot tipus de recipients amb capacitats des de 20 fins a 1500 litres. Perquè aquest procés sigui possible, un sistema de transportadors desplaça el palet buit fins la zona de dosificació, lloc on l’operari posiciona manualment el dosificador per posteriorment iniciar el cicle d’emplenat automàtic. L’opera...

  3. Application of semi-automated ultrasonography on nutritional support for severe acute pancreatitis.

    Science.gov (United States)

    Li, Ying; Ye, Yu; Yang, Mei; Ruan, Haiying; Yu, Yuan

    2018-04-25

    To evaluate the application value of semi-automated ultrasound on the guidance of nasogastrojejunal tube replacement for patients with acute severe pancreatitis (ASP), as well as the value of the nutritional support for standardized treatment in clinical practice. The retrospective research was performed in our hospital, and 34 patients suffering from ASP were enrolled into this study. All these identified participants ever received CT scans in order to make definitive diagnoses. Following, these patients received semi-automated ultrasound examinations within 1 days after their onset, in order to provide enteral nutrititon treatment via nasogastrojejunal tube, or freehand nasogastrojejunal tube replacement. In terms of statistical analysis, the application value of semi-automated ultrasound guidance on nasogastrojejunal tube replacement was evaluated, and was compared with tube replacement of no guidance. After cathetering, the additional enteral nutrition was provided, and its therapeutic effect on SAP was analyzed in further. A total of 34 patients with pancreatitis were identified in this research, 29 cases with necrosis of pancreas parenchyma. After further examinations, 32 cases were SAP, 2 cases were mild acute pancreatitis. When the firm diagnosis was made, additional enteral nutrition (EN) was given, all the patient conditions appeared good, and they all were satisfied with this kind of nutritional support. According to our clinical experience, when there was 200-250 ml liquid in the stomach, the successful rate of intubation appeared higher. Additionally, the comparison between ultrasound-guided and freehand nasogastrojejunal tube replacement was made. According to the statistical results, in terms of the utilization ratio of nutritional support, it was better in ultrasound-guided group, when compared with it in freehand group, within 1 day, after 3 days and after 7 days (7/20 versus 2/14; P groups was not statistically different (P > 0.05). It can

  4. Optimal operation of batch membrane processes

    CERN Document Server

    Paulen, Radoslav

    2016-01-01

    This study concentrates on a general optimization of a particular class of membrane separation processes: those involving batch diafiltration. Existing practices are explained and operational improvements based on optimal control theory are suggested. The first part of the book introduces the theory of membrane processes, optimal control and dynamic optimization. Separation problems are defined and mathematical models of batch membrane processes derived. The control theory focuses on problems of dynamic optimization from a chemical-engineering point of view. Analytical and numerical methods that can be exploited to treat problems of optimal control for membrane processes are described. The second part of the text builds on this theoretical basis to establish solutions for membrane models of increasing complexity. Each chapter starts with a derivation of optimal operation and continues with case studies exemplifying various aspects of the control problems under consideration. The authors work their way from th...

  5. Reproducibility of surface roughness in reaming

    DEFF Research Database (Denmark)

    Müller, Pavel; De Chiffre, Leonardo

    An investigation on the reproducibility of surface roughness in reaming was performed to document the applicability of this approach for testing cutting fluids. Austenitic stainless steel was used as a workpiece material and HSS reamers as cutting tools. Reproducibility of the results was evaluat...

  6. Batch calculations in CalcHEP

    International Nuclear Information System (INIS)

    Pukhov, A.

    2003-01-01

    CalcHEP is a clone of the CompHEP project which is developed by the author outside of the CompHEP group. CompHEP/CalcHEP are packages for automatic calculations of elementary particle decay and collision properties in the lowest order of perturbation theory. The main idea prescribed into the packages is to make available passing on from the Lagrangian to the final distributions effectively with a high level of automation. According to this, the packages were created as a menu driven user friendly programs for calculations in the interactive mode. From the other side, long-time calculations should be done in the non-interactive regime. Thus, from the beginning CompHEP has a problem of batch calculations. In CompHEP 33.23 the batch session was realized by mean of interactive menu which allows to the user to formulate the task for batch. After that the not-interactive session was launched. This way is too restricted, not flexible, and leads to doubling in programming. In this article I discuss another approach how one can force an interactive program to work in non-interactive mode. This approach was realized in CalcHEP 2.1 disposed on http://theory.sinp.msu.ru/~pukhov/calchep.html

  7. Pollution prevention applications in batch manufacturing operations

    Science.gov (United States)

    Sykes, Derek W.; O'Shaughnessy, James

    2004-02-01

    Older, "low-tech" batch manufacturing operations are often fertile grounds for gains resulting from pollution prevention techniques. This paper presents a pollution prevention technique utilized for wastewater discharge permit compliance purposes at a batch manufacturer of detergents, deodorants, and floor-care products. This manufacturer generated industrial wastewater as a result of equipment rinses required after each product batch changeover. After investing a significant amount of capital on end of pip-line wastewater treatment technology designed to address existing discharge limits, this manufacturer chose to investigate alternate, low-cost approaches to address anticipated new permit limits. Mass balances using spreadsheets and readily available formulation and production data were conducted on over 300 products to determine how each individual product contributed to the total wastewater pollutant load. These mass balances indicated that 22 products accounted for over 55% of the wastewater pollutant. Laboratory tests were conducted to determine whether these same products could accept their individual changeover rinse water as make-up water in formulations without sacrificing product quality. This changeover reuse technique was then implement at the plant scale for selected products. Significant reductions in wastewater volume (25%) and wastewater pollutant loading (85+%) were realized as a direct result of this approach.

  8. Reproducibility principles, problems, practices, and prospects

    CERN Document Server

    Maasen, Sabine

    2016-01-01

    Featuring peer-reviewed contributions from noted experts in their fields of research, Reproducibility: Principles, Problems, Practices, and Prospects presents state-of-the-art approaches to reproducibility, the gold standard sound science, from multi- and interdisciplinary perspectives. Including comprehensive coverage for implementing and reflecting the norm of reproducibility in various pertinent fields of research, the book focuses on how the reproducibility of results is applied, how it may be limited, and how such limitations can be understood or even controlled in the natural sciences, computational sciences, life sciences, social sciences, and studies of science and technology. The book presents many chapters devoted to a variety of methods and techniques, as well as their epistemic and ontological underpinnings, which have been developed to safeguard reproducible research and curtail deficits and failures. The book also investigates the political, historical, and social practices that underlie repro...

  9. On-line Scheduling Of Multi-Server Batch Operations

    NARCIS (Netherlands)

    van der Zee, D.J.; van Harten, A.; Schuur, P.C.

    1999-01-01

    Batching jobs in a manufacturing system is a very common policy in most industries. Main reasons for batching are avoidance of setups and/or facilitation of material handling. Good examples of batch-wise production systems are ovens found in aircraft industry and in semiconductor manufacturing.

  10. On-line scheduling of multi-server batch operations

    NARCIS (Netherlands)

    Zee, Durk Jouke van der; Harten, Aart van; Schuur, Peter

    The batching of jobs in a manufacturing system is a very common policy in many industries. The main reasons for batching are the avoidance of setups and/or facilitation of material handling. Good examples of batch-wise production systems are the ovens that are found in the aircraft industry and in

  11. 7 CFR 58.728 - Cooking the batch.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Cooking the batch. 58.728 Section 58.728 Agriculture Regulations of the Department of Agriculture (Continued) AGRICULTURAL MARKETING SERVICE (Standards... Procedures § 58.728 Cooking the batch. Each batch of cheese within the cooker, including the optional...

  12. 40 CFR 63.1408 - Aggregate batch vent stream provisions.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 11 2010-07-01 2010-07-01 true Aggregate batch vent stream provisions... § 63.1408 Aggregate batch vent stream provisions. (a) Emission standards. Owners or operators of aggregate batch vent streams at a new or existing affected source shall comply with either paragraph (a)(1...

  13. Percutaneous biopsy of a metastatic common iliac lymph node using hydrodissection and a semi-automated biopsy gun

    Energy Technology Data Exchange (ETDEWEB)

    Ryu, Seong Yoon; Park, Byung Kwan [Dept. of Radiology, amsung Medical Center, Sungkyunkwan University School of Medicine, Seoul (Korea, Republic of)

    2017-06-15

    Percutaneous biopsy is a less invasive technique for sampling the tissue than laparoscopic biopsy or exploratory laparotomy. However, it is difficult to perform biopsy of a deep-seated lesion because of the possibility of damage to the critical organs. Recently, we successfully performed CT-guided biopsy of a metastatic common iliac lymph node using hydrodissection and semi-automated biopsy devices. The purpose of this case report was to show how to perform hydrodissection and how to use a semi-automated gun for safe biopsy of a metastatic common iliac lymph node.

  14. Volumetric analysis of pelvic hematomas after blunt trauma using semi-automated seeded region growing segmentation: a method validation study.

    Science.gov (United States)

    Dreizin, David; Bodanapally, Uttam K; Neerchal, Nagaraj; Tirada, Nikki; Patlas, Michael; Herskovits, Edward

    2016-11-01

    Manually segmented traumatic pelvic hematoma volumes are strongly predictive of active bleeding at conventional angiography, but the method is time intensive, limiting its clinical applicability. We compared volumetric analysis using semi-automated region growing segmentation to manual segmentation and diameter-based size estimates in patients with pelvic hematomas after blunt pelvic trauma. A 14-patient cohort was selected in an anonymous randomized fashion from a dataset of patients with pelvic binders at MDCT, collected retrospectively as part of a HIPAA-compliant IRB-approved study from January 2008 to December 2013. To evaluate intermethod differences, one reader (R1) performed three volume measurements using the manual technique and three volume measurements using the semi-automated technique. To evaluate interobserver differences for semi-automated segmentation, a second reader (R2) performed three semi-automated measurements. One-way analysis of variance was used to compare differences in mean volumes. Time effort was also compared. Correlation between the two methods as well as two shorthand appraisals (greatest diameter, and the ABC/2 method for estimating ellipsoid volumes) was assessed with Spearman's rho (r). Intraobserver variability was lower for semi-automated compared to manual segmentation, with standard deviations ranging between ±5-32 mL and ±17-84 mL, respectively (p = 0.0003). There was no significant difference in mean volumes between the two readers' semi-automated measurements (p = 0.83); however, means were lower for the semi-automated compared with the manual technique (manual: mean and SD 309.6 ± 139 mL; R1 semi-auto: 229.6 ± 88.2 mL, p = 0.004; R2 semi-auto: 243.79 ± 99.7 mL, p = 0.021). Despite differences in means, the correlation between the two methods was very strong and highly significant (r = 0.91, p hematoma volumes correlate strongly with manually segmented volumes. Since semi-automated segmentation

  15. Semi-automated solid phase extraction method for the mass spectrometric quantification of 12 specific metabolites of organophosphorus pesticides, synthetic pyrethroids, and select herbicides in human urine.

    Science.gov (United States)

    Davis, Mark D; Wade, Erin L; Restrepo, Paula R; Roman-Esteva, William; Bravo, Roberto; Kuklenyik, Peter; Calafat, Antonia M

    2013-06-15

    Organophosphate and pyrethroid insecticides and phenoxyacetic acid herbicides represent important classes of pesticides applied in commercial and residential settings. Interest in assessing the extent of human exposure to these pesticides exists because of their widespread use and their potential adverse health effects. An analytical method for measuring 12 biomarkers of several of these pesticides in urine has been developed. The target analytes were extracted from one milliliter of urine by a semi-automated solid phase extraction technique, separated from each other and from other urinary biomolecules by reversed-phase high performance liquid chromatography, and detected using tandem mass spectrometry with isotope dilution quantitation. This method can be used to measure all the target analytes in one injection with similar repeatability and detection limits of previous methods which required more than one injection. Each step of the procedure was optimized to produce a robust, reproducible, accurate, precise and efficient method. The required selectivity and sensitivity for trace-level analysis (e.g., limits of detection below 0.5ng/mL) was achieved using a narrow diameter analytical column, higher than unit mass resolution for certain analytes, and stable isotope labeled internal standards. The method was applied to the analysis of 55 samples collected from adult anonymous donors with no known exposure to the target pesticides. This efficient and cost-effective method is adequate to handle the large number of samples required for national biomonitoring surveys. Published by Elsevier B.V.

  16. Response variation in a batch of TLDS

    International Nuclear Information System (INIS)

    Burrage, J.; Campbell, A.

    2004-01-01

    Full text: At Royal Perth Hospital, LiF thermoluminescent dosimeter rods (TLDs) are handled in batches of 50. Rods in each batch are always annealed together to ensure the same thermal history and an individual batch is used with the same type and energy of radiation. A subset of a batch is used for calibration purposes by exposing them to a range of known doses and their output is used to calculate the dose received by other rods used for a dose measurement. Variation in TLD response is addressed by calculating 95% certainty levels from the calibration rods and applying this to the dose measurement rods. This approach relies on the sensitivity of rods within each batch being similar. This work investigates the validity of this assumption and considers possible benefits of applying individual rod sensitivities. The variation in response of TLD rods was assessed using 25 TLD-100 rods (Harshaw/Bicron) which were uniformly exposed to 1 Gy using 6 MeV photons in a linear accelerator on 5 separate occasions. Rods were read with a Harshaw 5500 reader. During the read process the Harshaw reader periodically checks for noise and PMT gain drift and the data were corrected for these parameters. Replicate exposure data were analysed using 1-way Analysis of Variance (ANOVA) to determine whether the between rod variations were significantly different to the variations within a single rod. A batch of 50 rods was also exposed on three occasions using the above technique. Individual TLD rod sensitivity values were determined using the rod responses from 2 exposures and these values were applied to correct charges on a rod-by-rod basis for the third exposure. ANOVA results on the 5 exposures of 25 rods showed the variance between rods was significantly greater than the within rod variance (p < 0.001). The precision of an individual rod was estimated to have a standard deviation of 2.8%. This suggests that the 95% confidence limits for repeated measurements using the same dose and

  17. BATCH-GE: Batch analysis of Next-Generation Sequencing data for genome editing assessment

    Science.gov (United States)

    Boel, Annekatrien; Steyaert, Woutert; De Rocker, Nina; Menten, Björn; Callewaert, Bert; De Paepe, Anne; Coucke, Paul; Willaert, Andy

    2016-01-01

    Targeted mutagenesis by the CRISPR/Cas9 system is currently revolutionizing genetics. The ease of this technique has enabled genome engineering in-vitro and in a range of model organisms and has pushed experimental dimensions to unprecedented proportions. Due to its tremendous progress in terms of speed, read length, throughput and cost, Next-Generation Sequencing (NGS) has been increasingly used for the analysis of CRISPR/Cas9 genome editing experiments. However, the current tools for genome editing assessment lack flexibility and fall short in the analysis of large amounts of NGS data. Therefore, we designed BATCH-GE, an easy-to-use bioinformatics tool for batch analysis of NGS-generated genome editing data, available from https://github.com/WouterSteyaert/BATCH-GE.git. BATCH-GE detects and reports indel mutations and other precise genome editing events and calculates the corresponding mutagenesis efficiencies for a large number of samples in parallel. Furthermore, this new tool provides flexibility by allowing the user to adapt a number of input variables. The performance of BATCH-GE was evaluated in two genome editing experiments, aiming to generate knock-out and knock-in zebrafish mutants. This tool will not only contribute to the evaluation of CRISPR/Cas9-based experiments, but will be of use in any genome editing experiment and has the ability to analyze data from every organism with a sequenced genome. PMID:27461955

  18. Reliability of Semiautomated Computational Methods for Estimating Tibiofemoral Contact Stress in the Multicenter Osteoarthritis Study

    Directory of Open Access Journals (Sweden)

    Donald D. Anderson

    2012-01-01

    Full Text Available Recent findings suggest that contact stress is a potent predictor of subsequent symptomatic osteoarthritis development in the knee. However, much larger numbers of knees (likely on the order of hundreds, if not thousands need to be reliably analyzed to achieve the statistical power necessary to clarify this relationship. This study assessed the reliability of new semiautomated computational methods for estimating contact stress in knees from large population-based cohorts. Ten knees of subjects from the Multicenter Osteoarthritis Study were included. Bone surfaces were manually segmented from sequential 1.0 Tesla magnetic resonance imaging slices by three individuals on two nonconsecutive days. Four individuals then registered the resulting bone surfaces to corresponding bone edges on weight-bearing radiographs, using a semi-automated algorithm. Discrete element analysis methods were used to estimate contact stress distributions for each knee. Segmentation and registration reliabilities (day-to-day and interrater for peak and mean medial and lateral tibiofemoral contact stress were assessed with Shrout-Fleiss intraclass correlation coefficients (ICCs. The segmentation and registration steps of the modeling approach were found to have excellent day-to-day (ICC 0.93–0.99 and good inter-rater reliability (0.84–0.97. This approach for estimating compartment-specific tibiofemoral contact stress appears to be sufficiently reliable for use in large population-based cohorts.

  19. Semi-automated, occupationally safe immunofluorescence microtip sensor for rapid detection of Mycobacterium cells in sputum.

    Directory of Open Access Journals (Sweden)

    Shinnosuke Inoue

    Full Text Available An occupationally safe (biosafe sputum liquefaction protocol was developed for use with a semi-automated antibody-based microtip immunofluorescence sensor. The protocol effectively liquefied sputum and inactivated microorganisms including Mycobacterium tuberculosis, while preserving the antibody-binding activity of Mycobacterium cell surface antigens. Sputum was treated with a synergistic chemical-thermal protocol that included moderate concentrations of NaOH and detergent at 60°C for 5 to 10 min. Samples spiked with M. tuberculosis complex cells showed approximately 10(6-fold inactivation of the pathogen after treatment. Antibody binding was retained post-treatment, as determined by analysis with a microtip immunosensor. The sensor correctly distinguished between Mycobacterium species and other cell types naturally present in biosafe-treated sputum, with a detection limit of 100 CFU/mL for M. tuberculosis, in a 30-minute sample-to-result process. The microtip device was also semi-automated and shown to be compatible with low-cost, LED-powered fluorescence microscopy. The device and biosafe sputum liquefaction method opens the door to rapid detection of tuberculosis in settings with limited laboratory infrastructure.

  20. Investigating Semi-Automated Cadastral Boundaries Extraction from Airborne Laser Scanned Data

    Directory of Open Access Journals (Sweden)

    Xianghuan Luo

    2017-09-01

    Full Text Available Many developing countries have witnessed the urgent need of accelerating cadastral surveying processes. Previous studies found that large portions of cadastral boundaries coincide with visible physical objects, namely roads, fences, and building walls. This research explores the application of airborne laser scanning (ALS techniques on cadastral surveys. A semi-automated workflow is developed to extract cadastral boundaries from an ALS point clouds. Firstly, a two-phased workflow was developed that focused on extracting digital representations of physical objects. In the automated extraction phase, after classifying points into semantic components, the outline of planar objects such as building roofs and road surfaces were generated by an α-shape algorithm, whilst the centerlines delineatiation approach was fitted into the lineate object—a fence. Afterwards, the extracted vector lines were edited and refined during the post-refinement phase. Secondly, we quantitatively evaluated the workflow performance by comparing results against an exiting cadastral map as reference. It was found that the workflow achieved promising results: around 80% completeness and 60% correctness on average, although the spatial accuracy is still modest. It is argued that the semi-automated extraction workflow could effectively speed up cadastral surveying, with both human resources and equipment costs being reduced

  1. Dynamic CT myocardial perfusion imaging: performance of 3D semi-automated evaluation software

    Energy Technology Data Exchange (ETDEWEB)

    Ebersberger, Ullrich [Medical University of South Carolina, Heart and Vascular Center, Charleston, SC (United States); Heart Center Munich-Bogenhausen, Department of Cardiology and Intensive Care Medicine, Munich (Germany); Marcus, Roy P.; Nikolaou, Konstantin; Bamberg, Fabian [University of Munich, Institute of Clinical Radiology, Munich (Germany); Schoepf, U.J.; Gray, J.C.; McQuiston, Andrew D. [Medical University of South Carolina, Heart and Vascular Center, Charleston, SC (United States); Lo, Gladys G. [Hong Kong Sanatorium and Hospital, Department of Diagnostic and Interventional Radiology, Hong Kong (China); Wang, Yining [Medical University of South Carolina, Heart and Vascular Center, Charleston, SC (United States); Peking Union Medical College Hospital, Chinese Academy of Medical Sciences, Department of Radiology, Beijing (China); Blanke, Philipp [Medical University of South Carolina, Heart and Vascular Center, Charleston, SC (United States); University Hospital Freiburg, Department of Diagnostic Radiology, Freiburg (Germany); Geyer, Lucas L. [Medical University of South Carolina, Heart and Vascular Center, Charleston, SC (United States); University of Munich, Institute of Clinical Radiology, Munich (Germany); Cho, Young Jun [Medical University of South Carolina, Heart and Vascular Center, Charleston, SC (United States); Konyang University College of Medicine, Department of Radiology, Daejeon (Korea, Republic of); Scheuering, Michael; Canstein, Christian [Siemens Healthcare, CT Division, Forchheim (Germany); Hoffmann, Ellen [Heart Center Munich-Bogenhausen, Department of Cardiology and Intensive Care Medicine, Munich (Germany)

    2014-01-15

    To evaluate the performance of three-dimensional semi-automated evaluation software for the assessment of myocardial blood flow (MBF) and blood volume (MBV) at dynamic myocardial perfusion computed tomography (CT). Volume-based software relying on marginal space learning and probabilistic boosting tree-based contour fitting was applied to CT myocardial perfusion imaging data of 37 subjects. In addition, all image data were analysed manually and both approaches were compared with SPECT findings. Study endpoints included time of analysis and conventional measures of diagnostic accuracy. Of 592 analysable segments, 42 showed perfusion defects on SPECT. Average analysis times for the manual and software-based approaches were 49.1 ± 11.2 and 16.5 ± 3.7 min respectively (P < 0.01). There was strong agreement between the two measures of interest (MBF, ICC = 0.91, and MBV, ICC = 0.88, both P < 0.01) and no significant difference in MBF/MBV with respect to diagnostic accuracy between the two approaches for both MBF and MBV for manual versus software-based approach; respectively; all comparisons P > 0.05. Three-dimensional semi-automated evaluation of dynamic myocardial perfusion CT data provides similar measures and diagnostic accuracy to manual evaluation, albeit with substantially reduced analysis times. This capability may aid the integration of this test into clinical workflows. (orig.)

  2. Quality of Radiomic Features in Glioblastoma Multiforme: Impact of Semi-Automated Tumor Segmentation Software.

    Science.gov (United States)

    Lee, Myungeun; Woo, Boyeong; Kuo, Michael D; Jamshidi, Neema; Kim, Jong Hyo

    2017-01-01

    The purpose of this study was to evaluate the reliability and quality of radiomic features in glioblastoma multiforme (GBM) derived from tumor volumes obtained with semi-automated tumor segmentation software. MR images of 45 GBM patients (29 males, 16 females) were downloaded from The Cancer Imaging Archive, in which post-contrast T1-weighted imaging and fluid-attenuated inversion recovery MR sequences were used. Two raters independently segmented the tumors using two semi-automated segmentation tools (TumorPrism3D and 3D Slicer). Regions of interest corresponding to contrast-enhancing lesion, necrotic portions, and non-enhancing T2 high signal intensity component were segmented for each tumor. A total of 180 imaging features were extracted, and their quality was evaluated in terms of stability, normalized dynamic range (NDR), and redundancy, using intra-class correlation coefficients, cluster consensus, and Rand Statistic. Our study results showed that most of the radiomic features in GBM were highly stable. Over 90% of 180 features showed good stability (intra-class correlation coefficient [ICC] ≥ 0.8), whereas only 7 features were of poor stability (ICC NDR ≥1), while above 35% of the texture features showed poor NDR (software tools provided sufficiently reliable tumor segmentation and feature stability; thus helping to overcome the inherent inter-rater and intra-rater variability of user intervention. However, certain aspects of feature quality, including NDR and redundancy, need to be assessed for determination of representative signature features before further development of radiomics.

  3. A semi-automated algorithm for hypothalamus volumetry in 3 Tesla magnetic resonance images.

    Science.gov (United States)

    Wolff, Julia; Schindler, Stephanie; Lucas, Christian; Binninger, Anne-Sophie; Weinrich, Luise; Schreiber, Jan; Hegerl, Ulrich; Möller, Harald E; Leitzke, Marco; Geyer, Stefan; Schönknecht, Peter

    2018-07-30

    The hypothalamus, a small diencephalic gray matter structure, is part of the limbic system. Volumetric changes of this structure occur in psychiatric diseases, therefore there is increasing interest in precise volumetry. Based on our detailed volumetry algorithm for 7 Tesla magnetic resonance imaging (MRI), we developed a method for 3 Tesla MRI, adopting anatomical landmarks and work in triplanar view. We overlaid T1-weighted MR images with gray matter-tissue probability maps to combine anatomical information with tissue class segmentation. Then, we outlined regions of interest (ROIs) that covered potential hypothalamus voxels. Within these ROIs, seed growing technique helped define the hypothalamic volume using gray matter probabilities from the tissue probability maps. This yielded a semi-automated method with short processing times of 20-40 min per hypothalamus. In the MRIs of ten subjects, reliabilities were determined as intraclass correlations (ICC) and volume overlaps in percent. Three raters achieved very good intra-rater reliabilities (ICC 0.82-0.97) and good inter-rater reliabilities (ICC 0.78 and 0.82). Overlaps of intra- and inter-rater runs were very good (≥ 89.7%). We present a fast, semi-automated method for in vivo hypothalamus volumetry in 3 Tesla MRI. Copyright © 2018 Elsevier B.V. All rights reserved.

  4. Learning Reproducibility with a Yearly Networking Contest

    KAUST Repository

    Canini, Marco

    2017-08-10

    Better reproducibility of networking research results is currently a major goal that the academic community is striving towards. This position paper makes the case that improving the extent and pervasiveness of reproducible research can be greatly fostered by organizing a yearly international contest. We argue that holding a contest undertaken by a plurality of students will have benefits that are two-fold. First, it will promote hands-on learning of skills that are helpful in producing artifacts at the replicable-research level. Second, it will advance the best practices regarding environments, testbeds, and tools that will aid the tasks of reproducibility evaluation committees by and large.

  5. The Economics of Reproducibility in Preclinical Research.

    Directory of Open Access Journals (Sweden)

    Leonard P Freedman

    2015-06-01

    Full Text Available Low reproducibility rates within life science research undermine cumulative knowledge production and contribute to both delays and costs of therapeutic drug development. An analysis of past studies indicates that the cumulative (total prevalence of irreproducible preclinical research exceeds 50%, resulting in approximately US$28,000,000,000 (US$28B/year spent on preclinical research that is not reproducible-in the United States alone. We outline a framework for solutions and a plan for long-term improvements in reproducibility rates that will help to accelerate the discovery of life-saving therapies and cures.

  6. Thou Shalt Be Reproducible! A Technology Perspective

    Directory of Open Access Journals (Sweden)

    Patrick Mair

    2016-07-01

    Full Text Available This article elaborates on reproducibility in psychology from a technological viewpoint. Modernopen source computational environments are shown and explained that foster reproducibilitythroughout the whole research life cycle, and to which emerging psychology researchers shouldbe sensitized, are shown and explained. First, data archiving platforms that make datasets publiclyavailable are presented. Second, R is advocated as the data-analytic lingua franca in psychologyfor achieving reproducible statistical analysis. Third, dynamic report generation environments forwriting reproducible manuscripts that integrate text, data analysis, and statistical outputs such asfigures and tables in a single document are described. Supplementary materials are provided inorder to get the reader started with these technologies.

  7. Production of ethanol in batch and fed-batch fermentation of soluble sugar

    International Nuclear Information System (INIS)

    Chaudhary, M.Y.; Shah, M.A.; Shah, F.H.

    1991-01-01

    Keeping in view of the demand and need for alternate energy source, especially liquid fuels and the availability of raw materials in Pakistan, we have carried out biochemical and technological studies for ethanol through fermentation of renewable substrates. Molasses and sugar cane have been used as substrate for yeast fermentation. Selected yeast were used in both batch and semi continuous fermentation of molasses. Clarified dilute molasses were fermented with different strains of Saccharomyces cerevisiae. Ethanol concentration after 64 hours batch fermentation reached 9.4% with 90% yield based on sugar content. During feed batch system similar results were obtained after a fermentation cycle of 48 hours resulting in higher productivity. Similarly carbohydrates in fruit juices and hydro lysates of biomass can be economically fermented to ethanol to be used as feed stock for other chemicals. (author)

  8. Passing in Command Line Arguments and Parallel Cluster/Multicore Batching in R with batch.

    Science.gov (United States)

    Hoffmann, Thomas J

    2011-03-01

    It is often useful to rerun a command line R script with some slight change in the parameters used to run it - a new set of parameters for a simulation, a different dataset to process, etc. The R package batch provides a means to pass in multiple command line options, including vectors of values in the usual R format, easily into R. The same script can be setup to run things in parallel via different command line arguments. The R package batch also provides a means to simplify this parallel batching by allowing one to use R and an R-like syntax for arguments to spread a script across a cluster or local multicore/multiprocessor computer, with automated syntax for several popular cluster types. Finally it provides a means to aggregate the results together of multiple processes run on a cluster.

  9. Highly reproducible and sensitive silver nanorod array for the rapid detection of Allura Red in candy

    Science.gov (United States)

    Yao, Yue; Wang, Wen; Tian, Kangzhen; Ingram, Whitney Marvella; Cheng, Jie; Qu, Lulu; Li, Haitao; Han, Caiqin

    2018-04-01

    Allura Red (AR) is a highly stable synthetic red azo dye, which is widely used in the food industry to dye food and increase its attraction to consumers. However, the excessive consumption of AR can result in adverse health effects to humans. Therefore, a highly reproducible silver nanorod (AgNR) array was developed for surface enhanced Raman scattering (SERS) detection of AR in candy. The relative standard deviation (RSD) of AgNR substrate obtained from the same batch and different batches were 5.7% and 11.0%, respectively, demonstrating the high reproducibility. Using these highly reproducible AgNR arrays as the SERS substrates, AR was detected successfully, and its characteristic peaks were assigned by the density function theory (DFT) calculation. The limit of detection (LOD) of AR was determined to be 0.05 mg/L with a wide linear range of 0.8-100 mg/L. Furthermore, the AgNR SERS arrays can detect AR directly in different candy samples within 3 min without any complicated pretreatment. These results suggest the AgNR array can be used for rapid and qualitative SERS detection of AR, holding a great promise for expanding SERS application in food safety control field.

  10. A new assay for cytotoxic lymphocytes, based on a radioautographic readout of 111In release, suitable for rapid, semi-automated assessment of limit-dilution cultures

    International Nuclear Information System (INIS)

    Shortman, K.; Wilson, A.

    1981-01-01

    A new assay for cytotoxic T lymphocytes is described, of general application, but particularly suitable for rapid, semi-automated assessment of multiple microculture tests. Target cells are labelled with high efficiency and to high specific activity with the oxine chelate of 111 indium. After a 3-4 h incubation of test cells with 5 X 10 3 labelled target cells in V wells of microtitre trays, samples of the supernatant are spotted on paper (5 μl) or transferred to soft-plastic U wells (25-50 μl) and the 111 In release assessed by radioautography. Overnight exposure of X-ray film with intensifying screens at -70 0 C gives an image which is an intense dark spot for maximum release, a barely visible darkening with the low spontaneous release, and a definite positive with 10% specific lysis. The degree of film darkening, which can be quantitated by microdensitometry, shows a linear relationship with cytotoxic T lymphocyte dose up to the 40% lysis level. The labelling intensity and sensitivity can be adjusted over a wide range, allowing a single batch of the short half-life isotope to serve for 2 weeks. The 96 assays from a single tray are developed simultaneously on a single small sheet of film. Many trays can be processed together, and handling is rapid if 96-channel automatic pipettors are used. The method allows rapid visual scanning for positive and negative limit dilution cultures in cytotoxic T cell precursor frequency and specificity studies. In addition, in conjunction with an automated densitometer designed to scan microtitre trays, the method provides an efficient alternative to isotope counting in routine cytotoxic assays. (Auth.)

  11. CONVERSION OF PINEAPPLE JUICE WASTE INTO LACTIC ACID IN BATCH AND FED – BATCH FERMENTATION SYSTEMS

    Directory of Open Access Journals (Sweden)

    Abdullah Mochamad Busairi

    2012-01-01

    Full Text Available Pineapple juice waste contains valuable components, which are mainly sucrose, glucose, and fructose. Recently, lactic acid has been considered to be an important raw material for the production of biodegradable lactide polymer. The fermentation experiments were carried out in a 3 litres fermentor (Biostat B Model under anaerobic condition with stirring speed of 50 rpm, temperature at 40oC, and pH of 6.00. Effect of feed concentration on lactic acid production, bacterial growth, substrate utilisation and productivity was studied. The results obtained from fed- batch culture fermentation showed that the maximum lactic acid productivity was 0.44 g/L.h for feed concentration of 90 g/L at 48 hours. Whereas the lactic acid productivity obtained from fed-batch culture was twice and half fold higher than that of batch culture productivity.  Buangan jus nanas mengandung komponen yang berharga terutama sukrosa, glukosa, dan fruktosa. Asam laktat adalah bahan baku yang terbaru dan penting untuk dibuat sebagai polimer laktat yang dapat terdegradasi oleh lingkungan. Percobaan dilakukan pada fermentor 3 liter (Model Biostat B di bawah kondisi anaerob dengan kecepatan pengadukan 50 rpm, temperatur 40oC, dan pH 6,00. Pengaruh konsentrasi umpan terhadap produksi asam laktat, pertumbuhan mikroba, pengggunaan substrat dan produktivitas telah dipelajari. Hasil yang didapatkan pada fermentasi dengan menggunakan sistem fed-batch menunjukkan bahwa produktivitas asam laktat maksimum adalah 0.44 g/L,jam dengan konsentrasi umpan, 90 g/L pada waktu 48 jam. Bahkan produktivitas asam laktat yang didapat pada kultur fed-batch lebih tinggi 2,5 kali dari pada proses menggunakan sistem batch

  12. Archiving Reproducible Research with R and Dataverse

    DEFF Research Database (Denmark)

    Leeper, Thomas

    2014-01-01

    Reproducible research and data archiving are increasingly important issues in research involving statistical analyses of quantitative data. This article introduces the dvn package, which allows R users to publicly archive datasets, analysis files, codebooks, and associated metadata in Dataverse...

  13. Sojourn time distributions in a Markovian G-queue with batch arrival and batch removal

    Directory of Open Access Journals (Sweden)

    Yang Woo Shin

    1999-01-01

    Full Text Available We consider a single server Markovian queue with two types of customers; positive and negative, where positive customers arrive in batches and arrivals of negative customers remove positive customers in batches. Only positive customers form a queue and negative customers just reduce the system congestion by removing positive ones upon their arrivals. We derive the LSTs of sojourn time distributions for a single server Markovian queue with positive customers and negative customers by using the first passage time arguments for Markov chains.

  14. Cadmium removal using Cladophora in batch, semi-batch and flow reactors.

    Science.gov (United States)

    Sternberg, Steven P K; Dorn, Ryan W

    2002-02-01

    This study presents the results of using viable algae to remove cadmium from a synthetic wastewater. In batch and semi-batch tests, a local strain of Cladophora algae removed 80-94% of the cadmium introduced. The flow experiments that followed were conducted using non-local Cladophora parriaudii. Results showed that the alga removed only 12.7(+/-6.4)% of the cadmium introduced into the reactor. Limited removal was the result of insufficient algal quantities and poor contact between the algae and cadmium solution.

  15. Reproducing Epidemiologic Research and Ensuring Transparency.

    Science.gov (United States)

    Coughlin, Steven S

    2017-08-15

    Measures for ensuring that epidemiologic studies are reproducible include making data sets and software available to other researchers so they can verify published findings, conduct alternative analyses of the data, and check for statistical errors or programming errors. Recent developments related to the reproducibility and transparency of epidemiologic studies include the creation of a global platform for sharing data from clinical trials and the anticipated future extension of the global platform to non-clinical trial data. Government agencies and departments such as the US Department of Veterans Affairs Cooperative Studies Program have also enhanced their data repositories and data sharing resources. The Institute of Medicine and the International Committee of Medical Journal Editors released guidance on sharing clinical trial data. The US National Institutes of Health has updated their data-sharing policies. In this issue of the Journal, Shepherd et al. (Am J Epidemiol. 2017;186:387-392) outline a pragmatic approach for reproducible research with sensitive data for studies for which data cannot be shared because of legal or ethical restrictions. Their proposed quasi-reproducible approach facilitates the dissemination of statistical methods and codes to independent researchers. Both reproducibility and quasi-reproducibility can increase transparency for critical evaluation, further dissemination of study methods, and expedite the exchange of ideas among researchers. © The Author(s) 2017. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  16. Feasibility of a semi-automated method for cardiac conduction velocity analysis of high-resolution activation maps

    NARCIS (Netherlands)

    Doshi, Ashish N.; Walton, Richard D.; Krul, Sébastien P.; de Groot, Joris R.; Bernus, Olivier; Efimov, Igor R.; Boukens, Bastiaan J.; Coronel, Ruben

    2015-01-01

    Myocardial conduction velocity is important for the genesis of arrhythmias. In the normal heart, conduction is primarily dependent on fiber direction (anisotropy) and may be discontinuous at sites with tissue heterogeneities (trabeculated or fibrotic tissue). We present a semi-automated method for

  17. Semi-automated literature mining to identify putative biomarkers of disease from multiple biofluids

    Science.gov (United States)

    2014-01-01

    Background Computational methods for mining of biomedical literature can be useful in augmenting manual searches of the literature using keywords for disease-specific biomarker discovery from biofluids. In this work, we develop and apply a semi-automated literature mining method to mine abstracts obtained from PubMed to discover putative biomarkers of breast and lung cancers in specific biofluids. Methodology A positive set of abstracts was defined by the terms ‘breast cancer’ and ‘lung cancer’ in conjunction with 14 separate ‘biofluids’ (bile, blood, breastmilk, cerebrospinal fluid, mucus, plasma, saliva, semen, serum, synovial fluid, stool, sweat, tears, and urine), while a negative set of abstracts was defined by the terms ‘(biofluid) NOT breast cancer’ or ‘(biofluid) NOT lung cancer.’ More than 5.3 million total abstracts were obtained from PubMed and examined for biomarker-disease-biofluid associations (34,296 positive and 2,653,396 negative for breast cancer; 28,355 positive and 2,595,034 negative for lung cancer). Biological entities such as genes and proteins were tagged using ABNER, and processed using Python scripts to produce a list of putative biomarkers. Z-scores were calculated, ranked, and used to determine significance of putative biomarkers found. Manual verification of relevant abstracts was performed to assess our method’s performance. Results Biofluid-specific markers were identified from the literature, assigned relevance scores based on frequency of occurrence, and validated using known biomarker lists and/or databases for lung and breast cancer [NCBI’s On-line Mendelian Inheritance in Man (OMIM), Cancer Gene annotation server for cancer genomics (CAGE), NCBI’s Genes & Disease, NCI’s Early Detection Research Network (EDRN), and others]. The specificity of each marker for a given biofluid was calculated, and the performance of our semi-automated literature mining method assessed for breast and lung cancer

  18. Comparison of Sedentary Behaviors in Office Workers Using Sit-Stand Tables With and Without Semiautomated Position Changes.

    Science.gov (United States)

    Barbieri, Dechristian França; Srinivasan, Divya; Mathiassen, Svend Erik; Oliveira, Ana Beatriz

    2017-08-01

    We compared usage patterns of two different electronically controlled sit-stand tables during a 2-month intervention period among office workers. Office workers spend most of their working time sitting, which is likely detrimental to health. Although the introduction of sit-stand tables has been suggested as an effective intervention to decrease sitting time, limited evidence is available on usage patterns of sit-stand tables and whether patterns are influenced by table configuration. Twelve workers were provided with standard sit-stand tables (nonautomated table group) and 12 with semiautomated sit-stand tables programmed to change table position according to a preset pattern, if the user agreed to the system-generated prompt (semiautomated table group). Table position was monitored continuously for 2 months after introducing the tables, as a proxy for sit-stand behavior. On average, the table was in a "sit" position for 85% of the workday in both groups; this percentage did not change significantly during the 2-month period. Switches in table position from sit to stand were, however, more frequent in the semiautomated table group than in the nonautomated table group (0.65 vs. 0.29 hr -1 ; p = .001). Introducing a semiautomated sit-stand table appeared to be an attractive alternative to a standard sit-stand table, because it led to more posture variation. A semiautomated sit-stand table may effectively contribute to making postures more variable among office workers and thus aid in alleviating negative health effects of extensive sitting.

  19. AIR: A batch-oriented web program package for construction of supermatrices ready for phylogenomic analyses

    Directory of Open Access Journals (Sweden)

    Mevik Bjørn-Helge

    2009-10-01

    Full Text Available Abstract Background Large multigene sequence alignments have over recent years been increasingly employed for phylogenomic reconstruction of the eukaryote tree of life. Such supermatrices of sequence data are preferred over single gene alignments as they contain vastly more information about ancient sequence characteristics, and are thus more suitable for resolving deeply diverging relationships. However, as alignments are expanded, increasingly numbers of sites with misleading phylogenetic information are also added. Therefore, a major goal in phylogenomic analyses is to maximize the ratio of information to noise; this can be achieved by the reduction of fast evolving sites. Results Here we present a batch-oriented web-based program package, named AIR that allows 1 transformation of several single genes to one multigene alignment, 2 identification of evolutionary rates in multigene alignments and 3 removal of fast evolving sites. These three processes can be done with the programs AIR-Appender, AIR-Identifier, and AIR-Remover (AIR, which can be used independently or in a semi-automated pipeline. AIR produces user-friendly output files with filtered and non-filtered alignments where residues are colored according to their evolutionary rates. Other bioinformatics applications linked to the AIR package are available at the Bioportal http://www.bioportal.uio.no, University of Oslo; together these greatly improve the flexibility, efficiency and quality of phylogenomic analyses. Conclusion The AIR program package allows for efficient creation of multigene alignments and better assessment of evolutionary rates in sequence alignments. Removing fast evolving sites with the AIR programs has been employed in several recent phylogenomic analyses resulting in improved phylogenetic resolution and increased statistical support for branching patterns among the early diverging eukaryotes.

  20. Reproducibility of central lumbar vertebral BMD

    International Nuclear Information System (INIS)

    Chan, F.; Pocock, N.; Griffiths, M.; Majerovic, Y.; Freund, J.

    1997-01-01

    Full text: Lumbar vertebral bone mineral density (BMD) using dual X-ray absorptiometry (DXA) has generally been calculated from a region of interest which includes the entire vertebral body. Although this region excludes part of the transverse processes, it does include the outer cortical shell of the vertebra. Recent software has been devised to calculate BMD in a central vertebral region of interest which excludes the outer cortical envelope. Theoretically this area may be more sensitive to detecting osteoporosis which affects trabecular bone to a greater extent than cortical bone. Apart from the sensitivity of BMD estimation, the reproducibility of any measurement is important owing to the slow rate of change of bone mass. We have evaluated the reproducibility of this new vertebral region of interest in 23 women who had duplicate lumbar spine DXA scans performed on the same day. The patients were repositioned between each measurement. Central vertebral analysis was performed for L2-L4 and the reproducibility of area, bone mineral content (BMC) and BMD calculated as the coefficient of variation; these values were compared with those from conventional analysis. Thus we have shown that the reproducibility of the central BMD is comparable to the conventional analysis which is essential if this technique is to provide any additional clinical data. The reasons for the decrease in reproducibility of the area and hence BMC requires further investigation

  1. Semantic enrichment of medical forms - semi-automated coding of ODM-elements via web services.

    Science.gov (United States)

    Breil, Bernhard; Watermann, Andreas; Haas, Peter; Dziuballe, Philipp; Dugas, Martin

    2012-01-01

    Semantic interoperability is an unsolved problem which occurs while working with medical forms from different information systems or institutions. Standards like ODM or CDA assure structural homogenization but in order to compare elements from different data models it is necessary to use semantic concepts and codes on an item level of those structures. We developed and implemented a web-based tool which enables a domain expert to perform semi-automated coding of ODM-files. For each item it is possible to inquire web services which result in unique concept codes without leaving the context of the document. Although it was not feasible to perform a totally automated coding we have implemented a dialog based method to perform an efficient coding of all data elements in the context of the whole document. The proportion of codable items was comparable to results from previous studies.

  2. Semi-automated high-efficiency reflectivity chamber for vacuum UV measurements

    Science.gov (United States)

    Wiley, James; Fleming, Brian; Renninger, Nicholas; Egan, Arika

    2017-08-01

    This paper presents the design and theory of operation for a semi-automated reflectivity chamber for ultraviolet optimized optics. A graphical user interface designed in LabVIEW controls the stages, interfaces with the detector system, takes semi-autonomous measurements, and monitors the system in case of error. Samples and an optical photodiode sit on an optics plate mounted to a rotation stage in the middle of the vacuum chamber. The optics plate rotates the samples and diode between an incident and reflected position to measure the absolute reflectivity of the samples at wavelengths limited by the monochromator operational bandpass of 70 nm to 550 nm. A collimating parabolic mirror on a fine steering tip-tilt motor enables beam steering for detector peak-ups. This chamber is designed to take measurements rapidly and with minimal oversight, increasing lab efficiency for high cadence and high accuracy vacuum UV reflectivity measurements.

  3. Smartnotebook: A semi-automated approach to protein sequential NMR resonance assignments

    International Nuclear Information System (INIS)

    Slupsky, Carolyn M.; Boyko, Robert F.; Booth, Valerie K.; Sykes, Brian D.

    2003-01-01

    Complete and accurate NMR spectral assignment is a prerequisite for high-throughput automated structure determination of biological macromolecules. However, completely automated assignment procedures generally encounter difficulties for all but the most ideal data sets. Sources of these problems include difficulty in resolving correlations in crowded spectral regions, as well as complications arising from dynamics, such as weak or missing peaks, or atoms exhibiting more than one peak due to exchange phenomena. Smartnotebook is a semi-automated assignment software package designed to combine the best features of the automated and manual approaches. The software finds and displays potential connections between residues, while the spectroscopist makes decisions on which connection is correct, allowing rapid and robust assignment. In addition, smartnotebook helps the user fit chains of connected residues to the primary sequence of the protein by comparing the experimentally determined chemical shifts with expected shifts derived from a chemical shift database, while providing bookkeeping throughout the assignment procedure

  4. A novel model-based control strategy for aerobic filamentous fungal fed-batch fermentation processes.

    Science.gov (United States)

    Mears, Lisa; Stocks, Stuart M; Albaek, Mads O; Cassells, Benny; Sin, Gürkan; Gernaey, Krist V

    2017-07-01

    A novel model-based control strategy has been developed for filamentous fungal fed-batch fermentation processes. The system of interest is a pilot scale (550 L) filamentous fungus process operating at Novozymes A/S. In such processes, it is desirable to maximize the total product achieved in a batch in a defined process time. In order to achieve this goal, it is important to maximize both the product concentration, and also the total final mass in the fed-batch system. To this end, we describe the development of a control strategy which aims to achieve maximum tank fill, while avoiding oxygen limited conditions. This requires a two stage approach: (i) calculation of the tank start fill; and (ii) on-line control in order to maximize fill subject to oxygen transfer limitations. First, a mechanistic model was applied off-line in order to determine the appropriate start fill for processes with four different sets of process operating conditions for the stirrer speed, headspace pressure, and aeration rate. The start fills were tested with eight pilot scale experiments using a reference process operation. An on-line control strategy was then developed, utilizing the mechanistic model which is recursively updated using on-line measurements. The model was applied in order to predict the current system states, including the biomass concentration, and to simulate the expected future trajectory of the system until a specified end time. In this way, the desired feed rate is updated along the progress of the batch taking into account the oxygen mass transfer conditions and the expected future trajectory of the mass. The final results show that the target fill was achieved to within 5% under the maximum fill when tested using eight pilot scale batches, and over filling was avoided. The results were reproducible, unlike the reference experiments which show over 10% variation in the final tank fill, and this also includes over filling. The variance of the final tank fill is

  5. Semiautomated tremor detection using a combined cross-correlation and neural network approach

    Science.gov (United States)

    Horstmann, Tobias; Harrington, Rebecca M.; Cochran, Elizabeth S.

    2013-01-01

    Despite observations of tectonic tremor in many locations around the globe, the emergent phase arrivals, low‒amplitude waveforms, and variable event durations make automatic detection a nontrivial task. In this study, we employ a new method to identify tremor in large data sets using a semiautomated technique. The method first reduces the data volume with an envelope cross‒correlation technique, followed by a Self‒Organizing Map (SOM) algorithm to identify and classify event types. The method detects tremor in an automated fashion after calibrating for a specific data set, hence we refer to it as being “semiautomated”. We apply the semiautomated detection algorithm to a newly acquired data set of waveforms from a temporary deployment of 13 seismometers near Cholame, California, from May 2010 to July 2011. We manually identify tremor events in a 3 week long test data set and compare to the SOM output and find a detection accuracy of 79.5%. Detection accuracy improves with increasing signal‒to‒noise ratios and number of available stations. We find detection completeness of 96% for tremor events with signal‒to‒noise ratios above 3 and optimal results when data from at least 10 stations are available. We compare the SOM algorithm to the envelope correlation method of Wech and Creager and find the SOM performs significantly better, at least for the data set examined here. Using the SOM algorithm, we detect 2606 tremor events with a cumulative signal duration of nearly 55 h during the 13 month deployment. Overall, the SOM algorithm is shown to be a flexible new method that utilizes characteristics of the waveforms to identify tremor from noise or other seismic signals.

  6. Evaluation of vitrification factors from DWPF's macro-batch 1

    International Nuclear Information System (INIS)

    Edwards, T.B.

    2000-01-01

    The Defense Waste Processing Facility (DWPF) is evaluating new sampling and analytical methods that may be used to support future Slurry Mix Evaporator (SME) batch acceptability decisions. This report uses data acquired during DWPF's processing of macro-batch 1 to determine a set of vitrification factors covering several SME and Melter Feed Tank (MFT) batches. Such values are needed for converting the cation measurements derived from the new methods to a ''glass'' basis. The available data from macro-batch 1 were used to examine the stability of these vitrification factors, to estimate their uncertainty over the course of a macro-batch, and to provide a recommendation on the use of a single factor for an entire macro-batch. The report is in response to Technical Task Request HLW/DWPF/TTR-980015

  7. Enacting the International/Reproducing Eurocentrism

    Directory of Open Access Journals (Sweden)

    Zeynep Gülşah Çapan

    Full Text Available Abstract This article focuses on the way in which Eurocentric conceptualisations of the ‘international’ are reproduced in different geopolitical contexts. Even though the Eurocentrism of International Relations has received growing attention, it has predominantly been concerned with unearthing the Eurocentrism of the ‘centre’, overlooking its varied manifestations in other geopolitical contexts. The article seeks to contribute to discussions about Eurocentrism by examining how different conceptualisations of the international are at work at a particular moment, and how these conceptualisations continue to reproduce Eurocentrism. It will focus on the way in which Eurocentric designations of spatial and temporal hierarchies were reproduced in the context of Turkey through a reading of how the ‘Gezi Park protests’ of 2013 and ‘Turkey’ itself were written into the story of the international.

  8. Reproducibility, controllability, and optimization of LENR experiments

    Energy Technology Data Exchange (ETDEWEB)

    Nagel, David J. [The George Washington University, Washington DC 20052 (United States)

    2006-07-01

    Low-energy nuclear reaction (LENR) measurements are significantly, and increasingly reproducible. Practical control of the production of energy or materials by LENR has yet to be demonstrated. Minimization of costly inputs and maximization of desired outputs of LENR remain for future developments. The paper concludes by underlying that it is now clearly that demands for reproducible experiments in the early years of LENR experiments were premature. In fact, one can argue that irreproducibility should be expected for early experiments in a complex new field. As emphasized in the paper and as often happened in the history of science, experimental and theoretical progress can take even decades. It is likely to be many years before investments in LENR experiments will yield significant returns, even for successful research programs. However, it is clearly that a fundamental understanding of the anomalous effects observed in numerous experiments will significantly increase reproducibility, improve controllability, enable optimization of processes, and accelerate the economic viability of LENR.

  9. Reproducibility, controllability, and optimization of LENR experiments

    International Nuclear Information System (INIS)

    Nagel, David J.

    2006-01-01

    Low-energy nuclear reaction (LENR) measurements are significantly, and increasingly reproducible. Practical control of the production of energy or materials by LENR has yet to be demonstrated. Minimization of costly inputs and maximization of desired outputs of LENR remain for future developments. The paper concludes by underlying that it is now clearly that demands for reproducible experiments in the early years of LENR experiments were premature. In fact, one can argue that irreproducibility should be expected for early experiments in a complex new field. As emphasized in the paper and as often happened in the history of science, experimental and theoretical progress can take even decades. It is likely to be many years before investments in LENR experiments will yield significant returns, even for successful research programs. However, it is clearly that a fundamental understanding of the anomalous effects observed in numerous experiments will significantly increase reproducibility, improve controllability, enable optimization of processes, and accelerate the economic viability of LENR

  10. Undefined cellulase formulations hinder scientific reproducibility.

    Science.gov (United States)

    Himmel, Michael E; Abbas, Charles A; Baker, John O; Bayer, Edward A; Bomble, Yannick J; Brunecky, Roman; Chen, Xiaowen; Felby, Claus; Jeoh, Tina; Kumar, Rajeev; McCleary, Barry V; Pletschke, Brett I; Tucker, Melvin P; Wyman, Charles E; Decker, Stephen R

    2017-01-01

    In the shadow of a burgeoning biomass-to-fuels industry, biological conversion of lignocellulose to fermentable sugars in a cost-effective manner is key to the success of second-generation and advanced biofuel production. For the effective comparison of one cellulase preparation to another, cellulase assays are typically carried out with one or more engineered cellulase formulations or natural exoproteomes of known performance serving as positive controls. When these formulations have unknown composition, as is the case with several widely used commercial products, it becomes impossible to compare or reproduce work done today to work done in the future, where, for example, such preparations may not be available. Therefore, being a critical tenet of science publishing, experimental reproducibility is endangered by the continued use of these undisclosed products. We propose the introduction of standard procedures and materials to produce specific and reproducible cellulase formulations. These formulations are to serve as yardsticks to measure improvements and performance of new cellulase formulations.

  11. Reproducibility of somatosensory spatial perceptual maps.

    Science.gov (United States)

    Steenbergen, Peter; Buitenweg, Jan R; Trojan, Jörg; Veltink, Peter H

    2013-02-01

    Various studies have shown subjects to mislocalize cutaneous stimuli in an idiosyncratic manner. Spatial properties of individual localization behavior can be represented in the form of perceptual maps. Individual differences in these maps may reflect properties of internal body representations, and perceptual maps may therefore be a useful method for studying these representations. For this to be the case, individual perceptual maps need to be reproducible, which has not yet been demonstrated. We assessed the reproducibility of localizations measured twice on subsequent days. Ten subjects participated in the experiments. Non-painful electrocutaneous stimuli were applied at seven sites on the lower arm. Subjects localized the stimuli on a photograph of their own arm, which was presented on a tablet screen overlaying the real arm. Reproducibility was assessed by calculating intraclass correlation coefficients (ICC) for the mean localizations of each electrode site and the slope and offset of regression models of the localizations, which represent scaling and displacement of perceptual maps relative to the stimulated sites. The ICCs of the mean localizations ranged from 0.68 to 0.93; the ICCs of the regression parameters were 0.88 for the intercept and 0.92 for the slope. These results indicate a high degree of reproducibility. We conclude that localization patterns of non-painful electrocutaneous stimuli on the arm are reproducible on subsequent days. Reproducibility is a necessary property of perceptual maps for these to reflect properties of a subject's internal body representations. Perceptual maps are therefore a promising method for studying body representations.

  12. Inorganic fouling mitigation by salinity cycling in batch reverse osmosis

    OpenAIRE

    Maswadeh, Laith A.; Warsinger, David Elan Martin; Tow, Emily W.; Connors, Grace B.; Swaminathan, Jaichander; Lienhard, John H

    2018-01-01

    Enhanced fouling resistance has been observed in recent variants of reverse osmosis (RO) desalination which use time-varying batch or semi-batch processes, such as closed-circuit RO (CCRO) and pulse flow RO (PFRO). However, the mechanisms of batch processes' fouling resistance are not well-understood, and models have not been developed for prediction of their fouling performance. Here, a framework for predicting reverse osmosis fouling is developed by comparing the fluid residence time in bat...

  13. Optimizing Resource Utilization in Grid Batch Systems

    International Nuclear Information System (INIS)

    Gellrich, Andreas

    2012-01-01

    On Grid sites, the requirements of the computing tasks (jobs) to computing, storage, and network resources differ widely. For instance Monte Carlo production jobs are almost purely CPU-bound, whereas physics analysis jobs demand high data rates. In order to optimize the utilization of the compute node resources, jobs must be distributed intelligently over the nodes. Although the job resource requirements cannot be deduced directly, jobs are mapped to POSIX UID/GID according to the VO, VOMS group and role information contained in the VOMS proxy. The UID/GID then allows to distinguish jobs, if users are using VOMS proxies as planned by the VO management, e.g. ‘role=production’ for Monte Carlo jobs. It is possible to setup and configure batch systems (queuing system and scheduler) at Grid sites based on these considerations although scaling limits were observed with the scheduler MAUI. In tests these limitations could be overcome with a home-made scheduler.

  14. [Natural head position's reproducibility on photographs].

    Science.gov (United States)

    Eddo, Marie-Line; El Hayeck, Émilie; Hoyeck, Maha; Khoury, Élie; Ghoubril, Joseph

    2017-12-01

    The purpose of this study is to evaluate the reproducibility of natural head position with time on profile photographs. Our sample is composed of 96 students (20-30 years old) at the department of dentistry of Saint Joseph University in Beirut. Two profile photographs were taken in natural head position about a week apart. No significant differences were found between T0 and T1 (E = 1.065°). Many studies confirmed this reproducibility with time. Natural head position can be adopted as an orientation for profile photographs in orthodontics. © EDP Sciences, SFODF, 2017.

  15. Highly reproducible polyol synthesis for silver nanocubes

    Science.gov (United States)

    Han, Hye Ji; Yu, Taekyung; Kim, Woo-Sik; Im, Sang Hyuk

    2017-07-01

    We could synthesize the Ag nanocubes highly reproducibly by conducting the polyol synthesis using HCl etchant in dark condition because the photodecomposition/photoreduction of AgCl nanoparticles formed at initial reaction stage were greatly depressed and consequently the selective self-nucleation of Ag single crystals and their selective growth reaction could be promoted. Whereas the reproducibility of the formation of Ag nanocubes were very poor when we synthesize the Ag nanocubes in light condition due to the photoreduction of AgCl to Ag.

  16. Reproducible statistical analysis with multiple languages

    DEFF Research Database (Denmark)

    Lenth, Russell; Højsgaard, Søren

    2011-01-01

    This paper describes the system for making reproducible statistical analyses. differs from other systems for reproducible analysis in several ways. The two main differences are: (1) Several statistics programs can be in used in the same document. (2) Documents can be prepared using OpenOffice or ......Office or \\LaTeX. The main part of this paper is an example showing how to use and together in an OpenOffice text document. The paper also contains some practical considerations on the use of literate programming in statistics....

  17. Sewage sludge irradiators: Batch and continuous flow

    International Nuclear Information System (INIS)

    Lavale, D.S.; George, J.R.; Shah, M.R.; Rawat, K.P.

    1998-01-01

    The potential threat to the environment imposed by high pathogenic organism content in municipal wastewater, especially the sludge and the world-wide growing aspirations for a cleaner, salubrious environment have made it mandatory for the sewage and sludge to undergo treatment, prior to their ultimate disposal to mother nature. Incapabilities associated with the conventional wastewater treatments to mitigate the problem of microorganisms have made it necessary to look for other alternatives, radiation treatment being the most reliable, rapid and environmentally sustainable of them. To promote the use of radiation for the sludge hygienization, Department of Atomic Energy has endeavoured to set up an indigenous, Sludge Hygienization Research Irradiator (SHRI) in the city of Baroda. Designed for 18.5 PBq of 60 Co to disinfect the digested sludge, the irradiator has additional provision for treatment of effluent and raw sewage. From engineering standpoint, all the subsystems have been functioning satisfactorily since its commissioning in 1990. Prolonged studies, spanning over a period of six years, primarily focused on inactivation of microorganism revealed that 3 kGy dose of gamma radiation is adequate to make the sludge pathogen and odour-free. A dose of 1.6 kGy in raw sewage and 0.5 kGy in effluent reduced coliform counts down to the regulatory discharge limits. These observations reflect a possible cost-effective solution to the burgeoning problem of surface water pollution across the globe. In the past, sub 37 PBq 60 Co batch irradiators have been designed and commissioned successfully for the treatment of sludge. Characterized with low dose delivery rates they are well-suited for treating low volumes of sludge in batches. Some concepts of continuous flow 60 Co irradiators having larger activities, yet simple and economic in design, are presented in the paper

  18. A reproducible and scalable procedure for preparing bacterial extracts for cell-free protein synthesis.

    Science.gov (United States)

    Katsura, Kazushige; Matsuda, Takayoshi; Tomabechi, Yuri; Yonemochi, Mayumi; Hanada, Kazuharu; Ohsawa, Noboru; Sakamoto, Kensaku; Takemoto, Chie; Shirouzu, Mikako

    2017-11-01

    Cell-free protein synthesis is a useful method for preparing proteins for functional or structural analyses. However, batch-to-batch variability with regard to protein synthesis activity remains a problem for large-scale production of cell extract in the laboratory. To address this issue, we have developed a novel procedure for large-scale preparation of bacterial cell extract with high protein synthesis activity. The developed procedure comprises cell cultivation using a fermentor, harvesting and washing of cells by tangential flow filtration, cell disruption with high-pressure homogenizer and continuous diafiltration. By optimizing and combining these methods, ∼100 ml of the cell extract was prepared from 150 g of Escherichia coli cells. The protein synthesis activities, defined as the yield of protein per unit of absorbance at 260 nm of the cell extract, were shown to be reproducible, and the average activity of several batches was twice that obtained using a previously reported method. In addition, combinatorial use of the high-pressure homogenizer and diafiltration increased the scalability, indicating that the cell concentration at disruption varies from 0.04 to 1 g/ml. Furthermore, addition of Gam protein and examinations of the N-terminal sequence rendered the extract prepared here useful for rapid screening with linear DNA templates. © The Authors 2017. Published by Oxford University Press on behalf of the Japanese Biochemical Society. All rights reserved.

  19. Reproducing kernel Hilbert spaces of Gaussian priors

    NARCIS (Netherlands)

    Vaart, van der A.W.; Zanten, van J.H.; Clarke, B.; Ghosal, S.

    2008-01-01

    We review definitions and properties of reproducing kernel Hilbert spaces attached to Gaussian variables and processes, with a view to applications in nonparametric Bayesian statistics using Gaussian priors. The rate of contraction of posterior distributions based on Gaussian priors can be described

  20. Reproducibility in Computational Neuroscience Models and Simulations

    Science.gov (United States)

    McDougal, Robert A.; Bulanova, Anna S.; Lytton, William W.

    2016-01-01

    Objective Like all scientific research, computational neuroscience research must be reproducible. Big data science, including simulation research, cannot depend exclusively on journal articles as the method to provide the sharing and transparency required for reproducibility. Methods Ensuring model reproducibility requires the use of multiple standard software practices and tools, including version control, strong commenting and documentation, and code modularity. Results Building on these standard practices, model sharing sites and tools have been developed that fit into several categories: 1. standardized neural simulators, 2. shared computational resources, 3. declarative model descriptors, ontologies and standardized annotations; 4. model sharing repositories and sharing standards. Conclusion A number of complementary innovations have been proposed to enhance sharing, transparency and reproducibility. The individual user can be encouraged to make use of version control, commenting, documentation and modularity in development of models. The community can help by requiring model sharing as a condition of publication and funding. Significance Model management will become increasingly important as multiscale models become larger, more detailed and correspondingly more difficult to manage by any single investigator or single laboratory. Additional big data management complexity will come as the models become more useful in interpreting experiments, thus increasing the need to ensure clear alignment between modeling data, both parameters and results, and experiment. PMID:27046845

  1. Estimating the reproducibility of psychological science

    NARCIS (Netherlands)

    Aarts, Alexander A.; Anderson, Joanna E.; Anderson, Christopher J.; Attridge, Peter R.; Attwood, Angela; Axt, Jordan; Babel, Molly; Bahnik, Stepan; Baranski, Erica; Barnett-Cowan, Michael; Bartmess, Elizabeth; Beer, Jennifer; Bell, Raoul; Bentley, Heather; Beyan, Leah; Binion, Grace; Borsboom, Denny; Bosch, Annick; Bosco, Frank A.; Bowman, Sara D.; Brandt, Mark J.; Braswell, Erin; Brohmer, Hilmar; Brown, Benjamin T.; Brown, Kristina; Bruening, Jovita; Calhoun-Sauls, Ann; Chagnon, Elizabeth; Callahan, Shannon P.; Chandler, Jesse; Chartier, Christopher R.; Cheung, Felix; Cillessen, Linda; Christopherson, Cody D.; Clay, Russ; Cleary, Hayley; Cloud, Mark D.; Cohn, Michael; Cohoon, Johanna; Columbus, Simon; Cordes, Andreas; Costantini, Giulio; Hartgerink, Chris; Krijnen, Job; Nuijten, Michele B.; van 't Veer, Anna E.; Van Aert, Robbie; van Assen, M.A.L.M.; Wissink, Joeri; Zeelenberg, Marcel

    2015-01-01

    INTRODUCTION Reproducibility is a defining feature of science, but the extent to which it characterizes current research is unknown. Scientific claims should not gain credence because of the status or authority of their originator but by the replicability of their supporting evidence. Even research

  2. Reproducibility, Controllability, and Optimization of Lenr Experiments

    Science.gov (United States)

    Nagel, David J.

    2006-02-01

    Low-energy nuclear reaction (LENR) measurements are significantly and increasingly reproducible. Practical control of the production of energy or materials by LENR has yet to be demonstrated. Minimization of costly inputs and maximization of desired outputs of LENR remain for future developments.

  3. Estimating the reproducibility of psychological science

    NARCIS (Netherlands)

    Anderson, Joanna E.; Aarts, Alexander A.; Anderson, Christopher J.; Attridge, Peter R.; Attwood, Angela; Axt, Jordan; Babel, Molly; Bahník, Štěpán; Baranski, Erica; Barnett-Cowan, Michael; Bartmess, Elizabeth; Beer, Jennifer; Bell, Raoul; Bentley, Heather; Beyan, Leah; Binion, Grace; Borsboom, Denny; Bosch, Annick; Bosco, Frank A.; Bowman, Sara D.; Brandt, Mark J.; Braswell, Erin; Brohmer, Hilmar; Brown, Benjamin T.; Brown, Kristina; Brüning, Jovita; Calhoun-Sauls, Ann; Callahan, Shannon P.; Chagnon, Elizabeth; Chandler, Jesse; Chartier, Christopher R.; Cheung, Felix; Christopherson, Cody D.; Cillessen, Linda; Clay, Russ; Cleary, Hayley; Cloud, Mark D.; Conn, Michael; Cohoon, Johanna; Columbus, Simon; Cordes, Andreas; Costantini, Giulio; Alvarez, Leslie D Cramblet; Cremata, Ed; Crusius, Jan; DeCoster, Jamie; DeGaetano, Michelle A.; Penna, Nicolás Delia; Den Bezemer, Bobby; Deserno, Marie K.; Devitt, Olivia; Dewitte, Laura; Dobolyi, David G.; Dodson, Geneva T.; Donnellan, M. Brent; Donohue, Ryan; Dore, Rebecca A.; Dorrough, Angela; Dreber, Anna; Dugas, Michelle; Dunn, Elizabeth W.; Easey, Kayleigh; Eboigbe, Sylvia; Eggleston, Casey; Embley, Jo; Epskamp, Sacha; Errington, Timothy M.; Estel, Vivien; Farach, Frank J.; Feather, Jenelle; Fedor, Anna; Fernández-Castilla, Belén; Fiedler, Susann; Field, James G.; Fitneva, Stanka A.; Flagan, Taru; Forest, Amanda L.; Forsell, Eskil; Foster, Joshua D.; Frank, Michael C.; Frazier, Rebecca S.; Fuchs, Heather; Gable, Philip; Galak, Jeff; Galliani, Elisa Maria; Gampa, Anup; Garcia, Sara; Gazarian, Douglas; Gilbert, Elizabeth; Giner-Sorolla, Roger; Glöckner, Andreas; Goellner, Lars; Goh, Jin X.; Goldberg, Rebecca; Goodbourn, Patrick T.; Gordon-McKeon, Shauna; Gorges, Bryan; Gorges, Jessie; Goss, Justin; Graham, Jesse; Grange, James A.; Gray, Jeremy; Hartgerink, Chris; Hartshorne, Joshua; Hasselman, Fred; Hayes, Timothy; Heikensten, Emma; Henninger, Felix; Hodsoll, John; Holubar, Taylor; Hoogendoorn, Gea; Humphries, Denise J.; Hung, Cathy O Y; Immelman, Nathali; Irsik, Vanessa C.; Jahn, Georg; Jäkel, Frank; Jekel, Marc; Johannesson, Magnus; Johnson, Larissa G.; Johnson, David J.; Johnson, Kate M.; Johnston, William J.; Jonas, Kai; Joy-Gaba, Jennifer A.; Kappes, Heather Barry; Kelso, Kim; Kidwell, Mallory C.; Kim, Seung Kyung; Kirkhart, Matthew; Kleinberg, Bennett; Knežević, Goran; Kolorz, Franziska Maria; Kossakowski, Jolanda J.; Krause, Robert Wilhelm; Krijnen, Job; Kuhlmann, Tim; Kunkels, Yoram K.; Kyc, Megan M.; Lai, Calvin K.; Laique, Aamir; Lakens, Daniël|info:eu-repo/dai/nl/298811855; Lane, Kristin A.; Lassetter, Bethany; Lazarević, Ljiljana B.; Le Bel, Etienne P.; Lee, Key Jung; Lee, Minha; Lemm, Kristi; Levitan, Carmel A.; Lewis, Melissa; Lin, Lin; Lin, Stephanie; Lippold, Matthias; Loureiro, Darren; Luteijn, Ilse; MacKinnon, Sean; Mainard, Heather N.; Marigold, Denise C.; Martin, Daniel P.; Martinez, Tylar; Masicampo, E. J.; Matacotta, Josh; Mathur, Maya; May, Michael; Mechin, Nicole; Mehta, Pranjal; Meixner, Johannes; Melinger, Alissa; Miller, Jeremy K.; Miller, Mallorie; Moore, Katherine; Möschl, Marcus; Motyl, Matt; Müller, Stephanie M.; Munafo, Marcus; Neijenhuijs, Koen I.; Nervi, Taylor; Nicolas, Gandalf; Nilsonne, Gustav; Nosek, Brian A.; Nuijten, Michèle B.; Olsson, Catherine; Osborne, Colleen; Ostkamp, Lutz; Pavel, Misha; Penton-Voak, Ian S.; Perna, Olivia; Pernet, Cyril; Perugini, Marco; Pipitone, R. Nathan; Pitts, Michael; Plessow, Franziska; Prenoveau, Jason M.; Rahal, Rima Maria; Ratliff, Kate A.; Reinhard, David; Renkewitz, Frank; Ricker, Ashley A.; Rigney, Anastasia; Rivers, Andrew M.; Roebke, Mark; Rutchick, Abraham M.; Ryan, Robert S.; Sahin, Onur; Saide, Anondah; Sandstrom, Gillian M.; Santos, David; Saxe, Rebecca; Schlegelmilch, René; Schmidt, Kathleen; Scholz, Sabine; Seibel, Larissa; Selterman, Dylan Faulkner; Shaki, Samuel; Simpson, William B.; Sinclair, H. Colleen; Skorinko, Jeanine L M; Slowik, Agnieszka; Snyder, Joel S.; Soderberg, Courtney; Sonnleitner, Carina; Spencer, Nick; Spies, Jeffrey R.; Steegen, Sara; Stieger, Stefan; Strohminger, Nina; Sullivan, Gavin B.; Talhelm, Thomas; Tapia, Megan; Te Dorsthorst, Anniek; Thomae, Manuela; Thomas, Sarah L.; Tio, Pia; Traets, Frits; Tsang, Steve; Tuerlinckx, Francis; Turchan, Paul; Valášek, Milan; Van't Veer, Anna E.; Van Aert, Robbie; Van Assen, Marcel|info:eu-repo/dai/nl/407629971; Van Bork, Riet; Van De Ven, Mathijs; Van Den Bergh, Don; Van Der Hulst, Marije; Van Dooren, Roel; Van Doorn, Johnny; Van Renswoude, Daan R.; Van Rijn, Hedderik; Vanpaemel, Wolf; Echeverría, Alejandro Vásquez; Vazquez, Melissa; Velez, Natalia; Vermue, Marieke; Verschoor, Mark; Vianello, Michelangelo; Voracek, Martin; Vuu, Gina; Wagenmakers, Eric Jan; Weerdmeester, Joanneke; Welsh, Ashlee; Westgate, Erin C.; Wissink, Joeri; Wood, Michael; Woods, Andy; Wright, Emily; Wu, Sining; Zeelenberg, Marcel; Zuni, Kellylynn

    2015-01-01

    Reproducibility is a defining feature of science, but the extent to which it characterizes current research is unknown. We conducted replications of 100 experimental and correlational studies published in three psychology journals using high-powered designs and original materials when available.

  4. ITK: Enabling Reproducible Research and Open Science

    Directory of Open Access Journals (Sweden)

    Matthew Michael McCormick

    2014-02-01

    Full Text Available Reproducibility verification is essential to the practice of the scientific method. Researchers report their findings, which are strengthened as other independent groups in the scientific community share similar outcomes. In the many scientific fields where software has become a fundamental tool for capturing and analyzing data, this requirement of reproducibility implies that reliable and comprehensive software platforms and tools should be made available to the scientific community. The tools will empower them and the public to verify, through practice, the reproducibility of observations that are reported in the scientific literature.Medical image analysis is one of the fields in which the use of computational resources, both software and hardware, are an essential platform for performing experimental work. In this arena, the introduction of the Insight Toolkit (ITK in 1999 has transformed the field and facilitates its progress by accelerating the rate at which algorithmic implementations are developed, tested, disseminated and improved. By building on the efficiency and quality of open source methodologies, ITK has provided the medical image community with an effective platform on which to build a daily workflow that incorporates the true scientific practices of reproducibility verification.This article describes the multiple tools, methodologies, and practices that the ITK community has adopted, refined, and followed during the past decade, in order to become one of the research communities with the most modern reproducibility verification infrastructure. For example, 207 contributors have created over 2400 unit tests that provide over 84% code line test coverage. The Insight Journal, an open publication journal associated with the toolkit, has seen over 360,000 publication downloads. The median normalized closeness centrality, a measure of knowledge flow, resulting from the distributed peer code review system was high, 0.46.

  5. A PHYSICAL ACTIVITY QUESTIONNAIRE: REPRODUCIBILITY AND VALIDITY

    Directory of Open Access Journals (Sweden)

    Nicolas Barbosa

    2007-12-01

    Full Text Available This study evaluates the Quantification de L'Activite Physique en Altitude chez les Enfants (QAPACE supervised self-administered questionnaire reproducibility and validity on the estimation of the mean daily energy expenditure (DEE on Bogotá's schoolchildren. The comprehension was assessed on 324 students, whereas the reproducibility was studied on a different random sample of 162 who were exposed twice to it. Reproducibility was assessed using both the Bland-Altman plot and the intra-class correlation coefficient (ICC. The validity was studied in a sample of 18 girls and 18 boys randomly selected, which completed the test - re-test study. The DEE derived from the questionnaire was compared with the laboratory measurement results of the peak oxygen uptake (Peak VO2 from ergo-spirometry and Leger Test. The reproducibility ICC was 0.96 (95% C.I. 0.95-0.97; by age categories 8-10, 0.94 (0.89-0. 97; 11-13, 0.98 (0.96- 0.99; 14-16, 0.95 (0.91-0.98. The ICC between mean TEE as estimated by the questionnaire and the direct and indirect Peak VO2 was 0.76 (0.66 (p<0.01; by age categories, 8-10, 11-13, and 14-16 were 0.89 (0.87, 0.76 (0.78 and 0.88 (0.80 respectively. The QAPACE questionnaire is reproducible and valid for estimating PA and showed a high correlation with the Peak VO2 uptake

  6. Bioprocess iterative batch-to-batch optimization based on hybrid parametric/nonparametric models.

    Science.gov (United States)

    Teixeira, Ana P; Clemente, João J; Cunha, António E; Carrondo, Manuel J T; Oliveira, Rui

    2006-01-01

    This paper presents a novel method for iterative batch-to-batch dynamic optimization of bioprocesses. The relationship between process performance and control inputs is established by means of hybrid grey-box models combining parametric and nonparametric structures. The bioreactor dynamics are defined by material balance equations, whereas the cell population subsystem is represented by an adjustable mixture of nonparametric and parametric models. Thus optimizations are possible without detailed mechanistic knowledge concerning the biological system. A clustering technique is used to supervise the reliability of the nonparametric subsystem during the optimization. Whenever the nonparametric outputs are unreliable, the objective function is penalized. The technique was evaluated with three simulation case studies. The overall results suggest that the convergence to the optimal process performance may be achieved after a small number of batches. The model unreliability risk constraint along with sampling scheduling are crucial to minimize the experimental effort required to attain a given process performance. In general terms, it may be concluded that the proposed method broadens the application of the hybrid parametric/nonparametric modeling technique to "newer" processes with higher potential for optimization.

  7. Does systematic variation improve the reproducibility of animal experiments?

    NARCIS (Netherlands)

    Jonker, R.M.; Guenther, A.; Engqvist, L.; Schmoll, T.

    2013-01-01

    Reproducibility of results is a fundamental tenet of science. In this journal, Richter et al.1 tested whether systematic variation in experimental conditions (heterogenization) affects the reproducibility of results. Comparing this approach with the current standard of ensuring reproducibility

  8. Modelling and Simulation of the Batch Hydrolysis of Acetic ...

    African Journals Online (AJOL)

    The kinetic modelling of the batch synthesis of acetic acid from acetic anhydride was investigated. The kinetic data of the reaction was obtained by conducting the hydrolysis reaction in a batch reactor. A dynamic model was formulated for this process and simulation was carried out using gPROMS® an advanced process ...

  9. [Batch release of immunoglobulin and monoclonal antibody products].

    Science.gov (United States)

    Gross, S

    2014-10-01

    The Paul-Ehrlich Institute (PEI) is an independent institution of the Federal Republic of Germany responsible for performing official experimental batch testing of sera. The institute decides about the release of each batch and performs experimental research in the field. The experimental quality control ensures the potency of the product and also the absence of harmful impurities. For release of an immunoglobulin batch the marketing authorization holder has to submit the documentation of the manufacture and the results of quality control measures together with samples of the batch to the PEI. Experimental testing is performed according to the approved specifications regarding the efficacy and safety. Since implementation of the 15th German drug law amendment, the source of antibody is not defined anymore. According to § 32 German drug law, all batches of sera need to be released by an official control laboratory. Sera are medicinal products, which contain antibodies, antibody fragments or fusion proteins with a functional antibody portion. Therefore, all batches of monoclonal antibodies and derivatives must also be released by the PEI and the marketing authorization holder has to submit a batch release application. Under certain circumstances a waiver for certain products can be issued with regard to batch release. The conditions for such a waiver apply to the majority of monoclonal antibodies.

  10. 21 CFR 80.37 - Treatment of batch pending certification.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 1 2010-04-01 2010-04-01 false Treatment of batch pending certification. 80.37 Section 80.37 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL COLOR ADDITIVE CERTIFICATION Certification Procedures § 80.37 Treatment of batch pending certification...

  11. Solving a chemical batch scheduling problem by local search

    NARCIS (Netherlands)

    Brucker, P.; Hurink, Johann L.

    1999-01-01

    A chemical batch scheduling problem is modelled in two different ways as a discrete optimization problem. Both models are used to solve the batch scheduling problem in a two-phase tabu search procedure. The method is tested on real-world data.

  12. Dynamic Scheduling Of Batch Operations With Non-Identical Machines

    NARCIS (Netherlands)

    van der Zee, D.J.; van Harten, A.; Schuur, P.C.

    1997-01-01

    Batch-wise production is found in many industries. A good example of production systems which process products batch-wise are the ovens found in aircraft industry and in semiconductor manufacturing. These systems mostly consist of multiple machines of different types, given the range and volumes of

  13. Dynamic scheduling of batch operations with non-identical machines

    NARCIS (Netherlands)

    van der Zee, D.J.; van Harten, Aart; Schuur, Peter

    1997-01-01

    Batch-wise production is found in many industries. A good example of production systems which process products batch-wise are the ovens found in aircraft industry and in semiconductor manufacturing. These systems mostly consist of multiple machines of different types, given the range and volumes of

  14. 3D neuromelanin-sensitive magnetic resonance imaging with semi-automated volume measurement of the substantia nigra pars compacta for diagnosis of Parkinson's disease

    Energy Technology Data Exchange (ETDEWEB)

    Ogisu, Kimihiro; Shirato, Hiroki [Hokkaido University Graduate School of Medicine, Department of Radiology, Hokkaido (Japan); Kudo, Kohsuke; Sasaki, Makoto [Iwate Medical University, Division of Ultrahigh Field MRI, Iwate (Japan); Sakushima, Ken; Yabe, Ichiro; Sasaki, Hidenao [Hokkaido University Hospital, Department of Neurology, Hokkaido (Japan); Terae, Satoshi; Nakanishi, Mitsuhiro [Hokkaido University Hospital, Department of Radiology, Hokkaido (Japan)

    2013-06-15

    Neuromelanin-sensitive MRI has been reported to be used in the diagnosis of Parkinson's disease (PD), which results from loss of dopamine-producing cells in the substantia nigra pars compacta (SNc). In this study, we aimed to apply a 3D turbo field echo (TFE) sequence for neuromelanin-sensitive MRI and to evaluate the diagnostic performance of semi-automated method for measurement of SNc volume in patients with PD. We examined 18 PD patients and 27 healthy volunteers (control subjects). A 3D TFE technique with off-resonance magnetization transfer pulse was used for neuromelanin-sensitive MRI on a 3T scanner. The SNc volume was semi-automatically measured using a region-growing technique at various thresholds (ranging from 1.66 to 2.48), with the signals measured relative to that for the superior cerebellar peduncle. Receiver operating characteristic (ROC) analysis was performed at all thresholds. Intra-rater reproducibility was evaluated by intraclass correlation coefficient (ICC). The average SNc volume in the PD group was significantly smaller than that in the control group at all the thresholds (P < 0.01, student t test). At higher thresholds (>2.0), the area under the curve of ROC (Az) increased (0.88). In addition, we observed balanced sensitivity and specificity (0.83 and 0.85, respectively). At lower thresholds, sensitivity tended to increase but specificity reduced in comparison with that at higher thresholds. ICC was larger than 0.9 when the threshold was over 1.86. Our method can distinguish the PD group from the control group with high sensitivity and specificity, especially for early stage of PD. (orig.)

  15. A canned food scheduling problem with batch due date

    Science.gov (United States)

    Chung, Tsui-Ping; Liao, Ching-Jong; Smith, Milton

    2014-09-01

    This article considers a canned food scheduling problem where jobs are grouped into several batches. Jobs can be sent to the next operation only when all the jobs in the same batch have finished their processing, i.e. jobs in a batch, have a common due date. This batch due date problem is quite common in canned food factories, but there is no efficient heuristic to solve the problem. The problem can be formulated as an identical parallel machine problem with batch due date to minimize the total tardiness. Since the problem is NP hard, two heuristics are proposed to find the near-optimal solution. Computational results comparing the effectiveness and efficiency of the two proposed heuristics with an existing heuristic are reported and discussed.

  16. Spatial and interannual variability in Baltic sprat batch fecundity

    DEFF Research Database (Denmark)

    Haslob, H.; Tomkiewicz, Jonna; Hinrichsen, H.H.

    2011-01-01

    in the central Baltic Sea, namely the Bornholm Basin, Gdansk Deep and Southern Gotland Basin. Environmental parameters such as hydrography, fish condition and stock density were tested in order to investigate the observed variability in sprat fecundity. Absolute batch fecundity was found to be positively related...... to fish length and weight. Significant differences in absolute and relative batch fecundity of Baltic sprat among areas and years were detected, and could partly be explained by hydrographic features of the investigated areas. A non-linear multiple regression model taking into account fish length...... and ambient temperature explained 70% of variability in absolute batch fecundity. Oxygen content and fish condition were not related to sprat batch fecundity. Additionally, a negative effect of stock size on sprat batch fecundity in the Bornholm Basin was revealed. The obtained data and results are important...

  17. Semi-automated De-identification of German Content Sensitive Reports for Big Data Analytics.

    Science.gov (United States)

    Seuss, Hannes; Dankerl, Peter; Ihle, Matthias; Grandjean, Andrea; Hammon, Rebecca; Kaestle, Nicola; Fasching, Peter A; Maier, Christian; Christoph, Jan; Sedlmayr, Martin; Uder, Michael; Cavallaro, Alexander; Hammon, Matthias

    2017-07-01

    Purpose  Projects involving collaborations between different institutions require data security via selective de-identification of words or phrases. A semi-automated de-identification tool was developed and evaluated on different types of medical reports natively and after adapting the algorithm to the text structure. Materials and Methods  A semi-automated de-identification tool was developed and evaluated for its sensitivity and specificity in detecting sensitive content in written reports. Data from 4671 pathology reports (4105 + 566 in two different formats), 2804 medical reports, 1008 operation reports, and 6223 radiology reports of 1167 patients suffering from breast cancer were de-identified. The content was itemized into four categories: direct identifiers (name, address), indirect identifiers (date of birth/operation, medical ID, etc.), medical terms, and filler words. The software was tested natively (without training) in order to establish a baseline. The reports were manually edited and the model re-trained for the next test set. After manually editing 25, 50, 100, 250, 500 and if applicable 1000 reports of each type re-training was applied. Results  In the native test, 61.3 % of direct and 80.8 % of the indirect identifiers were detected. The performance (P) increased to 91.4 % (P25), 96.7 % (P50), 99.5 % (P100), 99.6 % (P250), 99.7 % (P500) and 100 % (P1000) for direct identifiers and to 93.2 % (P25), 97.9 % (P50), 97.2 % (P100), 98.9 % (P250), 99.0 % (P500) and 99.3 % (P1000) for indirect identifiers. Without training, 5.3 % of medical terms were falsely flagged as critical data. The performance increased, after training, to 4.0 % (P25), 3.6 % (P50), 4.0 % (P100), 3.7 % (P250), 4.3 % (P500), and 3.1 % (P1000). Roughly 0.1 % of filler words were falsely flagged. Conclusion  Training of the developed de-identification tool continuously improved its performance. Training with roughly 100 edited

  18. Applicability Of A Semi-Automated Clinical Chemistry Analyzer In Determining The Antioxidant Concentrations Of Selected Plants

    OpenAIRE

    Allan L. Hilario; Phylis C. Rio; Geraldine Susan C. Tengco; Danilo M. Menorca

    2017-01-01

    Plants are rich sources of antioxidants that are protective against diseases associated to oxidative stress. There is a need for high throughput screening method that should be useful in determining the antioxidant concentration in plants. Such screening method should significantly simplify and speed up most antioxidant assays. This paper aimed at comparing the applicability of a semi-automated clinical chemistry analyzer Pointe Scientific MI USA with the traditional standard curve method and...

  19. Reproducibility of scoring emphysema by HRCT

    International Nuclear Information System (INIS)

    Malinen, A.; Partanen, K.; Rytkoenen, H.; Vanninen, R.; Erkinjuntti-Pekkanen, R.

    2002-01-01

    Purpose: We evaluated the reproducibility of three visual scoring methods of emphysema and compared these methods with pulmonary function tests (VC, DLCO, FEV1 and FEV%) among farmer's lung patients and farmers. Material and Methods: Three radiologists examined high-resolution CT images of farmer's lung patients and their matched controls (n=70) for chronic interstitial lung diseases. Intraobserver reproducibility and interobserver variability were assessed for three methods: severity, Sanders' (extent) and Sakai. Pulmonary function tests as spirometry and diffusing capacity were measured. Results: Intraobserver -values for all three methods were good (0.51-0.74). Interobserver varied from 0.35 to 0.72. The Sanders' and the severity methods correlated strongly with pulmonary function tests, especially DLCO and FEV1. Conclusion: The Sanders' method proved to be reliable in evaluating emphysema, in terms of good consistency of interpretation and good correlation with pulmonary function tests

  20. Reproducibility of scoring emphysema by HRCT

    Energy Technology Data Exchange (ETDEWEB)

    Malinen, A.; Partanen, K.; Rytkoenen, H.; Vanninen, R. [Kuopio Univ. Hospital (Finland). Dept. of Clinical Radiology; Erkinjuntti-Pekkanen, R. [Kuopio Univ. Hospital (Finland). Dept. of Pulmonary Diseases

    2002-04-01

    Purpose: We evaluated the reproducibility of three visual scoring methods of emphysema and compared these methods with pulmonary function tests (VC, DLCO, FEV1 and FEV%) among farmer's lung patients and farmers. Material and Methods: Three radiologists examined high-resolution CT images of farmer's lung patients and their matched controls (n=70) for chronic interstitial lung diseases. Intraobserver reproducibility and interobserver variability were assessed for three methods: severity, Sanders' (extent) and Sakai. Pulmonary function tests as spirometry and diffusing capacity were measured. Results: Intraobserver -values for all three methods were good (0.51-0.74). Interobserver varied from 0.35 to 0.72. The Sanders' and the severity methods correlated strongly with pulmonary function tests, especially DLCO and FEV1. Conclusion: The Sanders' method proved to be reliable in evaluating emphysema, in terms of good consistency of interpretation and good correlation with pulmonary function tests.

  1. Reproducibility of the chamber scarification test

    DEFF Research Database (Denmark)

    Andersen, Klaus Ejner

    1996-01-01

    The chamber scarification test is a predictive human skin irritation test developed to rank the irritation potential of products and ingredients meant for repeated use on normal and diseased skin. 12 products or ingredients can be tested simultaneously on the forearm skin of each volunteer....... The test combines with the procedure scratching of the skin at each test site and subsequent closed patch tests with the products, repeated daily for 3 days. The test is performed on groups of human volunteers: a skin irritant substance or products is included in each test as a positive control...... high reproducibility of the test. Further, intra-individual variation in skin reaction to the 2 control products in 26 volunteers, who participated 2x, is shown, which supports the conclusion that the chamber scarification test is a useful short-term human skin irritation test with high reproducibility....

  2. Additive Manufacturing: Reproducibility of Metallic Parts

    Directory of Open Access Journals (Sweden)

    Konda Gokuldoss Prashanth

    2017-02-01

    Full Text Available The present study deals with the properties of five different metals/alloys (Al-12Si, Cu-10Sn and 316L—face centered cubic structure, CoCrMo and commercially pure Ti (CP-Ti—hexagonal closed packed structure fabricated by selective laser melting. The room temperature tensile properties of Al-12Si samples show good consistency in results within the experimental errors. Similar reproducible results were observed for sliding wear and corrosion experiments. The other metal/alloy systems also show repeatable tensile properties, with the tensile curves overlapping until the yield point. The curves may then follow the same path or show a marginal deviation (~10 MPa until they reach the ultimate tensile strength and a negligible difference in ductility levels (of ~0.3% is observed between the samples. The results show that selective laser melting is a reliable fabrication method to produce metallic materials with consistent and reproducible properties.

  3. Biodenitrification in Sequencing Batch Reactors. Final report

    International Nuclear Information System (INIS)

    Silverstein, J.

    1996-01-01

    One plan for stabilization of the Solar Pond waters and sludges at Rocky Flats Plant (RFP), is evaporation and cement solidification of the salts to stabilize heavy metals and radionuclides for land disposal as low-level mixed waste. It has been reported that nitrate (NO 3- ) salts may interfere with cement stabilization of heavy metals and radionuclides. Therefore, biological nitrate removal (denitrification) may be an important pretreatment for the Solar Pond wastewaters at RFP, improving the stability of the cement final waste form, reducing the requirement for cement (or pozzolan) additives and reducing the volume of cemented low-level mixed waste requiring ultimate disposal. A laboratory investigation of the performance of the Sequencing Batch Reactor (SBR) activated sludge process developed for nitrate removal from a synthetic brine typical of the high-nitrate and high-salinity wastewaters in the Solar Ponds at Rocky Flats Plant was carried out at the Environmental Engineering labs at the University of Colorado, Boulder, between May 1, 1994 and October 1, 1995

  4. Production process reproducibility and product quality consistency of transient gene expression in HEK293 cells with anti-PD1 antibody as the model protein.

    Science.gov (United States)

    Ding, Kai; Han, Lei; Zong, Huifang; Chen, Junsheng; Zhang, Baohong; Zhu, Jianwei

    2017-03-01

    Demonstration of reproducibility and consistency of process and product quality is one of the most crucial issues in using transient gene expression (TGE) technology for biopharmaceutical development. In this study, we challenged the production consistency of TGE by expressing nine batches of recombinant IgG antibody in human embryonic kidney 293 cells to evaluate reproducibility including viable cell density, viability, apoptotic status, and antibody yield in cell culture supernatant. Product quality including isoelectric point, binding affinity, secondary structure, and thermal stability was assessed as well. In addition, major glycan forms of antibody from different batches of production were compared to demonstrate glycosylation consistency. Glycan compositions of the antibody harvested at different time periods were also measured to illustrate N-glycan distribution over the culture time. From the results, it has been demonstrated that different TGE batches are reproducible from lot to lot in overall cell growth, product yield, and product qualities including isoelectric point, binding affinity, secondary structure, and thermal stability. Furthermore, major N-glycan compositions are consistent among different TGE batches and conserved during cell culture time.

  5. Reproducibility in cyclostratigraphy: initiating an intercomparison project

    Science.gov (United States)

    Sinnesael, Matthias; De Vleeschouwer, David; Zeeden, Christian; Claeys, Philippe

    2017-04-01

    The study of astronomical climate forcing and the application of cyclostratigraphy have experienced a spectacular growth over the last decades. In the field of cyclostratigraphy a broad range in methodological approaches exist. However, comparative study between the different approaches is lacking. Different cases demand different approaches, but with the growing importance of the field, questions arise about reproducibility, uncertainties and standardization of results. The radioisotopic dating community, in particular, has done far-reaching efforts to improve reproducibility and intercomparison of radioisotopic dates and their errors. To satisfy this need in cyclostratigraphy, we initiate a comparable framework for the community. The aims are to investigate and quantify reproducibility of, and uncertainties related to cyclostratigraphic studies and to provide a platform to discuss the merits and pitfalls of different methodologies, and their applicabilities. With this poster, we ask the feedback from the community on how to design this comparative framework in a useful, meaningful and productive manner. In parallel, we would like to discuss how reproducibility should be tested and what uncertainties should stand for in cyclostratigraphy. On the other hand, we intend to trigger interest for a cyclostratigraphic intercomparison project. This intercomparison project would imply the analysis of artificial and genuine geological records by individual researchers. All participants would be free to determine their method of choice. However, a handful of criterions will be required for an outcome to be comparable. The different results would be compared (e.g. during a workshop or a special session), and the lessons learned from the comparison could potentially be reported in a review paper. The aim of an intercomparison project is not to rank the different methods according to their merits, but to get insight into which specific methods are most suitable for which

  6. A how to guide to reproducible research

    OpenAIRE

    Whitaker, Kirstie

    2018-01-01

    This talk will discuss the perceived and actual barriers experienced by researchers attempting to do reproducible research, and give practical guidance on how they can be overcome. It will include suggestions on how to make your code and data available and usable for others (including a strong suggestion to document both clearly so you don't have to reply to lots of email questions from future users). Specifically it will include a brief guide to version control, collaboration and disseminati...

  7. Bad Behavior: Improving Reproducibility in Behavior Testing.

    Science.gov (United States)

    Andrews, Anne M; Cheng, Xinyi; Altieri, Stefanie C; Yang, Hongyan

    2018-01-24

    Systems neuroscience research is increasingly possible through the use of integrated molecular and circuit-level analyses. These studies depend on the use of animal models and, in many cases, molecular and circuit-level analyses. Associated with genetic, pharmacologic, epigenetic, and other types of environmental manipulations. We illustrate typical pitfalls resulting from poor validation of behavior tests. We describe experimental designs and enumerate controls needed to improve reproducibility in investigating and reporting of behavioral phenotypes.

  8. A quality monitoring program for red blood cell components: in vitro quality indicators before and after implementation of semiautomated processing.

    Science.gov (United States)

    Acker, Jason P; Hansen, Adele L; Kurach, Jayme D R; Turner, Tracey R; Croteau, Ioana; Jenkins, Craig

    2014-10-01

    Canadian Blood Services has been conducting quality monitoring of red blood cell (RBC) components since 2005, a period spanning the implementation of semiautomated component production. The aim was to compare the quality of RBC components produced before and after this production method change. Data from 572 RBC units were analyzed, categorized by production method: Method 1, RBC units produced by manual production methods; Method 2, RBC units produced by semiautomated production and the buffy coat method; and Method 3, RBC units produced by semiautomated production and the whole blood filtration method. RBC units were assessed using an extensive panel of in vitro tests, encompassing regulated quality control criteria such as hematocrit (Hct), hemolysis, and hemoglobin (Hb) levels, as well as adenosine triphosphate, 2,3-diphosphoglycerate, extracellular K(+) and Na(+) levels, methemoglobin, p50, RBC indices, and morphology. Throughout the study, all RBC units met mandated Canadian Standards Association guidelines for Hb and Hct, and most (>99%) met hemolysis requirements. However, there were significant differences among RBC units produced using different methods. Hb content was significantly lower in RBC units produced by Method 2 (51.5 ± 5.6 g/unit; p levels were lowest in units produced by Method 1 (p < 0.001). While overall quality was similar before and after the production method change, the observed differences, although small, indicate a lack of equivalency across RBC products manufactured by different methods. © 2014 AABB.

  9. A Study on the Cost-Effectiveness of a SemiAutomated Cutting Process at a Garment Manufacturing Company

    Directory of Open Access Journals (Sweden)

    Castro, Mark Daniel

    2017-11-01

    Full Text Available The subject of the study, Company X, has been experiencing variations in the quantity report from the cutting department and the transmittal reports. The management found that these processes are hugely affected by manual labor. To reduce the system's proneness to human error, the management decided to explore the possibility of adapting a semi-automated spreading and cutting process in the system. This research aims to evaluate the pre-sewing processes of Company X and whether introducing automation can be beneficial to the company and the garments industry. The researchers used process mapping tools, descriptive research, and process flowchart to assess the current and proposed systems, and engineering economics to evaluate the cost and benefits of implementing the semi-automated system. The results showed that with the implementation of the semi- automated system; the company will incur 66.61% more savings per year than the current system. In terms of cycle time, the semi-automated system eliminated the relaxation of fabric before the cutting process, thereby greatly reducing cycle time. In addition, the researchers found that as long as the company produce more than 4,140 pieces per day for the system will be economically feasible. Unquantifiable benefits are also identified on introducing the semi- automated system to the company. The company can have a cleaner work environment that will lead to more productivity and greater quality of goods. This will lead to a better company image that will encourage more customers to place job orders.

  10. Method for semi-automated microscopy of filtration-enriched circulating tumor cells.

    Science.gov (United States)

    Pailler, Emma; Oulhen, Marianne; Billiot, Fanny; Galland, Alexandre; Auger, Nathalie; Faugeroux, Vincent; Laplace-Builhé, Corinne; Besse, Benjamin; Loriot, Yohann; Ngo-Camus, Maud; Hemanda, Merouan; Lindsay, Colin R; Soria, Jean-Charles; Vielh, Philippe; Farace, Françoise

    2016-07-14

    Circulating tumor cell (CTC)-filtration methods capture high numbers of CTCs in non-small-cell lung cancer (NSCLC) and metastatic prostate cancer (mPCa) patients, and hold promise as a non-invasive technique for treatment selection and disease monitoring. However filters have drawbacks that make the automation of microscopy challenging. We report the semi-automated microscopy method we developed to analyze filtration-enriched CTCs from NSCLC and mPCa patients. Spiked cell lines in normal blood and CTCs were enriched by ISET (isolation by size of epithelial tumor cells). Fluorescent staining was carried out using epithelial (pan-cytokeratins, EpCAM), mesenchymal (vimentin, N-cadherin), leukocyte (CD45) markers and DAPI. Cytomorphological staining was carried out with Mayer-Hemalun or Diff-Quik. ALK-, ROS1-, ERG-rearrangement were detected by filter-adapted-FISH (FA-FISH). Microscopy was carried out using an Ariol scanner. Two combined assays were developed. The first assay sequentially combined four-color fluorescent staining, scanning, automated selection of CD45(-) cells, cytomorphological staining, then scanning and analysis of CD45(-) cell phenotypical and cytomorphological characteristics. CD45(-) cell selection was based on DAPI and CD45 intensity, and a nuclear area >55 μm(2). The second assay sequentially combined fluorescent staining, automated selection of CD45(-) cells, FISH scanning on CD45(-) cells, then analysis of CD45(-) cell FISH signals. Specific scanning parameters were developed to deal with the uneven surface of filters and CTC characteristics. Thirty z-stacks spaced 0.6 μm apart were defined as the optimal setting, scanning 82 %, 91 %, and 95 % of CTCs in ALK-, ROS1-, and ERG-rearranged patients respectively. A multi-exposure protocol consisting of three separate exposure times for green and red fluorochromes was optimized to analyze the intensity, size and thickness of FISH signals. The semi-automated microscopy method reported here

  11. Method for semi-automated microscopy of filtration-enriched circulating tumor cells

    International Nuclear Information System (INIS)

    Pailler, Emma; Oulhen, Marianne; Billiot, Fanny; Galland, Alexandre; Auger, Nathalie; Faugeroux, Vincent; Laplace-Builhé, Corinne; Besse, Benjamin; Loriot, Yohann; Ngo-Camus, Maud; Hemanda, Merouan; Lindsay, Colin R.; Soria, Jean-Charles; Vielh, Philippe; Farace, Françoise

    2016-01-01

    Circulating tumor cell (CTC)-filtration methods capture high numbers of CTCs in non-small-cell lung cancer (NSCLC) and metastatic prostate cancer (mPCa) patients, and hold promise as a non-invasive technique for treatment selection and disease monitoring. However filters have drawbacks that make the automation of microscopy challenging. We report the semi-automated microscopy method we developed to analyze filtration-enriched CTCs from NSCLC and mPCa patients. Spiked cell lines in normal blood and CTCs were enriched by ISET (isolation by size of epithelial tumor cells). Fluorescent staining was carried out using epithelial (pan-cytokeratins, EpCAM), mesenchymal (vimentin, N-cadherin), leukocyte (CD45) markers and DAPI. Cytomorphological staining was carried out with Mayer-Hemalun or Diff-Quik. ALK-, ROS1-, ERG-rearrangement were detected by filter-adapted-FISH (FA-FISH). Microscopy was carried out using an Ariol scanner. Two combined assays were developed. The first assay sequentially combined four-color fluorescent staining, scanning, automated selection of CD45 − cells, cytomorphological staining, then scanning and analysis of CD45 − cell phenotypical and cytomorphological characteristics. CD45 − cell selection was based on DAPI and CD45 intensity, and a nuclear area >55 μm 2 . The second assay sequentially combined fluorescent staining, automated selection of CD45 − cells, FISH scanning on CD45 − cells, then analysis of CD45 − cell FISH signals. Specific scanning parameters were developed to deal with the uneven surface of filters and CTC characteristics. Thirty z-stacks spaced 0.6 μm apart were defined as the optimal setting, scanning 82 %, 91 %, and 95 % of CTCs in ALK-, ROS1-, and ERG-rearranged patients respectively. A multi-exposure protocol consisting of three separate exposure times for green and red fluorochromes was optimized to analyze the intensity, size and thickness of FISH signals. The semi-automated microscopy method reported here

  12. A Framework for Reproducible Latent Fingerprint Enhancements.

    Science.gov (United States)

    Carasso, Alfred S

    2014-01-01

    Photoshop processing of latent fingerprints is the preferred methodology among law enforcement forensic experts, but that appproach is not fully reproducible and may lead to questionable enhancements. Alternative, independent, fully reproducible enhancements, using IDL Histogram Equalization and IDL Adaptive Histogram Equalization, can produce better-defined ridge structures, along with considerable background information. Applying a systematic slow motion smoothing procedure to such IDL enhancements, based on the rapid FFT solution of a Lévy stable fractional diffusion equation, can attenuate background detail while preserving ridge information. The resulting smoothed latent print enhancements are comparable to, but distinct from, forensic Photoshop images suitable for input into automated fingerprint identification systems, (AFIS). In addition, this progressive smoothing procedure can be reexamined by displaying the suite of progressively smoother IDL images. That suite can be stored, providing an audit trail that allows monitoring for possible loss of useful information, in transit to the user-selected optimal image. Such independent and fully reproducible enhancements provide a valuable frame of reference that may be helpful in informing, complementing, and possibly validating the forensic Photoshop methodology.

  13. Reproducibility of 201Tl myocardial imaging

    International Nuclear Information System (INIS)

    McLaughlin, P.R.; Martin, R.P.; Doherty, P.; Daspit, S.; Goris, M.; Haskell, W.; Lewis, S.; Kriss, J.P.; Harrison, D.C.

    1977-01-01

    Seventy-six thallium-201 myocardial perfusion studies were performed on twenty-five patients to assess their reproducibility and the effect of varying the level of exercise on the results of imaging. Each patient had a thallium-201 study at rest. Fourteen patients had studies on two occasions at maximum exercise, and twelve patients had studies both at light and at maximum exercise. Of 70 segments in the 14 patients assessed on each of two maximum exercise tests, 64 (91 percent) were reproducible. Only 53 percent (16/30) of the ischemic defects present at maximum exercise were seen in the light exercise study in the 12 patients assessed at two levels of exercise. Correlation of perfusion defects with arteriographically proven significant coronary stenosis was good for the left anterior descending and right coronary arteries, but not as good for circumflex artery disease. Thallium-201 myocardial imaging at maximum exercise is reproducible within acceptable limits, but careful attention to exercise technique is essential for valid comparative studies

  14. Glucoamylase production in batch, chemostat and fed-batch cultivations by an industrial strain of Aspergillus niger

    DEFF Research Database (Denmark)

    Pedersen, Henrik; Beyer, Michael; Nielsen, Jens

    2000-01-01

    The Aspergillus niger strain BO-1 was grown in batch, continuous (chemostat) and fed-batch cultivations in order to study the production of the extracellular enzyme glucoamylase under different growth conditions. In the pH range 2.5-6.0, the specific glucoamylase productivity and the specific...

  15. Batch-to-batch quality consistency evaluation of botanical drug products using multivariate statistical analysis of the chromatographic fingerprint.

    Science.gov (United States)

    Xiong, Haoshu; Yu, Lawrence X; Qu, Haibin

    2013-06-01

    Botanical drug products have batch-to-batch quality variability due to botanical raw materials and the current manufacturing process. The rational evaluation and control of product quality consistency are essential to ensure the efficacy and safety. Chromatographic fingerprinting is an important and widely used tool to characterize the chemical composition of botanical drug products. Multivariate statistical analysis has showed its efficacy and applicability in the quality evaluation of many kinds of industrial products. In this paper, the combined use of multivariate statistical analysis and chromatographic fingerprinting is presented here to evaluate batch-to-batch quality consistency of botanical drug products. A typical botanical drug product in China, Shenmai injection, was selected as the example to demonstrate the feasibility of this approach. The high-performance liquid chromatographic fingerprint data of historical batches were collected from a traditional Chinese medicine manufacturing factory. Characteristic peaks were weighted by their variability among production batches. A principal component analysis model was established after outliers were modified or removed. Multivariate (Hotelling T(2) and DModX) control charts were finally successfully applied to evaluate the quality consistency. The results suggest useful applications for a combination of multivariate statistical analysis with chromatographic fingerprinting in batch-to-batch quality consistency evaluation for the manufacture of botanical drug products.

  16. Batch-To-Batch Rational Feedforward Control : From Iterative Learning to Identification Approaches, with Application to a Wafer Stage

    NARCIS (Netherlands)

    Blanken, L.; Boeren, F.A.J.; Bruijnen, D.J.H.; Oomen, T.A.E.

    2017-01-01

    Feedforward control enables high performance for industrial motion systems that perform nonrepeating motion tasks. Recently, learning techniques have been proposed that improve both performance and flexibility to nonrepeating tasks in a batch-To-batch fashion by using a rational parameterization in

  17. Kinetic study of batch and fed-batch enzymatic saccharification of pretreated substrate and subsequent fermentation to ethanol

    Directory of Open Access Journals (Sweden)

    Gupta Rishi

    2012-03-01

    Full Text Available Abstract Background Enzymatic hydrolysis, the rate limiting step in the process development for biofuel, is always hampered by its low sugar concentration. High solid enzymatic saccharification could solve this problem but has several other drawbacks such as low rate of reaction. In the present study we have attempted to enhance the concentration of sugars in enzymatic hydrolysate of delignified Prosopis juliflora, using a fed-batch enzymatic hydrolysis approach. Results The enzymatic hydrolysis was carried out at elevated solid loading up to 20% (w/v and a comparison kinetics of batch and fed-batch enzymatic hydrolysis was carried out using kinetic regimes. Under batch mode, the actual sugar concentration values at 20% initial substrate consistency were found deviated from the predicted values and the maximum sugar concentration obtained was 80.78 g/L. Fed-batch strategy was implemented to enhance the final sugar concentration to 127 g/L. The batch and fed-batch enzymatic hydrolysates were fermented with Saccharomyces cerevisiae and ethanol production of 34.78 g/L and 52.83 g/L, respectively, were achieved. Furthermore, model simulations showed that higher insoluble solids in the feed resulted in both smaller reactor volume and shorter residence time. Conclusion Fed-batch enzymatic hydrolysis is an efficient procedure for enhancing the sugar concentration in the hydrolysate. Restricting the process to suitable kinetic regimes could result in higher conversion rates.

  18. Kinetic study of batch and fed-batch enzymatic saccharification of pretreated substrate and subsequent fermentation to ethanol

    Science.gov (United States)

    2012-01-01

    Background Enzymatic hydrolysis, the rate limiting step in the process development for biofuel, is always hampered by its low sugar concentration. High solid enzymatic saccharification could solve this problem but has several other drawbacks such as low rate of reaction. In the present study we have attempted to enhance the concentration of sugars in enzymatic hydrolysate of delignified Prosopis juliflora, using a fed-batch enzymatic hydrolysis approach. Results The enzymatic hydrolysis was carried out at elevated solid loading up to 20% (w/v) and a comparison kinetics of batch and fed-batch enzymatic hydrolysis was carried out using kinetic regimes. Under batch mode, the actual sugar concentration values at 20% initial substrate consistency were found deviated from the predicted values and the maximum sugar concentration obtained was 80.78 g/L. Fed-batch strategy was implemented to enhance the final sugar concentration to 127 g/L. The batch and fed-batch enzymatic hydrolysates were fermented with Saccharomyces cerevisiae and ethanol production of 34.78 g/L and 52.83 g/L, respectively, were achieved. Furthermore, model simulations showed that higher insoluble solids in the feed resulted in both smaller reactor volume and shorter residence time. Conclusion Fed-batch enzymatic hydrolysis is an efficient procedure for enhancing the sugar concentration in the hydrolysate. Restricting the process to suitable kinetic regimes could result in higher conversion rates. PMID:22433563

  19. Standing Together for Reproducibility in Large-Scale Computing: Report on reproducibility@XSEDE

    OpenAIRE

    James, Doug; Wilkins-Diehr, Nancy; Stodden, Victoria; Colbry, Dirk; Rosales, Carlos; Fahey, Mark; Shi, Justin; Silva, Rafael F.; Lee, Kyo; Roskies, Ralph; Loewe, Laurence; Lindsey, Susan; Kooper, Rob; Barba, Lorena; Bailey, David

    2014-01-01

    This is the final report on reproducibility@xsede, a one-day workshop held in conjunction with XSEDE14, the annual conference of the Extreme Science and Engineering Discovery Environment (XSEDE). The workshop's discussion-oriented agenda focused on reproducibility in large-scale computational research. Two important themes capture the spirit of the workshop submissions and discussions: (1) organizational stakeholders, especially supercomputer centers, are in a unique position to promote, enab...

  20. Semi-automated uranium analysis by a modified Davies--Gray procedure

    International Nuclear Information System (INIS)

    Swanson, G.C.

    1977-01-01

    To rapidly and reliably determine uranium in fuel materials a semi-automated implementation of the Davies-Gray uranium titration was developed. The Davies-Gray method is essentially a three step procedure. First uranium is reduced quantitatively from +6 valence to +4 valence by excess of iron (II) in strong phosphoric acid in the absence of nitrite. Prior to the uranium reduction nitrite is destroyed by addition of sulfamic acid. In the second step iron (II) is selectively oxidized to iron (III) by nitric acid in the presence of Mo (VI) catalyst. Finally after dilution to reduce phosphate concentration, the uranium is titrated to U (VI) by standard dichromate. The original sluggish colorimetric endpoint determination used by Davies and Gray is seldom used since New Brunswick Laboratory discovered that addition of vanadium (IV) just prior to titration sufficiently improves reaction rate to allow a potentiometric endpoint determination. One of the advantages of the Davies-Gray uranium titration is that it is quite specific for uranium, most common impurity elements do not interfere with the analysis, and specifically high levels of Pu, Th, and Fe are tolerated

  1. Semiautomated volumetric response evaluation as an imaging biomarker in superior sulcus tumors

    International Nuclear Information System (INIS)

    Vos, C.G.; Paul, M.A.; Dahele, M.; Soernsen de Koste, J.R. van; Senan, S.; Bahce, I.; Smit, E.F.; Thunnissen, E.; Hartemink, K.J.

    2014-01-01

    Volumetric response to therapy has been suggested as a biomarker for patient-centered outcomes. The primary aim of this pilot study was to investigate whether the volumetric response to induction chemoradiotherapy was associated with pathological complete response (pCR) or survival in patients with superior sulcus tumors managed with trimodality therapy. The secondary aim was to evaluate a semiautomated method for serial volume assessment. In this retrospective study, treatment outcomes were obtained from a departmental database. The tumor was delineated on the computed tomography (CT) scan used for radiotherapy planning, which was typically performed during the first cycle of chemotherapy. These contours were transferred to the post-chemoradiotherapy diagnostic CT scan using deformable image registration (DIR) with/without manual editing. CT scans from 30 eligible patients were analyzed. Median follow-up was 51 months. Neither absolute nor relative reduction in tumor volume following chemoradiotherapy correlated with pCR or 2-year survival. The tumor volumes determined by DIR alone and DIR + manual editing correlated to a high degree (R 2 = 0.99, P < 0.01). Volumetric response to induction chemoradiotherapy was not correlated with pCR or survival in patients with superior sulcus tumors managed with trimodality therapy. DIR-based contour propagation merits further evaluation as a tool for serial volumetric assessment. (orig.)

  2. Assessment of tobacco smoke effects on neonatal cardiorespiratory control using a semi-automated processing approach.

    Science.gov (United States)

    Al-Omar, Sally; Le Rolle, Virginie; Beuchée, Alain; Samson, Nathalie; Praud, Jean-Paul; Carrault, Guy

    2018-05-10

    A semi-automated processing approach was developed to assess the effects of early postnatal environmental tobacco smoke (ETS) on the cardiorespiratory control of newborn lambs. The system consists of several steps beginning with artifact rejection, followed by the selection of stationary segments, and ending with feature extraction. This approach was used in six lambs exposed to 20 cigarettes/day for the first 15 days of life, while another six control lambs were exposed to room air. On postnatal day 16, electrocardiograph and respiratory signals were obtained from a 6-h polysomnographic recording. The effects of postnatal ETS exposure on heart rate variability, respiratory rate variability, and cardiorespiratory interrelations were explored. The unique results suggest that early postnatal ETS exposure increases respiratory rate variability and decreases the coupling between cardiac and respiratory systems. Potentially harmful consequences in early life include unstable breathing and decreased adaptability of cardiorespiratory function, particularly during early life challenges, such as prematurity or viral infection. Graphical abstract ᅟ.

  3. Protannotator: a semiautomated pipeline for chromosome-wise functional annotation of the "missing" human proteome.

    Science.gov (United States)

    Islam, Mohammad T; Garg, Gagan; Hancock, William S; Risk, Brian A; Baker, Mark S; Ranganathan, Shoba

    2014-01-03

    The chromosome-centric human proteome project (C-HPP) aims to define the complete set of proteins encoded in each human chromosome. The neXtProt database (September 2013) lists 20,128 proteins for the human proteome, of which 3831 human proteins (∼19%) are considered "missing" according to the standard metrics table (released September 27, 2013). In support of the C-HPP initiative, we have extended the annotation strategy developed for human chromosome 7 "missing" proteins into a semiautomated pipeline to functionally annotate the "missing" human proteome. This pipeline integrates a suite of bioinformatics analysis and annotation software tools to identify homologues and map putative functional signatures, gene ontology, and biochemical pathways. From sequential BLAST searches, we have primarily identified homologues from reviewed nonhuman mammalian proteins with protein evidence for 1271 (33.2%) "missing" proteins, followed by 703 (18.4%) homologues from reviewed nonhuman mammalian proteins and subsequently 564 (14.7%) homologues from reviewed human proteins. Functional annotations for 1945 (50.8%) "missing" proteins were also determined. To accelerate the identification of "missing" proteins from proteomics studies, we generated proteotypic peptides in silico. Matching these proteotypic peptides to ENCODE proteogenomic data resulted in proteomic evidence for 107 (2.8%) of the 3831 "missing proteins, while evidence from a recent membrane proteomic study supported the existence for another 15 "missing" proteins. The chromosome-wise functional annotation of all "missing" proteins is freely available to the scientific community through our web server (http://biolinfo.org/protannotator).

  4. Semi-automated segmentation of a glioblastoma multiforme on brain MR images for radiotherapy planning.

    Science.gov (United States)

    Hori, Daisuke; Katsuragawa, Shigehiko; Murakami, Ryuuji; Hirai, Toshinori

    2010-04-20

    We propose a computerized method for semi-automated segmentation of the gross tumor volume (GTV) of a glioblastoma multiforme (GBM) on brain MR images for radiotherapy planning (RTP). Three-dimensional (3D) MR images of 28 cases with a GBM were used in this study. First, a sphere volume of interest (VOI) including the GBM was selected by clicking a part of the GBM region in the 3D image. Then, the sphere VOI was transformed to a two-dimensional (2D) image by use of a spiral-scanning technique. We employed active contour models (ACM) to delineate an optimal outline of the GBM in the transformed 2D image. After inverse transform of the optimal outline to the 3D space, a morphological filter was applied to smooth the shape of the 3D segmented region. For evaluation of our computerized method, we compared the computer output with manually segmented regions, which were obtained by a therapeutic radiologist using a manual tracking method. In evaluating our segmentation method, we employed the Jaccard similarity coefficient (JSC) and the true segmentation coefficient (TSC) in volumes between the computer output and the manually segmented region. The mean and standard deviation of JSC and TSC were 74.2+/-9.8% and 84.1+/-7.1%, respectively. Our segmentation method provided a relatively accurate outline for GBM and would be useful for radiotherapy planning.

  5. Semi-automated analysis of EEG spikes in the preterm fetal sheep using wavelet analysis

    International Nuclear Information System (INIS)

    Walbran, A.C.; Unsworth, C.P.; Gunn, A.J.; Benett, L.

    2010-01-01

    Full text: Presentation Preference Oral Presentation Perinatal hypoxia plays a key role in the cause of brain injury in premature infants. Cerebral hypothermia commenced in the latent phase of evolving injury (first 6-8 h post hypoxic-ischemic insult) is the lead candidate for treatment however currently there is no means to identify which infants can benefit from treatment. Recent studies suggest that epileptiform transients in latent phase are predictive of neural outcome. To quantify this, an automated means of EEG analysis is required as EEG monitoring produces vast amounts of data which is timely to analyse manually. We have developed a semi-automated EEG spike detection method which employs a discretized version of the continuous wavelet transform (CWT). EEG data was obtained from a fetal sheep at approximately 0.7 of gestation. Fetal asphyxia was maintained for 25 min and the EEG recorded for 8 h before and after asphyxia. The CWT was calculated followed by the power of the wavelet transform coefficients. Areas of high power corresponded to spike waves so thresholding was employed to identify the spikes. The performance of the method was found have a good sensitivity and selectivity, thus demonstrating that this method is a simple, robust and potentially effective spike detection algorithm.

  6. Semi-Automated Digital Image Analysis of Pick's Disease and TDP-43 Proteinopathy.

    Science.gov (United States)

    Irwin, David J; Byrne, Matthew D; McMillan, Corey T; Cooper, Felicia; Arnold, Steven E; Lee, Edward B; Van Deerlin, Vivianna M; Xie, Sharon X; Lee, Virginia M-Y; Grossman, Murray; Trojanowski, John Q

    2016-01-01

    Digital image analysis of histology sections provides reliable, high-throughput methods for neuropathological studies but data is scant in frontotemporal lobar degeneration (FTLD), which has an added challenge of study due to morphologically diverse pathologies. Here, we describe a novel method of semi-automated digital image analysis in FTLD subtypes including: Pick's disease (PiD, n=11) with tau-positive intracellular inclusions and neuropil threads, and TDP-43 pathology type C (FTLD-TDPC, n=10), defined by TDP-43-positive aggregates predominantly in large dystrophic neurites. To do this, we examined three FTLD-associated cortical regions: mid-frontal gyrus (MFG), superior temporal gyrus (STG) and anterior cingulate gyrus (ACG) by immunohistochemistry. We used a color deconvolution process to isolate signal from the chromogen and applied both object detection and intensity thresholding algorithms to quantify pathological burden. We found object-detection algorithms had good agreement with gold-standard manual quantification of tau- and TDP-43-positive inclusions. Our sampling method was reliable across three separate investigators and we obtained similar results in a pilot analysis using open-source software. Regional comparisons using these algorithms finds differences in regional anatomic disease burden between PiD and FTLD-TDP not detected using traditional ordinal scale data, suggesting digital image analysis is a powerful tool for clinicopathological studies in morphologically diverse FTLD syndromes. © The Author(s) 2015.

  7. Semi-Automated Digital Image Analysis of Pick’s Disease and TDP-43 Proteinopathy

    Science.gov (United States)

    Irwin, David J.; Byrne, Matthew D.; McMillan, Corey T.; Cooper, Felicia; Arnold, Steven E.; Lee, Edward B.; Van Deerlin, Vivianna M.; Xie, Sharon X.; Lee, Virginia M.-Y.; Grossman, Murray; Trojanowski, John Q.

    2015-01-01

    Digital image analysis of histology sections provides reliable, high-throughput methods for neuropathological studies but data is scant in frontotemporal lobar degeneration (FTLD), which has an added challenge of study due to morphologically diverse pathologies. Here, we describe a novel method of semi-automated digital image analysis in FTLD subtypes including: Pick’s disease (PiD, n=11) with tau-positive intracellular inclusions and neuropil threads, and TDP-43 pathology type C (FTLD-TDPC, n=10), defined by TDP-43-positive aggregates predominantly in large dystrophic neurites. To do this, we examined three FTLD-associated cortical regions: mid-frontal gyrus (MFG), superior temporal gyrus (STG) and anterior cingulate gyrus (ACG) by immunohistochemistry. We used a color deconvolution process to isolate signal from the chromogen and applied both object detection and intensity thresholding algorithms to quantify pathological burden. We found object-detection algorithms had good agreement with gold-standard manual quantification of tau- and TDP-43-positive inclusions. Our sampling method was reliable across three separate investigators and we obtained similar results in a pilot analysis using open-source software. Regional comparisons using these algorithms finds differences in regional anatomic disease burden between PiD and FTLD-TDP not detected using traditional ordinal scale data, suggesting digital image analysis is a powerful tool for clinicopathological studies in morphologically diverse FTLD syndromes. PMID:26538548

  8. Semiautomated TaqMan PCR screening of GMO labelled samples for (unauthorised) GMOs.

    Science.gov (United States)

    Scholtens, Ingrid M J; Molenaar, Bonnie; van Hoof, Richard A; Zaaijer, Stephanie; Prins, Theo W; Kok, Esther J

    2017-06-01

    In most countries, systems are in place to analyse food products for the potential presence of genetically modified organisms (GMOs), to enforce labelling requirements and to screen for the potential presence of unauthorised GMOs. With the growing number of GMOs on the world market, a larger diversity of methods is required for informative analyses. In this paper, the specificity of an extended screening set consisting of 32 screening methods to identify different crop species (endogenous genes) and GMO elements was verified against 59 different GMO reference materials. In addition, a cost- and time-efficient strategy for DNA isolation, screening and identification is presented. A module for semiautomated analysis of the screening results and planning of subsequent event-specific tests for identification has been developed. The Excel-based module contains information on the experimentally verified specificity of the element methods and of the EU authorisation status of the GMO events. If a detected GMO element cannot be explained by any of the events as identified in the same sample, this may indicate the presence of an unknown unauthorised GMO that may not yet have been assessed for its safety for humans, animals or the environment.

  9. Semi-automated operation of Mars Climate Simulation chamber - MCSC modelled for biological experiments

    Science.gov (United States)

    Tarasashvili, M. V.; Sabashvili, Sh. A.; Tsereteli, S. L.; Aleksidze, N. D.; Dalakishvili, O.

    2017-10-01

    The Mars Climate Simulation Chamber (MCSC) (GEO PAT 12 522/01) is designed for the investigation of the possible past and present habitability of Mars, as well as for the solution of practical tasks necessary for the colonization and Terraformation of the Planet. There are specific tasks such as the experimental investigation of the biological parameters that allow many terrestrial organisms to adapt to the imitated Martian conditions: chemistry of the ground, atmosphere, temperature, radiation, etc. MCSC is set for the simulation of the conduction of various biological experiments, as well as the selection of extremophile microorganisms for the possible Settlement, Ecopoesis and/or Terraformation purposes and investigation of their physiological functions. For long-term purposes, it is possible to cultivate genetically modified organisms (e.g., plants) adapted to the Martian conditions for future Martian agriculture to sustain human Mars missions and permanent settlements. The size of the chamber allows preliminary testing of the functionality of space-station mini-models and personal protection devices such as space-suits, covering and building materials and other structures. The reliability of the experimental biotechnological materials can also be tested over a period of years. Complex and thorough research has been performed to acquire the most appropriate technical tools for the accurate engineering of the MCSC and precious programmed simulation of Martian environmental conditions. This paper describes the construction and technical details of the equipment of the MCSC, which allows its semi-automated, long-term operation.

  10. Semi-automated camera trap image processing for the detection of ungulate fence crossing events.

    Science.gov (United States)

    Janzen, Michael; Visser, Kaitlyn; Visscher, Darcy; MacLeod, Ian; Vujnovic, Dragomir; Vujnovic, Ksenija

    2017-09-27

    Remote cameras are an increasingly important tool for ecological research. While remote camera traps collect field data with minimal human attention, the images they collect require post-processing and characterization before it can be ecologically and statistically analyzed, requiring the input of substantial time and money from researchers. The need for post-processing is due, in part, to a high incidence of non-target images. We developed a stand-alone semi-automated computer program to aid in image processing, categorization, and data reduction by employing background subtraction and histogram rules. Unlike previous work that uses video as input, our program uses still camera trap images. The program was developed for an ungulate fence crossing project and tested against an image dataset which had been previously processed by a human operator. Our program placed images into categories representing the confidence of a particular sequence of images containing a fence crossing event. This resulted in a reduction of 54.8% of images that required further human operator characterization while retaining 72.6% of the known fence crossing events. This program can provide researchers using remote camera data the ability to reduce the time and cost required for image post-processing and characterization. Further, we discuss how this procedure might be generalized to situations not specifically related to animal use of linear features.

  11. Chemical composition dispersion in bi-metallic nanoparticles: semi-automated analysis using HAADF-STEM

    International Nuclear Information System (INIS)

    Epicier, T.; Sato, K.; Tournus, F.; Konno, T.

    2012-01-01

    We present a method using high-angle annular dark field scanning transmission electron microscopy (HAADF-STEM) to determine the chemical composition of bi-metallic nanoparticles. This method, which can be applied in a semi-automated way, allows large scale analysis with a statistical number of particles (several hundreds) in a short time. Once a calibration curve has been obtained, e.g., using energy-dispersive X-ray spectroscopy (EDX) measurements on a few particles, the HAADF integrated intensity of each particle can indeed be directly related to its chemical composition. After a theoretical description, this approach is applied to the case of iron–palladium nanoparticles (expected to be nearly stoichiometric) with a mean size of 8.3 nm. It will be shown that an accurate chemical composition histogram is obtained, i.e., the Fe content has been determined to be 49.0 at.% with a dispersion of 10.4 %. HAADF-STEM analysis represents a powerful alternative to fastidious single particle EDX measurements, for the compositional dispersion in alloy nanoparticles.

  12. Semi-automated 86Y purification using a three-column system

    International Nuclear Information System (INIS)

    Park, Luke S.; Szajek, Lawrence P.; Wong, Karen J.; Plascjak, Paul S.; Garmestani, Kayhan; Googins, Shawn; Eckelman, William C.; Carrasquillo, Jorge A.; Paik, Chang H.

    2004-01-01

    The separation of 86 Y from 86 Sr was optimized by a semi-automated purification system involving the passage of the target sample through three sequential columns. The target material was dissolved in 4 N HNO 3 and loaded onto a Sr-selective (Sr-Spec) column to retain the 86 Sr. The yttrium was eluted with 4 N HNO 3 onto the second Y-selective (RE-Spec) column with quantitative retention. The RE-Spec column was eluted with a stepwise decreasing concentration of HNO 3 to wash out potential metallic impurities to a waste container. The eluate was then pumped onto an Aminex A5 column with 0.1 N HCl and finally with 3 N HCl to collect the radioyttrium in 0.6-0.8 mL with a >80% recovery. This method enabled us to decontaminate Sr by 250,000 times and label 30 μ g of DOTA-Biotin with a >95% yield

  13. A semi-automated methodology for finding lipid-related GO terms.

    Science.gov (United States)

    Fan, Mengyuan; Low, Hong Sang; Wenk, Markus R; Wong, Limsoon

    2014-01-01

    Although semantic similarity in Gene Ontology (GO) and other approaches may be used to find similar GO terms, there is yet a method to systematically find a class of GO terms sharing a common property with high accuracy (e.g., involving human curation). We have developed a methodology to address this issue and applied it to identify lipid-related GO terms, owing to the important and varied roles of lipids in many biological processes. Our methodology finds lipid-related GO terms in a semi-automated manner, requiring only moderate manual curation. We first obtain a list of lipid-related gold-standard GO terms by keyword search and manual curation. Then, based on the hypothesis that co-annotated GO terms share similar properties, we develop a machine learning method that expands the list of lipid-related terms from the gold standard. Those terms predicted most likely to be lipid related are examined by a human curator following specific curation rules to confirm the class labels. The structure of GO is also exploited to help reduce the curation effort. The prediction and curation cycle is repeated until no further lipid-related term is found. Our approach has covered a high proportion, if not all, of lipid-related terms with relatively high efficiency. http://compbio.ddns.comp.nus.edu.sg/∼lipidgo. © The Author(s) 2014. Published by Oxford University Press.

  14. OMIT: dynamic, semi-automated ontology development for the microRNA domain.

    Directory of Open Access Journals (Sweden)

    Jingshan Huang

    Full Text Available As a special class of short non-coding RNAs, microRNAs (a.k.a. miRNAs or miRs have been reported to perform important roles in various biological processes by regulating respective target genes. However, significant barriers exist during biologists' conventional miR knowledge discovery. Emerging semantic technologies, which are based upon domain ontologies, can render critical assistance to this problem. Our previous research has investigated the construction of a miR ontology, named Ontology for MIcroRNA Target Prediction (OMIT, the very first of its kind that formally encodes miR domain knowledge. Although it is unavoidable to have a manual component contributed by domain experts when building ontologies, many challenges have been identified for a completely manual development process. The most significant issue is that a manual development process is very labor-intensive and thus extremely expensive. Therefore, we propose in this paper an innovative ontology development methodology. Our contributions can be summarized as: (i We have continued the development and critical improvement of OMIT, solidly based on our previous research outcomes. (ii We have explored effective and efficient algorithms with which the ontology development can be seamlessly combined with machine intelligence and be accomplished in a semi-automated manner, thus significantly reducing large amounts of human efforts. A set of experiments have been conducted to thoroughly evaluate our proposed methodology.

  15. OMIT: dynamic, semi-automated ontology development for the microRNA domain.

    Science.gov (United States)

    Huang, Jingshan; Dang, Jiangbo; Borchert, Glen M; Eilbeck, Karen; Zhang, He; Xiong, Min; Jiang, Weijian; Wu, Hao; Blake, Judith A; Natale, Darren A; Tan, Ming

    2014-01-01

    As a special class of short non-coding RNAs, microRNAs (a.k.a. miRNAs or miRs) have been reported to perform important roles in various biological processes by regulating respective target genes. However, significant barriers exist during biologists' conventional miR knowledge discovery. Emerging semantic technologies, which are based upon domain ontologies, can render critical assistance to this problem. Our previous research has investigated the construction of a miR ontology, named Ontology for MIcroRNA Target Prediction (OMIT), the very first of its kind that formally encodes miR domain knowledge. Although it is unavoidable to have a manual component contributed by domain experts when building ontologies, many challenges have been identified for a completely manual development process. The most significant issue is that a manual development process is very labor-intensive and thus extremely expensive. Therefore, we propose in this paper an innovative ontology development methodology. Our contributions can be summarized as: (i) We have continued the development and critical improvement of OMIT, solidly based on our previous research outcomes. (ii) We have explored effective and efficient algorithms with which the ontology development can be seamlessly combined with machine intelligence and be accomplished in a semi-automated manner, thus significantly reducing large amounts of human efforts. A set of experiments have been conducted to thoroughly evaluate our proposed methodology.

  16. The accuracy of liquid-liquid phase transition temperatures determined from semiautomated light scattering measurements

    Science.gov (United States)

    Dean, Kevin M.; Babayco, Christopher B.; Sluss, Daniel R. B.; Williamson, J. Charles

    2010-08-01

    The synthetic-method determination of liquid-liquid coexistence curves using semiautomated light scattering instrumentation and stirred samples is based on identifying the coexistence curve transition temperatures (Tcx) from sudden changes in turbidity associated with droplet formation. Here we use a thorough set of such measurements to evaluate the accuracy of several different analysis methods reported in the literature for assigning Tcx. More than 20 samples each of weakly opalescent isobutyric acid+water and strongly opalescent aniline+hexane were tested with our instrumentation. Transmitted light and scattering intensities at 2°, 24°, and 90° were collected simultaneously as a function of temperature for each stirred sample, and the data were compared with visual observations and light scattering theory. We find that assigning Tcx to the onset of decreased transmitted light or increased 2° scattering has a potential accuracy of 0.01 K or better for many samples. However, the turbidity due to critical opalescence obscures the identification of Tcx from the light scattering data of near-critical stirred samples, and no simple rule of interpretation can be applied regardless of collection geometry. At best, when 90° scattering is collected along with transmitted or 2° data, the accuracy of Tcx is limited to 0.05 K for near-critical samples. Visual determination of Tcx remains the more accurate approach in this case.

  17. A Semiautomated Framework for Integrating Expert Knowledge into Disease Marker Identification

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Jing; Webb-Robertson, Bobbie-Jo M.; Matzke, Melissa M.; Varnum, Susan M.; Brown, Joseph N.; Riensche, Roderick M.; Adkins, Joshua N.; Jacobs, Jon M.; Hoidal, John R.; Scholand, Mary Beth; Pounds, Joel G.; Blackburn, Michael R.; Rodland, Karin D.; McDermott, Jason E.

    2013-10-01

    Background. The availability of large complex data sets generated by high throughput technologies has enabled the recent proliferation of disease biomarker studies. However, a recurring problem in deriving biological information from large data sets is how to best incorporate expert knowledge into the biomarker selection process. Objective. To develop a generalizable framework that can incorporate expert knowledge into data-driven processes in a semiautomated way while providing a metric for optimization in a biomarker selection scheme. Methods. The framework was implemented as a pipeline consisting of five components for the identification of signatures from integrated clustering (ISIC). Expert knowledge was integrated into the biomarker identification process using the combination of two distinct approaches; a distance-based clustering approach and an expert knowledge-driven functional selection. Results. The utility of the developed framework ISIC was demonstrated on proteomics data from a study of chronic obstructive pulmonary disease (COPD). Biomarker candidates were identified in a mouse model using ISIC and validated in a study of a human cohort. Conclusions. Expert knowledge can be introduced into a biomarker discovery process in different ways to enhance the robustness of selected marker candidates. Developing strategies for extracting orthogonal and robust features from large data sets increases the chances of success in biomarker identification.

  18. OMIT: Dynamic, Semi-Automated Ontology Development for the microRNA Domain

    Science.gov (United States)

    Huang, Jingshan; Dang, Jiangbo; Borchert, Glen M.; Eilbeck, Karen; Zhang, He; Xiong, Min; Jiang, Weijian; Wu, Hao; Blake, Judith A.; Natale, Darren A.; Tan, Ming

    2014-01-01

    As a special class of short non-coding RNAs, microRNAs (a.k.a. miRNAs or miRs) have been reported to perform important roles in various biological processes by regulating respective target genes. However, significant barriers exist during biologists' conventional miR knowledge discovery. Emerging semantic technologies, which are based upon domain ontologies, can render critical assistance to this problem. Our previous research has investigated the construction of a miR ontology, named Ontology for MIcroRNA Target Prediction (OMIT), the very first of its kind that formally encodes miR domain knowledge. Although it is unavoidable to have a manual component contributed by domain experts when building ontologies, many challenges have been identified for a completely manual development process. The most significant issue is that a manual development process is very labor-intensive and thus extremely expensive. Therefore, we propose in this paper an innovative ontology development methodology. Our contributions can be summarized as: (i) We have continued the development and critical improvement of OMIT, solidly based on our previous research outcomes. (ii) We have explored effective and efficient algorithms with which the ontology development can be seamlessly combined with machine intelligence and be accomplished in a semi-automated manner, thus significantly reducing large amounts of human efforts. A set of experiments have been conducted to thoroughly evaluate our proposed methodology. PMID:25025130

  19. Feasibility of a semi-automated contrast-oriented algorithm for tumor segmentation in retrospectively gated PET images: phantom and clinical validation

    Science.gov (United States)

    Carles, Montserrat; Fechter, Tobias; Nemer, Ursula; Nanko, Norbert; Mix, Michael; Nestle, Ursula; Schaefer, Andrea

    2015-12-01

    PET/CT plays an important role in radiotherapy planning for lung tumors. Several segmentation algorithms have been proposed for PET tumor segmentation. However, most of them do not take into account respiratory motion and are not well validated. The aim of this work was to evaluate a semi-automated contrast-oriented algorithm (COA) for PET tumor segmentation adapted to retrospectively gated (4D) images. The evaluation involved a wide set of 4D-PET/CT acquisitions of dynamic experimental phantoms and lung cancer patients. In addition, segmentation accuracy of 4D-COA was compared with four other state-of-the-art algorithms. In phantom evaluation, the physical properties of the objects defined the gold standard. In clinical evaluation, the ground truth was estimated by the STAPLE (Simultaneous Truth and Performance Level Estimation) consensus of three manual PET contours by experts. Algorithm evaluation with phantoms resulted in: (i) no statistically significant diameter differences for different targets and movements (Δ φ =0.3+/- 1.6 mm); (ii) reproducibility for heterogeneous and irregular targets independent of user initial interaction and (iii) good segmentation agreement for irregular targets compared to manual CT delineation in terms of Dice Similarity Coefficient (DSC  =  0.66+/- 0.04 ), Positive Predictive Value (PPV  =  0.81+/- 0.06 ) and Sensitivity (Sen.  =  0.49+/- 0.05 ). In clinical evaluation, the segmented volume was in reasonable agreement with the consensus volume (difference in volume (%Vol)  =  40+/- 30 , DSC  =  0.71+/- 0.07 and PPV  =  0.90+/- 0.13 ). High accuracy in target tracking position (Δ ME) was obtained for experimental and clinical data (Δ ME{{}\\text{exp}}=0+/- 3 mm; Δ ME{{}\\text{clin}}=0.3+/- 1.4 mm). In the comparison with other lung segmentation methods, 4D-COA has shown the highest volume accuracy in both experimental and clinical data. In conclusion, the accuracy in volume

  20. Convergence of macrostates under reproducible processes

    International Nuclear Information System (INIS)

    Rau, Jochen

    2010-01-01

    I show that whenever a system undergoes a reproducible macroscopic process the mutual distinguishability of macrostates, as measured by their relative entropy, diminishes. This extends the second law which regards only ordinary entropies, and hence only the distinguishability between macrostates and one specific reference state (equidistribution). The new result holds regardless of whether the process is linear or nonlinear. Its proof hinges on the monotonicity of quantum relative entropy under arbitrary coarse grainings, even those that cannot be represented by trace-preserving completely positive maps.

  1. Open and reproducible global land use classification

    Science.gov (United States)

    Nüst, Daniel; Václavík, Tomáš; Pross, Benjamin

    2015-04-01

    Researchers led by the Helmholtz Centre for Environmental research (UFZ) developed a new world map of land use systems based on over 30 diverse indicators (http://geoportal.glues.geo.tu-dresden.de/stories/landsystemarchetypes.html) of land use intensity, climate and environmental and socioeconomic factors. They identified twelve land system archetypes (LSA) using a data-driven classification algorithm (self-organizing maps) to assess global impacts of land use on the environment, and found unexpected similarities across global regions. We present how the algorithm behind this analysis can be published as an executable web process using 52°North WPS4R (https://wiki.52north.org/bin/view/Geostatistics/WPS4R) within the GLUES project (http://modul-a.nachhaltiges-landmanagement.de/en/scientific-coordination-glues/). WPS4R is an open source collaboration platform for researchers, analysts and software developers to publish R scripts (http://www.r-project.org/) as a geo-enabled OGC Web Processing Service (WPS) process. The interoperable interface to call the geoprocess allows both reproducibility of the analysis and integration of user data without knowledge about web services or classification algorithms. The open platform allows everybody to replicate the analysis in their own environments. The LSA WPS process has several input parameters, which can be changed via a simple web interface. The input parameters are used to configure both the WPS environment and the LSA algorithm itself. The encapsulation as a web process allows integration of non-public datasets, while at the same time the publication requires a well-defined documentation of the analysis. We demonstrate this platform specifically to domain scientists and show how reproducibility and open source publication of analyses can be enhanced. We also discuss future extensions of the reproducible land use classification, such as the possibility for users to enter their own areas of interest to the system and

  2. Comparing a Dynamic Fed-Batch and a Continuous Steady-State Simulation of Ethanol Fermentation in a Distillery to a Stoichiometric Conversion Simulation

    Directory of Open Access Journals (Sweden)

    G.C. Fonseca

    Full Text Available Abstract An autonomous sugarcane bioethanol plant was simulated in EMSO software, an equation oriented process simulator. Three types of fermentation units were simulated: a six parallel fed-batch reactor system, a set of four CSTR in steady state and one consisting of a single stoichiometric reactor. Stoichiometric models are less accurate than kinetic-based fermentation models used for fed-batch and continuous fermenter simulations, since they do not account for inhibition effects and depend on a known conversion rate of reactant to be specified instead. On the other hand, stoichiometric models are faster and simpler to converge. In this study it was found that the conversion rates of sugar for the fermentation systems analyzedwere predictable from information on the composition of the juice stream. Those rates were used in the stoichiometric model, which accurately reproduced the results from both the fed-batch and the continuous fermenter system.

  3. Batch-to-batch uniformity of bacterial community succession and flavor formation in the fermentation of Zhenjiang aromatic vinegar.

    Science.gov (United States)

    Wang, Zong-Min; Lu, Zhen-Ming; Yu, Yong-Jian; Li, Guo-Quan; Shi, Jin-Song; Xu, Zheng-Hong

    2015-09-01

    Solid-state fermentation of traditional Chinese vinegar is a mixed-culture refreshment process that proceeds for many centuries without spoilage. Here, we investigated bacterial community succession and flavor formation in three batches of Zhenjiang aromatic vinegar using pyrosequencing and metabolomics approaches. Temporal patterns of bacterial succession in the Pei (solid-state vinegar culture) showed no significant difference (P > 0.05) among three batches of fermentation. In all the batches investigated, the average number of community operational taxonomic units (OTUs) decreased dramatically from 119 ± 11 on day 1 to 48 ± 16 on day 3, and then maintained in the range of 61 ± 9 from day 5 to the end of fermentation. We confirmed that, within a batch of fermentation process, the patterns of bacterial diversity between the starter (took from the last batch of vinegar culture on day 7) and the Pei on day 7 were similar (90%). The relative abundance dynamics of two dominant members, Lactobacillus and Acetobacter, showed high correlation (coefficient as 0.90 and 0.98 respectively) among different batches. Furthermore, statistical analysis revealed dynamics of 16 main flavor metabolites were stable among different batches. The findings validate the batch-to-batch uniformity of bacterial community succession and flavor formation accounts for the quality of Zhenjiang aromatic vinegar. Based on our understanding, this is the first study helps to explain the rationality of age-old artistry from a scientific perspective. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. Reproducibility in Research: Systems, Infrastructure, Culture

    Directory of Open Access Journals (Sweden)

    Tom Crick

    2017-11-01

    Full Text Available The reproduction and replication of research results has become a major issue for a number of scientific disciplines. In computer science and related computational disciplines such as systems biology, the challenges closely revolve around the ability to implement (and exploit novel algorithms and models. Taking a new approach from the literature and applying it to a new codebase frequently requires local knowledge missing from the published manuscripts and transient project websites. Alongside this issue, benchmarking, and the lack of open, transparent and fair benchmark sets present another barrier to the verification and validation of claimed results. In this paper, we outline several recommendations to address these issues, driven by specific examples from a range of scientific domains. Based on these recommendations, we propose a high-level prototype open automated platform for scientific software development which effectively abstracts specific dependencies from the individual researcher and their workstation, allowing easy sharing and reproduction of results. This new e-infrastructure for reproducible computational science offers the potential to incentivise a culture change and drive the adoption of new techniques to improve the quality and efficiency – and thus reproducibility – of scientific exploration.

  5. Batched Triangular DLA for Very Small Matrices on GPUs

    KAUST Repository

    Charara, Ali; Keyes, David E.; Ltaief, Hatem

    2017-01-01

    linear algebra operations on very small matrix sizes (usually less than 100). Batched dense linear algebra kernels are becoming ubiquitous for such scientific computations. Within a single API call, these kernels are capable of simultaneously launching a

  6. Groundwater arsenic remediation using zerovalent iron: Batch and column tests

    Science.gov (United States)

    Recently, increasing efforts have been made to explore the applicability and limitations of zerovalent iron (Fe0) for the treatment of arsenicbearing groundwater and wastewater. The experimental batch and column tests have demonstrated that arsenate and arsenite are removed effec...

  7. Batch Adsorption Study of Methylene Blue in Aqueous Solution ...

    African Journals Online (AJOL)

    PROF HORSFALL

    of methylene blue (azo dye) from the synthetic industrial wastewater was investigated in a batch system. Rice husk and coconut shell were ... the textiles, rubber, paper, plastics, cosmetic, and .... wastewater by. Fenton's oxidation: Kinetic study.

  8. Automated batch emulsion copolymerization of styrene and butyl acrylate

    NARCIS (Netherlands)

    Mballa Mballa, M.A.; Schubert, U.S.; Heuts, J.P.A.; Herk, van A.M.

    2011-01-01

    This article describes a method for carrying out emulsion copolymerization using an automated synthesizer. For this purpose, batch emulsion copolymerizations of styrene and butyl acrylate were investigated. The optimization of the polymerization system required tuning the liquid transfer method,

  9. development of an automated batch-process solar water disinfection

    African Journals Online (AJOL)

    user

    This work presents the development of an automated batch-process water disinfection system ... Locally sourced materials in addition to an Arduinomicro processor were used to control ..... As already mentioned in section 3.1.1, a statistical.

  10. Numerical modeling of batch formation in waste incineration plants

    Directory of Open Access Journals (Sweden)

    Obroučka Karel

    2015-03-01

    Full Text Available The aim of this paper is a mathematical description of algorithm for controlled assembly of incinerated batch of waste. The basis for formation of batch is selected parameters of incinerated waste as its calorific value or content of pollutants or the combination of both. The numerical model will allow, based on selected criteria, to compile batch of wastes which continuously follows the previous batch, which is a prerequisite for optimized operation of incinerator. The model was prepared as for waste storage in containers, as well as for waste storage in continuously refilled boxes. The mathematical model was developed into the computer program and its functionality was verified either by practical measurements or by numerical simulations. The proposed model can be used in incinerators for hazardous and municipal waste.

  11. 40 CFR 63.462 - Batch cold cleaning machine standards.

    Science.gov (United States)

    2010-07-01

    .... (a) Each owner or operator of an immersion batch cold solvent cleaning machine shall comply with the... cleaning machine complying with paragraph (a)(2) or (b) of this section shall comply with the work and...

  12. Characterization and reproducibility of HepG2 hanging drop spheroids toxicology in vitro.

    Science.gov (United States)

    Hurrell, Tracey; Ellero, Andrea Antonio; Masso, Zelie Flavienne; Cromarty, Allan Duncan

    2018-02-21

    Hepatotoxicity remains a major challenge in drug development despite preclinical toxicity screening using hepatocytes of human origin. To overcome some limitations of reproducing the hepatic phenotype, more structurally and functionally authentic cultures in vitro can be introduced by growing cells in 3D spheroid cultures. Characterisation and reproducibility of HepG2 spheroid cultures using a high-throughput hanging drop technique was performed and features contributing to potential phenotypic variation highlighted. Cultured HepG2 cells were seeded into Perfecta 3D® 96-well hanging drop plates and assessed over time for morphology, viability, cell cycle distribution, protein content and protein-mass profiles. Divergent aspects which were assessed included cell stocks, seeding density, volume of culture medium and use of extracellular matrix additives. Hanging drops are advantageous due to no complex culture matrix being present, enabling background free extractions for downstream experimentation. Varying characteristics were observed across cell stocks and batches, seeding density, culture medium volume and extracellular matrix when using immortalized HepG2 cells. These factors contribute to wide-ranging cellular responses and highlights concerns with respect to generating a reproducible phenotype in HepG2 hanging drop spheroids. Copyright © 2018 Elsevier Ltd. All rights reserved.

  13. Macroscopic Dynamic Modeling of Sequential Batch Cultures of Hybridoma Cells: An Experimental Validation

    Directory of Open Access Journals (Sweden)

    Laurent Dewasme

    2017-02-01

    Full Text Available Hybridoma cells are commonly grown for the production of monoclonal antibodies (MAb. For monitoring and control purposes of the bioreactors, dynamic models of the cultures are required. However these models are difficult to infer from the usually limited amount of available experimental data and do not focus on target protein production optimization. This paper explores an experimental case study where hybridoma cells are grown in a sequential batch reactor. The simplest macroscopic reaction scheme translating the data is first derived using a maximum likelihood principal component analysis. Subsequently, nonlinear least-squares estimation is used to determine the kinetic laws. The resulting dynamic model reproduces quite satisfactorily the experimental data, as evidenced in direct and cross-validation tests. Furthermore, model predictions can also be used to predict optimal medium renewal time and composition.

  14. Dynamic Extensions of Batch Systems with Cloud Resources

    International Nuclear Information System (INIS)

    Hauth, T; Quast, G; Büge, V; Scheurer, A; Kunze, M; Baun, C

    2011-01-01

    Compute clusters use Portable Batch Systems (PBS) to distribute workload among individual cluster machines. To extend standard batch systems to Cloud infrastructures, a new service monitors the number of queued jobs and keeps track of the price of available resources. This meta-scheduler dynamically adapts the number of Cloud worker nodes according to the requirement profile. Two different worker node topologies are presented and tested on the Amazon EC2 Cloud service.

  15. Effect of glass-batch makeup on the melting process

    International Nuclear Information System (INIS)

    Hrma, Pavel R.; Schweiger, Michael J.; Humrickhouse, Carissa J.; Moody, J. Adam; Tate, Rachel M.; Rainsdon, Timothy T.; Tegrotenhuis, Nathan E.; Arrigoni, Benjamin M.; Marcial, Jose; Rodriguez, Carmen P.; Tincher, Benjamin

    2010-01-01

    The response of a glass batch to heating is determined by the batch makeup and in turn determines the rate of melting. Batches formulated for a high-alumina nuclear waste to be vitrified in an all-electric melter were heated at a constant temperature-increase rate to determine changes in melting behavior in response to the selection of batch chemicals and silica grain-size as well as the addition of heat-generating reactants. The type of batch materials and the size of silica grains determine how much, if any, primary foam occurs during melting. Small quartz grains, 5 (micro)m in size, caused extensive foaming because their major portion dissolved at temperatures 800 C when batch gases no longer evolved. The exothermal reaction of nitrates with sucrose was ignited at a temperature as low as 160 C and caused a temporary jump in temperature of several hundred degrees. Secondary foam, the source of which is oxygen from redox reactions, occurred in all batches of a limited composition variation involving five oxides, B 2 O 3 , CaO, Li 2 O, MgO, and Na 2 O. The foam volume at the maximum volume-increase rate was a weak function of temperature and melt basicity. Neither the batch makeup nor the change in glass composition had a significant impact on the dissolution of silica grains. The impacts of primary foam generation on glass homogeneity and the rate of melting in large-scale continuous furnaces have yet to be established via mathematical modeling and melter experiments.

  16. Effect Of Glass-Batch Makeup On The Melting Process

    International Nuclear Information System (INIS)

    Kruger, A.A.; Hrma, P.

    2010-01-01

    The response of a glass batch to heating is determined by the batch makeup and in turn determines the rate of melting. Batches formulated for a high-alumina nuclear waste to be vitrified in an all-electric melter were heated at a constant temperature-increase rate to determine changes in melting behavior in response to the selection of batch chemicals and silica grain-size as well as the addition of heat-generating reactants. The type of batch materials and the size of silica grains determine how much, if any, primary foam occurs during melting. Small quartz grains, 5 (micro)m in size, caused extensive foaming because their major portion dissolved at temperatures 800 C when batch gases no longer evolved. The exothermal reaction of nitrates with sucrose was ignited at a temperature as low as 160 C and caused a temporary jump in temperature of several hundred degrees. Secondary foam, the source of which is oxygen from redox reactions, occurred in all batches of a limited composition variation involving five oxides, B 2 O 3 , CaO, Li 2 O, MgO, and Na 2 O. The foam volume at the maximum volume-increase rate was a weak function of temperature and melt basicity. Neither the batch makeup nor the change in glass composition had a significant impact on the dissolution of silica grains. The impacts of primary foam generation on glass homogeneity and the rate of melting in large-scale continuous furnaces have yet to be established via mathematical modeling and melter experiments.

  17. Application of the fuzzy theory to simulation of batch fermentation

    Energy Technology Data Exchange (ETDEWEB)

    Filev, D P; Kishimoto, M; Sengupta, S; Yoshida, T; Taguchi, H

    1985-12-01

    A new approach for system identification with a linguistic model of batch fermentation processes is proposed. The fuzzy theory was applied in order to reduce the uncertainty of quantitative description of the processes by use of qualitative characteristics. An example of fuzzy modeling was illustrated in the simulation of batch ethanol production from molasses after interpretation of the new method, and extension of the fuzzy model was also discussed for several cases of different measurable variables.

  18. Structural analysis of magnetic fusion energy systems in a combined interactive/batch computer environment

    International Nuclear Information System (INIS)

    Johnson, N.E.; Singhal, M.K.; Walls, J.C.; Gray, W.H.

    1979-01-01

    A system of computer programs has been developed to aid in the preparation of input data for and the evaluation of output data from finite element structural analyses of magnetic fusion energy devices. The system utilizes the NASTRAN structural analysis computer program and a special set of interactive pre- and post-processor computer programs, and has been designed for use in an environment wherein a time-share computer system is linked to a batch computer system. In such an environment, the analyst must only enter, review and/or manipulate data through interactive terminals linked to the time-share computer system. The primary pre-processor programs include NASDAT, NASERR and TORMAC. NASDAT and TORMAC are used to generate NASTRAN input data. NASERR performs routine error checks on this data. The NASTRAN program is run on a batch computer system using data generated by NASDAT and TORMAC. The primary post-processing programs include NASCMP and NASPOP. NASCMP is used to compress the data initially stored on magnetic tape by NASTRAN so as to facilitate interactive use of the data. NASPOP reads the data stored by NASCMP and reproduces NASTRAN output for selected grid points, elements and/or data types

  19. Semi-automated volumetric analysis of lymph node metastases in patients with malignant melanoma stage III/IV-A feasibility study

    International Nuclear Information System (INIS)

    Fabel, M.; Tengg-Kobligk, H. von; Giesel, F.L.; Delorme, S.; Kauczor, H.-U.; Bornemann, L.; Dicken, V.; Kopp-Schneider, A.; Moser, C.

    2008-01-01

    Therapy monitoring in oncological patient care requires accurate and reliable imaging and post-processing methods. RECIST criteria are the current standard, with inherent disadvantages. The aim of this study was to investigate the feasibility of semi-automated volumetric analysis of lymph node metastases in patients with malignant melanoma compared to manual volumetric analysis and RECIST. Multislice CT was performed in 47 patients, covering the chest, abdomen and pelvis. In total, 227 suspicious, enlarged lymph nodes were evaluated retrospectively by two radiologists regarding diameters (RECIST), manually measured volume by placement of ROIs and semi-automated volumetric analysis. Volume (ml), quality of segmentation (++/-) and time effort (s) were evaluated in the study. The semi-automated volumetric analysis software tool was rated acceptable to excellent in 81% of all cases (reader 1) and 79% (reader 2). Median time for the entire segmentation process and necessary corrections was shorter with the semi-automated software than by manual segmentation. Bland-Altman plots showed a significantly lower interobserver variability for semi-automated volumetric than for RECIST measurements. The study demonstrated feasibility of volumetric analysis of lymph node metastases. The software allows a fast and robust segmentation in up to 80% of all cases. Ease of use and time needed are acceptable for application in the clinical routine. Variability and interuser bias were reduced to about one third of the values found for RECIST measurements. (orig.)

  20. Polynomial Batch Codes for Efficient IT-PIR

    Directory of Open Access Journals (Sweden)

    Henry Ryan

    2016-10-01

    Full Text Available Private information retrieval (PIR is a way for clients to query a remote database without the database holder learning the clients’ query terms or the responses they generate. Compelling applications for PIR are abound in the cryptographic and privacy research literature, yet existing PIR techniques are notoriously inefficient. Consequently, no such PIRbased application to date has seen real-world at-scale deployment. This paper proposes new “batch coding” techniques to help address PIR’s efficiency problem. The new techniques exploit the connection between ramp secret sharing schemes and efficient information-theoretically secure PIR (IT-PIR protocols. This connection was previously observed by Henry, Huang, and Goldberg (NDSS 2013, who used ramp schemes to construct efficient “batch queries” with which clients can fetch several database records for the same cost as fetching a single record using a standard, non-batch query. The new techniques in this paper generalize and extend those of Henry et al. to construct “batch codes” with which clients can fetch several records for only a fraction the cost of fetching a single record using a standard non-batch query over an unencoded database. The batch codes are highly tuneable, providing a means to trade off (i lower server-side computation cost, (ii lower server-side storage cost, and/or (iii lower uni- or bi-directional communication cost, in exchange for a comparatively modest decrease in resilience to Byzantine database servers.

  1. Semiautomated analysis of embryoscope images: Using localized variance of image intensity to detect embryo developmental stages.

    Science.gov (United States)

    Mölder, Anna; Drury, Sarah; Costen, Nicholas; Hartshorne, Geraldine M; Czanner, Silvester

    2015-02-01

    Embryo selection in in vitro fertilization (IVF) treatment has traditionally been done manually using microscopy at intermittent time points during embryo development. Novel technique has made it possible to monitor embryos using time lapse for long periods of time and together with the reduced cost of data storage, this has opened the door to long-term time-lapse monitoring, and large amounts of image material is now routinely gathered. However, the analysis is still to a large extent performed manually, and images are mostly used as qualitative reference. To make full use of the increased amount of microscopic image material, (semi)automated computer-aided tools are needed. An additional benefit of automation is the establishment of standardization tools for embryo selection and transfer, making decisions more transparent and less subjective. Another is the possibility to gather and analyze data in a high-throughput manner, gathering data from multiple clinics and increasing our knowledge of early human embryo development. In this study, the extraction of data to automatically select and track spatio-temporal events and features from sets of embryo images has been achieved using localized variance based on the distribution of image grey scale levels. A retrospective cohort study was performed using time-lapse imaging data derived from 39 human embryos from seven couples, covering the time from fertilization up to 6.3 days. The profile of localized variance has been used to characterize syngamy, mitotic division and stages of cleavage, compaction, and blastocoel formation. Prior to analysis, focal plane and embryo location were automatically detected, limiting precomputational user interaction to a calibration step and usable for automatic detection of region of interest (ROI) regardless of the method of analysis. The results were validated against the opinion of clinical experts. © 2015 International Society for Advancement of Cytometry. © 2015 International

  2. Methods for semi-automated indexing for high precision information retrieval

    Science.gov (United States)

    Berrios, Daniel C.; Cucina, Russell J.; Fagan, Lawrence M.

    2002-01-01

    OBJECTIVE: To evaluate a new system, ISAID (Internet-based Semi-automated Indexing of Documents), and to generate textbook indexes that are more detailed and more useful to readers. DESIGN: Pilot evaluation: simple, nonrandomized trial comparing ISAID with manual indexing methods. Methods evaluation: randomized, cross-over trial comparing three versions of ISAID and usability survey. PARTICIPANTS: Pilot evaluation: two physicians. Methods evaluation: twelve physicians, each of whom used three different versions of the system for a total of 36 indexing sessions. MEASUREMENTS: Total index term tuples generated per document per minute (TPM), with and without adjustment for concordance with other subjects; inter-indexer consistency; ratings of the usability of the ISAID indexing system. RESULTS: Compared with manual methods, ISAID decreased indexing times greatly. Using three versions of ISAID, inter-indexer consistency ranged from 15% to 65% with a mean of 41%, 31%, and 40% for each of three documents. Subjects using the full version of ISAID were faster (average TPM: 5.6) and had higher rates of concordant index generation. There were substantial learning effects, despite our use of a training/run-in phase. Subjects using the full version of ISAID were much faster by the third indexing session (average TPM: 9.1). There was a statistically significant increase in three-subject concordant indexing rate using the full version of ISAID during the second indexing session (p < 0.05). SUMMARY: Users of the ISAID indexing system create complex, precise, and accurate indexing for full-text documents much faster than users of manual methods. Furthermore, the natural language processing methods that ISAID uses to suggest indexes contributes substantially to increased indexing speed and accuracy.

  3. Semi-Automated Classification of Seafloor Data Collected on the Delmarva Inner Shelf

    Science.gov (United States)

    Sweeney, E. M.; Pendleton, E. A.; Brothers, L. L.; Mahmud, A.; Thieler, E. R.

    2017-12-01

    We tested automated classification methods on acoustic bathymetry and backscatter data collected by the U.S. Geological Survey (USGS) and National Oceanic and Atmospheric Administration (NOAA) on the Delmarva inner continental shelf to efficiently and objectively identify sediment texture and geomorphology. Automated classification techniques are generally less subjective and take significantly less time than manual classification methods. We used a semi-automated process combining unsupervised and supervised classification techniques to characterize seafloor based on bathymetric slope and relative backscatter intensity. Statistical comparison of our automated classification results with those of a manual classification conducted on a subset of the acoustic imagery indicates that our automated method was highly accurate (95% total accuracy and 93% Kappa). Our methods resolve sediment ridges, zones of flat seafloor and areas of high and low backscatter. We compared our classification scheme with mean grain size statistics of samples collected in the study area and found that strong correlations between backscatter intensity and sediment texture exist. High backscatter zones are associated with the presence of gravel and shells mixed with sand, and low backscatter areas are primarily clean sand or sand mixed with mud. Slope classes further elucidate textural and geomorphologic differences in the seafloor, such that steep slopes (>0.35°) with high backscatter are most often associated with the updrift side of sand ridges and bedforms, whereas low slope with high backscatter correspond to coarse lag or shell deposits. Low backscatter and high slopes are most often found on the downdrift side of ridges and bedforms, and low backscatter and low slopes identify swale areas and sand sheets. We found that poor acoustic data quality was the most significant cause of inaccurate classification results, which required additional user input to mitigate. Our method worked well

  4. Semi-automated scar detection in delayed enhanced cardiac magnetic resonance images

    Science.gov (United States)

    Morisi, Rita; Donini, Bruno; Lanconelli, Nico; Rosengarden, James; Morgan, John; Harden, Stephen; Curzen, Nick

    2015-06-01

    Late enhancement cardiac magnetic resonance images (MRI) has the ability to precisely delineate myocardial scars. We present a semi-automated method for detecting scars in cardiac MRI. This model has the potential to improve routine clinical practice since quantification is not currently offered due to time constraints. A first segmentation step was developed for extracting the target regions for potential scar and determining pre-candidate objects. Pattern recognition methods are then applied to the segmented images in order to detect the position of the myocardial scar. The database of late gadolinium enhancement (LE) cardiac MR images consists of 111 blocks of images acquired from 63 patients at the University Hospital Southampton NHS Foundation Trust (UK). At least one scar was present for each patient, and all the scars were manually annotated by an expert. A group of images (around one third of the entire set) was used for training the system which was subsequently tested on all the remaining images. Four different classifiers were trained (Support Vector Machine (SVM), k-nearest neighbor (KNN), Bayesian and feed-forward neural network) and their performance was evaluated by using Free response Receiver Operating Characteristic (FROC) analysis. Feature selection was implemented for analyzing the importance of the various features. The segmentation method proposed allowed the region affected by the scar to be extracted correctly in 96% of the blocks of images. The SVM was shown to be the best classifier for our task, and our system reached an overall sensitivity of 80% with less than 7 false positives per patient. The method we present provides an effective tool for detection of scars on cardiac MRI. This may be of value in clinical practice by permitting routine reporting of scar quantification.

  5. Semi-automated procedures for shoreline extraction using single RADARSAT-1 SAR image

    Science.gov (United States)

    Al Fugura, A.'kif; Billa, Lawal; Pradhan, Biswajeet

    2011-12-01

    Coastline identification is important for surveying and mapping reasons. Coastline serves as the basic point of reference and is used on nautical charts for navigation purposes. Its delineation has become crucial and more important in the wake of the many recent earthquakes and tsunamis resulting in complete change and redraw of some shorelines. In a tropical country like Malaysia, presence of cloud cover hinders the application of optical remote sensing data. In this study a semi-automated technique and procedures are presented for shoreline delineation from RADARSAT-1 image. A scene of RADARSAT-1 satellite image was processed using enhanced filtering technique to identify and extract the shoreline coast of Kuala Terengganu, Malaysia. RADSARSAT image has many advantages over the optical data because of its ability to penetrate cloud cover and its night sensing capabilities. At first, speckles were removed from the image by using Lee sigma filter which was used to reduce random noise and to enhance the image and discriminate the boundary between land and water. The results showed an accurate and improved extraction and delineation of the entire coastline of Kuala Terrenganu. The study demonstrated the reliability of the image averaging filter in reducing random noise over the sea surface especially near the shoreline. It enhanced land-water boundary differentiation, enabling better delineation of the shoreline. Overall, the developed techniques showed the potential of radar imagery for accurate shoreline mapping and will be useful for monitoring shoreline changes during high and low tides as well as shoreline erosion in a tropical country like Malaysia.

  6. PSYCHOLOGY. Estimating the reproducibility of psychological science.

    Science.gov (United States)

    2015-08-28

    Reproducibility is a defining feature of science, but the extent to which it characterizes current research is unknown. We conducted replications of 100 experimental and correlational studies published in three psychology journals using high-powered designs and original materials when available. Replication effects were half the magnitude of original effects, representing a substantial decline. Ninety-seven percent of original studies had statistically significant results. Thirty-six percent of replications had statistically significant results; 47% of original effect sizes were in the 95% confidence interval of the replication effect size; 39% of effects were subjectively rated to have replicated the original result; and if no bias in original results is assumed, combining original and replication results left 68% with statistically significant effects. Correlational tests suggest that replication success was better predicted by the strength of original evidence than by characteristics of the original and replication teams. Copyright © 2015, American Association for the Advancement of Science.

  7. A parametric study ot protease production in batch and fed-batch cultures of Bacillus firmus.

    Science.gov (United States)

    Moon, S H; Parulekar, S J

    1991-03-05

    Proteolytic enzymes produced by Bacillus species find a wide variety of applications in brewing, detergent, food, and leather industries. Owing to significant differences normally observed in culture conditions promoting cell growth and those promoting production of metabolites such as enzymes, for increased efficacy of bioreactor operations it is essential to identify these sets of conditions (including medium formulation). This study is focused on formulation of a semidefined medium that substantially enhances synthesis and secretion of an alkaline protease in batch cultures of Bacillus firmus NRS 783, a known superior producer of this enzyme. The series of experiments conducted to identify culture conditions that lead to improved protease production also enables investigation of the regulatory effects of important culture parameters including pH, dissolved oxygen, and concentrations of nitrogen and phosphorous sources and yeast extract in the medium on cell growth, synthesis and secretion of protease, and production of two major nonbiomass products, viz., acetic acid and ethanol. Cell growth and formation of the three nonbiomass products are hampered significantly under nitrogen, phosphorous, or oxygen limitation, with the cells being unable to grow in an oxygen-free environment. Improvement in protease production is achieved with respect to each culture parameter, leading in the process to 80% enhancement in protease activity over that attained using media reported in the literature. Results of a few fed-batch experiments with constant feed rate, conducted to examine possible enhancement in protease production and to further investigate repression of protease synthesis by excess of the principal carbon and nitrogen sources, are also discussed. The detailed investigation of stimulatory and repressory effects of simple and complex nutrients on protease production and metabolism of Bacillus firmus conducted in this study will provide useful guidelines for design

  8. Sludge Batch Variability Study With Frit 418

    International Nuclear Information System (INIS)

    Johnson, F.; Edwards, T.

    2010-01-01

    The Defense Waste Processing Facility (DWPF) initiated processing Sludge Batch 6 (SB6) in the summer of 2010. In support of processing, the Savannah River National Laboratory (SRNL) provided a recommendation to utilize Frit 418 to process SB6. This recommendation was based on assessments of the compositional projections for SB6 available at the time from the Liquid Waste Organization (LWO) and SRNL (using a model-based approach). To support qualification of SB6, SRNL executed a variability study to assess the applicability of the current durability models for SB6. The durability models were assessed over the expected Frit 418-SB6 composition range. Seventeen glasses were selected for the variability study based on the sludge projections used in the frit recommendation. Five of the glasses are based on the centroid of the compositional region, spanning a waste loading (WL) range of 32 to 40%. The remaining twelve glasses are extreme vertices (EVs) of the sludge region of interest for SB6 combined with Frit 418 and are all at 36% WL. These glasses were fabricated and characterized using chemical composition analysis, X-ray diffraction (XRD) and the Product Consistency Test (PCT). After initiating the SB6 variability study, the measured composition of the SB6 Tank 51 qualification glass produced at the SRNL Shielded Cells Facility indicated that thorium was present in the glass at an appreciable concentration (1.03 wt%), which made it a reportable element for SB6. This concentration of ThO 2 resulted in a second phase of experimental studies. Five glasses were formulated that were based on the centroid of the new sludge compositional region combined with Frit 418, spanning a WL range of 32 to 40%. These glasses were fabricated and characterized using chemical composition analysis and the PCT. Based on the measured PCT response, all of the glasses (with and without thorium) were acceptable with respect to the Environmental Assessment (EA) reference glass regardless of

  9. Batch-to-Batch Quality Consistency Evaluation of Botanical Drug Products Using Multivariate Statistical Analysis of the Chromatographic Fingerprint

    OpenAIRE

    Xiong, Haoshu; Yu, Lawrence X.; Qu, Haibin

    2013-01-01

    Botanical drug products have batch-to-batch quality variability due to botanical raw materials and the current manufacturing process. The rational evaluation and control of product quality consistency are essential to ensure the efficacy and safety. Chromatographic fingerprinting is an important and widely used tool to characterize the chemical composition of botanical drug products. Multivariate statistical analysis has showed its efficacy and applicability in the quality evaluation of many ...

  10. Preliminary clinical evaluation of semi-automated nailfold capillaroscopy in the assessment of patients with Raynaud's phenomenon.

    Science.gov (United States)

    Murray, Andrea K; Feng, Kaiyan; Moore, Tonia L; Allen, Phillip D; Taylor, Christopher J; Herrick, Ariane L

    2011-08-01

      Nailfold capillaroscopy is well established in screening patients with Raynaud's phenomenon for underlying SSc-spectrum disorders, by identifying abnormal capillaries. Our aim was to compare semi-automatic feature measurement from newly developed software with manual measurements, and determine the degree to which semi-automated data allows disease group classification.   Images from 46 healthy controls, 21 patients with PRP and 49 with SSc were preprocessed, and semi-automated measurements of intercapillary distance and capillary width, tortuosity, and derangement were performed. These were compared with manual measurements. Features were used to classify images into the three subject groups.   Comparison of automatic and manual measures for distance, width, tortuosity, and derangement had correlations of r=0.583, 0.624, 0.495 (p<0.001), and 0.195 (p=0.040). For automatic measures, correlations were found between width and intercapillary distance, r=0.374, and width and tortuosity, r=0.573 (p<0.001). Significant differences between subject groups were found for all features (p<0.002). Overall, 75% of images correctly matched clinical classification using semi-automated features, compared with 71% for manual measurements.   Semi-automatic and manual measurements of distance, width, and tortuosity showed moderate (but statistically significant) correlations. Correlation for derangement was weaker. Semi-automatic measurements are faster than manual measurements. Semi-automatic parameters identify differences between groups, and are as good as manual measurements for between-group classification. © 2011 John Wiley & Sons Ltd.

  11. Semiautomated segmentation of head and neck cancers in 18F-FDG PET scans: A just-enough-interaction approach.

    Science.gov (United States)

    Beichel, Reinhard R; Van Tol, Markus; Ulrich, Ethan J; Bauer, Christian; Chang, Tangel; Plichta, Kristin A; Smith, Brian J; Sunderland, John J; Graham, Michael M; Sonka, Milan; Buatti, John M

    2016-06-01

    The purpose of this work was to develop, validate, and compare a highly computer-aided method for the segmentation of hot lesions in head and neck 18F-FDG PET scans. A semiautomated segmentation method was developed, which transforms the segmentation problem into a graph-based optimization problem. For this purpose, a graph structure around a user-provided approximate lesion centerpoint is constructed and a suitable cost function is derived based on local image statistics. To handle frequently occurring situations that are ambiguous (e.g., lesions adjacent to each other versus lesion with inhomogeneous uptake), several segmentation modes are introduced that adapt the behavior of the base algorithm accordingly. In addition, the authors present approaches for the efficient interactive local and global refinement of initial segmentations that are based on the "just-enough-interaction" principle. For method validation, 60 PET/CT scans from 59 different subjects with 230 head and neck lesions were utilized. All patients had squamous cell carcinoma of the head and neck. A detailed comparison with the current clinically relevant standard manual segmentation approach was performed based on 2760 segmentations produced by three experts. Segmentation accuracy measured by the Dice coefficient of the proposed semiautomated and standard manual segmentation approach was 0.766 and 0.764, respectively. This difference was not statistically significant (p = 0.2145). However, the intra- and interoperator standard deviations were significantly lower for the semiautomated method. In addition, the proposed method was found to be significantly faster and resulted in significantly higher intra- and interoperator segmentation agreement when compared to the manual segmentation approach. Lack of consistency in tumor definition is a critical barrier for radiation treatment targeting as well as for response assessment in clinical trials and in clinical oncology decision-making. The properties

  12. Semiautomated segmentation of head and neck cancers in 18F-FDG PET scans: A just-enough-interaction approach

    Energy Technology Data Exchange (ETDEWEB)

    Beichel, Reinhard R., E-mail: reinhard-beichel@uiowa.edu [Department of Electrical and Computer Engineering, University of Iowa, Iowa City, Iowa 52242 (United States); Iowa Institute for Biomedical Imaging, University of Iowa, Iowa City, Iowa 52242 (United States); Department of Internal Medicine, University of Iowa, Iowa City, Iowa 52242 (United States); Van Tol, Markus; Ulrich, Ethan J.; Bauer, Christian [Department of Electrical and Computer Engineering, University of Iowa, Iowa City, Iowa 52242 (United States); Iowa Institute for Biomedical Imaging, The University of Iowa, Iowa City, Iowa 52242 (United States); Chang, Tangel; Plichta, Kristin A. [Department of Radiation Oncology, University of Iowa, Iowa City, Iowa 52242 (United States); Smith, Brian J. [Department of Biostatistics, University of Iowa, Iowa City, Iowa 52242 (United States); Sunderland, John J.; Graham, Michael M. [Department of Radiology, University of Iowa, Iowa City, Iowa 52242 (United States); Sonka, Milan [Department of Electrical and Computer Engineering, University of Iowa, Iowa City, Iowa 52242 (United States); Department of Radiation Oncology, The University of Iowa, Iowa City, Iowa 52242 (United States); Iowa Institute for Biomedical Imaging, University of Iowa, Iowa City, Iowa 52242 (United States); Buatti, John M. [Department of Radiation Oncology, University of Iowa, Iowa City, Iowa 52242 (United States); Iowa Institute for Biomedical Imaging, University of Iowa, Iowa City, Iowa 52242 (United States)

    2016-06-15

    Purpose: The purpose of this work was to develop, validate, and compare a highly computer-aided method for the segmentation of hot lesions in head and neck 18F-FDG PET scans. Methods: A semiautomated segmentation method was developed, which transforms the segmentation problem into a graph-based optimization problem. For this purpose, a graph structure around a user-provided approximate lesion centerpoint is constructed and a suitable cost function is derived based on local image statistics. To handle frequently occurring situations that are ambiguous (e.g., lesions adjacent to each other versus lesion with inhomogeneous uptake), several segmentation modes are introduced that adapt the behavior of the base algorithm accordingly. In addition, the authors present approaches for the efficient interactive local and global refinement of initial segmentations that are based on the “just-enough-interaction” principle. For method validation, 60 PET/CT scans from 59 different subjects with 230 head and neck lesions were utilized. All patients had squamous cell carcinoma of the head and neck. A detailed comparison with the current clinically relevant standard manual segmentation approach was performed based on 2760 segmentations produced by three experts. Results: Segmentation accuracy measured by the Dice coefficient of the proposed semiautomated and standard manual segmentation approach was 0.766 and 0.764, respectively. This difference was not statistically significant (p = 0.2145). However, the intra- and interoperator standard deviations were significantly lower for the semiautomated method. In addition, the proposed method was found to be significantly faster and resulted in significantly higher intra- and interoperator segmentation agreement when compared to the manual segmentation approach. Conclusions: Lack of consistency in tumor definition is a critical barrier for radiation treatment targeting as well as for response assessment in clinical trials and in

  13. Applicability Of A Semi-Automated Clinical Chemistry Analyzer In Determining The Antioxidant Concentrations Of Selected Plants

    Directory of Open Access Journals (Sweden)

    Allan L. Hilario

    2017-07-01

    Full Text Available Plants are rich sources of antioxidants that are protective against diseases associated to oxidative stress. There is a need for high throughput screening method that should be useful in determining the antioxidant concentration in plants. Such screening method should significantly simplify and speed up most antioxidant assays. This paper aimed at comparing the applicability of a semi-automated clinical chemistry analyzer Pointe Scientific MI USA with the traditional standard curve method and using a Vis spectrophotometer in performing the DPPH assay for antioxidant screening. Samples of crude aqueous leaf extract of kulitis Amaranthus viridis Linn and chayote Sechium edule Linn were screened for the Total Antioxidant Concentration TAC using the two methods. Results presented in mean SD amp956gdl were compared using unpaired Students t-test P0.05. All runs were done in triplicates. The mean TAC of A. viridis was 646.0 45.5 amp956gdl using the clinical chemistry analyzer and 581.9 19.4 amp956gdl using the standard curve-spectrophotometer. On the other hand the mean TAC of S. edule was 660.2 35.9 amp956gdl using the semi-automated clinical chemistry analyzer and 672.3 20.9 amp956gdl using the spectrophotometer. No significant differences were observed between the readings of the two methods for A. viridis P0.05 and S. edible P0.05. This implies that the clinical chemistry analyzer can be an alternative method in conducting the DPPH assay to determine the TAC in plants. This study presented the applicability of a semi-automated clinical chemistry analyzer in performing the DPPH assay. Further validation can be conducted by performing other antioxidant assays using this equipment.

  14. Semi-automated preparation of the dopamine transporter ligand [18F]FECNT for human PET imaging studies

    International Nuclear Information System (INIS)

    Voll, Ronald J.; McConathy, Jonathan; Waldrep, Michael S.; Crowe, Ronald J.; Goodman, Mark M.

    2005-01-01

    The fluorine-18 labeled dopamine transport (DAT) ligand 2β-carbomethoxy-3β-(4-chlorophenyl)-8-(2-fluoroethyl)nortropane (FECNT) has shown promising properties as an in vivo DAT imaging agent in human and monkey PET studies. A semi-automated synthesis has been developed to reliably produce [ 18 F]FECNT in a 16% decay corrected yield. This method utilizes a new [ 18 F]fluoralkylating agent and provides high purity [ 18 F]FECNT in a formulation suitable for human use

  15. Investigation of LiF, Mg and Ti (TLD-100 Reproducibility

    Directory of Open Access Journals (Sweden)

    Sadeghi M.

    2015-12-01

    Full Text Available LiF, Mg and Ti cubical TLD chips (known as TLD-100 are widely used for dosimetry purposes. The repeatability of TL dosimetry is investigated by exposing them to doses of (81, 162 and 40.5 mGy with 662keV photons of Cs-137. A group of 40 cubical TLD chips was randomly selected from a batch and the values of Element Correction Coefficient (ECC were obtained 4 times by irradiating them to doses of 81 mGy (two times, 162mGy and 40.5mGy. Results of this study indicate that the average reproducibility of ECC calculation for 40 TLDs is 1.5%, while these values for all chips do not exceed 5%.

  16. Investigation of LiF, Mg and Ti (TLD-100) Reproducibility.

    Science.gov (United States)

    Sadeghi, M; Sina, S; Faghihi, R

    2015-12-01

    LiF, Mg and Ti cubical TLD chips (known as TLD-100) are widely used for dosimetry purposes. The repeatability of TL dosimetry is investigated by exposing them to doses of (81, 162 and 40.5 mGy) with 662keV photons of Cs-137. A group of 40 cubical TLD chips was randomly selected from a batch and the values of Element Correction Coefficient (ECC) were obtained 4 times by irradiating them to doses of 81 mGy (two times), 162mGy and 40.5mGy. Results of this study indicate that the average reproducibility of ECC calculation for 40 TLDs is 1.5%, while these values for all chips do not exceed 5%.

  17. Semi-automated technique for the separation and determination of barium and strontium in surface waters by ion exchange chromatography and atomic emission spectrometry

    International Nuclear Information System (INIS)

    Pierce, F.D.; Brown, H.R.

    1977-01-01

    A semi-automated method for the separation and the analysis of barium and strontium in surface waters by atomic emission spectrometry is described. The method employs a semi-automated separation technique using ion exchange and an automated aspiration-analysis procedure. Forty specimens can be prepared in approximately 90 min and can be analyzed for barium and strontium content in 20 min. The detection limits and sensitivities provided by the described technique are 0.003 mg/l and 0.01 mg/l respectively for barium and 0.00045 mg/l and 0.003 mg/l respectively for strontium

  18. Reproducibility of neuroimaging analyses across operating systems.

    Science.gov (United States)

    Glatard, Tristan; Lewis, Lindsay B; Ferreira da Silva, Rafael; Adalat, Reza; Beck, Natacha; Lepage, Claude; Rioux, Pierre; Rousseau, Marc-Etienne; Sherif, Tarek; Deelman, Ewa; Khalili-Mahani, Najmeh; Evans, Alan C

    2015-01-01

    Neuroimaging pipelines are known to generate different results depending on the computing platform where they are compiled and executed. We quantify these differences for brain tissue classification, fMRI analysis, and cortical thickness (CT) extraction, using three of the main neuroimaging packages (FSL, Freesurfer and CIVET) and different versions of GNU/Linux. We also identify some causes of these differences using library and system call interception. We find that these packages use mathematical functions based on single-precision floating-point arithmetic whose implementations in operating systems continue to evolve. While these differences have little or no impact on simple analysis pipelines such as brain extraction and cortical tissue classification, their accumulation creates important differences in longer pipelines such as subcortical tissue classification, fMRI analysis, and cortical thickness extraction. With FSL, most Dice coefficients between subcortical classifications obtained on different operating systems remain above 0.9, but values as low as 0.59 are observed. Independent component analyses (ICA) of fMRI data differ between operating systems in one third of the tested subjects, due to differences in motion correction. With Freesurfer and CIVET, in some brain regions we find an effect of build or operating system on cortical thickness. A first step to correct these reproducibility issues would be to use more precise representations of floating-point numbers in the critical sections of the pipelines. The numerical stability of pipelines should also be reviewed.

  19. Modeling reproducibility of porescale multiphase flow experiments

    Science.gov (United States)

    Ling, B.; Tartakovsky, A. M.; Bao, J.; Oostrom, M.; Battiato, I.

    2017-12-01

    Multi-phase flow in porous media is widely encountered in geological systems. Understanding immiscible fluid displacement is crucial for processes including, but not limited to, CO2 sequestration, non-aqueous phase liquid contamination and oil recovery. Microfluidic devices and porescale numerical models are commonly used to study multiphase flow in biological, geological, and engineered porous materials. In this work, we perform a set of drainage and imbibition experiments in six identical microfluidic cells to study the reproducibility of multiphase flow experiments. We observe significant variations in the experimental results, which are smaller during the drainage stage and larger during the imbibition stage. We demonstrate that these variations are due to sub-porescale geometry differences in microcells (because of manufacturing defects) and variations in the boundary condition (i.e.,fluctuations in the injection rate inherent to syringe pumps). Computational simulations are conducted using commercial software STAR-CCM+, both with constant and randomly varying injection rate. Stochastic simulations are able to capture variability in the experiments associated with the varying pump injection rate.

  20. Environment and industrial economy: Challenge of reproducibility

    International Nuclear Information System (INIS)

    Rullani, E.

    1992-01-01

    Historically and methodologically counterposed until now, the environmentalist and the economic approach to environmental problems need to be integrated in a new approach that considers, from one side, the relevance of the ecological equilibria for the economic systems and, from the other side, the economic dimension (in terms of investments and transformations in the production system) of any attempt to achieve a better environment. In order to achieve this integration, both approaches are compelled to give up some cultural habits that have characterized them, and have contributed to over-emphasize the opposition between them. The article shows that both approaches can converge into a new one, in which environment is no longer only an holistic, not bargainable, natural external limit to human activity (as in the environmentalist approach), nor simply a scarce and exhaustible resource (as economics tends to consider it); environment should instead become part of the reproducibility sphere, or, in other words, it must be regarded as part of the output that the economic system provides. This new approach, due to scientific and technological advances, is made possible for an increasing class of environmental problems. In order to do this, an evolution is required, that could be able to convert environmental goals into investment and technological innovation goals, and communicate to the firms the value society assigns to environmental resources. This value, the author suggests, should correspond to the reproduction cost. Various examples of this new approach are analyzed and discussed

  1. Cloning, multicopy expression and fed-batch production of Rhodotorula araucariae epoxide hydrolase in yarrowia lipolytica

    CSIR Research Space (South Africa)

    Ramduth, D

    2008-05-01

    Full Text Available demonstrated a 4 fold enhanced EH activity over the transformant. The transformant was then evaluated in batch and fed batch fermentations, where the batch fermentations resulted in - 50% improved EH activity from flask evaluations. In fed batch fermentations...

  2. A novel process-based model of microbial growth: self-inhibition in Saccharomyces cerevisiae aerobic fed-batch cultures.

    Science.gov (United States)

    Mazzoleni, Stefano; Landi, Carmine; Cartenì, Fabrizio; de Alteriis, Elisabetta; Giannino, Francesco; Paciello, Lucia; Parascandola, Palma

    2015-07-30

    Microbial population dynamics in bioreactors depend on both nutrients availability and changes in the growth environment. Research is still ongoing on the optimization of bioreactor yields focusing on the increase of the maximum achievable cell density. A new process-based model is proposed to describe the aerobic growth of Saccharomyces cerevisiae cultured on glucose as carbon and energy source. The model considers the main metabolic routes of glucose assimilation (fermentation to ethanol and respiration) and the occurrence of inhibition due to the accumulation of both ethanol and other self-produced toxic compounds in the medium. Model simulations reproduced data from classic and new experiments of yeast growth in batch and fed-batch cultures. Model and experimental results showed that the growth decline observed in prolonged fed-batch cultures had to be ascribed to self-produced inhibitory compounds other than ethanol. The presented results clarify the dynamics of microbial growth under different feeding conditions and highlight the relevance of the negative feedback by self-produced inhibitory compounds on the maximum cell densities achieved in a bioreactor.

  3. Semi-automated Digital Imaging and Processing System for Measuring Lake Ice Thickness

    Science.gov (United States)

    Singh, Preetpal

    to detect equipment failure and identify defective products at the assembly line. The research work in this thesis combines machine vision and image processing technology to build a digital imaging and processing system for monitoring and measuring lake ice thickness in real time. An ultra-compact USB camera is programmed to acquire and transmit high resolution imagery for processing with MATLAB Image Processing toolbox. The image acquisition and transmission process is fully automated; image analysis is semi-automated and requires limited user input. Potential design changes to the prototype and ideas on fully automating the imaging and processing procedure are presented to conclude this research work.

  4. Semi-automated identification of artefact and noise signals in MEG sensors

    International Nuclear Information System (INIS)

    Rettich, E.

    2006-09-01

    . The semi-automated solution presented here was tested on real MEG data

  5. RadShield: semiautomated shielding design using a floor plan driven graphical user interface.

    Science.gov (United States)

    DeLorenzo, Matthew C; Wu, Dee H; Yang, Kai; Rutel, Isaac B

    2016-09-08

    The purpose of this study was to introduce and describe the development of RadShield, a Java-based graphical user interface (GUI), which provides a base design that uniquely performs thorough, spatially distributed calculations at many points and reports the maximum air-kerma rate and barrier thickness for each barrier pursuant to NCRP Report 147 methodology. Semiautomated shielding design calculations are validated by two approaches: a geometry-based approach and a manual approach. A series of geometry-based equations were derived giv-ing the maximum air-kerma rate magnitude and location through a first derivative root finding approach. The second approach consisted of comparing RadShield results with those found by manual shielding design by an American Board of Radiology (ABR)-certified medical physicist for two clinical room situations: two adjacent catheterization labs, and a radiographic and fluoroscopic (R&F) exam room. RadShield's efficacy in finding the maximum air-kerma rate was compared against the geometry-based approach and the overall shielding recommendations by RadShield were compared against the medical physicist's shielding results. Percentage errors between the geometry-based approach and RadShield's approach in finding the magnitude and location of the maximum air-kerma rate was within 0.00124% and 14 mm. RadShield's barrier thickness calculations were found to be within 0.156 mm lead (Pb) and 0.150 mm lead (Pb) for the adjacent catheteriza-tion labs and R&F room examples, respectively. However, within the R&F room example, differences in locating the most sensitive calculation point on the floor plan for one of the barriers was not considered in the medical physicist's calculation and was revealed by the RadShield calculations. RadShield is shown to accurately find the maximum values of air-kerma rate and barrier thickness using NCRP Report 147 methodology. Visual inspection alone of the 2D X-ray exam distribution by a medical physicist may not

  6. Automated concept and relationship extraction for the semi-automated ontology management (SEAM) system.

    Science.gov (United States)

    Doing-Harris, Kristina; Livnat, Yarden; Meystre, Stephane

    2015-01-01

    We develop medical-specialty specific ontologies that contain the settled science and common term usage. We leverage current practices in information and relationship extraction to streamline the ontology development process. Our system combines different text types with information and relationship extraction techniques in a low overhead modifiable system. Our SEmi-Automated ontology Maintenance (SEAM) system features a natural language processing pipeline for information extraction. Synonym and hierarchical groups are identified using corpus-based semantics and lexico-syntactic patterns. The semantic vectors we use are term frequency by inverse document frequency and context vectors. Clinical documents contain the terms we want in an ontology. They also contain idiosyncratic usage and are unlikely to contain the linguistic constructs associated with synonym and hierarchy identification. By including both clinical and biomedical texts, SEAM can recommend terms from those appearing in both document types. The set of recommended terms is then used to filter the synonyms and hierarchical relationships extracted from the biomedical corpus. We demonstrate the generality of the system across three use cases: ontologies for acute changes in mental status, Medically Unexplained Syndromes, and echocardiogram summary statements. Across the three uses cases, we held the number of recommended terms relatively constant by changing SEAM's parameters. Experts seem to find more than 300 recommended terms to be overwhelming. The approval rate of recommended terms increased as the number and specificity of clinical documents in the corpus increased. It was 60% when there were 199 clinical documents that were not specific to the ontology domain and 90% when there were 2879 documents very specific to the target domain. We found that fewer than 100 recommended synonym groups were also preferred. Approval rates for synonym recommendations remained low varying from 43% to 25% as the

  7. The interchangeability of global positioning system and semiautomated video-based performance data during elite soccer match play.

    Science.gov (United States)

    Harley, Jamie A; Lovell, Ric J; Barnes, Christopher A; Portas, Matthew D; Weston, Matthew

    2011-08-01

    In elite-level soccer, player motion characteristics are commonly generated from match play and training situations using semiautomated video analysis systems and global positioning system (GPS) technology, respectively. Before such data are used collectively to quantify global player load, it is necessary to understand both the level of agreement and direction of bias between the systems so that specific interventions can be made based on the reported results. The aim of this report was to compare data derived from both systems for physical match performances. Six elite-level soccer players were analyzed during a competitive match using semiautomated video analysis (ProZone® [PZ]) and GPS (MinimaxX) simultaneously. Total distances (TDs), high speed running (HSR), very high speed running (VHSR), sprinting distance (SPR), and high-intensity running distance (HIR; >4.0 m·s(-1)) were reported in 15-minute match periods. The GPS reported higher values than PZ did for TD (GPS: 1,755.4 ± 245.4 m; PZ: 1,631.3 ± 239.5 m; p < 0.05); PZ reported higher values for SPR and HIR than GPS did (SPR: PZ, 34.1 ± 24.0 m; GPS: 20.3 ± 15.8 m; HIR: PZ, 368.1 ± 129.8 m; GPS: 317.0 ± 92.5 m; p < 0.05). Caution should be exercised when using match-load (PZ) and training-load (GPS) data interchangeably.

  8. Magnetic resonance image segmentation using semi-automated software for quantification of knee articular cartilage - initial evaluation of a technique for paired scans

    International Nuclear Information System (INIS)

    Brem, M.H.; Lang, P.K.; Neumann, G.; Schlechtweg, P.M.; Yoshioka, H.; Pappas, G.; Duryea, J.; Schneider, E.; Jackson, R.; Yu, J.; Eaton, C.B.; Hennig, F.F.

    2009-01-01

    Software-based image analysis is important for studies of cartilage changes in knee osteoarthritis (OA). This study describes an evaluation of a semi-automated cartilage segmentation software tool capable of quantifying paired images for potential use in longitudinal studies of knee OA. We describe the methodology behind the analysis and demonstrate its use by determination of test-retest analysis precision of duplicate knee magnetic resonance imaging (MRI) data sets. Test-retest knee MR images of 12 subjects with a range of knee health were evaluated from the Osteoarthritis Initiative (OAI) pilot MR study. Each subject was removed from the magnet between the two scans. The 3D DESS (sagittal, 0.456 mm x 0.365 mm, 0.7 mm slice thickness, TR 16.5 ms, TE 4.7 ms) images were obtained on a 3-T Siemens Trio MR system with a USA Instruments quadrature transmit-receive extremity coil. Segmentation of one 3D-image series was first performed and then the corresponding retest series was segmented by viewing both image series concurrently in two adjacent windows. After manual registration of the series, the first segmentation cartilage outline served as an initial estimate for the second segmentation. We evaluated morphometric measures of the bone and cartilage surface area (tAB and AC), cartilage volume (VC), and mean thickness (ThC.me) for medial/lateral tibia (MT/LT), total femur (F) and patella (P). Test-retest reproducibility was assessed using the root-mean square coefficient of variation (RMS CV%). For the paired analyses, RMS CV % ranged from 0.9% to 1.2% for VC, from 0.3% to 0.7% for AC, from 0.6% to 2.7% for tAB and 0.8% to 1.5% for ThC.me. Paired image analysis improved the measurement precision of cartilage segmentation. Our results are in agreement with other publications supporting the use of paired analysis for longitudinal studies of knee OA. (orig.)

  9. Magnetic resonance image segmentation using semi-automated software for quantification of knee articular cartilage - initial evaluation of a technique for paired scans

    Energy Technology Data Exchange (ETDEWEB)

    Brem, M.H. [Brigham and Women' s Hospital, Department of Radiology, Boston, MA (United States); Friedrich-Alexander-University Erlangen Nurenberg, Division of Orthopaedic and Trauma Surgery, Department of Surgery, Erlangen (Germany); Lang, P.K.; Neumann, G.; Schlechtweg, P.M.; Yoshioka, H.; Pappas, G.; Duryea, J. [Brigham and Women' s Hospital, Department of Radiology, Boston, MA (United States); Schneider, E. [LLC, SciTrials, Rocky River, OH (United States); Cleveland Clinic, Imaging Institute, Cleveland, OH (United States); Jackson, R.; Yu, J. [Ohio State University, Diabetes and Metabolism and Radiology, Department of Endocrinology, Columbus, OH (United States); Eaton, C.B. [Center for Primary Care and Prevention and the Warren Alpert Medical School of Brown University, Memorial Hospital of Rhode Island, Providence, RI (United States); Hennig, F.F. [Friedrich-Alexander-University Erlangen Nurenberg, Division of Orthopaedic and Trauma Surgery, Department of Surgery, Erlangen (Germany)

    2009-05-15

    Software-based image analysis is important for studies of cartilage changes in knee osteoarthritis (OA). This study describes an evaluation of a semi-automated cartilage segmentation software tool capable of quantifying paired images for potential use in longitudinal studies of knee OA. We describe the methodology behind the analysis and demonstrate its use by determination of test-retest analysis precision of duplicate knee magnetic resonance imaging (MRI) data sets. Test-retest knee MR images of 12 subjects with a range of knee health were evaluated from the Osteoarthritis Initiative (OAI) pilot MR study. Each subject was removed from the magnet between the two scans. The 3D DESS (sagittal, 0.456 mm x 0.365 mm, 0.7 mm slice thickness, TR 16.5 ms, TE 4.7 ms) images were obtained on a 3-T Siemens Trio MR system with a USA Instruments quadrature transmit-receive extremity coil. Segmentation of one 3D-image series was first performed and then the corresponding retest series was segmented by viewing both image series concurrently in two adjacent windows. After manual registration of the series, the first segmentation cartilage outline served as an initial estimate for the second segmentation. We evaluated morphometric measures of the bone and cartilage surface area (tAB and AC), cartilage volume (VC), and mean thickness (ThC.me) for medial/lateral tibia (MT/LT), total femur (F) and patella (P). Test-retest reproducibility was assessed using the root-mean square coefficient of variation (RMS CV%). For the paired analyses, RMS CV % ranged from 0.9% to 1.2% for VC, from 0.3% to 0.7% for AC, from 0.6% to 2.7% for tAB and 0.8% to 1.5% for ThC.me. Paired image analysis improved the measurement precision of cartilage segmentation. Our results are in agreement with other publications supporting the use of paired analysis for longitudinal studies of knee OA. (orig.)

  10. Application of gain scheduling to the control of batch bioreactors

    Science.gov (United States)

    Cardello, Ralph; San, Ka-Yiu

    1987-01-01

    The implementation of control algorithms to batch bioreactors is often complicated by the inherent variations in process dynamics during the course of fermentation. Such a wide operating range may render the performance of fixed gain PID controllers unsatisfactory. In this work, a detailed study on the control of batch fermentation is performed. Furthermore, a simple batch controller design is proposed which incorporates the concept of gain-scheduling, a subclass of adaptive control, with oxygen uptake rate as an auxiliary variable. The control of oxygen tension in the biorector is used as a vehicle to convey the proposed idea, analysis and results. Simulation experiments indicate significant improvement in controller performance can be achieved by the proposed approach even in the presence of measurement noise.

  11. From Fed-batch to Continuous Enzymatic Biodiesel Production

    DEFF Research Database (Denmark)

    Price, Jason Anthony; Nordblad, Mathias; Woodley, John M.

    2015-01-01

    In this this paper, we use mechanistic modelling to guide the development of acontinuous enzymatic process that is performed as a fed-batch operation. In this workwe use the enzymatic biodiesel process as a case study. A mechanistic model developedin our previous work was used to determine...... measured components (triglycerides, diglycerides, monoglycerides, free fatty acid and fatty acid methyl esters(biodiesel)) much better than using fed-batch data alone given the smaller residuals. We also observe a reduction in the correlation between the parameters.The model was then used to predict that 5...... reactors are required (with a combined residence time of 30 hours) to reach a final biodiesel concentration within 2 % of the95.6 mass % achieved in a fed-batch operation, for 24 hours....

  12. Continuous flow technology vs. the batch-by-batch approach to produce pharmaceutical compounds.

    Science.gov (United States)

    Cole, Kevin P; Johnson, Martin D

    2018-01-01

    For the manufacture of small molecule drugs, many pharmaceutical innovator companies have recently invested in continuous processing, which can offer significant technical and economic advantages over traditional batch methodology. This Expert Review will describe the reasons for this interest as well as many considerations and challenges that exist today concerning continuous manufacturing. Areas covered: Continuous processing is defined and many reasons for its adoption are described. The current state of continuous drug substance manufacturing within the pharmaceutical industry is summarized. Current key challenges to implementation of continuous manufacturing are highlighted, and an outlook provided regarding the prospects for continuous within the industry. Expert commentary: Continuous processing at Lilly has been a journey that started with the need for increased safety and capability. Over twelve years the original small, dedicated group has grown to more than 100 Lilly employees in discovery, development, quality, manufacturing, and regulatory designing in continuous drug substance processing. Recently we have focused on linked continuous unit operations for the purpose of all-at-once pharmaceutical manufacturing, but the technical and business drivers that existed in the very beginning for stand-alone continuous unit operations in hybrid processes have persisted, which merits investment in both approaches.

  13. A High-Fidelity Batch Simulation Environment for Integrated Batch and Piloted Air Combat Simulation Analysis

    Science.gov (United States)

    Goodrich, Kenneth H.; McManus, John W.; Chappell, Alan R.

    1992-01-01

    A batch air combat simulation environment known as the Tactical Maneuvering Simulator (TMS) is presented. The TMS serves as a tool for developing and evaluating tactical maneuvering logics. The environment can also be used to evaluate the tactical implications of perturbations to aircraft performance or supporting systems. The TMS is capable of simulating air combat between any number of engagement participants, with practical limits imposed by computer memory and processing power. Aircraft are modeled using equations of motion, control laws, aerodynamics and propulsive characteristics equivalent to those used in high-fidelity piloted simulation. Databases representative of a modern high-performance aircraft with and without thrust-vectoring capability are included. To simplify the task of developing and implementing maneuvering logics in the TMS, an outer-loop control system known as the Tactical Autopilot (TA) is implemented in the aircraft simulation model. The TA converts guidance commands issued by computerized maneuvering logics in the form of desired angle-of-attack and wind axis-bank angle into inputs to the inner-loop control augmentation system of the aircraft. This report describes the capabilities and operation of the TMS.

  14. Liquid scintigraphic gastric emptying - is it reproducible?

    International Nuclear Information System (INIS)

    Cooper, R.G.; Shuter, B.; Leach, M.; Roach, P.J.

    1999-01-01

    Full text: Radioisotope gastric emptying (GE) studies have been used as a non-invasive technique for motility assessment for many years. In a recent study investigating the correlation of mesenteric vascular changes with GE, six subjects had a repeat study 2-4 months later. Repeat studies were required due to minor technical problems (5 subjects) and a very slow GE (I subject) on the original study. Subjects drank 275 ml of 'Ensure Plus' mixed with 8 MBq 67 Ga-DTPA and were imaged for 2 h while lying supine. GE time-activity curves for each subject were generated and time to half emptying (T l/2 ) calculated. Five of the six subjects had more rapid GE on the second study. Three of the subjects had T l/2 values on their second study which were within ± 15 min of their original T l/2 . The other three subjects had T l/2 values on their second study which were 36 min, 55 min and 280 min (subject K.H.) less than their original T l/2 . Statistical analysis (t-test) was performed on paired T l/2 values. The average T l/2 value was greater in the first study than in the second (149 ± 121 and 86 ± 18 min respectively), although the difference was not statistically significant (P ∼ 0.1). Subjects' anxiety levels were not quantitated during the GE study; however, several major equipment faults occurred during the original study of subject K.H., who became visibly stressed. These results suggest that the reproducibility of GE studies may be influenced by psychological factors

  15. Is my network module preserved and reproducible?

    Directory of Open Access Journals (Sweden)

    Peter Langfelder

    2011-01-01

    Full Text Available In many applications, one is interested in determining which of the properties of a network module change across conditions. For example, to validate the existence of a module, it is desirable to show that it is reproducible (or preserved in an independent test network. Here we study several types of network preservation statistics that do not require a module assignment in the test network. We distinguish network preservation statistics by the type of the underlying network. Some preservation statistics are defined for a general network (defined by an adjacency matrix while others are only defined for a correlation network (constructed on the basis of pairwise correlations between numeric variables. Our applications show that the correlation structure facilitates the definition of particularly powerful module preservation statistics. We illustrate that evaluating module preservation is in general different from evaluating cluster preservation. We find that it is advantageous to aggregate multiple preservation statistics into summary preservation statistics. We illustrate the use of these methods in six gene co-expression network applications including 1 preservation of cholesterol biosynthesis pathway in mouse tissues, 2 comparison of human and chimpanzee brain networks, 3 preservation of selected KEGG pathways between human and chimpanzee brain networks, 4 sex differences in human cortical networks, 5 sex differences in mouse liver networks. While we find no evidence for sex specific modules in human cortical networks, we find that several human cortical modules are less preserved in chimpanzees. In particular, apoptosis genes are differentially co-expressed between humans and chimpanzees. Our simulation studies and applications show that module preservation statistics are useful for studying differences between the modular structure of networks. Data, R software and accompanying tutorials can be downloaded from the following webpage: http

  16. Production of tea vinegar by batch and semicontinuous fermentation

    OpenAIRE

    Kaur, Pardeep; Kocher, G. S.; Phutela, R. P.

    2010-01-01

    The fermented tea vinegar combines the beneficial properties of tea and vinegar. The complete fermentation takes 4 to 5 weeks in a batch culture and thus can be shortened by semi continuous/ continuous fermentation using immobilized bacterial cells. In the present study, alcoholic fermentation of 1.0 and 1.5% tea infusions using Saccharomyces cerevisae G was carried out that resulted in 84.3 and 84.8% fermentation efficiency (FE) respectively. The batch vinegar fermentation of these wines wit...

  17. Stochastic growth logistic model with aftereffect for batch fermentation process

    Science.gov (United States)

    Rosli, Norhayati; Ayoubi, Tawfiqullah; Bahar, Arifah; Rahman, Haliza Abdul; Salleh, Madihah Md

    2014-06-01

    In this paper, the stochastic growth logistic model with aftereffect for the cell growth of C. acetobutylicum P262 and Luedeking-Piret equations for solvent production in batch fermentation system is introduced. The parameters values of the mathematical models are estimated via Levenberg-Marquardt optimization method of non-linear least squares. We apply Milstein scheme for solving the stochastic models numerically. The effciency of mathematical models is measured by comparing the simulated result and the experimental data of the microbial growth and solvent production in batch system. Low values of Root Mean-Square Error (RMSE) of stochastic models with aftereffect indicate good fits.

  18. Stochastic growth logistic model with aftereffect for batch fermentation process

    International Nuclear Information System (INIS)

    Rosli, Norhayati; Ayoubi, Tawfiqullah; Bahar, Arifah; Rahman, Haliza Abdul; Salleh, Madihah Md

    2014-01-01

    In this paper, the stochastic growth logistic model with aftereffect for the cell growth of C. acetobutylicum P262 and Luedeking-Piret equations for solvent production in batch fermentation system is introduced. The parameters values of the mathematical models are estimated via Levenberg-Marquardt optimization method of non-linear least squares. We apply Milstein scheme for solving the stochastic models numerically. The effciency of mathematical models is measured by comparing the simulated result and the experimental data of the microbial growth and solvent production in batch system. Low values of Root Mean-Square Error (RMSE) of stochastic models with aftereffect indicate good fits

  19. Stochastic growth logistic model with aftereffect for batch fermentation process

    Energy Technology Data Exchange (ETDEWEB)

    Rosli, Norhayati; Ayoubi, Tawfiqullah [Faculty of Industrial Sciences and Technology, Universiti Malaysia Pahang, Lebuhraya Tun Razak, 26300 Gambang, Pahang (Malaysia); Bahar, Arifah; Rahman, Haliza Abdul [Department of Mathematical Sciences, Faculty of Science, Universiti Teknologi Malaysia, 81310 Johor Bahru, Johor (Malaysia); Salleh, Madihah Md [Department of Biotechnology Industry, Faculty of Biosciences and Bioengineering, Universiti Teknologi Malaysia, 81310 Johor Bahru, Johor (Malaysia)

    2014-06-19

    In this paper, the stochastic growth logistic model with aftereffect for the cell growth of C. acetobutylicum P262 and Luedeking-Piret equations for solvent production in batch fermentation system is introduced. The parameters values of the mathematical models are estimated via Levenberg-Marquardt optimization method of non-linear least squares. We apply Milstein scheme for solving the stochastic models numerically. The effciency of mathematical models is measured by comparing the simulated result and the experimental data of the microbial growth and solvent production in batch system. Low values of Root Mean-Square Error (RMSE) of stochastic models with aftereffect indicate good fits.

  20. Optimum heat storage design for heat integrated multipurpose batch plants

    CSIR Research Space (South Africa)

    Stamp, J

    2011-01-01

    Full Text Available procedure is presented tha journal homepage: www All rights reserved. ajozi T, Optimum heat storage grated multipurpose batch plants , South Africa y usage in multipurpose batch plants has been in published literature most present methods, time... � 2pL?u?kins ? 1 h3A3?u?cu?U (36) The internal area for heat loss by convection from the heat transfer medium is given by Constraint (37) and the area for convective heat transfer losses to the environment is given in Constraint (38). A1?u? ? 2...

  1. Reproducibility of a semi-automatic method for 6-point vertebral morphometry in a multi-centre trial

    International Nuclear Information System (INIS)

    Guglielmi, Giuseppe; Stoppino, Luca Pio; Placentino, Maria Grazia; D'Errico, Francesco; Palmieri, Francesco

    2009-01-01

    Purpose: To evaluate the reproducibility of a semi-automated system for vertebral morphometry (MorphoXpress) in a large multi-centre trial. Materials and methods: The study involved 132 clinicians (no radiologist) with different levels of experience across 20 osteo-centres in Italy. All have received training in using MorphoXpress. An expert radiologist was also involved providing data used as standard of reference. The test image originate from normal clinical activity and represent a variety of normal, under and over exposed films, indicating both normal anatomy and vertebral deformities. The image was represented twice to the clinicians in a random order. Using the software, the clinicians initially marked the midpoints of the upper and lower vertebrae to include as many of the vertebrae (T5-L4) as practical within each given image. MorphoXpress performs the localisation of all morphometric points based on statistical model-based vision system. Intra-operator as well inter-operator measurement of agreement was calculated using the coefficient of variation and the mean and standard deviation of the difference of two measurements to check their agreement. Results: The overall intra-operator mean differences in vertebral heights is 1.61 ± 4.27% (1 S.D.). The overall intra-operator coefficient of variation is 3.95%. The overall inter-operator mean differences in vertebral heights is 2.93 ± 5.38% (1 S.D.). The overall inter-operator coefficient of variation is 6.89%. Conclusions: The technology tested here can facilitate reproducible quantitative morphometry suitable for large studies of vertebral deformities

  2. A semi-automated measuring system of brain diffusion and perfusion magnetic resonance imaging abnormalities in patients with multiple sclerosis based on the integration of coregistration and tissue segmentation procedures

    International Nuclear Information System (INIS)

    Revenaz, Alfredo; Ruggeri, Massimiliano; Laganà, Marcella; Bergsland, Niels; Groppo, Elisabetta; Rovaris, Marco; Fainardi, Enrico

    2016-01-01

    Diffusion-weighted imaging (DWI) and perfusion-weighted imaging (PWI) abnormalities in patients with multiple sclerosis (MS) are currently measured by a complex combination of separate procedures. Therefore, the purpose of this study was to provide a reliable method for reducing analysis complexity and obtaining reproducible results. We implemented a semi-automated measuring system in which different well-known software components for magnetic resonance imaging (MRI) analysis are integrated to obtain reliable measurements of DWI and PWI disturbances in MS. We generated the Diffusion/Perfusion Project (DPP) Suite, in which a series of external software programs are managed and harmonically and hierarchically incorporated by in-house developed Matlab software to perform the following processes: 1) image pre-processing, including imaging data anonymization and conversion from DICOM to Nifti format; 2) co-registration of 2D and 3D non-enhanced and Gd-enhanced T1-weighted images in fluid-attenuated inversion recovery (FLAIR) space; 3) lesion segmentation and classification, in which FLAIR lesions are at first segmented and then categorized according to their presumed evolution; 4) co-registration of segmented FLAIR lesion in T1 space to obtain the FLAIR lesion mask in the T1 space; 5) normal appearing tissue segmentation, in which T1 lesion mask is used to segment basal ganglia/thalami, normal appearing grey matter (NAGM) and normal appearing white matter (NAWM); 6) DWI and PWI map generation; 7) co-registration of basal ganglia/thalami, NAGM, NAWM, DWI and PWI maps in previously segmented FLAIR space; 8) data analysis. All these steps are automatic, except for lesion segmentation and classification. We developed a promising method to limit misclassifications and user errors, providing clinical researchers with a practical and reproducible tool to measure DWI and PWI changes in MS

  3. Precision and reproducibility in AMS radiocarbon measurements.

    Energy Technology Data Exchange (ETDEWEB)

    Hotchkis, M A; Fink, D; Hua, Q; Jacobsen, G E; Lawson, E M; Smith, A M; Tuniz, C [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia)

    1997-12-31

    Accelerator Mass Spectrometry (AMS) is a technique by which rare radioisotopes such as {sup 14}C can be measured at environmental levels with high efficiency. Instead of detecting radioactivity, which is very weak for long-lived environmental radioisotopes, atoms are counted directly. The sample is placed in an ion source, from which a negative ion beam of the atoms of interest is extracted, mass analysed, and injected into a tandem accelerator. After stripping to positive charge states in the accelerator HV terminal, the ions are further accelerated, analysed with magnetic and electrostatic devices and counted in a detector. An isotopic ratio is derived from the number of radioisotope atoms counted in a given time and the beam current of a stable isotope of the same element, measured after the accelerator. For radiocarbon, {sup 14}C/{sup 13}C ratios are usually measured, and the ratio of an unknown sample is compared to that of a standard. The achievable precision for such ratio measurements is limited primarily by {sup 14}C counting statistics and also by a variety of factors related to accelerator and ion source stability. At the ANTARES AMS facility at Lucas Heights Research Laboratories we are currently able to measure {sup 14}C with 0.5% precision. In the two years since becoming operational, more than 1000 {sup 14}C samples have been measured. Recent improvements in precision for {sup 14}C have been achieved with the commissioning of a 59 sample ion source. The measurement system, from sample changing to data acquisition, is under common computer control. These developments have allowed a new regime of automated multi-sample processing which has impacted both on the system throughput and the measurement precision. We have developed data evaluation methods at ANTARES which cross-check the self-consistency of the statistical analysis of our data. Rigorous data evaluation is invaluable in assessing the true reproducibility of the measurement system and aids in

  4. Precision and reproducibility in AMS radiocarbon measurements.

    Energy Technology Data Exchange (ETDEWEB)

    Hotchkis, M.A.; Fink, D.; Hua, Q.; Jacobsen, G.E.; Lawson, E. M.; Smith, A.M.; Tuniz, C. [Australian Nuclear Science and Technology Organisation, Lucas Heights, NSW (Australia)

    1996-12-31

    Accelerator Mass Spectrometry (AMS) is a technique by which rare radioisotopes such as {sup 14}C can be measured at environmental levels with high efficiency. Instead of detecting radioactivity, which is very weak for long-lived environmental radioisotopes, atoms are counted directly. The sample is placed in an ion source, from which a negative ion beam of the atoms of interest is extracted, mass analysed, and injected into a tandem accelerator. After stripping to positive charge states in the accelerator HV terminal, the ions are further accelerated, analysed with magnetic and electrostatic devices and counted in a detector. An isotopic ratio is derived from the number of radioisotope atoms counted in a given time and the beam current of a stable isotope of the same element, measured after the accelerator. For radiocarbon, {sup 14}C/{sup 13}C ratios are usually measured, and the ratio of an unknown sample is compared to that of a standard. The achievable precision for such ratio measurements is limited primarily by {sup 14}C counting statistics and also by a variety of factors related to accelerator and ion source stability. At the ANTARES AMS facility at Lucas Heights Research Laboratories we are currently able to measure {sup 14}C with 0.5% precision. In the two years since becoming operational, more than 1000 {sup 14}C samples have been measured. Recent improvements in precision for {sup 14}C have been achieved with the commissioning of a 59 sample ion source. The measurement system, from sample changing to data acquisition, is under common computer control. These developments have allowed a new regime of automated multi-sample processing which has impacted both on the system throughput and the measurement precision. We have developed data evaluation methods at ANTARES which cross-check the self-consistency of the statistical analysis of our data. Rigorous data evaluation is invaluable in assessing the true reproducibility of the measurement system and aids in

  5. Design of two-column batch-to-batch recirculation to enhance performance in ion-exchange chromatography.

    Science.gov (United States)

    Persson, Oliver; Andersson, Niklas; Nilsson, Bernt

    2018-01-05

    Preparative liquid chromatography is a separation technique widely used in the manufacturing of fine chemicals and pharmaceuticals. A major drawback of traditional single-column batch chromatography step is the trade-off between product purity and process performance. Recirculation of impure product can be utilized to make the trade-off more favorable. The aim of the present study was to investigate the usage of a two-column batch-to-batch recirculation process step to increase the performance compared to single-column batch chromatography at a high purity requirement. The separation of a ternary protein mixture on ion-exchange chromatography columns was used to evaluate the proposed process. The investigation used modelling and simulation of the process step, experimental validation and optimization of the simulated process. In the presented case the yield increases from 45.4% to 93.6% and the productivity increases 3.4 times compared to the performance of a batch run for a nominal case. A rapid concentration build-up product can be seen during the first cycles, before the process reaches a cyclic steady-state with reoccurring concentration profiles. The optimization of the simulation model predicts that the recirculated salt can be used as a flying start of the elution, which would enhance the process performance. The proposed process is more complex than a batch process, but may improve the separation performance, especially while operating at cyclic steady-state. The recirculation of impure fractions reduces the product losses and ensures separation of product to a high degree of purity. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. Reproducing ten years of road ageing - Accelerated carbonation and leaching of EAF steel slag

    International Nuclear Information System (INIS)

    Suer, Pascal; Lindqvist, Jan-Erik; Arm, Maria; Frogner-Kockum, Paul

    2009-01-01

    Reuse of industrial aggregates is still hindered by concern for their long-term properties. This paper proposes a laboratory method for accelerated ageing of steel slag, to predict environmental and technical properties, starting from fresh slag. Ageing processes in a 10-year old asphalt road with steel slag of electric arc furnace (EAF) type in the subbase were identified by scanning electron microscopy (SEM) and leaching tests. Samples from the road centre and the pavement edge were compared with each other and with samples of fresh slag. It was found that slag from the pavement edge showed traces of carbonation and leaching processes, whereas the road centre material was nearly identical to fresh slag, in spite of an accessible particle structure. Batches of moisturized road centre material exposed to oxygen, nitrogen or carbon dioxide (CO 2 ) were used for accelerated ageing. Time (7-14 days), temperature (20-40 o C) and initial slag moisture content (8-20%) were varied to achieve the carbonation (decrease in pH) and leaching that was observed in the pavement edge material. After ageing, water was added to assess leaching of metals and macroelements. 12% moisture, CO 2 and seven days at 40 o C gave the lowest pH value. This also reproduced the observed ageing effect for Ca, Cu, Ba, Fe, Mn, Pb, Ca (decreased leaching) and for V, Si, and Al (increased leaching). However, ageing effects on SO 4 , DOC and Cr were not reproduced.

  7. Potency Evaluation of Recombinant Human Erythropoietin in Brazil: Assessment of Reproducibility Using a Practical Approach

    Directory of Open Access Journals (Sweden)

    Michele Cardoso do Nascimento

    2015-08-01

    Full Text Available In this study, we compared the results of potency determination of recombinant human erythropoietin (rhEPO obtained between 2010 and 2012 by the National Institute of Quality Control in Health (INCQS/Fiocruz, i.e., the National Control Laboratory (NCL, and by a manufacturer of rhEPO. In total, 47 different batches of commercially prepared rhEPO (alpha isoform were analyzed. All results, including those of the control and warning limits, remained within the limits recommended by European Pharmacopoeia (Ph. Eur.. All relative error (RE values were less than ± 30%, wh ereas most were approximately ± 20%. Applying the Bland-Altman plot, only two of 47 values remained outside the limits of agreement (LA. In addition, agreement of potency determination between INCQS and the manufacturer coefficient of variation of reproducibility (% CVR was considered satisfactory. Taken together, our results demonstrate (i. the potency assay of rhEPO performed at INCQS, is standardized and controlled, (ii. the comparison of our results with those of the manufacturer, revealed an adequate inter-laboratory variation, and (iii. the critical appraisal proposed here appears to be a feasible tool to assess the reproducibility of biological activity, providing additional information regarding monitoring and production consistency to manufacturers and NCLs.

  8. Lipid production in batch and fed-batch cultures of Rhodosporidium toruloides from 5 and 6 carbon carbohydrates

    Directory of Open Access Journals (Sweden)

    Wiebe Marilyn G

    2012-05-01

    Full Text Available Abstract Background Microbial lipids are a potential source of bio- or renewable diesel and the red yeast Rhodosporidium toruloides is interesting not only because it can accumulate over 50% of its dry biomass as lipid, but also because it utilises both five and six carbon carbohydrates, which are present in plant biomass hydrolysates. Methods R. toruloides was grown in batch and fed-batch cultures in 0.5 L bioreactors at pH 4 in chemically defined, nitrogen restricted (C/N 40 to 100 media containing glucose, xylose, arabinose, or all three carbohydrates as carbon source. Lipid was extracted from the biomass using chloroform-methanol, measured gravimetrically and analysed by GC. Results Lipid production was most efficient with glucose (up to 25 g lipid L−1, 48 to 75% lipid in the biomass, at up to 0.21 g lipid L−1 h−1 as the sole carbon source, but high lipid concentrations were also produced from xylose (36 to 45% lipid in biomass. Lipid production was low (15–19% lipid in biomass with arabinose as sole carbon source and was lower than expected (30% lipid in biomass when glucose, xylose and arabinose were provided simultaneously. The presence of arabinose and/or xylose in the medium increased the proportion of palmitic and linoleic acid and reduced the proportion of oleic acid in the fatty acids, compared to glucose-grown cells. High cell densities were obtained in both batch (37 g L−1, with 49% lipid in the biomass and fed-batch (35 to 47 g L−1, with 50 to 75% lipid in the biomass cultures. The highest proportion of lipid in the biomass was observed in cultures given nitrogen during the batch phase but none with the feed. However, carbohydrate consumption was incomplete when the feed did not contain nitrogen and the highest total lipid and best substrate consumption were observed in cultures which received a constant low nitrogen supply. Conclusions Lipid production in R. toruloides was lower from arabinose and mixed

  9. SEMIAUTOMATED SOLID-PHASE EXTRACTION PROCEDURE FOR DRUG SCREENING IN BIOLOGICAL-FLUIDS USING THE ASPEC SYSTEM IN COMBINATION WITH CLEAN SCREEN DAU COLUMNS

    NARCIS (Netherlands)

    CHEN, XH; FRANKE, JP; ENSING, K; WIJSBEEK, J; DEZEEUW, RA

    1993-01-01

    The use of a semi-automated solid-phase extraction system (ASPEC) for the screening of drugs in plasma and urine on a single mixed-mode column (Clean Screen DAU) is described. The processes of column preconditioning, sample application, column wash, pH adjustment and elution of the drugs were

  10. Campylobacter in Broiler Chicken and Broiler Meat in Sri Lanka : Influence of Semi-Automated vs. Wet Market Processing on Campylobacter Contamination of Broiler Neck Skin Samples

    NARCIS (Netherlands)

    Kottawatta, Kottawattage S A; van Bergen, Marcel A P; Abeynayake, Preeni; Wagenaar, Jaap A; Veldman, Kees T; Kalupahana, Ruwani S

    2017-01-01

    Broiler meat can become contaminated with Campylobacter of intestinal origin during processing. The present study aimed to identify the prevalence of Campylobacter in broiler flocks and meat contamination at retail shops, and determine the influence of semi-automated and wet market processing on

  11. Adaptation to high throughput batch chromatography enhances multivariate screening.

    Science.gov (United States)

    Barker, Gregory A; Calzada, Joseph; Herzer, Sibylle; Rieble, Siegfried

    2015-09-01

    High throughput process development offers unique approaches to explore complex process design spaces with relatively low material consumption. Batch chromatography is one technique that can be used to screen chromatographic conditions in a 96-well plate. Typical batch chromatography workflows examine variations in buffer conditions or comparison of multiple resins in a given process, as opposed to the assessment of protein loading conditions in combination with other factors. A modification to the batch chromatography paradigm is described here where experimental planning, programming, and a staggered loading approach increase the multivariate space that can be explored with a liquid handling system. The iterative batch chromatography (IBC) approach is described, which treats every well in a 96-well plate as an individual experiment, wherein protein loading conditions can be varied alongside other factors such as wash and elution buffer conditions. As all of these factors are explored in the same experiment, the interactions between them are characterized and the number of follow-up confirmatory experiments is reduced. This in turn improves statistical power and throughput. Two examples of the IBC method are shown and the impact of the load conditions are assessed in combination with the other factors explored. Copyright © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Control of polymer network topology in semi-batch systems

    Science.gov (United States)

    Wang, Rui; Olsen, Bradley; Johnson, Jeremiah

    Polymer networks invariably possess topological defects: loops of different orders. Since small loops (primary loops and secondary loops) both lower the modulus of network and lead to stress concentration that causes material failure at low deformation, it is desirable to greatly reduce the loop fraction. We have shown that achieving loop fraction close to zero is extremely difficult in the batch process due to the slow decay of loop fraction with the polymer concentration and chain length. Here, we develop a modified kinetic graph theory that can model network formation reactions in semi-batch systems. We demonstrate that the loop fraction is not sensitive to the feeding policy if the reaction volume maintains constant during the network formation. However, if we initially put concentrated solution of small junction molecules in the reactor and continuously adding polymer solutions, the fractions of both primary loop and higher-order loops will be significantly reduced. There is a limiting value (nonzero) of loop fraction that can be achieved in the semi-batch system in condition of extremely slow feeding rate. This minimum loop fraction only depends on a single dimensionless variable, the product of concentration and with single chain pervaded volume, and defines an operating zone in which the loop fraction of polymer networks can be controlled through adjusting the feeding rate of the semi-batch process.

  13. State and parameter estimation in biotechnical batch reactors

    NARCIS (Netherlands)

    Keesman, K.J.

    2000-01-01

    In this paper the problem of state and parameter estimation in biotechnical batch reactors is considered. Models describing the biotechnical process behaviour are usually nonlinear with time-varying parameters. Hence, the resulting large dimensions of the augmented state vector, roughly > 7, in

  14. Batch distillation column modeling for quality control program

    NARCIS (Netherlands)

    Betlem, Bernardus H.L.

    2000-01-01

    For batch distillation, the dynamic composition behaviour can be described by the dominant time constant and the bottom exhaustion. Its magnitude is determined by the change of the composition distribution and is maximal when the inflection point of the molar fraction profile is located in the

  15. Sequencing for Batch Production in a Group Flowline Machine Shop ...

    African Journals Online (AJOL)

    The purpose of the paper is to develop a useful technique for sequencing batches of components through machine shops arranged under the group flowline production system. The approach is to apply a modified version of Petrov's group flowline technique for machining components which follow a unidirectional route.

  16. Quality control for 12 batch of DTPA-Sn

    International Nuclear Information System (INIS)

    Isaac, M.; Gamboa, R.; Leyva, R.; Hernandez, I.; Turino, D.

    1994-01-01

    The quality control is carry out at 12 batch of DTPA-Sn for labeling with 99 m Tc. The instrumental methods of analysis and control charts were discussed in order to find a warranty time for the product. (author). 2 refs, 3 figs, 1 tab

  17. Flash chemistry: flow chemistry that cannot be done in batch.

    Science.gov (United States)

    Yoshida, Jun-ichi; Takahashi, Yusuke; Nagaki, Aiichiro

    2013-11-04

    Flash chemistry based on high-resolution reaction time control using flow microreactors enables chemical reactions that cannot be done in batch and serves as a powerful tool for laboratory synthesis of organic compounds and for production in chemical and pharmaceutical industries.

  18. Adsorption of Arsenite onto Kemiron in a batch system

    African Journals Online (AJOL)

    doti

    This study investigated the effect of pH and coexisting ions on As(III) adsorption using batch experiment and discovered that pH strongly influenced As(III) adsorption. However, differences ... contamination by such heavy metals as arsenic (As). Arsenite ..... and then transition through point of zero charge (PZC) and then into ...

  19. Development of Production Control in Small Batch Production

    Directory of Open Access Journals (Sweden)

    Németh Péter

    2016-01-01

    Full Text Available Our aim with this paper is to develop a new performance measurement and control system for small batch production in the automotive industry. For this reason, we present our previous research results for warehouse performance measurement and adopt its methodology to production control. The proposed method is based on artificial intelligence (neural networks.

  20. Shell of Planet Earth – Global Batch Bioreactor.

    Czech Academy of Sciences Publication Activity Database

    Hanika, Jiří; Šolcová, Olga; Kaštánek, P.

    2017-01-01

    Roč. 40, č. 11 (2017), s. 1959-1965 ISSN 0930-7516 R&D Projects: GA TA ČR TE01020080 Institutional support: RVO:67985858 Keywords : critical raw materials * global batch bioreactor * planet earth Subject RIV: CI - Industrial Chemistry, Chemical Engineering OBOR OECD: Chemical process engineering Impact factor: 2.051, year: 2016

  1. Design of common heat exchanger network for batch processes

    International Nuclear Information System (INIS)

    Anastasovski, Aleksandar

    2014-01-01

    Heat integration of energy streams is very important for the efficient energy recovery in production systems. Pinch technology is a very useful tool for heat integration and maximizing energy efficiency. Creating of heat exchangers network as a common solution for systems in batch mode that will be applicable in all existing time slices is very difficult. This paper suggests a new methodology for design of common heat exchanger network for batch processes. Heat exchanger network designs were created for all determined repeatable and non-repeatable time periods – time slices. They are the basis for creating the common heat exchanger network. The common heat exchanger network as solution, satisfies all heat-transfer needs for each time period and for every existing combination of selected streams in the production process. This methodology use split of some heat exchangers into two or more heat exchange units or heat exchange zones. The reason for that is the multipurpose use of heat exchangers between different pairs of streams in different time periods. Splitting of large heat exchangers would maximize the total heat transfer usage of heat exchange units. Final solution contains heat exchangers with the minimum heat load as well as the minimum need of heat transfer area. The solution is applicable for all determined time periods and all existing stream combinations. - Highlights: •Methodology for design of energy efficient systems in batch processes. •Common Heat Exchanger Network solution based on designs with Pinch technology. •Multipurpose use of heat exchangers in batch processes

  2. A fixed-size batch service queue with vacations

    Directory of Open Access Journals (Sweden)

    Ho Woo Lee

    1996-01-01

    Full Text Available The paper deals with batch service queues with vacations in which customers arrive according to a Poisson process. Decomposition method is used to derive the queue length distributions both for single and multiple vacation cases. The authors look at other decomposition techniques and discuss some related open problems.

  3. Optimization of heat-liberating batches for ash residue stabilization

    International Nuclear Information System (INIS)

    Karlina, O.K.; Varlackova, G.A.; Ojovan, M.I.; Tivansky, V.M.; Dmitriev, S.A.

    1999-01-01

    The ash residue obtained after incineration of solid radioactive waste is a dusting poly-dispersed powder like material that contains radioactive nuclides ( 137 Cs, 90 Sr, 239 Pu, hor ( ellipsis)). Specific radioactivity of the ash can be about 10 5 --10 7 Bq/kg. In order to dispose of the ash, residue shall be stabilized by producing a monolith material. The ash residue can be either vitrified or stabilized into a ceramic matrix. For this purpose the ash residue is mixed with fluxing agents followed by melting of obtained composition in the different type melters. As a rule this requires both significant energy consumption and complex melting equipment. A stabilization technology of ash residue was proposed recently by using heat liberating batches-compositions with redox properties. The ash residue is melted due to exothermic chemical reactions in the mixture with heat-liberating batch that occur with considerable release of heat. Stabilization method has three stages: (1) preparation of a mixture of heating batch and ash residue with or without glass forming batch (frit); (2) ignition and combustion of mixed composition; (3) cooling (quenching) of obtained vitreous material. Combustion of mixed composition occurs in the form of propagation of reacting wave. The heat released during exothermic chemical reactions provides melting of ash residue components and production of glass-like phase. The final product consists of a glass like matrix with embedded crystalline inclusions of infusible ash residue components

  4. Comparative Batch and Column Evaluation of Thermal and Wet ...

    African Journals Online (AJOL)

    The efficiency of regenerated spent commercial activated carbon for synthetic dye removal was studied using thermal and wet oxidative regeneration methods. Two types of experiments were carried out, batch adsorption experiments and continous flow (fixed bed) column experiment to study the mechanism of dye removal ...

  5. DEVELOPMENT OF AN AUTOMATED BATCH-PROCESS SOLAR ...

    African Journals Online (AJOL)

    One of the shortcomings of solar disinfection of water (SODIS) is the absence of a feedback mechanism indicating treatment completion. This work presents the development of an automated batch-process water disinfection system aimed at solving this challenge. Locally sourced materials in addition to an Arduinomicro ...

  6. Medium optimization for protopectinase production by batch culture of

    African Journals Online (AJOL)

    Medium optimization for protopectinase production by batch culture of. C Fan, Z Liu, L Yao. Abstract. Optimization of medium compositions for protopectinase production by Aspergillus terreus in submerged culture was carried out. The medium components having significant effect on protopectinase production were reported ...

  7. Batch immunoextraction method for efficient purification of aromatic cytokinins

    Czech Academy of Sciences Publication Activity Database

    Hauserová, Eva; Swaczynová, Jana; Doležal, Karel; Lenobel, René; Popa, Igor; Hajdúch, M.; Vydra, D.; Fuksová, Květoslava; Strnad, Miroslav

    2005-01-01

    Roč. 1100, č. 1 (2005), s. 116-125 ISSN 0021-9673 R&D Projects: GA AV ČR IBS4055304 Institutional research plan: CEZ:AV0Z50380511; MSM6198959216 Keywords : antibody * 6-benzylaminopurine * batch immunoextraction Subject RIV: ED - Physiology Impact factor: 3.096, year: 2005

  8. Tier 3 batch system data locality via managed caches

    Science.gov (United States)

    Fischer, Max; Giffels, Manuel; Jung, Christopher; Kühn, Eileen; Quast, Günter

    2015-05-01

    Modern data processing increasingly relies on data locality for performance and scalability, whereas the common HEP approaches aim for uniform resource pools with minimal locality, recently even across site boundaries. To combine advantages of both, the High- Performance Data Analysis (HPDA) Tier 3 concept opportunistically establishes data locality via coordinated caches. In accordance with HEP Tier 3 activities, the design incorporates two major assumptions: First, only a fraction of data is accessed regularly and thus the deciding factor for overall throughput. Second, data access may fallback to non-local, making permanent local data availability an inefficient resource usage strategy. Based on this, the HPDA design generically extends available storage hierarchies into the batch system. Using the batch system itself for scheduling file locality, an array of independent caches on the worker nodes is dynamically populated with high-profile data. Cache state information is exposed to the batch system both for managing caches and scheduling jobs. As a result, users directly work with a regular, adequately sized storage system. However, their automated batch processes are presented with local replications of data whenever possible.

  9. modelling and simulation of the batch hydrolysis of acetic ing

    African Journals Online (AJOL)

    eobe

    The kinetic modelling of the batch synthesis of acetic acid from acetic. The kinetic modelling of ... integral method of analysis to determine the kinetic parameters .... Equation (5) is applied to all the components ... In common chemical engineering terminology, the degree of ..... of Physical Organic Chemistry, Vol. 25, Number ...

  10. Comparison of neptunium sorption results using batch and column techniques

    International Nuclear Information System (INIS)

    Triay, I.R.; Furlano, A.C.; Weaver, S.C.; Chipera, S.J.; Bish, D.L.

    1996-08-01

    We used crushed-rock columns to study the sorption retardation of neptunium by zeolitic, devitrified, and vitric tuffs typical of those at the site of the potential high-level nuclear waste repository at Yucca Mountain, Nevada. We used two sodium bicarbonate waters (groundwater from Well J-13 at the site and water prepared to simulate groundwater from Well UE-25p No. 1) under oxidizing conditions. It was found that values of the sorption distribution coefficient, Kd, obtained from these column experiments under flowing conditions, regardless of the water or the water velocity used, agreed well with those obtained earlier from batch sorption experiments under static conditions. The batch sorption distribution coefficient can be used to predict the arrival time for neptunium eluted through the columns. On the other hand, the elution curves showed dispersivity, which implies that neptunium sorption in these tuffs may be nonlinear, irreversible, or noninstantaneous. As a result, use of a batch sorption distribution coefficient to calculate neptunium transport through Yucca Mountain tuffs would yield conservative values for neptunium release from the site. We also noted that neptunium (present as the anionic neptunyl carbonate complex) never eluted prior to tritiated water, which implies that charge exclusion does not appear to exclude neptunium from the tuff pores. The column experiments corroborated the trends observed in batch sorption experiments: neptunium sorption onto devitrified and vitric tuffs is minimal and sorption onto zeolitic tuffs decreases as the amount of sodium and bicarbonate/carbonate in the water increases

  11. Multileaf collimator performance monitoring and improvement using semiautomated quality control testing and statistical process control

    International Nuclear Information System (INIS)

    Létourneau, Daniel; McNiven, Andrea; Keller, Harald; Wang, An; Amin, Md Nurul; Pearce, Jim; Norrlinger, Bernhard; Jaffray, David A.

    2014-01-01

    Purpose: High-quality radiation therapy using highly conformal dose distributions and image-guided techniques requires optimum machine delivery performance. In this work, a monitoring system for multileaf collimator (MLC) performance, integrating semiautomated MLC quality control (QC) tests and statistical process control tools, was developed. The MLC performance monitoring system was used for almost a year on two commercially available MLC models. Control charts were used to establish MLC performance and assess test frequency required to achieve a given level of performance. MLC-related interlocks and servicing events were recorded during the monitoring period and were investigated as indicators of MLC performance variations. Methods: The QC test developed as part of the MLC performance monitoring system uses 2D megavoltage images (acquired using an electronic portal imaging device) of 23 fields to determine the location of the leaves with respect to the radiation isocenter. The precision of the MLC performance monitoring QC test and the MLC itself was assessed by detecting the MLC leaf positions on 127 megavoltage images of a static field. After initial calibration, the MLC performance monitoring QC test was performed 3–4 times/week over a period of 10–11 months to monitor positional accuracy of individual leaves for two different MLC models. Analysis of test results was performed using individuals control charts per leaf with control limits computed based on the measurements as well as two sets of specifications of ±0.5 and ±1 mm. Out-of-specification and out-of-control leaves were automatically flagged by the monitoring system and reviewed monthly by physicists. MLC-related interlocks reported by the linear accelerator and servicing events were recorded to help identify potential causes of nonrandom MLC leaf positioning variations. Results: The precision of the MLC performance monitoring QC test and the MLC itself was within ±0.22 mm for most MLC leaves

  12. Multileaf collimator performance monitoring and improvement using semiautomated quality control testing and statistical process control.

    Science.gov (United States)

    Létourneau, Daniel; Wang, An; Amin, Md Nurul; Pearce, Jim; McNiven, Andrea; Keller, Harald; Norrlinger, Bernhard; Jaffray, David A

    2014-12-01

    High-quality radiation therapy using highly conformal dose distributions and image-guided techniques requires optimum machine delivery performance. In this work, a monitoring system for multileaf collimator (MLC) performance, integrating semiautomated MLC quality control (QC) tests and statistical process control tools, was developed. The MLC performance monitoring system was used for almost a year on two commercially available MLC models. Control charts were used to establish MLC performance and assess test frequency required to achieve a given level of performance. MLC-related interlocks and servicing events were recorded during the monitoring period and were investigated as indicators of MLC performance variations. The QC test developed as part of the MLC performance monitoring system uses 2D megavoltage images (acquired using an electronic portal imaging device) of 23 fields to determine the location of the leaves with respect to the radiation isocenter. The precision of the MLC performance monitoring QC test and the MLC itself was assessed by detecting the MLC leaf positions on 127 megavoltage images of a static field. After initial calibration, the MLC performance monitoring QC test was performed 3-4 times/week over a period of 10-11 months to monitor positional accuracy of individual leaves for two different MLC models. Analysis of test results was performed using individuals control charts per leaf with control limits computed based on the measurements as well as two sets of specifications of ± 0.5 and ± 1 mm. Out-of-specification and out-of-control leaves were automatically flagged by the monitoring system and reviewed monthly by physicists. MLC-related interlocks reported by the linear accelerator and servicing events were recorded to help identify potential causes of nonrandom MLC leaf positioning variations. The precision of the MLC performance monitoring QC test and the MLC itself was within ± 0.22 mm for most MLC leaves and the majority of the

  13. Guidelines for Reproducibly Building and Simulating Systems Biology Models.

    Science.gov (United States)

    Medley, J Kyle; Goldberg, Arthur P; Karr, Jonathan R

    2016-10-01

    Reproducibility is the cornerstone of the scientific method. However, currently, many systems biology models cannot easily be reproduced. This paper presents methods that address this problem. We analyzed the recent Mycoplasma genitalium whole-cell (WC) model to determine the requirements for reproducible modeling. We determined that reproducible modeling requires both repeatable model building and repeatable simulation. New standards and simulation software tools are needed to enhance and verify the reproducibility of modeling. New standards are needed to explicitly document every data source and assumption, and new deterministic parallel simulation tools are needed to quickly simulate large, complex models. We anticipate that these new standards and software will enable researchers to reproducibly build and simulate more complex models, including WC models.

  14. Terminal digit bias is not an issue for properly trained healthcare personnel using manual or semi-automated devices - biomed 2010.

    Science.gov (United States)

    Butler, Kenneth R; Minor, Deborah S; Benghuzzi, Hamed A; Tucci, Michelle

    2010-01-01

    The objective of this study was to evaluate terminal digit preference in blood pressure (BP) measurements taken from a sample of clinics at a large academic health sciences center. We hypothesized that terminal digit preference would occur more frequently in BP measurements taken with manual mercury sphygmomanometry compared to those obtained with semi-automated instruments. A total of 1,393 BP measures were obtained in 16 ambulatory and inpatient sites by personnel using both mercury (n=1,286) and semi-automated (n=107) devices For the semi-automated devices, a trained observer repeated the patients BP following American Heart Association recommendations using a similar device with a known calibration history. At least two recorded systolic and diastolic blood pressures (average of two or more readings for each) were obtained for all manual mercury readings. Data were evaluated using descriptive statistics and Chi square as appropriate (SPSS software, 17.0). Overall, zero and other terminal digit preference was observed more frequently in systolic (?2 = 883.21, df = 9, p manual instruments, while all end digits obtained by clinic staff using semi-automated devices were more evenly distributed (?2 = 8.23, df = 9, p = 0.511 for systolic and ?2 = 10.48, df = 9, p = 0.313 for diastolic). In addition to zero digit bias in mercury readings, even numbers were reported with significantly higher frequency than odd numbers. There was no detectable digit preference observed when examining semi-automated measurements by clinic staff or device type for either systolic or diastolic BP measures. These findings demonstrate that terminal digit preference was more likely to occur with manual mercury sphygmomanometry. This phenomenon was most likely the result of mercury column graduation in 2 mm Hg increments producing a higher than expected frequency of even digits.

  15. Accuracy and reproducibility of voxel based superimposition of cone beam computed tomography models on the anterior cranial base and the zygomatic arches.

    Science.gov (United States)

    Nada, Rania M; Maal, Thomas J J; Breuning, K Hero; Bergé, Stefaan J; Mostafa, Yehya A; Kuijpers-Jagtman, Anne Marie

    2011-02-09

    Superimposition of serial Cone Beam Computed Tomography (CBCT) scans has become a valuable tool for three dimensional (3D) assessment of treatment effects and stability. Voxel based image registration is a newly developed semi-automated technique for superimposition and comparison of two CBCT scans. The accuracy and reproducibility of CBCT superimposition on the anterior cranial base or the zygomatic arches using voxel based image registration was tested in this study. 16 pairs of 3D CBCT models were constructed from pre and post treatment CBCT scans of 16 adult dysgnathic patients. Each pair was registered on the anterior cranial base three times and on the left zygomatic arch twice. Following each superimposition, the mean absolute distances between the 2 models were calculated at 4 regions: anterior cranial base, forehead, left and right zygomatic arches. The mean distances between the models ranged from 0.2 to 0.37 mm (SD 0.08-0.16) for the anterior cranial base registration and from 0.2 to 0.45 mm (SD 0.09-0.27) for the zygomatic arch registration. The mean differences between the two registration zones ranged between 0.12 to 0.19 mm at the 4 regions. Voxel based image registration on both zones could be considered as an accurate and a reproducible method for CBCT superimposition. The left zygomatic arch could be used as a stable structure for the superimposition of smaller field of view CBCT scans where the anterior cranial base is not visible.

  16. Participant Nonnaiveté and the reproducibility of cognitive psychology.

    Science.gov (United States)

    Zwaan, Rolf A; Pecher, Diane; Paolacci, Gabriele; Bouwmeester, Samantha; Verkoeijen, Peter; Dijkstra, Katinka; Zeelenberg, René

    2017-07-25

    Many argue that there is a reproducibility crisis in psychology. We investigated nine well-known effects from the cognitive psychology literature-three each from the domains of perception/action, memory, and language, respectively-and found that they are highly reproducible. Not only can they be reproduced in online environments, but they also can be reproduced with nonnaïve participants with no reduction of effect size. Apparently, some cognitive tasks are so constraining that they encapsulate behavior from external influences, such as testing situation and prior recent experience with the experiment to yield highly robust effects.

  17. No-cost manual method for preparation of tissue microarrays having high quality comparable to semiautomated methods.

    Science.gov (United States)

    Foda, Abd Al-Rahman Mohammad

    2013-05-01

    Manual tissue microarray (TMA) construction had been introduced to avoid the high cost of automated and semiautomated techniques. The cheapest and simplest technique for constructing manual TMA was that of using mechanical pencil tips. This study was carried out to modify this method, aiming to raise its quality to reach that of expensive ones. Some modifications were introduced to Shebl's technique. Two conventional mechanical pencil tips of different diameters were used to construct the recipient blocks. A source of mild heat was used, and blocks were incubated at 38°C overnight. With our modifications, 3 high-density TMA blocks were constructed. We successfully performed immunostaining without substantial tissue loss. Our modifications increased the number of cores per block and improved the stability of the cores within the paraffin block. This new, modified technique is a good alternative for expensive machines in many laboratories.

  18. Semiautomated object-based classification of rain-induced landslides with VHR multispectral images on Madeira Island

    Science.gov (United States)

    Heleno, Sandra; Matias, Magda; Pina, Pedro; Sousa, António Jorge

    2016-04-01

    A method for semiautomated landslide detection and mapping, with the ability to separate source and run-out areas, is presented in this paper. It combines object-based image analysis and a support vector machine classifier and is tested using a GeoEye-1 multispectral image, sensed 3 days after a major damaging landslide event that occurred on Madeira Island (20 February 2010), and a pre-event lidar digital terrain model. The testing is developed in a 15 km2 wide study area, where 95 % of the number of landslides scars are detected by this supervised approach. The classifier presents a good performance in the delineation of the overall landslide area, with commission errors below 26 % and omission errors below 24 %. In addition, fair results are achieved in the separation of the source from the run-out landslide areas, although in less illuminated slopes this discrimination is less effective than in sunnier, east-facing slopes.

  19. A High Throughput, 384-Well, Semi-Automated, Hepatocyte Intrinsic Clearance Assay for Screening New Molecular Entities in Drug Discovery.

    Science.gov (United States)

    Heinle, Lance; Peterkin, Vincent; de Morais, Sonia M; Jenkins, Gary J; Badagnani, Ilaria

    2015-01-01

    A high throughput, semi-automated clearance screening assay in hepatocytes was developed allowing a scientist to generate data for 96 compounds in one week. The 384-well format assay utilizes a Thermo Multidrop Combi and an optimized LC-MS/MS method. The previously reported LCMS/ MS method reduced the analytical run time by 3-fold, down to 1.2 min injection-to-injection. The Multidrop was able to deliver hepatocytes to 384-well plates with minimal viability loss. Comparison of results from the new 384-well and historical 24-well assays yielded a correlation of 0.95. In addition, results obtained for 25 marketed drugs with various metabolism pathways had a correlation of 0.75 when compared with literature values. Precision was maintained in the new format as 8 compounds tested in ≥39 independent experiments had coefficients of variation ≤21%. The ability to predict in vivo clearances using the new stability assay format was also investigated using 22 marketed drugs and 26 AbbVie compounds. Correction of intrinsic clearance values with binding to hepatocytes (in vitro data) and plasma (in vivo data) resulted in a higher in vitro to in vivo correlation when comparing 22 marketed compounds in human (0.80 vs 0.35) and 26 AbbVie Discovery compounds in rat (0.56 vs 0.17), demonstrating the importance of correcting for binding in clearance studies. This newly developed high throughput, semi-automated clearance assay allows for rapid screening of Discovery compounds to enable Structure Activity Relationship (SAR) analysis based on high quality hepatocyte stability data in sufficient quantity and quality to drive the next round of compound synthesis.

  20. Serum bactericidal assay for the evaluation of typhoid vaccine using a semi-automated colony-counting method.

    Science.gov (United States)

    Jang, Mi Seon; Sahastrabuddhe, Sushant; Yun, Cheol-Heui; Han, Seung Hyun; Yang, Jae Seung

    2016-08-01

    Typhoid fever, mainly caused by Salmonella enterica serovar Typhi (S. Typhi), is a life-threatening disease, mostly in developing countries. Enzyme-linked immunosorbent assay (ELISA) is widely used to quantify antibodies against S. Typhi in serum but does not provide information about functional antibody titers. Although the serum bactericidal assay (SBA) using an agar plate is often used to measure functional antibody titers against various bacterial pathogens in clinical specimens, it has rarely been used for typhoid vaccines because it is time-consuming and labor-intensive. In the present study, we established an improved SBA against S. Typhi using a semi-automated colony-counting system with a square agar plate harboring 24 samples. The semi-automated SBA efficiently measured bactericidal titers of sera from individuals immunized with S. Typhi Vi polysaccharide vaccines. The assay specifically responded to S. Typhi Ty2 but not to other irrelevant enteric bacteria including Vibrio cholerae and Shigella flexneri. Baby rabbit complement was more appropriate source for the SBA against S. Typhi than complements from adult rabbit, guinea pig, and human. We also examined the correlation between SBA and ELISA for measuring antibody responses against S. Typhi using pre- and post-vaccination sera from 18 human volunteers. The SBA titer showed a good correlation with anti-Vi IgG quantity in the serum as determined by Spearman correlation coefficient of 0.737 (P measure functional antibody titers against S. Typhi in sera from human subjects immunized with typhoid vaccines. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  1. Improvement of the banana "Musa acuminata" reference sequence using NGS data and semi-automated bioinformatics methods.

    Science.gov (United States)

    Martin, Guillaume; Baurens, Franc-Christophe; Droc, Gaëtan; Rouard, Mathieu; Cenci, Alberto; Kilian, Andrzej; Hastie, Alex; Doležel, Jaroslav; Aury, Jean-Marc; Alberti, Adriana; Carreel, Françoise; D'Hont, Angélique

    2016-03-16

    Recent advances in genomics indicate functional significance of a majority of genome sequences and their long range interactions. As a detailed examination of genome organization and function requires very high quality genome sequence, the objective of this study was to improve reference genome assembly of banana (Musa acuminata). We have developed a modular bioinformatics pipeline to improve genome sequence assemblies, which can handle various types of data. The pipeline comprises several semi-automated tools. However, unlike classical automated tools that are based on global parameters, the semi-automated tools proposed an expert mode for a user who can decide on suggested improvements through local compromises. The pipeline was used to improve the draft genome sequence of Musa acuminata. Genotyping by sequencing (GBS) of a segregating population and paired-end sequencing were used to detect and correct scaffold misassemblies. Long insert size paired-end reads identified scaffold junctions and fusions missed by automated assembly methods. GBS markers were used to anchor scaffolds to pseudo-molecules with a new bioinformatics approach that avoids the tedious step of marker ordering during genetic map construction. Furthermore, a genome map was constructed and used to assemble scaffolds into super scaffolds. Finally, a consensus gene annotation was projected on the new assembly from two pre-existing annotations. This approach reduced the total Musa scaffold number from 7513 to 1532 (i.e. by 80%), with an N50 that increased from 1.3 Mb (65 scaffolds) to 3.0 Mb (26 scaffolds). 89.5% of the assembly was anchored to the 11 Musa chromosomes compared to the previous 70%. Unknown sites (N) were reduced from 17.3 to 10.0%. The release of the Musa acuminata reference genome version 2 provides a platform for detailed analysis of banana genome variation, function and evolution. Bioinformatics tools developed in this work can be used to improve genome sequence assemblies in

  2. A modular, prospective, semi-automated drug safety monitoring system for use in a distributed data environment.

    Science.gov (United States)

    Gagne, Joshua J; Wang, Shirley V; Rassen, Jeremy A; Schneeweiss, Sebastian

    2014-06-01

    The aim of this study was to develop and test a semi-automated process for conducting routine active safety monitoring for new drugs in a network of electronic healthcare databases. We built a modular program that semi-automatically performs cohort identification, confounding adjustment, diagnostic checks, aggregation and effect estimation across multiple databases, and application of a sequential alerting algorithm. During beta-testing, we applied the system to five databases to evaluate nine examples emulating prospective monitoring with retrospective data (five pairs for which we expected signals, two negative controls, and two examples for which it was uncertain whether a signal would be expected): cerivastatin versus atorvastatin and rhabdomyolysis; paroxetine versus tricyclic antidepressants and gastrointestinal bleed; lisinopril versus angiotensin receptor blockers and angioedema; ciprofloxacin versus macrolide antibiotics and Achilles tendon rupture; rofecoxib versus non-selective non-steroidal anti-inflammatory drugs (ns-NSAIDs) and myocardial infarction; telithromycin versus azithromycin and hepatotoxicity; rosuvastatin versus atorvastatin and diabetes and rhabdomyolysis; and celecoxib versus ns-NSAIDs and myocardial infarction. We describe the program, the necessary inputs, and the assumed data environment. In beta-testing, the system generated four alerts, all among positive control examples (i.e., lisinopril and angioedema; rofecoxib and myocardial infarction; ciprofloxacin and tendon rupture; and cerivastatin and rhabdomyolysis). Sequential effect estimates for each example were consistent in direction and magnitude with existing literature. Beta-testing across nine drug-outcome examples demonstrated the feasibility of the proposed semi-automated prospective monitoring approach. In retrospective assessments, the system identified an increased risk of myocardial infarction with rofecoxib and an increased risk of rhabdomyolysis with cerivastatin years

  3. Accuracy and feasibility of estimated tumour volumetry in primary gastric gastrointestinal stromal tumours: validation using semiautomated technique in 127 patients.

    Science.gov (United States)

    Tirumani, Sree Harsha; Shinagare, Atul B; O'Neill, Ailbhe C; Nishino, Mizuki; Rosenthal, Michael H; Ramaiya, Nikhil H

    2016-01-01

    To validate estimated tumour volumetry in primary gastric gastrointestinal stromal tumours (GISTs) using semiautomated volumetry. In this IRB-approved retrospective study, we measured the three longest diameters in x, y, z axes on CTs of primary gastric GISTs in 127 consecutive patients (52 women, 75 men, mean age 61 years) at our institute between 2000 and 2013. Segmented volumes (Vsegmented) were obtained using commercial software by two radiologists. Estimate volumes (V1-V6) were obtained using formulae for spheres and ellipsoids. Intra- and interobserver agreement of Vsegmented and agreement of V1-6 with Vsegmented were analysed with concordance correlation coefficients (CCC) and Bland-Altman plots. Median Vsegmented and V1-V6 were 75.9, 124.9, 111.6, 94.0, 94.4, 61.7 and 80.3 cm(3), respectively. There was strong intra- and interobserver agreement for Vsegmented. Agreement with Vsegmented was highest for V6 (scalene ellipsoid, x ≠ y ≠ z), with CCC of 0.96 [95 % CI 0.95-0.97]. Mean relative difference was smallest for V6 (0.6 %), while it was -19.1 % for V5, +14.5 % for V4, +17.9 % for V3, +32.6 % for V2 and +47 % for V1. Ellipsoidal approximations of volume using three measured axes may be used to closely estimate Vsegmented when semiautomated techniques are unavailable. Estimation of tumour volume in primary GIST using mathematical formulae is feasible. Gastric GISTs are rarely spherical. Segmented volumes are highly concordant with three axis-based scalene ellipsoid volumes. Ellipsoid volume can be used as an alternative for automated tumour volumetry.

  4. Repeatability and reproducibility of an automated gas production technique

    NARCIS (Netherlands)

    Laar, van H.; Straalen, van W.M.; Gelder, van A.H.; Boever, de J.L.; heer, D' B.; Vedder, H.; Kroes, R.; Bot, de P.; Hees, van J.; Cone, J.W.

    2006-01-01

    Two ring tests with five and three laboratories, respectively, were conducted to quantify variation within and among laboratories in an automated gas production technique. Single batches of the feeds soya bean meal (SBM), wheat grain (WG), grass silage (GS) and maize gluten meal (MG) were divided

  5. Reproducible diagnosis of Chronic Lymphocytic Leukemia by flow cytometry

    DEFF Research Database (Denmark)

    Rawstron, Andy C; Kreuzer, Karl-Anton; Soosapilla, Asha

    2018-01-01

    The diagnostic criteria for CLL rely on morphology and immunophenotype. Current approaches have limitations affecting reproducibility and there is no consensus on the role of new markers. The aim of this project was to identify reproducible criteria and consensus on markers recommended for the di...

  6. Genotypic variability enhances the reproducibility of an ecological study.

    Science.gov (United States)

    Milcu, Alexandru; Puga-Freitas, Ruben; Ellison, Aaron M; Blouin, Manuel; Scheu, Stefan; Freschet, Grégoire T; Rose, Laura; Barot, Sebastien; Cesarz, Simone; Eisenhauer, Nico; Girin, Thomas; Assandri, Davide; Bonkowski, Michael; Buchmann, Nina; Butenschoen, Olaf; Devidal, Sebastien; Gleixner, Gerd; Gessler, Arthur; Gigon, Agnès; Greiner, Anna; Grignani, Carlo; Hansart, Amandine; Kayler, Zachary; Lange, Markus; Lata, Jean-Christophe; Le Galliard, Jean-François; Lukac, Martin; Mannerheim, Neringa; Müller, Marina E H; Pando, Anne; Rotter, Paula; Scherer-Lorenzen, Michael; Seyhun, Rahme; Urban-Mead, Katherine; Weigelt, Alexandra; Zavattaro, Laura; Roy, Jacques

    2018-02-01

    Many scientific disciplines are currently experiencing a 'reproducibility crisis' because numerous scientific findings cannot be repeated consistently. A novel but controversial hypothesis postulates that stringent levels of environmental and biotic standardization in experimental studies reduce reproducibility by amplifying the impacts of laboratory-specific environmental factors not accounted for in study designs. A corollary to this hypothesis is that a deliberate introduction of controlled systematic variability (CSV) in experimental designs may lead to increased reproducibility. To test this hypothesis, we had 14 European laboratories run a simple microcosm experiment using grass (Brachypodium distachyon L.) monocultures and grass and legume (Medicago truncatula Gaertn.) mixtures. Each laboratory introduced environmental and genotypic CSV within and among replicated microcosms established in either growth chambers (with stringent control of environmental conditions) or glasshouses (with more variable environmental conditions). The introduction of genotypic CSV led to 18% lower among-laboratory variability in growth chambers, indicating increased reproducibility, but had no significant effect in glasshouses where reproducibility was generally lower. Environmental CSV had little effect on reproducibility. Although there are multiple causes for the 'reproducibility crisis', deliberately including genetic variability may be a simple solution for increasing the reproducibility of ecological studies performed under stringently controlled environmental conditions.

  7. Participant Nonnaiveté and the reproducibility of cognitive psychology

    NARCIS (Netherlands)

    R.A. Zwaan (Rolf); D. Pecher (Diane); G. Paolacci (Gabriele); S. Bouwmeester (Samantha); P.P.J.L. Verkoeijen (Peter); K. Dijkstra (Katinka); R. Zeelenberg (René)

    2017-01-01

    textabstractMany argue that there is a reproducibility crisis in psychology. We investigated nine well-known effects from the cognitive psychology literature—three each from the domains of perception/action, memory, and language, respectively—and found that they are highly reproducible. Not only can

  8. Reproducing Kernels and Coherent States on Julia Sets

    Energy Technology Data Exchange (ETDEWEB)

    Thirulogasanthar, K., E-mail: santhar@cs.concordia.ca; Krzyzak, A. [Concordia University, Department of Computer Science and Software Engineering (Canada)], E-mail: krzyzak@cs.concordia.ca; Honnouvo, G. [Concordia University, Department of Mathematics and Statistics (Canada)], E-mail: g_honnouvo@yahoo.fr

    2007-11-15

    We construct classes of coherent states on domains arising from dynamical systems. An orthonormal family of vectors associated to the generating transformation of a Julia set is found as a family of square integrable vectors, and, thereby, reproducing kernels and reproducing kernel Hilbert spaces are associated to Julia sets. We also present analogous results on domains arising from iterated function systems.

  9. Reproducing Kernels and Coherent States on Julia Sets

    International Nuclear Information System (INIS)

    Thirulogasanthar, K.; Krzyzak, A.; Honnouvo, G.

    2007-01-01

    We construct classes of coherent states on domains arising from dynamical systems. An orthonormal family of vectors associated to the generating transformation of a Julia set is found as a family of square integrable vectors, and, thereby, reproducing kernels and reproducing kernel Hilbert spaces are associated to Julia sets. We also present analogous results on domains arising from iterated function systems

  10. Completely reproducible description of digital sound data with cellular automata

    International Nuclear Information System (INIS)

    Wada, Masato; Kuroiwa, Jousuke; Nara, Shigetoshi

    2002-01-01

    A novel method of compressive and completely reproducible description of digital sound data by means of rule dynamics of CA (cellular automata) is proposed. The digital data of spoken words and music recorded with the standard format of a compact disk are reproduced completely by this method with use of only two rules in a one-dimensional CA without loss of information

  11. Continuous Heterogeneous Photocatalysis in Serial Micro-Batch Reactors.

    Science.gov (United States)

    Pieber, Bartholomäus; Shalom, Menny; Antonietti, Markus; Seeberger, Peter H; Gilmore, Kerry

    2018-01-29

    Solid reagents, leaching catalysts, and heterogeneous photocatalysts are commonly employed in batch processes but are ill-suited for continuous-flow chemistry. Heterogeneous catalysts for thermal reactions are typically used in packed-bed reactors, which cannot be penetrated by light and thus are not suitable for photocatalytic reactions involving solids. We demonstrate that serial micro-batch reactors (SMBRs) allow for the continuous utilization of solid materials together with liquids and gases in flow. This technology was utilized to develop selective and efficient fluorination reactions using a modified graphitic carbon nitride heterogeneous catalyst instead of costly homogeneous metal polypyridyl complexes. The merger of this inexpensive, recyclable catalyst and the SMBR approach enables sustainable and scalable photocatalysis. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Optimal Operation of Industrial Batch Crystallizers : A Nonlinear Model-based Control Approach

    NARCIS (Netherlands)

    Mesbah, A.

    2010-01-01

    Batch crystallization is extensively employed in the chemical, pharmaceutical, and food industries to separate and purify high value-added chemical substances. Despite their widespread application, optimal operation of batch crystallizers is particularly challenging. The difficulties primarily

  13. A parallel ILP algorithm that incorporates incremental batch learning

    OpenAIRE

    Nuno Fonseca; Rui Camacho; Fernado Silva

    2003-01-01

    In this paper we tackle the problems of eciency and scala-bility faced by Inductive Logic Programming (ILP) systems. We proposethe use of parallelism to improve eciency and the use of an incrementalbatch learning to address the scalability problem. We describe a novelparallel algorithm that incorporates into ILP the method of incremen-tal batch learning. The theoretical complexity of the algorithm indicatesthat a linear speedup can be achieved.

  14. Automated handling for SAF batch furnace and chemistry analysis operations

    International Nuclear Information System (INIS)

    Bowen, W.W.; Sherrell, D.L.; Wiemers, M.J.

    1981-01-01

    The Secure Automated Fabrication Program is developing a remotely operated breeder reactor fuel pin fabrication line. The equipment will be installed in the Fuels and Materials Examination Facility being constructed at Hanford, Washington. Production is scheduled to start in mid-1986. The application of small pneumatically operated industrial robots for loading and unloading product into and out of batch furnaces and for distribution and handling of chemistry samples is described

  15. Integration of virtualized worker nodes in standard batch systems

    International Nuclear Information System (INIS)

    Buege, Volker; Kunze, Marcel; Oberst, Oliver; Quast, Guenter; Scheurer, Armin; Hessling, Hermann; Kemp, Yves; Synge, Owen

    2010-01-01

    Current experiments in HEP only use a limited number of operating system flavours. Their software might only be validated on one single OS platform. Resource providers might have other operating systems of choice for the installation of the batch infrastructure. This is especially the case if a cluster is shared with other communities, or communities that have stricter security requirements. One solution would be to statically divide the cluster into separated sub-clusters. In such a scenario, no opportunistic distribution of the load can be achieved, resulting in a poor overall utilization efficiency. Another approach is to make the batch system aware of virtualization, and to provide each community with its favoured operating system in a virtual machine. Here, the scheduler has full flexibility, resulting in a better overall efficiency of the resources. In our contribution, we present a lightweight concept for the integration of virtual worker nodes into standard batch systems. The virtual machines are started on the worker nodes just before jobs are executed there. No meta-scheduling is introduced. We demonstrate two prototype implementations, one based on the Sun Grid Engine (SGE), the other using Maui/Torque as a batch system. Both solutions support local job as well as Grid job submission. The hypervisors currently used are Xen and KVM, a port to another system is easily envisageable. To better handle different virtual machines on the physical host, the management solution VmImageManager is developed. We will present first experience from running the two prototype implementations. In a last part, we will show the potential future use of this lightweight concept when integrated into high-level (i.e. Grid) work-flows.

  16. Hydrothermal liquefaction of biomass: Developments from batch to continuous process

    OpenAIRE

    Elliott, DC; Biller, P; Ross, AB; Schmidt, AJ; Jones, SB

    2015-01-01

    This review describes the recent results in hydrothermal liquefaction (HTL) of biomass in continuous-flow processing systems. Although much has been published about batch reactor tests of biomass HTL, there is only limited information yet available on continuous-flow tests, which can provide a more reasonable basis for process design and scale-up for commercialization. High-moisture biomass feedstocks are the most likely to be used in HTL. These materials are described and results of their pr...

  17. Batch production of microchannel plate photo-multipliers

    Energy Technology Data Exchange (ETDEWEB)

    Frisch, Henry J.; Wetstein, Matthew; Elagin, Andrey

    2018-03-06

    In-situ methods for the batch fabrication of flat-panel micro-channel plate (MCP) photomultiplier tube (PMT) detectors (MCP-PMTs), without transporting either the window or the detector assembly inside a vacuum vessel are provided. The method allows for the synthesis of a reflection-mode photocathode on the entrance to the pores of a first MCP or the synthesis of a transmission-mode photocathode on the vacuum side of a photodetector entrance window.

  18. BATCH PROCESS INTEGRATION OF APPLYING TECHNOLOGY OF ACID CARMINIC PINCH

    OpenAIRE

    Erazo E., Raymundo; Cárdenas R., Jorge L.; Woolcott H., Juan C.

    2014-01-01

    This work was developed in order to implement the PINCH technology integration batch process for carminic acid. The method used consisted of the application of the concepts of bottle necks total process (OPB) together with part-time models (TAM) and time fractionated! (TSM). The drying operation is identified as the rate limiting step of the process identifying it as an OPB plant capacity. The extraction yield was 95% w / p carminic acid with an energy savings of approximately 60% of the...

  19. Copper solubility in DWPF, Batch 1 waste glass: Update report

    International Nuclear Information System (INIS)

    Schumacker, R.F.

    1992-01-01

    The ''Late Washing'' Step in the processing of precipitate will require the use of additional copper formate in the Precipitate Reactor to catalyze the hydrolysis reaction. The increased copper concentration in the melter feed increases the potential for metal precipitation during the vitrification of the melter feed. This report describes recent results with a conservative glass selected from the DWPF acceptable region in the Batch 1 Variability Study

  20. Mixing volume determination in batch transfers through sonic detectors

    Energy Technology Data Exchange (ETDEWEB)

    Baptista, Renan Martins [PETROBRAS S.A., Rio de Janeiro, RJ (Brazil). Centro de Pesquisas]. E-mail: renan@cenpes.petrobras.com.br; Rachid, Felipe Bastos de Freitas [Universidade Federal Fluminense, Niteroi, RJ (Brazil). Dept. de Engenharia Mecanica]. E-mail: rachid@mec.uff.br; Araujo, Jose Henrique Carneiro de [Universidade Federal Fluminense, Niteroi, RJ (Brazil). Dept. de Ciencia da Computacao]. E-mail: jhca@dcc.ic.uff.br

    2000-07-01

    An experimental methodology to evaluate mixing volumes in batch transfers by means of sonic detectors has been reported in this paper. Mixing volumes have then been computed in a transfer of diesel/gasoline carried out through a pipeline operated by Petrobras for different interface points. It has been shown that an adequate choice of the interface points is crucial for keeping the mixing volume uncertainty within acceptable limits. (author)

  1. On the track of fish batches in three distribution networks

    DEFF Research Database (Denmark)

    Randrup, Maria; Wu, Haiping; Jørgensen, Bo M.

    2012-01-01

    Three fish products sampled in retail shops were traced back to their origin and fish from the same batch were tracked forward towards the retailer, thereby simulating a recall situation. The resulting distribution networks were very complex, but to the extent that companies were willing to provi...... of discovering a fault as early as possible in order to minimise the costs of a recall. The localisation of distributed products during a recall operation can be facilitated by a well-constructed traceability system....

  2. Operational stability of naringinase PVA lens-shaped microparticles in batch stirred reactors and mini packed bed reactors-one step closer to industry.

    Science.gov (United States)

    Nunes, Mário A P; Rosa, M Emilia; Fernandes, Pedro C B; Ribeiro, Maria H L

    2014-07-01

    The immobilization of naringinase in PVA lens-shaped particles, a cheap and biocompatible hydrogel was shown to provide an effective biocatalyst for naringin hydrolysis, an appealing reaction in the food and pharmaceutical industries. The present work addresses the operational stability and scale-up of the bioconversion system, in various types of reactors, namely shaken microtiter plates (volume ⩽ 2 mL), batch stirred tank reactors (volume reactor (PBR, 6.8 mL). Consecutive batch runs were performed with the shaken/stirred vessels, with reproducible and encouraging results, related to operational stability. The PBR was used to establish the feasibility for continuous operation, running continuously for 54 days at 45°C. The biocatalyst activity remained constant for 40 days of continuous operation. The averaged specific productivity was 9.07 mmol h(-1) g enzyme(-1) and the half-life of 48 days. Copyright © 2014 Elsevier Ltd. All rights reserved.

  3. The Kinetics and Reproducibility of 18F-Sodium Fluoride for Oncology Using Current PET Camera Technology

    Science.gov (United States)

    Kurdziel, Karen A.; Shih, Joanna H.; Apolo, Andrea B.; Lindenberg, Liza; Mena, Esther; McKinney, Yolanda; Adler, Stephen S.; Turkbey, Baris; Dahut, William; Gulley, James L.; Madan, Ravi A.; Landgren, Ola; Choyke, Peter L.

    2012-01-01

    We evaluated the kinetics of 18F-sodium fluoride (NaF) and reassessed the recommended dose, optimal uptake period, and reproducibility using a current-generation PET/CT scanner. Methods In this prospective study, 73 patients (31 patients with multiple myeloma or myeloma precursor disease and 42 with prostate cancer) were injected with a mean administered dose of 141 MBq of 18F-NaF. Sixty patients underwent 3 sequential sessions of 3-dimensional PET/CT of the torso beginning ~15 min after 18F-NaF injection, followed by a whole-body 3-dimensional PET/CT at 2 h. The remaining 13 prostate cancer patients were imaged only at 2 and 3 h after injection. Twenty-one prostate cancer patients underwent repeat baseline studies (mean interval, 5.9 d) to evaluate reproducibility. Results The measured effective dose was 0.017 mSv/MBq, with the urinary bladder, osteogenic cells, and red marrow receiving the highest doses at 0.080, 0.077, and 0.028 mGy/MBq, respectively. Visual analysis showed that uptake in both normal and abnormal bone increased with time; however, the rate of increase decreased with time. A semiautomated workflow provided objective uptake parameters, including the mean standardized uptake value of all pixels within bone with SUVs greater than 10 and the average of the mean SUV of all malignant lesions identified by the algorithm. The values of these parameters for the images beginning at ~15 min and ~35 min were significantly different (0.3% change/minute). Differences between the later imaging time points were not significant (P 0.9) and relatively low critical percent change (the value above which a change can be considered real) for these parameters. The tumor-to-normal bone ratio, based on the SUVmax of identified malignant lesions, decreased with time; however, this difference was small, estimated at ~0.16%/min in the first hour. Conclusion 18F-NaF PET/CT images obtained with modest radiation exposures can result in highly reproducible imaging parameters

  4. Yields from pyrolysis of refinery residue using a batch process

    Directory of Open Access Journals (Sweden)

    S. Prithiraj

    2017-12-01

    Full Text Available Batch pyrolysis was a valuable process of assessing the potential of recovering and characterising products from hazardous waste materials. This research explored the pyrolysis of hydrocarbon-rich refinery residue, from crude oil processes, in a 1200 L electrically-heated batch retort. Furthermore, the off-gases produced were easily processed in compliance with existing regulatory emission standards. The methodology offers a novel, cost-effective and environmentally compliant method of assessing recovery potential of valuable products. The pyrolysis experiments yielded significant oil (70% with high calorific value (40 MJ/kg, char (14% with carbon content over 80% and non-condensable gas (6% with significant calorific value (240 kJ/mol. The final gas stream was subjected to an oxidative clean-up process with continuous on-line monitoring demonstrating compliance with South African emission standards. The gas treatment was overall economically optimal as only a smaller portion of the original residue was subjected to emission-controlling steps. Keywords: Batch pyrolysis, Volatiles, Oil yields, Char, Emissions, Oil recovery

  5. Treatment of slaughterhouse wastewater in anaerobic sequencing batch reactors

    Energy Technology Data Exchange (ETDEWEB)

    Masse, D. I.; Masse, L. [Agriculture and Agri-Food Canada, Lennoxville, PQ (Canada)

    2000-09-01

    Slaughterhouse waste water was treated in anaerobic sequencing batch reactors operated at 30 degrees C. Two of the batch reactors were seeded with anaerobic granular sludge from a milk processing plant reactor; two others received anaerobic non-granulated sludge from a municipal waste water treatment plant. Influent total chemical oxygen demand was reduced by 90 to 96 per cent at organic loading rates ranging from 2.07 kg to 4.93 kg per cubic meter. Reactors seeded with municipal sludge performed slightly better than those containing sludge from the milk processing plant. The difference was particularly noticeable during start-up, but the differences between the two sludges were reduced with time. The reactors produced a biogas containing 75 per cent methane. About 90.5 per cent of the chemical oxygen demand removed was methanized; volatile suspended solids accumulation was determined at 0.068 kg per kg of chemical oxygen demand removed. The high degree of methanization suggests that most of the soluble and suspended organic material in slaughterhouse waste water was degraded during the treatment in the anaerobic sequencing batch reactors. 30 refs., 1 tab., 6 figs.

  6. Modeling of oxide reduction in repeated-batch pyroprocessing

    International Nuclear Information System (INIS)

    Lee, Hyo Jik; Im, Hun Suk; Park, Geun Il

    2016-01-01

    Highlights: • Pyroprocessing is a complicated batch-type operation. • Discrete event system modeling was used to create an integrated operation model. • Simulation showed that could be accomplished. • The dynamic material flow helps us understand the process operation. • We showed that complex material flow could be simulated in terms of mass balance. - Abstract: Pyroprocessing is a complicated batch-type operation, involving a highly complex material flow logic with a huge number of unit processes. Discrete event system modeling was used to create an integrated operation model for which simulation showed that dynamic material flow could be accomplished to provide considerable insight into the process operation. In the model simulation, the amount of material transported upstream and downstream in the process satisfies a mass balance equation while considering the hold-up incurred by every batch operation. This study also simulated, in detail, an oxide reduction group process embracing electrolytic reduction, cathode processing, and salt purification. Based on the default operation scenario, it showed that complex material flows could be precisely simulated in terms of the mass balance. Specifically, the amount of high-heat elements remaining in the molten salt bath is analyzed to evaluate the operation scenario.

  7. Automation of gamwave batch irradiator in Natal, South Africa

    International Nuclear Information System (INIS)

    Basson, J.K.; Basson, R.A.; Botha, J.

    1995-01-01

    High Energy Processing (HEPRO) has operated a Nordion JS 8200 Batch Irradiator for several years at Gamwave in Durban, South Africa. Product is loaded into aluminium totes and manually transported on trolleys into the irradiation chamber. Unirradiated totes are then exchanged with all the irradiated totes in the product pass mechanism, after which the source is raised and the batch irradiation process is started. Due to the inefficient Cobalt utilization experienced in this type of plant, it was decided to upgrade and automate the facility. This was done in what we believe is a simple and unique solution to the problem facing the future of such batch facilities. The design concept used for the Gamwave irradiator was based on irradiating product in carbons or bags of variable dimensions as per customer requirements. The intention was to convey product automatically in and out of the irradiation chamber eliminating the product change over downtime and thereby increasing source up utilization. Minor extensions were carried out to the Bioshield with the existing irradiator in full operation awaiting installation of the new source pass mechanism and conveyor system. Total plant shutdown for conversion to automation, including source reload and safety checks, was estimated to take ten days to fit the equipment. (author)

  8. Fault Diagnosis of Batch Reactor Using Machine Learning Methods

    Directory of Open Access Journals (Sweden)

    Sujatha Subramanian

    2014-01-01

    Full Text Available Fault diagnosis of a batch reactor gives the early detection of fault and minimizes the risk of thermal runaway. It provides superior performance and helps to improve safety and consistency. It has become more vital in this technical era. In this paper, support vector machine (SVM is used to estimate the heat release (Qr of the batch reactor both normal and faulty conditions. The signature of the residual, which is obtained from the difference between nominal and estimated faulty Qr values, characterizes the different natures of faults occurring in the batch reactor. Appropriate statistical and geometric features are extracted from the residual signature and the total numbers of features are reduced using SVM attribute selection filter and principle component analysis (PCA techniques. artificial neural network (ANN classifiers like multilayer perceptron (MLP, radial basis function (RBF, and Bayes net are used to classify the different types of faults from the reduced features. It is observed from the result of the comparative study that the proposed method for fault diagnosis with limited number of features extracted from only one estimated parameter (Qr shows that it is more efficient and fast for diagnosing the typical faults.

  9. Impact of Sterile Compounding Batch Frequency on Pharmaceutical Waste.

    Science.gov (United States)

    Abbasi, Ghalib; Gay, Evan

    2017-01-01

    Purpose: To measure the impact of increasing sterile compounding batch frequency on pharmaceutical waste as it relates to cost and quantity. Methods: Pharmaceutical IV waste at a tertiary care hospital was observed and recorded for 7 days. The batching frequency of compounded sterile products (CSPs) was then increased from twice daily to 4 times daily. After a washout period, pharmaceutical IV waste was then recorded for another 7 days. The quantity of units wasted and the cost were compared between both phases to determine the impact that batching frequency has on IV waste, specifically among high- and low-cost drugs. Results: Patient days increased from 2,459 during phase 1 to 2,617 during phase 2. The total number of CSPs wasted decreased from 3.6 to 2.7 doses per 100 patient days. Overall cost was reduced from $4,585.36 in phase 1 to $4,453.88 in phase 2. The value of wasted high-cost drugs per 100 patient days increased from $146 in phase 1 to $149 in phase 2 ( p > .05). The value of wasted low cost drugs per 100 patient days decreased from $41 in phase 1 to $21 in phase 2 ( p waste quantity and cost. The highest impact of the intervention was observed among low-cost CSPs.

  10. Analyzing data flows of WLCG jobs at batch job level

    Science.gov (United States)

    Kuehn, Eileen; Fischer, Max; Giffels, Manuel; Jung, Christopher; Petzold, Andreas

    2015-05-01

    With the introduction of federated data access to the workflows of WLCG, it is becoming increasingly important for data centers to understand specific data flows regarding storage element accesses, firewall configurations, as well as the scheduling of batch jobs themselves. As existing batch system monitoring and related system monitoring tools do not support measurements at batch job level, a new tool has been developed and put into operation at the GridKa Tier 1 center for monitoring continuous data streams and characteristics of WLCG jobs and pilots. Long term measurements and data collection are in progress. These measurements already have been proven to be useful analyzing misbehaviors and various issues. Therefore we aim for an automated, realtime approach for anomaly detection. As a requirement, prototypes for standard workflows have to be examined. Based on measurements of several months, different features of HEP jobs are evaluated regarding their effectiveness for data mining approaches to identify these common workflows. The paper will introduce the actual measurement approach and statistics as well as the general concept and first results classifying different HEP job workflows derived from the measurements at GridKa.

  11. Three-batch reloading scheme for IRIS reactor extended cycles

    International Nuclear Information System (INIS)

    Jecmenica, R.; Pevec, D.; Grgic, D.

    2004-01-01

    To fully exploit the IRIS reactor optimized maintenance, and at the same time improve fuel utilization, a core design enabling a 4-year operating cycle together with a three-batch reloading scheme is desirable. However, this requires not only the increased allowed burnup but also use of fuel with uranium oxide enriched beyond 5%. This paper considers three-batch reloading scheme for a 4-year operating cycle with the assumptions of increased discharge burnup and fuel enrichment beyond 5%. Calculational model of IRIS reactor core has been developed based on FER FA2D code for group constants generation and NRC's PARCS nodal code for global core analysis. Studies have been performed resulting in a preliminary design of a three-batch core configuration for the first cycle. It must be emphasized that this study is outside the current IRIS licensing efforts, which rely on the present fuel technology (enrichment below 5%), but it is of long-term interest for potential future IRIS design upgrades. (author)

  12. Some performance measures for vacation models with a batch Markovian arrival process

    Directory of Open Access Journals (Sweden)

    Sadrac K. Matendo

    1994-01-01

    Full Text Available We consider a single server infinite capacity queueing system, where the arrival process is a batch Markovian arrival process (BMAP. Particular BMAPs are the batch Poisson arrival process, the Markovian arrival process (MAP, many batch arrival processes with correlated interarrival times and batch sizes, and superpositions of these processes. We note that the MAP includes phase-type (PH renewal processes and non-renewal processes such as the Markov modulated Poisson process (MMPP.

  13. From batch to continuous extractive distillation using thermodynamic insight: class 1.0-2 case B

    OpenAIRE

    Shen, Weifeng; Benyounes, Hassiba; Gerbaud, Vincent

    2011-01-01

    A systematic feasibility analysis is presented for the separation azeotropic mixtures by batch and continuous extractive distillation. Based on batch feasibility knowledge, batch and continuous separation feasibility is studied under reflux ratio and entrainer flow-rate for the ternary system chloroform-vinyl acetate-butyl acetate, which belongs to the class 1.0-2 separating maximum boiling temperature azeotropes using a heavy entrainer. How information on feasibility of batch mode could be e...

  14. Potential of semiautomated, synoptic geologic studies for characterization of hazardous waste sites

    International Nuclear Information System (INIS)

    Foley, M.G.; Beaver, D.E.; Glennon, M.A.; Eliason, J.R.

    1988-01-01

    Siting studies for licensing hazardous facilities require three-dimensional characterization of site geology including lithology, structure, and tectonics. The scope of these studies depends on the type of hazardous facility and its associated regulations. This scope can vary from a pro forma literature review to an extensive, multiyear research effort. Further, the regulatory environment often requires that the credibility of such studies be established in administrative and litigative proceedings, rather than solely by technical peer review. Pacific Northwest Laboratory (PNL) has developed a technology called remote geologic analysis (RGA). This technology provides reproducible photogeologic maps, determinations of three- dimensional faults and fracture sets expressed as erosional lineaments or planar topographic features, planar feature identification in seismic hypocenter data, and crustal- stress/tectonic analyses. Results from the RGA establish a foundation for interpretations that are defensible in licensing proceedings

  15. Comparative study of trapping parameters of LiF(TLD-100) from different production batches

    Energy Technology Data Exchange (ETDEWEB)

    Bos, A.J.J.; Piters, T.M.; Vries, W. de; Hoogenboom, J.E. (Delft Univ. of Technology (Netherlands). Interfaculty Reactor Institute)

    1990-01-01

    Computerised glow curve analysis has been used to determine the trapping parameters of the main peaks of the thermoluminescent (TL) material LiF(TLD-100). The TL material (solid state chips) originated from six different production batches with at least 19 chips per batch. The maxima of glow peaks 2 to 5 are found at the same temperature within very small limits. The activation energy and frequency factor of the main glow peak (peak 5) of TLD-100 originating from two batches differ significantly from those of the other four investigated batches. Nevertheless, the sensitivity of glow peak 5 is more or less the same for all batches. The trapping parameters of glow peaks 2 to 4 of TLD-100 vary little from batch to batch. The measured half-life of peak 2 differed strongly from batch to batch. For all investigated peaks no correlation has been found between glow peak sensitivity and trapping parameters. The results of this study suggest that both defect concentration and nature of the trapping centres vary from batch to batch. It would appear that as a consequence of selection by the manufacturer, the differences between the batches in terms of total light output are small. (author).

  16. An order batching algorithm for wave picking in a parallel-aisle warehouse

    NARCIS (Netherlands)

    Gademann, A.J.R.M.; Berg, van den J.P.; Hoff, van der H.H.

    2001-01-01

    In this paper we address the problem of batching orders in a parallel-aisle warehouse, with the objective to minimize the maximum lead time of any of the batches. This is a typical objective for a wave picking operation. Many heuristics have been suggested to solve order batching problems. We

  17. Look-ahead strategies for controlling batch operations in industry - overview, comparison and exploration

    NARCIS (Netherlands)

    Zee, D.J. van der; Harten, A. van; Schuur, P.C.; Joines, JA; Barton, RR; Kang, K; Fishwick, PA

    2000-01-01

    Batching jobs in a manufacturing system is a very common policy in most industries. The main reasons for batching are avoidance of set ups and/or facilitation of material handling. Good examples of batch-wise production systems are ovens found in aircraft industry and in semiconductor manufacturing.

  18. Look-ahead strategies for controlling batch operations in industry - An overview

    NARCIS (Netherlands)

    Zee, Durk-Jouke van der; Chick, SE; Sanchez, PJ; Ferrin, D; Morrice, DJ

    2003-01-01

    Batching jobs in a manufacturing system is a very common policy in most industries. Main reasons for batching are avoidance of set ups and/or facilitation of material handling. Examples of batch-wise production systems are ovens found in aircraft industry and in semiconductor manufacturing. Starting

  19. 40 CFR 63.487 - Batch front-end process vents-reference control technology.

    Science.gov (United States)

    2010-07-01

    ... § 63.487 Batch front-end process vents—reference control technology. (a) Batch front-end process vents... 40 Protection of Environment 9 2010-07-01 2010-07-01 false Batch front-end process vents-reference control technology. 63.487 Section 63.487 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY...

  20. Monitoring a PVC batch process with multivariate statistical process control charts

    NARCIS (Netherlands)

    Tates, A. A.; Louwerse, D. J.; Smilde, A. K.; Koot, G. L. M.; Berndt, H.

    1999-01-01

    Multivariate statistical process control charts (MSPC charts) are developed for the industrial batch production process of poly(vinyl chloride) (PVC). With these MSPC charts different types of abnormal batch behavior were detected on-line. With batch contribution plots, the probable causes of these

  1. Look-ahead strategies for controlling batch operations in industry : basic insights in rule construction

    NARCIS (Netherlands)

    van der Zee, D.J.; Sullivan, W.A.; Ahmad, M.M.; Fichtner, D.; Sauer, W.; Weigert, G.; Zerna, T.

    2002-01-01

    Batching jobs in a manufacturing system is a very common policy in most industries. Main reasons for batching are avoidance of set ups and/or facilitation of material handling. Examples of batch-wise production systems are ovens found in aircraft industry and in semiconductor manufacturing. Starting

  2. 40 CFR 63.1322 - Batch process vents-reference control technology.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 11 2010-07-01 2010-07-01 true Batch process vents-reference control technology. 63.1322 Section 63.1322 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR... Batch process vents—reference control technology. (a) Batch process vents. The owner or operator of a...

  3. Ethanol production from Sorghum bicolor using both separate and simultaneous saccharification and fermentation in batch and fed batch systems

    DEFF Research Database (Denmark)

    Mehmood, Sajid; Gulfraz, M.; Rana, N. F.

    2009-01-01

    The objective of this work was to find the best combination of different experimental conditions during pre-treatment, enzymatic saccharification, detoxification of inhibitors and fermentation of Sorghum bicolor straw for ethanol production. The optimization of pre-treatment using different...... were used in order to increase the monomeric sugar during enzymatic hydrolysis and it has been observed that the addition of these surfactants contributed significantly in cellulosic conversion but no effect was shown on hemicellulosic hydrolysis. Fermentability of hydrolyzate was tested using...... Saccharomyces cerevisiae Ethanol Red (TM) and it was observed that simultaneous saccharification and fermentation ( SSF) with both batch and fed batch resulted in better ethanol yield as compared to separate hydrolysis and fermentation ( SHF). Detoxification of furan during SHF facilitated reduction...

  4. Optimization of the Production of Polygalacturonase from Aspergillus kawachii Cloned in Saccharomyces cerevisiae in Batch and Fed-Batch Cultures

    Directory of Open Access Journals (Sweden)

    Diego Jorge Baruque

    2011-01-01

    Full Text Available Polygalacturonases (PG; EC 3.2.1.15 catalyze the hydrolysis of pectin and/or pectic acid and are useful for industrial applications such as juice clarification and pectin extraction. Growth and heterologous expression of recombinant Saccharomyces cerevisiae which expresses an acidic PG from Aspergillus kawachii has been studied in batch and fed-batch cultures. Kinetics and stoichiometric parameters of the recombinant yeast were determined in batch cultures in a synthetic medium. In these cultures, the total biomass concentration, protein concentration, and enzyme activity achieved were 2.2 g/L, 10 mg/L, and 3 U/mL, respectively, to give a productivity of 0.06 U/(mL·h. In fed-batch cultures, various strategies for galactose feeding were used: (i after a glucose growth phase, the addition of a single pulse of galactose which gave a productivity of 0.19 U/(mL·h; (ii after a glucose growth phase, a double pulse of galactose at the same final concentration was added, resulting in a productivity of 0.21 U/(mL·h; (iii a simultaneous feeding of glucose and galactose, yielding a productivity of 1.32 U/(mL·h. Based on these results, the simultaneous feeding of glucose and galactose was by far the most suitable strategy for the production of this enzyme. Moreover, some biochemical characteristics of the recombinant enzyme such as a molecular mass of ~60 kDa, an isoelectric point of 3.7 and its ability to hydrolyze polygalacturonic acid at pH=2.5 were determined.

  5. Effect of Initial Conditions on Reproducibility of Scientific Research

    Science.gov (United States)

    Djulbegovic, Benjamin; Hozo, Iztok

    2014-01-01

    Background: It is estimated that about half of currently published research cannot be reproduced. Many reasons have been offered as explanations for failure to reproduce scientific research findings- from fraud to the issues related to design, conduct, analysis, or publishing scientific research. We also postulate a sensitive dependency on initial conditions by which small changes can result in the large differences in the research findings when attempted to be reproduced at later times. Methods: We employed a simple logistic regression equation to model the effect of covariates on the initial study findings. We then fed the input from the logistic equation into a logistic map function to model stability of the results in repeated experiments over time. We illustrate the approach by modeling effects of different factors on the choice of correct treatment. Results: We found that reproducibility of the study findings depended both on the initial values of all independent variables and the rate of change in the baseline conditions, the latter being more important. When the changes in the baseline conditions vary by about 3.5 to about 4 in between experiments, no research findings could be reproduced. However, when the rate of change between the experiments is ≤2.5 the results become highly predictable between the experiments. Conclusions: Many results cannot be reproduced because of the changes in the initial conditions between the experiments. Better control of the baseline conditions in-between the experiments may help improve reproducibility of scientific findings. PMID:25132705

  6. Kinetic studies on batch cultivation of Trichoderma reesei and application to enhance cellulase production by fed-batch fermentation.

    Science.gov (United States)

    Ma, Lijuan; Li, Chen; Yang, Zhenhua; Jia, Wendi; Zhang, Dongyuan; Chen, Shulin

    2013-07-20

    Reducing the production cost of cellulase as the key enzyme for cellulose hydrolysis to fermentable sugars remains a major challenge for biofuel production. Because of the complexity of cellulase production, kinetic modeling and mass balance calculation can be used as effective tools for process design and optimization. In this study, kinetic models for cell growth, substrate consumption and cellulase production in batch fermentation were developed, and then applied in fed-batch fermentation to enhance cellulase production. Inhibition effect of substrate was considered and a modified Luedeking-Piret model was developed for cellulase production and substrate consumption according to the growth characteristics of Trichoderma reesei. The model predictions fit well with the experimental data. Simulation results showed that higher initial substrate concentration led to decrease of cellulase production rate. Mass balance and kinetic simulation results were applied to determine the feeding strategy. Cellulase production and its corresponding productivity increased by 82.13% after employing the proper feeding strategy in fed-batch fermentation. This method combining mathematics and chemometrics by kinetic modeling and mass balance can not only improve cellulase fermentation process, but also help to better understand the cellulase fermentation process. The model development can also provide insight to other similar fermentation processes. Copyright © 2013 Elsevier B.V. All rights reserved.

  7. Fed batch fermentation scale up in the production of recombinant streptokinase

    Directory of Open Access Journals (Sweden)

    Salvador Losada-Nerey

    2017-01-01

    Full Text Available Due to the high international demand of the recombinant streptokinase (Skr produced at the National Center for Bioproducts (BioCen, it was necessary to increase the production capacity of the drug, since the current production volume does not cover the demand. A scale up of the process of fermentation of the recombinant streptokinase was made using a fed batch culture, from the bank scale towards a 300L fermenter. The scaling criteria used were: the intensive variables of the process, the relationships of volumes of the fermentation medium and inoculum, the volumetric coefficient of oxygen transfer and air volume to liquid flow relationship which were kept constant. With this scale up procedure it was possible to reproduce the results obtained at the bank scale of and to double the biomass production volume with the same equipment, fulfilling all the quality requirements of the product and to cover the current demand of the market. Techno-economic indicators demonstrated the feasibility of this option.

  8. Reproducible and controllable induction voltage adder for scaled beam experiments

    Energy Technology Data Exchange (ETDEWEB)

    Sakai, Yasuo; Nakajima, Mitsuo; Horioka, Kazuhiko [Department of Energy Sciences, Tokyo Institute of Technology, 4259 Nagatsuta, Midori-ku, Yokohama 226-8502 (Japan)

    2016-08-15

    A reproducible and controllable induction adder was developed using solid-state switching devices and Finemet cores for scaled beam compression experiments. A gate controlled MOSFET circuit was developed for the controllable voltage driver. The MOSFET circuit drove the induction adder at low magnetization levels of the cores which enabled us to form reproducible modulation voltages with jitter less than 0.3 ns. Preliminary beam compression experiments indicated that the induction adder can improve the reproducibility of modulation voltages and advance the beam physics experiments.

  9. A study on the reproducibility and spatial uniformity of N-isopropylacrylamide polymer gel dosimetry using a commercial 10X fast optical-computed tomography scanner

    International Nuclear Information System (INIS)

    Chang, Y J; Lin, J Q; Hsieh, B T; Chen, C H

    2013-01-01

    This study investigated the reproducibility and spatial uniformity of N-isopropylacrylamide (NIPAM) polymer gel as well as the reproducibility of a NIPAM polymer gel dosimeter. A commercial 10X fast optical computed tomography scanner (OCTOPUS-10X, MGS Research, Inc., Madison, CT, USA) was used as the readout tool of the NIPAM polymer gel dosimeter. A cylindrical NIPAM gel phantom measuring 10 cm (diameter) by 10 cm (height) by 3 mm (thickness) was irradiated by the four-field box treatment with a field size of 3 cm × 3 cm. The dose profiles were found to be consistent at the depths of 2.0 cm to 5.0 cm for two independent gel phantom batches, and the average uncertainty was less than 2%. The gamma pass rates were calculated to be between 94% and 95% at depths of 40 mm for two independent gel phantom batches using 4% dose difference and 4 mm distance-to-agreement criterion. The NIPAM polymer gel dosimeter was highly reproducible and spatially uniform. The results highlighted the potential of the NIPAM polymer gel dosimeter in radiotherapy.

  10. Reliability of a semi-automated 3D-CT measuring method for tunnel diameters after anterior cruciate ligament reconstruction: A comparison between soft-tissue single-bundle allograft vs. autograft.

    Science.gov (United States)

    Robbrecht, Cedric; Claes, Steven; Cromheecke, Michiel; Mahieu, Peter; Kakavelakis, Kyriakos; Victor, Jan; Bellemans, Johan; Verdonk, Peter

    2014-10-01

    Post-operative widening of tibial and/or femoral bone tunnels is a common observation after ACL reconstruction, especially with soft-tissue grafts. There are no studies comparing tunnel widening in hamstring autografts versus tibialis anterior allografts. The goal of this study was to observe the difference in tunnel widening after the use of allograft vs. autograft for ACL reconstruction, by measuring it with a novel 3-D computed tomography based method. Thirty-five ACL-deficient subjects were included, underwent anatomic single-bundle ACL reconstruction and were evaluated at one year after surgery with the use of 3-D CT imaging. Three independent observers semi-automatically delineated femoral and tibial tunnel outlines, after which a best-fit cylinder was derived and the tunnel diameter was determined. Finally, intra- and inter-observer reliability of this novel measurement protocol was defined. In femoral tunnels, the intra-observer ICC was 0.973 (95% CI: 0.922-0.991) and the inter-observer ICC was 0.992 (95% CI: 0.982-0.996). In tibial tunnels, the intra-observer ICC was 0.955 (95% CI: 0.875-0.985). The combined inter-observer ICC was 0.970 (95% CI: 0.987-0.917). Tunnel widening was significantly higher in allografts compared to autografts, in the tibial tunnels (p=0.013) as well as in the femoral tunnels (p=0.007). To our knowledge, this novel, semi-automated 3D-computed tomography image processing method has shown to yield highly reproducible results for the measurement of bone tunnel diameter and area. This series showed a significantly higher amount of tunnel widening observed in the allograft group at one-year follow-up. Level II, Prospective comparative study. Copyright © 2014 Elsevier B.V. All rights reserved.

  11. Moving from proprietary to open-source solutions for academic research in remote sensing: Example with semi-automated land cover mapping

    OpenAIRE

    Grippa, Taïs

    2017-01-01

    GRASS GIS has recently experienced significant improvements for Object-Based Image Analysis. At ULB the choice was made to combine GRASS GIS and Python in a semi-automated processing chain for land-cover mapping. The later proved its ability of being quickly customized in order to match the requirements of different projects. In order to promote the OSGEO software, we decided to make it freely available, allowing anyone interested to review, reuse and/or enhance it for further studies.

  12. Reproducing ten years of road ageing - Accelerated carbonation and leaching of EAF steel slag

    Energy Technology Data Exchange (ETDEWEB)

    Suer, Pascal, E-mail: pascal.suer@swedgeo.se [Swedish Geotechnical Institute, Linkoeping (Sweden); Lindqvist, Jan-Erik [Swedish Cement and Concrete Research Institute, Boras (Sweden); Arm, Maria; Frogner-Kockum, Paul [Swedish Geotechnical Institute, Linkoeping (Sweden)

    2009-09-01

    Reuse of industrial aggregates is still hindered by concern for their long-term properties. This paper proposes a laboratory method for accelerated ageing of steel slag, to predict environmental and technical properties, starting from fresh slag. Ageing processes in a 10-year old asphalt road with steel slag of electric arc furnace (EAF) type in the subbase were identified by scanning electron microscopy (SEM) and leaching tests. Samples from the road centre and the pavement edge were compared with each other and with samples of fresh slag. It was found that slag from the pavement edge showed traces of carbonation and leaching processes, whereas the road centre material was nearly identical to fresh slag, in spite of an accessible particle structure. Batches of moisturized road centre material exposed to oxygen, nitrogen or carbon dioxide (CO{sub 2}) were used for accelerated ageing. Time (7-14 days), temperature (20-40 {sup o}C) and initial slag moisture content (8-20%) were varied to achieve the carbonation (decrease in pH) and leaching that was observed in the pavement edge material. After ageing, water was added to assess leaching of metals and macroelements. 12% moisture, CO{sub 2} and seven days at 40 {sup o}C gave the lowest pH value. This also reproduced the observed ageing effect for Ca, Cu, Ba, Fe, Mn, Pb, Ca (decreased leaching) and for V, Si, and Al (increased leaching). However, ageing effects on SO{sub 4}, DOC and Cr were not reproduced.

  13. Reproducibility of corneal, macular and retinal nerve fiber layer ...

    African Journals Online (AJOL)

    side the limits of a consulting room.5. Reproducibility of ... examination, intraocular pressure and corneal thickness ... All OCT measurements were taken between 2 and 5 pm ..... CAS-OCT, Slit-lamp OCT, RTVue-100) have shown ICC.

  14. Beyond Bundles - Reproducible Software Environments with GNU Guix

    CERN Multimedia

    CERN. Geneva; Wurmus, Ricardo

    2018-01-01

    Building reproducible data analysis pipelines and numerical experiments is a key challenge for reproducible science, in which tools to reproduce software environments play a critical role. The advent of “container-based” deployment tools such as Docker and Singularity has made it easier to replicate software environments. These tools are very much about bundling the bits of software binaries in a convenient way, not so much about describing how software is composed. Science is not just about replicating, though—it demands the ability to inspect and to experiment. In this talk we will present GNU Guix, a software management toolkit. Guix departs from container-based solutions in that it enables declarative composition of software environments. It is comparable to “package managers” like apt or yum, but with a significant difference: Guix provides accurate provenance tracking of build artifacts, and bit-reproducible software. We will illustrate the many ways in which Guix can improve how software en...

  15. The reproducibility of random amplified polymorphic DNA (RAPD ...

    African Journals Online (AJOL)

    RAPD) profiles of Streptococcus thermophilus strains by using the polymerase chain reaction (PCR). Several factors can cause the amplification of false and non reproducible bands in the RAPD profiles. We tested three primers, OPI-02 MOD, ...

  16. Influence of coal batch preparation on the quality of metallurgical соkе

    Directory of Open Access Journals (Sweden)

    Катерина Олегівна Шмельцер

    2015-10-01

    Full Text Available To study the influence of coal batch properties on coke strength we have considered the quality of the coke produced at the plant in Krivoy Rog from 2008 till 2012. Such factors as the large number of coal suppliers, imprecise selection of the optimal degree of batch crushing result in the decline in coke quality, the batch density and contents of the lean class (<0,5 mm are not optimum; poor blending of the batch after crushing; increased moisture and ash content of the coking batch; and extreme fluctuation in the coal and batch characteristics. It was found that high humidity of coal batch and its large fluctuations has most profound effect on the mechanical properties of coke. Under deteriorating resource base the quality of the coking batch preparation is important, To have batch of proper quality the following key aspects must be taken into account: the batch must be crushed to an optimum degree that will result in leaning components decrease and increased contents of vitrivite in it which improves the sinterability and coking, and hence the quality of coke; the degree of mixing of the coking batch in all indices must be up to 98-99%, for uneven distribution in the coal chamber worsens the quality of coke

  17. Semi-automated measurement of anatomical structures using statistical and morphological priors

    Science.gov (United States)

    Ashton, Edward A.; Du, Tong

    2004-05-01

    Rapid, accurate and reproducible delineation and measurement of arbitrary anatomical structures in medical images is a widely held goal, with important applications in both clinical diagnostics and, perhaps more significantly, pharmaceutical trial evaluation. This process requires the ability first to localize a structure within the body, and then to find a best approximation of the structure"s boundaries within a given scan. Structures that are tortuous and small in cross section, such as the hippocampus in the brain or the abdominal aorta, present a particular challenge. Their apparent shape and position can change significantly from slice to slice, and accurate prior shape models for such structures are often difficult to form. In this work, we have developed a system that makes use of both a user-defined shape model and a statistical maximum likelihood classifier to identify and measure structures of this sort in MRI and CT images. Experiments show that this system can reduce analysis time by 75% or more with respect to manual tracing with no loss of precision or accuracy.

  18. Systematic heterogenization for better reproducibility in animal experimentation.

    Science.gov (United States)

    Richter, S Helene

    2017-08-31

    The scientific literature is full of articles discussing poor reproducibility of findings from animal experiments as well as failures to translate results from preclinical animal studies to clinical trials in humans. Critics even go so far as to talk about a "reproducibility crisis" in the life sciences, a novel headword that increasingly finds its way into numerous high-impact journals. Viewed from a cynical perspective, Fett's law of the lab "Never replicate a successful experiment" has thus taken on a completely new meaning. So far, poor reproducibility and translational failures in animal experimentation have mostly been attributed to biased animal data, methodological pitfalls, current publication ethics and animal welfare constraints. More recently, the concept of standardization has also been identified as a potential source of these problems. By reducing within-experiment variation, rigorous standardization regimes limit the inference to the specific experimental conditions. In this way, however, individual phenotypic plasticity is largely neglected, resulting in statistically significant but possibly irrelevant findings that are not reproducible under slightly different conditions. By contrast, systematic heterogenization has been proposed as a concept to improve representativeness of study populations, contributing to improved external validity and hence improved reproducibility. While some first heterogenization studies are indeed very promising, it is still not clear how this approach can be transferred into practice in a logistically feasible and effective way. Thus, further research is needed to explore different heterogenization strategies as well as alternative routes toward better reproducibility in animal experimentation.

  19. Recommendation of ruthenium source for sludge batch flowsheet studies

    Energy Technology Data Exchange (ETDEWEB)

    Woodham, W. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2017-09-13

    Included herein is a preliminary analysis of previously-generated data from sludge batches 7a, 7b, 8, and 9 sludge simulant and real-waste testing, performed to recommend a form of ruthenium for future sludge batch simulant testing under the nitric-formic flowsheet. Focus is given to reactions present in the Sludge Receipt and Adjustment Tank cycle, given that this cycle historically produces the most changes in chemical composition during Chemical Process Cell processing. Data is presented and analyzed for several runs performed under the nitric-formic flowsheet, with consideration given to effects on the production of hydrogen gas, nitrous oxide gas, consumption of formate, conversion of nitrite to nitrate, and the removal and recovery of mercury during processing. Additionally, a brief discussion is given to the effect of ruthenium source selection under the nitric-glycolic flowsheet. An analysis of data generated from scaled demonstration testing, sludge batch 9 qualification testing, and antifoam degradation testing under the nitric-glycolic flowsheet is presented. Experimental parameters of interest under the nitric-glycolic flowsheet include N2O production, glycolate destruction, conversion of glycolate to formate and oxalate, and the conversion of nitrite to nitrate. To date, the number of real-waste experiments that have been performed under the nitric-glycolic flowsheet is insufficient to provide a complete understanding of the effects of ruthenium source selection in simulant experiments with regard to fidelity to real-waste testing. Therefore, a determination of comparability between the two ruthenium sources as employed under the nitric-glycolic flowsheet is made based on available data in order to inform ruthenium source selection for future testing under the nitric-glycolic flowsheet.

  20. Batch biomethanation of banana trash and coir path

    Energy Technology Data Exchange (ETDEWEB)

    Deivanai, K.; Bai, R.K. [Madurai Kamaraj Univ. (India)

    1995-08-01

    Anaerobic digestion of banana trash and coir pith was carried out for a period of one month by batch digestion. During biomethanation reduction of total- and volatile-solids was, respectively, 25.3 and 39.6% in banana trash and 13.6 and 21.6% in coir pith. A production of 9.22 l and 1.69 l (per kg TS added) of biogas with average methane content of 72 and 80% was achieved from banana trash and coir pith, respectively. (author)

  1. Batched Triangular DLA for Very Small Matrices on GPUs

    KAUST Repository

    Charara, Ali

    2017-03-13

    In several scientific applications, like tensor contractions in deep learning computation or data compression in hierarchical low rank matrix approximation, the bulk of computation typically resides in performing thousands of independent dense linear algebra operations on very small matrix sizes (usually less than 100). Batched dense linear algebra kernels are becoming ubiquitous for such scientific computations. Within a single API call, these kernels are capable of simultaneously launching a large number of similar matrix computations, removing the expensive overhead of multiple API calls while increasing the utilization of the underlying hardware.

  2. Online and Batch Supervised Background Estimation via L1 Regression

    KAUST Repository

    Dutta, Aritra

    2017-11-23

    We propose a surprisingly simple model for supervised video background estimation. Our model is based on $\\\\ell_1$ regression. As existing methods for $\\\\ell_1$ regression do not scale to high-resolution videos, we propose several simple and scalable methods for solving the problem, including iteratively reweighted least squares, a homotopy method, and stochastic gradient descent. We show through extensive experiments that our model and methods match or outperform the state-of-the-art online and batch methods in virtually all quantitative and qualitative measures.

  3. Online and Batch Supervised Background Estimation via L1 Regression

    KAUST Repository

    Dutta, Aritra; Richtarik, Peter

    2017-01-01

    We propose a surprisingly simple model for supervised video background estimation. Our model is based on $\\ell_1$ regression. As existing methods for $\\ell_1$ regression do not scale to high-resolution videos, we propose several simple and scalable methods for solving the problem, including iteratively reweighted least squares, a homotopy method, and stochastic gradient descent. We show through extensive experiments that our model and methods match or outperform the state-of-the-art online and batch methods in virtually all quantitative and qualitative measures.

  4. Simple multicomponent batch distillation procedure with a variable reflux policy

    Directory of Open Access Journals (Sweden)

    A. N. García

    2014-06-01

    Full Text Available This paper describes a shortcut procedure for batch distillation simulation with a variable reflux policy. The procedure starts from a shortcut method developed by Sundaram and Evans in 1993 and uses an iterative cycle to calculate the reflux ratio at each moment. The functional relationship between the concentrations at the bottom and the dome is evaluated using the Fenske equation and is complemented with the equations proposed by Underwood and Gilliland. The results of this procedure are consistent with those obtained using a fast method widely validated in the relevant literature.

  5. DISPATCHING CONTROL SYSTEM OF THE CONCRETE BATCHING PLANTS

    Directory of Open Access Journals (Sweden)

    Andrey Vladimirovich Ostroukh

    2015-09-01

    Full Text Available This paper proposes an approach to the design of dispatching control system of the concrete batching plant, which is a set of hardware maintenance, information, mathematical and software for control of technological objects. The proposed system is scalable and can include a control subsystem of mobile concrete plant, laboratory, subsystems, access control, and personnel management jobs. The system provides optimum automating the collection and processing of information for generating control signals and transmitting them without loss and distortion to the actuators in order to achieve the most efficient operation of process control object as a whole.

  6. OneD: increasing reproducibility of Hi-C samples with abnormal karyotypes.

    Science.gov (United States)

    Vidal, Enrique; le Dily, François; Quilez, Javier; Stadhouders, Ralph; Cuartero, Yasmina; Graf, Thomas; Marti-Renom, Marc A; Beato, Miguel; Filion, Guillaume J

    2018-05-04

    The three-dimensional conformation of genomes is an essential component of their biological activity. The advent of the Hi-C technology enabled an unprecedented progress in our understanding of genome structures. However, Hi-C is subject to systematic biases that can compromise downstream analyses. Several strategies have been proposed to remove those biases, but the issue of abnormal karyotypes received little attention. Many experiments are performed in cancer cell lines, which typically harbor large-scale copy number variations that create visible defects on the raw Hi-C maps. The consequences of these widespread artifacts on the normalized maps are mostly unexplored. We observed that current normalization methods are not robust to the presence of large-scale copy number variations, potentially obscuring biological differences and enhancing batch effects. To address this issue, we developed an alternative approach designed to take into account chromosomal abnormalities. The method, called OneD, increases reproducibility among replicates of Hi-C samples with abnormal karyotype, outperforming previous methods significantly. On normal karyotypes, OneD fared equally well as state-of-the-art methods, making it a safe choice for Hi-C normalization. OneD is fast and scales well in terms of computing resources for resolutions up to 5 kb.

  7. AGA: Interactive pipeline for reproducible gene expression and DNA methylation data analyses [version 2; referees: 2 approved

    Directory of Open Access Journals (Sweden)

    Michael Considine

    2015-10-01

    Full Text Available Automated Genomics Analysis (AGA is an interactive program to analyze high-throughput genomic data sets on a variety of platforms. An easy to use, point and click, guided pipeline is implemented to combine, define, and compare datasets, and customize their outputs. In contrast to other automated programs, AGA enables flexible selection of sample groups for comparison from complex sample annotations. Batch correction techniques are also included to further enable the combination of datasets from diverse studies in this comparison. AGA also allows users to save plots, tables and data, and log files containing key portions of the R script run for reproducible analyses. The link between the interface and R supports collaborative research, enabling advanced R users to extend preliminary analyses generated from bioinformatics novices.

  8. An evaluation of semi-automated methods for collecting ecosystem-level data in temperate marine systems.

    Science.gov (United States)

    Griffin, Kingsley J; Hedge, Luke H; González-Rivero, Manuel; Hoegh-Guldberg, Ove I; Johnston, Emma L

    2017-07-01

    Historically, marine ecologists have lacked efficient tools that are capable of capturing detailed species distribution data over large areas. Emerging technologies such as high-resolution imaging and associated machine-learning image-scoring software are providing new tools to map species over large areas in the ocean. Here, we combine a novel diver propulsion vehicle (DPV) imaging system with free-to-use machine-learning software to semi-automatically generate dense and widespread abundance records of a habitat-forming algae over ~5,000 m 2 of temperate reef. We employ replicable spatial techniques to test the effectiveness of traditional diver-based sampling, and better understand the distribution and spatial arrangement of one key algal species. We found that the effectiveness of a traditional survey depended on the level of spatial structuring, and generally 10-20 transects (50 × 1 m) were required to obtain reliable results. This represents 2-20 times greater replication than have been collected in previous studies. Furthermore, we demonstrate the usefulness of fine-resolution distribution modeling for understanding patterns in canopy algae cover at multiple spatial scales, and discuss applications to other marine habitats. Our analyses demonstrate that semi-automated methods of data gathering and processing provide more accurate results than traditional methods for describing habitat structure at seascape scales, and therefore represent vastly improved techniques for understanding and managing marine seascapes.

  9. Semi-automated delineation of breast cancer tumors and subsequent materialization using three-dimensional printing (rapid prototyping).

    Science.gov (United States)

    Schulz-Wendtland, Rüdiger; Harz, Markus; Meier-Meitinger, Martina; Brehm, Barbara; Wacker, Till; Hahn, Horst K; Wagner, Florian; Wittenberg, Thomas; Beckmann, Matthias W; Uder, Michael; Fasching, Peter A; Emons, Julius

    2017-03-01

    Three-dimensional (3D) printing has become widely available, and a few cases of its use in clinical practice have been described. The aim of this study was to explore facilities for the semi-automated delineation of breast cancer tumors and to assess the feasibility of 3D printing of breast cancer tumors. In a case series of five patients, different 3D imaging methods-magnetic resonance imaging (MRI), digital breast tomosynthesis (DBT), and 3D ultrasound-were used to capture 3D data for breast cancer tumors. The volumes of the breast tumors were calculated to assess the comparability of the breast tumor models, and the MRI information was used to render models on a commercially available 3D printer to materialize the tumors. The tumor volumes calculated from the different 3D methods appeared to be comparable. Tumor models with volumes between 325 mm 3 and 7,770 mm 3 were printed and compared with the models rendered from MRI. The materialization of the tumors reflected the computer models of them. 3D printing (rapid prototyping) appears to be feasible. Scenarios for the clinical use of the technology might include presenting the model to the surgeon to provide a better understanding of the tumor's spatial characteristics in the breast, in order to improve decision-making in relation to neoadjuvant chemotherapy or surgical approaches. J. Surg. Oncol. 2017;115:238-242. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  10. Semi-automated relative quantification of cell culture contamination with mycoplasma by Photoshop-based image analysis on immunofluorescence preparations.

    Science.gov (United States)

    Kumar, Ashok; Yerneni, Lakshmana K

    2009-01-01

    Mycoplasma contamination in cell culture is a serious setback for the cell-culturist. The experiments undertaken using contaminated cell cultures are known to yield unreliable or false results due to various morphological, biochemical and genetic effects. Earlier surveys revealed incidences of mycoplasma contamination in cell cultures to range from 15 to 80%. Out of a vast array of methods for detecting mycoplasma in cell culture, the cytological methods directly demonstrate the contaminating organism present in association with the cultured cells. In this investigation, we report the adoption of a cytological immunofluorescence assay (IFA), in an attempt to obtain a semi-automated relative quantification of contamination by employing the user-friendly Photoshop-based image analysis. The study performed on 77 cell cultures randomly collected from various laboratories revealed mycoplasma contamination in 18 cell cultures simultaneously by IFA and Hoechst DNA fluorochrome staining methods. It was observed that the Photoshop-based image analysis on IFA stained slides was very valuable as a sensitive tool in providing quantitative assessment on the extent of contamination both per se and in comparison to cellularity of cell cultures. The technique could be useful in estimating the efficacy of anti-mycoplasma agents during decontaminating measures.

  11. Feasibility of geometric-intensity-based semi-automated delineation of the tentorium cerebelli from MRI scans.

    Science.gov (United States)

    Penumetcha, Neeraja; Kabadi, Suraj; Jedynak, Bruno; Walcutt, Charles; Gado, Mokhtar H; Wang, Lei; Ratnanather, J Tilak

    2011-04-01

    This paper describes a feasibility study of a method for delineating the tentorium cerebelli in magnetic resonance imaging (MRI) brain scans. The tentorium cerebelli is a thin sheet of dura matter covering the cerebellum and separating it from the posterior part of the temporal lobe and the occipital lobe of the cerebral hemispheres. Cortical structures such as the parahippocampal gyrus can be indistinguishable from tentorium in magnetized prepared rapid gradient echo and T1-weighted MRI scans. Similar intensities in these neighboring regions make it difficult to perform accurate cortical analysis in neuroimaging studies of schizophrenia and Alzheimer's disease. A semi-automated, geometric, intensity-based procedure for delineating the tentorium from a whole-brain scan is described. Initial and final curves are traced within the tentorium. A cost function, based on intensity and Euclidean distance, is computed between the two curves using the Fast Marching method. The initial curve is then evolved to the final curve based on the gradient of the computed costs, generating a series of intermediate curves. These curves are then used to generate a triangulated surface of the tentorium. For 3 scans, surfaces were found to be within 2 voxels from hand segmentations. Copyright © 2009 by the American Society of Neuroimaging.

  12. A geometrical approach for semi-automated crystal centering and in situ X-ray diffraction data collection

    International Nuclear Information System (INIS)

    Mohammad Yaser Heidari Khajepour; Ferrer, Jean-Luc; Lebrette, Hugo; Vernede, Xavier; Rogues, Pierrick

    2013-01-01

    High-throughput protein crystallography projects pushed forward the development of automated crystallization platforms that are now commonly used. This created an urgent need for adapted and automated equipment for crystal analysis. However, first these crystals have to be harvested, cryo-protected and flash-cooled, operations that can fail or negatively impact on the crystal. In situ X-ray diffraction analysis has become a valid alternative to these operations, and a growing number of users apply it for crystal screening and to solve structures. Nevertheless, even this shortcut may require a significant amount of beam time. In this in situ high-throughput approach, the centering of crystals relative to the beam represents the bottleneck in the analysis process. In this article, a new method to accelerate this process, by recording accurately the local geometry coordinates for each crystal in the crystallization plate, is presented. Subsequently, the crystallization plate can be presented to the X-ray beam by an automated plate-handling device, such as a six-axis robot arm, for an automated crystal centering in the beam, in situ screening or data collection. Here the preliminary results of such a semi-automated pipeline are reported for two distinct test proteins. (authors)

  13. Semi-automated curation of metabolic models via flux balance analysis: a case study with Mycoplasma gallisepticum.

    Directory of Open Access Journals (Sweden)

    Eddy J Bautista

    Full Text Available Primarily used for metabolic engineering and synthetic biology, genome-scale metabolic modeling shows tremendous potential as a tool for fundamental research and curation of metabolism. Through a novel integration of flux balance analysis and genetic algorithms, a strategy to curate metabolic networks and facilitate identification of metabolic pathways that may not be directly inferable solely from genome annotation was developed. Specifically, metabolites involved in unknown reactions can be determined, and potentially erroneous pathways can be identified. The procedure developed allows for new fundamental insight into metabolism, as well as acting as a semi-automated curation methodology for genome-scale metabolic modeling. To validate the methodology, a genome-scale metabolic model for the bacterium Mycoplasma gallisepticum was created. Several reactions not predicted by the genome annotation were postulated and validated via the literature. The model predicted an average growth rate of 0.358±0.12[Formula: see text], closely matching the experimentally determined growth rate of M. gallisepticum of 0.244±0.03[Formula: see text]. This work presents a powerful algorithm for facilitating the identification and curation of previously known and new metabolic pathways, as well as presenting the first genome-scale reconstruction of M. gallisepticum.

  14. A Semiautomated Multilayer Picking Algorithm for Ice-sheet Radar Echograms Applied to Ground-Based Near-Surface Data

    Science.gov (United States)

    Onana, Vincent De Paul; Koenig, Lora Suzanne; Ruth, Julia; Studinger, Michael; Harbeck, Jeremy P.

    2014-01-01

    Snow accumulation over an ice sheet is the sole mass input, making it a primary measurement for understanding the past, present, and future mass balance. Near-surface frequency-modulated continuous-wave (FMCW) radars image isochronous firn layers recording accumulation histories. The Semiautomated Multilayer Picking Algorithm (SAMPA) was designed and developed to trace annual accumulation layers in polar firn from both airborne and ground-based radars. The SAMPA algorithm is based on the Radon transform (RT) computed by blocks and angular orientations over a radar echogram. For each echogram's block, the RT maps firn segmented-layer features into peaks, which are picked using amplitude and width threshold parameters of peaks. A backward RT is then computed for each corresponding block, mapping the peaks back into picked segmented-layers. The segmented layers are then connected and smoothed to achieve a final layer pick across the echogram. Once input parameters are trained, SAMPA operates autonomously and can process hundreds of kilometers of radar data picking more than 40 layers. SAMPA final pick results and layer numbering still require a cursory manual adjustment to correct noncontinuous picks, which are likely not annual, and to correct for inconsistency in layer numbering. Despite the manual effort to train and check SAMPA results, it is an efficient tool for picking multiple accumulation layers in polar firn, reducing time over manual digitizing efforts. The trackability of good detected layers is greater than 90%.

  15. Semi-automated preparation of a 11C-labelled antibiotic - [N-methyl-11C]erythromycin A lactobionate

    International Nuclear Information System (INIS)

    Pike, V.W.; Palmer, A.J.; Horlock, P.L.; Liss, R.H.

    1984-01-01

    A fast semi-automated method is described for labelling the antibiotic, erythromycin A (1), with the short-lived positron-emitting radionuclide, 11 C(tsub(1/2)=20.4 min), in order to permit the non-invasive study of its tissue uptake in vivo. Labelling was achieved by the fast reductive methylation of N-demethylerythromycin A (2) with [ 11 C]formaldehyde, itself prepared from cyclotron-produced [ 11 C]-carbon dioxide. Rapid chemical and radiochemical purification of the [N-methyl- 11 C]erythromycin A (3) were achieved by HPLC and verified by TLC with autoradiography. The purified material was formulated for human i.v. injection as a sterile apyrogenic solution of the lactobionate salt. The preparation takes 42 min from the end of radionuclide production and from [ 11 C]carbon dioxide produces [N-methyl- 11 C]erythromycin A lactobionate in 4-12% radiochemical yield, corrected for radioactive decay. (author)

  16. Shear wave elastography for breast masses is highly reproducible.

    Science.gov (United States)

    Cosgrove, David O; Berg, Wendie A; Doré, Caroline J; Skyba, Danny M; Henry, Jean-Pierre; Gay, Joel; Cohen-Bacrie, Claude

    2012-05-01

    To evaluate intra- and interobserver reproducibility of shear wave elastography (SWE) for breast masses. For intraobserver reproducibility, each observer obtained three consecutive SWE images of 758 masses that were visible on ultrasound. 144 (19%) were malignant. Weighted kappa was used to assess the agreement of qualitative elastographic features; the reliability of quantitative measurements was assessed by intraclass correlation coefficients (ICC). For the interobserver reproducibility, a blinded observer reviewed images and agreement on features was determined. Mean age was 50 years; mean mass size was 13 mm. Qualitatively, SWE images were at least reasonably similar for 666/758 (87.9%). Intraclass correlation for SWE diameter, area and perimeter was almost perfect (ICC ≥ 0.94). Intraobserver reliability for maximum and mean elasticity was almost perfect (ICC = 0.84 and 0.87) and was substantial for the ratio of mass-to-fat elasticity (ICC = 0.77). Interobserver agreement was moderate for SWE homogeneity (κ = 0.57), substantial for qualitative colour assessment of maximum elasticity (κ = 0.66), fair for SWE shape (κ = 0.40), fair for B-mode mass margins (κ = 0.38), and moderate for B-mode mass shape (κ = 0.58), orientation (κ = 0.53) and BI-RADS assessment (κ = 0.59). SWE is highly reproducible for assessing elastographic features of breast masses within and across observers. SWE interpretation is at least as consistent as that of BI-RADS ultrasound B-mode features. • Shear wave ultrasound elastography can measure the stiffness of breast tissue • It provides a qualitatively and quantitatively interpretable colour-coded map of tissue stiffness • Intraobserver reproducibility of SWE is almost perfect while intraobserver reproducibility of SWE proved to be moderate to substantial • The most reproducible SWE features between observers were SWE image homogeneity and maximum elasticity.

  17. Batch Attribute-Based Encryption for Secure Clouds

    Directory of Open Access Journals (Sweden)

    Chen Yang

    2015-10-01

    Full Text Available Cloud storage is widely used by organizations due to its advantage of allowing universal access with low cost. Attribute-based encryption (ABE is a kind of public key encryption suitable for cloud storage. The secret key of each user and the ciphertext are associated with an access policy and an attribute set, respectively; in addition to holding a secret key, one can decrypt a ciphertext only if the associated attributes match the predetermined access policy, which allows one to enforce fine-grained access control on outsourced files. One issue in existing ABE schemes is that they are designed for the users of a single organization. When one wants to share the data with the users of different organizations, the owner needs to encrypt the messages to the receivers of one organization and then repeats this process for another organization. This situation is deteriorated with more and more mobile devices using cloud services, as the ABE encryption process is time consuming and may exhaust the power supplies of the mobile devices quickly. In this paper, we propose a batch attribute-based encryption (BABE approach to address this problem in a provably-secure way. With our approach, the data owner can outsource data in batches to the users of different organizations simultaneously. The data owner is allowed to decide the receiving organizations and the attributes required for decryption. Theoretical and experimental analyses show that our approach is more efficient than traditional encryption implementations in computation and communication.

  18. "Batch" kinetics in flow: online IR analysis and continuous control.

    Science.gov (United States)

    Moore, Jason S; Jensen, Klavs F

    2014-01-07

    Currently, kinetic data is either collected under steady-state conditions in flow or by generating time-series data in batch. Batch experiments are generally considered to be more suitable for the generation of kinetic data because of the ability to collect data from many time points in a single experiment. Now, a method that rapidly generates time-series reaction data from flow reactors by continuously manipulating the flow rate and reaction temperature has been developed. This approach makes use of inline IR analysis and an automated microreactor system, which allowed for rapid and tight control of the operating conditions. The conversion/residence time profiles at several temperatures were used to fit parameters to a kinetic model. This method requires significantly less time and a smaller amount of starting material compared to one-at-a-time flow experiments, and thus allows for the rapid generation of kinetic data. Copyright © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. Stormwater Pollution Prevention Plan - TA-60 Asphalt Batch Plant

    Energy Technology Data Exchange (ETDEWEB)

    Sandoval, Leonard Frank [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2018-01-31

    This Storm Water Pollution Prevention Plan (SWPPP) was developed in accordance with the provisions of the Clean Water Act (33 U.S.C. §§1251 et seq., as amended), and the Multi-Sector General Permit for Storm Water Discharges Associated with Industrial Activity (U.S. EPA, June 2015) issued by the U.S. Environmental Protection Agency (EPA) for the National Pollutant Discharge Elimination System (NPDES) and using the industry specific permit requirements for Sector P-Land Transportation and Warehousing as a guide. This SWPPP applies to discharges of stormwater from the operational areas of the TA-60-01 Asphalt Batch Plant at Los Alamos National Laboratory. Los Alamos National Laboratory (also referred to as LANL or the “Laboratory”) is owned by the Department of Energy (DOE), and is operated by Los Alamos National Security, LLC (LANS). Throughout this document, the term “facility” refers to the TA-60 Asphalt Batch Plant and associated areas. The current permit expires at midnight on June 4, 2020.

  20. SEQUENCING BATCH REACTOR: A PROMISING TECHNOLOGY IN WASTEWATER TREATMENT

    Directory of Open Access Journals (Sweden)

    A. H. Mahvi

    2008-04-01

    Full Text Available Discharge of domestic and industrial wastewater to surface or groundwater is very dangerous to the environment. Therefore treatment of any kind of wastewater to produce effluent with good quality is necessary. In this regard choosing an effective treatment system is important. Sequencing batch reactor is a modification of activated sludge process which has been successfully used to treat municipal and industrial wastewater. The process could be applied for nutrients removal, high biochemical oxygen demand containing industrial wastewater, wastewater containing toxic materials such as cyanide, copper, chromium, lead and nickel, food industries effluents, landfill leachates and tannery wastewater. Of the process advantages are single-tank configuration, small foot print, easily expandable, simple operation and low capital costs. Many researches have been conducted on this treatment technology. The authors had been conducted some investigations on a modification of sequencing batch reactor. Their studies resulted in very high percentage removal of biochemical oxygen demand, chemical oxygen demand, total kjeldahl nitrogen, total nitrogen, total phosphorus and total suspended solids respectively. This paper reviews some of the published works in addition to experiences of the authors.

  1. Near infrared spectroscopy for qualitative comparison of pharmaceutical batches.

    Science.gov (United States)

    Roggo, Y; Roeseler, C; Ulmschneider, M

    2004-11-19

    Pharmaceuticals are produced according to current pharmacopoeias, which require quality parameters. Tablets of identical formulation, produced by different factories should have the same properties before and after storage. In this article, we analyzed samples having two different origins before and after storage (30 degrees C, 75% relative moisture). The aim of the study is to propose two approaches to understand the differences between origins and the storage effect by near infrared spectroscopy. In the first part, the main wavelengths are identified in transmittance and reflectance near infrared spectra in order to identify the major differences between the samples. In this paper, this approach is called fingerprinting. In the second part, principal component analysis (PCA) is computed to confirm the fingerprinting interpretation. The two interpretations show the differences between batches: physical aspect and moisture content. The manufacturing process is responsible for the physical differences between batches. During the storage, changes are due to the increase of moisture content and the decrease of the active content.

  2. Atomic-batched tensor decomposed two-electron repulsion integrals

    Science.gov (United States)

    Schmitz, Gunnar; Madsen, Niels Kristian; Christiansen, Ove

    2017-04-01

    We present a new integral format for 4-index electron repulsion integrals, in which several strategies like the Resolution-of-the-Identity (RI) approximation and other more general tensor-decomposition techniques are combined with an atomic batching scheme. The 3-index RI integral tensor is divided into sub-tensors defined by atom pairs on which we perform an accelerated decomposition to the canonical product (CP) format. In a first step, the RI integrals are decomposed to a high-rank CP-like format by repeated singular value decompositions followed by a rank reduction, which uses a Tucker decomposition as an intermediate step to lower the prefactor of the algorithm. After decomposing the RI sub-tensors (within the Coulomb metric), they can be reassembled to the full decomposed tensor (RC approach) or the atomic batched format can be maintained (ABC approach). In the first case, the integrals are very similar to the well-known tensor hypercontraction integral format, which gained some attraction in recent years since it allows for quartic scaling implementations of MP2 and some coupled cluster methods. On the MP2 level, the RC and ABC approaches are compared concerning efficiency and storage requirements. Furthermore, the overall accuracy of this approach is assessed. Initial test calculations show a good accuracy and that it is not limited to small systems.

  3. Xylitol production by Candida parapsilosis under fed-batch culture

    Directory of Open Access Journals (Sweden)

    Sandra A. Furlan

    2001-06-01

    Full Text Available Xylitol production by Candida parapsilosis was investigated under fed-batch cultivation, using single (xylose or mixed (xylose and glucose sugars as substrates. The presence of glucose in the medium induced the production of ethanol as secondary metabolite and improved specific rates of growth, xylitol formation and substrate consumption. Fractionated supply of the feed medium at constant sugar concentration did not promote any increase on the productivity compared to the single batch cultivation.A produção de xylitol por Candida parapsilosis foi investigada em regime de batelada alimentada, usando substratos açucarados de composição simples (xilose ou composta (xilose e glicose. A presença de glicose no meio induziu a formação de etanol como metabólito secundário. A suplementação fracionada do meio de alimentação numa concentração fixa de açúcar não resultou em aumento da produtividade em relação àquela alcançada em batelada simples.

  4. Comparison of the release of constituents from granular materials under batch and column testing.

    Science.gov (United States)

    Lopez Meza, Sarynna; Garrabrants, Andrew C; van der Sloot, Hans; Kosson, David S

    2008-01-01

    Column leaching testing can be considered a better basis for assessing field impact data than any other available batch test method and thus provides a fundamental basis from which to estimate constituent release under a variety of field conditions. However, column testing is time-intensive compared to the more simplified batch testing, and may not always be a viable option when making decisions for material reuse. Batch tests are used most frequently as a simple tool for compliance or quality control reasons. Therefore, it is important to compare the release that occurs under batch and column testing, and establish conservative interpretation protocols for extrapolation from batch data when column data are not available. Five different materials (concrete, construction debris, aluminum recycling residue, coal fly ash and bottom ash) were evaluated via batch and column testing, including different column flow regimes (continuously saturated and intermittent unsaturated flow). Constituent release data from batch and column tests were compared. Results showed no significant difference between the column flow regimes when constituent release data from batch and column tests were compared. In most cases batch and column testing agreed when presented in the form of cumulative release. For arsenic in carbonated materials, however, batch testing underestimates the column constituent release for most LS ratios and also on a cumulative basis. For cases when As is a constituent of concern, column testing may be required.

  5. Reproducibility of computer-aided detection system in digital mammograms

    International Nuclear Information System (INIS)

    Kim, Seung Ja; Cho, Nariya; Cha, Joo Hee; Chung, Hye Kyung; Lee, Sin Ho; Cho, Kyung Soo; Kim, Sun Mi; Moon, Woo Kyung

    2005-01-01

    To evaluate the reproducibility of the computer-aided detection (CAD) system for digital mammograms. We applied the CAD system (ImageChecker M1000-DM, version 3.1; R2 Technology) to full field digital mammograms. These mammograms were taken twice at an interval of 10-45 days (mean:25 days) for 34 preoperative patients (breast cancer n=27, benign disease n=7, age range:20-66 years, mean age:47.9 years). On the mammograms, lesions were visible in 19 patients and these were depicted as 15 masses and 12 calcification clusters. We analyzed the sensitivity, the false positive rate (FPR) and the reproducibility of the CAD marks. The broader sensitivities of the CAD system were 80% (12 of 15), 67%(10 of 15) for masses and those for calcification clusters were 100% (12 of 12). The strict sensitivities were 50% (15 of 30) and 50% (15 of 30) for masses and 92% (22 of 24) and 79% (19 of 24) for the clusters. The FPR for the masses was 0.21-0.22/image, the FPR for the clusters was 0.03-0.04/image and the total FPR was 0.24-0.26/image. Among 132 mammography images, the identical images regardless of the existence of CAD marks were 59% (78 of 132), and the identical images with CAD marks were 22% (15 of 69). The reproducibility of the CAD marks for the true positive mass was 67% (12 of 18) and 71% (17 of 24) for the true positive cluster. The reproducibility of CAD marks for the false positive mass was 8% (4 of 53), and the reproducibility of CAD marks for the false positive clusters was 14% (1 of 7). The reproducibility of the total mass marks was 23% (16 of 71), and the reproducibility of the total cluster marks was 58% (18 of 31). CAD system showed higher sensitivity and reproducibility of CAD marks for the calcification clusters which are related to breast cancer. Yet the overall reproducibility of CAD marks was low; therefore, the CAD system must be applied considering this limitation

  6. Using prediction markets to estimate the reproducibility of scientific research

    Science.gov (United States)

    Dreber, Anna; Pfeiffer, Thomas; Almenberg, Johan; Isaksson, Siri; Wilson, Brad; Chen, Yiling; Nosek, Brian A.; Johannesson, Magnus

    2015-01-01

    Concerns about a lack of reproducibility of statistically significant results have recently been raised in many fields, and it has been argued that this lack comes at substantial economic costs. We here report the results from prediction markets set up to quantify the reproducibility of 44 studies published in prominent psychology journals and replicated in the Reproducibility Project: Psychology. The prediction markets predict the outcomes of the replications well and outperform a survey of market participants’ individual forecasts. This shows that prediction markets are a promising tool for assessing the reproducibility of published scientific results. The prediction markets also allow us to estimate probabilities for the hypotheses being true at different testing stages, which provides valuable information regarding the temporal dynamics of scientific discovery. We find that the hypotheses being tested in psychology typically have low prior probabilities of being true (median, 9%) and that a “statistically significant” finding needs to be confirmed in a well-powered replication to have a high probability of being true. We argue that prediction markets could be used to obtain speedy information about reproducibility at low cost and could potentially even be used to determine which studies to replicate to optimally allocate limited resources into replications. PMID:26553988

  7. Validation and reproducibility of an Australian caffeine food frequency questionnaire.

    Science.gov (United States)

    Watson, E J; Kohler, M; Banks, S; Coates, A M

    2017-08-01

    The aim of this study was to measure validity and reproducibility of a caffeine food frequency questionnaire (C-FFQ) developed for the Australian population. The C-FFQ was designed to assess average daily caffeine consumption using four categories of food and beverages including; energy drinks; soft drinks/soda; coffee and tea and chocolate (food and drink). Participants completed a seven-day food diary immediately followed by the C-FFQ on two consecutive days. The questionnaire was first piloted in 20 adults, and then, a validity/reproducibility study was conducted (n = 90 adults). The C-FFQ showed moderate correlations (r = .60), fair agreement (mean difference 63 mg) and reasonable quintile rankings indicating fair to moderate agreement with the seven-day food diary. To test reproducibility, the C-FFQ was compared to itself and showed strong correlations (r = .90), good quintile rankings and strong kappa values (κ = 0.65), indicating strong reproducibility. The C-FFQ shows adequate validity and reproducibility and will aid researchers in Australia to quantify caffeine consumption.

  8. Using prediction markets to estimate the reproducibility of scientific research.

    Science.gov (United States)

    Dreber, Anna; Pfeiffer, Thomas; Almenberg, Johan; Isaksson, Siri; Wilson, Brad; Chen, Yiling; Nosek, Brian A; Johannesson, Magnus

    2015-12-15

    Concerns about a lack of reproducibility of statistically significant results have recently been raised in many fields, and it has been argued that this lack comes at substantial economic costs. We here report the results from prediction markets set up to quantify the reproducibility of 44 studies published in prominent psychology journals and replicated in the Reproducibility Project: Psychology. The prediction markets predict the outcomes of the replications well and outperform a survey of market participants' individual forecasts. This shows that prediction markets are a promising tool for assessing the reproducibility of published scientific results. The prediction markets also allow us to estimate probabilities for the hypotheses being true at different testing stages, which provides valuable information regarding the temporal dynamics of scientific discovery. We find that the hypotheses being tested in psychology typically have low prior probabilities of being true (median, 9%) and that a "statistically significant" finding needs to be confirmed in a well-powered replication to have a high probability of being true. We argue that prediction markets could be used to obtain speedy information about reproducibility at low cost and could potentially even be used to determine which studies to replicate to optimally allocate limited resources into replications.

  9. Bagasse hydrolyzates from Agave tequilana as substrates for succinic acid production by Actinobacillus succinogenes in batch and repeated batch reactor.

    Science.gov (United States)

    Corona-González, Rosa Isela; Varela-Almanza, Karla María; Arriola-Guevara, Enrique; Martínez-Gómez, Álvaro de Jesús; Pelayo-Ortiz, Carlos; Toriz, Guillermo

    2016-04-01

    The aim of this work was to obtain fermentable sugars by enzymatic or acid hydrolyses of Agave tequilana Weber bagasse in order to produce succinic acid with Actinobacillus succinogenes. Hydrolyses were carried out with mineral acids (sulfuric and hydrochloric acids) or a commercial cellulolytic enzyme, and were optimized statistically by a response surface methodology, having as factors the concentration of acid/enzyme and time of hydrolysis. The concentration of sugars obtained at optimal conditions for each hydrolysis were 21.7, 22.4y 19.8g/L for H2SO4, HCl and the enzymatic preparation respectively. Concerning succinic acid production, the enzymatic hydrolyzates resulted in the highest yield (0.446g/g) and productivity (0.57g/Lh) using A. succinogenes in a batch reactor system. Repeated batch fermentation with immobilized A. succinogenes in agar and with the enzymatic hydrolyzates resulted in a maximum concentration of succinic acid of 33.6g/L from 87.2g/L monosaccharides after 5 cycles in 40h, obtaining a productivity of 1.32g/Lh. Copyright © 2016. Published by Elsevier Ltd.

  10. Production of carotenoids and lipids by Rhodococcus opacus PD630 in batch and fed-batch culture.

    Science.gov (United States)

    Thanapimmetha, Anusith; Suwaleerat, Tharatron; Saisriyoot, Maythee; Chisti, Yusuf; Srinophakun, Penjit

    2017-01-01

    Production of carotenoids by Rhodococcus opacus PD630 is reported. A modified mineral salt medium formulated with glycerol as an inexpensive carbon source was used for the fermentation. Ammonium acetate was the nitrogen source. A dry cell mass concentration of nearly 5.4 g/L could be produced in shake flasks with a carotenoid concentration of 0.54 mg/L. In batch culture in a 5 L bioreactor, without pH control, the maximum dry biomass concentration was ~30 % lower than in shake flasks and the carotenoids concentration was 0.09 mg/L. Both the biomass concentration and the carotenoids concentration could be raised using a fed-batch operation with a feed mixture of ammonium acetate and acetic acid. With this strategy, the final biomass concentration was 8.2 g/L and the carotenoids concentration was 0.20 mg/L in a 10-day fermentation. A control of pH proved to be unnecessary for maximizing the production of carotenoids in this fermentation.

  11. The quest for improved reproducibility in MALDI mass spectrometry.

    Science.gov (United States)

    O'Rourke, Matthew B; Djordjevic, Steven P; Padula, Matthew P

    2018-03-01

    Reproducibility has been one of the biggest hurdles faced when attempting to develop quantitative protocols for MALDI mass spectrometry. The heterogeneous nature of sample recrystallization has made automated sample acquisition somewhat "hit and miss" with manual intervention needed to ensure that all sample spots have been analyzed. In this review, we explore the last 30 years of literature and anecdotal evidence that has attempted to address and improve reproducibility in MALDI MS. Though many methods have been attempted, we have discovered a significant publication history surrounding the use of nitrocellulose as a substrate to improve homogeneity of crystal formation and therefore reproducibility. We therefore propose that this is the most promising avenue of research for developing a comprehensive and universal preparation protocol for quantitative MALDI MS analysis. © 2016 Wiley Periodicals, Inc. Mass Spec Rev 37:217-228, 2018. © 2016 Wiley Periodicals, Inc.

  12. Dysplastic naevus: histological criteria and their inter-observer reproducibility.

    Science.gov (United States)

    Hastrup, N; Clemmensen, O J; Spaun, E; Søndergaard, K

    1994-06-01

    Forty melanocytic lesions were examined in a pilot study, which was followed by a final series of 100 consecutive melanocytic lesions, in order to evaluate the inter-observer reproducibility of the histological criteria proposed for the dysplastic naevus. The specimens were examined in a blind fashion by four observers. Analysis by kappa statistics showed poor reproducibility of nuclear features, while reproducibility of architectural features was acceptable, improving in the final series. Consequently, we cannot apply the combined criteria of cytological and architectural features with any confidence in the diagnosis of dysplastic naevus, and, until further studies have documented that architectural criteria alone will suffice in the diagnosis of dysplastic naevus, we, as pathologists, shall avoid this term.

  13. Relevant principal factors affecting the reproducibility of insect primary culture.

    Science.gov (United States)

    Ogata, Norichika; Iwabuchi, Kikuo

    2017-06-01

    The primary culture of insect cells often suffers from problems with poor reproducibility in the quality of the final cell preparations. The cellular composition of the explants (cell number and cell types), surgical methods (surgical duration and surgical isolation), and physiological and genetic differences between donors may be critical factors affecting the reproducibility of culture. However, little is known about where biological variation (interindividual differences between donors) ends and technical variation (variance in replication of culture conditions) begins. In this study, we cultured larval fat bodies from the Japanese rhinoceros beetle, Allomyrina dichotoma, and evaluated, using linear mixed models, the effect of interindividual variation between donors on the reproducibility of the culture. We also performed transcriptome analysis of the hemocyte-like cells mainly seen in the cultures using RNA sequencing and ultrastructural analyses of hemocytes using a transmission electron microscope, revealing that the cultured cells have many characteristics of insect hemocytes.

  14. Citric acid production from hydrolysate of pretreated straw cellulose by Yarrowia lipolytica SWJ-1b using batch and fed-batch cultivation.

    Science.gov (United States)

    Liu, Xiaoyan; Lv, Jinshun; Zhang, Tong; Deng, Yuanfang

    2015-01-01

    In this study, crude cellulase produced by Trichoderma reesei Rut-30 was used to hydrolyze pretreated straw. After the compositions of the hydrolysate of pretreated straw were optimized, the study showed that natural components of pretreated straw without addition of any other components such as (NH4)2SO4, KH2PO4, or Mg(2+) were suitable for citric acid production by Yarrowia lipolytica SWJ-1b, and the optimal ventilatory capacity was 10.0 L/min/L medium. Batch and fed-batch production of citric acid from the hydrolysate of pretreated straw by Yarrowia lipolytica SWJ-1b has been investigated. In the batch cultivation, 25.4 g/L and 26.7 g/L citric acid were yields from glucose and hydrolysate of straw cellulose, respectively, while the cultivation time was 120 hr. In the three-cycle fed-batch cultivation, citric acid (CA) production was increased to 42.4 g/L and the cultivation time was extended to 240 hr. However, iso-citric acid (ICA) yield in fed-batch cultivation (4.0 g/L) was similar to that during the batch cultivation (3.9 g/L), and only 1.6 g/L of reducing sugar was left in the medium at the end of fed-batch cultivation, suggesting that most of the added carbon was used in the cultivation.

  15. Batch and multi-step fed-batch enzymatic saccharification of Formiline-pretreated sugarcane bagasse at high solid loadings for high sugar and ethanol titers.

    Science.gov (United States)

    Zhao, Xuebing; Dong, Lei; Chen, Liang; Liu, Dehua

    2013-05-01

    Formiline pretreatment pertains to a biomass fractionation process. In the present work, Formiline-pretreated sugarcane bagasse was hydrolyzed with cellulases by batch and multi-step fed-batch processes at 20% solid loading. For wet pulp, after 144 h incubation with cellulase loading of 10 FPU/g dry solid, fed-batch process obtained ~150 g/L glucose and ~80% glucan conversion, while batch process obtained ~130 g/L glucose with corresponding ~70% glucan conversion. Solid loading could be further increased to 30% for the acetone-dried pulp. By fed-batch hydrolysis of the dried pulp in pH 4.8 buffer solution, glucose concentration could be 247.3±1.6 g/L with corresponding 86.1±0.6% glucan conversion. The enzymatic hydrolyzates could be well converted to ethanol by a subsequent fermentation using Saccharomices cerevisiae with ethanol titer of 60-70 g/L. Batch and fed-batch SSF indicated that Formiline-pretreated substrate showed excellent fermentability. The final ethanol concentration was 80 g/L with corresponding 82.7% of theoretical yield. Copyright © 2012 Elsevier Ltd. All rights reserved.

  16. Reproducibility of clinical research in critical care: a scoping review.

    Science.gov (United States)

    Niven, Daniel J; McCormick, T Jared; Straus, Sharon E; Hemmelgarn, Brenda R; Jeffs, Lianne; Barnes, Tavish R M; Stelfox, Henry T

    2018-02-21

    The ability to reproduce experiments is a defining principle of science. Reproducibility of clinical research has received relatively little scientific attention. However, it is important as it may inform clinical practice, research agendas, and the design of future studies. We used scoping review methods to examine reproducibility within a cohort of randomized trials examining clinical critical care research and published in the top general medical and critical care journals. To identify relevant clinical practices, we searched the New England Journal of Medicine, The Lancet, and JAMA for randomized trials published up to April 2016. To identify a comprehensive set of studies for these practices, included articles informed secondary searches within other high-impact medical and specialty journals. We included late-phase randomized controlled trials examining therapeutic clinical practices in adults admitted to general medical-surgical or specialty intensive care units (ICUs). Included articles were classified using a reproducibility framework. An original study was the first to evaluate a clinical practice. A reproduction attempt re-evaluated that practice in a new set of participants. Overall, 158 practices were examined in 275 included articles. A reproduction attempt was identified for 66 practices (42%, 95% CI 33-50%). Original studies reported larger effects than reproduction attempts (primary endpoint, risk difference 16.0%, 95% CI 11.6-20.5% vs. 8.4%, 95% CI 6.0-10.8%, P = 0.003). More than half of clinical practices with a reproduction attempt demonstrated effects that were inconsistent with the original study (56%, 95% CI 42-68%), among which a large number were reported to be efficacious in the original study and to lack efficacy in the reproduction attempt (34%, 95% CI 19-52%). Two practices reported to be efficacious in the original study were found to be harmful in the reproduction attempt. A minority of critical care practices with research published

  17. Reproducibility of graph metrics in fMRI networks

    Directory of Open Access Journals (Sweden)

    Qawi K Telesford

    2010-12-01

    Full Text Available The reliability of graph metrics calculated in network analysis is essential to the interpretation of complex network organization. These graph metrics are used to deduce the small-world properties in networks. In this study, we investigated the test-retest reliability of graph metrics from functional magnetic resonance imaging (fMRI data collected for two runs in 45 healthy older adults. Graph metrics were calculated on data for both runs and compared using intraclass correlation coefficient (ICC statistics and Bland-Altman (BA plots. ICC scores describe the level of absolute agreement between two measurements and provide a measure of reproducibility. For mean graph metrics, ICC scores were high for clustering coefficient (ICC=0.86, global efficiency (ICC=0.83, path length (ICC=0.79, and local efficiency (ICC=0.75; the ICC score for degree was found to be low (ICC=0.29. ICC scores were also used to generate reproducibility maps in brain space to test voxel-wise reproducibility for unsmoothed and smoothed data. Reproducibility was uniform across the brain for global efficiency and path length, but was only high in network hubs for clustering coefficient, local efficiency and degree. BA plots were used to test the measurement repeatability of all graph metrics. All graph metrics fell within the limits for repeatability. Together, these results suggest that with exception of degree, mean graph metrics are reproducible and suitable for clinical studies. Further exploration is warranted to better understand reproducibility across the brain on a voxel-wise basis.

  18. Effective Form of Reproducing the Total Financial Potential of Ukraine

    Directory of Open Access Journals (Sweden)

    Portna Oksana V.

    2015-03-01

    Full Text Available Development of scientific principles of reproducing the total financial potential of the country and its effective form is an urgent problem both in theoretical and practical aspects of the study, the solution of which is intended to ensure the active mobilization and effective use of the total financial potential of Ukraine, and as a result — its expanded reproduction as well, which would contribute to realization of the internal capacities for stabilization of the national economy. The purpose of the article is disclosing the essence of the effective form of reproducing the total financial potential of the country, analyzing the results of reproducing the total financial potential of Ukraine. It has been proved that the basis for the effective form of reproducing the total financial potential of the country is the volume and flow of resources, which are associated with the «real» economy, affect the dynamics of GDP and define it, i.e. resource and process forms of reproducing the total financial potential of Ukraine (which precede the effective one. The analysis of reproducing the total financial potential of Ukraine has shown that in the analyzed period there was an increase in the financial possibilities of the country, but steady dynamics of reduction of the total financial potential was observed. If we consider the amount of resources involved in production, creating a net value added and GDP, it occurs on a restricted basis. Growth of the total financial potential of Ukraine is connected only with extensive quantitative factors rather than intensive qualitative changes.

  19. The MIMIC Code Repository: enabling reproducibility in critical care research.

    Science.gov (United States)

    Johnson, Alistair Ew; Stone, David J; Celi, Leo A; Pollard, Tom J

    2018-01-01

    Lack of reproducibility in medical studies is a barrier to the generation of a robust knowledge base to support clinical decision-making. In this paper we outline the Medical Information Mart for Intensive Care (MIMIC) Code Repository, a centralized code base for generating reproducible studies on an openly available critical care dataset. Code is provided to load the data into a relational structure, create extractions of the data, and reproduce entire analysis plans including research studies. Concepts extracted include severity of illness scores, comorbid status, administrative definitions of sepsis, physiologic criteria for sepsis, organ failure scores, treatment administration, and more. Executable documents are used for tutorials and reproduce published studies end-to-end, providing a template for future researchers to replicate. The repository's issue tracker enables community discussion about the data and concepts, allowing users to collaboratively improve the resource. The centralized repository provides a platform for users of the data to interact directly with the data generators, facilitating greater understanding of the data. It also provides a location for the community to collaborate on necessary concepts for research progress and share them with a larger audience. Consistent application of the same code for underlying concepts is a key step in ensuring that research studies on the MIMIC database are comparable and reproducible. By providing open source code alongside the freely accessible MIMIC-III database, we enable end-to-end reproducible analysis of electronic health records. © The Author 2017. Published by Oxford University Press on behalf of the American Medical Informatics Association.

  20. Language-Agnostic Reproducible Data Analysis Using Literate Programming.

    Science.gov (United States)

    Vassilev, Boris; Louhimo, Riku; Ikonen, Elina; Hautaniemi, Sampsa

    2016-01-01

    A modern biomedical research project can easily contain hundreds of analysis steps and lack of reproducibility of the analyses has been recognized as a severe issue. While thorough documentation enables reproducibility, the number of analysis programs used can be so large that in reality reproducibility cannot be easily achieved. Literate programming is an approach to present computer programs to human readers. The code is rearranged to follow the logic of the program, and to explain that logic in a natural language. The code executed by the computer is extracted from the literate source code. As such, literate programming is an ideal formalism for systematizing analysis steps in biomedical research. We have developed the reproducible computing tool Lir (literate, reproducible computing) that allows a tool-agnostic approach to biomedical data analysis. We demonstrate the utility of Lir by applying it to a case study. Our aim was to investigate the role of endosomal trafficking regulators to the progression of breast cancer. In this analysis, a variety of tools were combined to interpret the available data: a relational database, standard command-line tools, and a statistical computing environment. The analysis revealed that the lipid transport related genes LAPTM4B and NDRG1 are coamplified in breast cancer patients, and identified genes potentially cooperating with LAPTM4B in breast cancer progression. Our case study demonstrates that with Lir, an array of tools can be combined in the same data analysis to improve efficiency, reproducibility, and ease of understanding. Lir is an open-source software available at github.com/borisvassilev/lir.

  1. Biological Treatment of Leachate using Sequencing Batch Reactor

    Directory of Open Access Journals (Sweden)

    WDMC Perera

    2014-12-01

    Full Text Available Normal 0 false false false EN-US X-NONE TA Abstract   In Sri Lanka municipal solid waste is generally disposed in poorly managed open dumps which lack liner systems and leachate collection systems. Rain water percolates through the waste layers to produce leachate which drains in to ground water and finally to nearby water bodies, degrading the quality of water. Leachate thus has become a major environmental concern in municipal waste management and treatment of leachate is a major challenge for the existing and proposed landfill sites.   The study was conducted to assess the feasibility of the usage of the Sequencing Batch Reactor in the treatment of the landfill leachate up to the proposed levels in the draft report of “Proposed Sri Lankan standard for landfill leachate to be disposed to the inland waters". Leachate collected from the open dumpsite at Meethotamulla, Western Province, Sri Lanka was used for leachate characterization.   SBR was constructed with a 10-liter working volume operated in an 18 hour cycle mode and each cycle consists of 15hours of aerobic, 2h settle and 0.5 h of fill/decant stages. The Dissolved Oxygen level within the SBR was maintained at 2 mg/l through the aerobic stage. Infeed was diluted with water during the acclimatization period and a leachate to water ratio of 55:45 was maintained. The removal efficiencies for different parameters were; COD (90.5%, BOD (92.6%, TS (92.1%, Conductivity (83.9%, Alkalinity (97.4%, Hardness (82.2%, Mg (80.5%, Fe (94.2%, Zn (63.4%, Cr (31.69%, Pb (99.6%, Sulphate (98.9%, and Phosphorus (71.4% respectively. In addition Ni and Cd were removed completely during a single SBR cycle. Thus the dilution of leachate in the dumpsites using municipal wastewater, groundwater or rainwater was identified as the most cost effective dilution methods. The effluent from the Sequencing batch reactor is proposed to be further treated using a constructed wetland before releasing to surface water.

  2. FLOWSHEET FOR ALUMINUM REMOVAL FROM SLUDGE BATCH 6

    International Nuclear Information System (INIS)

    Pike, J.; Gillam, J.

    2008-01-01

    Samples of Tank 12 sludge slurry show a substantially larger fraction of aluminum than originally identified in sludge batch planning. The Liquid Waste Organization (LWO) plans to formulate Sludge Batch 6 (SB6) with about one half of the sludge slurry in Tank 12 and one half of the sludge slurry in Tank 4. LWO identified aluminum dissolution as a method to mitigate the effect of having about 50% more solids in High Level Waste (HLW) sludge than previously planned. Previous aluminum dissolution performed in a HLW tank in 1982 was performed at approximately 85 C for 5 days and dissolved nearly 80% of the aluminum in the sludge slurry. In 2008, LWO successfully dissolved 64% of the aluminum at approximately 60 C in 46 days with minimal tank modifications and using only slurry pumps as a heat source. This report establishes the technical basis and flowsheet for performing an aluminum removal process in Tank 51 for SB6 that incorporates the lessons learned from previous aluminum dissolution evolutions. For SB6, aluminum dissolution process temperature will be held at a minimum of 65 C for at least 24 days, but as long as practical or until as much as 80% of the aluminum is dissolved. As planned, an aluminum removal process can reduce the aluminum in SB6 from about 84,500 kg to as little as 17,900 kg with a corresponding reduction of total insoluble solids in the batch from 246,000 kg to 131,000 kg. The extent of the reduction may be limited by the time available to maintain Tank 51 at dissolution temperature. The range of dissolution in four weeks based on the known variability in dissolution kinetics can range from 44 to more than 80%. At 44% of the aluminum dissolved, the mass reduction is approximately 1/2 of the mass noted above, i.e., 33,300 kg of aluminum instead of 66,600 kg. Planning to reach 80% of the aluminum dissolved should allow a maximum of 81 days for dissolution and reduce the allowance if test data shows faster kinetics. 47,800 kg of the dissolved

  3. Reproducibility problems of in-service ultrasonic testing results

    International Nuclear Information System (INIS)

    Honcu, E.

    1974-01-01

    The reproducibility of the results of ultrasonic testing is the basic precondition for its successful application in in-service inspection of changes in the quality of components of nuclear power installations. The results of periodic ultrasonic inspections are not satisfactory from the point of view of reproducibility. Regardless, the ultrasonic pulse-type method is suitable for evaluating the quality of most components of nuclear installations and often the sole method which may be recommended for inspection with regard to its technical and economic aspects. (J.B.)

  4. Reproducibility of esophageal scintigraphy using semi-solid yoghurt

    Energy Technology Data Exchange (ETDEWEB)

    Imai, Yukinori; Kinoshita, Manabu; Asakura, Yasushi; Kakinuma, Tohru; Shimoji, Katsunori; Fujiwara, Kenji; Suzuki, Kenji; Miyamae, Tatsuya [Saitama Medical School, Moroyama (Japan)

    1999-10-01

    Esophageal scintigraphy is a non-invasive method which evaluate esophageal function quantitatively. We applied new technique using semi-solid yoghurt, which can evaluate esophageal function in a sitting position. To evaluate the reproducibility of this method, scintigraphy were performed in 16 healthy volunteers. From the result of four swallows except the first one, the mean coefficients of variation in esophageal transit time and esophageal emptying time were 12.8% and 13.4% respectively (interday variation). As regards the interday variation, this method had also good reproducibility from the result on the 2 separate days. (author)

  5. Reproducing Kernel Method for Solving Nonlinear Differential-Difference Equations

    Directory of Open Access Journals (Sweden)

    Reza Mokhtari

    2012-01-01

    Full Text Available On the basis of reproducing kernel Hilbert spaces theory, an iterative algorithm for solving some nonlinear differential-difference equations (NDDEs is presented. The analytical solution is shown in a series form in a reproducing kernel space, and the approximate solution , is constructed by truncating the series to terms. The convergence of , to the analytical solution is also proved. Results obtained by the proposed method imply that it can be considered as a simple and accurate method for solving such differential-difference problems.

  6. A CATASTROPHIC-CUM-RESTORATIVE QUEUING SYSTEM WITH CORRELATED BATCH ARRIVALS AND VARIABLE CAPACITY

    Directory of Open Access Journals (Sweden)

    Rakesh Kumar

    2008-07-01

    Full Text Available In this paper, we study a catastrophic-cum-restorative queuing system with correlated batch arrivals and service in batches of variable sizes. We perform the transient analysis of the queuing model. We obtain the Laplace Transform of the probability generating function of system size. Finally, some particular cases of the model have been derived and discussed. Keywords: Queue length, Catastrophes, Correlated batch arrivals, Broadband services, Variable service capacity, and Restoration.

  7. Queue Length and Server Content Distribution in an Infinite-Buffer Batch-Service Queue with Batch-Size-Dependent Service

    Directory of Open Access Journals (Sweden)

    U. C. Gupta

    2015-01-01

    Full Text Available We analyze an infinite-buffer batch-size-dependent batch-service queue with Poisson arrival and arbitrarily distributed service time. Using supplementary variable technique, we derive a bivariate probability generating function from which the joint distribution of queue and server content at departure epoch of a batch is extracted and presented in terms of roots of the characteristic equation. We also obtain the joint distribution of queue and server content at arbitrary epoch. Finally, the utility of analytical results is demonstrated by the inclusion of some numerical examples which also includes the investigation of multiple zeros.

  8. Aerobic degradation of petroleum refinery wastewater in sequential batch reactor.

    Science.gov (United States)

    Thakur, Chandrakant; Srivastava, Vimal C; Mall, Indra D

    2014-01-01

    The aim of the present work was to study the effect of various parameters affecting the treatment of raw petroleum refinery wastewater (PRW) having chemical oxygen demand (COD) of 350 mg L(-1) and total organic carbon (TOC) of 70 mg L(-1) in sequential batch reactor (SBR). Effect of hydraulic retention time (HRT) was studied in instantaneous fill condition. Maximum COD and TOC removal efficiencies were found to be 80% and 84%, respectively, for fill phase of 2 h and react phase of 2 h with fraction of SBR being filled with raw PRW in each cycle being 0.4. Effect of parameters was studied in terms of settling characteristic of treated slurry. Kinetics of treatment process has been studied. FTIR and UV-visible analysis of PRW before and after treatment have been performed so as to understand the degradation mechanism.

  9. Simulated annealing and joint manufacturing batch-sizing

    Directory of Open Access Journals (Sweden)

    Sarker Ruhul

    2003-01-01

    Full Text Available We address an important problem of a manufacturing system. The system procures raw materials from outside suppliers in a lot and processes them to produce finished goods. It proposes an ordering policy for raw materials to meet the requirements of a production facility. In return, this facility has to deliver finished products demanded by external buyers at fixed time intervals. First, a general cost model is developed considering both raw materials and finished products. Then this model is used to develop a simulated annealing approach to determining an optimal ordering policy for procurement of raw materials and also for the manufacturing batch size to minimize the total cost for meeting customer demands in time. The solutions obtained were compared with those of traditional approaches. Numerical examples are presented. .

  10. Computer evaluation of the results of batch fermentations

    Energy Technology Data Exchange (ETDEWEB)

    Nyeste, L; Sevella, B

    1980-01-01

    A useful aid to the mathematical modeling of fermentation systems, for the kinetic evaluation of batch fermentations, is described. The generalized logistic equation may be used to describe the growth curves, substrate consumption, and product formation. A computer process was developed to fit the equation to experimental points, automatically determining the equation constants on the basis of the iteration algorithm of the method of non-linear least squares. By fitting the process to different master programs of various fermentations, the complex kinetic evaluation of fermentations becomes possible. Based on the analysis easily treatable generalized logistic equation, it is possible to calculate by computer different kinetic characteristics, e.g. rates, special rates, yields, etc. The possibility of committing subjective errors was reduced to a minimum. Employment of the method is demonstrated on some fermentation processes and problems arising in the course of application are discussed.

  11. Sequential batch anaerobic composting (SEBAC sup TM ) of solid wastes

    Energy Technology Data Exchange (ETDEWEB)

    Chynoweth, D.P.; O' Keefe, D.M.; Barkdoll, A.W.; Owens, J.M. (Department of Agricultural Engineering, University of Florida, Gainesville, Florida (US)); Legrand, R. (Radian Corporation, Austin, Texas (US))

    1992-01-01

    Anaerobic high-solids digestion (anaerobic composting) is an attractive option for treatment of organic wastes. The main advantages of anaerobic composting are the lack of aeration requirements and production of methane. An anaerobic composting design, sequential batch anaerobic composting (SEBAC{sup TM}), has been developed and demonstrated at the pilot scale which has proven to be stable and effective for treatment of the non-yeard waste and yard waste organic fractions of municipal solid waste (MSW). The design employs leachate recycle for wetting, inoculation, and removal of volatile organic acids during startup. Performance is similar to that of other designs requiring heavy solids inoculation and mixing and which do not have a mechanism for volatile organic acid removal during imbalance. (au) (12 refs.).

  12. Batch study of uranium biosorption by Elodea canadensis biomass

    International Nuclear Information System (INIS)

    Zheng-ji Yi; University of Science and Technology Beijing, Haidian District, Beijing; Jun Yao; Chinese University of Geosciences, Beijing; Mi-jia Zhu; Hui-lun Chen; Fei Wang; Zhi-min Yuan; Xing Liu

    2016-01-01

    The adsorption of U(VI) onto Elodea canadensis was studied via a batch equilibrium method. Kinetic investigation indicated that the U(VI) adsorption by E. canadensis reached an equilibrium in 120 min and followed pseudo-second-order kinetics. The solution pH was the most important parameter controlling adsorption of U(VI) and the optimum pH for U(VI) removal is 6.0. The U(VI) biosorption can be well described by Langmuir model. IR spectrum analysis revealed that -NH 2 , -OH, C=O and C-O could bind strongly with U(VI). XPS spectrum analysis implied that ion exchange and coordination mechanism could be involved in the U(VI) biosorption process. (author)

  13. Treatment of Laboratory Wastewater by Sequence Batch reactor technology

    International Nuclear Information System (INIS)

    Imtiaz, N.; Butt, M.; Khan, R.A.; Saeed, M.T.; Irfan, M.

    2012-01-01

    These studies were conducted on the characterization and treatment of sewage mixed with waste -water of research and testing laboratory (PCSIR Laboratories Lahore). In this study all the parameters COD, BOD and TSS etc of influent (untreated waste-water) and effluent (treated waste-water) were characterized using the standard methods of examination for water and waste-water. All the results of the analyzed waste-water parameters were above the National Environmental Quality Standards (NEQS) set at National level. Treatment of waste-water was carried out by conventional sequencing batch reactor technique (SBR) using aeration and settling technique in the same treatment reactor at laboratory scale. The results of COD after treatment were reduced from (90-95 %), BOD (95-97 %) and TSS (96-99 %) and the reclaimed effluent quality was suitable for gardening purposes. (author)

  14. Convolutional neural networks with balanced batches for facial expressions recognition

    Science.gov (United States)

    Battini Sönmez, Elena; Cangelosi, Angelo

    2017-03-01

    This paper considers the issue of fully automatic emotion classification on 2D faces. In spite of the great effort done in recent years, traditional machine learning approaches based on hand-crafted feature extraction followed by the classification stage failed to develop a real-time automatic facial expression recognition system. The proposed architecture uses Convolutional Neural Networks (CNN), which are built as a collection of interconnected processing elements to simulate the brain of human beings. The basic idea of CNNs is to learn a hierarchical representation of the input data, which results in a better classification performance. In this work we present a block-based CNN algorithm, which uses noise, as data augmentation technique, and builds batches with a balanced number of samples per class. The proposed architecture is a very simple yet powerful CNN, which can yield state-of-the-art accuracy on the very competitive benchmark algorithm of the Extended Cohn Kanade database.

  15. Analytical study plan: Shielded Cells batch 1 campaign; Revision 1

    International Nuclear Information System (INIS)

    Bibler, N.E.; Ha, B.C.; Hay, M.S.; Ferrara, D.M.; Andrews, M.K.

    1993-01-01

    Radioactive operations in the Defense Waste Processing Facility (DWPF) will require that the Savannah River Technology Center (SRTC) perform analyses and special studies with actual Savannah River Site (SRS) high-level waste sludge. SRS Tank 42 and Tank 51 will comprise the first batch of sludge to be processed in the DWPF. Approximately 25 liters of sludge from each of these tanks will be characterized and processed in the Shielded Cells of SRTC. During the campaign, processes will include sludge characterization, sludge washing, rheology determination, mixing, hydrogen evolution, feed preparation, and vitrification of the waste. To complete the campaign, the glass will be characterized to determine its durability and crystallinity. This document describes the types of samples that will be produced, the sampling schedule and analyses required, and the methods for sample and analytical control

  16. A sensitive, reproducible and objective immunofluorescence analysis method of dystrophin in individual fibers in samples from patients with duchenne muscular dystrophy.

    Directory of Open Access Journals (Sweden)

    Chantal Beekman

    Full Text Available Duchenne muscular dystrophy (DMD is characterized by the absence or reduced levels of dystrophin expression on the inner surface of the sarcolemmal membrane of muscle fibers. Clinical development of therapeutic approaches aiming to increase dystrophin levels requires sensitive and reproducible measurement of differences in dystrophin expression in muscle biopsies of treated patients with DMD. This, however, poses a technical challenge due to intra- and inter-donor variance in the occurrence of revertant fibers and low trace dystrophin expression throughout the biopsies. We have developed an immunofluorescence and semi-automated image analysis method that measures the sarcolemmal dystrophin intensity per individual fiber for the entire fiber population in a muscle biopsy. Cross-sections of muscle co-stained for dystrophin and spectrin have been imaged by confocal microscopy, and image analysis was performed using Definiens software. Dystrophin intensity has been measured in the sarcolemmal mask of spectrin for each individual muscle fiber and multiple membrane intensity parameters (mean, maximum, quantiles per fiber were calculated. A histogram can depict the distribution of dystrophin intensities for the fiber population in the biopsy. This method was tested by measuring dystrophin in DMD, Becker muscular dystrophy, and healthy muscle samples. Analysis of duplicate or quadruplicate sections of DMD biopsies on the same or multiple days, by different operators, or using different antibodies, was shown to be objective and reproducible (inter-assay precision, CV 2-17% and intra-assay precision, CV 2-10%. Moreover, the method was sufficiently sensitive to detect consistently small differences in dystrophin between two biopsies from a patient with DMD before and after treatment with an investigational compound.

  17. Accuracy and reproducibility of voxel based superimposition of cone beam computed tomography models on the anterior cranial base and the zygomatic arches.

    Directory of Open Access Journals (Sweden)

    Rania M Nada

    Full Text Available Superimposition of serial Cone Beam Computed Tomography (CBCT scans has become a valuable tool for three dimensional (3D assessment of treatment effects and stability. Voxel based image registration is a newly developed semi-automated technique for superimposition and comparison of two CBCT scans. The accuracy and reproducibility of CBCT superimposition on the anterior cranial base or the zygomatic arches using voxel based image registration was tested in this study. 16 pairs of 3D CBCT models were constructed from pre and post treatment CBCT scans of 16 adult dysgnathic patients. Each pair was registered on the anterior cranial base three times and on the left zygomatic arch twice. Following each superimposition, the mean absolute distances between the 2 models were calculated at 4 regions: anterior cranial base, forehead, left and right zygomatic arches. The mean distances between the models ranged from 0.2 to 0.37 mm (SD 0.08-0.16 for the anterior cranial base registration and from 0.2 to 0.45 mm (SD 0.09-0.27 for the zygomatic arch registration. The mean differences between the two registration zones ranged between 0.12 to 0.19 mm at the 4 regions. Voxel based image registration on both zones could be considered as an accurate and a reproducible method for CBCT superimposition. The left zygomatic arch could be used as a stable structure for the superimposition of smaller field of view CBCT scans where the anterior cranial base is not visible.

  18. Integrating PROOF Analysis in Cloud and Batch Clusters

    International Nuclear Information System (INIS)

    Rodríguez-Marrero, Ana Y; Fernández-del-Castillo, Enol; López García, Álvaro; Marco de Lucas, Jesús; Matorras Weinig, Francisco; González Caballero, Isidro; Cuesta Noriega, Alberto

    2012-01-01

    High Energy Physics (HEP) analysis are becoming more complex and demanding due to the large amount of data collected by the current experiments. The Parallel ROOT Facility (PROOF) provides researchers with an interactive tool to speed up the analysis of huge volumes of data by exploiting parallel processing on both multicore machines and computing clusters. The typical PROOF deployment scenario is a permanent set of cores configured to run the PROOF daemons. However, this approach is incapable of adapting to the dynamic nature of interactive usage. Several initiatives seek to improve the use of computing resources by integrating PROOF with a batch system, such as Proof on Demand (PoD) or PROOF Cluster. These solutions are currently in production at Universidad de Oviedo and IFCA and are positively evaluated by users. Although they are able to adapt to the computing needs of users, they must comply with the specific configuration, OS and software installed at the batch nodes. Furthermore, they share the machines with other workloads, which may cause disruptions in the interactive service for users. These limitations make PROOF a typical use-case for cloud computing. In this work we take profit from Cloud Infrastructure at IFCA in order to provide a dynamic PROOF environment where users can control the software configuration of the machines. The Proof Analysis Framework (PAF) facilitates the development of new analysis and offers a transparent access to PROOF resources. Several performance measurements are presented for the different scenarios (PoD, SGE and Cloud), showing a speed improvement closely correlated with the number of cores used.

  19. Repeated batch and continuous degradation of chlorpyrifos by Pseudomonas putida.

    Science.gov (United States)

    Pradeep, Vijayalakshmi; Subbaiah, Usha Malavalli

    2015-01-01

    The present study was undertaken with the objective of studying repeated batch and continuous degradation of chlorpyrifos (O,O-diethyl O-3,5,6-trichloropyridin-2-yl phosphorothioate) using Ca-alginate immobilized cells of Pseudomonas putida isolated from an agricultural soil, and to study the genes and enzymes involved in degradation. The study was carried out to reduce the toxicity of chlorpyrifos by degrading it to less toxic metabolites. Long-term stability of pesticide degradation was studied during repeated batch degradation of chlorpyrifos, which was carried out over a period of 50 days. Immobilized cells were able to show 65% degradation of chlorpyrifos at the end of the 50th cycle with a cell leakage of 112 × 10(3) cfu mL(-1). During continuous treatment, 100% degradation was observed at 100 mL h(-1) flow rate with 2% chlorpyrifos, and with 10% concentration of chlorpyrifos 98% and 80% degradation was recorded at 20 mL h(-1) and 100 mL h(-1) flow rate respectively. The products of degradation detected by liquid chromatography-mass spectrometry analysis were 3,5,6-trichloro-2-pyridinol and chlorpyrifos oxon. Plasmid curing experiments with ethidium bromide indicated that genes responsible for the degradation of chlorpyrifos are present on the chromosome and not on the plasmid. The results of Polymerase chain reaction indicate that a ~890-bp product expected for mpd gene was present in Ps. putida. Enzymatic degradation studies indicated that the enzymes involved in the degradation of chlorpyrifos are membrane-bound. The study indicates that immobilized cells of Ps. putida have the potential to be used in bioremediation of water contaminated with chlorpyrifos.

  20. Optimal control of batch emulsion polymerization of vinyl chloride

    Energy Technology Data Exchange (ETDEWEB)

    Damslora, Andre Johan

    1998-12-31

    The highly exothermic polymerization of vinyl chloride (VC) is carried out in large vessels where the heat removal represents a major limitation of the production rate. Many emulsion polymerization reactors are operated in such a way that a substantial part of the heat transfer capacity is left unused for a significant part of the total batch time. To increase the reaction rate so that it matches the heat removal capacity during the course of the reaction, this thesis proposes the use of a sufficiently flexible initiator system to obtain a reaction rate which is high throughout the reaction and real-time optimization to compute the addition policy for the initiator. This optimization based approach provides a basis for an interplay between design and control and between production and research. A simple model is developed for predicting the polymerization rate. The model is highly nonlinear and open-loop unstable and may serve as an interesting case for comparison of nonlinear control strategies. The model is fitted to data obtained in a laboratory scale reactor. Finally, the thesis discusses optimal control of the emulsion polymerization reactor. Reduction of the batch cycle time is of major economic importance, as long as the quality parameters are within their specifications. The control parameterization had a major influence on the performance. A differentiable spline parameterization was applied and the optimization is illustrated in a number of cases. The best performance is obtained when the reactor temperature is obtained when the optimization is combined with some form of closed-loop control of the reactor temperature. 112 refs., 48 figs., 4 tabs.