WorldWideScience

Sample records for fully automated demand

  1. Development and evaluation of fully automated demand response in large facilities

    Energy Technology Data Exchange (ETDEWEB)

    Piette, Mary Ann; Sezgen, Osman; Watson, David S.; Motegi, Naoya; Shockman, Christine; ten Hope, Laurie

    2004-03-30

    This report describes the results of a research project to develop and evaluate the performance of new Automated Demand Response (Auto-DR) hardware and software technology in large facilities. Demand Response (DR) is a set of activities to reduce or shift electricity use to improve electric grid reliability, manage electricity costs, and ensure that customers receive signals that encourage load reduction during times when the electric grid is near its capacity. The two main drivers for widespread demand responsiveness are the prevention of future electricity crises and the reduction of electricity prices. Additional goals for price responsiveness include equity through cost of service pricing, and customer control of electricity usage and bills. The technology developed and evaluated in this report could be used to support numerous forms of DR programs and tariffs. For the purpose of this report, we have defined three levels of Demand Response automation. Manual Demand Response involves manually turning off lights or equipment; this can be a labor-intensive approach. Semi-Automated Response involves the use of building energy management control systems for load shedding, where a preprogrammed load shedding strategy is initiated by facilities staff. Fully-Automated Demand Response is initiated at a building or facility through receipt of an external communications signal--facility staff set up a pre-programmed load shedding strategy which is automatically initiated by the system without the need for human intervention. We have defined this approach to be Auto-DR. An important concept in Auto-DR is that a facility manager is able to ''opt out'' or ''override'' an individual DR event if it occurs at a time when the reduction in end-use services is not desirable. This project sought to improve the feasibility and nature of Auto-DR strategies in large facilities. The research focused on technology development, testing

  2. Fully automated parallel oligonucleotide synthesizer

    Czech Academy of Sciences Publication Activity Database

    Lebl, M.; Burger, Ch.; Ellman, B.; Heiner, D.; Ibrahim, G.; Jones, A.; Nibbe, M.; Thompson, J.; Mudra, Petr; Pokorný, Vít; Poncar, Pavel; Ženíšek, Karel

    2001-01-01

    Roč. 66, č. 8 (2001), s. 1299-1314 ISSN 0010-0765 Institutional research plan: CEZ:AV0Z4055905 Keywords : automated oligonucleotide synthesizer Subject RIV: CC - Organic Chemistry Impact factor: 0.778, year: 2001

  3. Developments towards a fully automated AMS system

    International Nuclear Information System (INIS)

    Steier, P.; Puchegger, S.; Golser, R.; Kutschera, W.; Priller, A.; Rom, W.; Wallner, A.; Wild, E.

    2000-01-01

    The possibilities of computer-assisted and automated accelerator mass spectrometry (AMS) measurements were explored. The goal of these efforts is to develop fully automated procedures for 'routine' measurements at the Vienna Environmental Research Accelerator (VERA), a dedicated 3-MV Pelletron tandem AMS facility. As a new tool for automatic tuning of the ion optics we developed a multi-dimensional optimization algorithm robust to noise, which was applied for 14 C and 10 Be. The actual isotope ratio measurements are performed in a fully automated fashion and do not require the presence of an operator. Incoming data are evaluated online and the results can be accessed via Internet. The system was used for 14 C, 10 Be, 26 Al and 129 I measurements

  4. Automation of energy demand forecasting

    Science.gov (United States)

    Siddique, Sanzad

    Automation of energy demand forecasting saves time and effort by searching automatically for an appropriate model in a candidate model space without manual intervention. This thesis introduces a search-based approach that improves the performance of the model searching process for econometrics models. Further improvements in the accuracy of the energy demand forecasting are achieved by integrating nonlinear transformations within the models. This thesis introduces machine learning techniques that are capable of modeling such nonlinearity. Algorithms for learning domain knowledge from time series data using the machine learning methods are also presented. The novel search based approach and the machine learning models are tested with synthetic data as well as with natural gas and electricity demand signals. Experimental results show that the model searching technique is capable of finding an appropriate forecasting model. Further experimental results demonstrate an improved forecasting accuracy achieved by using the novel machine learning techniques introduced in this thesis. This thesis presents an analysis of how the machine learning techniques learn domain knowledge. The learned domain knowledge is used to improve the forecast accuracy.

  5. A Fully Automated Penumbra Segmentation Tool

    DEFF Research Database (Denmark)

    Nagenthiraja, Kartheeban; Ribe, Lars Riisgaard; Hougaard, Kristina Dupont

    2012-01-01

    Introduction: Perfusion- and diffusion weighted MRI (PWI/DWI) is widely used to select patients who are likely to benefit from recanalization therapy. The visual identification of PWI-DWI-mismatch tissue depends strongly on the observer, prompting a need for software, which estimates potentially...... salavageable tissue, quickly and accurately. We present a fully Automated Penumbra Segmentation (APS) algorithm using PWI and DWI images. We compare automatically generated PWI-DWI mismatch mask to mask outlined manually by experts, in 168 patients. Method: The algorithm initially identifies PWI lesions......) at 600∙10-6 mm2/sec. Due to the nature of thresholding, the ADC mask overestimates the DWI lesion volume and consequently we initialized level-set algorithm on DWI image with ADC mask as prior knowledge. Combining the PWI and inverted DWI mask then yield the PWI-DWI mismatch mask. Four expert raters...

  6. A new fully automated TLD badge reader

    International Nuclear Information System (INIS)

    Kannan, S.; Ratna, P.; Kulkarni, M.S.

    2003-01-01

    At present personnel monitoring in India is being carried out using a number of manual and semiautomatic TLD badge Readers and the BARC TL dosimeter badge designed during 1970. Of late the manual TLD badge readers are almost completely replaced by semiautomatic readers with a number of performance improvements like use of hot gas heating to reduce the readout time considerably. PC based design with storage of glow curve for every dosimeter, on-line dose computation and printout of dose reports, etc. However the semiautomatic system suffers from the lack of a machine readable ID code on the badge and the physical design of the dosimeter card not readily compatible for automation. This paper describes a fully automated TLD badge Reader developed in the RSS Division, using a new TLD badge with machine readable ID code. The new PC based reader has a built-in reader for reading the ID code, in the form of an array of holes, on the dosimeter card. The reader has a number of self-diagnostic features to ensure a high degree of reliability. (author)

  7. Participation through Automation: Fully Automated Critical PeakPricing in Commercial Buildings

    Energy Technology Data Exchange (ETDEWEB)

    Piette, Mary Ann; Watson, David S.; Motegi, Naoya; Kiliccote,Sila; Linkugel, Eric

    2006-06-20

    California electric utilities have been exploring the use of dynamic critical peak prices (CPP) and other demand response programs to help reduce peaks in customer electric loads. CPP is a tariff design to promote demand response. Levels of automation in DR can be defined as follows: Manual Demand Response involves a potentially labor-intensive approach such as manually turning off or changing comfort set points at each equipment switch or controller. Semi-Automated Demand Response involves a pre-programmed demand response strategy initiated by a person via centralized control system. Fully Automated Demand Response does not involve human intervention, but is initiated at a home, building, or facility through receipt of an external communications signal. The receipt of the external signal initiates pre-programmed demand response strategies. They refer to this as Auto-DR. This paper describes the development, testing, and results from automated CPP (Auto-CPP) as part of a utility project in California. The paper presents the project description and test methodology. This is followed by a discussion of Auto-DR strategies used in the field test buildings. They present a sample Auto-CPP load shape case study, and a selection of the Auto-CPP response data from September 29, 2005. If all twelve sites reached their maximum saving simultaneously, a total of approximately 2 MW of DR is available from these twelve sites that represent about two million ft{sup 2}. The average DR was about half that value, at about 1 MW. These savings translate to about 0.5 to 1.0 W/ft{sup 2} of demand reduction. They are continuing field demonstrations and economic evaluations to pursue increasing penetrations of automated DR that has demonstrated ability to provide a valuable DR resource for California.

  8. Open Automated Demand Response Communications Specification (Version 1.0)

    Energy Technology Data Exchange (ETDEWEB)

    Piette, Mary Ann; Ghatikar, Girish; Kiliccote, Sila; Koch, Ed; Hennage, Dan; Palensky, Peter; McParland, Charles

    2009-02-28

    The development of the Open Automated Demand Response Communications Specification, also known as OpenADR or Open Auto-DR, began in 2002 following the California electricity crisis. The work has been carried out by the Demand Response Research Center (DRRC), which is managed by Lawrence Berkeley National Laboratory. This specification describes an open standards-based communications data model designed to facilitate sending and receiving demand response price and reliability signals from a utility or Independent System Operator to electric customers. OpenADR is one element of the Smart Grid information and communications technologies that are being developed to improve optimization between electric supply and demand. The intention of the open automated demand response communications data model is to provide interoperable signals to building and industrial control systems that are preprogrammed to take action based on a demand response signal, enabling a demand response event to be fully automated, with no manual intervention. The OpenADR specification is a flexible infrastructure to facilitate common information exchange between the utility or Independent System Operator and end-use participants. The concept of an open specification is intended to allow anyone to implement the signaling systems, the automation server or the automation clients.

  9. Demands on digital automation; Anforderungen an die Digitale Automation

    Energy Technology Data Exchange (ETDEWEB)

    Bieler, P.

    1995-12-31

    In chapter 12 of the anthology about building control the demands on digital automation are presented. The following aspects are discussed: variety of the companies` philosophies, demands of the customer/investor, demands of the use of buildings/rooms, the user, point of view of manufacturer of technical plants. (BWI) [Deutsch] Kapitel 12 des Sammelbandes ueber Building Control stellt die Anforderungen an die Digitale Automation vor. In diesem Zusammenhang wird auf folgende Themenbereiche eingegangen: Spektrum der Firmenphilosophien, Forderungen der Auftraggeber/Investoren, der Gebaeude-/Raumnutzung, der Betreiber sowie Sicht der Ersteller betriebstechnischer Anlagen. (BWI)

  10. A Fully Automated Approach to Spike Sorting.

    Science.gov (United States)

    Chung, Jason E; Magland, Jeremy F; Barnett, Alex H; Tolosa, Vanessa M; Tooker, Angela C; Lee, Kye Y; Shah, Kedar G; Felix, Sarah H; Frank, Loren M; Greengard, Leslie F

    2017-09-13

    Understanding the detailed dynamics of neuronal networks will require the simultaneous measurement of spike trains from hundreds of neurons (or more). Currently, approaches to extracting spike times and labels from raw data are time consuming, lack standardization, and involve manual intervention, making it difficult to maintain data provenance and assess the quality of scientific results. Here, we describe an automated clustering approach and associated software package that addresses these problems and provides novel cluster quality metrics. We show that our approach has accuracy comparable to or exceeding that achieved using manual or semi-manual techniques with desktop central processing unit (CPU) runtimes faster than acquisition time for up to hundreds of electrodes. Moreover, a single choice of parameters in the algorithm is effective for a variety of electrode geometries and across multiple brain regions. This algorithm has the potential to enable reproducible and automated spike sorting of larger scale recordings than is currently possible. Copyright © 2017 Elsevier Inc. All rights reserved.

  11. Home Network Technologies and Automating Demand Response

    Energy Technology Data Exchange (ETDEWEB)

    McParland, Charles

    2009-12-01

    Over the past several years, interest in large-scale control of peak energy demand and total consumption has increased. While motivated by a number of factors, this interest has primarily been spurred on the demand side by the increasing cost of energy and, on the supply side by the limited ability of utilities to build sufficient electricity generation capacity to meet unrestrained future demand. To address peak electricity use Demand Response (DR) systems are being proposed to motivate reductions in electricity use through the use of price incentives. DR systems are also be design to shift or curtail energy demand at critical times when the generation, transmission, and distribution systems (i.e. the 'grid') are threatened with instabilities. To be effectively deployed on a large-scale, these proposed DR systems need to be automated. Automation will require robust and efficient data communications infrastructures across geographically dispersed markets. The present availability of widespread Internet connectivity and inexpensive, reliable computing hardware combined with the growing confidence in the capabilities of distributed, application-level communications protocols suggests that now is the time for designing and deploying practical systems. Centralized computer systems that are capable of providing continuous signals to automate customers reduction of power demand, are known as Demand Response Automation Servers (DRAS). The deployment of prototype DRAS systems has already begun - with most initial deployments targeting large commercial and industrial (C & I) customers. An examination of the current overall energy consumption by economic sector shows that the C & I market is responsible for roughly half of all energy consumption in the US. On a per customer basis, large C & I customers clearly have the most to offer - and to gain - by participating in DR programs to reduce peak demand. And, by concentrating on a small number of relatively

  12. Cardiac imaging: working towards fully-automated machine analysis & interpretation.

    Science.gov (United States)

    Slomka, Piotr J; Dey, Damini; Sitek, Arkadiusz; Motwani, Manish; Berman, Daniel S; Germano, Guido

    2017-03-01

    Non-invasive imaging plays a critical role in managing patients with cardiovascular disease. Although subjective visual interpretation remains the clinical mainstay, quantitative analysis facilitates objective, evidence-based management, and advances in clinical research. This has driven developments in computing and software tools aimed at achieving fully automated image processing and quantitative analysis. In parallel, machine learning techniques have been used to rapidly integrate large amounts of clinical and quantitative imaging data to provide highly personalized individual patient-based conclusions. Areas covered: This review summarizes recent advances in automated quantitative imaging in cardiology and describes the latest techniques which incorporate machine learning principles. The review focuses on the cardiac imaging techniques which are in wide clinical use. It also discusses key issues and obstacles for these tools to become utilized in mainstream clinical practice. Expert commentary: Fully-automated processing and high-level computer interpretation of cardiac imaging are becoming a reality. Application of machine learning to the vast amounts of quantitative data generated per scan and integration with clinical data also facilitates a move to more patient-specific interpretation. These developments are unlikely to replace interpreting physicians but will provide them with highly accurate tools to detect disease, risk-stratify, and optimize patient-specific treatment. However, with each technological advance, we move further from human dependence and closer to fully-automated machine interpretation.

  13. Fully automated MRI-guided robotics for prostate brachytherapy

    International Nuclear Information System (INIS)

    Stoianovici, D.; Vigaru, B.; Petrisor, D.; Muntener, M.; Patriciu, A.; Song, D.

    2008-01-01

    The uncertainties encountered in the deployment of brachytherapy seeds are related to the commonly used ultrasound imager and the basic instrumentation used for the implant. An alternative solution is under development in which a fully automated robot is used to place the seeds according to the dosimetry plan under direct MRI-guidance. Incorporation of MRI-guidance creates potential for physiological and molecular image-guided therapies. Moreover, MRI-guided brachytherapy is also enabling for re-estimating dosimetry during the procedure, because with the MRI the seeds already implanted can be localised. An MRI compatible robot (MrBot) was developed. The robot is designed for transperineal percutaneous prostate interventions, and customised for fully automated MRI-guided brachytherapy. With different end-effectors, the robot applies to other image-guided interventions of the prostate. The robot is constructed of non-magnetic and dielectric materials and is electricity free using pneumatic actuation and optic sensing. A new motor (PneuStep) was purposely developed to set this robot in motion. The robot fits alongside the patient in closed-bore MRI scanners. It is able to stay fully operational during MR imaging without deteriorating the quality of the scan. In vitro, cadaver, and animal tests showed millimetre needle targeting accuracy, and very precise seed placement. The robot tested without any interference up to 7T. The robot is the first fully automated robot to function in MRI scanners. Its first application is MRI-guided seed brachytherapy. It is capable of automated, highly accurate needle placement. Extensive testing is in progress prior to clinical trials. Preliminary results show that the robot may become a useful image-guided intervention instrument. (author)

  14. A Fully Automated High-Throughput Zebrafish Behavioral Ototoxicity Assay.

    Science.gov (United States)

    Todd, Douglas W; Philip, Rohit C; Niihori, Maki; Ringle, Ryan A; Coyle, Kelsey R; Zehri, Sobia F; Zabala, Leanne; Mudery, Jordan A; Francis, Ross H; Rodriguez, Jeffrey J; Jacob, Abraham

    2017-08-01

    Zebrafish animal models lend themselves to behavioral assays that can facilitate rapid screening of ototoxic, otoprotective, and otoregenerative drugs. Structurally similar to human inner ear hair cells, the mechanosensory hair cells on their lateral line allow the zebrafish to sense water flow and orient head-to-current in a behavior called rheotaxis. This rheotaxis behavior deteriorates in a dose-dependent manner with increased exposure to the ototoxin cisplatin, thereby establishing itself as an excellent biomarker for anatomic damage to lateral line hair cells. Building on work by our group and others, we have built a new, fully automated high-throughput behavioral assay system that uses automated image analysis techniques to quantify rheotaxis behavior. This novel system consists of a custom-designed swimming apparatus and imaging system consisting of network-controlled Raspberry Pi microcomputers capturing infrared video. Automated analysis techniques detect individual zebrafish, compute their orientation, and quantify the rheotaxis behavior of a zebrafish test population, producing a powerful, high-throughput behavioral assay. Using our fully automated biological assay to test a standardized ototoxic dose of cisplatin against varying doses of compounds that protect or regenerate hair cells may facilitate rapid translation of candidate drugs into preclinical mammalian models of hearing loss.

  15. FULLY AUTOMATED IMAGE ORIENTATION IN THE ABSENCE OF TARGETS

    Directory of Open Access Journals (Sweden)

    C. Stamatopoulos

    2012-07-01

    Full Text Available Automated close-range photogrammetric network orientation has traditionally been associated with the use of coded targets in the object space to allow for an initial relative orientation (RO and subsequent spatial resection of the images. Over the past decade, automated orientation via feature-based matching (FBM techniques has attracted renewed research attention in both the photogrammetry and computer vision (CV communities. This is largely due to advances made towards the goal of automated relative orientation of multi-image networks covering untargetted (markerless objects. There are now a number of CV-based algorithms, with accompanying open-source software, that can achieve multi-image orientation within narrow-baseline networks. From a photogrammetric standpoint, the results are typically disappointing as the metric integrity of the resulting models is generally poor, or even unknown, while the number of outliers within the image matching and triangulation is large, and generally too large to allow relative orientation (RO via the commonly used coplanarity equations. On the other hand, there are few examples within the photogrammetric research field of automated markerless camera calibration to metric tolerances, and these too are restricted to narrow-baseline, low-convergence imaging geometry. The objective addressed in this paper is markerless automatic multi-image orientation, maintaining metric integrity, within networks that incorporate wide-baseline imagery. By wide-baseline we imply convergent multi-image configurations with convergence angles of up to around 90°. An associated aim is provision of a fast, fully automated process, which can be performed without user intervention. For this purpose, various algorithms require optimisation to allow parallel processing utilising multiple PC cores and graphics processing units (GPUs.

  16. Fully Automated Deep Learning System for Bone Age Assessment.

    Science.gov (United States)

    Lee, Hyunkwang; Tajmir, Shahein; Lee, Jenny; Zissen, Maurice; Yeshiwas, Bethel Ayele; Alkasab, Tarik K; Choy, Garry; Do, Synho

    2017-08-01

    Skeletal maturity progresses through discrete phases, a fact that is used routinely in pediatrics where bone age assessments (BAAs) are compared to chronological age in the evaluation of endocrine and metabolic disorders. While central to many disease evaluations, little has changed to improve the tedious process since its introduction in 1950. In this study, we propose a fully automated deep learning pipeline to segment a region of interest, standardize and preprocess input radiographs, and perform BAA. Our models use an ImageNet pretrained, fine-tuned convolutional neural network (CNN) to achieve 57.32 and 61.40% accuracies for the female and male cohorts on our held-out test images. Female test radiographs were assigned a BAA within 1 year 90.39% and within 2 years 98.11% of the time. Male test radiographs were assigned 94.18% within 1 year and 99.00% within 2 years. Using the input occlusion method, attention maps were created which reveal what features the trained model uses to perform BAA. These correspond to what human experts look at when manually performing BAA. Finally, the fully automated BAA system was deployed in the clinical environment as a decision supporting system for more accurate and efficient BAAs at much faster interpretation time (<2 s) than the conventional method.

  17. Fully Automated Lipid Pool Detection Using Near Infrared Spectroscopy

    Directory of Open Access Journals (Sweden)

    Elżbieta Pociask

    2016-01-01

    Full Text Available Background. Detecting and identifying vulnerable plaque, which is prone to rupture, is still a challenge for cardiologist. Such lipid core-containing plaque is still not identifiable by everyday angiography, thus triggering the need to develop a new tool where NIRS-IVUS can visualize plaque characterization in terms of its chemical and morphologic characteristic. The new tool can lead to the development of new methods of interpreting the newly obtained data. In this study, the algorithm to fully automated lipid pool detection on NIRS images is proposed. Method. Designed algorithm is divided into four stages: preprocessing (image enhancement, segmentation of artifacts, detection of lipid areas, and calculation of Lipid Core Burden Index. Results. A total of 31 NIRS chemograms were analyzed by two methods. The metrics, total LCBI, maximal LCBI in 4 mm blocks, and maximal LCBI in 2 mm blocks, were calculated to compare presented algorithm with commercial available system. Both intraclass correlation (ICC and Bland-Altman plots showed good agreement and correlation between used methods. Conclusions. Proposed algorithm is fully automated lipid pool detection on near infrared spectroscopy images. It is a tool developed for offline data analysis, which could be easily augmented for newer functions and projects.

  18. Fully automated gynecomastia quantification from low-dose chest CT

    Science.gov (United States)

    Liu, Shuang; Sonnenblick, Emily B.; Azour, Lea; Yankelevitz, David F.; Henschke, Claudia I.; Reeves, Anthony P.

    2018-02-01

    Gynecomastia is characterized by the enlargement of male breasts, which is a common and sometimes distressing condition found in over half of adult men over the age of 44. Although the majority of gynecomastia is physiologic or idiopathic, its occurrence may also associate with an extensive variety of underlying systemic disease or drug toxicity. With the recent large-scale implementation of annual lung cancer screening using low-dose chest CT (LDCT), gynecomastia is believed to be a frequent incidental finding on LDCT. A fully automated system for gynecomastia quantification from LDCT is presented in this paper. The whole breast region is first segmented using an anatomyorientated approach based on the propagation of pectoral muscle fronts in the vertical direction. The subareolar region is then localized, and the fibroglandular tissue within it is measured for the assessment of gynecomastia. The presented system was validated using 454 breast regions from non-contrast LDCT scans of 227 adult men. The ground truth was established by an experienced radiologist by classifying each breast into one of the five categorical scores. The automated measurements have been demonstrated to achieve promising performance for the gynecomastia diagnosis with the AUC of 0.86 for the ROC curve and have statistically significant Spearman correlation r=0.70 (p early detection as well as the treatment of both gynecomastia and the underlying medical problems, if any, that cause gynecomastia.

  19. A fully automated system for ultrasonic power measurement and simulation accordingly to IEC 61161:2006

    International Nuclear Information System (INIS)

    Costa-Felix, Rodrigo P B; Alvarenga, Andre V; Hekkenberg, Rob

    2011-01-01

    The ultrasonic power measurement, worldwide accepted, standard is the IEC 61161, presently in its 2nd edition (2006), but under review. To fulfil its requirements, considering that a radiation force balance is to be used as ultrasonic power detector, a large amount of raw data (mass measurement) shall be collected as function of time to perform all necessary calculations and corrections. Uncertainty determination demands calculation effort of raw and processed data. Although it is possible to be undertaken in an old-fashion way, using spread sheets and manual data collection, automation software are often used in metrology to provide a virtually error free environment concerning data acquisition and repetitive calculations and corrections. Considering that, a fully automate ultrasonic power measurement system was developed and comprehensively tested. A 0,1 mg of precision balance model CP224S (Sartorius, Germany) was used as measuring device and a calibrated continuous wave ultrasound check source (Precision Acoustics, UK) was the device under test. A 150 ml container filled with degassed water and containing an absorbing target at the bottom was placed on the balance pan. Besides the feature of automation software, a routine of power measurement simulation was implemented. It was idealized as a teaching tool of how ultrasonic power emission behaviour is with a radiation force balance equipped with an absorbing target. Automation software was considered as an effective tool for speeding up ultrasonic power measurement, while allowing accurate calculation and attractive graphical partial and final results.

  20. Fully automated data acquisition, processing, and display in equilibrium radioventriculography

    International Nuclear Information System (INIS)

    Bourguignon, M.H.; Douglass, K.H.; Links, J.M.; Wagner, H.N. Jr.; Johns Hopkins Medical Institutions, Baltimore, MD

    1981-01-01

    A fully automated data acquisition, processing, and display procedure was developed for equilibrium radioventriculography. After a standardized acquisition, the study is automatically analyzed to yield both right and left ventricular time-activity curves. The program first creates a series of edge-enhanced images (difference between squared images and scaled original images). A marker point within each ventricle is then identified as that pixel with maximum counts to the patient's right and left of the count center of gravity of a stroke volume image. Regions of interest are selected on each frame as the first contour of local maxima of the two-dimensional second derivative (pseudo-Laplacian) which encloses the appropriate marker point, using a method developed by Goris. After shifting the left ventricular end-systolic region of interest four pixels to the patient's left, a background region of interest is generated as the crescent-shaped area of the shifted region of interest not intersected by the end systolic region. The average counts/pixel in this background region in the end systolic frame of the original study are subtracted from each pixel in all frames of the gated study. Right and left ventricular time-activity curves are then obtained by applying each region of interest to its corresponding background-subtracted frame, and the ejection fraction, end diastolic, end systolic, and stroke counts determined for both ventricles. In fourteen consecutive patients, in addition to the automatic ejection fractions, manually drawn regions of interest were used to obtain ejection fractions for both ventricles. The manual regions of interest were drawn twice, and the average obtained. (orig./TR)

  1. An international crowdsourcing study into people's statements on fully automated driving

    NARCIS (Netherlands)

    Bazilinskyy, P.; Kyriakidis, M.; de Winter, J.C.F.; Ahram, Tareq; Karwowski, Waldemar; Schmorrow, Dylan

    2015-01-01

    Fully automated driving can potentially provide enormous benefits to society. However, it has been unclear whether people will appreciate such far-reaching technology. This study investigated anonymous textual comments regarding fully automated driving, based on data extracted from three online

  2. Northwest Open Automated Demand Response Technology Demonstration Project

    Energy Technology Data Exchange (ETDEWEB)

    Kiliccote, Sila; Dudley, Junqiao Han; Piette, Mary Ann

    2009-08-01

    Lawrence Berkeley National Laboratory (LBNL) and the Demand Response Research Center (DRRC) performed a technology demonstration and evaluation for Bonneville Power Administration (BPA) in Seattle City Light's (SCL) service territory. This report summarizes the process and results of deploying open automated demand response (OpenADR) in Seattle area with winter morning peaking commercial buildings. The field tests were designed to evaluate the feasibility of deploying fully automated demand response (DR) in four to six sites in the winter and the savings from various building systems. The project started in November of 2008 and lasted 6 months. The methodology for the study included site recruitment, control strategy development, automation system deployment and enhancements, and evaluation of sites participation in DR test events. LBNL subcontracted McKinstry and Akuacom for this project. McKinstry assisted with recruitment, site survey collection, strategy development and overall participant and control vendor management. Akuacom established a new server and enhanced its operations to allow for scheduling winter morning day-of and day-ahead events. Each site signed a Memorandum of Agreement with SCL. SCL offered each site $3,000 for agreeing to participate in the study and an additional $1,000 for each event they participated. Each facility and their control vendor worked with LBNL and McKinstry to select and implement control strategies for DR and developed their automation based on the existing Internet connectivity and building control system. Once the DR strategies were programmed, McKinstry commissioned them before actual test events. McKinstry worked with LBNL to identify control points that can be archived at each facility. For each site LBNL collected meter data and trend logs from the energy management and control system. The communication system allowed the sites to receive day-ahead as well as day-of DR test event signals. Measurement of DR was

  3. Intention to use a fully automated car: attitudes and a priori acceptability

    OpenAIRE

    PAYRE, William; CESTAC, Julien; DELHOMME, Patricia

    2014-01-01

    If previous research studied acceptability of partially or highly automated driving, few of them focused on fully automated driving (FAD), including the ability to master longitudinal control, lateral control and maneuvers. The present study analyzes a priori acceptability, attitudes, personality traits and intention to use a fully automated vehicle. 421 French drivers (153 males, M= 40.2 years, age range 19-73) answered an online questionnaire. 68.1% of the sample a priori accepted FAD. P...

  4. Opportunities for Automated Demand Response in California Agricultural Irrigation

    Energy Technology Data Exchange (ETDEWEB)

    Olsen, Daniel [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Aghajanzadeh, Arian [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); McKane, Aimee [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2015-08-01

    Pumping water for agricultural irrigation represents a significant share of California’s annual electricity use and peak demand. It also represents a large source of potential flexibility, as farms possess a form of storage in their wetted soil. By carefully modifying their irrigation schedules, growers can participate in demand response without adverse effects on their crops. This report describes the potential for participation in demand response and automated demand response by agricultural irrigators in California, as well as barriers to widespread participation. The report first describes the magnitude, timing, location, purpose, and manner of energy use in California. Typical on-­farm controls are discussed, as well as common impediments to participation in demand response and automated demand response programs. Case studies of demand response programs in California and across the country are reviewed, and their results along with overall California demand estimates are used to estimate statewide demand response potential. Finally, recommendations are made for future research that can enhance the understanding of demand response potential in this industry.

  5. Toward Fully Automated Multicriterial Plan Generation: A Prospective Clinical Study

    International Nuclear Information System (INIS)

    Voet, Peter W.J.; Dirkx, Maarten L.P.; Breedveld, Sebastiaan; Fransen, Dennie; Levendag, Peter C.; Heijmen, Ben J.M.

    2013-01-01

    Purpose: To prospectively compare plans generated with iCycle, an in-house-developed algorithm for fully automated multicriterial intensity modulated radiation therapy (IMRT) beam profile and beam orientation optimization, with plans manually generated by dosimetrists using the clinical treatment planning system. Methods and Materials: For 20 randomly selected head-and-neck cancer patients with various tumor locations (of whom 13 received sequential boost treatments), we offered the treating physician the choice between an automatically generated iCycle plan and a manually optimized plan using standard clinical procedures. Although iCycle used a fixed “wish list” with hard constraints and prioritized objectives, the dosimetrists manually selected the beam configuration and fine tuned the constraints and objectives for each IMRT plan. Dosimetrists were not informed in advance whether a competing iCycle plan was made. The 2 plans were simultaneously presented to the physician, who then selected the plan to be used for treatment. For the patient group, differences in planning target volume coverage and sparing of critical tissues were quantified. Results: In 32 of 33 plan comparisons, the physician selected the iCycle plan for treatment. This highly consistent preference for the automatically generated plans was mainly caused by the improved sparing for the large majority of critical structures. With iCycle, the normal tissue complication probabilities for the parotid and submandibular glands were reduced by 2.4% ± 4.9% (maximum, 18.5%, P=.001) and 6.5% ± 8.3% (maximum, 27%, P=.005), respectively. The reduction in the mean oral cavity dose was 2.8 ± 2.8 Gy (maximum, 8.1 Gy, P=.005). For the swallowing muscles, the esophagus and larynx, the mean dose reduction was 3.3 ± 1.1 Gy (maximum, 9.2 Gy, P<.001). For 15 of the 20 patients, target coverage was also improved. Conclusions: In 97% of cases, automatically generated plans were selected for treatment because of

  6. Automated mammographic breast density estimation using a fully convolutional network.

    Science.gov (United States)

    Lee, Juhun; Nishikawa, Robert M

    2018-03-01

    The purpose of this study was to develop a fully automated algorithm for mammographic breast density estimation using deep learning. Our algorithm used a fully convolutional network, which is a deep learning framework for image segmentation, to segment both the breast and the dense fibroglandular areas on mammographic images. Using the segmented breast and dense areas, our algorithm computed the breast percent density (PD), which is the faction of dense area in a breast. Our dataset included full-field digital screening mammograms of 604 women, which included 1208 mediolateral oblique (MLO) and 1208 craniocaudal (CC) views. We allocated 455, 58, and 91 of 604 women and their exams into training, testing, and validation datasets, respectively. We established ground truth for the breast and the dense fibroglandular areas via manual segmentation and segmentation using a simple thresholding based on BI-RADS density assessments by radiologists, respectively. Using the mammograms and ground truth, we fine-tuned a pretrained deep learning network to train the network to segment both the breast and the fibroglandular areas. Using the validation dataset, we evaluated the performance of the proposed algorithm against radiologists' BI-RADS density assessments. Specifically, we conducted a correlation analysis between a BI-RADS density assessment of a given breast and its corresponding PD estimate by the proposed algorithm. In addition, we evaluated our algorithm in terms of its ability to classify the BI-RADS density using PD estimates, and its ability to provide consistent PD estimates for the left and the right breast and the MLO and CC views of the same women. To show the effectiveness of our algorithm, we compared the performance of our algorithm against a state of the art algorithm, laboratory for individualized breast radiodensity assessment (LIBRA). The PD estimated by our algorithm correlated well with BI-RADS density ratings by radiologists. Pearson's rho values of

  7. Detection of virus-specific intrathecally synthesised immunoglobulin G with a fully automated enzyme immunoassay system

    Directory of Open Access Journals (Sweden)

    Weissbrich Benedikt

    2007-05-01

    Full Text Available Abstract Background The determination of virus-specific immunoglobulin G (IgG antibodies in cerebrospinal fluid (CSF is useful for the diagnosis of virus associated diseases of the central nervous system (CNS and for the detection of a polyspecific intrathecal immune response in patients with multiple sclerosis. Quantification of virus-specific IgG in the CSF is frequently performed by calculation of a virus-specific antibody index (AI. Determination of the AI is a demanding and labour-intensive technique and therefore automation is desirable. We evaluated the precision and the diagnostic value of a fully automated enzyme immunoassay for the detection of virus-specific IgG in serum and CSF using the analyser BEP2000 (Dade Behring. Methods The AI for measles, rubella, varicella-zoster, and herpes simplex virus IgG was determined from pairs of serum and CSF samples of patients with viral CNS infections, multiple sclerosis and of control patients. CSF and serum samples were tested simultaneously with reference to a standard curve. Starting dilutions were 1:6 and 1:36 for CSF and 1:1386 and 1:8316 for serum samples. Results The interassay coefficient of variation was below 10% for all parameters tested. There was good agreement between AIs obtained with the BEP2000 and AIs derived from the semi-automated reference method. Conclusion Determination of virus-specific IgG in serum-CSF-pairs for calculation of AI has been successfully automated on the BEP2000. Current limitations of the assay layout imposed by the analyser software should be solved in future versions to offer more convenience in comparison to manual or semi-automated methods.

  8. Opportunities for Automated Demand Response in California Wastewater Treatment Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Aghajanzadeh, Arian [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Wray, Craig [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); McKane, Aimee [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2015-08-30

    Previous research over a period of six years has identified wastewater treatment facilities as good candidates for demand response (DR), automated demand response (Auto-­DR), and Energy Efficiency (EE) measures. This report summarizes that work, including the characteristics of wastewater treatment facilities, the nature of the wastewater stream, energy used and demand, as well as details of the wastewater treatment process. It also discusses control systems and automated demand response opportunities. Furthermore, this report summarizes the DR potential of three wastewater treatment facilities. In particular, Lawrence Berkeley National Laboratory (LBNL) has collected data at these facilities from control systems, submetered process equipment, utility electricity demand records, and governmental weather stations. The collected data were then used to generate a summary of wastewater power demand, factors affecting that demand, and demand response capabilities. These case studies show that facilities that have implemented energy efficiency measures and that have centralized control systems are well suited to shed or shift electrical loads in response to financial incentives, utility bill savings, and/or opportunities to enhance reliability of service. In summary, municipal wastewater treatment energy demand in California is large, and energy-­intensive equipment offers significant potential for automated demand response. In particular, large load reductions were achieved by targeting effluent pumps and centrifuges. One of the limiting factors to implementing demand response is the reaction of effluent turbidity to reduced aeration at an earlier stage of the process. Another limiting factor is that cogeneration capabilities of municipal facilities, including existing power purchase agreements and utility receptiveness to purchasing electricity from cogeneration facilities, limit a facility’s potential to participate in other DR activities.

  9. Didactical And Ethics Demands For Automated Pedagogical Diagnostics.

    Directory of Open Access Journals (Sweden)

    O. Kolgatin

    2009-06-01

    Full Text Available Didactical demands for pedagogical diagnostics and its realisation specific characters under conditions of active using of ICT in instruction process of universities are analysed. The ethics questions of pedagogical diagnostics are considered. Ethic aspects, connected with using of the automated pedagogical diagnostic systems, are underlined.

  10. Findings from Seven Years of Field Performance Data for Automated Demand Response in Commercial Buildings

    Energy Technology Data Exchange (ETDEWEB)

    Kiliccote, Sila; Piette, Mary Ann; Mathieu, Johanna; Parrish, Kristen

    2010-05-14

    California is a leader in automating demand response (DR) to promote low-cost, consistent, and predictable electric grid management tools. Over 250 commercial and industrial facilities in California participate in fully-automated programs providing over 60 MW of peak DR savings. This paper presents a summary of Open Automated DR (OpenADR) implementation by each of the investor-owned utilities in California. It provides a summary of participation, DR strategies and incentives. Commercial buildings can reduce peak demand from 5 to 15percent with an average of 13percent. Industrial facilities shed much higher loads. For buildings with multi-year savings we evaluate their load variability and shed variability. We provide a summary of control strategies deployed, along with costs to install automation. We report on how the electric DR control strategies perform over many years of events. We benchmark the peak demand of this sample of buildings against their past baselines to understand the differences in building performance over the years. This is done with peak demand intensities and load factors. The paper also describes the importance of these data in helping to understand possible techniques to reach net zero energy using peak day dynamic control capabilities in commercial buildings. We present an example in which the electric load shape changed as a result of a lighting retrofit.

  11. LV challenge LKEB contribution : fully automated myocardial contour detection

    NARCIS (Netherlands)

    Wijnhout, J.S.; Hendriksen, D.; Assen, van H.C.; Geest, van der R.J.

    2009-01-01

    In this paper a contour detection method is described and evaluated on the evaluation data sets of the Cardiac MR Left Ventricle Segmentation Challenge as part of MICCAI 2009s 3D Segmentation Challenge for Clinical Applications. The proposed method, using 2D AAM and 3D ASM, performs a fully

  12. Toward fully automated genotyping: Genotyping microsatellite markers by deconvolution

    Energy Technology Data Exchange (ETDEWEB)

    Perlin, M.W.; Lancia, G.; See-Kiong, Ng [Carnegie Mellon Univ., Pittsburgh, PA (United States)

    1995-11-01

    Dense genetic linkage maps have been constructed for the human and mouse genomes, with average densities of 2.9 cM and 0.35 cM, respectively. These genetic maps are crucial for mapping both Mendelian and complex traits and are useful in clinical genetic diagnosis. Current maps are largely comprised of abundant, easily assayed, and highly polymorphic PCR-based microsatellite markers, primarily dinucleotide (CA){sub n} repeats. One key limitation of these length polymorphisms is the PCR stutter (or slippage) artifact that introduces additional stutter bands. With two (or more) closely spaced alleles, the stutter bands overlap, and it is difficult to accurately determine the correct alleles; this stutter phenomenon has all but precluded full automation, since a human must visually inspect the allele data. We describe here novel deconvolution methods for accurate genotyping that mathematically remove PCR stutter artifact from microsatellite markers. These methods overcome the manual interpretation bottleneck and thereby enable full automation of genetic map construction and use. New functionalities, including the pooling of DNAs and the pooling of markers, are described that may greatly reduce the associated experimentation requirements. 32 refs., 5 figs., 3 tabs.

  13. Enabling Automated Dynamic Demand Response: From Theory to Practice

    Energy Technology Data Exchange (ETDEWEB)

    Frincu, Marc; Chelmis, Charalampos; Aman, Saima; Saeed, Rizwan; Zois, Vasileios; Prasanna, Viktor

    2015-07-14

    Demand response (DR) is a technique used in smart grids to shape customer load during peak hours. Automated DR offers utilities a fine grained control and a high degree of confidence in the outcome. However the impact on the customer's comfort means this technique is more suited for industrial and commercial settings than for residential homes. In this paper we propose a system for achieving automated controlled DR in a heterogeneous environment. We present some of the main issues arising in building such a system, including privacy, customer satisfiability, reliability, and fast decision turnaround, with emphasis on the solutions we proposed. Based on the lessons we learned from empirical results we describe an integrated automated system for controlled DR on the USC microgrid. Results show that while on a per building per event basis the accuracy of our prediction and customer selection techniques varies, it performs well on average when considering several events and buildings.

  14. Open Automated Demand Response for Small Commerical Buildings

    Energy Technology Data Exchange (ETDEWEB)

    Dudley, June Han; Piette, Mary Ann; Koch, Ed; Hennage, Dan

    2009-05-01

    This report characterizes small commercial buildings by market segments, systems and end-uses; develops a framework for identifying demand response (DR) enabling technologies and communication means; and reports on the design and development of a low-cost OpenADR enabling technology that delivers demand reductions as a percentage of the total predicted building peak electric demand. The results show that small offices, restaurants and retail buildings are the major contributors making up over one third of the small commercial peak demand. The majority of the small commercial buildings in California are located in southern inland areas and the central valley. Single-zone packaged units with manual and programmable thermostat controls make up the majority of heating ventilation and air conditioning (HVAC) systems for small commercial buildings with less than 200 kW peak electric demand. Fluorescent tubes with magnetic ballast and manual controls dominate this customer group's lighting systems. There are various ways, each with its pros and cons for a particular application, to communicate with these systems and three methods to enable automated DR in small commercial buildings using the Open Automated Demand Response (or OpenADR) communications infrastructure. Development of DR strategies must consider building characteristics, such as weather sensitivity and load variability, as well as system design (i.e. under-sizing, under-lighting, over-sizing, etc). Finally, field tests show that requesting demand reductions as a percentage of the total building predicted peak electric demand is feasible using the OpenADR infrastructure.

  15. FULLY AUTOMATED GENERATION OF ACCURATE DIGITAL SURFACE MODELS WITH SUB-METER RESOLUTION FROM SATELLITE IMAGERY

    Directory of Open Access Journals (Sweden)

    J. Wohlfeil

    2012-07-01

    Full Text Available Modern pixel-wise image matching algorithms like Semi-Global Matching (SGM are able to compute high resolution digital surface models from airborne and spaceborne stereo imagery. Although image matching itself can be performed automatically, there are prerequisites, like high geometric accuracy, which are essential for ensuring the high quality of resulting surface models. Especially for line cameras, these prerequisites currently require laborious manual interaction using standard tools, which is a growing problem due to continually increasing demand for such surface models. The tedious work includes partly or fully manual selection of tie- and/or ground control points for ensuring the required accuracy of the relative orientation of images for stereo matching. It also includes masking of large water areas that seriously reduce the quality of the results. Furthermore, a good estimate of the depth range is required, since accurate estimates can seriously reduce the processing time for stereo matching. In this paper an approach is presented that allows performing all these steps fully automated. It includes very robust and precise tie point selection, enabling the accurate calculation of the images’ relative orientation via bundle adjustment. It is also shown how water masking and elevation range estimation can be performed automatically on the base of freely available SRTM data. Extensive tests with a large number of different satellite images from QuickBird and WorldView are presented as proof of the robustness and reliability of the proposed method.

  16. Fully automated processing of fMRI data in SPM: from MRI scanner to PACS.

    Science.gov (United States)

    Maldjian, Joseph A; Baer, Aaron H; Kraft, Robert A; Laurienti, Paul J; Burdette, Jonathan H

    2009-01-01

    Here we describe the Wake Forest University Pipeline, a fully automated method for the processing of fMRI data using SPM. The method includes fully automated data transfer and archiving from the point of acquisition, real-time batch script generation, distributed grid processing, interface to SPM in MATLAB, error recovery and data provenance, DICOM conversion and PACS insertion. It has been used for automated processing of fMRI experiments, as well as for the clinical implementation of fMRI and spin-tag perfusion imaging. The pipeline requires no manual intervention, and can be extended to any studies requiring offline processing.

  17. A Framework for Fully Automated Performance Testing for Smart Buildings

    DEFF Research Database (Denmark)

    Markoska, Elena; Johansen, Aslak; Lazarova-Molnar, Sanja

    2018-01-01

    , setup of performance tests has been manual and labor-intensive and has required intimate knowledge of buildings’ complexity and systems. The emergence of the concept of smart buildings has provided an opportunity to overcome this restriction. In this paper, we propose a framework for automated......A significant proportion of energy consumption by buildings worldwide, estimated to ca. 40%, has yielded a high importance to studying buildings’ performance. Performance testing is a mean by which buildings can be continuously commissioned to ensure that they operate as designed. Historically...... performance testing of smart buildings that utilizes metadata models. The approach features automatic detection of applicable performance tests using metadata queries and their corresponding instantiation, as well as continuous commissioning based on metadata. The presented approach has been implemented...

  18. Costs to Automate Demand Response - Taxonomy and Results from Field Studies and Programs

    Energy Technology Data Exchange (ETDEWEB)

    Piette, Mary A. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Schetrit, Oren [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Kiliccote, Sila [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Cheung, Iris [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Li, Becky Z [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2017-07-31

    During the past decade, the technology to automate demand response (DR) in buildings and industrial facilities has advanced significantly. Automation allows rapid, repeatable, reliable operation. This study focuses on costs for DR automation in commercial buildings with some discussion on residential buildings and industrial facilities. DR automation technology relies on numerous components, including communication systems, hardware and software gateways, standards-based messaging protocols, controls and integration platforms, and measurement and telemetry systems. This report compares cost data from several DR automation programs and pilot projects, evaluates trends in the cost per unit of DR and kilowatts (kW) available from automated systems, and applies a standard naming convention and classification or taxonomy for system elements. Median costs for the 56 installed automated DR systems studied here are about $200/kW. The deviation around this median is large with costs in some cases being an order of magnitude great or less than the median. This wide range is a result of variations in system age, size of load reduction, sophistication, and type of equipment included in cost analysis. The costs to automate fast DR systems for ancillary services are not fully analyzed in this report because additional research is needed to determine the total cost to install, operate, and maintain these systems. However, recent research suggests that they could be developed at costs similar to those of existing hot-summer DR automation systems. This report considers installation and configuration costs and does include the costs of owning and operating DR automation systems. Future analysis of the latter costs should include the costs to the building or facility manager costs as well as utility or third party program manager cost.

  19. A fully automated algorithm of baseline correction based on wavelet feature points and segment interpolation

    Science.gov (United States)

    Qian, Fang; Wu, Yihui; Hao, Peng

    2017-11-01

    Baseline correction is a very important part of pre-processing. Baseline in the spectrum signal can induce uneven amplitude shifts across different wavenumbers and lead to bad results. Therefore, these amplitude shifts should be compensated before further analysis. Many algorithms are used to remove baseline, however fully automated baseline correction is convenient in practical application. A fully automated algorithm based on wavelet feature points and segment interpolation (AWFPSI) is proposed. This algorithm finds feature points through continuous wavelet transformation and estimates baseline through segment interpolation. AWFPSI is compared with three commonly introduced fully automated and semi-automated algorithms, using simulated spectrum signal, visible spectrum signal and Raman spectrum signal. The results show that AWFPSI gives better accuracy and has the advantage of easy use.

  20. Improving reticle defect disposition via fully automated lithography simulation

    Science.gov (United States)

    Mann, Raunak; Goodman, Eliot; Lao, Keith; Ha, Steven; Vacca, Anthony; Fiekowsky, Peter; Fiekowsky, Dan

    2016-03-01

    Most advanced wafer fabs have embraced complex pattern decoration, which creates numerous challenges during in-fab reticle qualification. These optical proximity correction (OPC) techniques create assist features that tend to be very close in size and shape to the main patterns as seen in Figure 1. A small defect on an assist feature will most likely have little or no impact on the fidelity of the wafer image, whereas the same defect on a main feature could significantly decrease device functionality. In order to properly disposition these defects, reticle inspection technicians need an efficient method that automatically separates main from assist features and predicts the resulting defect impact on the wafer image. Analysis System (ADAS) defect simulation system[1]. Up until now, using ADAS simulation was limited to engineers due to the complexity of the settings that need to be manually entered in order to create an accurate result. A single error in entering one of these values can cause erroneous results, therefore full automation is necessary. In this study, we propose a new method where all needed simulation parameters are automatically loaded into ADAS. This is accomplished in two parts. First we have created a scanner parameter database that is automatically identified from mask product and level names. Second, we automatically determine the appropriate simulation printability threshold by using a new reference image (provided by the inspection tool) that contains a known measured value of the reticle critical dimension (CD). This new method automatically loads the correct scanner conditions, sets the appropriate simulation threshold, and automatically measures the percentage of CD change caused by the defect. This streamlines qualification and reduces the number of reticles being put on hold, waiting for engineer review. We also present data showing the consistency and reliability of the new method, along with the impact on the efficiency of in

  1. Current advances and strategies towards fully automated sample preparation for regulated LC-MS/MS bioanalysis.

    Science.gov (United States)

    Zheng, Naiyu; Jiang, Hao; Zeng, Jianing

    2014-09-01

    Robotic liquid handlers (RLHs) have been widely used in automated sample preparation for liquid chromatography-tandem mass spectrometry (LC-MS/MS) bioanalysis. Automated sample preparation for regulated bioanalysis offers significantly higher assay efficiency, better data quality and potential bioanalytical cost-savings. For RLHs that are used for regulated bioanalysis, there are additional requirements, including 21 CFR Part 11 compliance, software validation, system qualification, calibration verification and proper maintenance. This article reviews recent advances in automated sample preparation for regulated bioanalysis in the last 5 years. Specifically, it covers the following aspects: regulated bioanalysis requirements, recent advances in automation hardware and software development, sample extraction workflow simplification, strategies towards fully automated sample extraction, and best practices in automated sample preparation for regulated bioanalysis.

  2. A Fully Automated Classification for Mapping the Annual Cropland Extent

    Science.gov (United States)

    Waldner, F.; Defourny, P.

    2015-12-01

    Mapping the global cropland extent is of paramount importance for food security. Indeed, accurate and reliable information on cropland and the location of major crop types is required to make future policy, investment, and logistical decisions, as well as production monitoring. Timely cropland information directly feed early warning systems such as GIEWS and, FEWS NET. In Africa, and particularly in the arid and semi-arid region, food security is center of debate (at least 10% of the population remains undernourished) and accurate cropland estimation is a challenge. Space borne Earth Observation provides opportunities for global cropland monitoring in a spatially explicit, economic, efficient, and objective fashion. In the both agriculture monitoring and climate modelling, cropland maps serve as mask to isolate agricultural land for (i) time-series analysis for crop condition monitoring and (ii) to investigate how the cropland is respond to climatic evolution. A large diversity of mapping strategies ranging from the local to the global scale and associated with various degrees of accuracy can be found in the literature. At the global scale, despite efforts, cropland is generally one of classes with the poorest accuracy which make difficult the use for agricultural. This research aims at improving the cropland delineation from the local scale to the regional and global scales as well as allowing near real time updates. To that aim, five temporal features were designed to target the key- characteristics of crop spectral-temporal behavior. To ensure a high degree of automation, training data is extracted from available baseline land cover maps. The method delivers cropland maps with a high accuracy over contrasted agro-systems in Ukraine, Argentina, China and Belgium. The accuracy reached are comparable to those obtained with classifiers trained with in-situ data. Besides, it was found that the cropland class is associated with a low uncertainty. The temporal features

  3. Accurate, fully-automated NMR spectral profiling for metabolomics.

    Directory of Open Access Journals (Sweden)

    Siamak Ravanbakhsh

    Full Text Available Many diseases cause significant changes to the concentrations of small molecules (a.k.a. metabolites that appear in a person's biofluids, which means such diseases can often be readily detected from a person's "metabolic profile"-i.e., the list of concentrations of those metabolites. This information can be extracted from a biofluids Nuclear Magnetic Resonance (NMR spectrum. However, due to its complexity, NMR spectral profiling has remained manual, resulting in slow, expensive and error-prone procedures that have hindered clinical and industrial adoption of metabolomics via NMR. This paper presents a system, BAYESIL, which can quickly, accurately, and autonomously produce a person's metabolic profile. Given a 1D 1H NMR spectrum of a complex biofluid (specifically serum or cerebrospinal fluid, BAYESIL can automatically determine the metabolic profile. This requires first performing several spectral processing steps, then matching the resulting spectrum against a reference compound library, which contains the "signatures" of each relevant metabolite. BAYESIL views spectral matching as an inference problem within a probabilistic graphical model that rapidly approximates the most probable metabolic profile. Our extensive studies on a diverse set of complex mixtures including real biological samples (serum and CSF, defined mixtures and realistic computer generated spectra; involving > 50 compounds, show that BAYESIL can autonomously find the concentration of NMR-detectable metabolites accurately (~ 90% correct identification and ~ 10% quantification error, in less than 5 minutes on a single CPU. These results demonstrate that BAYESIL is the first fully-automatic publicly-accessible system that provides quantitative NMR spectral profiling effectively-with an accuracy on these biofluids that meets or exceeds the performance of trained experts. We anticipate this tool will usher in high-throughput metabolomics and enable a wealth of new applications of

  4. Fully automated data collection and processing system on macromolecular crystallography beamlines at the PF

    International Nuclear Information System (INIS)

    Yamada, Yusuke; Hiraki, Masahiko; Matsugaki, Naohiro; Chavas, Leonard M.G.; Igarashi, Noriyuki; Wakatsuki, Soichi

    2012-01-01

    Fully automated data collection and processing system has been developed on macromolecular crystallography beamlines at the Photon Factory. In this system, the sample exchange, centering and data collection are sequentially performed for all samples stored in the sample exchange system at a beamline without any manual operations. Data processing of collected data sets is also performed automatically. These results are stored into the database system, and users can monitor the progress and results of automated experiment via a Web browser. (author)

  5. Development of a fully automated online mixing system for SAXS protein structure analysis

    DEFF Research Database (Denmark)

    Nielsen, Søren Skou; Arleth, Lise

    2010-01-01

    This thesis presents the development of an automated high-throughput mixing and exposure system for Small-Angle Scattering analysis on a synchrotron using polymer microfluidics. Software and hardware for both automated mixing, exposure control on a beamline and automated data reduction...... and preliminary analysis is presented. Three mixing systems that have been the corner stones of the development process are presented including a fully functioning high-throughput microfluidic system that is able to produce and expose 36 mixed samples per hour using 30 μL of sample volume. The system is tested...

  6. A fully automated microfluidic femtosecond laser axotomy platform for nerve regeneration studies in C. elegans.

    Science.gov (United States)

    Gokce, Sertan Kutal; Guo, Samuel X; Ghorashian, Navid; Everett, W Neil; Jarrell, Travis; Kottek, Aubri; Bovik, Alan C; Ben-Yakar, Adela

    2014-01-01

    Femtosecond laser nanosurgery has been widely accepted as an axonal injury model, enabling nerve regeneration studies in the small model organism, Caenorhabditis elegans. To overcome the time limitations of manual worm handling techniques, automation and new immobilization technologies must be adopted to improve throughput in these studies. While new microfluidic immobilization techniques have been developed that promise to reduce the time required for axotomies, there is a need for automated procedures to minimize the required amount of human intervention and accelerate the axotomy processes crucial for high-throughput. Here, we report a fully automated microfluidic platform for performing laser axotomies of fluorescently tagged neurons in living Caenorhabditis elegans. The presented automation process reduces the time required to perform axotomies within individual worms to ∼17 s/worm, at least one order of magnitude faster than manual approaches. The full automation is achieved with a unique chip design and an operation sequence that is fully computer controlled and synchronized with efficient and accurate image processing algorithms. The microfluidic device includes a T-shaped architecture and three-dimensional microfluidic interconnects to serially transport, position, and immobilize worms. The image processing algorithms can identify and precisely position axons targeted for ablation. There were no statistically significant differences observed in reconnection probabilities between axotomies carried out with the automated system and those performed manually with anesthetics. The overall success rate of automated axotomies was 67.4±3.2% of the cases (236/350) at an average processing rate of 17.0±2.4 s. This fully automated platform establishes a promising methodology for prospective genome-wide screening of nerve regeneration in C. elegans in a truly high-throughput manner.

  7. Fully automated joint space width measurement and digital X-ray radiogrammetry in early RA

    NARCIS (Netherlands)

    Platten, Michael; Kisten, Yogan; Kälvesten, Johan; Arnaud, Laurent; Forslind, Kristina; van Vollenhoven, Ronald

    2017-01-01

    To study fully automated digital joint space width (JSW) and bone mineral density (BMD) in relation to a conventional radiographic scoring method in early rheumatoid arthritis (eRA). Radiographs scored by the modified Sharp van der Heijde score (SHS) in patients with eRA were acquired from the

  8. Open Automated Demand Response Communications in Demand Response for Wholesale Ancillary Services

    Energy Technology Data Exchange (ETDEWEB)

    Kiliccote, Sila; Piette, Mary Ann; Ghatikar, Girish; Koch, Ed; Hennage, Dan; Hernandez, John; Chiu, Albert; Sezgen, Osman; Goodin, John

    2009-11-06

    The Pacific Gas and Electric Company (PG&E) is conducting a pilot program to investigate the technical feasibility of bidding certain demand response (DR) resources into the California Independent System Operator's (CAISO) day-ahead market for ancillary services nonspinning reserve. Three facilities, a retail store, a local government office building, and a bakery, are recruited into the pilot program. For each facility, hourly demand, and load curtailment potential are forecasted two days ahead and submitted to the CAISO the day before the operation as an available resource. These DR resources are optimized against all other generation resources in the CAISO ancillary service. Each facility is equipped with four-second real time telemetry equipment to ensure resource accountability and visibility to CAISO operators. When CAISO requests DR resources, PG&E's OpenADR (Open Automated DR) communications infrastructure is utilized to deliver DR signals to the facilities energy management and control systems (EMCS). The pre-programmed DR strategies are triggered without a human in the loop. This paper describes the automated system architecture and the flow of information to trigger and monitor the performance of the DR events. We outline the DR strategies at each of the participating facilities. At one site a real time electric measurement feedback loop is implemented to assure the delivery of CAISO dispatched demand reductions. Finally, we present results from each of the facilities and discuss findings.

  9. Fuzzy inventory model for deteriorating items, with time depended demand, shortages, and fully backlogging

    OpenAIRE

    Wasim Akram Mandal; Sahidul Islam

    2016-01-01

    In this paper analyzes fuzzy inventory system for deterioration item with time depended demand. Shortages are allowed under fully backlogged. Fixed cost, deterioration cost, shortages cost, holding cost are the cost considered in this model. Fuzziness is applying by allowing the cost components (holding cost, deterioration, shortage cost, holding cost, etc). In fuzzy environment it considered all required parameter to be triangular fuzzy numbers. One numerical solution of the model is obtaine...

  10. A Distributed Intelligent Automated Demand Response Building Management System

    Energy Technology Data Exchange (ETDEWEB)

    Auslander, David [Univ. of California, Berkeley, CA (United States); Culler, David [Univ. of California, Berkeley, CA (United States); Wright, Paul [Univ. of California, Berkeley, CA (United States); Lu, Yan [Siemens Corporate Research Inc., Princeton, NJ (United States); Piette, Mary [Univ. of California, Berkeley, CA (United States)

    2013-03-31

    The goal of the 2.5 year Distributed Intelligent Automated Demand Response (DIADR) project was to reduce peak electricity load of Sutardja Dai Hall at UC Berkeley by 30% while maintaining a healthy, comfortable, and productive environment for the occupants. We sought to bring together both central and distributed control to provide “deep” demand response1 at the appliance level of the building as well as typical lighting and HVAC applications. This project brought together Siemens Corporate Research and Siemens Building Technology (the building has a Siemens Apogee Building Automation System (BAS)), Lawrence Berkeley National Laboratory (leveraging their Open Automated Demand Response (openADR), Auto-­Demand Response, and building modeling expertise), and UC Berkeley (related demand response research including distributed wireless control, and grid-­to-­building gateway development). Sutardja Dai Hall houses the Center for Information Technology Research in the Interest of Society (CITRIS), which fosters collaboration among industry and faculty and students of four UC campuses (Berkeley, Davis, Merced, and Santa Cruz). The 141,000 square foot building, occupied in 2009, includes typical office spaces and a nanofabrication laboratory. Heating is provided by a district heating system (steam from campus as a byproduct of the campus cogeneration plant); cooling is provided by one of two chillers: a more typical electric centrifugal compressor chiller designed for the cool months (Nov-­ March) and a steam absorption chiller for use in the warm months (April-­October). Lighting in the open office areas is provided by direct-­indirect luminaries with Building Management System-­based scheduling for open areas, and occupancy sensors for private office areas. For the purposes of this project, we focused on the office portion of the building. Annual energy consumption is approximately 8053 MWh; the office portion is estimated as 1924 MWh. The maximum peak load

  11. The future of fully automated vehicles : opportunities for vehicle- and ride-sharing, with cost and emissions savings.

    Science.gov (United States)

    2014-08-01

    Fully automated or autonomous vehicles (AVs) hold great promise for the future of transportation. By 2020 : Google, auto manufacturers and other technology providers intend to introduce self-driving cars to the public with : either limited or fully a...

  12. A fully automated fast analysis system for capillary gas chromatography. Part 1. Automation of system control

    NARCIS (Netherlands)

    Snijders, H.M.J.; Rijks, J.P.E.M.; Bombeeck, A.J.; Rijks, J.A.; Sandra, P.; Lee, M.L.

    1992-01-01

    This paper is dealing with the design, the automation and evaluation of a high speed capillary gas chromatographic system. A combination of software and hardware was developed for a new cold trap/reinjection device that allows selective solvent eliminating and on column sample enrichment and an

  13. Automated Dynamic Demand Response Implementation on a Micro-grid

    Energy Technology Data Exchange (ETDEWEB)

    Kuppannagari, Sanmukh R.; Kannan, Rajgopal; Chelmis, Charalampos; Prasanna, Viktor K.

    2016-11-16

    In this paper, we describe a system for real-time automated Dynamic and Sustainable Demand Response with sparse data consumption prediction implemented on the University of Southern California campus microgrid. Supply side approaches to resolving energy supply-load imbalance do not work at high levels of renewable energy penetration. Dynamic Demand Response (D2R) is a widely used demand-side technique to dynamically adjust electricity consumption during peak load periods. Our D2R system consists of accurate machine learning based energy consumption forecasting models that work with sparse data coupled with fast and sustainable load curtailment optimization algorithms that provide the ability to dynamically adapt to changing supply-load imbalances in near real-time. Our Sustainable DR (SDR) algorithms attempt to distribute customer curtailment evenly across sub-intervals during a DR event and avoid expensive demand peaks during a few sub-intervals. It also ensures that each customer is penalized fairly in order to achieve the targeted curtailment. We develop near linear-time constant-factor approximation algorithms along with Polynomial Time Approximation Schemes (PTAS) for SDR curtailment that minimizes the curtailment error defined as the difference between the target and achieved curtailment values. Our SDR curtailment problem is formulated as an Integer Linear Program that optimally matches customers to curtailment strategies during a DR event while also explicitly accounting for customer strategy switching overhead as a constraint. We demonstrate the results of our D2R system using real data from experiments performed on the USC smartgrid and show that 1) our prediction algorithms can very accurately predict energy consumption even with noisy or missing data and 2) our curtailment algorithms deliver DR with extremely low curtailment errors in the 0.01-0.05 kWh range.

  14. Fully automated segmentation of callus by micro-CT compared to biomechanics.

    Science.gov (United States)

    Bissinger, Oliver; Götz, Carolin; Wolff, Klaus-Dietrich; Hapfelmeier, Alexander; Prodinger, Peter Michael; Tischer, Thomas

    2017-07-11

    A high percentage of closed femur fractures have slight comminution. Using micro-CT (μCT), multiple fragment segmentation is much more difficult than segmentation of unfractured or osteotomied bone. Manual or semi-automated segmentation has been performed to date. However, such segmentation is extremely laborious, time-consuming and error-prone. Our aim was to therefore apply a fully automated segmentation algorithm to determine μCT parameters and examine their association with biomechanics. The femura of 64 rats taken after randomised inhibitory or neutral medication, in terms of the effect on fracture healing, and controls were closed fractured after a Kirschner wire was inserted. After 21 days, μCT and biomechanical parameters were determined by a fully automated method and correlated (Pearson's correlation). The fully automated segmentation algorithm automatically detected bone and simultaneously separated cortical bone from callus without requiring ROI selection for each single bony structure. We found an association of structural callus parameters obtained by μCT to the biomechanical properties. However, results were only explicable by additionally considering the callus location. A large number of slightly comminuted fractures in combination with therapies that influence the callus qualitatively and/or quantitatively considerably affects the association between μCT and biomechanics. In the future, contrast-enhanced μCT imaging of the callus cartilage might provide more information to improve the non-destructive and non-invasive prediction of callus mechanical properties. As studies evaluating such important drugs increase, fully automated segmentation appears to be clinically important.

  15. A novel method to determine simultaneously methane production during in vitro gas production using fully automated equipment

    NARCIS (Netherlands)

    Pellikaan, W.F.; Hendriks, W.H.; Uwimanaa, G.; Bongers, L.J.G.M.; Becker, P.M.; Cone, J.W.

    2011-01-01

    An adaptation of fully automated gas production equipment was tested for its ability to simultaneously measure methane and total gas. The simultaneous measurement of gas production and gas composition was not possible using fully automated equipment, as the bottles should be kept closed during the

  16. A Fully Automated Method to Detect and Segment a Manufactured Object in an Underwater Color Image

    Science.gov (United States)

    Barat, Christian; Phlypo, Ronald

    2010-12-01

    We propose a fully automated active contours-based method for the detection and the segmentation of a moored manufactured object in an underwater image. Detection of objects in underwater images is difficult due to the variable lighting conditions and shadows on the object. The proposed technique is based on the information contained in the color maps and uses the visual attention method, combined with a statistical approach for the detection and an active contour for the segmentation of the object to overcome the above problems. In the classical active contour method the region descriptor is fixed and the convergence of the method depends on the initialization. With our approach, this dependence is overcome with an initialization using the visual attention results and a criterion to select the best region descriptor. This approach improves the convergence and the processing time while providing the advantages of a fully automated method.

  17. Development of Fully Automated Low-Cost Immunoassay System for Research Applications.

    Science.gov (United States)

    Wang, Guochun; Das, Champak; Ledden, Bradley; Sun, Qian; Nguyen, Chien

    2017-10-01

    Enzyme-linked immunosorbent assay (ELISA) automation for routine operation in a small research environment would be very attractive. A portable fully automated low-cost immunoassay system was designed, developed, and evaluated with several protein analytes. It features disposable capillary columns as the reaction sites and uses real-time calibration for improved accuracy. It reduces the overall assay time to less than 75 min with the ability of easy adaptation of new testing targets. The running cost is extremely low due to the nature of automation, as well as reduced material requirements. Details about system configuration, components selection, disposable fabrication, system assembly, and operation are reported. The performance of the system was initially established with a rabbit immunoglobulin G (IgG) assay, and an example of assay adaptation with an interleukin 6 (IL6) assay is shown. This system is ideal for research use, but could work for broader testing applications with further optimization.

  18. How a Fully Automated eHealth Program Simulates Three Therapeutic Processes: A Case Study.

    Science.gov (United States)

    Holter, Marianne T S; Johansen, Ayna; Brendryen, Håvar

    2016-06-28

    eHealth programs may be better understood by breaking down the components of one particular program and discussing its potential for interactivity and tailoring in regard to concepts from face-to-face counseling. In the search for the efficacious elements within eHealth programs, it is important to understand how a program using lapse management may simultaneously support working alliance, internalization of motivation, and behavior maintenance. These processes have been applied to fully automated eHealth programs individually. However, given their significance in face-to-face counseling, it may be important to simulate the processes simultaneously in interactive, tailored programs. We propose a theoretical model for how fully automated behavior change eHealth programs may be more effective by simulating a therapist's support of a working alliance, internalization of motivation, and managing lapses. We show how the model is derived from theory and its application to Endre, a fully automated smoking cessation program that engages the user in several "counseling sessions" about quitting. A descriptive case study based on tools from the intervention mapping protocol shows how each therapeutic process is simulated. The program supports the user's working alliance through alliance factors, the nonembodied relational agent Endre and computerized motivational interviewing. Computerized motivational interviewing also supports internalized motivation to quit, whereas a lapse management component responds to lapses. The description operationalizes working alliance, internalization of motivation, and managing lapses, in terms of eHealth support of smoking cessation. A program may simulate working alliance, internalization of motivation, and lapse management through interactivity and individual tailoring, potentially making fully automated eHealth behavior change programs more effective.

  19. Fully automated system for Pu measurement by gamma spectrometry of alpha contaminated solid wastes

    International Nuclear Information System (INIS)

    Cresti, P.

    1986-01-01

    A description is given of a fully automated system developed at Comb/Mepis Laboratories which is based on the detection of specific gamma signatures of Pu isotopes for monitoring Pu content in 15-25 l containers of low density (0.1 g/cm 3 ) wastes. The methodological approach is discussed; based on experimental data, an evaluation of the achievable performances (detection limit, precision, accuracy, etc.) is also given

  20. Performance of a fully automated scatterometer for BRDF and BTDF measurements at visible and infrared wavelengths

    International Nuclear Information System (INIS)

    Anderson, S.; Shepard, D.F.; Pompea, S.M.; Castonguay, R.

    1989-01-01

    The general performance of a fully automated scatterometer shows that the instrument can make rapid, accurate BRDF (bidirectional reflectance distribution function) and BTDF (bidirectional transmittance distribution function) measurements of optical surfaces over a range of approximately ten orders of magnitude in BRDF. These measurements can be made for most surfaces even with the detector at the specular angle, because of beam-attenuation techniques. He-Ne and CO2 lasers are used as sources in conjunction with a reference detector and chopper

  1. Fully Automated Volumetric Modulated Arc Therapy Plan Generation for Prostate Cancer Patients

    International Nuclear Information System (INIS)

    Voet, Peter W.J.; Dirkx, Maarten L.P.; Breedveld, Sebastiaan; Al-Mamgani, Abrahim; Incrocci, Luca; Heijmen, Ben J.M.

    2014-01-01

    Purpose: To develop and evaluate fully automated volumetric modulated arc therapy (VMAT) treatment planning for prostate cancer patients, avoiding manual trial-and-error tweaking of plan parameters by dosimetrists. Methods and Materials: A system was developed for fully automated generation of VMAT plans with our commercial clinical treatment planning system (TPS), linked to the in-house developed Erasmus-iCycle multicriterial optimizer for preoptimization. For 30 randomly selected patients, automatically generated VMAT plans (VMAT auto ) were compared with VMAT plans generated manually by 1 expert dosimetrist in the absence of time pressure (VMAT man ). For all treatment plans, planning target volume (PTV) coverage and sparing of organs-at-risk were quantified. Results: All generated plans were clinically acceptable and had similar PTV coverage (V 95%  > 99%). For VMAT auto and VMAT man plans, the organ-at-risk sparing was similar as well, although only the former plans were generated without any planning workload. Conclusions: Fully automated generation of high-quality VMAT plans for prostate cancer patients is feasible and has recently been implemented in our clinic

  2. Designing a fully automated multi-bioreactor plant for fast DoE optimization of pharmaceutical protein production.

    Science.gov (United States)

    Fricke, Jens; Pohlmann, Kristof; Jonescheit, Nils A; Ellert, Andree; Joksch, Burkhard; Luttmann, Reiner

    2013-06-01

    The identification of optimal expression conditions for state-of-the-art production of pharmaceutical proteins is a very time-consuming and expensive process. In this report a method for rapid and reproducible optimization of protein expression in an in-house designed small-scale BIOSTAT® multi-bioreactor plant is described. A newly developed BioPAT® MFCS/win Design of Experiments (DoE) module (Sartorius Stedim Systems, Germany) connects the process control system MFCS/win and the DoE software MODDE® (Umetrics AB, Sweden) and enables therefore the implementation of fully automated optimization procedures. As a proof of concept, a commercial Pichia pastoris strain KM71H has been transformed for the expression of potential malaria vaccines. This approach has allowed a doubling of intact protein secretion productivity due to the DoE optimization procedure compared to initial cultivation results. In a next step, robustness regarding the sensitivity to process parameter variability has been proven around the determined optimum. Thereby, a pharmaceutical production process that is significantly improved within seven 24-hour cultivation cycles was established. Specifically, regarding the regulatory demands pointed out in the process analytical technology (PAT) initiative of the United States Food and Drug Administration (FDA), the combination of a highly instrumented, fully automated multi-bioreactor platform with proper cultivation strategies and extended DoE software solutions opens up promising benefits and opportunities for pharmaceutical protein production. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Comparison of semi-automated center-dot and fully automated endothelial cell analyses from specular microscopy images.

    Science.gov (United States)

    Maruoka, Sachiko; Nakakura, Shunsuke; Matsuo, Naoko; Yoshitomi, Kayo; Katakami, Chikako; Tabuchi, Hitoshi; Chikama, Taiichiro; Kiuchi, Yoshiaki

    2017-10-30

    To evaluate two specular microscopy analysis methods across different endothelial cell densities (ECDs). Endothelial images of one eye from each of 45 patients were taken by using three different specular microscopes (three replicates each). To determine the consistency of the center-dot method, we compared SP-6000 and SP-2000P images. CME-530 and SP-6000 images were compared to assess the consistency of the fully automated method. The SP-6000 images from the two methods were compared. Intraclass correlation coefficients (ICCs) for the three measurements were calculated, and parametric multiple comparisons tests and Bland-Altman analysis were performed. The ECD mean value was 2425 ± 883 (range 516-3707) cells/mm 2 . ICC values were > 0.9 for all three microscopes for ECD, but the coefficients of variation (CVs) were 0.3-0.6. For ECD measurements, Bland-Altman analysis revealed that the mean difference was 42 cells/mm 2 between the SP-2000P and SP-6000 for the center-dot method; 57 cells/mm 2 between the SP-6000 measurements from both methods; and -5 cells/mm 2 between the SP-6000 and CME-530 for the fully automated method (95% limits of agreement: - 201 to 284 cell/mm 2 , - 410 to 522 cells/mm 2 , and - 327 to 318 cells/mm 2 , respectively). For CV measurements, the mean differences were - 3, - 12, and 13% (95% limits of agreement - 18 to 11, - 26 to 2, and - 5 to 32%, respectively). Despite using three replicate measurements, the precision of the center-dot method with the SP-2000P and SP-6000 software was only ± 10% for ECD data and was even worse for the fully automated method. Japan Clinical Trials Register ( http://www.umin.ac.jp/ctr/index/htm9 ) number UMIN 000015236.

  4. A Fully Automated High-Throughput Flow Cytometry Screening System Enabling Phenotypic Drug Discovery.

    Science.gov (United States)

    Joslin, John; Gilligan, James; Anderson, Paul; Garcia, Catherine; Sharif, Orzala; Hampton, Janice; Cohen, Steven; King, Miranda; Zhou, Bin; Jiang, Shumei; Trussell, Christopher; Dunn, Robert; Fathman, John W; Snead, Jennifer L; Boitano, Anthony E; Nguyen, Tommy; Conner, Michael; Cooke, Mike; Harris, Jennifer; Ainscow, Ed; Zhou, Yingyao; Shaw, Chris; Sipes, Dan; Mainquist, James; Lesley, Scott

    2018-05-01

    The goal of high-throughput screening is to enable screening of compound libraries in an automated manner to identify quality starting points for optimization. This often involves screening a large diversity of compounds in an assay that preserves a connection to the disease pathology. Phenotypic screening is a powerful tool for drug identification, in that assays can be run without prior understanding of the target and with primary cells that closely mimic the therapeutic setting. Advanced automation and high-content imaging have enabled many complex assays, but these are still relatively slow and low throughput. To address this limitation, we have developed an automated workflow that is dedicated to processing complex phenotypic assays for flow cytometry. The system can achieve a throughput of 50,000 wells per day, resulting in a fully automated platform that enables robust phenotypic drug discovery. Over the past 5 years, this screening system has been used for a variety of drug discovery programs, across many disease areas, with many molecules advancing quickly into preclinical development and into the clinic. This report will highlight a diversity of approaches that automated flow cytometry has enabled for phenotypic drug discovery.

  5. Fully automated MR liver volumetry using watershed segmentation coupled with active contouring.

    Science.gov (United States)

    Huynh, Hieu Trung; Le-Trong, Ngoc; Bao, Pham The; Oto, Aytek; Suzuki, Kenji

    2017-02-01

    Our purpose is to develop a fully automated scheme for liver volume measurement in abdominal MR images, without requiring any user input or interaction. The proposed scheme is fully automatic for liver volumetry from 3D abdominal MR images, and it consists of three main stages: preprocessing, rough liver shape generation, and liver extraction. The preprocessing stage reduced noise and enhanced the liver boundaries in 3D abdominal MR images. The rough liver shape was revealed fully automatically by using the watershed segmentation, thresholding transform, morphological operations, and statistical properties of the liver. An active contour model was applied to refine the rough liver shape to precisely obtain the liver boundaries. The liver volumes calculated by the proposed scheme were compared to the "gold standard" references which were estimated by an expert abdominal radiologist. The liver volumes computed by using our developed scheme excellently agreed (Intra-class correlation coefficient was 0.94) with the "gold standard" manual volumes by the radiologist in the evaluation with 27 cases from multiple medical centers. The running time was 8.4 min per case on average. We developed a fully automated liver volumetry scheme in MR, which does not require any interaction by users. It was evaluated with cases from multiple medical centers. The liver volumetry performance of our developed system was comparable to that of the gold standard manual volumetry, and it saved radiologists' time for manual liver volumetry of 24.7 min per case.

  6. A fully automated Drosophila olfactory classical conditioning and testing system for behavioral learning and memory assessment.

    Science.gov (United States)

    Jiang, Hui; Hanna, Eriny; Gatto, Cheryl L; Page, Terry L; Bhuva, Bharat; Broadie, Kendal

    2016-03-01

    Aversive olfactory classical conditioning has been the standard method to assess Drosophila learning and memory behavior for decades, yet training and testing are conducted manually under exceedingly labor-intensive conditions. To overcome this severe limitation, a fully automated, inexpensive system has been developed, which allows accurate and efficient Pavlovian associative learning/memory analyses for high-throughput pharmacological and genetic studies. The automated system employs a linear actuator coupled to an odorant T-maze with airflow-mediated transfer of animals between training and testing stages. Odorant, airflow and electrical shock delivery are automatically administered and monitored during training trials. Control software allows operator-input variables to define parameters of Drosophila learning, short-term memory and long-term memory assays. The approach allows accurate learning/memory determinations with operational fail-safes. Automated learning indices (immediately post-training) and memory indices (after 24h) are comparable to traditional manual experiments, while minimizing experimenter involvement. The automated system provides vast improvements over labor-intensive manual approaches with no experimenter involvement required during either training or testing phases. It provides quality control tracking of airflow rates, odorant delivery and electrical shock treatments, and an expanded platform for high-throughput studies of combinational drug tests and genetic screens. The design uses inexpensive hardware and software for a total cost of ∼$500US, making it affordable to a wide range of investigators. This study demonstrates the design, construction and testing of a fully automated Drosophila olfactory classical association apparatus to provide low-labor, high-fidelity, quality-monitored, high-throughput and inexpensive learning and memory behavioral assays. Copyright © 2015 Elsevier B.V. All rights reserved.

  7. Fully Automated Trimethylsilyl (TMS Derivatisation Protocol for Metabolite Profiling by GC-MS

    Directory of Open Access Journals (Sweden)

    Erica Zarate

    2016-12-01

    Full Text Available Gas Chromatography-Mass Spectrometry (GC-MS has long been used for metabolite profiling of a wide range of biological samples. Many derivatisation protocols are already available and among these, trimethylsilyl (TMS derivatisation is one of the most widely used in metabolomics. However, most TMS methods rely on off-line derivatisation prior to GC-MS analysis. In the case of manual off-line TMS derivatisation, the derivative created is unstable, so reduction in recoveries occurs over time. Thus, derivatisation is carried out in small batches. Here, we present a fully automated TMS derivatisation protocol using robotic autosamplers and we also evaluate a commercial software, Maestro available from Gerstel GmbH. Because of automation, there was no waiting time of derivatised samples on the autosamplers, thus reducing degradation of unstable metabolites. Moreover, this method allowed us to overlap samples and improved throughputs. We compared data obtained from both manual and automated TMS methods performed on three different matrices, including standard mix, wine, and plasma samples. The automated TMS method showed better reproducibility and higher peak intensity for most of the identified metabolites than the manual derivatisation method. We also validated the automated method using 114 quality control plasma samples. Additionally, we showed that this online method was highly reproducible for most of the metabolites detected and identified (RSD < 20 and specifically achieved excellent results for sugars, sugar alcohols, and some organic acids. To the very best of our knowledge, this is the first time that the automated TMS method has been applied to analyse a large number of complex plasma samples. Furthermore, we found that this method was highly applicable for routine metabolite profiling (both targeted and untargeted in any metabolomics laboratory.

  8. Fully Automated Driving: Impact of Trust and Practice on Manual Control Recovery.

    Science.gov (United States)

    Payre, William; Cestac, Julien; Delhomme, Patricia

    2016-03-01

    An experiment was performed in a driving simulator to investigate the impacts of practice, trust, and interaction on manual control recovery (MCR) when employing fully automated driving (FAD). To increase the use of partially or highly automated driving efficiency and to improve safety, some studies have addressed trust in driving automation and training, but few studies have focused on FAD. FAD is an autonomous system that has full control of a vehicle without any need for intervention by the driver. A total of 69 drivers with a valid license practiced with FAD. They were distributed evenly across two conditions: simple practice and elaborate practice. When examining emergency MCR, a correlation was found between trust and reaction time in the simple practice group (i.e., higher trust meant a longer reaction time), but not in the elaborate practice group. This result indicated that to mitigate the negative impact of overtrust on reaction time, more appropriate practice may be needed. Drivers should be trained in how the automated device works so as to improve MCR performance in case of an emergency. The practice format used in this study could be used for the first interaction with an FAD car when acquiring such a vehicle. © 2015, Human Factors and Ergonomics Society.

  9. Performance of a fully automated program for measurement of left ventricular ejection fraction

    International Nuclear Information System (INIS)

    Douglass, K.H.; Tibbits, P.; Kasecamp, W.; Han, S.T.; Koller, D.; Links, J.M.; Wagner, H.H. Jr.

    1982-01-01

    A fully automated program developed by us for measurement of left ventricular ejection fraction from equilibrium gated blood studies was evaluated in 130 additional patients. Both of 6-min (130 studies) and 2-min (142 studies in 31 patients) gated blood pool studies were acquired and processed. The program successfully generated ejection fractions in 86% of the studies. These automatically generated ejection fractions were compared with ejection fractions derived from manually drawn regions the interest. When studies were acquired for 6-min with the patient at rest, the correlation between automated and manual ejection fractions was 0.92. When studies were acquired for 2-min, both at rest and during bicycle exercise, the correlation was 0.81. In 25 studies from patients who also underwent contrast ventriculography, the program successfully generated regions of interest in 22 (88%). The correlation between the ejection fraction determined by contrast ventriculography and the automatically generated radionuclide ejection fraction was 0.79. (orig.)

  10. Fully automated gamma spectrometry gauge observing possible radioactive contamination of melting-shop samples

    International Nuclear Information System (INIS)

    Kroos, J.; Westkaemper, G.; Stein, J.

    1999-01-01

    At Salzgitter AG, several monitoring systems have been installed to check the scrap transport by rail and by car. At the moment, the scrap transport by ship is reloaded onto wagons for monitoring afterwards. In the future, a detection system will be mounted onto a crane for a direct check on scrap upon the departure of ship. Furthermore, at Salzgitter AG Central Chemical Laboratory, a fully automated gamma spectrometry gauge is installed in order to observe a possible radioactive contamination of the products. The gamma spectrometer is integrated into the automated OE spectrometry line for testing melting shop samples after performing the OE spectrometry. With this technique the specific activity of selected nuclides and dose rate will be determined. The activity observation is part of the release procedure. The corresponding measurement data are stored in a database for quality management reasons. (author)

  11. Manual or automated measuring of antipsychotics' chemical oxygen demand.

    Science.gov (United States)

    Pereira, Sarah A P; Costa, Susana P F; Cunha, Edite; Passos, Marieta L C; Araújo, André R S T; Saraiva, M Lúcia M F S

    2018-05-15

    Antipsychotic (AP) drugs are becoming accumulated in terrestrial and aqueous resources due to their actual consumption. Thus, the search of methods for assessing the contamination load of these drugs is mandatory. The COD is a key parameter used for monitoring water quality upon the assessment of the effect of polluting agents on the oxygen level. Thus, the present work aims to assess the chemical oxygen demand (COD) levels of several typical and atypical antipsychotic drugs in order to obtain structure-activity relationships. It was implemented the titrimetric method with potassium dichromate as oxidant and a digestion step of 2h, followed by the measurement of remained unreduced dichromate by titration. After that, an automated sequential injection analysis (SIA) method was, also, used aiming to overcome some drawbacks of the titrimetric method. The results obtained showed a relationship between the chemical structures of antipsychotic drugs and their COD values, where the presence of aromatic rings and oxidable groups give higher COD values. It was obtained a good compliance between the results of the reference batch procedure and the SIA system, and the APs were clustered in two groups, with the values ratio between the methodologies, of 2 or 4, in the case of lower or higher COD values, respectively. The SIA methodology is capable of operating as a screening method, in any stage of a synthetic process, being also more environmentally friendly, and cost-effective. Besides, the studies presented open promising perspectives for the improvement of the effectiveness of pharmaceutical removal from the waste effluents, by assessing COD values. Copyright © 2018 Elsevier Inc. All rights reserved.

  12. TreeRipper web application: towards a fully automated optical tree recognition software

    Directory of Open Access Journals (Sweden)

    Hughes Joseph

    2011-05-01

    Full Text Available Abstract Background Relationships between species, genes and genomes have been printed as trees for over a century. Whilst this may have been the best format for exchanging and sharing phylogenetic hypotheses during the 20th century, the worldwide web now provides faster and automated ways of transferring and sharing phylogenetic knowledge. However, novel software is needed to defrost these published phylogenies for the 21st century. Results TreeRipper is a simple website for the fully-automated recognition of multifurcating phylogenetic trees (http://linnaeus.zoology.gla.ac.uk/~jhughes/treeripper/. The program accepts a range of input image formats (PNG, JPG/JPEG or GIF. The underlying command line c++ program follows a number of cleaning steps to detect lines, remove node labels, patch-up broken lines and corners and detect line edges. The edge contour is then determined to detect the branch length, tip label positions and the topology of the tree. Optical Character Recognition (OCR is used to convert the tip labels into text with the freely available tesseract-ocr software. 32% of images meeting the prerequisites for TreeRipper were successfully recognised, the largest tree had 115 leaves. Conclusions Despite the diversity of ways phylogenies have been illustrated making the design of a fully automated tree recognition software difficult, TreeRipper is a step towards automating the digitization of past phylogenies. We also provide a dataset of 100 tree images and associated tree files for training and/or benchmarking future software. TreeRipper is an open source project licensed under the GNU General Public Licence v3.

  13. Fully automated joint space width measurement and digital X-ray radiogrammetry in early RA.

    Science.gov (United States)

    Platten, Michael; Kisten, Yogan; Kälvesten, Johan; Arnaud, Laurent; Forslind, Kristina; van Vollenhoven, Ronald

    2017-01-01

    To study fully automated digital joint space width (JSW) and bone mineral density (BMD) in relation to a conventional radiographic scoring method in early rheumatoid arthritis (eRA). Radiographs scored by the modified Sharp van der Heijde score (SHS) in patients with eRA were acquired from the SWEdish FarmacOTherapy study. Fully automated JSW measurements of bilateral metacarpals 2, 3 and 4 were compared with the joint space narrowing (JSN) score in SHS. Multilevel mixed model statistics were applied to calculate the significance of the association between ΔJSW and ΔBMD over 1 year, and the JSW differences between damaged and undamaged joints as evaluated by the JSN. Based on 576 joints of 96 patients with eRA, a significant reduction from baseline to 1 year was observed in the JSW from 1.69 (±0.19) mm to 1.66 (±0.19) mm (p0) joints: 1.68 mm (95% CI 1.70 to 1.67) vs 1.54 mm (95% CI 1.63 to 1.46). Similarly the unadjusted multilevel model showed significant differences in JSW between undamaged (1.68 mm (95% CI 1.72 to 1.64)) and damaged joints (1.63 mm (95% CI 1.68 to 1.58)) (p=0.0048). This difference remained significant in the adjusted model: 1.66 mm (95% CI 1.70 to 1.61) vs 1.62 mm (95% CI 1.68 to 1.56) (p=0.042). To measure the JSW with this fully automated digital tool may be useful as a quick and observer-independent application for evaluating cartilage damage in eRA. NCT00764725.

  14. Development of a fully automated software system for rapid analysis/processing of the falling weight deflectometer data.

    Science.gov (United States)

    2009-02-01

    The Office of Special Investigations at Iowa Department of Transportation (DOT) collects FWD data on regular basis to evaluate pavement structural conditions. The primary objective of this study was to develop a fully-automated software system for ra...

  15. Fully Automated Trimethylsilyl (TMS) Derivatisation Protocol for Metabolite Profiling by GC-MS.

    Science.gov (United States)

    Zarate, Erica; Boyle, Veronica; Rupprecht, Udo; Green, Saras; Villas-Boas, Silas G; Baker, Philip; Pinu, Farhana R

    2016-12-29

    Gas Chromatography-Mass Spectrometry (GC-MS) has long been used for metabolite profiling of a wide range of biological samples. Many derivatisation protocols are already available and among these, trimethylsilyl (TMS) derivatisation is one of the most widely used in metabolomics. However, most TMS methods rely on off-line derivatisation prior to GC-MS analysis. In the case of manual off-line TMS derivatisation, the derivative created is unstable, so reduction in recoveries occurs over time. Thus, derivatisation is carried out in small batches. Here, we present a fully automated TMS derivatisation protocol using robotic autosamplers and we also evaluate a commercial software, Maestro available from Gerstel GmbH. Because of automation, there was no waiting time of derivatised samples on the autosamplers, thus reducing degradation of unstable metabolites. Moreover, this method allowed us to overlap samples and improved throughputs. We compared data obtained from both manual and automated TMS methods performed on three different matrices, including standard mix, wine, and plasma samples. The automated TMS method showed better reproducibility and higher peak intensity for most of the identified metabolites than the manual derivatisation method. We also validated the automated method using 114 quality control plasma samples. Additionally, we showed that this online method was highly reproducible for most of the metabolites detected and identified (RSD TMS method has been applied to analyse a large number of complex plasma samples. Furthermore, we found that this method was highly applicable for routine metabolite profiling (both targeted and untargeted) in any metabolomics laboratory.

  16. Validation of Fully Automated VMAT Plan Generation for Library-Based Plan-of-the-Day Cervical Cancer Radiotherapy

    OpenAIRE

    Sharfo, Abdul Wahab M.; Breedveld, Sebastiaan; Voet, Peter W. J.; Heijkoop, Sabrina T.; Mens, Jan-Willem M.; Hoogeman, Mischa S.; Heijmen, Ben J. M.

    2016-01-01

    textabstractPurpose: To develop and validate fully automated generation of VMAT plan-libraries for plan-of-the-day adaptive radiotherapy in locally-advanced cervical cancer. Material and Methods: Our framework for fully automated treatment plan generation (Erasmus-iCycle) was adapted to create dual-arc VMAT treatment plan libraries for cervical cancer patients. For each of 34 patients, automatically generated VMAT plans (autoVMAT) were compared to manually generated, clinically delivered 9-be...

  17. Fully automated segmentation of left ventricle using dual dynamic programming in cardiac cine MR images

    Science.gov (United States)

    Jiang, Luan; Ling, Shan; Li, Qiang

    2016-03-01

    Cardiovascular diseases are becoming a leading cause of death all over the world. The cardiac function could be evaluated by global and regional parameters of left ventricle (LV) of the heart. The purpose of this study is to develop and evaluate a fully automated scheme for segmentation of LV in short axis cardiac cine MR images. Our fully automated method consists of three major steps, i.e., LV localization, LV segmentation at end-diastolic phase, and LV segmentation propagation to the other phases. First, the maximum intensity projection image along the time phases of the midventricular slice, located at the center of the image, was calculated to locate the region of interest of LV. Based on the mean intensity of the roughly segmented blood pool in the midventricular slice at each phase, end-diastolic (ED) and end-systolic (ES) phases were determined. Second, the endocardial and epicardial boundaries of LV of each slice at ED phase were synchronously delineated by use of a dual dynamic programming technique. The external costs of the endocardial and epicardial boundaries were defined with the gradient values obtained from the original and enhanced images, respectively. Finally, with the advantages of the continuity of the boundaries of LV across adjacent phases, we propagated the LV segmentation from the ED phase to the other phases by use of dual dynamic programming technique. The preliminary results on 9 clinical cardiac cine MR cases show that the proposed method can obtain accurate segmentation of LV based on subjective evaluation.

  18. Improved protein hydrogen/deuterium exchange mass spectrometry platform with fully automated data processing.

    Science.gov (United States)

    Zhang, Zhongqi; Zhang, Aming; Xiao, Gang

    2012-06-05

    Protein hydrogen/deuterium exchange (HDX) followed by protease digestion and mass spectrometric (MS) analysis is accepted as a standard method for studying protein conformation and conformational dynamics. In this article, an improved HDX MS platform with fully automated data processing is described. The platform significantly reduces systematic and random errors in the measurement by introducing two types of corrections in HDX data analysis. First, a mixture of short peptides with fast HDX rates is introduced as internal standards to adjust the variations in the extent of back exchange from run to run. Second, a designed unique peptide (PPPI) with slow intrinsic HDX rate is employed as another internal standard to reflect the possible differences in protein intrinsic HDX rates when protein conformations at different solution conditions are compared. HDX data processing is achieved with a comprehensive HDX model to simulate the deuterium labeling and back exchange process. The HDX model is implemented into the in-house developed software MassAnalyzer and enables fully unattended analysis of the entire protein HDX MS data set starting from ion detection and peptide identification to final processed HDX output, typically within 1 day. The final output of the automated data processing is a set (or the average) of the most possible protection factors for each backbone amide hydrogen. The utility of the HDX MS platform is demonstrated by exploring the conformational transition of a monoclonal antibody by increasing concentrations of guanidine.

  19. Fully Automated Data Collection Using PAM and the Development of PAM/SPACE Reversible Cassettes

    Science.gov (United States)

    Hiraki, Masahiko; Watanabe, Shokei; Chavas, Leonard M. G.; Yamada, Yusuke; Matsugaki, Naohiro; Igarashi, Noriyuki; Wakatsuki, Soichi; Fujihashi, Masahiro; Miki, Kunio; Baba, Seiki; Ueno, Go; Yamamoto, Masaki; Suzuki, Mamoru; Nakagawa, Atsushi; Watanabe, Nobuhisa; Tanaka, Isao

    2010-06-01

    To remotely control and automatically collect data in high-throughput X-ray data collection experiments, the Structural Biology Research Center at the Photon Factory (PF) developed and installed sample exchange robots PAM (PF Automated Mounting system) at PF macromolecular crystallography beamlines; BL-5A, BL-17A, AR-NW12A and AR-NE3A. We developed and installed software that manages the flow of the automated X-ray experiments; sample exchanges, loop-centering and X-ray diffraction data collection. The fully automated data collection function has been available since February 2009. To identify sample cassettes, PAM employs a two-dimensional bar code reader. New beamlines, BL-1A at the Photon Factory and BL32XU at SPring-8, are currently under construction as part of Targeted Proteins Research Program (TPRP) by the Ministry of Education, Culture, Sports, Science and Technology of Japan. However, different robots, PAM and SPACE (SPring-8 Precise Automatic Cryo-sample Exchanger), will be installed at BL-1A and BL32XU, respectively. For the convenience of the users of both facilities, pins and cassettes for PAM and SPACE are developed as part of the TPRP.

  20. A new fully automated FTIR system for total column measurements of greenhouse gases

    Directory of Open Access Journals (Sweden)

    M. C. Geibel

    2010-10-01

    Full Text Available This article introduces a new fully automated FTIR system that is part of the Total Carbon Column Observing Network (TCCON. It will provide continuous ground-based measurements of column-averaged volume mixing ratio for CO2, CH4 and several other greenhouse gases in the tropics.

    Housed in a 20-foot shipping container it was developed as a transportable system that could be deployed almost anywhere in the world. We describe the automation concept which relies on three autonomous subsystems and their interaction. Crucial components like a sturdy and reliable solar tracker dome are described in detail. The automation software employs a new approach relying on multiple processes, database logging and web-based remote control.

    First results of total column measurements at Jena, Germany show that the instrument works well and can provide parts of the diurnal as well as seasonal cycle for CO2. Instrument line shape measurements with an HCl cell suggest that the instrument stays well-aligned over several months.

    After a short test campaign for side by side intercomaprison with an existing TCCON instrument in Australia, the system will be transported to its final destination Ascension Island.

  1. A new fully automated FTIR system for total column measurements of greenhouse gases

    Science.gov (United States)

    Geibel, M. C.; Gerbig, C.; Feist, D. G.

    2010-10-01

    This article introduces a new fully automated FTIR system that is part of the Total Carbon Column Observing Network (TCCON). It will provide continuous ground-based measurements of column-averaged volume mixing ratio for CO2, CH4 and several other greenhouse gases in the tropics. Housed in a 20-foot shipping container it was developed as a transportable system that could be deployed almost anywhere in the world. We describe the automation concept which relies on three autonomous subsystems and their interaction. Crucial components like a sturdy and reliable solar tracker dome are described in detail. The automation software employs a new approach relying on multiple processes, database logging and web-based remote control. First results of total column measurements at Jena, Germany show that the instrument works well and can provide parts of the diurnal as well as seasonal cycle for CO2. Instrument line shape measurements with an HCl cell suggest that the instrument stays well-aligned over several months. After a short test campaign for side by side intercomaprison with an existing TCCON instrument in Australia, the system will be transported to its final destination Ascension Island.

  2. UBO Detector - A cluster-based, fully automated pipeline for extracting white matter hyperintensities.

    Science.gov (United States)

    Jiang, Jiyang; Liu, Tao; Zhu, Wanlin; Koncz, Rebecca; Liu, Hao; Lee, Teresa; Sachdev, Perminder S; Wen, Wei

    2018-07-01

    We present 'UBO Detector', a cluster-based, fully automated pipeline for extracting and calculating variables for regions of white matter hyperintensities (WMH) (available for download at https://cheba.unsw.edu.au/group/neuroimaging-pipeline). It takes T1-weighted and fluid attenuated inversion recovery (FLAIR) scans as input, and SPM12 and FSL functions are utilised for pre-processing. The candidate clusters are then generated by FMRIB's Automated Segmentation Tool (FAST). A supervised machine learning algorithm, k-nearest neighbor (k-NN), is applied to determine whether the candidate clusters are WMH or non-WMH. UBO Detector generates both image and text (volumes and the number of WMH clusters) outputs for whole brain, periventricular, deep, and lobar WMH, as well as WMH in arterial territories. The computation time for each brain is approximately 15 min. We validated the performance of UBO Detector by showing a) high segmentation (similarity index (SI) = 0.848) and volumetric (intraclass correlation coefficient (ICC) = 0.985) agreement between the UBO Detector-derived and manually traced WMH; b) highly correlated (r 2  > 0.9) and a steady increase of WMH volumes over time; and c) significant associations of periventricular (t = 22.591, p deep (t = 14.523, p < 0.001) WMH volumes generated by UBO Detector with Fazekas rating scores. With parallel computing enabled in UBO Detector, the processing can take advantage of multi-core CPU's that are commonly available on workstations. In conclusion, UBO Detector is a reliable, efficient and fully automated WMH segmentation pipeline. Copyright © 2018 Elsevier Inc. All rights reserved.

  3. Feasibility of Commercially Available, Fully Automated Hepatic CT Volumetry for Assessing Both Total and Territorial Liver Volumes in Liver Transplantation

    Energy Technology Data Exchange (ETDEWEB)

    Shin, Cheong Il; Kim, Se Hyung; Rhim, Jung Hyo; Yi, Nam Joon; Suh, Kyung Suk; Lee, Jeong Min; Han, Joon Koo; Choi, Byung Ihn [Seoul National University Hospital, Seoul (Korea, Republic of)

    2013-02-15

    To assess the feasibility of commercially-available, fully automated hepatic CT volumetry for measuring both total and territorial liver volumes by comparing with interactive manual volumetry and measured ex-vivo liver volume. For the assessment of total and territorial liver volume, portal phase CT images of 77 recipients and 107 donors who donated right hemiliver were used. Liver volume was measured using both the fully automated and interactive manual methods with Advanced Liver Analysis software. The quality of the automated segmentation was graded on a 4-point scale. Grading was performed by two radiologists in consensus. For the cases with excellent-to-good quality, the accuracy of automated volumetry was compared with interactive manual volumetry and measured ex-vivo liver volume which was converted from weight using analysis of variance test and Pearson's or Spearman correlation test. Processing time for both automated and interactive manual methods was also compared. Excellent-to-good quality of automated segmentation for total liver and right hemiliver was achieved in 57.1% (44/77) and 17.8% (19/107), respectively. For both total and right hemiliver volumes, there were no significant differences among automated, manual, and ex-vivo volumes except between automate volume and manual volume of the total liver (p = 0.011). There were good correlations between automate volume and ex-vivo liver volume ({gamma}= 0.637 for total liver and {gamma}= 0.767 for right hemiliver). Both correlation coefficients were higher than those with manual method. Fully automated volumetry required significantly less time than interactive manual method (total liver: 48.6 sec vs. 53.2 sec, right hemiliver: 182 sec vs. 244.5 sec). Fully automated hepatic CT volumetry is feasible and time-efficient for total liver volume measurement. However, its usefulness for territorial liver volumetry needs to be improved.

  4. Feasibility of Commercially Available, Fully Automated Hepatic CT Volumetry for Assessing Both Total and Territorial Liver Volumes in Liver Transplantation

    International Nuclear Information System (INIS)

    Shin, Cheong Il; Kim, Se Hyung; Rhim, Jung Hyo; Yi, Nam Joon; Suh, Kyung Suk; Lee, Jeong Min; Han, Joon Koo; Choi, Byung Ihn

    2013-01-01

    To assess the feasibility of commercially-available, fully automated hepatic CT volumetry for measuring both total and territorial liver volumes by comparing with interactive manual volumetry and measured ex-vivo liver volume. For the assessment of total and territorial liver volume, portal phase CT images of 77 recipients and 107 donors who donated right hemiliver were used. Liver volume was measured using both the fully automated and interactive manual methods with Advanced Liver Analysis software. The quality of the automated segmentation was graded on a 4-point scale. Grading was performed by two radiologists in consensus. For the cases with excellent-to-good quality, the accuracy of automated volumetry was compared with interactive manual volumetry and measured ex-vivo liver volume which was converted from weight using analysis of variance test and Pearson's or Spearman correlation test. Processing time for both automated and interactive manual methods was also compared. Excellent-to-good quality of automated segmentation for total liver and right hemiliver was achieved in 57.1% (44/77) and 17.8% (19/107), respectively. For both total and right hemiliver volumes, there were no significant differences among automated, manual, and ex-vivo volumes except between automate volume and manual volume of the total liver (p = 0.011). There were good correlations between automate volume and ex-vivo liver volume (γ= 0.637 for total liver and γ= 0.767 for right hemiliver). Both correlation coefficients were higher than those with manual method. Fully automated volumetry required significantly less time than interactive manual method (total liver: 48.6 sec vs. 53.2 sec, right hemiliver: 182 sec vs. 244.5 sec). Fully automated hepatic CT volumetry is feasible and time-efficient for total liver volume measurement. However, its usefulness for territorial liver volumetry needs to be improved.

  5. Opportunities for Automated Demand Response in California’s Dairy Processing Industry

    Energy Technology Data Exchange (ETDEWEB)

    Homan, Gregory K. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Aghajanzadeh, Arian [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); McKane, Aimee [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2015-08-30

    During periods of peak electrical demand on the energy grid or when there is a shortage of supply, the stability of the grid may be compromised or the cost of supplying electricity may rise dramatically, respectively. Demand response programs are designed to mitigate the severity of these problems and improve reliability by reducing the demand on the grid during such critical times. In 2010, the Demand Response Research Center convened a group of industry experts to suggest potential industries that would be good demand response program candidates for further review. The dairy industry was suggested due to the perception that the industry had suitable flexibility and automatic controls in place. The purpose of this report is to provide an initial description of the industry with regard to demand response potential, specifically automated demand response. This report qualitatively describes the potential for participation in demand response and automated demand response by dairy processing facilities in California, as well as barriers to widespread participation. The report first describes the magnitude, timing, location, purpose, and manner of energy use. Typical process equipment and controls are discussed, as well as common impediments to participation in demand response and automated demand response programs. Two case studies of demand response at dairy facilities in California and across the country are reviewed. Finally, recommendations are made for future research that can enhance the understanding of demand response potential in this industry.

  6. Fully automated chest wall line segmentation in breast MRI by using context information

    Science.gov (United States)

    Wu, Shandong; Weinstein, Susan P.; Conant, Emily F.; Localio, A. Russell; Schnall, Mitchell D.; Kontos, Despina

    2012-03-01

    Breast MRI has emerged as an effective modality for the clinical management of breast cancer. Evidence suggests that computer-aided applications can further improve the diagnostic accuracy of breast MRI. A critical and challenging first step for automated breast MRI analysis, is to separate the breast as an organ from the chest wall. Manual segmentation or user-assisted interactive tools are inefficient, tedious, and error-prone, which is prohibitively impractical for processing large amounts of data from clinical trials. To address this challenge, we developed a fully automated and robust computerized segmentation method that intensively utilizes context information of breast MR imaging and the breast tissue's morphological characteristics to accurately delineate the breast and chest wall boundary. A critical component is the joint application of anisotropic diffusion and bilateral image filtering to enhance the edge that corresponds to the chest wall line (CWL) and to reduce the effect of adjacent non-CWL tissues. A CWL voting algorithm is proposed based on CWL candidates yielded from multiple sequential MRI slices, in which a CWL representative is generated and used through a dynamic time warping (DTW) algorithm to filter out inferior candidates, leaving the optimal one. Our method is validated by a representative dataset of 20 3D unilateral breast MRI scans that span the full range of the American College of Radiology (ACR) Breast Imaging Reporting and Data System (BI-RADS) fibroglandular density categorization. A promising performance (average overlay percentage of 89.33%) is observed when the automated segmentation is compared to manually segmented ground truth obtained by an experienced breast imaging radiologist. The automated method runs time-efficiently at ~3 minutes for each breast MR image set (28 slices).

  7. Microscope image based fully automated stomata detection and pore measurement method for grapevines

    Directory of Open Access Journals (Sweden)

    Hiranya Jayakody

    2017-11-01

    Full Text Available Abstract Background Stomatal behavior in grapevines has been identified as a good indicator of the water stress level and overall health of the plant. Microscope images are often used to analyze stomatal behavior in plants. However, most of the current approaches involve manual measurement of stomatal features. The main aim of this research is to develop a fully automated stomata detection and pore measurement method for grapevines, taking microscope images as the input. The proposed approach, which employs machine learning and image processing techniques, can outperform available manual and semi-automatic methods used to identify and estimate stomatal morphological features. Results First, a cascade object detection learning algorithm is developed to correctly identify multiple stomata in a large microscopic image. Once the regions of interest which contain stomata are identified and extracted, a combination of image processing techniques are applied to estimate the pore dimensions of the stomata. The stomata detection approach was compared with an existing fully automated template matching technique and a semi-automatic maximum stable extremal regions approach, with the proposed method clearly surpassing the performance of the existing techniques with a precision of 91.68% and an F1-score of 0.85. Next, the morphological features of the detected stomata were measured. Contrary to existing approaches, the proposed image segmentation and skeletonization method allows us to estimate the pore dimensions even in cases where the stomatal pore boundary is only partially visible in the microscope image. A test conducted using 1267 images of stomata showed that the segmentation and skeletonization approach was able to correctly identify the stoma opening 86.27% of the time. Further comparisons made with manually traced stoma openings indicated that the proposed method is able to estimate stomata morphological features with accuracies of 89.03% for area

  8. Development of a Fully-Automated Monte Carlo Burnup Code Monteburns

    International Nuclear Information System (INIS)

    Poston, D.I.; Trellue, H.R.

    1999-01-01

    Several computer codes have been developed to perform nuclear burnup calculations over the past few decades. In addition, because of advances in computer technology, it recently has become more desirable to use Monte Carlo techniques for such problems. Monte Carlo techniques generally offer two distinct advantages over discrete ordinate methods: (1) the use of continuous energy cross sections and (2) the ability to model detailed, complex, three-dimensional (3-D) geometries. These advantages allow more accurate burnup results to be obtained, provided that the user possesses the required computing power (which is required for discrete ordinate methods as well). Several linkage codes have been written that combine a Monte Carlo N-particle transport code (such as MCNP TM ) with a radioactive decay and burnup code. This paper describes one such code that was written at Los Alamos National Laboratory: monteburns. Monteburns links MCNP with the isotope generation and depletion code ORIGEN2. The basis for the development of monteburns was the need for a fully automated code that could perform accurate burnup (and other) calculations for any 3-D system (accelerator-driven or a full reactor core). Before the initial development of monteburns, a list of desired attributes was made and is given below. o The code should be fully automated (that is, after the input is set up, no further user interaction is required). . The code should allow for the irradiation of several materials concurrently (each material is evaluated collectively in MCNP and burned separately in 0RIGEN2). o The code should allow the transfer of materials (shuffling) between regions in MCNP. . The code should allow any materials to be added or removed before, during, or after each step in an automated fashion. . The code should not require the user to provide input for 0RIGEN2 and should have minimal MCNP input file requirements (other than a working MCNP deck). . The code should be relatively easy to use

  9. A predictive control scheme for automated demand response mechanisms

    NARCIS (Netherlands)

    Lampropoulos, I.; Bosch, van den P.P.J.; Kling, W.L.

    2012-01-01

    The development of demand response mechanisms can provide a considerable option for the integration of renewable energy sources and the establishment of efficient generation and delivery of electrical power. The full potential of demand response can be significant, but its exploration still remains

  10. AutoDrug: fully automated macromolecular crystallography workflows for fragment-based drug discovery

    International Nuclear Information System (INIS)

    Tsai, Yingssu; McPhillips, Scott E.; González, Ana; McPhillips, Timothy M.; Zinn, Daniel; Cohen, Aina E.; Feese, Michael D.; Bushnell, David; Tiefenbrunn, Theresa; Stout, C. David; Ludaescher, Bertram; Hedman, Britt; Hodgson, Keith O.; Soltis, S. Michael

    2013-01-01

    New software has been developed for automating the experimental and data-processing stages of fragment-based drug discovery at a macromolecular crystallography beamline. A new workflow-automation framework orchestrates beamline-control and data-analysis software while organizing results from multiple samples. AutoDrug is software based upon the scientific workflow paradigm that integrates the Stanford Synchrotron Radiation Lightsource macromolecular crystallography beamlines and third-party processing software to automate the crystallography steps of the fragment-based drug-discovery process. AutoDrug screens a cassette of fragment-soaked crystals, selects crystals for data collection based on screening results and user-specified criteria and determines optimal data-collection strategies. It then collects and processes diffraction data, performs molecular replacement using provided models and detects electron density that is likely to arise from bound fragments. All processes are fully automated, i.e. are performed without user interaction or supervision. Samples can be screened in groups corresponding to particular proteins, crystal forms and/or soaking conditions. A single AutoDrug run is only limited by the capacity of the sample-storage dewar at the beamline: currently 288 samples. AutoDrug was developed in conjunction with RestFlow, a new scientific workflow-automation framework. RestFlow simplifies the design of AutoDrug by managing the flow of data and the organization of results and by orchestrating the execution of computational pipeline steps. It also simplifies the execution and interaction of third-party programs and the beamline-control system. Modeling AutoDrug as a scientific workflow enables multiple variants that meet the requirements of different user groups to be developed and supported. A workflow tailored to mimic the crystallography stages comprising the drug-discovery pipeline of CoCrystal Discovery Inc. has been deployed and successfully

  11. Integrated Platform for Automated Sustainable Demand Response in Smart Grids

    Energy Technology Data Exchange (ETDEWEB)

    Zois, Vassilis [Univ. of Southern California, Los Angeles, CA (United States). Dept. of Computer Science; Frincu, Marc [Univ. of Southern California, Los Angeles, CA (United States). Dept. of Electrical Engineering; Prasanna, Viktor K. [Univ. of Southern California, Los Angeles, CA (United States). Dept. of Electrical Engineering

    2014-10-08

    Demand Response(DR) is a common practice used by utility providers to regulate energy demand. It is used at periods of high demand to minimize the peak to average consumption ratio. Several methods have been Demand Response(DR) is a common praon using information about the baseline consumption and the consumption during DR. Our goal is to provide a sustainable reduction to ensure the elimination of peaks in demand. The proposed system includes an adaptation mechanism for when the provided solution does not meet the DR requirements. We conducted a series of experiments using consumption data from a real life micro grid to evaluate the efficiency as well as the robustness of our solution.

  12. 3D model assisted fully automated scanning laser Doppler vibrometer measurements

    Science.gov (United States)

    Sels, Seppe; Ribbens, Bart; Bogaerts, Boris; Peeters, Jeroen; Vanlanduit, Steve

    2017-12-01

    In this paper, a new fully automated scanning laser Doppler vibrometer (LDV) measurement technique is presented. In contrast to existing scanning LDV techniques which use a 2D camera for the manual selection of sample points, we use a 3D Time-of-Flight camera in combination with a CAD file of the test object to automatically obtain measurements at pre-defined locations. The proposed procedure allows users to test prototypes in a shorter time because physical measurement locations are determined without user interaction. Another benefit from this methodology is that it incorporates automatic mapping between a CAD model and the vibration measurements. This mapping can be used to visualize measurements directly on a 3D CAD model. The proposed method is illustrated with vibration measurements of an unmanned aerial vehicle

  13. Self-consistent hybrid functionals for solids: a fully-automated implementation

    Science.gov (United States)

    Erba, A.

    2017-08-01

    A fully-automated algorithm for the determination of the system-specific optimal fraction of exact exchange in self-consistent hybrid functionals of the density-functional-theory is illustrated, as implemented into the public Crystal program. The exchange fraction of this new class of functionals is self-consistently updated proportionally to the inverse of the dielectric response of the system within an iterative procedure (Skone et al 2014 Phys. Rev. B 89, 195112). Each iteration of the present scheme, in turn, implies convergence of a self-consistent-field (SCF) and a coupled-perturbed-Hartree-Fock/Kohn-Sham (CPHF/KS) procedure. The present implementation, beside improving the user-friendliness of self-consistent hybrids, exploits the unperturbed and electric-field perturbed density matrices from previous iterations as guesses for subsequent SCF and CPHF/KS iterations, which is documented to reduce the overall computational cost of the whole process by a factor of 2.

  14. Fully automated laser ray tracing system to measure changes in the crystalline lens GRIN profile.

    Science.gov (United States)

    Qiu, Chen; Maceo Heilman, Bianca; Kaipio, Jari; Donaldson, Paul; Vaghefi, Ehsan

    2017-11-01

    Measuring the lens gradient refractive index (GRIN) accurately and reliably has proven an extremely challenging technical problem. A fully automated laser ray tracing (LRT) system was built to address this issue. The LRT system captures images of multiple laser projections before and after traversing through an ex vivo lens. These LRT images, combined with accurate measurements of the lens geometry, are used to calculate the lens GRIN profile. Mathematically, this is an ill-conditioned problem; hence, it is essential to apply biologically relevant constraints to produce a feasible solution. The lens GRIN measurements were compared with previously published data. Our GRIN retrieval algorithm produces fast and accurate measurements of the lens GRIN profile. Experiments to study the optics of physiologically perturbed lenses are the future direction of this research.

  15. Fully automated left ventricular contour detection for gated radionuclide angiography, (1)

    International Nuclear Information System (INIS)

    Hosoba, Minoru; Wani, Hidenobu; Hiroe, Michiaki; Kusakabe, Kiyoko.

    1984-01-01

    A fully automated practical method has been developed to detect the left ventricular (LV) contour from gated pool images. Ejection fraction and volume curve can be computed accurately without operater variance. The characteristics of the method are summarized as follows: 1. Optimal design of the filter that works on Fourier domain, can be achieved to improve the signal to noise ratio. 2. New algorithm which use the cosine and sine transform images has been developed for the separating ventricle from atrium and defining center of LV. 3. Contrast enhancement by optimized square filter. 4. Radial profiles are generated from the center of LV and smoothed by fourth order Fourier series approximation. The crossing point with local threshold value searched from the center of the LV is defined as edge. 5. LV contour is obtained by conecting all the edge points defined on radial profiles by fitting them to Fourier function. (author)

  16. Clinical validation of fully automated computation of ejection fraction from gated equilibrium blood-pool scintigrams

    International Nuclear Information System (INIS)

    Reiber, J.H.C.; Lie, S.P.; Simoons, M.L.; Hoek, C.; Gerbrands, J.J.; Wijns, W.; Bakker, W.H.; Kooij, P.P.M.

    1983-01-01

    A fully automated procedure for the computation of left-ventricular ejection fraction (EF) from cardiac-gated Tc-99m blood-pool (GBP) scintigrams with fixed, dual, and variable ROI methods is described. By comparison with EF data from contrast ventriculography in 68 patients, the dual-ROI method (separate end-diastolic and end-systolic contours) was found to be the method of choice; processing time was 2 min. Success score of dual-ROI procedure was 92% as assessed from 100 GBP studies. Overall reproducibility of data acquisition and analysis was determined in 12 patients. Mean value and standard deviation of differences between repeat studies (average time interval 27 min) were 0.8% and 4.3% EF units, respectively, (r=0.98). The authors conclude that left-ventricular EF can be computed automatically from GBP scintigrams with minimal operator-interaction and good reproducibility; EFs are similar to those from contrast ventriculography

  17. A fully automated entanglement-based quantum cryptography system for telecom fiber networks

    International Nuclear Information System (INIS)

    Treiber, Alexander; Ferrini, Daniele; Huebel, Hannes; Zeilinger, Anton; Poppe, Andreas; Loruenser, Thomas; Querasser, Edwin; Matyus, Thomas; Hentschel, Michael

    2009-01-01

    We present in this paper a quantum key distribution (QKD) system based on polarization entanglement for use in telecom fibers. A QKD exchange up to 50 km was demonstrated in the laboratory with a secure key rate of 550 bits s -1 . The system is compact and portable with a fully automated start-up, and stabilization modules for polarization, synchronization and photon coupling allow hands-off operation. Stable and reliable key exchange in a deployed optical fiber of 16 km length was demonstrated. In this fiber network, we achieved over 2 weeks an automatic key generation with an average key rate of 2000 bits s -1 without manual intervention. During this period, the system had an average entanglement visibility of 93%, highlighting the technical level and stability achieved for entanglement-based quantum cryptography.

  18. Fully automated bone mineral density assessment from low-dose chest CT

    Science.gov (United States)

    Liu, Shuang; Gonzalez, Jessica; Zulueta, Javier; de-Torres, Juan P.; Yankelevitz, David F.; Henschke, Claudia I.; Reeves, Anthony P.

    2018-02-01

    A fully automated system is presented for bone mineral density (BMD) assessment from low-dose chest CT (LDCT). BMD assessment is central in the diagnosis and follow-up therapy monitoring of osteoporosis, which is characterized by low bone density and is estimated to affect 12.3 million US population aged 50 years or older, creating tremendous social and economic burdens. BMD assessment from DXA scans (BMDDXA) is currently the most widely used and gold standard technique for the diagnosis of osteoporosis and bone fracture risk estimation. With the recent large-scale implementation of annual lung cancer screening using LDCT, great potential emerges for the concurrent opportunistic osteoporosis screening. In the presented BMDCT assessment system, each vertebral body is first segmented and labeled with its anatomical name. Various 3D region of interest (ROI) inside the vertebral body are then explored for BMDCT measurements at different vertebral levels. The system was validated using 76 pairs of DXA and LDCT scans of the same subject. Average BMDDXA of L1-L4 was used as the reference standard. Statistically significant (p-value correlation is obtained between BMDDXA and BMDCT at all vertebral levels (T1 - L2). A Pearson correlation of 0.857 was achieved between BMDDXA and average BMDCT of T9-T11 by using a 3D ROI taking into account of both trabecular and cortical bone tissue. These encouraging results demonstrate the feasibility of fully automated quantitative BMD assessment and the potential of opportunistic osteoporosis screening with concurrent lung cancer screening using LDCT.

  19. The worldwide NORM production and a fully automated gamma-ray spectrometer for their characterization

    International Nuclear Information System (INIS)

    Xhixha, G.; Callegari, I.; Guastaldi, E.; De Bianchi, S.; Fiorentini, G.; Universita di Ferrara, Ferrara; Istituto Nazionale di Fisica Nucleare; Kaceli Xhixha, M.

    2013-01-01

    Materials containing radionuclides of natural origin and being subject to regulation because of their radioactivity are known as Naturally Occurring Radioactive Material (NORM). By following International Atomic Energy Agency, we include in NORM those materials with an activity concentration, which is modified by human made processes. We present a brief review of the main categories of non-nuclear industries together with the levels of activity concentration in feed raw materials, products and waste, including mechanisms of radioisotope enrichments. The global management of NORM shows a high level of complexity, mainly due to different degrees of radioactivity enhancement and the huge amount of worldwide waste production. The future tendency of guidelines concerning environmental protection will require both a systematic monitoring based on the ever-increasing sampling and high performance of gamma-ray spectroscopy. On the ground of these requirements a new low-background fully automated high-resolution gamma-ray spectrometer MCA R ad has been developed. The design of lead and cooper shielding allowed to reach a background reduction of two order of magnitude with respect to laboratory radioactivity. A severe lowering of manpower cost is obtained through a fully automation system, which enables up to 24 samples to be measured without any human attendance. Two coupled HPGe detectors increase the detection efficiency, performing accurate measurements on small sample volume (180 cm 3 ) with a reduction of sample transport cost of material. Details of the instrument calibration method are presented. MCA R ad system can measure in less than one hour a typical NORM sample enriched in U and Th with some hundreds of Bq kg -1 , with an overall uncertainty less than 5 %. Quality control of this method has been tested. Measurements of three certified reference materials RGK-1, RGU-2 and RGTh-1 containing concentrations of potassium, uranium and thorium comparable to NORM have

  20. Development of a phantom to test fully automated breast density software – A work in progress

    International Nuclear Information System (INIS)

    Waade, G.G.; Hofvind, S.; Thompson, J.D.; Highnam, R.; Hogg, P.

    2017-01-01

    Objectives: Mammographic density (MD) is an independent risk factor for breast cancer and may have a future role for stratified screening. Automated software can estimate MD but the relationship between breast thickness reduction and MD is not fully understood. Our aim is to develop a deformable breast phantom to assess automated density software and the impact of breast thickness reduction on MD. Methods: Several different configurations of poly vinyl alcohol (PVAL) phantoms were created. Three methods were used to estimate their density. Raw image data of mammographic images were processed using Volpara to estimate volumetric breast density (VBD%); Hounsfield units (HU) were measured on CT images; and physical density (g/cm 3 ) was calculated using a formula involving mass and volume. Phantom volume versus contact area and phantom volume versus phantom thickness was compared to values of real breasts. Results: Volpara recognized all deformable phantoms as female breasts. However, reducing the phantom thickness caused a change in phantom density and the phantoms were not able to tolerate same level of compression and thickness reduction experienced by female breasts during mammography. Conclusion: Our results are promising as all phantoms resulted in valid data for automated breast density measurement. Further work should be conducted on PVAL and other materials to produce deformable phantoms that mimic female breast structure and density with the ability of being compressed to the same level as female breasts. Advances in knowledge: We are the first group to have produced deformable phantoms that are recognized as breasts by Volpara software. - Highlights: • Several phantoms of different configurations were created. • Three methods to assess phantom density were implemented. • All phantoms were identified as breasts by the Volpara software. • Reducing phantom thickness caused a change in phantom density.

  1. A fully automated system for quantification of background parenchymal enhancement in breast DCE-MRI

    Science.gov (United States)

    Ufuk Dalmiş, Mehmet; Gubern-Mérida, Albert; Borelli, Cristina; Vreemann, Suzan; Mann, Ritse M.; Karssemeijer, Nico

    2016-03-01

    Background parenchymal enhancement (BPE) observed in breast dynamic contrast enhanced magnetic resonance imaging (DCE-MRI) has been identified as an important biomarker associated with risk for developing breast cancer. In this study, we present a fully automated framework for quantification of BPE. We initially segmented fibroglandular tissue (FGT) of the breasts using an improved version of an existing method. Subsequently, we computed BPEabs (volume of the enhancing tissue), BPErf (BPEabs divided by FGT volume) and BPErb (BPEabs divided by breast volume), using different relative enhancement threshold values between 1% and 100%. To evaluate and compare the previous and improved FGT segmentation methods, we used 20 breast DCE-MRI scans and we computed Dice similarity coefficient (DSC) values with respect to manual segmentations. For evaluation of the BPE quantification, we used a dataset of 95 breast DCE-MRI scans. Two radiologists, in individual reading sessions, visually analyzed the dataset and categorized each breast into minimal, mild, moderate and marked BPE. To measure the correlation between automated BPE values to the radiologists' assessments, we converted these values into ordinal categories and we used Spearman's rho as a measure of correlation. According to our results, the new segmentation method obtained an average DSC of 0.81 0.09, which was significantly higher (p<0.001) compared to the previous method (0.76 0.10). The highest correlation values between automated BPE categories and radiologists' assessments were obtained with the BPErf measurement (r=0.55, r=0.49, p<0.001 for both), while the correlation between the scores given by the two radiologists was 0.82 (p<0.001). The presented framework can be used to systematically investigate the correlation between BPE and risk in large screening cohorts.

  2. A fully automated cell segmentation and morphometric parameter system for quantifying corneal endothelial cell morphology.

    Science.gov (United States)

    Al-Fahdawi, Shumoos; Qahwaji, Rami; Al-Waisy, Alaa S; Ipson, Stanley; Ferdousi, Maryam; Malik, Rayaz A; Brahma, Arun

    2018-07-01

    Corneal endothelial cell abnormalities may be associated with a number of corneal and systemic diseases. Damage to the endothelial cells can significantly affect corneal transparency by altering hydration of the corneal stroma, which can lead to irreversible endothelial cell pathology requiring corneal transplantation. To date, quantitative analysis of endothelial cell abnormalities has been manually performed by ophthalmologists using time consuming and highly subjective semi-automatic tools, which require an operator interaction. We developed and applied a fully-automated and real-time system, termed the Corneal Endothelium Analysis System (CEAS) for the segmentation and computation of endothelial cells in images of the human cornea obtained by in vivo corneal confocal microscopy. First, a Fast Fourier Transform (FFT) Band-pass filter is applied to reduce noise and enhance the image quality to make the cells more visible. Secondly, endothelial cell boundaries are detected using watershed transformations and Voronoi tessellations to accurately quantify the morphological parameters of the human corneal endothelial cells. The performance of the automated segmentation system was tested against manually traced ground-truth images based on a database consisting of 40 corneal confocal endothelial cell images in terms of segmentation accuracy and obtained clinical features. In addition, the robustness and efficiency of the proposed CEAS system were compared with manually obtained cell densities using a separate database of 40 images from controls (n = 11), obese subjects (n = 16) and patients with diabetes (n = 13). The Pearson correlation coefficient between automated and manual endothelial cell densities is 0.9 (p system, and the possibility of utilizing it in a real world clinical setting to enable rapid diagnosis and for patient follow-up, with an execution time of only 6 seconds per image. Copyright © 2018 Elsevier B.V. All rights reserved.

  3. Toxicity assessment of ionic liquids with Vibrio fischeri: an alternative fully automated methodology.

    Science.gov (United States)

    Costa, Susana P F; Pinto, Paula C A G; Lapa, Rui A S; Saraiva, M Lúcia M F S

    2015-03-02

    A fully automated Vibrio fischeri methodology based on sequential injection analysis (SIA) has been developed. The methodology was based on the aspiration of 75 μL of bacteria and 50 μL of inhibitor followed by measurement of the luminescence of bacteria. The assays were conducted for contact times of 5, 15, and 30 min, by means of three mixing chambers that ensured adequate mixing conditions. The optimized methodology provided a precise control of the reaction conditions which is an asset for the analysis of a large number of samples. The developed methodology was applied to the evaluation of the impact of a set of ionic liquids (ILs) on V. fischeri and the results were compared with those provided by a conventional assay kit (Biotox(®)). The collected data evidenced the influence of different cation head groups and anion moieties on the toxicity of ILs. Generally, aromatic cations and fluorine-containing anions displayed higher impact on V. fischeri, evidenced by lower EC50. The proposed methodology was validated through statistical analysis which demonstrated a strong positive correlation (P>0.98) between assays. It is expected that the automated methodology can be tested for more classes of compounds and used as alternative to microplate based V. fischeri assay kits. Copyright © 2014 Elsevier B.V. All rights reserved.

  4. Photochemical-chemiluminometric determination of aldicarb in a fully automated multicommutation based flow-assembly

    International Nuclear Information System (INIS)

    Palomeque, M.; Garcia Bautista, J.A.; Catala Icardo, M.; Garcia Mateo, J.V.; Martinez Calatayud, J.

    2004-01-01

    A sensitive and fully automated method for determination of aldicarb in technical formulations (Temik) and mineral waters is proposed. The automation of the flow-assembly is based on the multicommutation approach, which uses a set of solenoid valves acting as independent switchers. The operating cycle for obtaining a typical analytical transient signal can be easily programmed by means of a home-made software running in the Windows environment. The manifold is provided with a photoreactor consisting of a 150 cm long x 0.8 mm i.d. piece of PTFE tubing coiled around a 20 W low-pressure mercury lamp. The determination of aldicarb is performed on the basis of the iron(III) catalytic mineralization of the pesticide by UV irradiation (150 s), and the chemiluminescent (CL) behavior of the photodegradated pesticide in presence of potassium permanganate and quinine sulphate as sensitizer. UV irradiation of aldicarb turns the very week chemiluminescent pesticide into a strongly chemiluminescent photoproduct. The method is linear over the range 2.2-100.0 μg l -1 of aldicarb; the limit of detection is 0.069 μg l -1 ; the reproducibility (as the R.S.D. of 20 peaks of a 24 μg l -1 solution) is 3.7% and the sample throughput is 17 h -1

  5. Fully automated synthesis system of 3'-deoxy-3'-[18F]fluorothymidine

    International Nuclear Information System (INIS)

    Oh, Seung Jun; Mosdzianowski, Christoph; Chi, Dae Yoon; Kim, Jung Young; Kang, Se Hun; Ryu, Jin Sook; Yeo, Jeong Seok; Moon, Dae Hyuk

    2004-01-01

    We developed a new fully automated method for the synthesis of 3'-deoxy-3'-[ 18 F]fluorothymidine ([ 18 F]FLT), by modifying a commercial FDG synthesizer and its disposable fluid pathway. Optimal labeling condition was that 40 mg of precursor in acetonitrile (2 mL) was heated at 150 degree sign C for 100 sec, followed by heating at 85 degree sign C for 450 sec and hydrolysis with 1 N HCl at 105 degree sign C for 300 sec. Using 3.7 GBq of [ 18 F]F - as starting activity, [ 18 F]FLT was obtained with a yield of 50.5±5.2% (n=28, decay corrected) within 60.0±5.4 min including HPLC purification. With 37.0 GBq, we obtained 48.7±5.6% (n=10). The [ 18 F]FLT showed the good stability for 6 h. This new automated synthesis procedure combines high and reproducible yields with the benefits of a disposable cassette system

  6. Fully automated muscle quality assessment by Gabor filtering of second harmonic generation images

    Science.gov (United States)

    Paesen, Rik; Smolders, Sophie; Vega, José Manolo de Hoyos; Eijnde, Bert O.; Hansen, Dominique; Ameloot, Marcel

    2016-02-01

    Although structural changes on the sarcomere level of skeletal muscle are known to occur due to various pathologies, rigorous studies of the reduced sarcomere quality remain scarce. This can possibly be explained by the lack of an objective tool for analyzing and comparing sarcomere images across biological conditions. Recent developments in second harmonic generation (SHG) microscopy and increasing insight into the interpretation of sarcomere SHG intensity profiles have made SHG microscopy a valuable tool to study microstructural properties of sarcomeres. Typically, sarcomere integrity is analyzed by fitting a set of manually selected, one-dimensional SHG intensity profiles with a supramolecular SHG model. To circumvent this tedious manual selection step, we developed a fully automated image analysis procedure to map the sarcomere disorder for the entire image at once. The algorithm relies on a single-frequency wavelet-based Gabor approach and includes a newly developed normalization procedure allowing for unambiguous data interpretation. The method was validated by showing the correlation between the sarcomere disorder, quantified by the M-band size obtained from manually selected profiles, and the normalized Gabor value ranging from 0 to 1 for decreasing disorder. Finally, to elucidate the applicability of our newly developed protocol, Gabor analysis was used to study the effect of experimental autoimmune encephalomyelitis on the sarcomere regularity. We believe that the technique developed in this work holds great promise for high-throughput, unbiased, and automated image analysis to study sarcomere integrity by SHG microscopy.

  7. DeepPicker: A deep learning approach for fully automated particle picking in cryo-EM.

    Science.gov (United States)

    Wang, Feng; Gong, Huichao; Liu, Gaochao; Li, Meijing; Yan, Chuangye; Xia, Tian; Li, Xueming; Zeng, Jianyang

    2016-09-01

    Particle picking is a time-consuming step in single-particle analysis and often requires significant interventions from users, which has become a bottleneck for future automated electron cryo-microscopy (cryo-EM). Here we report a deep learning framework, called DeepPicker, to address this problem and fill the current gaps toward a fully automated cryo-EM pipeline. DeepPicker employs a novel cross-molecule training strategy to capture common features of particles from previously-analyzed micrographs, and thus does not require any human intervention during particle picking. Tests on the recently-published cryo-EM data of three complexes have demonstrated that our deep learning based scheme can successfully accomplish the human-level particle picking process and identify a sufficient number of particles that are comparable to those picked manually by human experts. These results indicate that DeepPicker can provide a practically useful tool to significantly reduce the time and manual effort spent in single-particle analysis and thus greatly facilitate high-resolution cryo-EM structure determination. DeepPicker is released as an open-source program, which can be downloaded from https://github.com/nejyeah/DeepPicker-python. Copyright © 2016 Elsevier Inc. All rights reserved.

  8. A fully automated FTIR system for remote sensing of greenhouse gases in the tropics

    Science.gov (United States)

    Geibel, M. C.; Gerbig, C.; Feist, D. G.

    2010-07-01

    This article introduces a new fully automated FTIR system that is part of the Total Carbon Column Observing Network. It will provide continuous ground-based measurements of column-averaged volume mixing ratio for CO2, CH4 and several other greenhouse gases in the tropics. Housed in a 20-foot shipping container it was developed as a transportable system that could be deployed almost anywhere in the world. We describe the automation concept which relies on three autonomous subsystems and their interaction. Crucial components like a sturdy and reliable solar tracker dome are described in detail. First results of total column measurements at Jena, Germany show that the instrument works well and can provide diurnal as well as seasonal cycle for CO2. Instrument line shape measurements with an HCl cell suggest that the instrument stays well-aligned over several months. After a short test campaign for side by side intercomaprison with an existing TCCON instrument in Australia, the system will be transported to its final destination Ascension Island.

  9. Toward fully automated processing of dynamic susceptibility contrast perfusion MRI for acute ischemic cerebral stroke.

    Science.gov (United States)

    Kim, Jinsuh; Leira, Enrique C; Callison, Richard C; Ludwig, Bryan; Moritani, Toshio; Magnotta, Vincent A; Madsen, Mark T

    2010-05-01

    We developed fully automated software for dynamic susceptibility contrast (DSC) MR perfusion-weighted imaging (PWI) to efficiently and reliably derive critical hemodynamic information for acute stroke treatment decisions. Brain MR PWI was performed in 80 consecutive patients with acute nonlacunar ischemic stroke within 24h after onset of symptom from January 2008 to August 2009. These studies were automatically processed to generate hemodynamic parameters that included cerebral blood flow and cerebral blood volume, and the mean transit time (MTT). To develop reliable software for PWI analysis, we used computationally robust algorithms including the piecewise continuous regression method to determine bolus arrival time (BAT), log-linear curve fitting, arrival time independent deconvolution method and sophisticated motion correction methods. An optimal arterial input function (AIF) search algorithm using a new artery-likelihood metric was also developed. Anatomical locations of the automatically determined AIF were reviewed and validated. The automatically computed BAT values were statistically compared with estimated BAT by a single observer. In addition, gamma-variate curve-fitting errors of AIF and inter-subject variability of AIFs were analyzed. Lastly, two observes independently assessed the quality and area of hypoperfusion mismatched with restricted diffusion area from motion corrected MTT maps and compared that with time-to-peak (TTP) maps using the standard approach. The AIF was identified within an arterial branch and enhanced areas of perfusion deficit were visualized in all evaluated cases. Total processing time was 10.9+/-2.5s (mean+/-s.d.) without motion correction and 267+/-80s (mean+/-s.d.) with motion correction on a standard personal computer. The MTT map produced with our software adequately estimated brain areas with perfusion deficit and was significantly less affected by random noise of the PWI when compared with the TTP map. Results of image

  10. An Automation System for Optimizing a Supply Chain Network Design under the Influence of Demand Uncertainty

    OpenAIRE

    Polany, Rany

    2012-01-01

    This research develops and applies an integrated hierarchical framework for modeling a multi-echelon supply chain network design, under the influence of demand uncertainty. The framework is a layered integration of two levels: macro, high-level scenario planning combined with micro, low-level Monte Carlo simulation of uncertainties in demand. To facilitate rapid simulation of the effects of demand uncertainty, the integrated framework was implemented as a dashboard automation system using Mic...

  11. [18F]FMeNER-D2: Reliable fully-automated synthesis for visualization of the norepinephrine transporter

    International Nuclear Information System (INIS)

    Rami-Mark, Christina; Zhang, Ming-Rong; Mitterhauser, Markus; Lanzenberger, Rupert; Hacker, Marcus; Wadsak, Wolfgang

    2013-01-01

    Purpose: In neurodegenerative diseases and neuropsychiatric disorders dysregulation of the norepinephrine transporter (NET) has been reported. For visualization of NET availability and occupancy in the human brain PET imaging can be used. Therefore, selective NET-PET tracers with high affinity are required. Amongst these, [ 18 F]FMeNER-D2 is showing the best results so far. Furthermore, a reliable fully automated radiosynthesis is a prerequisite for successful application of PET-tracers. The aim of this work was the automation of [ 18 F]FMeNER-D2 radiolabelling for subsequent clinical use. The presented study comprises 25 automated large-scale syntheses, which were directly applied to healthy volunteers and adult patients suffering from attention deficit hyperactivity disorder (ADHD). Procedures: Synthesis of [ 18 F]FMeNER-D2 was automated within a Nuclear Interface Module. Starting from 20–30 GBq [ 18 F]fluoride, azeotropic drying, reaction with Br 2 CD 2 , distillation of 1-bromo-2-[ 18 F]fluoromethane-D2 ([ 18 F]BFM) and reaction of the pure [ 18 F]BFM with unprotected precursor NER were optimized and completely automated. HPLC purification and SPE procedure were completed, formulation and sterile filtration were achieved on-line and full quality control was performed. Results: Purified product was obtained in a fully automated synthesis in clinical scale allowing maximum radiation safety and routine production under GMP-like manner. So far, more than 25 fully automated syntheses were successfully performed, yielding 1.0–2.5 GBq of formulated [ 18 F]FMeNER-D2 with specific activities between 430 and 1707 GBq/μmol within 95 min total preparation time. Conclusions: A first fully automated [ 18 F]FMeNER-D2 synthesis was established, allowing routine production of this NET-PET tracer under maximum radiation safety and standardization

  12. [{sup 18}F]FMeNER-D2: Reliable fully-automated synthesis for visualization of the norepinephrine transporter

    Energy Technology Data Exchange (ETDEWEB)

    Rami-Mark, Christina [Radiochemistry and Biomarker Development Unit, Division of Nuclear Medicine, Department of Biomedical Imaging and Image-guided Therapy, Medical University of Vienna (Austria); Department of Inorganic Chemistry, University of Vienna (Austria); Zhang, Ming-Rong [Molecular Imaging Center, National Institute of Radiological Sciences, Chiba (Japan); Mitterhauser, Markus [Radiochemistry and Biomarker Development Unit, Division of Nuclear Medicine, Department of Biomedical Imaging and Image-guided Therapy, Medical University of Vienna (Austria); Hospital Pharmacy of the General Hospital of Vienna (Austria); Lanzenberger, Rupert [Department of Psychiatry and Psychotherapy, Medical University of Vienna (Austria); Hacker, Marcus [Radiochemistry and Biomarker Development Unit, Division of Nuclear Medicine, Department of Biomedical Imaging and Image-guided Therapy, Medical University of Vienna (Austria); Wadsak, Wolfgang [Radiochemistry and Biomarker Development Unit, Division of Nuclear Medicine, Department of Biomedical Imaging and Image-guided Therapy, Medical University of Vienna (Austria); Department of Inorganic Chemistry, University of Vienna (Austria)

    2013-11-15

    Purpose: In neurodegenerative diseases and neuropsychiatric disorders dysregulation of the norepinephrine transporter (NET) has been reported. For visualization of NET availability and occupancy in the human brain PET imaging can be used. Therefore, selective NET-PET tracers with high affinity are required. Amongst these, [{sup 18}F]FMeNER-D2 is showing the best results so far. Furthermore, a reliable fully automated radiosynthesis is a prerequisite for successful application of PET-tracers. The aim of this work was the automation of [{sup 18}F]FMeNER-D2 radiolabelling for subsequent clinical use. The presented study comprises 25 automated large-scale syntheses, which were directly applied to healthy volunteers and adult patients suffering from attention deficit hyperactivity disorder (ADHD). Procedures: Synthesis of [{sup 18}F]FMeNER-D2 was automated within a Nuclear Interface Module. Starting from 20–30 GBq [{sup 18}F]fluoride, azeotropic drying, reaction with Br{sub 2}CD{sub 2}, distillation of 1-bromo-2-[{sup 18}F]fluoromethane-D2 ([{sup 18}F]BFM) and reaction of the pure [{sup 18}F]BFM with unprotected precursor NER were optimized and completely automated. HPLC purification and SPE procedure were completed, formulation and sterile filtration were achieved on-line and full quality control was performed. Results: Purified product was obtained in a fully automated synthesis in clinical scale allowing maximum radiation safety and routine production under GMP-like manner. So far, more than 25 fully automated syntheses were successfully performed, yielding 1.0–2.5 GBq of formulated [{sup 18}F]FMeNER-D2 with specific activities between 430 and 1707 GBq/μmol within 95 min total preparation time. Conclusions: A first fully automated [{sup 18}F]FMeNER-D2 synthesis was established, allowing routine production of this NET-PET tracer under maximum radiation safety and standardization.

  13. [18F]FMeNER-D2: reliable fully-automated synthesis for visualization of the norepinephrine transporter.

    Science.gov (United States)

    Rami-Mark, Christina; Zhang, Ming-Rong; Mitterhauser, Markus; Lanzenberger, Rupert; Hacker, Marcus; Wadsak, Wolfgang

    2013-11-01

    In neurodegenerative diseases and neuropsychiatric disorders dysregulation of the norepinephrine transporter (NET) has been reported. For visualization of NET availability and occupancy in the human brain PET imaging can be used. Therefore, selective NET-PET tracers with high affinity are required. Amongst these, [(18)F]FMeNER-D2 is showing the best results so far. Furthermore, a reliable fully automated radiosynthesis is a prerequisite for successful application of PET-tracers. The aim of this work was the automation of [(18)F]FMeNER-D2 radiolabelling for subsequent clinical use. The presented study comprises 25 automated large-scale syntheses, which were directly applied to healthy volunteers and adult patients suffering from attention deficit hyperactivity disorder (ADHD). Synthesis of [(18)F]FMeNER-D2 was automated within a Nuclear Interface Module. Starting from 20-30 GBq [(18)F]fluoride, azeotropic drying, reaction with Br2CD2, distillation of 1-bromo-2-[(18)F]fluoromethane-D2 ([(18)F]BFM) and reaction of the pure [(18)F]BFM with unprotected precursor NER were optimized and completely automated. HPLC purification and SPE procedure were completed, formulation and sterile filtration were achieved on-line and full quality control was performed. Purified product was obtained in a fully automated synthesis in clinical scale allowing maximum radiation safety and routine production under GMP-like manner. So far, more than 25 fully automated syntheses were successfully performed, yielding 1.0-2.5 GBq of formulated [(18)F]FMeNER-D2 with specific activities between 430 and 1707 GBq/μmol within 95 min total preparation time. A first fully automated [(18)F]FMeNER-D2 synthesis was established, allowing routine production of this NET-PET tracer under maximum radiation safety and standardization. © 2013.

  14. Opportunities for Energy Efficiency and Automated Demand Response in Industrial Refrigerated Warehouses in California

    Energy Technology Data Exchange (ETDEWEB)

    Lekov, Alex; Thompson, Lisa; McKane, Aimee; Rockoff, Alexandra; Piette, Mary Ann

    2009-05-11

    This report summarizes the Lawrence Berkeley National Laboratory's research to date in characterizing energy efficiency and open automated demand response opportunities for industrial refrigerated warehouses in California. The report describes refrigerated warehouses characteristics, energy use and demand, and control systems. It also discusses energy efficiency and open automated demand response opportunities and provides analysis results from three demand response studies. In addition, several energy efficiency, load management, and demand response case studies are provided for refrigerated warehouses. This study shows that refrigerated warehouses can be excellent candidates for open automated demand response and that facilities which have implemented energy efficiency measures and have centralized control systems are well-suited to shift or shed electrical loads in response to financial incentives, utility bill savings, and/or opportunities to enhance reliability of service. Control technologies installed for energy efficiency and load management purposes can often be adapted for open automated demand response (OpenADR) at little additional cost. These improved controls may prepare facilities to be more receptive to OpenADR due to both increased confidence in the opportunities for controlling energy cost/use and access to the real-time data.

  15. Automated leak localization performance without detailed demand distribution data

    NARCIS (Netherlands)

    Moors, Janneke; Scholten, L.; van der Hoek, J.P.; den Besten, J.

    2018-01-01

    Automatic leak localization has been suggested to reduce the time and personnel efforts needed to localize
    (small) leaks. Yet, the available methods require a detailed demand distribution model for successful
    calibration and good leak localization performance. The main aim of this work was

  16. A Fully Automated Diabetes Prevention Program, Alive-PD: Program Design and Randomized Controlled Trial Protocol.

    Science.gov (United States)

    Block, Gladys; Azar, Kristen Mj; Block, Torin J; Romanelli, Robert J; Carpenter, Heather; Hopkins, Donald; Palaniappan, Latha; Block, Clifford H

    2015-01-21

    In the United States, 86 million adults have pre-diabetes. Evidence-based interventions that are both cost effective and widely scalable are needed to prevent diabetes. Our goal was to develop a fully automated diabetes prevention program and determine its effectiveness in a randomized controlled trial. Subjects with verified pre-diabetes were recruited to participate in a trial of the effectiveness of Alive-PD, a newly developed, 1-year, fully automated behavior change program delivered by email and Web. The program involves weekly tailored goal-setting, team-based and individual challenges, gamification, and other opportunities for interaction. An accompanying mobile phone app supports goal-setting and activity planning. For the trial, participants were randomized by computer algorithm to start the program immediately or after a 6-month delay. The primary outcome measures are change in HbA1c and fasting glucose from baseline to 6 months. The secondary outcome measures are change in HbA1c, glucose, lipids, body mass index (BMI), weight, waist circumference, and blood pressure at 3, 6, 9, and 12 months. Randomization and delivery of the intervention are independent of clinic staff, who are blinded to treatment assignment. Outcomes will be evaluated for the intention-to-treat and per-protocol populations. A total of 340 subjects with pre-diabetes were randomized to the intervention (n=164) or delayed-entry control group (n=176). Baseline characteristics were as follows: mean age 55 (SD 8.9); mean BMI 31.1 (SD 4.3); male 68.5%; mean fasting glucose 109.9 (SD 8.4) mg/dL; and mean HbA1c 5.6 (SD 0.3)%. Data collection and analysis are in progress. We hypothesize that participants in the intervention group will achieve statistically significant reductions in fasting glucose and HbA1c as compared to the control group at 6 months post baseline. The randomized trial will provide rigorous evidence regarding the efficacy of this Web- and Internet-based program in reducing or

  17. Design and Operation of an Open, Interoperable Automated Demand Response Infrastructure for Commercial Buildings

    Energy Technology Data Exchange (ETDEWEB)

    Piette, Mary Ann; Ghatikar, Girish; Kiliccote, Sila; Watson, David; Koch, Ed; Hennage, Dan

    2009-05-01

    This paper describes the concept for and lessons from the development and field-testing of an open, interoperable communications infrastructure to support automated demand response (auto-DR). Automating DR allows greater levels of participation, improved reliability, and repeatability of the DR in participating facilities. This paper also presents the technical and architectural issues associated with auto-DR and description of the demand response automation server (DRAS), the client/server architecture-based middle-ware used to automate the interactions between the utilities or any DR serving entity and their customers for DR programs. Use case diagrams are presented to show the role of the DRAS between utility/ISO and the clients at the facilities.

  18. Analysis of xanthines in beverages using a fully automated SPE-SPC-DAD hyphenated system

    Energy Technology Data Exchange (ETDEWEB)

    Medvedovici, A. [Bucarest Univ., Bucarest (Romania). Faculty of Chemistry, Dept. of Analytical Chemistry; David, F.; David, V.; Sandra, P. [Research Institute of Chromatography, Kortrijk (Belgium)

    2000-08-01

    Analysis of some xanthines (caffeine, theophylline and theobromine) in beverages has been achieved by a fully automated on-line Solid Phase Extraction - Supercritical Fluid Chromatography - Diode Array Detection (Spe - Sofc - Dad). Three adsorbents have been tested for the Spe procedure: octadecyl modified silicagel (ODS) and two types of styrene-divinylbenzen copolymer based materials, from which Porapack proved to be the most suitable adsorbent. Optimisation and correlation of both Spe and Sofc operational parameters are also discussed. By this technique, caffeine was determined in ice tea and Coca-Cola in a concentration of 0.15 ppm, theobromine - 1.5 ppb, and theophylline - 0.15 ppb. [Italian] Si e' realizzata l'analis di alcune xantine (caffeina, teofillina e teobromina) mediante un sistema, in linea, completamente automatizzato basato su Estrazione in Fase Solida - Cromatografia in Fase Supercritica - Rivelazione con Diode Array (Spe - Sfc - Dad). Per la procedura Spe sono stati valutati tre substrati: silice ottadecilica (ODS) e due tipi di materiali polimerici a base stirene-divinilbenzene, di cui, quello denominato PRP-1, e' risultato essere il piu' efficiente. Sono discusse sia l'ottimizzazione che la correlazione dei parametri operazionali per la Spe e la Sfc. Con questa tecnica sono state determinate, in te' ghiacciato e Coca-Cola, la caffeina, la teobromina e la teofillina alle concentrazini di 0.15, 1.5 e 0.15 ppm.

  19. Development of a fully automated adaptive unsharp masking technique in digital chest radiograph

    International Nuclear Information System (INIS)

    Abe, Katsumi; Katsuragawa, Shigehiko; Sasaki, Yasuo

    1991-01-01

    We are developing a fully automated adaptive unsharp masking technique with various parameters depending on regional image features of a digital chest radiograph. A chest radiograph includes various regions such as lung fields, retrocardiac area and spine in which their texture patterns and optical densities are extremely different. Therefore, it is necessary to enhance image contrast of each region by each optimum parameter. First, we investigated optimum weighting factors and mask sizes of unsharp masking technique in a digital chest radiograph. Then, a chest radiograph is automatically divided into three segments, one for the lung field, one for the retrocardiac area, and one for the spine, by using histogram analysis of pixel values. Finally, high frequency components of the lung field and retrocardiac area are selectively enhanced with a small mask size and mild weighting factors which are previously determined as optimum parameters. In addition, low frequency components of the spine are enhanced with a large mask size and adequate weighting factors. This processed image shows excellent depiction of the lung field, retrocardiac area and spine simultaneously with optimum contrast. Our image processing technique may be useful for diagnosis of chest radiographs. (author)

  20. Fully automated drug screening of dried blood spots using online LC-MS/MS analysis

    Directory of Open Access Journals (Sweden)

    Stefan Gaugler

    2018-01-01

    Full Text Available A new and fully automated workflow for the cost effective drug screening of large populations based on the dried blood spot (DBS technology was introduced in this study. DBS were prepared by spotting 15 μL of whole blood, previously spiked with alprazolam, amphetamine, cocaine, codeine, diazepam, fentanyl, lysergic acid diethylamide (LSD, 3,4-methylenedioxymethamphet-amine (MDMA, methadone, methamphetamine, morphine and oxycodone onto filter paper cards. The dried spots were scanned, spiked with deuterated standards and directly extracted. The extract was transferred online to an analytical LC column and then to the electrospray ionization tandem mass spectrometry system. All drugs were quantified at their cut-off level and good precision and correlation within the calibration range was obtained. The method was finally applied to DBS samples from two patients with back pain and codeine and oxycodone could be identified and quantified accurately below the level of misuse of 89.6 ng/mL and 39.6 ng/mL respectively.

  1. Fully automated VMAT treatment planning for advanced-stage NSCLC patients

    International Nuclear Information System (INIS)

    Della Gala, Giuseppe; Dirkx, Maarten L.P.; Hoekstra, Nienke; Fransen, Dennie; Pol, Marjan van de; Heijmen, Ben J.M.; Lanconelli, Nico; Petit, Steven F.

    2017-01-01

    To develop a fully automated procedure for multicriterial volumetric modulated arc therapy (VMAT) treatment planning (autoVMAT) for stage III/IV non-small cell lung cancer (NSCLC) patients treated with curative intent. After configuring the developed autoVMAT system for NSCLC, autoVMAT plans were compared with manually generated clinically delivered intensity-modulated radiotherapy (IMRT) plans for 41 patients. AutoVMAT plans were also compared to manually generated VMAT plans in the absence of time pressure. For 16 patients with reduced planning target volume (PTV) dose prescription in the clinical IMRT plan (to avoid violation of organs at risk tolerances), the potential for dose escalation with autoVMAT was explored. Two physicians evaluated 35/41 autoVMAT plans (85%) as clinically acceptable. Compared to the manually generated IMRT plans, autoVMAT plans showed statistically significant improved PTV coverage (V_9_5_% increased by 1.1% ± 1.1%), higher dose conformity (R_5_0 reduced by 12.2% ± 12.7%), and reduced mean lung, heart, and esophagus doses (reductions of 0.9 Gy ± 1.0 Gy, 1.5 Gy ± 1.8 Gy, 3.6 Gy ± 2.8 Gy, respectively, all p [de

  2. Fully automated deformable registration of breast DCE-MRI and PET/CT

    Science.gov (United States)

    Dmitriev, I. D.; Loo, C. E.; Vogel, W. V.; Pengel, K. E.; Gilhuijs, K. G. A.

    2013-02-01

    Accurate characterization of breast tumors is important for the appropriate selection of therapy and monitoring of the response. For this purpose breast imaging and tissue biopsy are important aspects. In this study, a fully automated method for deformable registration of DCE-MRI and PET/CT of the breast is presented. The registration is performed using the CT component of the PET/CT and the pre-contrast T1-weighted non-fat suppressed MRI. Comparable patient setup protocols were used during the MRI and PET examinations in order to avoid having to make assumptions of biomedical properties of the breast during and after the application of chemotherapy. The registration uses a multi-resolution approach to speed up the process and to minimize the probability of converging to local minima. The validation was performed on 140 breasts (70 patients). From a total number of registration cases, 94.2% of the breasts were aligned within 4.0 mm accuracy (1 PET voxel). Fused information may be beneficial to obtain representative biopsy samples, which in turn will benefit the treatment of the patient.

  3. Radiation Planning Assistant - A Streamlined, Fully Automated Radiotherapy Treatment Planning System

    Science.gov (United States)

    Court, Laurence E.; Kisling, Kelly; McCarroll, Rachel; Zhang, Lifei; Yang, Jinzhong; Simonds, Hannah; du Toit, Monique; Trauernicht, Chris; Burger, Hester; Parkes, Jeannette; Mejia, Mike; Bojador, Maureen; Balter, Peter; Branco, Daniela; Steinmann, Angela; Baltz, Garrett; Gay, Skylar; Anderson, Brian; Cardenas, Carlos; Jhingran, Anuja; Shaitelman, Simona; Bogler, Oliver; Schmeller, Kathleen; Followill, David; Howell, Rebecca; Nelson, Christopher; Peterson, Christine; Beadle, Beth

    2018-01-01

    The Radiation Planning Assistant (RPA) is a system developed for the fully automated creation of radiotherapy treatment plans, including volume-modulated arc therapy (VMAT) plans for patients with head/neck cancer and 4-field box plans for patients with cervical cancer. It is a combination of specially developed in-house software that uses an application programming interface to communicate with a commercial radiotherapy treatment planning system. It also interfaces with a commercial secondary dose verification software. The necessary inputs to the system are a Treatment Plan Order, approved by the radiation oncologist, and a simulation computed tomography (CT) image, approved by the radiographer. The RPA then generates a complete radiotherapy treatment plan. For the cervical cancer treatment plans, no additional user intervention is necessary until the plan is complete. For head/neck treatment plans, after the normal tissue and some of the target structures are automatically delineated on the CT image, the radiation oncologist must review the contours, making edits if necessary. They also delineate the gross tumor volume. The RPA then completes the treatment planning process, creating a VMAT plan. Finally, the completed plan must be reviewed by qualified clinical staff. PMID:29708544

  4. Validation of a fully automated robotic setup for preparation of whole blood samples for LC-MS toxicology analysis

    DEFF Research Database (Denmark)

    Andersen, David Wederkinck; Rasmussen, Brian; Linnet, Kristian

    2012-01-01

    A fully automated setup was developed for preparing whole blood samples using a Tecan Evo workstation. By integrating several add-ons to the robotic platform, the flexible setup was able to prepare samples from sample tubes to a 96-well sample plate ready for injection on liquid chromatography...

  5. Breast Density Estimation with Fully Automated Volumetric Method: Comparison to Radiologists' Assessment by BI-RADS Categories.

    Science.gov (United States)

    Singh, Tulika; Sharma, Madhurima; Singla, Veenu; Khandelwal, Niranjan

    2016-01-01

    The objective of our study was to calculate mammographic breast density with a fully automated volumetric breast density measurement method and to compare it to breast imaging reporting and data system (BI-RADS) breast density categories assigned by two radiologists. A total of 476 full-field digital mammography examinations with standard mediolateral oblique and craniocaudal views were evaluated by two blinded radiologists and BI-RADS density categories were assigned. Using a fully automated software, mean fibroglandular tissue volume, mean breast volume, and mean volumetric breast density were calculated. Based on percentage volumetric breast density, a volumetric density grade was assigned from 1 to 4. The weighted overall kappa was 0.895 (almost perfect agreement) for the two radiologists' BI-RADS density estimates. A statistically significant difference was seen in mean volumetric breast density among the BI-RADS density categories. With increased BI-RADS density category, increase in mean volumetric breast density was also seen (P BI-RADS categories and volumetric density grading by fully automated software (ρ = 0.728, P BI-RADS density category by two observers showed fair agreement (κ = 0.398 and 0.388, respectively). In our study, a good correlation was seen between density grading using fully automated volumetric method and density grading using BI-RADS density categories assigned by the two radiologists. Thus, the fully automated volumetric method may be used to quantify breast density on routine mammography. Copyright © 2016 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.

  6. A novel fully automated molecular diagnostic system (AMDS for colorectal cancer mutation detection.

    Directory of Open Access Journals (Sweden)

    Shiro Kitano

    Full Text Available BACKGROUND: KRAS, BRAF and PIK3CA mutations are frequently observed in colorectal cancer (CRC. In particular, KRAS mutations are strong predictors for clinical outcomes of EGFR-targeted treatments such as cetuximab and panitumumab in metastatic colorectal cancer (mCRC. For mutation analysis, the current methods are time-consuming, and not readily available to all oncologists and pathologists. We have developed a novel, simple, sensitive and fully automated molecular diagnostic system (AMDS for point of care testing (POCT. Here we report the results of a comparison study between AMDS and direct sequencing (DS in the detection of KRAS, BRAF and PI3KCA somatic mutations. METHODOLOGY/PRINCIPAL FINDING: DNA was extracted from a slice of either frozen (n = 89 or formalin-fixed and paraffin-embedded (FFPE CRC tissue (n = 70, and then used for mutation analysis by AMDS and DS. All mutations (n = 41 among frozen and 27 among FFPE samples detected by DS were also successfully (100% detected by the AMDS. However, 8 frozen and 6 FFPE samples detected as wild-type in the DS analysis were shown as mutants in the AMDS analysis. By cloning-sequencing assays, these discordant samples were confirmed as true mutants. One sample had simultaneous "hot spot" mutations of KRAS and PIK3CA, and cloning assay comfirmed that E542K and E545K were not on the same allele. Genotyping call rates for DS were 100.0% (89/89 and 74.3% (52/70 in frozen and FFPE samples, respectively, for the first attempt; whereas that of AMDS was 100.0% for both sample sets. For automated DNA extraction and mutation detection by AMDS, frozen tissues (n = 41 were successfully detected all mutations within 70 minutes. CONCLUSIONS/SIGNIFICANCE: AMDS has superior sensitivity and accuracy over DS, and is much easier to execute than conventional labor intensive manual mutation analysis. AMDS has great potential for POCT equipment for mutation analysis.

  7. Evaluation of automated residential demand response with flat and dynamic pricing

    International Nuclear Information System (INIS)

    Swisher, Joel; Wang, Kitty; Stewart, Stewart

    2005-01-01

    This paper reviews the performance of two recent automated load management programs for residential customers of electric utilities in two American states. Both pilot programs have been run with about 200 participant houses each, and both programs have control populations of similar customers without the technology or program treatment. In both cases, the technology used in the pilot is GoodWatts, an advanced, two-way, real-time, comprehensive home energy management system. The purpose of each pilot is to determine the household kW reduction in coincident peak electric load from the energy management technology. Nevada Power has conducted a pilot program for Air-Conditioning Load Management (ACLM), in which customers are sent an electronic curtailment signal for three-hour intervals during times of maximum peak demand. The participating customers receive an annual incentive payment, but otherwise they are on a conventional utility tariff. In California, three major utilities are jointly conducting a pilot demonstration of an Automated Demand Response System (ADRS). Customers are on a time-of-use (ToU) tariff, which includes a critical peak pricing (CPP) element. During times of maximum peak demand, customers are sent an electronic price signal that is three times higher than the normal on-peak price. Houses with the automated GoodWatts technology reduced their demand in both the ACLM and the ADRS programs by about 50% consistently across the summer curtailment or super peak events, relative to homes without the technology or any load management program or tariff in place. The absolute savings were greater in the ACLM program, due to the higher baseline air conditioning loads in the hotter Las Vegas climate. The results suggest that either automated technology or dynamic pricing can deliver significant demand response in low-consumption houses. However, for high-consumption houses, automated technology can reduce load by a greater absolute kWh difference. Targeting

  8. A fully automated temperature-dependent resistance measurement setup using van der Pauw method

    Science.gov (United States)

    Pandey, Shivendra Kumar; Manivannan, Anbarasu

    2018-03-01

    The van der Pauw (VDP) method is widely used to identify the resistance of planar homogeneous samples with four contacts placed on its periphery. We have developed a fully automated thin film resistance measurement setup using the VDP method with the capability of precisely measuring a wide range of thin film resistances from few mΩ up to 10 GΩ under controlled temperatures from room-temperature up to 600 °C. The setup utilizes a robust, custom-designed switching network board (SNB) for measuring current-voltage characteristics automatically at four different source-measure configurations based on the VDP method. Moreover, SNB is connected with low noise shielded coaxial cables that reduce the effect of leakage current as well as the capacitance in the circuit thereby enhancing the accuracy of measurement. In order to enable precise and accurate resistance measurement of the sample, wide range of sourcing currents/voltages are pre-determined with the capability of auto-tuning for ˜12 orders of variation in the resistances. Furthermore, the setup has been calibrated with standard samples and also employed to investigate temperature dependent resistance (few Ω-10 GΩ) measurements for various chalcogenide based phase change thin films (Ge2Sb2Te5, Ag5In5Sb60Te30, and In3SbTe2). This setup would be highly helpful for measurement of temperature-dependent resistance of wide range of materials, i.e., metals, semiconductors, and insulators illuminating information about structural change upon temperature as reflected by change in resistances, which are useful for numerous applications.

  9. Fully Automated Laser Ablation Liquid Capture Sample Analysis using NanoElectrospray Ionization Mass Spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Lorenz, Matthias [ORNL; Ovchinnikova, Olga S [ORNL; Van Berkel, Gary J [ORNL

    2014-01-01

    RATIONALE: Laser ablation provides for the possibility of sampling a large variety of surfaces with high spatial resolution. This type of sampling when employed in conjunction with liquid capture followed by nanoelectrospray ionization provides the opportunity for sensitive and prolonged interrogation of samples by mass spectrometry as well as the ability to analyze surfaces not amenable to direct liquid extraction. METHODS: A fully automated, reflection geometry, laser ablation liquid capture spot sampling system was achieved by incorporating appropriate laser fiber optics and a focusing lens into a commercially available, liquid extraction surface analysis (LESA ) ready Advion TriVersa NanoMate system. RESULTS: Under optimized conditions about 10% of laser ablated material could be captured in a droplet positioned vertically over the ablation region using the NanoMate robot controlled pipette. The sampling spot size area with this laser ablation liquid capture surface analysis (LA/LCSA) mode of operation (typically about 120 m x 160 m) was approximately 50 times smaller than that achievable by direct liquid extraction using LESA (ca. 1 mm diameter liquid extraction spot). The set-up was successfully applied for the analysis of ink on glass and paper as well as the endogenous components in Alstroemeria Yellow King flower petals. In a second mode of operation with a comparable sampling spot size, termed laser ablation/LESA , the laser system was used to drill through, penetrate, or otherwise expose material beneath a solvent resistant surface. Once drilled, LESA was effective in sampling soluble material exposed at that location on the surface. CONCLUSIONS: Incorporating the capability for different laser ablation liquid capture spot sampling modes of operation into a LESA ready Advion TriVersa NanoMate enhanced the spot sampling spatial resolution of this device and broadened the surface types amenable to analysis to include absorbent and solvent resistant

  10. Fully automated whole-head segmentation with improved smoothness and continuity, with theory reviewed.

    Directory of Open Access Journals (Sweden)

    Yu Huang

    Full Text Available Individualized current-flow models are needed for precise targeting of brain structures using transcranial electrical or magnetic stimulation (TES/TMS. The same is true for current-source reconstruction in electroencephalography and magnetoencephalography (EEG/MEG. The first step in generating such models is to obtain an accurate segmentation of individual head anatomy, including not only brain but also cerebrospinal fluid (CSF, skull and soft tissues, with a field of view (FOV that covers the whole head. Currently available automated segmentation tools only provide results for brain tissues, have a limited FOV, and do not guarantee continuity and smoothness of tissues, which is crucially important for accurate current-flow estimates. Here we present a tool that addresses these needs. It is based on a rigorous Bayesian inference framework that combines image intensity model, anatomical prior (atlas and morphological constraints using Markov random fields (MRF. The method is evaluated on 20 simulated and 8 real head volumes acquired with magnetic resonance imaging (MRI at 1 mm3 resolution. We find improved surface smoothness and continuity as compared to the segmentation algorithms currently implemented in Statistical Parametric Mapping (SPM. With this tool, accurate and morphologically correct modeling of the whole-head anatomy for individual subjects may now be feasible on a routine basis. Code and data are fully integrated into SPM software tool and are made publicly available. In addition, a review on the MRI segmentation using atlas and the MRF over the last 20 years is also provided, with the general mathematical framework clearly derived.

  11. Fully automated whole-head segmentation with improved smoothness and continuity, with theory reviewed.

    Science.gov (United States)

    Huang, Yu; Parra, Lucas C

    2015-01-01

    Individualized current-flow models are needed for precise targeting of brain structures using transcranial electrical or magnetic stimulation (TES/TMS). The same is true for current-source reconstruction in electroencephalography and magnetoencephalography (EEG/MEG). The first step in generating such models is to obtain an accurate segmentation of individual head anatomy, including not only brain but also cerebrospinal fluid (CSF), skull and soft tissues, with a field of view (FOV) that covers the whole head. Currently available automated segmentation tools only provide results for brain tissues, have a limited FOV, and do not guarantee continuity and smoothness of tissues, which is crucially important for accurate current-flow estimates. Here we present a tool that addresses these needs. It is based on a rigorous Bayesian inference framework that combines image intensity model, anatomical prior (atlas) and morphological constraints using Markov random fields (MRF). The method is evaluated on 20 simulated and 8 real head volumes acquired with magnetic resonance imaging (MRI) at 1 mm3 resolution. We find improved surface smoothness and continuity as compared to the segmentation algorithms currently implemented in Statistical Parametric Mapping (SPM). With this tool, accurate and morphologically correct modeling of the whole-head anatomy for individual subjects may now be feasible on a routine basis. Code and data are fully integrated into SPM software tool and are made publicly available. In addition, a review on the MRI segmentation using atlas and the MRF over the last 20 years is also provided, with the general mathematical framework clearly derived.

  12. Fully automated intrinsic respiratory and cardiac gating for small animal CT

    Energy Technology Data Exchange (ETDEWEB)

    Kuntz, J; Baeuerle, T; Semmler, W; Bartling, S H [Department of Medical Physics in Radiology, German Cancer Research Center, Heidelberg (Germany); Dinkel, J [Department of Radiology, German Cancer Research Center, Heidelberg (Germany); Zwick, S [Department of Diagnostic Radiology, Medical Physics, Freiburg University (Germany); Grasruck, M [Siemens Healthcare, Forchheim (Germany); Kiessling, F [Chair of Experimental Molecular Imaging, RWTH-Aachen University, Medical Faculty, Aachen (Germany); Gupta, R [Department of Radiology, Massachusetts General Hospital, Boston, MA (United States)], E-mail: j.kuntz@dkfz.de

    2010-04-07

    A fully automated, intrinsic gating algorithm for small animal cone-beam CT is described and evaluated. A parameter representing the organ motion, derived from the raw projection images, is used for both cardiac and respiratory gating. The proposed algorithm makes it possible to reconstruct motion-corrected still images as well as to generate four-dimensional (4D) datasets representing the cardiac and pulmonary anatomy of free-breathing animals without the use of electrocardiogram (ECG) or respiratory sensors. Variation analysis of projections from several rotations is used to place a region of interest (ROI) on the diaphragm. The ROI is cranially extended to include the heart. The centre of mass (COM) variation within this ROI, the filtered frequency response and the local maxima are used to derive a binary motion-gating parameter for phase-sensitive gated reconstruction. This algorithm was implemented on a flat-panel-based cone-beam CT scanner and evaluated using a moving phantom and animal scans (seven rats and eight mice). Volumes were determined using a semiautomatic segmentation. In all cases robust gating signals could be obtained. The maximum volume error in phantom studies was less than 6%. By utilizing extrinsic gating via externally placed cardiac and respiratory sensors, the functional parameters (e.g. cardiac ejection fraction) and image quality were equivalent to this current gold standard. This algorithm obviates the necessity of both gating hardware and user interaction. The simplicity of the proposed algorithm enables adoption in a wide range of small animal cone-beam CT scanners.

  13. Effectiveness of a Web-Based Screening and Fully Automated Brief Motivational Intervention for Adolescent Substance Use

    DEFF Research Database (Denmark)

    Arnaud, Nicolas; Baldus, Christiane; Elgán, Tobias H.

    2016-01-01

    ).Conclusions: Although the study is limited by a large drop-out, significant between-group effects for alcohol use indicate that targeted brief motivational intervention in a fully automated Web-based format can be effective to reduce drinking and lessen existing substance use service barriers for at...... of substance use among college students. However, the evidence is sparse among adolescents with at-risk use of alcohol and other drugs. Objective: This study evaluated the effectiveness of a targeted and fully automated Web-based brief motivational intervention with no face-to-face components on substance use......, and polydrug use. All outcome analyses were conducted with and without Expectation Maximization (EM) imputation of missing follow-up data. Results: In total, 2673 adolescents were screened and 1449 (54.2%) participants were randomized to the intervention or control group. After 3 months, 211 adolescents (14...

  14. Effectiveness of a Web-Based Screening and Fully Automated Brief Motivational Intervention for Adolescent Substance Use

    DEFF Research Database (Denmark)

    Arnaud, Nicolas; Baldus, Christiane; Elgán, Tobias H.

    2016-01-01

    of substance use among college students. However, the evidence is sparse among adolescents with at-risk use of alcohol and other drugs. Objective: This study evaluated the effectiveness of a targeted and fully automated Web-based brief motivational intervention with no face-to-face components on substance use...... methods and screened online for at-risk substance use using the CRAFFT (Car, Relax, Alone, Forget, Friends, Trouble) screening instrument. Participants were randomized to a single session brief motivational intervention group or an assessment-only control group but not blinded. Primary outcome......).Conclusions: Although the study is limited by a large drop-out, significant between-group effects for alcohol use indicate that targeted brief motivational intervention in a fully automated Web-based format can be effective to reduce drinking and lessen existing substance use service barriers for at...

  15. Fully automated treatment planning for head and neck radiotherapy using a voxel-based dose prediction and dose mimicking method

    Science.gov (United States)

    McIntosh, Chris; Welch, Mattea; McNiven, Andrea; Jaffray, David A.; Purdie, Thomas G.

    2017-08-01

    Recent works in automated radiotherapy treatment planning have used machine learning based on historical treatment plans to infer the spatial dose distribution for a novel patient directly from the planning image. We present a probabilistic, atlas-based approach which predicts the dose for novel patients using a set of automatically selected most similar patients (atlases). The output is a spatial dose objective, which specifies the desired dose-per-voxel, and therefore replaces the need to specify and tune dose-volume objectives. Voxel-based dose mimicking optimization then converts the predicted dose distribution to a complete treatment plan with dose calculation using a collapsed cone convolution dose engine. In this study, we investigated automated planning for right-sided oropharaynx head and neck patients treated with IMRT and VMAT. We compare four versions of our dose prediction pipeline using a database of 54 training and 12 independent testing patients by evaluating 14 clinical dose evaluation criteria. Our preliminary results are promising and demonstrate that automated methods can generate comparable dose distributions to clinical. Overall, automated plans achieved an average of 0.6% higher dose for target coverage evaluation criteria, and 2.4% lower dose at the organs at risk criteria levels evaluated compared with clinical. There was no statistically significant difference detected in high-dose conformity between automated and clinical plans as measured by the conformation number. Automated plans achieved nine more unique criteria than clinical across the 12 patients tested and automated plans scored a significantly higher dose at the evaluation limit for two high-risk target coverage criteria and a significantly lower dose in one critical organ maximum dose. The novel dose prediction method with dose mimicking can generate complete treatment plans in 12-13 min without user interaction. It is a promising approach for fully automated treatment

  16. Gene Expression Measurement Module (GEMM) - a fully automated, miniaturized instrument for measuring gene expression in space

    Science.gov (United States)

    Karouia, Fathi; Ricco, Antonio; Pohorille, Andrew; Peyvan, Kianoosh

    2012-07-01

    The capability to measure gene expression on board spacecrafts opens the doors to a large number of experiments on the influence of space environment on biological systems that will profoundly impact our ability to conduct safe and effective space travel, and might also shed light on terrestrial physiology or biological function and human disease and aging processes. Measurements of gene expression will help us to understand adaptation of terrestrial life to conditions beyond the planet of origin, identify deleterious effects of the space environment on a wide range of organisms from microbes to humans, develop effective countermeasures against these effects, determine metabolic basis of microbial pathogenicity and drug resistance, test our ability to sustain and grow in space organisms that can be used for life support and in situ resource utilization during long-duration space exploration, and monitor both the spacecraft environment and crew health. These and other applications hold significant potential for discoveries in space biology, biotechnology and medicine. Accordingly, supported by funding from the NASA Astrobiology Science and Technology Instrument Development Program, we are developing a fully automated, miniaturized, integrated fluidic system for small spacecraft capable of in-situ measuring microbial expression of thousands of genes from multiple samples. The instrument will be capable of (1) lysing bacterial cell walls, (2) extracting and purifying RNA released from cells, (3) hybridizing it on a microarray and (4) providing electrochemical readout, all in a microfluidics cartridge. The prototype under development is suitable for deployment on nanosatellite platforms developed by the NASA Small Spacecraft Office. The first target application is to cultivate and measure gene expression of the photosynthetic bacterium Synechococcus elongatus, i.e. a cyanobacterium known to exhibit remarkable metabolic diversity and resilience to adverse conditions

  17. Gene Expression Measurement Module (GEMM) - A Fully Automated, Miniaturized Instrument for Measuring Gene Expression in Space

    Science.gov (United States)

    Pohorille, Andrew; Peyvan, Kia; Karouia, Fathi; Ricco, Antonio

    2012-01-01

    The capability to measure gene expression on board spacecraft opens the door to a large number of high-value experiments on the influence of the space environment on biological systems. For example, measurements of gene expression will help us to understand adaptation of terrestrial life to conditions beyond the planet of origin, identify deleterious effects of the space environment on a wide range of organisms from microbes to humans, develop effective countermeasures against these effects, and determine the metabolic bases of microbial pathogenicity and drug resistance. These and other applications hold significant potential for discoveries in space biology, biotechnology, and medicine. Supported by funding from the NASA Astrobiology Science and Technology Instrument Development Program, we are developing a fully automated, miniaturized, integrated fluidic system for small spacecraft capable of in-situ measurement of expression of several hundreds of microbial genes from multiple samples. The instrument will be capable of (1) lysing cell walls of bacteria sampled from cultures grown in space, (2) extracting and purifying RNA released from cells, (3) hybridizing the RNA on a microarray and (4) providing readout of the microarray signal, all in a single microfluidics cartridge. The device is suitable for deployment on nanosatellite platforms developed by NASA Ames' Small Spacecraft Division. To meet space and other technical constraints imposed by these platforms, a number of technical innovations are being implemented. The integration and end-to-end technological and biological validation of the instrument are carried out using as a model the photosynthetic bacterium Synechococcus elongatus, known for its remarkable metabolic diversity and resilience to adverse conditions. Each step in the measurement process-lysis, nucleic acid extraction, purification, and hybridization to an array-is assessed through comparison of the results obtained using the instrument with

  18. Implementation of a fully automated process purge-and-trap gas chromatograph at an environmental remediation site

    International Nuclear Information System (INIS)

    Blair, D.S.; Morrison, D.J.

    1997-01-01

    The AQUASCAN, a commercially available, fully automated purge-and-trap gas chromatograph from Sentex Systems Inc., was implemented and evaluated as an in-field, automated monitoring system of contaminated groundwater at an active DOE remediation site in Pinellas, FL. Though the AQUASCAN is designed as a stand alone process analytical unit, implementation at this site required additional hardware. The hardware included a sample dilution system and a method for delivering standard solution to the gas chromatograph for automated calibration. As a result of the evaluation the system was determined to be a reliable and accurate instrument. The AQUASCAN reported concentration values for methylene chloride, trichloroethylene, and toluene in the Pinellas ground water were within 20% of reference laboratory values

  19. Fully automated VMAT treatment planning for advanced-stage NSCLC patients

    Energy Technology Data Exchange (ETDEWEB)

    Della Gala, Giuseppe [Erasmus MC Cancer Institute, Department of Radiation Oncology, Rotterdam (Netherlands); Universita di Bologna, Scuola di Scienze, Alma Mater Studiorum, Bologna (Italy); Dirkx, Maarten L.P.; Hoekstra, Nienke; Fransen, Dennie; Pol, Marjan van de; Heijmen, Ben J.M. [Erasmus MC Cancer Institute, Department of Radiation Oncology, Rotterdam (Netherlands); Lanconelli, Nico [Universita di Bologna, Scuola di Scienze, Alma Mater Studiorum, Bologna (Italy); Petit, Steven F. [Erasmus MC Cancer Institute, Department of Radiation Oncology, Rotterdam (Netherlands); Massachusetts General Hospital - Harvard Medical School, Department of Radiation Oncology, Boston, MA (United States)

    2017-05-15

    To develop a fully automated procedure for multicriterial volumetric modulated arc therapy (VMAT) treatment planning (autoVMAT) for stage III/IV non-small cell lung cancer (NSCLC) patients treated with curative intent. After configuring the developed autoVMAT system for NSCLC, autoVMAT plans were compared with manually generated clinically delivered intensity-modulated radiotherapy (IMRT) plans for 41 patients. AutoVMAT plans were also compared to manually generated VMAT plans in the absence of time pressure. For 16 patients with reduced planning target volume (PTV) dose prescription in the clinical IMRT plan (to avoid violation of organs at risk tolerances), the potential for dose escalation with autoVMAT was explored. Two physicians evaluated 35/41 autoVMAT plans (85%) as clinically acceptable. Compared to the manually generated IMRT plans, autoVMAT plans showed statistically significant improved PTV coverage (V{sub 95%} increased by 1.1% ± 1.1%), higher dose conformity (R{sub 50} reduced by 12.2% ± 12.7%), and reduced mean lung, heart, and esophagus doses (reductions of 0.9 Gy ± 1.0 Gy, 1.5 Gy ± 1.8 Gy, 3.6 Gy ± 2.8 Gy, respectively, all p < 0.001). To render the six remaining autoVMAT plans clinically acceptable, a dosimetrist needed less than 10 min hands-on time for fine-tuning. AutoVMAT plans were also considered equivalent or better than manually optimized VMAT plans. For 6/16 patients, autoVMAT allowed tumor dose escalation of 5-10 Gy. Clinically deliverable, high-quality autoVMAT plans can be generated fully automatically for the vast majority of advanced-stage NSCLC patients. For a subset of patients, autoVMAT allowed for tumor dose escalation. (orig.) [German] Entwicklung einer vollautomatisierten, auf multiplen Kriterien basierenden volumenmodulierten Arc-Therapie-(VMAT-)Behandlungsplanung (autoVMAT) fuer kurativ behandelte Patienten mit nicht-kleinzelligem Bronchialkarzinom (NSCLC) im Stadium III/IV. Nach Konfiguration unseres auto

  20. Fully automated atlas-based hippocampal volumetry for detection of Alzheimer's disease in a memory clinic setting.

    Science.gov (United States)

    Suppa, Per; Anker, Ulrich; Spies, Lothar; Bopp, Irene; Rüegger-Frey, Brigitte; Klaghofer, Richard; Gocke, Carola; Hampel, Harald; Beck, Sacha; Buchert, Ralph

    2015-01-01

    Hippocampal volume is a promising biomarker to enhance the accuracy of the diagnosis of dementia due to Alzheimer's disease (AD). However, whereas hippocampal volume is well studied in patient samples from clinical trials, its value in clinical routine patient care is still rather unclear. The aim of the present study, therefore, was to evaluate fully automated atlas-based hippocampal volumetry for detection of AD in the setting of a secondary care expert memory clinic for outpatients. One-hundred consecutive patients with memory complaints were clinically evaluated and categorized into three diagnostic groups: AD, intermediate AD, and non-AD. A software tool based on open source software (Statistical Parametric Mapping SPM8) was employed for fully automated tissue segmentation and stereotactical normalization of high-resolution three-dimensional T1-weighted magnetic resonance images. Predefined standard masks were used for computation of grey matter volume of the left and right hippocampus which then was scaled to the patient's total grey matter volume. The right hippocampal volume provided an area under the receiver operating characteristic curve of 84% for detection of AD patients in the whole sample. This indicates that fully automated MR-based hippocampal volumetry fulfills the requirements for a relevant core feasible biomarker for detection of AD in everyday patient care in a secondary care memory clinic for outpatients. The software used in the present study has been made freely available as an SPM8 toolbox. It is robust and fast so that it is easily integrated into routine workflow.

  1. A fully automated conversational agent for promoting mental well-being: A pilot RCT using mixed methods

    Directory of Open Access Journals (Sweden)

    Kien Hoa Ly

    2017-12-01

    Full Text Available Fully automated self-help interventions can serve as highly cost-effective mental health promotion tools for massive amounts of people. However, these interventions are often characterised by poor adherence. One way to address this problem is to mimic therapy support by a conversational agent. The objectives of this study were to assess the effectiveness and adherence of a smartphone app, delivering strategies used in positive psychology and CBT interventions via an automated chatbot (Shim for a non-clinical population — as well as to explore participants' views and experiences of interacting with this chatbot. A total of 28 participants were randomized to either receive the chatbot intervention (n = 14 or to a wait-list control group (n = 14. Findings revealed that participants who adhered to the intervention (n = 13 showed significant interaction effects of group and time on psychological well-being (FS and perceived stress (PSS-10 compared to the wait-list control group, with small to large between effect sizes (Cohen's d range 0.14–1.06. Also, the participants showed high engagement during the 2-week long intervention, with an average open app ratio of 17.71 times for the whole period. This is higher compared to other studies on fully automated interventions claiming to be highly engaging, such as Woebot and the Panoply app. The qualitative data revealed sub-themes which, to our knowledge, have not been found previously, such as the moderating format of the chatbot. The results of this study, in particular the good adherence rate, validated the usefulness of replicating this study in the future with a larger sample size and an active control group. This is important, as the search for fully automated, yet highly engaging and effective digital self-help interventions for promoting mental health is crucial for the public health.

  2. Performance of an Artificial Multi-observer Deep Neural Network for Fully Automated Segmentation of Polycystic Kidneys.

    Science.gov (United States)

    Kline, Timothy L; Korfiatis, Panagiotis; Edwards, Marie E; Blais, Jaime D; Czerwiec, Frank S; Harris, Peter C; King, Bernard F; Torres, Vicente E; Erickson, Bradley J

    2017-08-01

    Deep learning techniques are being rapidly applied to medical imaging tasks-from organ and lesion segmentation to tissue and tumor classification. These techniques are becoming the leading algorithmic approaches to solve inherently difficult image processing tasks. Currently, the most critical requirement for successful implementation lies in the need for relatively large datasets that can be used for training the deep learning networks. Based on our initial studies of MR imaging examinations of the kidneys of patients affected by polycystic kidney disease (PKD), we have generated a unique database of imaging data and corresponding reference standard segmentations of polycystic kidneys. In the study of PKD, segmentation of the kidneys is needed in order to measure total kidney volume (TKV). Automated methods to segment the kidneys and measure TKV are needed to increase measurement throughput and alleviate the inherent variability of human-derived measurements. We hypothesize that deep learning techniques can be leveraged to perform fast, accurate, reproducible, and fully automated segmentation of polycystic kidneys. Here, we describe a fully automated approach for segmenting PKD kidneys within MR images that simulates a multi-observer approach in order to create an accurate and robust method for the task of segmentation and computation of TKV for PKD patients. A total of 2000 cases were used for training and validation, and 400 cases were used for testing. The multi-observer ensemble method had mean ± SD percent volume difference of 0.68 ± 2.2% compared with the reference standard segmentations. The complete framework performs fully automated segmentation at a level comparable with interobserver variability and could be considered as a replacement for the task of segmentation of PKD kidneys by a human.

  3. A Novel Approach for Fully Automated, Personalized Health Coaching for Adults with Prediabetes: Pilot Clinical Trial.

    Science.gov (United States)

    Everett, Estelle; Kane, Brian; Yoo, Ashley; Dobs, Adrian; Mathioudakis, Nestoras

    2018-02-27

    Prediabetes is a high-risk state for the future development of type 2 diabetes, which may be prevented through physical activity (PA), adherence to a healthy diet, and weight loss. Mobile health (mHealth) technology is a practical and cost-effective method of delivering diabetes prevention programs in a real-world setting. Sweetch (Sweetch Health, Ltd) is a fully automated, personalized mHealth platform designed to promote adherence to PA and weight reduction in people with prediabetes. The objective of this pilot study was to calibrate the Sweetch app and determine the feasibility, acceptability, safety, and effectiveness of the Sweetch app in combination with a digital body weight scale (DBWS) in adults with prediabetes. This was a 3-month prospective, single-arm, observational study of adults with a diagnosis of prediabetes and body mass index (BMI) between 24 kg/m 2 and 40 kg/m 2 . Feasibility was assessed by study retention. Acceptability of the mobile platform and DBWS were evaluated using validated questionnaires. Effectiveness measures included change in PA, weight, BMI, glycated hemoglobin (HbA 1c ), and fasting blood glucose from baseline to 3-month visit. The significance of changes in outcome measures was evaluated using paired t test or Wilcoxon matched pairs test. The study retention rate was 47 out of 55 (86%) participants. There was a high degree of acceptability of the Sweetch app, with a median (interquartile range [IQR]) score of 78% (73%-80%) out of 100% on the validated System Usability Scale. Satisfaction regarding the DBWS was also high, with median (IQR) score of 93% (83%-100%). PA increased by 2.8 metabolic equivalent of task (MET)-hours per week (SD 6.8; P=.02), with mean weight loss of 1.6 kg (SD 2.5; P<.001) from baseline. The median change in A 1c was -0.1% (IQR -0.2% to 0.1%; P=.04), with no significant change in fasting blood glucose (-1 mg/dL; P=.59). There were no adverse events reported. The Sweetch mobile

  4. Field demonstration of automated demand response for both winter and summer events in large buildings in the Pacific Northwest

    Energy Technology Data Exchange (ETDEWEB)

    Piette, M.A.; Kiliccote, S.; Dudley, J.H. [Lawrence Berkeley National Laboratory, Berkeley, CA (United States)

    2013-11-15

    There are growing strains on the electric grid as cooling peaks grow and equipment ages. Increased penetration of renewables on the grid is also straining electricity supply systems and the need for flexible demand is growing. This paper summarizes results of a series of field test of automated demand response systems in large buildings in the Pacific Northwest. The objective of the research was twofold. One objective was to evaluate the use demand response automation technologies. A second objective was to evaluate control strategies that could change the electric load shape in both winter and summer conditions. Winter conditions focused on cold winter mornings, a time when the electric grid is often stressed. The summer test evaluated DR strategies in the afternoon. We found that we could automate both winter and summer control strategies with the open automated demand response communication standard. The buildings were able to provide significant demand response in both winter and summer events.

  5. Fully automated laboratory for the assay of plutonium in wastes and recoverable scraps

    International Nuclear Information System (INIS)

    Guiberteau, P.; Michaut, F.; Bergey, C.; Debruyne, T.

    1990-01-01

    To determine the plutonium content of wastes and recoverable scraps in intermediate size containers (ten liters) an automated laboratory has been carried out. Two passive methods of measurement are used. Gamma ray spectrometry allows plutonium isotopic analysis, americium determination and plutonium assay in wastes and poor scraps. Calorimetry is used for accurate (± 3%) plutonium determination in rich scraps. A full automation was realized with a barcode management and a supply robot to feed the eight assay set-ups. The laboratory works on a 24 hours per day and 365 days per year basis and has a capacity of 8,000 assays per year

  6. A two-dimensionally coincident second difference cosmic ray spike removal method for the fully automated processing of Raman spectra.

    Science.gov (United States)

    Schulze, H Georg; Turner, Robin F B

    2014-01-01

    Charge-coupled device detectors are vulnerable to cosmic rays that can contaminate Raman spectra with positive going spikes. Because spikes can adversely affect spectral processing and data analyses, they must be removed. Although both hardware-based and software-based spike removal methods exist, they typically require parameter and threshold specification dependent on well-considered user input. Here, we present a fully automated spike removal algorithm that proceeds without requiring user input. It is minimally dependent on sample attributes, and those that are required (e.g., standard deviation of spectral noise) can be determined with other fully automated procedures. At the core of the method is the identification and location of spikes with coincident second derivatives along both the spectral and spatiotemporal dimensions of two-dimensional datasets. The method can be applied to spectra that are relatively inhomogeneous because it provides fairly effective and selective targeting of spikes resulting in minimal distortion of spectra. Relatively effective spike removal obtained with full automation could provide substantial benefits to users where large numbers of spectra must be processed.

  7. The development of a fully automated radioimmunoassay instrument - micromedic systems concept 4

    International Nuclear Information System (INIS)

    Painter, K.

    1977-01-01

    The fully automatic RIA system Concept 4 by Micromedic is described in detail. The system uses antibody-coated test tubes to take up the samples. It has a maximum capacity of 200 tubes including standards and control tubes. Its advantages are, in particular, high flow rate, reproducibility, and fully automatic testing i.e. low personnel requirements. Its disadvantage are difficulties in protein assays. (ORU) [de

  8. Rapid access to compound libraries through flow technology: fully automated synthesis of a 3-aminoindolizine library via orthogonal diversification.

    Science.gov (United States)

    Lange, Paul P; James, Keith

    2012-10-08

    A novel methodology for the synthesis of druglike heterocycle libraries has been developed through the use of flow reactor technology. The strategy employs orthogonal modification of a heterocyclic core, which is generated in situ, and was used to construct both a 25-membered library of druglike 3-aminoindolizines, and selected examples of a 100-member virtual library. This general protocol allows a broad range of acylation, alkylation and sulfonamidation reactions to be performed in conjunction with a tandem Sonogashira coupling/cycloisomerization sequence. All three synthetic steps were conducted under full automation in the flow reactor, with no handling or isolation of intermediates, to afford the desired products in good yields. This fully automated, multistep flow approach opens the way to highly efficient generation of druglike heterocyclic systems as part of a lead discovery strategy or within a lead optimization program.

  9. Fully automated synthesis of 11C-acetate as tumor PET tracer by simple modified solid-phase extraction purification

    International Nuclear Information System (INIS)

    Tang, Xiaolan; Tang, Ganghua; Nie, Dahong

    2013-01-01

    Introduction: Automated synthesis of 11 C-acetate ( 11 C-AC) as the most commonly used radioactive fatty acid tracer is performed by a simple, rapid, and modified solid-phase extraction (SPE) purification. Methods: Automated synthesis of 11 C-AC was implemented by carboxylation reaction of MeMgBr on a polyethylene Teflon loop ring with 11 C-CO 2 , followed by acidic hydrolysis with acid and SCX cartridge, and purification on SCX, AG11A8 and C18 SPE cartridges using a commercially available 11 C-tracer synthesizer. Quality control test and animals positron emission tomography (PET) imaging were also carried out. Results: A high and reproducible decay-uncorrected radiochemical yield of (41.0±4.6)% (n=10) was obtained from 11 C-CO 2 within the whole synthesis time about 8 min. The radiochemical purity of 11 C-AC was over 95% by high-performance liquid chromatography (HPLC) analysis. Quality control test and PET imaging showed that 11 C-AC injection produced by the simple SPE procedure was safe and efficient, and was in agreement with the current Chinese radiopharmaceutical quality control guidelines. Conclusion: The novel, simple, and rapid method is readily adapted to the fully automated synthesis of 11 C-AC on several existing commercial synthesis module. The method can be used routinely to produce 11 C-AC for preclinical and clinical studies with PET imaging. - Highlights: • A fully automated synthesis of 11 C-acetate by simple modified solid-phase extraction purification has been developed. • Typical non-decay-corrected yields were (41.0±4.6)% (n=10) • Radiochemical purity was determined by radio-HPLC analysis on a C18 column using the gradient program, instead of expensive organic acid column or anion column. • QC testing (RCP>99%)

  10. A new TLD badge with machine readable ID for fully automated readout

    International Nuclear Information System (INIS)

    Kannan, S. Ratna P.; Kulkarni, M.S.

    2003-01-01

    The TLD badge currently being used for personnel monitoring of more than 40,000 radiation workers has a few drawbacks such as lack of on-badge machine readable ID code, delicate two-point clamping of dosimeters on an aluminium card with the chances of dosimeters falling off during handling or readout, projections on one side making automation of readout difficult etc. A new badge has been designed with a 8-digit identification code in the form of an array of holes and smooth exteriors to enable full automation of readout. The new badge also permits changing of dosimeters when necessary. The new design does not affect the readout time or the dosimetric characteristics. The salient features and the dosimetric characteristics are discussed. (author)

  11. A fully automated mass spectrometer for the analysis of organic solids

    International Nuclear Information System (INIS)

    Hillig, H.; Kueper, H.; Riepe, W.

    1979-01-01

    Automation of a mass spectrometer-computer system makes it possible to process up to 30 samples without attention after sample loading. An automatic sample changer introduces the samples successively into the ion source by means of a direct inlet probe. A process control unit determines the operation sequence. Computer programs are available for the hardware support, system supervision and evaluation of the spectrometer signals. The most essential precondition for automation - automatic evaporation of the sample material by electronic control of the total ion current - is confirmed to be satisfactory. The system operates routinely overnight in an industrial laboratory, so that day work can be devoted to difficult analytical problems. The cost of routine analyses is halved. (Auth.)

  12. A wearable device for a fully automated in-hospital staff and patient identification.

    Science.gov (United States)

    Cavalleri, M; Morstabilini, R; Reni, G

    2004-01-01

    In the health care context, devices for automated staff / patient identification provide multiple benefits, including error reduction in drug administration, an easier and faster use of the Electronic Health Record, enhanced security and control features when accessing confidential data, etc. Current identification systems (e.g. smartcards, bar codes) are not completely seamless to users and require mechanical operations that sometimes are difficult to perform for impaired subjects. Emerging wireless RFID technologies are encouraging, but cannot still be introduced in health care environments due to their electromagnetic emissions and the need for large size antenna to operate at reasonable distances. The present work describes a prototype of wearable device for automated staff and patient identification which is small in size and complies with the in-hospital electromagnetic requirements. This prototype also implements an anti-counterfeit option. Its experimental application allowed the introduction of some security functions for confidential data management.

  13. Development of a fully automated network system for long-term health-care monitoring at home.

    Science.gov (United States)

    Motoi, K; Kubota, S; Ikarashi, A; Nogawa, M; Tanaka, S; Nemoto, T; Yamakoshi, K

    2007-01-01

    Daily monitoring of health condition at home is very important not only as an effective scheme for early diagnosis and treatment of cardiovascular and other diseases, but also for prevention and control of such diseases. From this point of view, we have developed a prototype room for fully automated monitoring of various vital signs. From the results of preliminary experiments using this room, it was confirmed that (1) ECG and respiration during bathing, (2) excretion weight and blood pressure, and (3) respiration and cardiac beat during sleep could be monitored with reasonable accuracy by the sensor system installed in bathtub, toilet and bed, respectively.

  14. Evaluation of a Fully Automated Analyzer for Rapid Measurement of Water Vapor Sorption Isotherms for Applications in Soil Science

    DEFF Research Database (Denmark)

    Arthur, Emmanuel; Tuller, Markus; Moldrup, Per

    2014-01-01

    The characterization and description of important soil processes such as water vapor transport, volatilization of pesticides, and hysteresis require accurate means for measuring the soil water characteristic (SWC) at low water potentials. Until recently, measurement of the SWC at low water...... potentials was constrained by hydraulic decoupling and long equilibration times when pressure plates or single-point, chilled-mirror instruments were used. A new, fully automated Vapor Sorption Analyzer (VSA) helps to overcome these challenges and allows faster measurement of highly detailed water vapor...

  15. Comparison of subjective and fully automated methods for measuring mammographic density.

    Science.gov (United States)

    Moshina, Nataliia; Roman, Marta; Sebuødegård, Sofie; Waade, Gunvor G; Ursin, Giske; Hofvind, Solveig

    2018-02-01

    Background Breast radiologists of the Norwegian Breast Cancer Screening Program subjectively classified mammographic density using a three-point scale between 1996 and 2012 and changed into the fourth edition of the BI-RADS classification since 2013. In 2015, an automated volumetric breast density assessment software was installed at two screening units. Purpose To compare volumetric breast density measurements from the automated method with two subjective methods: the three-point scale and the BI-RADS density classification. Material and Methods Information on subjective and automated density assessment was obtained from screening examinations of 3635 women recalled for further assessment due to positive screening mammography between 2007 and 2015. The score of the three-point scale (I = fatty; II = medium dense; III = dense) was available for 2310 women. The BI-RADS density score was provided for 1325 women. Mean volumetric breast density was estimated for each category of the subjective classifications. The automated software assigned volumetric breast density to four categories. The agreement between BI-RADS and volumetric breast density categories was assessed using weighted kappa (k w ). Results Mean volumetric breast density was 4.5%, 7.5%, and 13.4% for categories I, II, and III of the three-point scale, respectively, and 4.4%, 7.5%, 9.9%, and 13.9% for the BI-RADS density categories, respectively ( P for trend density categories was k w  = 0.5 (95% CI = 0.47-0.53; P density increased with increasing density category of the subjective classifications. The agreement between BI-RADS and volumetric breast density categories was moderate.

  16. A fully automated primary screening system for the discovery of therapeutic antibodies directly from B cells.

    Science.gov (United States)

    Tickle, Simon; Howells, Louise; O'Dowd, Victoria; Starkie, Dale; Whale, Kevin; Saunders, Mark; Lee, David; Lightwood, Daniel

    2015-04-01

    For a therapeutic antibody to succeed, it must meet a range of potency, stability, and specificity criteria. Many of these characteristics are conferred by the amino acid sequence of the heavy and light chain variable regions and, for this reason, can be screened for during antibody selection. However, it is important to consider that antibodies satisfying all these criteria may be of low frequency in an immunized animal; for this reason, it is essential to have a mechanism that allows for efficient sampling of the immune repertoire. UCB's core antibody discovery platform combines high-throughput B cell culture screening and the identification and isolation of single, antigen-specific IgG-secreting B cells through a proprietary technique called the "fluorescent foci" method. Using state-of-the-art automation to facilitate primary screening, extremely efficient interrogation of the natural antibody repertoire is made possible; more than 1 billion immune B cells can now be screened to provide a useful starting point from which to identify the rare therapeutic antibody. This article will describe the design, construction, and commissioning of a bespoke automated screening platform and two examples of how it was used to screen for antibodies against two targets. © 2014 Society for Laboratory Automation and Screening.

  17. Fully automated synthesis of [(18) F]fluoro-dihydrotestosterone ([(18) F]FDHT) using the FlexLab module.

    Science.gov (United States)

    Ackermann, Uwe; Lewis, Jason S; Young, Kenneth; Morris, Michael J; Weickhardt, Andrew; Davis, Ian D; Scott, Andrew M

    2016-08-01

    Imaging of androgen receptor expression in prostate cancer using F-18 FDHT is becoming increasingly popular. With the radiolabelling precursor now commercially available, developing a fully automated synthesis of [(18) F] FDHT is important. We have fully automated the synthesis of F-18 FDHT using the iPhase FlexLab module using only commercially available components. Total synthesis time was 90 min, radiochemical yields were 25-33% (n = 11). Radiochemical purity of the final formulation was > 99% and specific activity was > 18.5 GBq/µmol for all batches. This method can be up-scaled as desired, thus making it possible to study multiple patients in a day. Furthermore, our procedure uses 4 mg of precursor only and is therefore cost-effective. The synthesis has now been validated at Austin Health and is currently used for [(18) F]FDHT studies in patients. We believe that this method can easily adapted by other modules to further widen the availability of [(18) F]FDHT. Copyright © 2016 John Wiley & Sons, Ltd.

  18. Searching for prostate cancer by fully automated magnetic resonance imaging classification: deep learning versus non-deep learning.

    Science.gov (United States)

    Wang, Xinggang; Yang, Wei; Weinreb, Jeffrey; Han, Juan; Li, Qiubai; Kong, Xiangchuang; Yan, Yongluan; Ke, Zan; Luo, Bo; Liu, Tao; Wang, Liang

    2017-11-13

    Prostate cancer (PCa) is a major cause of death since ancient time documented in Egyptian Ptolemaic mummy imaging. PCa detection is critical to personalized medicine and varies considerably under an MRI scan. 172 patients with 2,602 morphologic images (axial 2D T2-weighted imaging) of the prostate were obtained. A deep learning with deep convolutional neural network (DCNN) and a non-deep learning with SIFT image feature and bag-of-word (BoW), a representative method for image recognition and analysis, were used to distinguish pathologically confirmed PCa patients from prostate benign conditions (BCs) patients with prostatitis or prostate benign hyperplasia (BPH). In fully automated detection of PCa patients, deep learning had a statistically higher area under the receiver operating characteristics curve (AUC) than non-deep learning (P = 0.0007 deep learning method and 0.70 (95% CI 0.63-0.77) for non-deep learning method, respectively. Our results suggest that deep learning with DCNN is superior to non-deep learning with SIFT image feature and BoW model for fully automated PCa patients differentiation from prostate BCs patients. Our deep learning method is extensible to image modalities such as MR imaging, CT and PET of other organs.

  19. Fully automated laboratory and field-portable goniometer used for performing accurate and precise multiangular reflectance measurements

    Science.gov (United States)

    Harms, Justin D.; Bachmann, Charles M.; Ambeau, Brittany L.; Faulring, Jason W.; Ruiz Torres, Andres J.; Badura, Gregory; Myers, Emily

    2017-10-01

    Field-portable goniometers are created for a wide variety of applications. Many of these applications require specific types of instruments and measurement schemes and must operate in challenging environments. Therefore, designs are based on the requirements that are specific to the application. We present a field-portable goniometer that was designed for measuring the hemispherical-conical reflectance factor (HCRF) of various soils and low-growing vegetation in austere coastal and desert environments and biconical reflectance factors in laboratory settings. Unlike some goniometers, this system features a requirement for "target-plane tracking" to ensure that measurements can be collected on sloped surfaces, without compromising angular accuracy. The system also features a second upward-looking spectrometer to measure the spatially dependent incoming illumination, an integrated software package to provide full automation, an automated leveling system to ensure a standard frame of reference, a design that minimizes the obscuration due to self-shading to measure the opposition effect, and the ability to record a digital elevation model of the target region. This fully automated and highly mobile system obtains accurate and precise measurements of HCRF in a wide variety of terrain and in less time than most other systems while not sacrificing consistency or repeatability in laboratory environments.

  20. Reliability of fully automated versus visually controlled pre- and post-processing of resting-state EEG.

    Science.gov (United States)

    Hatz, F; Hardmeier, M; Bousleiman, H; Rüegg, S; Schindler, C; Fuhr, P

    2015-02-01

    To compare the reliability of a newly developed Matlab® toolbox for the fully automated, pre- and post-processing of resting state EEG (automated analysis, AA) with the reliability of analysis involving visually controlled pre- and post-processing (VA). 34 healthy volunteers (age: median 38.2 (20-49), 82% female) had three consecutive 256-channel resting-state EEG at one year intervals. Results of frequency analysis of AA and VA were compared with Pearson correlation coefficients, and reliability over time was assessed with intraclass correlation coefficients (ICC). Mean correlation coefficient between AA and VA was 0.94±0.07, mean ICC for AA 0.83±0.05 and for VA 0.84±0.07. AA and VA yield very similar results for spectral EEG analysis and are equally reliable. AA is less time-consuming, completely standardized, and independent of raters and their training. Automated processing of EEG facilitates workflow in quantitative EEG analysis. Copyright © 2014 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  1. Construction and calibration of a low cost and fully automated vibrating sample magnetometer

    International Nuclear Information System (INIS)

    El-Alaily, T.M.; El-Nimr, M.K.; Saafan, S.A.; Kamel, M.M.; Meaz, T.M.; Assar, S.T.

    2015-01-01

    A low cost vibrating sample magnetometer (VSM) has been constructed by using an electromagnet and an audio loud speaker; where both are controlled by a data acquisition device. The constructed VSM records the magnetic hysteresis loop up to 8.3 KG at room temperature. The apparatus has been calibrated and tested by using magnetic hysteresis data of some ferrite samples measured by two scientifically calibrated magnetometers; model (Lake Shore 7410) and model (LDJ Electronics Inc. Troy, MI). Our VSM lab-built new design proved success and reliability. - Highlights: • A low cost automated vibrating sample magnetometer VSM has been constructed. • The VSM records the magnetic hysteresis loop up to 8.3 KG at room temperature. • The VSM has been calibrated and tested by using some measured ferrite samples. • Our VSM lab-built new design proved success and reliability

  2. Construction and calibration of a low cost and fully automated vibrating sample magnetometer

    Energy Technology Data Exchange (ETDEWEB)

    El-Alaily, T.M., E-mail: toson_alaily@yahoo.com [Physics Department, Faculty of Science, Tanta University, Tanta (Egypt); El-Nimr, M.K.; Saafan, S.A.; Kamel, M.M.; Meaz, T.M. [Physics Department, Faculty of Science, Tanta University, Tanta (Egypt); Assar, S.T. [Engineering Physics and Mathematics Department, Faculty of Engineering, Tanta University, Tanta (Egypt)

    2015-07-15

    A low cost vibrating sample magnetometer (VSM) has been constructed by using an electromagnet and an audio loud speaker; where both are controlled by a data acquisition device. The constructed VSM records the magnetic hysteresis loop up to 8.3 KG at room temperature. The apparatus has been calibrated and tested by using magnetic hysteresis data of some ferrite samples measured by two scientifically calibrated magnetometers; model (Lake Shore 7410) and model (LDJ Electronics Inc. Troy, MI). Our VSM lab-built new design proved success and reliability. - Highlights: • A low cost automated vibrating sample magnetometer VSM has been constructed. • The VSM records the magnetic hysteresis loop up to 8.3 KG at room temperature. • The VSM has been calibrated and tested by using some measured ferrite samples. • Our VSM lab-built new design proved success and reliability.

  3. A fully automated contour detection algorithm the preliminary step for scatter and attenuation compensation in SPECT

    International Nuclear Information System (INIS)

    Younes, R.B.; Mas, J.; Bidet, R.

    1988-01-01

    Contour detection is an important step in information extraction from nuclear medicine images. In order to perform accurate quantitative studies in single photon emission computed tomography (SPECT) a new procedure is described which can rapidly derive the best fit contour of an attenuated medium. Some authors evaluate the influence of the detected contour on the reconstructed images with various attenuation correction techniques. Most of the methods are strongly affected by inaccurately detected contours. This approach uses the Compton window to redetermine the convex contour: It seems to be simpler and more practical in clinical SPECT studies. The main advantages of this procedure are the high speed of computation, the accuracy of the contour found and the programme's automation. Results obtained using computer simulated and real phantoms or clinical studies demonstrate the reliability of the present algorithm. (orig.)

  4. A FULLY AUTOMATED PIPELINE FOR CLASSIFICATION TASKS WITH AN APPLICATION TO REMOTE SENSING

    Directory of Open Access Journals (Sweden)

    K. Suzuki

    2016-06-01

    Full Text Available Nowadays deep learning has been intensively in spotlight owing to its great victories at major competitions, which undeservedly pushed ‘shallow’ machine learning methods, relatively naive/handy algorithms commonly used by industrial engineers, to the background in spite of their facilities such as small requisite amount of time/dataset for training. We, with a practical point of view, utilized shallow learning algorithms to construct a learning pipeline such that operators can utilize machine learning without any special knowledge, expensive computation environment, and a large amount of labelled data. The proposed pipeline automates a whole classification process, namely feature-selection, weighting features and the selection of the most suitable classifier with optimized hyperparameters. The configuration facilitates particle swarm optimization, one of well-known metaheuristic algorithms for the sake of generally fast and fine optimization, which enables us not only to optimize (hyperparameters but also to determine appropriate features/classifier to the problem, which has conventionally been a priori based on domain knowledge and remained untouched or dealt with naïve algorithms such as grid search. Through experiments with the MNIST and CIFAR-10 datasets, common datasets in computer vision field for character recognition and object recognition problems respectively, our automated learning approach provides high performance considering its simple setting (i.e. non-specialized setting depending on dataset, small amount of training data, and practical learning time. Moreover, compared to deep learning the performance stays robust without almost any modification even with a remote sensing object recognition problem, which in turn indicates that there is a high possibility that our approach contributes to general classification problems.

  5. Comparisons of fully automated syphilis tests with conventional VDRL and FTA-ABS tests.

    Science.gov (United States)

    Choi, Seung Jun; Park, Yongjung; Lee, Eun Young; Kim, Sinyoung; Kim, Hyon-Suk

    2013-06-01

    Serologic tests are widely used for the diagnosis of syphilis. However, conventional methods require well-trained technicians to produce reliable results. We compared automated nontreponemal and treponemal tests with conventional methods. The HiSens Auto Rapid Plasma Reagin (AutoRPR) and Treponema Pallidum particle agglutination (AutoTPPA) tests, which utilize latex turbidimetric immunoassay, were assessed. A total of 504 sera were assayed by AutoRPR, AutoTPPA, conventional VDRL and FTA-ABS. Among them, 250 samples were also tested by conventional TPPA. The concordance rate between the results of VDRL and AutoRPR was 67.5%, and 164 discrepant cases were all VDRL reactive but AutoRPR negative. In the 164 cases, 133 showed FTA-ABS reactivity. Medical records of 106 among the 133 cases were reviewed, and 82 among 106 specimens were found to be collected from patients already treated for syphilis. The concordance rate between the results of AutoTPPA and FTA-ABS was 97.8%. The results of conventional TPPA and AutoTPPA for 250 samples were concordant in 241 cases (96.4%). AutoRPR showed higher specificity than that of VDRL, while VDRL demonstrated higher sensitivity than that of AutoRPR regardless of whether the patients had been already treated for syphilis or not. Both FTA-ABS and AutoTPPA showed high sensitivities and specificities greater than 98.0%. Automated RPR and TPPA tests could be alternatives to conventional syphilis tests, and AutoRPR would be particularly suitable in treatment monitoring, since results by AutoRPR in cases after treatment became negative more rapidly than by VDRL. Copyright © 2013. Published by Elsevier Inc.

  6. Validation of a Fully Automated HER2 Staining Kit in Breast Cancer

    Directory of Open Access Journals (Sweden)

    Cathy B. Moelans

    2010-01-01

    Full Text Available Background: Testing for HER2 amplification and/or overexpression is currently routine practice to guide Herceptin therapy in invasive breast cancer. At present, HER2 status is most commonly assessed by immunohistochemistry (IHC. Standardization of HER2 IHC assays is of utmost clinical and economical importance. At present, HER2 IHC is most commonly performed with the HercepTest which contains a polyclonal antibody and applies a manual staining procedure. Analytical variability in HER2 IHC testing could be diminished by a fully automatic staining system with a monoclonal antibody.

  7. Opportunities for Energy Efficiency and Open Automated Demand Response in Wastewater Treatment Facilities in California -- Phase I Report

    Energy Technology Data Exchange (ETDEWEB)

    Lekov, Alex; Thompson, Lisa; McKane, Aimee; Song, Katherine; Piette, Mary Ann

    2009-04-01

    This report summarizes the Lawrence Berkeley National Laboratory?s research to date in characterizing energy efficiency and automated demand response opportunities for wastewater treatment facilities in California. The report describes the characteristics of wastewater treatment facilities, the nature of the wastewater stream, energy use and demand, as well as details of the wastewater treatment process. It also discusses control systems and energy efficiency and automated demand response opportunities. In addition, several energy efficiency and load management case studies are provided for wastewater treatment facilities.This study shows that wastewater treatment facilities can be excellent candidates for open automated demand response and that facilities which have implemented energy efficiency measures and have centralized control systems are well-suited to shift or shed electrical loads in response to financial incentives, utility bill savings, and/or opportunities to enhance reliability of service. Control technologies installed for energy efficiency and load management purposes can often be adapted for automated demand response at little additional cost. These improved controls may prepare facilities to be more receptive to open automated demand response due to both increased confidence in the opportunities for controlling energy cost/use and access to the real-time data.

  8. Automated Demand Response Technology Demonstration Project for Small and Medium Commercial Buildings

    Energy Technology Data Exchange (ETDEWEB)

    Page, Janie; Kiliccote, Sila; Dudley, Junqiao Han; Piette, Mary Ann; Chiu, Albert K.; Kellow, Bashar; Koch, Ed; Lipkin, Paul

    2011-07-01

    Small and medium commercial customers in California make up about 20-25% of electric peak load in California. With the roll out of smart meters to this customer group, which enable granular measurement of electricity consumption, the investor-owned utilities will offer dynamic prices as default tariffs by the end of 2011. Pacific Gas and Electric Company, which successfully deployed Automated Demand Response (AutoDR) Programs to its large commercial and industrial customers, started investigating the same infrastructures application to the small and medium commercial customers. This project aims to identify available technologies suitable for automating demand response for small-medium commercial buildings; to validate the extent to which that technology does what it claims to be able to do; and determine the extent to which customers find the technology useful for DR purpose. Ten sites, enabled by eight vendors, participated in at least four test AutoDR events per site in the summer of 2010. The results showed that while existing technology can reliably receive OpenADR signals and translate them into pre-programmed response strategies, it is likely that better levels of load sheds could be obtained than what is reported here if better understanding of the building systems were developed and the DR response strategies had been carefully designed and optimized for each site.

  9. Fully automated processing of buffy-coat-derived pooled platelet concentrates.

    Science.gov (United States)

    Janetzko, Karin; Klüter, Harald; van Waeg, Geert; Eichler, Hermann

    2004-07-01

    The OrbiSac device, which was developed to automate the manufacture of buffy-coat PLT concentrates (BC-PCs), was evaluated. In-vitro characteristics of BC-PC preparations using the OrbiSac device were compared with manually prepared BC-PCs. For standard processing (Std-PC, n = 20), four BC-PCs were pooled using 300 mL of PLT AS (PAS) followed by soft-spin centrifugation and WBC filtration. The OrbiSac preparation (OS-PC, n = 20) was performed by automated pooling of four BC-PCs with 300 mL PAS followed by centrifugation and inline WBC filtration. All PCs were stored at 22 degrees C. Samples were withdrawn on Day 1, 5, and 7 evaluating PTL count, blood gas analysis, glucose, lactate, LDH, beta-thromboglobulin, hypotonic shock response, and CD62p expression. A PLT content of 3.1 +/- 0.4 x 10(11) (OS-PCs) versus 2.7 +/- 0.5 x 10(11) (Std-PCs, p < 0.05) was found. A CV of 19 percent (Std-PC) versus 14 percent (OS-PC) suggests more standardization in the OS group. At Day 7, the Std-PCs versus OS-PCs showed a glucose consumption of 1.03 +/- 0.32 micro mol per 10(9) PLT versus 0.75 +/- 0.25 micro mol per 10(9) PLT (p < 0.001), and a lactate production of 1.50 +/- 0.86 micro mol per 10(9) versus 1.11 +/- 0.61 micro mol per 10(9) (p < 0.001). The pH (7.00 +/- 0.19 vs. 7.23 +/- 0.06; p < 0.001), pO(2) (45.3 +/- 18 vs. 31.3 +/- 10.4 mmHg; p < 0.01), and HCO(3) levels (4.91 +/- 1.49 vs. 7.14 +/- 0.95 mmol/L; p < 0.001) suggest a slightly better aerobic metabolism within the OS group. Only small differences in CD62p expression was observed (37.3 +/- 12.9% Std-PC vs. 44.8 +/- 6.6% OS-PC; p < 0.05). The OrbiSac device allows an improved PLT yield without affecting PLT in-vitro characteristics and may enable an improved consistency in product volume and yield.

  10. A quality assurance framework for the fully automated and objective evaluation of image quality in cone-beam computed tomography

    International Nuclear Information System (INIS)

    Steiding, Christian; Kolditz, Daniel; Kalender, Willi A.

    2014-01-01

    Purpose: Thousands of cone-beam computed tomography (CBCT) scanners for vascular, maxillofacial, neurological, and body imaging are in clinical use today, but there is no consensus on uniform acceptance and constancy testing for image quality (IQ) and dose yet. The authors developed a quality assurance (QA) framework for fully automated and time-efficient performance evaluation of these systems. In addition, the dependence of objective Fourier-based IQ metrics on direction and position in 3D volumes was investigated for CBCT. Methods: The authors designed a dedicated QA phantom 10 cm in length consisting of five compartments, each with a diameter of 10 cm, and an optional extension ring 16 cm in diameter. A homogeneous section of water-equivalent material allows measuring CT value accuracy, image noise and uniformity, and multidimensional global and local noise power spectra (NPS). For the quantitative determination of 3D high-contrast spatial resolution, the modulation transfer function (MTF) of centrally and peripherally positioned aluminum spheres was computed from edge profiles. Additional in-plane and axial resolution patterns were used to assess resolution qualitatively. The characterization of low-contrast detectability as well as CT value linearity and artifact behavior was tested by utilizing sections with soft-tissue-equivalent and metallic inserts. For an automated QA procedure, a phantom detection algorithm was implemented. All tests used in the dedicated QA program were initially verified in simulation studies and experimentally confirmed on a clinical dental CBCT system. Results: The automated IQ evaluation of volume data sets of the dental CBCT system was achieved with the proposed phantom requiring only one scan for the determination of all desired parameters. Typically, less than 5 min were needed for phantom set-up, scanning, and data analysis. Quantitative evaluation of system performance over time by comparison to previous examinations was also

  11. A quality assurance framework for the fully automated and objective evaluation of image quality in cone-beam computed tomography.

    Science.gov (United States)

    Steiding, Christian; Kolditz, Daniel; Kalender, Willi A

    2014-03-01

    Thousands of cone-beam computed tomography (CBCT) scanners for vascular, maxillofacial, neurological, and body imaging are in clinical use today, but there is no consensus on uniform acceptance and constancy testing for image quality (IQ) and dose yet. The authors developed a quality assurance (QA) framework for fully automated and time-efficient performance evaluation of these systems. In addition, the dependence of objective Fourier-based IQ metrics on direction and position in 3D volumes was investigated for CBCT. The authors designed a dedicated QA phantom 10 cm in length consisting of five compartments, each with a diameter of 10 cm, and an optional extension ring 16 cm in diameter. A homogeneous section of water-equivalent material allows measuring CT value accuracy, image noise and uniformity, and multidimensional global and local noise power spectra (NPS). For the quantitative determination of 3D high-contrast spatial resolution, the modulation transfer function (MTF) of centrally and peripherally positioned aluminum spheres was computed from edge profiles. Additional in-plane and axial resolution patterns were used to assess resolution qualitatively. The characterization of low-contrast detectability as well as CT value linearity and artifact behavior was tested by utilizing sections with soft-tissue-equivalent and metallic inserts. For an automated QA procedure, a phantom detection algorithm was implemented. All tests used in the dedicated QA program were initially verified in simulation studies and experimentally confirmed on a clinical dental CBCT system. The automated IQ evaluation of volume data sets of the dental CBCT system was achieved with the proposed phantom requiring only one scan for the determination of all desired parameters. Typically, less than 5 min were needed for phantom set-up, scanning, and data analysis. Quantitative evaluation of system performance over time by comparison to previous examinations was also verified. The maximum

  12. Fully Automated Detection of Corticospinal Tract Damage in Chronic Stroke Patients

    Directory of Open Access Journals (Sweden)

    Ming Yang

    2014-01-01

    Full Text Available Structural integrity of the corticospinal tract (CST after stroke is closely linked to the degree of motor impairment. However, current methods for measurement of fractional atrophy (FA of CST based on region of interest (ROI are time-consuming and open to bias. Here, we used tract-based spatial statistics (TBSS together with a CST template with healthy volunteers to quantify structural integrity of CST automatically. Two groups of patients after ischemic stroke were enrolled, group 1 (10 patients, 7 men, and Fugl-Meyer assessment (FMA scores ⩽ 50 and group 2 (12 patients, 12 men, and FMA scores = 100. CST of FAipsi, FAcontra, and FAratio was compared between the two groups. Relative to group 2, FA was decreased in group 1 in the ipsilesional CST (P<0.01, as well as the FAratio (P<0.01. There was no significant difference between the two subgroups in the contralesional CST (P=0.23. Compared with contralesional CST, FA of ipsilesional CST decreased in group 1 (P<0.01. These results suggest that the automated method used in our study could detect a surrogate biomarker to quantify the CST after stroke, which would facilitate implementation of clinical practice.

  13. Atmospheric ozone measurement with an inexpensive and fully automated porous tube collector-colorimeter.

    Science.gov (United States)

    Li, Jianzhong; Li, Qingyang; Dyke, Jason V; Dasgupta, Purnendu K

    2008-01-15

    The bleaching action of ozone on indigo and related compounds is well known. We describe sensitive automated instrumentation for measuring ambient ozone. Air is sampled around a porous polypropylene tube filled with a solution of indigotrisulfonate. Light transmission through the tube is measured. Light transmission increases as O(3) diffuses through the membrane and bleaches the indigo. Evaporation of the solution, a function of the RH and the air temperature, can, however cause major errors. We solve this problem by adding an O(3)-inert dye that absorbs at a different wavelength. Here we provide a new algorithm for this correction and show that this very inexpensive instrument package (controlled by a BASIC Stamp Microcontroller with an on-board data logger, total parts cost US$ 300) provides data highly comparable to commercial ozone monitors over an extended period. The instrument displays an LOD of 1.2ppbv and a linear span up to 300ppbv for a sampling time of 1min. For a sampling time of 5min, the respective values are 0.24ppbv and 100ppbv O(3).

  14. A Fully Automated and Robust Method to Incorporate Stamping Data in Crash, NVH and Durability Analysis

    Science.gov (United States)

    Palaniswamy, Hariharasudhan; Kanthadai, Narayan; Roy, Subir; Beauchesne, Erwan

    2011-08-01

    Crash, NVH (Noise, Vibration, Harshness), and durability analysis are commonly deployed in structural CAE analysis for mechanical design of components especially in the automotive industry. Components manufactured by stamping constitute a major portion of the automotive structure. In CAE analysis they are modeled at a nominal state with uniform thickness and no residual stresses and strains. However, in reality the stamped components have non-uniformly distributed thickness and residual stresses and strains resulting from stamping. It is essential to consider the stamping information in CAE analysis to accurately model the behavior of the sheet metal structures under different loading conditions. Especially with the current emphasis on weight reduction by replacing conventional steels with aluminum and advanced high strength steels it is imperative to avoid over design. Considering this growing need in industry, a highly automated and robust method has been integrated within Altair Hyperworks® to initialize sheet metal components in CAE models with stamping data. This paper demonstrates this new feature and the influence of stamping data for a full car frontal crash analysis.

  15. Grid-Competitive Residential and Commercial Fully Automated PV Systems Technology: Final technical Report, August 2011

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Katie E.; Cousins, Peter; Culligan, Matt; Jonathan Botkin; DeGraaff, David; Bunea, Gabriella; Rose, Douglas; Bourne, Ben; Koehler, Oliver

    2011-08-26

    Under DOE's Technology Pathway Partnership program, SunPower Corporation developed turn-key, high-efficiency residential and commercial systems that are cost effective. Key program objectives include a reduction in LCOE values to 9-12 cents/kWh and 13-18 cents/kWh respectively for the commercial and residential markets. Target LCOE values for the commercial ground, commercial roof, and residential markets are 10, 11, and 13 cents/kWh. For this effort, SunPower collaborated with a variety of suppliers and partners to complete the tasks below. Subcontractors included: Solaicx, SiGen, Ribbon Technology, Dow Corning, Xantrex, Tigo Energy, and Solar Bridge. SunPower's TPP addressed nearly the complete PV value chain: from ingot growth through system deployment. Throughout the award period of performance, SunPower has made progress toward achieving these reduced costs through the development of 20%+ efficient modules, increased cell efficiency through the understanding of loss mechanisms and improved manufacturing technologies, novel module development, automated design tools and techniques, and reduced system development and installation time. Based on an LCOE assessment using NREL's Solar Advisor Model, SunPower achieved the 2010 target range, as well as progress toward 2015 targets.

  16. Effects of Granular Control on Customers’ Perspective and Behavior with Automated Demand Response Systems

    Energy Technology Data Exchange (ETDEWEB)

    Schetrit, Oren; Kim, Joyce; Yin, Rongxin; Kiliccote, Sila

    2014-08-01

    Automated demand response (Auto-DR) is expected to close the loop between buildings and the grid by providing machine-to-machine communications to curtail loads without the need for human intervention. Hence, it can offer more reliable and repeatable demand response results to the grid than the manual approach and make demand response participation a hassle-free experience for customers. However, many building operators misunderstand Auto-DR and are afraid of losing control over their building operation. To ease the transition from manual to Auto-DR, we designed and implemented granular control of Auto-DR systems so that building operators could modify or opt out of individual load-shed strategies whenever they wanted. This paper reports the research findings from this effort demonstrated through a field study in large commercial buildings located in New York City. We focused on (1) understanding how providing granular control affects building operators’ perspective on Auto-DR, and (2) evaluating the usefulness of granular control by examining their interaction with the Auto-DR user interface during test events. Through trend log analysis, interviews, and surveys, we found that: (1) the opt-out capability during Auto-DR events can remove the feeling of being forced into load curtailments and increase their willingness to adopt Auto-DR; (2) being able to modify individual load-shed strategies allows flexible Auto-DR participation that meets the building’s changing operational requirements; (3) a clear display of automation strategies helps building operators easily identify how Auto-DR is functioning and can build trust in Auto-DR systems.

  17. Fully automated segmentation of oncological PET volumes using a combined multiscale and statistical model

    International Nuclear Information System (INIS)

    Montgomery, David W. G.; Amira, Abbes; Zaidi, Habib

    2007-01-01

    The widespread application of positron emission tomography (PET) in clinical oncology has driven this imaging technology into a number of new research and clinical arenas. Increasing numbers of patient scans have led to an urgent need for efficient data handling and the development of new image analysis techniques to aid clinicians in the diagnosis of disease and planning of treatment. Automatic quantitative assessment of metabolic PET data is attractive and will certainly revolutionize the practice of functional imaging since it can lower variability across institutions and may enhance the consistency of image interpretation independent of reader experience. In this paper, a novel automated system for the segmentation of oncological PET data aiming at providing an accurate quantitative analysis tool is proposed. The initial step involves expectation maximization (EM)-based mixture modeling using a k-means clustering procedure, which varies voxel order for initialization. A multiscale Markov model is then used to refine this segmentation by modeling spatial correlations between neighboring image voxels. An experimental study using an anthropomorphic thorax phantom was conducted for quantitative evaluation of the performance of the proposed segmentation algorithm. The comparison of actual tumor volumes to the volumes calculated using different segmentation methodologies including standard k-means, spatial domain Markov Random Field Model (MRFM), and the new multiscale MRFM proposed in this paper showed that the latter dramatically reduces the relative error to less than 8% for small lesions (7 mm radii) and less than 3.5% for larger lesions (9 mm radii). The analysis of the resulting segmentations of clinical oncologic PET data seems to confirm that this methodology shows promise and can successfully segment patient lesions. For problematic images, this technique enables the identification of tumors situated very close to nearby high normal physiologic uptake. The

  18. Real-time direct cell concentration and viability determination using a fully automated microfluidic platform for standalone process monitoring

    DEFF Research Database (Denmark)

    Rodrigues de Sousa Nunes, Pedro André; Kjaerulff, S.; Dufva, Martin

    2015-01-01

    system performance by monitoring in real time the cell concentration and viability of yeast extracted directly from an in-house made bioreactor. This is the first demonstration of using the Dean drag force, generated due to the implementation of a curved microchannel geometry in conjunction with high...... flow rates, to promote passive mixing of cell samples and thus homogenization of the diluted cell plug. The autonomous operation of the fluidics furthermore allows implementation of intelligent protocols for administering air bubbles from the bioreactor in the microfluidic system, so...... and thereby ensure optimal cell production, by prolonging the fermentation cycle and increasing the bioreactor output. In this work, we report on the development of a fully automated microfluidic system capable of extracting samples directly from a bioreactor, diluting the sample, staining the cells...

  19. A fully-automated multiscale kernel graph cuts based particle localization scheme for temporal focusing two-photon microscopy

    Science.gov (United States)

    Huang, Xia; Li, Chunqiang; Xiao, Chuan; Sun, Wenqing; Qian, Wei

    2017-03-01

    The temporal focusing two-photon microscope (TFM) is developed to perform depth resolved wide field fluorescence imaging by capturing frames sequentially. However, due to strong nonignorable noises and diffraction rings surrounding particles, further researches are extremely formidable without a precise particle localization technique. In this paper, we developed a fully-automated scheme to locate particles positions with high noise tolerance. Our scheme includes the following procedures: noise reduction using a hybrid Kalman filter method, particle segmentation based on a multiscale kernel graph cuts global and local segmentation algorithm, and a kinematic estimation based particle tracking method. Both isolated and partial-overlapped particles can be accurately identified with removal of unrelated pixels. Based on our quantitative analysis, 96.22% isolated particles and 84.19% partial-overlapped particles were successfully detected.

  20. A fully-automated neural network analysis of AFM force-distance curves for cancer tissue diagnosis

    Science.gov (United States)

    Minelli, Eleonora; Ciasca, Gabriele; Sassun, Tanya Enny; Antonelli, Manila; Palmieri, Valentina; Papi, Massimiliano; Maulucci, Giuseppe; Santoro, Antonio; Giangaspero, Felice; Delfini, Roberto; Campi, Gaetano; De Spirito, Marco

    2017-10-01

    Atomic Force Microscopy (AFM) has the unique capability of probing the nanoscale mechanical properties of biological systems that affect and are affected by the occurrence of many pathologies, including cancer. This capability has triggered growing interest in the translational process of AFM from physics laboratories to clinical practice. A factor still hindering the current use of AFM in diagnostics is related to the complexity of AFM data analysis, which is time-consuming and needs highly specialized personnel with a strong physical and mathematical background. In this work, we demonstrate an operator-independent neural-network approach for the analysis of surgically removed brain cancer tissues. This approach allowed us to distinguish—in a fully automated fashion—cancer from healthy tissues with high accuracy, also highlighting the presence and the location of infiltrating tumor cells.

  1. Automated red blood cells extraction from holographic images using fully convolutional neural networks

    Science.gov (United States)

    Yi, Faliu; Moon, Inkyu; Javidi, Bahram

    2017-01-01

    In this paper, we present two models for automatically extracting red blood cells (RBCs) from RBCs holographic images based on a deep learning fully convolutional neural network (FCN) algorithm. The first model, called FCN-1, only uses the FCN algorithm to carry out RBCs prediction, whereas the second model, called FCN-2, combines the FCN approach with the marker-controlled watershed transform segmentation scheme to achieve RBCs extraction. Both models achieve good segmentation accuracy. In addition, the second model has much better performance in terms of cell separation than traditional segmentation methods. In the proposed methods, the RBCs phase images are first numerically reconstructed from RBCs holograms recorded with off-axis digital holographic microscopy. Then, some RBCs phase images are manually segmented and used as training data to fine-tune the FCN. Finally, each pixel in new input RBCs phase images is predicted into either foreground or background using the trained FCN models. The RBCs prediction result from the first model is the final segmentation result, whereas the result from the second model is used as the internal markers of the marker-controlled transform algorithm for further segmentation. Experimental results show that the given schemes can automatically extract RBCs from RBCs phase images and much better RBCs separation results are obtained when the FCN technique is combined with the marker-controlled watershed segmentation algorithm. PMID:29082078

  2. Fully automated motion correction in first-pass myocardial perfusion MR image sequences.

    Science.gov (United States)

    Milles, Julien; van der Geest, Rob J; Jerosch-Herold, Michael; Reiber, Johan H C; Lelieveldt, Boudewijn P F

    2008-11-01

    This paper presents a novel method for registration of cardiac perfusion magnetic resonance imaging (MRI). The presented method is capable of automatically registering perfusion data, using independent component analysis (ICA) to extract physiologically relevant features together with their time-intensity behavior. A time-varying reference image mimicking intensity changes in the data of interest is computed based on the results of that ICA. This reference image is used in a two-pass registration framework. Qualitative and quantitative validation of the method is carried out using 46 clinical quality, short-axis, perfusion MR datasets comprising 100 images each. Despite varying image quality and motion patterns in the evaluation set, validation of the method showed a reduction of the average right ventricle (LV) motion from 1.26+/-0.87 to 0.64+/-0.46 pixels. Time-intensity curves are also improved after registration with an average error reduced from 2.65+/-7.89% to 0.87+/-3.88% between registered data and manual gold standard. Comparison of clinically relevant parameters computed using registered data and the manual gold standard show a good agreement. Additional tests with a simulated free-breathing protocol showed robustness against considerable deviations from a standard breathing protocol. We conclude that this fully automatic ICA-based method shows an accuracy, a robustness and a computation speed adequate for use in a clinical environment.

  3. Screening for anabolic steroids in urine of forensic cases using fully automated solid phase extraction and LC-MS-MS.

    Science.gov (United States)

    Andersen, David W; Linnet, Kristian

    2014-01-01

    A screening method for 18 frequently measured exogenous anabolic steroids and the testosterone/epitestosterone (T/E) ratio in forensic cases has been developed and validated. The method involves a fully automated sample preparation including enzyme treatment, addition of internal standards and solid phase extraction followed by analysis by liquid chromatography-tandem mass spectrometry (LC-MS-MS) using electrospray ionization with adduct formation for two compounds. Urine samples from 580 forensic cases were analyzed to determine the T/E ratio and occurrence of exogenous anabolic steroids. Extraction recoveries ranged from 77 to 95%, matrix effects from 48 to 78%, overall process efficiencies from 40 to 54% and the lower limit of identification ranged from 2 to 40 ng/mL. In the 580 urine samples analyzed from routine forensic cases, 17 (2.9%) were found positive for one or more anabolic steroids. Only seven different steroids including testosterone were found in the material, suggesting that only a small number of common steroids are likely to occur in a forensic context. The steroids were often in high concentrations (>100 ng/mL), and a combination of steroids and/or other drugs of abuse were seen in the majority of cases. The method presented serves as a fast and automated screening procedure, proving the suitability of LC-MS-MS for analyzing anabolic steroids. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  4. Comparison and clinical utility evaluation of four multiple allergen simultaneous tests including two newly introduced fully automated analyzers

    Directory of Open Access Journals (Sweden)

    John Hoon Rim

    2016-04-01

    Full Text Available Background: We compared the diagnostic performances of two newly introduced fully automated multiple allergen simultaneous tests (MAST analyzers with two conventional MAST assays. Methods: The serum samples from a total of 53 and 104 patients were tested for food panels and inhalant panels, respectively, in four analyzers including AdvanSure AlloScreen (LG Life Science, Korea, AdvanSure Allostation Smart II (LG Life Science, PROTIA Allergy-Q (ProteomeTech, Korea, and RIDA Allergy Screen (R-Biopharm, Germany. We compared not only the total agreement percentages but also positive propensities among four analyzers. Results: Evaluation of AdvanSure Allostation Smart II as upgraded version of AdvanSure AlloScreen revealed good concordance with total agreement percentages of 93.0% and 92.2% in food and inhalant panel, respectively. Comparisons of AdvanSure Allostation Smart II or PROTIA Allergy-Q with RIDA Allergy Screen also showed good concordance performance with positive propensities of two new analyzers for common allergens (Dermatophagoides farina and Dermatophagoides pteronyssinus. The changes of cut-off level resulted in various total agreement percentage fluctuations among allergens by different analyzers, although current cut-off level of class 2 appeared to be generally suitable. Conclusions: AdvanSure Allostation Smart II and PROTIA Allergy-Q presented favorable agreement performances with RIDA Allergy Screen, although positive propensities were noticed in common allergens. Keywords: Multiple allergen simultaneous test, Automated analyzer

  5. Semantic focusing allows fully automated single-layer slide scanning of cervical cytology slides.

    Directory of Open Access Journals (Sweden)

    Bernd Lahrmann

    Full Text Available Liquid-based cytology (LBC in conjunction with Whole-Slide Imaging (WSI enables the objective and sensitive and quantitative evaluation of biomarkers in cytology. However, the complex three-dimensional distribution of cells on LBC slides requires manual focusing, long scanning-times, and multi-layer scanning. Here, we present a solution that overcomes these limitations in two steps: first, we make sure that focus points are only set on cells. Secondly, we check the total slide focus quality. From a first analysis we detected that superficial dust can be separated from the cell layer (thin layer of cells on the glass slide itself. Then we analyzed 2,295 individual focus points from 51 LBC slides stained for p16 and Ki67. Using the number of edges in a focus point image, specific color values and size-inclusion filters, focus points detecting cells could be distinguished from focus points on artifacts (accuracy 98.6%. Sharpness as total focus quality of a virtual LBC slide is computed from 5 sharpness features. We trained a multi-parameter SVM classifier on 1,600 images. On an independent validation set of 3,232 cell images we achieved an accuracy of 94.8% for classifying images as focused. Our results show that single-layer scanning of LBC slides is possible and how it can be achieved. We assembled focus point analysis and sharpness classification into a fully automatic, iterative workflow, free of user intervention, which performs repetitive slide scanning as necessary. On 400 LBC slides we achieved a scanning-time of 13.9±10.1 min with 29.1±15.5 focus points. In summary, the integration of semantic focus information into whole-slide imaging allows automatic high-quality imaging of LBC slides and subsequent biomarker analysis.

  6. Semantic focusing allows fully automated single-layer slide scanning of cervical cytology slides.

    Science.gov (United States)

    Lahrmann, Bernd; Valous, Nektarios A; Eisenmann, Urs; Wentzensen, Nicolas; Grabe, Niels

    2013-01-01

    Liquid-based cytology (LBC) in conjunction with Whole-Slide Imaging (WSI) enables the objective and sensitive and quantitative evaluation of biomarkers in cytology. However, the complex three-dimensional distribution of cells on LBC slides requires manual focusing, long scanning-times, and multi-layer scanning. Here, we present a solution that overcomes these limitations in two steps: first, we make sure that focus points are only set on cells. Secondly, we check the total slide focus quality. From a first analysis we detected that superficial dust can be separated from the cell layer (thin layer of cells on the glass slide) itself. Then we analyzed 2,295 individual focus points from 51 LBC slides stained for p16 and Ki67. Using the number of edges in a focus point image, specific color values and size-inclusion filters, focus points detecting cells could be distinguished from focus points on artifacts (accuracy 98.6%). Sharpness as total focus quality of a virtual LBC slide is computed from 5 sharpness features. We trained a multi-parameter SVM classifier on 1,600 images. On an independent validation set of 3,232 cell images we achieved an accuracy of 94.8% for classifying images as focused. Our results show that single-layer scanning of LBC slides is possible and how it can be achieved. We assembled focus point analysis and sharpness classification into a fully automatic, iterative workflow, free of user intervention, which performs repetitive slide scanning as necessary. On 400 LBC slides we achieved a scanning-time of 13.9±10.1 min with 29.1±15.5 focus points. In summary, the integration of semantic focus information into whole-slide imaging allows automatic high-quality imaging of LBC slides and subsequent biomarker analysis.

  7. Development and Demonstration of the Open Automated Demand Response Standard for the Residential Sector

    Energy Technology Data Exchange (ETDEWEB)

    Herter, Karen; Rasin, Josh; Perry, Tim

    2009-11-30

    The goal of this study was to demonstrate a demand response system that can signal nearly every customer in all sectors through the integration of two widely available and non- proprietary communications technologies--Open Automated Demand Response (OpenADR) over lnternet protocol and Utility Messaging Channel (UMC) over FM radio. The outcomes of this project were as follows: (1) a software bridge to allow translation of pricing signals from OpenADR to UMC; and (2) a portable demonstration unit with an lnternet-connected notebook computer, a portfolio of DR-enabling technologies, and a model home. The demonstration unit provides visitors the opportunity to send electricity-pricing information over the lnternet (through OpenADR and UMC) and then watch as the model appliances and lighting respond to the signals. The integration of OpenADR and UMC completed and demonstrated in this study enables utilities to send hourly or sub-hourly electricity pricing information simultaneously to the residential, commercial and industrial sectors.

  8. A fully automated multi-modal computer aided diagnosis approach to coronary calcium scoring of MSCT images

    Science.gov (United States)

    Wu, Jing; Ferns, Gordon; Giles, John; Lewis, Emma

    2012-03-01

    Inter- and intra- observer variability is a problem often faced when an expert or observer is tasked with assessing the severity of a disease. This issue is keenly felt in coronary calcium scoring of patients suffering from atherosclerosis where in clinical practice, the observer must identify firstly the presence, followed by the location of candidate calcified plaques found within the coronary arteries that may prevent oxygenated blood flow to the heart muscle. However, it can be difficult for a human observer to differentiate calcified plaques that are located in the coronary arteries from those found in surrounding anatomy such as the mitral valve or pericardium. In addition to the benefits to scoring accuracy, the use of fast, low dose multi-slice CT imaging to perform the cardiac scan is capable of acquiring the entire heart within a single breath hold. Thus exposing the patient to lower radiation dose, which for a progressive disease such as atherosclerosis where multiple scans may be required, is beneficial to their health. Presented here is a fully automated method for calcium scoring using both the traditional Agatston method, as well as the volume scoring method. Elimination of the unwanted regions of the cardiac image slices such as lungs, ribs, and vertebrae is carried out using adaptive heart isolation. Such regions cannot contain calcified plaques but can be of a similar intensity and their removal will aid detection. Removal of both the ascending and descending aortas, as they contain clinical insignificant plaques, is necessary before the final calcium scores are calculated and examined against ground truth scores of three averaged expert observer results. The results presented here are intended to show the feasibility and requirement for an automated scoring method to reduce the subjectivity and reproducibility error inherent with manual clinical calcium scoring.

  9. Towards fully automated structure-based NMR resonance assignment of 15N-labeled proteins from automatically picked peaks

    KAUST Repository

    Jang, Richard; Gao, Xin; Li, Ming

    2011-01-01

    In NMR resonance assignment, an indispensable step in NMR protein studies, manually processed peaks from both N-labeled and C-labeled spectra are typically used as inputs. However, the use of homologous structures can allow one to use only N-labeled NMR data and avoid the added expense of using C-labeled data. We propose a novel integer programming framework for structure-based backbone resonance assignment using N-labeled data. The core consists of a pair of integer programming models: one for spin system forming and amino acid typing, and the other for backbone resonance assignment. The goal is to perform the assignment directly from spectra without any manual intervention via automatically picked peaks, which are much noisier than manually picked peaks, so methods must be error-tolerant. In the case of semi-automated/manually processed peak data, we compare our system with the Xiong-Pandurangan-Bailey- Kellogg's contact replacement (CR) method, which is the most error-tolerant method for structure-based resonance assignment. Our system, on average, reduces the error rate of the CR method by five folds on their data set. In addition, by using an iterative algorithm, our system has the added capability of using the NOESY data to correct assignment errors due to errors in predicting the amino acid and secondary structure type of each spin system. On a publicly available data set for human ubiquitin, where the typing accuracy is 83%, we achieve 91% accuracy, compared to the 59% accuracy obtained without correcting for such errors. In the case of automatically picked peaks, using assignment information from yeast ubiquitin, we achieve a fully automatic assignment with 97% accuracy. To our knowledge, this is the first system that can achieve fully automatic structure-based assignment directly from spectra. This has implications in NMR protein mutant studies, where the assignment step is repeated for each mutant. © Copyright 2011, Mary Ann Liebert, Inc.

  10. Towards fully automated structure-based NMR resonance assignment of 15N-labeled proteins from automatically picked peaks

    KAUST Repository

    Jang, Richard

    2011-03-01

    In NMR resonance assignment, an indispensable step in NMR protein studies, manually processed peaks from both N-labeled and C-labeled spectra are typically used as inputs. However, the use of homologous structures can allow one to use only N-labeled NMR data and avoid the added expense of using C-labeled data. We propose a novel integer programming framework for structure-based backbone resonance assignment using N-labeled data. The core consists of a pair of integer programming models: one for spin system forming and amino acid typing, and the other for backbone resonance assignment. The goal is to perform the assignment directly from spectra without any manual intervention via automatically picked peaks, which are much noisier than manually picked peaks, so methods must be error-tolerant. In the case of semi-automated/manually processed peak data, we compare our system with the Xiong-Pandurangan-Bailey- Kellogg\\'s contact replacement (CR) method, which is the most error-tolerant method for structure-based resonance assignment. Our system, on average, reduces the error rate of the CR method by five folds on their data set. In addition, by using an iterative algorithm, our system has the added capability of using the NOESY data to correct assignment errors due to errors in predicting the amino acid and secondary structure type of each spin system. On a publicly available data set for human ubiquitin, where the typing accuracy is 83%, we achieve 91% accuracy, compared to the 59% accuracy obtained without correcting for such errors. In the case of automatically picked peaks, using assignment information from yeast ubiquitin, we achieve a fully automatic assignment with 97% accuracy. To our knowledge, this is the first system that can achieve fully automatic structure-based assignment directly from spectra. This has implications in NMR protein mutant studies, where the assignment step is repeated for each mutant. © Copyright 2011, Mary Ann Liebert, Inc.

  11. Automated Price and Demand Response Demonstration for Large Customers in New York City using OpenADR

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Joyce Jihyun; Yin, Rongxin; Kiliccote, Sila

    2013-10-01

    Open Automated Demand Response (OpenADR), an XML-based information exchange model, is used to facilitate continuous price-responsive operation and demand response participation for large commercial buildings in New York who are subject to the default day-ahead hourly pricing. We summarize the existing demand response programs in New York and discuss OpenADR communication, prioritization of demand response signals, and control methods. Building energy simulation models are developed and field tests are conducted to evaluate continuous energy management and demand response capabilities of two commercial buildings in New York City. Preliminary results reveal that providing machine-readable prices to commercial buildings can facilitate both demand response participation and continuous energy cost savings. Hence, efforts should be made to develop more sophisticated algorithms for building control systems to minimize customer's utility bill based on price and reliability information from the electricity grid.

  12. Fully automated reconstruction of three-dimensional vascular tree structures from two orthogonal views using computational algorithms and productionrules

    Science.gov (United States)

    Liu, Iching; Sun, Ying

    1992-10-01

    A system for reconstructing 3-D vascular structure from two orthogonally projected images is presented. The formidable problem of matching segments between two views is solved using knowledge of the epipolar constraint and the similarity of segment geometry and connectivity. The knowledge is represented in a rule-based system, which also controls the operation of several computational algorithms for tracking segments in each image, representing 2-D segments with directed graphs, and reconstructing 3-D segments from matching 2-D segment pairs. Uncertain reasoning governs the interaction between segmentation and matching; it also provides a framework for resolving the matching ambiguities in an iterative way. The system was implemented in the C language and the C Language Integrated Production System (CLIPS) expert system shell. Using video images of a tree model, the standard deviation of reconstructed centerlines was estimated to be 0.8 mm (1.7 mm) when the view direction was parallel (perpendicular) to the epipolar plane. Feasibility of clinical use was shown using x-ray angiograms of a human chest phantom. The correspondence of vessel segments between two views was accurate. Computational time for the entire reconstruction process was under 30 s on a workstation. A fully automated system for two-view reconstruction that does not require the a priori knowledge of vascular anatomy is demonstrated.

  13. Fully automated dual-frequency three-pulse-echo 2DIR spectrometer accessing spectral range from 800 to 4000 wavenumbers

    Energy Technology Data Exchange (ETDEWEB)

    Leger, Joel D.; Nyby, Clara M.; Varner, Clyde; Tang, Jianan; Rubtsova, Natalia I.; Yue, Yuankai; Kireev, Victor V.; Burtsev, Viacheslav D.; Qasim, Layla N.; Rubtsov, Igor V., E-mail: irubtsov@tulane.edu [Department of Chemistry, Tulane University, New Orleans, Louisiana 70118 (United States); Rubtsov, Grigory I. [Institute for Nuclear Research of the Russian Academy of Sciences, Moscow 117312 (Russian Federation)

    2014-08-15

    A novel dual-frequency two-dimensional infrared instrument is designed and built that permits three-pulse heterodyned echo measurements of any cross-peak within a spectral range from 800 to 4000 cm{sup −1} to be performed in a fully automated fashion. The superior sensitivity of the instrument is achieved by a combination of spectral interferometry, phase cycling, and closed-loop phase stabilization accurate to ∼70 as. The anharmonicity of smaller than 10{sup −4} cm{sup −1} was recorded for strong carbonyl stretching modes using 800 laser shot accumulations. The novel design of the phase stabilization scheme permits tuning polarizations of the mid-infrared (m-IR) pulses, thus supporting measurements of the angles between vibrational transition dipoles. The automatic frequency tuning is achieved by implementing beam direction stabilization schemes for each m-IR beam, providing better than 50 μrad beam stability, and novel scheme for setting the phase-matching geometry for the m-IR beams at the sample. The errors in the cross-peak amplitudes associated with imperfect phase matching conditions and alignment are found to be at the level of 20%. The instrument can be used by non-specialists in ultrafast spectroscopy.

  14. Fully-automated in-syringe dispersive liquid-liquid microextraction for the determination of caffeine in coffee beverages.

    Science.gov (United States)

    Frizzarin, Rejane M; Maya, Fernando; Estela, José M; Cerdà, Víctor

    2016-12-01

    A novel fully-automated magnetic stirring-assisted lab-in-syringe analytical procedure has been developed for the fast and efficient dispersive liquid-liquid microextraction (DLLME) of caffeine in coffee beverages. The procedure is based on the microextraction of caffeine with a minute amount of dichloromethane, isolating caffeine from the sample matrix with no further sample pretreatment. Selection of the relevant extraction parameters such as the dispersive solvent, proportion of aqueous/organic phase, pH and flow rates have been carefully evaluated. Caffeine quantification was linear from 2 to 75mgL(-1), with detection and quantification limits of 0.46mgL(-1) and 1.54mgL(-1), respectively. A coefficient of variation (n=8; 5mgL(-1)) of a 2.1% and a sampling rate of 16h(-1), were obtained. The procedure was satisfactorily applied to the determination of caffeine in brewed, instant and decaf coffee samples, being the results for the sample analysis validated using high-performance liquid chromatography. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. Validation of the fully automated A&D TM-2656 blood pressure monitor according to the British Hypertension Society Protocol.

    Science.gov (United States)

    Zeng, Wei-Fang; Liu, Ming; Kang, Yuan-Yuan; Li, Yan; Wang, Ji-Guang

    2013-08-01

    The present study aimed to evaluate the accuracy of the fully automated oscillometric upper-arm blood pressure monitor TM-2656 according to the British Hypertension Society (BHS) Protocol 1993. We recruited individuals until there were 85 eligible participants and their blood pressure could meet the blood pressure distribution requirements specified by the BHS Protocol. For each individual, we sequentially measured the systolic and diastolic blood pressures using a mercury sphygmomanometer (two observers) and the TM-2656 device (one supervisor). Data analysis was carried out according to the BHS Protocol. The device achieved grade A. The percentage of blood pressure differences within 5, 10, and 15 mmHg was 62, 85, and 96%, respectively, for systolic blood pressure, and 71, 93, and 99%, respectively, for diastolic blood pressure. The average (±SD) of the device-observer differences was -2.1±7.8 mmHg (P<0.0001) and -1.1±5.8 mmHg (P<0.0001) for systolic and diastolic blood pressures, respectively. The A&D upper-arm blood pressure monitor TM-2656 has passed the requirements of the BHS Protocol, and can thus be recommended for blood pressure measurement.

  16. A fully automated and scalable timing probe-based method for time alignment of the LabPET II scanners

    Science.gov (United States)

    Samson, Arnaud; Thibaudeau, Christian; Bouchard, Jonathan; Gaudin, Émilie; Paulin, Caroline; Lecomte, Roger; Fontaine, Réjean

    2018-05-01

    A fully automated time alignment method based on a positron timing probe was developed to correct the channel-to-channel coincidence time dispersion of the LabPET II avalanche photodiode-based positron emission tomography (PET) scanners. The timing probe was designed to directly detect positrons and generate an absolute time reference. The probe-to-channel coincidences are recorded and processed using firmware embedded in the scanner hardware to compute the time differences between detector channels. The time corrections are then applied in real-time to each event in every channel during PET data acquisition to align all coincidence time spectra, thus enhancing the scanner time resolution. When applied to the mouse version of the LabPET II scanner, the calibration of 6 144 channels was performed in less than 15 min and showed a 47% improvement on the overall time resolution of the scanner, decreasing from 7 ns to 3.7 ns full width at half maximum (FWHM).

  17. DEWS (DEep White matter hyperintensity Segmentation framework): A fully automated pipeline for detecting small deep white matter hyperintensities in migraineurs.

    Science.gov (United States)

    Park, Bo-Yong; Lee, Mi Ji; Lee, Seung-Hak; Cha, Jihoon; Chung, Chin-Sang; Kim, Sung Tae; Park, Hyunjin

    2018-01-01

    Migraineurs show an increased load of white matter hyperintensities (WMHs) and more rapid deep WMH progression. Previous methods for WMH segmentation have limited efficacy to detect small deep WMHs. We developed a new fully automated detection pipeline, DEWS (DEep White matter hyperintensity Segmentation framework), for small and superficially-located deep WMHs. A total of 148 non-elderly subjects with migraine were included in this study. The pipeline consists of three components: 1) white matter (WM) extraction, 2) WMH detection, and 3) false positive reduction. In WM extraction, we adjusted the WM mask to re-assign misclassified WMHs back to WM using many sequential low-level image processing steps. In WMH detection, the potential WMH clusters were detected using an intensity based threshold and region growing approach. For false positive reduction, the detected WMH clusters were classified into final WMHs and non-WMHs using the random forest (RF) classifier. Size, texture, and multi-scale deep features were used to train the RF classifier. DEWS successfully detected small deep WMHs with a high positive predictive value (PPV) of 0.98 and true positive rate (TPR) of 0.70 in the training and test sets. Similar performance of PPV (0.96) and TPR (0.68) was attained in the validation set. DEWS showed a superior performance in comparison with other methods. Our proposed pipeline is freely available online to help the research community in quantifying deep WMHs in non-elderly adults.

  18. Experimental optimization of a direct injection homogeneous charge compression ignition gasoline engine using split injections with fully automated microgenetic algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Canakci, M. [Kocaeli Univ., Izmit (Turkey); Reitz, R.D. [Wisconsin Univ., Dept. of Mechanical Engineering, Madison, WI (United States)

    2003-03-01

    Homogeneous charge compression ignition (HCCI) is receiving attention as a new low-emission engine concept. Little is known about the optimal operating conditions for this engine operation mode. Combustion under homogeneous, low equivalence ratio conditions results in modest temperature combustion products, containing very low concentrations of NO{sub x} and particulate matter (PM) as well as providing high thermal efficiency. However, this combustion mode can produce higher HC and CO emissions than those of conventional engines. An electronically controlled Caterpillar single-cylinder oil test engine (SCOTE), originally designed for heavy-duty diesel applications, was converted to an HCCI direct injection (DI) gasoline engine. The engine features an electronically controlled low-pressure direct injection gasoline (DI-G) injector with a 60 deg spray angle that is capable of multiple injections. The use of double injection was explored for emission control and the engine was optimized using fully automated experiments and a microgenetic algorithm optimization code. The variables changed during the optimization include the intake air temperature, start of injection timing and the split injection parameters (per cent mass of fuel in each injection, dwell between the pulses). The engine performance and emissions were determined at 700 r/min with a constant fuel flowrate at 10 MPa fuel injection pressure. The results show that significant emissions reductions are possible with the use of optimal injection strategies. (Author)

  19. Computer-aided liver volumetry: performance of a fully-automated, prototype post-processing solution for whole-organ and lobar segmentation based on MDCT imaging.

    Science.gov (United States)

    Fananapazir, Ghaneh; Bashir, Mustafa R; Marin, Daniele; Boll, Daniel T

    2015-06-01

    To evaluate the performance of a prototype, fully-automated post-processing solution for whole-liver and lobar segmentation based on MDCT datasets. A polymer liver phantom was used to assess accuracy of post-processing applications comparing phantom volumes determined via Archimedes' principle with MDCT segmented datasets. For the IRB-approved, HIPAA-compliant study, 25 patients were enrolled. Volumetry performance compared the manual approach with the automated prototype, assessing intraobserver variability, and interclass correlation for whole-organ and lobar segmentation using ANOVA comparison. Fidelity of segmentation was evaluated qualitatively. Phantom volume was 1581.0 ± 44.7 mL, manually segmented datasets estimated 1628.0 ± 47.8 mL, representing a mean overestimation of 3.0%, automatically segmented datasets estimated 1601.9 ± 0 mL, representing a mean overestimation of 1.3%. Whole-liver and segmental volumetry demonstrated no significant intraobserver variability for neither manual nor automated measurements. For whole-liver volumetry, automated measurement repetitions resulted in identical values; reproducible whole-organ volumetry was also achieved with manual segmentation, p(ANOVA) 0.98. For lobar volumetry, automated segmentation improved reproducibility over manual approach, without significant measurement differences for either methodology, p(ANOVA) 0.95-0.99. Whole-organ and lobar segmentation results from manual and automated segmentation showed no significant differences, p(ANOVA) 0.96-1.00. Assessment of segmentation fidelity found that segments I-IV/VI showed greater segmentation inaccuracies compared to the remaining right hepatic lobe segments. Automated whole-liver segmentation showed non-inferiority of fully-automated whole-liver segmentation compared to manual approaches with improved reproducibility and post-processing duration; automated dual-seed lobar segmentation showed slight tendencies for underestimating the right hepatic lobe

  20. Fully automated SPE-based synthesis and purification of 2-[{sup 18}F]fluoroethyl-choline for human use

    Energy Technology Data Exchange (ETDEWEB)

    Schmaljohann, Joern [Department of Nuclear Medicine, University of Bonn, Bonn (Germany); Department of Nuclear Medicine, University of Aachen, Aachen (Germany); Schirrmacher, Esther [McConnell Brain Imaging Centre, Montreal Neurological Institute, McGill University, Montreal, Quebec (Canada); Waengler, Bjoern; Waengler, Carmen [Department of Nuclear Medicine, Ludwig-Maximilians University, Munich (Germany); Schirrmacher, Ralf, E-mail: ralf.schirrmacher@mcgill.c [McConnell Brain Imaging Centre, Montreal Neurological Institute, McGill University, Montreal, Quebec (Canada); Guhlke, Stefan, E-mail: stefan.guhlke@ukb.uni-bonn.d [Department of Nuclear Medicine, University of Bonn, Bonn (Germany)

    2011-02-15

    Introduction: 2-[{sup 18}F]Fluoroethyl-choline ([{sup 18}F]FECH) is a promising tracer for the detection of prostate cancer as well as brain tumors with positron emission tomography (PET). [{sup 18}F]FECH is actively transported into mammalian cells, becomes phosphorylated by choline kinase and gets incorporated into the cell membrane after being metabolized to phosphatidylcholine. So far, its synthesis is a two-step procedure involving at least one HPLC purification step. To allow a wider dissemination of this tracer, finding a purification method avoiding HPLC is highly desirable and would result in easier accessibility and more reliable production of [{sup 18}F]FECH. Methods: [{sup 18}F]FECH was synthesized by reaction of 2-bromo-1-[{sup 18}F]fluoroethane ([{sup 18}F]BFE) with dimethylaminoethanol (DMAE) in DMSO. We applied a novel and very reliable work-up procedure for the synthesis of [{sup 18}F]BFE. Based on a combination of three different solid-phase cartridges, the purification of [{sup 18}F]BFE from its precursor 2-bromoethyl-4-nitrobenzenesulfonate (BENos) could be achieved without using HPLC. Following the subsequent reaction of the purified [{sup 18}F]BFE with DMAE, the final product [{sup 18}F]FECH was obtained as a sterile solution by passing the crude reaction mixture through a combination of two CM plus cartridges and a sterile filter. The fully automated synthesis was performed using as well a Raytest SynChrom module (Raytest, Germany) or a Scintomics HotboxIII module (Scintomics, Germany). Results: The radiotracer [{sup 18}F]FECH can be synthesized in reliable radiochemical yields (RCY) of 37{+-}5% (Synchrom module) and 33{+-}5% (Hotbox III unit) in less than 1 h using these two fully automated commercially available synthesis units without HPLC involvement for purification. Detailed quality control of the final injectable [{sup 18}F]FECH solution proved the high radiochemical purity and the absence of Kryptofix2.2.2, DMAE and DMSO used in the

  1. Cholgate - a randomized controlled trial comparing the effect of automated and on-demand decision support on the management of cardiovascular disease factors in primary care

    NARCIS (Netherlands)

    J.T. van Wyk (Jacobus); M.A.M. van Wijk (Marc); P.W. Moorman (Peter); M. Mosseveld (Mees); J. van der Lei (Johan)

    2003-01-01

    textabstractAutomated and on-demand decision support systems integrated into an electronic medical record have proven to be an effective implementation strategy for guidelines. Cholgate is a randomized controlled trial comparing the effect of automated and on-demand decision

  2. Analysis of the Effects of Connected–Automated Vehicle Technologies on Travel Demand

    Energy Technology Data Exchange (ETDEWEB)

    Auld, Joshua [Argonne National Laboratory, 9700 South Cass Avenue, Lemont, IL 60439; Sokolov, Vadim [Department of Systems Engineering and Operations Research, Volgenau School of Engineering, George Mason University, MS 4A6, 4400 University Drive, Fairfax, VA 22030; Stephens, Thomas S. [Argonne National Laboratory, 9700 South Cass Avenue, Lemont, IL 60439

    2017-01-01

    Connected–automated vehicle (CAV) technologies are likely to have significant effects not only on how vehicles operate in the transportation system, but also on how individuals behave and use their vehicles. While many CAV technologies—such as connected adaptive cruise control and ecosignals—have the potential to increase network throughput and efficiency, many of these same technologies have a secondary effect of reducing driver burden, which can drive changes in travel behavior. Such changes in travel behavior—in effect, lowering the cost of driving—have the potential to increase greatly the utilization of the transportation system with concurrent negative externalities, such as congestion, energy use, and emissions, working against the positive effects on the transportation system resulting from increased capacity. To date, few studies have analyzed the potential effects on CAV technologies from a systems perspective; studies often focus on gains and losses to an individual vehicle, at a single intersection, or along a corridor. However, travel demand and traffic flow constitute a complex, adaptive, nonlinear system. Therefore, in this study, an advanced transportation systems simulation model—POLARIS—was used. POLARIS includes cosimulation of travel behavior and traffic flow to study the potential effects of several CAV technologies at the regional level. Various technology penetration levels and changes in travel time sensitivity have been analyzed to determine a potential range of effects on vehicle miles traveled from various CAV technologies.

  3. How do Air Traffic Controllers Use Automation and Tools Differently During High Demand Situations?

    Science.gov (United States)

    Kraut, Joshua M.; Mercer, Joey; Morey, Susan; Homola, Jeffrey; Gomez, Ashley; Prevot, Thomas

    2013-01-01

    In a human-in-the-loop simulation, two air traffic controllers managed identical airspace while burdened with higher than average workload, and while using advanced tools and automation designed to assist with scheduling aircraft on multiple arrival flows to a single meter fix. This paper compares the strategies employed by each controller, and investigates how the controllers' strategies change while managing their airspace under more normal workload conditions and a higher workload condition. Each controller engaged in different methods of maneuvering aircraft to arrive on schedule, and adapted their strategies to cope with the increased workload in different ways. Based on the conclusions three suggestions are made: that quickly providing air traffic controllers with recommendations and information to assist with maneuvering and scheduling aircraft when burdened with increased workload will improve the air traffic controller's effectiveness, that the tools should adapt to the strategy currently employed by a controller, and that training should emphasize which traffic management strategies are most effective given specific airspace demands.

  4. Association between fully automated MRI-based volumetry of different brain regions and neuropsychological test performance in patients with amnestic mild cognitive impairment and Alzheimer's disease.

    Science.gov (United States)

    Arlt, Sönke; Buchert, Ralph; Spies, Lothar; Eichenlaub, Martin; Lehmbeck, Jan T; Jahn, Holger

    2013-06-01

    Fully automated magnetic resonance imaging (MRI)-based volumetry may serve as biomarker for the diagnosis in patients with mild cognitive impairment (MCI) or dementia. We aimed at investigating the relation between fully automated MRI-based volumetric measures and neuropsychological test performance in amnestic MCI and patients with mild dementia due to Alzheimer's disease (AD) in a cross-sectional and longitudinal study. In order to assess a possible prognostic value of fully automated MRI-based volumetry for future cognitive performance, the rate of change of neuropsychological test performance over time was also tested for its correlation with fully automated MRI-based volumetry at baseline. In 50 subjects, 18 with amnestic MCI, 21 with mild AD, and 11 controls, neuropsychological testing and T1-weighted MRI were performed at baseline and at a mean follow-up interval of 2.1 ± 0.5 years (n = 19). Fully automated MRI volumetry of the grey matter volume (GMV) was performed using a combined stereotactic normalisation and segmentation approach as provided by SPM8 and a set of pre-defined binary lobe masks. Left and right hippocampus masks were derived from probabilistic cytoarchitectonic maps. Volumes of the inner and outer liquor space were also determined automatically from the MRI. Pearson's test was used for the correlation analyses. Left hippocampal GMV was significantly correlated with performance in memory tasks, and left temporal GMV was related to performance in language tasks. Bilateral frontal, parietal and occipital GMVs were correlated to performance in neuropsychological tests comprising multiple domains. Rate of GMV change in the left hippocampus was correlated with decline of performance in the Boston Naming Test (BNT), Mini-Mental Status Examination, and trail making test B (TMT-B). The decrease of BNT and TMT-A performance over time correlated with the loss of grey matter in multiple brain regions. We conclude that fully automated MRI

  5. Opportunities for Automated Demand Response in Wastewater Treatment Facilities in California - Southeast Water Pollution Control Plant Case Study

    Energy Technology Data Exchange (ETDEWEB)

    Olsen, Daniel [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Goli, Sasank [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Faulkner, David [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); McKane, Aimee [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2012-12-20

    This report details a study into the demand response potential of a large wastewater treatment facility in San Francisco. Previous research had identified wastewater treatment facilities as good candidates for demand response and automated demand response, and this study was conducted to investigate facility attributes that are conducive to demand response or which hinder its implementation. One years' worth of operational data were collected from the facility's control system, submetered process equipment, utility electricity demand records, and governmental weather stations. These data were analyzed to determine factors which affected facility power demand and demand response capabilities The average baseline demand at the Southeast facility was approximately 4 MW. During the rainy season (October-March) the facility treated 40% more wastewater than the dry season, but demand only increased by 4%. Submetering of the facility's lift pumps and centrifuges predicted load shifts capabilities of 154 kW and 86 kW, respectively, with large lift pump shifts in the rainy season. Analysis of demand data during maintenance events confirmed the magnitude of these possible load shifts, and indicated other areas of the facility with demand response potential. Load sheds were seen to be possible by shutting down a portion of the facility's aeration trains (average shed of 132 kW). Load shifts were seen to be possible by shifting operation of centrifuges, the gravity belt thickener, lift pumps, and external pump stations These load shifts were made possible by the storage capabilities of the facility and of the city's sewer system. Large load reductions (an average of 2,065 kW) were seen from operating the cogeneration unit, but normal practice is continuous operation, precluding its use for demand response. The study also identified potential demand response opportunities that warrant further study: modulating variable-demand aeration loads, shifting

  6. A device for fully automated on-site process monitoring and control of trihalomethane concentrations in drinking water

    International Nuclear Information System (INIS)

    Brown, Aaron W.; Simone, Paul S.; York, J.C.; Emmert, Gary L.

    2015-01-01

    Highlights: • Commercial device for on-line monitoring of trihalomethanes in drinking water. • Method detection limits for individual trihalomethanes range from 0.01–0.04 μg L –1 . • Rugged and robust device operates automatically for on-site process control. • Used for process mapping and process optimization to reduce treatment costs. • Hourly measurements of trihalomethanes made continuously for ten months. - Abstract: An instrument designed for fully automated on-line monitoring of trihalomethane concentrations in chlorinated drinking water is presented. The patented capillary membrane sampling device automatically samples directly from a water tap followed by injection of the sample into a gas chromatograph equipped with a nickel-63 electron capture detector. Detailed studies using individual trihalomethane species exhibited method detection limits ranging from 0.01–0.04 μg L −1 . Mean percent recoveries ranged from 77.1 to 86.5% with percent relative standard deviation values ranging from 1.2 to 4.6%. Out of more than 5200 samples analyzed, 95% of the concentration ranges were detectable, 86.5% were quantifiable. The failure rate was less than 2%. Using the data from the instrument, two different treatment processes were optimized so that total trihalomethane concentrations were maintained at acceptable levels while reducing treatment costs significantly. This ongoing trihalomethane monitoring program has been operating for more than ten months and has produced the longest continuous and most finely time-resolved data on trihalomethane concentrations reported in the literature

  7. Fully-Automated μMRI Morphometric Phenotyping of the Tc1 Mouse Model of Down Syndrome.

    Directory of Open Access Journals (Sweden)

    Nick M Powell

    Full Text Available We describe a fully automated pipeline for the morphometric phenotyping of mouse brains from μMRI data, and show its application to the Tc1 mouse model of Down syndrome, to identify new morphological phenotypes in the brain of this first transchromosomic animal carrying human chromosome 21. We incorporate an accessible approach for simultaneously scanning multiple ex vivo brains, requiring only a 3D-printed brain holder, and novel image processing steps for their separation and orientation. We employ clinically established multi-atlas techniques-superior to single-atlas methods-together with publicly-available atlas databases for automatic skull-stripping and tissue segmentation, providing high-quality, subject-specific tissue maps. We follow these steps with group-wise registration, structural parcellation and both Voxel- and Tensor-Based Morphometry-advantageous for their ability to highlight morphological differences without the laborious delineation of regions of interest. We show the application of freely available open-source software developed for clinical MRI analysis to mouse brain data: NiftySeg for segmentation and NiftyReg for registration, and discuss atlases and parameters suitable for the preclinical paradigm. We used this pipeline to compare 29 Tc1 brains with 26 wild-type littermate controls, imaged ex vivo at 9.4T. We show an unexpected increase in Tc1 total intracranial volume and, controlling for this, local volume and grey matter density reductions in the Tc1 brain compared to the wild-types, most prominently in the cerebellum, in agreement with human DS and previous histological findings.

  8. Review: Behavioral signs of estrus and the potential of fully automated systems for detection of estrus in dairy cattle.

    Science.gov (United States)

    Reith, S; Hoy, S

    2018-02-01

    Efficient detection of estrus is a permanent challenge for successful reproductive performance in dairy cattle. In this context, comprehensive knowledge of estrus-related behaviors is fundamental to achieve optimal estrus detection rates. This review was designed to identify the characteristics of behavioral estrus as a necessary basis for developing strategies and technologies to improve the reproductive management on dairy farms. The focus is on secondary symptoms of estrus (mounting, activity, aggressive and agonistic behaviors) which seem more indicative than standing behavior. The consequences of management, housing conditions and cow- and environmental-related factors impacting expression and detection of estrus as well as their relative importance are described in order to increase efficiency and accuracy of estrus detection. As traditional estrus detection via visual observation is time-consuming and ineffective, there has been a considerable advancement of detection aids during the last 10 years. By now, a number of fully automated technologies including pressure sensing systems, activity meters, video cameras, recordings of vocalization as well as measurements of body temperature and milk progesterone concentration are available. These systems differ in many aspects regarding sustainability and efficiency as keys to their adoption for farm use. As being most practical for estrus detection a high priority - according to the current research - is given to the detection based on sensor-supported activity monitoring, especially accelerometer systems. Due to differences in individual intensity and duration of estrus multivariate analysis can support herd managers in determining the onset of estrus. Actually, there is increasing interest in investigating the potential of combining data of activity monitoring and information of several other methods, which may lead to the best results concerning sensitivity and specificity of detection. Future improvements will

  9. A device for fully automated on-site process monitoring and control of trihalomethane concentrations in drinking water

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Aaron W. [The University of Memphis, Department of Chemistry, Memphis, TN 38152 (United States); Simone, Paul S. [The University of Memphis, Department of Chemistry, Memphis, TN 38152 (United States); Foundation Instruments, Inc., Collierville, TN 38017 (United States); York, J.C. [City of Lebanon, TN Water Treatment Plant, 7 Gilmore Hill Rd., Lebanon, TN 37087 (United States); Emmert, Gary L., E-mail: gemmert@memphis.edu [The University of Memphis, Department of Chemistry, Memphis, TN 38152 (United States); Foundation Instruments, Inc., Collierville, TN 38017 (United States)

    2015-01-01

    Highlights: • Commercial device for on-line monitoring of trihalomethanes in drinking water. • Method detection limits for individual trihalomethanes range from 0.01–0.04 μg L{sup –1}. • Rugged and robust device operates automatically for on-site process control. • Used for process mapping and process optimization to reduce treatment costs. • Hourly measurements of trihalomethanes made continuously for ten months. - Abstract: An instrument designed for fully automated on-line monitoring of trihalomethane concentrations in chlorinated drinking water is presented. The patented capillary membrane sampling device automatically samples directly from a water tap followed by injection of the sample into a gas chromatograph equipped with a nickel-63 electron capture detector. Detailed studies using individual trihalomethane species exhibited method detection limits ranging from 0.01–0.04 μg L{sup −1}. Mean percent recoveries ranged from 77.1 to 86.5% with percent relative standard deviation values ranging from 1.2 to 4.6%. Out of more than 5200 samples analyzed, 95% of the concentration ranges were detectable, 86.5% were quantifiable. The failure rate was less than 2%. Using the data from the instrument, two different treatment processes were optimized so that total trihalomethane concentrations were maintained at acceptable levels while reducing treatment costs significantly. This ongoing trihalomethane monitoring program has been operating for more than ten months and has produced the longest continuous and most finely time-resolved data on trihalomethane concentrations reported in the literature.

  10. Fully automated calculation of image-derived input function in simultaneous PET/MRI in a sheep model

    International Nuclear Information System (INIS)

    Jochimsen, Thies H.; Zeisig, Vilia; Schulz, Jessica; Werner, Peter; Patt, Marianne; Patt, Jörg; Dreyer, Antje Y.; Boltze, Johannes; Barthel, Henryk; Sabri, Osama; Sattler, Bernhard

    2016-01-01

    Obtaining the arterial input function (AIF) from image data in dynamic positron emission tomography (PET) examinations is a non-invasive alternative to arterial blood sampling. In simultaneous PET/magnetic resonance imaging (PET/MRI), high-resolution MRI angiographies can be used to define major arteries for correction of partial-volume effects (PVE) and point spread function (PSF) response in the PET data. The present study describes a fully automated method to obtain the image-derived input function (IDIF) in PET/MRI. Results are compared to those obtained by arterial blood sampling. To segment the trunk of the major arteries in the neck, a high-resolution time-of-flight MRI angiography was postprocessed by a vessel-enhancement filter based on the inertia tensor. Together with the measured PSF of the PET subsystem, the arterial mask was used for geometrical deconvolution, yielding the time-resolved activity concentration averaged over a major artery. The method was compared to manual arterial blood sampling at the hind leg of 21 sheep (animal stroke model) during measurement of blood flow with O15-water. Absolute quantification of activity concentration was compared after bolus passage during steady state, i.e., between 2.5- and 5-min post injection. Cerebral blood flow (CBF) values from blood sampling and IDIF were also compared. The cross-calibration factor obtained by comparing activity concentrations in blood samples and IDIF during steady state is 0.98 ± 0.10. In all examinations, the IDIF provided a much earlier and sharper bolus peak than in the time course of activity concentration obtained by arterial blood sampling. CBF using the IDIF was 22 % higher than CBF obtained by using the AIF yielded by blood sampling. The small deviation between arterial blood sampling and IDIF during steady state indicates that correction of PVE and PSF is possible with the method presented. The differences in bolus dynamics and, hence, CBF values can be explained by the

  11. Fully automated calculation of image-derived input function in simultaneous PET/MRI in a sheep model

    Energy Technology Data Exchange (ETDEWEB)

    Jochimsen, Thies H.; Zeisig, Vilia [Department of Nuclear Medicine, Leipzig University Hospital, Liebigstr. 18, Leipzig (Germany); Schulz, Jessica [Max Planck Institute for Human Cognitive and Brain Sciences, Stephanstr. 1a, Leipzig, D-04103 (Germany); Werner, Peter; Patt, Marianne; Patt, Jörg [Department of Nuclear Medicine, Leipzig University Hospital, Liebigstr. 18, Leipzig (Germany); Dreyer, Antje Y. [Fraunhofer Institute of Cell Therapy and Immunology, Perlickstr. 1, Leipzig, D-04103 (Germany); Translational Centre for Regenerative Medicine, University Leipzig, Philipp-Rosenthal-Str. 55, Leipzig, D-04103 (Germany); Boltze, Johannes [Fraunhofer Institute of Cell Therapy and Immunology, Perlickstr. 1, Leipzig, D-04103 (Germany); Translational Centre for Regenerative Medicine, University Leipzig, Philipp-Rosenthal-Str. 55, Leipzig, D-04103 (Germany); Fraunhofer Research Institution of Marine Biotechnology and Institute for Medical and Marine Biotechnology, University of Lübeck, Lübeck (Germany); Barthel, Henryk; Sabri, Osama; Sattler, Bernhard [Department of Nuclear Medicine, Leipzig University Hospital, Liebigstr. 18, Leipzig (Germany)

    2016-02-13

    Obtaining the arterial input function (AIF) from image data in dynamic positron emission tomography (PET) examinations is a non-invasive alternative to arterial blood sampling. In simultaneous PET/magnetic resonance imaging (PET/MRI), high-resolution MRI angiographies can be used to define major arteries for correction of partial-volume effects (PVE) and point spread function (PSF) response in the PET data. The present study describes a fully automated method to obtain the image-derived input function (IDIF) in PET/MRI. Results are compared to those obtained by arterial blood sampling. To segment the trunk of the major arteries in the neck, a high-resolution time-of-flight MRI angiography was postprocessed by a vessel-enhancement filter based on the inertia tensor. Together with the measured PSF of the PET subsystem, the arterial mask was used for geometrical deconvolution, yielding the time-resolved activity concentration averaged over a major artery. The method was compared to manual arterial blood sampling at the hind leg of 21 sheep (animal stroke model) during measurement of blood flow with O15-water. Absolute quantification of activity concentration was compared after bolus passage during steady state, i.e., between 2.5- and 5-min post injection. Cerebral blood flow (CBF) values from blood sampling and IDIF were also compared. The cross-calibration factor obtained by comparing activity concentrations in blood samples and IDIF during steady state is 0.98 ± 0.10. In all examinations, the IDIF provided a much earlier and sharper bolus peak than in the time course of activity concentration obtained by arterial blood sampling. CBF using the IDIF was 22 % higher than CBF obtained by using the AIF yielded by blood sampling. The small deviation between arterial blood sampling and IDIF during steady state indicates that correction of PVE and PSF is possible with the method presented. The differences in bolus dynamics and, hence, CBF values can be explained by the

  12. Automated evaluation of ultrasonic indications

    International Nuclear Information System (INIS)

    Hansch, M.K.T.; Stegemann, D.

    1994-01-01

    Future requirements of reliability and reproducibility in quality assurance demand computer evaluation of defect indications. The ultrasonic method with its large field of applications and a high potential for automation provides all preconditions for fully automated inspection. The survey proposes several desirable hardware improvements, data acquisition requirements and software configurations. (orig.) [de

  13. Fully Automated Electro Membrane Extraction Autosampler for LC-MS Systems Allowing Soft Extractions for High-Throughput Applications

    DEFF Research Database (Denmark)

    Fuchs, David; Pedersen-Bjergaard, Stig; Jensen, Henrik

    2016-01-01

    was optimized for soft extraction of analytes and high sample throughput. Further, it was demonstrated that by flushing the EME-syringe with acidic wash buffer and reverting the applied electric potential, carry-over between samples can be reduced to below 1%. Performance of the system was characterized (RSD......, a complete analytical workflow of purification, separation, and analysis of sample could be achieved within only 5.5 min. With the developed system large sequences of samples could be analyzed in a completely automated manner. This high degree of automation makes the developed EME-autosampler a powerful tool...

  14. Performance evaluation of vertical feed fully automated TLD badge reader using 0.8 and 0.4 mm teflon embedded CaSO4:Dy dosimeters

    International Nuclear Information System (INIS)

    Ratna, P.; More, Vinay; Kulkarni, M.S.

    2012-01-01

    The personnel monitoring of more than 80,000 radiation workers in India is at present carried out by semi-automated TLD badge Reader systems (TLDBR-7B) developed by Radiation Safety Systems Division, Bhabha Atomic Research Centre. More than 60 such reader systems are in use in all the personnel monitoring centers in the country. Radiation Safety Systems Division also developed the fully automated TLD badge reader based on a new TLD badge having built-in machine readable ID code (in the form of 16x3 hole pattern). This automated reader is designed with minimum of changes in the electronics and mechanical hardware in the semiautomatic version (TLDBR-7B) so that such semi-automatic readers can be easily upgraded to the fully automated versions by using the new TLD badge with ID code. The reader was capable of reading 50 TLD cards in 90 minutes. Based on the feedback from the users, a new model of frilly automated TLD badge Reader (model VEFFA-10) is designed which is an improved version of the previously reported fully Automated TLD badge reader. This VEFFA-10 PC based Reader incorporates vertical loading of TLD bards having machine readable ID code. In this new reader, a vertical rack, which can hold 100 such cards, is mounted from the right side of the reader system. The TLD card falls into the channel by gravity from where it is taken to the reading position by rack and pinion method. After the readout, the TLD card is dropped in a eject tray. The reader employs hot N 2 gas heating method and the gas flow is controlled by a specially designed digital gas flow meter on the front panel of the reader system. The system design is very compact and simple and card stuck up problem is totally eliminated in the reader system. The reader has a number of self-diagnostic features to ensure a high degree of reliability. This paper reports the performance evaluation of the Reader using 0.4 mm thick Teflon embedded CaSO 4 :Dy TLD cards instead of 0.8 mm cards

  15. Visual Versus Fully Automated Analyses of 18F-FDG and Amyloid PET for Prediction of Dementia Due to Alzheimer Disease in Mild Cognitive Impairment.

    Science.gov (United States)

    Grimmer, Timo; Wutz, Carolin; Alexopoulos, Panagiotis; Drzezga, Alexander; Förster, Stefan; Förstl, Hans; Goldhardt, Oliver; Ortner, Marion; Sorg, Christian; Kurz, Alexander

    2016-02-01

    Biomarkers of Alzheimer disease (AD) can be imaged in vivo and can be used for diagnostic and prognostic purposes in people with cognitive decline and dementia. Indicators of amyloid deposition such as (11)C-Pittsburgh compound B ((11)C-PiB) PET are primarily used to identify or rule out brain diseases that are associated with amyloid pathology but have also been deployed to forecast the clinical course. Indicators of neuronal metabolism including (18)F-FDG PET demonstrate the localization and severity of neuronal dysfunction and are valuable for differential diagnosis and for predicting the progression from mild cognitive impairment (MCI) to dementia. It is a matter of debate whether to analyze these images visually or using automated techniques. Therefore, we compared the usefulness of both imaging methods and both analyzing strategies to predict dementia due to AD. In MCI participants, a baseline examination, including clinical and imaging assessments, and a clinical follow-up examination after a planned interval of 24 mo were performed. Of 28 MCI patients, 9 developed dementia due to AD, 2 developed frontotemporal dementia, and 1 developed moderate dementia of unknown etiology. The positive and negative predictive values and the accuracy of visual and fully automated analyses of (11)C-PiB for the prediction of progression to dementia due to AD were 0.50, 1.00, and 0.68, respectively, for the visual and 0.53, 1.00, and 0.71, respectively, for the automated analyses. Positive predictive value, negative predictive value, and accuracy of fully automated analyses of (18)F-FDG PET were 0.37, 0.78, and 0.50, respectively. Results of visual analyses were highly variable between raters but were superior to automated analyses. Both (18)F-FDG and (11)C-PiB imaging appear to be of limited use for predicting the progression from MCI to dementia due to AD in short-term follow-up, irrespective of the strategy of analysis. On the other hand, amyloid PET is extremely useful to

  16. Toward fully automated genotyping: Allele assignment, pedigree construction, phase determination, and recombination detection in Duchenne muscular dystrophy

    Energy Technology Data Exchange (ETDEWEB)

    Perlin, M.W.; Burks, M.B. [Carnegie Mellon Univ., Pittsburgh, PA (United States); Hoop, R.C.; Hoffman, E.P. [Univ. of Pittsburgh School of Medicine, PA (United States)

    1994-10-01

    Human genetic maps have made quantum leaps in the past few years, because of the characterization of >2,000 CA dinucleotide repeat loci: these PCR-based markers offer extraordinarily high PIC, and within the next year their density is expected to reach intervals of a few centimorgans per marker. These new genetic maps open new avenues for disease gene research, including large-scale genotyping for both simple and complex disease loci. However, the allele patterns of many dinucleotide repeat loci can be complex and difficult to interpret, with genotyping errors a recognized problem. Furthermore, the possibility of genotyping individuals at hundreds or thousands of polymorphic loci requires improvements in data handling and analysis. The automation of genotyping and analysis of computer-derived haplotypes would remove many of the barriers preventing optimal use of dense and informative dinucleotide genetic maps. Toward this end, we have automated the allele identification, genotyping, phase determinations, and inheritance consistency checks generated by four CA repeats within the 2.5-Mbp, 10-cM X-linked dystrophin gene, using fluorescein-labeled multiplexed PCR products analyzed on automated sequencers. The described algorithms can deconvolute and resolve closely spaced alleles, despite interfering stutter noise; set phase in females; propagate the phase through the family; and identify recombination events. We show the implementation of these algorithms for the completely automated interpretation of allele data and risk assessment for five Duchenne/Becker muscular dystrophy families. The described approach can be scaled up to perform genome-based analyses with hundreds or thousands of CA-repeat loci, using multiple fluorophors on automated sequencers. 16 refs., 5 figs., 1 tab.

  17. A fully-automated computer-assisted method of CT brain scan analysis for the measurement of cerebrospinal fluid spaces and brain absorption density

    International Nuclear Information System (INIS)

    Baldy, R.E.; Brindley, G.S.; Jacobson, R.R.; Reveley, M.A.; Lishman, W.A.; Ewusi-Mensah, I.; Turner, S.W.

    1986-01-01

    Computer-assisted methods of CT brain scan analysis offer considerable advantages over visual inspection, particularly in research; and several semi-automated methods are currently available. A new computer-assisted program is presented which provides fully automated processing of CT brain scans, depending on ''anatomical knowledge'' of where cerebrospinal fluid (CSF)-containing spaces are likely to lie. After identifying these regions of interest quantitative estimates are then provided of CSF content in each slice in cisterns, ventricles, Sylvian fissure and interhemispheric fissure. Separate measures are also provided of mean brain density in each slice. These estimates can be summated to provide total ventricular and total brain volumes. The program shows a high correlation with measures derived from mechanical planimetry and visual grading procedures, also when tested against a phantom brain of known ventricular volume. The advantages and limitations of the present program are discussed. (orig.)

  18. Fully automated one-pot radiosynthesis of O-(2-[{sup 18}F]fluoroethyl)-L-tyrosine on the TracerLab FX{sub FN} module

    Energy Technology Data Exchange (ETDEWEB)

    Bourdier, Thomas, E-mail: bts@ansto.gov.au [LifeSciences, Australian Nuclear Science and Technology Organisation, Locked Bag 2001, Kirrawee DC NSW 2232, Sydney (Australia); Greguric, Ivan [LifeSciences, Australian Nuclear Science and Technology Organisation, Locked Bag 2001, Kirrawee DC NSW 2232, Sydney (Australia); Roselt, Peter [Centre for Molecular Imaging, Peter MacCallum Cancer Centre, 12 St Andrew' s Place, East Melbourne, VIC, 3002 (Australia); Jackson, Tim; Faragalla, Jane; Katsifis, Andrew [LifeSciences, Australian Nuclear Science and Technology Organisation, Locked Bag 2001, Kirrawee DC NSW 2232, Sydney (Australia)

    2011-07-15

    Introduction: An efficient fully automated method for the radiosynthesis of enantiomerically pure O-(2-[{sup 18}F]fluoroethyl)-L-tyrosine ([{sup 18}F]FET) using the GE TracerLab FX{sub FN} synthesis module via the O-(2-tosyloxyethyl)-N-trityl-L-tyrosine tert-butylester precursor has been developed. Methods: The radiolabelling of [{sup 18}F]FET involved a classical [{sup 18}F]fluoride nucleophilic substitution performed in acetonitrile using potassium carbonate and Kryptofix 222, followed by acid hydrolysis using 2N hydrochloric acid. Results: [{sup 18}F]FET was produced in 35{+-}5% (n=22) yield non-decay-corrected (55{+-}5% decay-corrected) and with radiochemical and enantiomeric purity of >99% with a specific activity of >90 GBq/{mu}mol after 63 min of radiosynthesis including HPLC purification and formulation. Conclusion: The automated radiosynthesis provides high and reproducible yields suitable for routine clinical use.

  19. Fully-automated computer-assisted method of CT brain scan analysis for the measurement of cerebrospinal fluid spaces and brain absorption density

    Energy Technology Data Exchange (ETDEWEB)

    Baldy, R.E.; Brindley, G.S.; Jacobson, R.R.; Reveley, M.A.; Lishman, W.A.; Ewusi-Mensah, I.; Turner, S.W.

    1986-03-01

    Computer-assisted methods of CT brain scan analysis offer considerable advantages over visual inspection, particularly in research; and several semi-automated methods are currently available. A new computer-assisted program is presented which provides fully automated processing of CT brain scans, depending on ''anatomical knowledge'' of where cerebrospinal fluid (CSF)-containing spaces are likely to lie. After identifying these regions of interest quantitative estimates are then provided of CSF content in each slice in cisterns, ventricles, Sylvian fissure and interhemispheric fissure. Separate measures are also provided of mean brain density in each slice. These estimates can be summated to provide total ventricular and total brain volumes. The program shows a high correlation with measures derived from mechanical planimetry and visual grading procedures, also when tested against a phantom brain of known ventricular volume. The advantages and limitations of the present program are discussed.

  20. Fully-automated identification of fish species based on otolith contour: using short-time Fourier transform and discriminant analysis (STFT-DA).

    Science.gov (United States)

    Salimi, Nima; Loh, Kar Hoe; Kaur Dhillon, Sarinder; Chong, Ving Ching

    2016-01-01

    Background. Fish species may be identified based on their unique otolith shape or contour. Several pattern recognition methods have been proposed to classify fish species through morphological features of the otolith contours. However, there has been no fully-automated species identification model with the accuracy higher than 80%. The purpose of the current study is to develop a fully-automated model, based on the otolith contours, to identify the fish species with the high classification accuracy. Methods. Images of the right sagittal otoliths of 14 fish species from three families namely Sciaenidae, Ariidae, and Engraulidae were used to develop the proposed identification model. Short-time Fourier transform (STFT) was used, for the first time in the area of otolith shape analysis, to extract important features of the otolith contours. Discriminant Analysis (DA), as a classification technique, was used to train and test the model based on the extracted features. Results. Performance of the model was demonstrated using species from three families separately, as well as all species combined. Overall classification accuracy of the model was greater than 90% for all cases. In addition, effects of STFT variables on the performance of the identification model were explored in this study. Conclusions. Short-time Fourier transform could determine important features of the otolith outlines. The fully-automated model proposed in this study (STFT-DA) could predict species of an unknown specimen with acceptable identification accuracy. The model codes can be accessed at http://mybiodiversityontologies.um.edu.my/Otolith/ and https://peerj.com/preprints/1517/. The current model has flexibility to be used for more species and families in future studies.

  1. The MMP inhibitor (R)-2-(N-benzyl-4-(2-[18F]fluoroethoxy)phenylsulphonamido) -N-hydroxy-3-methylbutanamide: Improved precursor synthesis and fully automated radiosynthesis

    International Nuclear Information System (INIS)

    Wagner, Stefan; Faust, Andreas; Breyholz, Hans-Joerg; Schober, Otmar; Schaefers, Michael; Kopka, Klaus

    2011-01-01

    Summary: The CGS 25966 derivative (R)-2-(N-Benzyl-4-(2-[ 18 F]fluoroethoxy)phenyl-sulphonamido) -N-hydroxy-3-methylbutanamide [ 18 F]9 represents a very potent radiolabelled matrix metalloproteinase inhibitor. For first human PET studies it is mandatory to have a fully automated radiosynthesis and a straightforward precursor synthesis available. The realisation of both requirements is reported herein. In particular, the corresponding precursor 8 was obtained in a reliable 7 step synthesis with an overall chemical yield of 2.3%. Furthermore, the target compound [ 18 F]9 was prepared with a radiochemical yield of 14.8±3.9% (not corrected for decay).

  2. Advances toward fully automated in vivo assessment of oral epithelial dysplasia by nuclear endomicroscopy-A pilot study.

    Science.gov (United States)

    Liese, Jan; Winter, Karsten; Glass, Änne; Bertolini, Julia; Kämmerer, Peer Wolfgang; Frerich, Bernhard; Schiefke, Ingolf; Remmerbach, Torsten W

    2017-11-01

    Uncertainties in detection of oral epithelial dysplasia (OED) frequently result from sampling error especially in inflammatory oral lesions. Endomicroscopy allows non-invasive, "en face" imaging of upper oral epithelium, but parameters of OED are unknown. Mucosal nuclei were imaged in 34 toluidine blue-stained oral lesions with a commercial endomicroscopy. Histopathological diagnosis showed four biopsies in "dys-/neoplastic," 23 in "inflammatory," and seven in "others" disease groups. Strength of different assessment strategies of nuclear scoring, nuclear count, and automated nuclear analysis were measured by area under ROC curve (AUC) to identify histopathological "dys-/neoplastic" group. Nuclear objects from automated image analysis were visually corrected. Best-performing parameters of nuclear-to-image ratios were the count of large nuclei (AUC=0.986) and 6-nearest neighborhood relation (AUC=0.896), and best parameters of nuclear polymorphism were the count of atypical nuclei (AUC=0.996) and compactness of nuclei (AUC=0.922). Excluding low-grade OED, nuclear scoring and count reached 100% sensitivity and 98% specificity for detection of dys-/neoplastic lesions. In automated analysis, combination of parameters enhanced diagnostic strength. Sensitivity of 100% and specificity of 87% were seen for distances of 6-nearest neighbors and aspect ratios even in uncorrected objects. Correction improved measures of nuclear polymorphism only. The hue of background color was stronger than nuclear density (AUC=0.779 vs 0.687) to detect dys-/neoplastic group indicating that macroscopic aspect is biased. Nuclear-to-image ratios are applicable for automated optical in vivo diagnostics for oral potentially malignant disorders. Nuclear endomicroscopy may promote non-invasive, early detection of dys-/neoplastic lesions by reducing sampling error. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  3. Towards a fully automated lab-on-a-disc system integrating sample enrichment and detection of analytes from complex matrices

    DEFF Research Database (Denmark)

    Andreasen, Sune Zoëga

    the technology on a large scale from fulfilling its potential for maturing into applied technologies and products. In this work, we have taken the first steps towards realizing a capable and truly automated “sample-to-answer” analysis system, aimed at small molecule detection and quantification from a complex...... sample matrix. The main result is a working prototype of a microfluidic system, integrating both centrifugal microfluidics for sample handling, supported liquid membrane extraction (SLM) for selective and effective sample treatment, as well as in-situ electrochemical detection. As a case study...

  4. A user-friendly robotic sample preparation program for fully automated biological sample pipetting and dilution to benefit the regulated bioanalysis.

    Science.gov (United States)

    Jiang, Hao; Ouyang, Zheng; Zeng, Jianing; Yuan, Long; Zheng, Naiyu; Jemal, Mohammed; Arnold, Mark E

    2012-06-01

    Biological sample dilution is a rate-limiting step in bioanalytical sample preparation when the concentrations of samples are beyond standard curve ranges, especially when multiple dilution factors are needed in an analytical run. We have developed and validated a Microsoft Excel-based robotic sample preparation program (RSPP) that automatically transforms Watson worklist sample information (identification, sequence and dilution factor) to comma-separated value (CSV) files. The Freedom EVO liquid handler software imports and transforms the CSV files to executable worklists (.gwl files), allowing the robot to perform sample dilutions at variable dilution factors. The dynamic dilution range is 1- to 1000-fold and divided into three dilution steps: 1- to 10-, 11- to 100-, and 101- to 1000-fold. The whole process, including pipetting samples, diluting samples, and adding internal standard(s), is accomplished within 1 h for two racks of samples (96 samples/rack). This platform also supports online sample extraction (liquid-liquid extraction, solid-phase extraction, protein precipitation, etc.) using 96 multichannel arms. This fully automated and validated sample dilution and preparation process has been applied to several drug development programs. The results demonstrate that application of the RSPP for fully automated sample processing is efficient and rugged. The RSPP not only saved more than 50% of the time in sample pipetting and dilution but also reduced human errors. The generated bioanalytical data are accurate and precise; therefore, this application can be used in regulated bioanalysis.

  5. Fully automated radiosynthesis of [11C]PBR28, a radiopharmaceutical for the translocator protein (TSPO) 18 kDa, using a GE TRACERlab FXC-Pro

    International Nuclear Information System (INIS)

    Hoareau, Raphaël; Shao, Xia; Henderson, Bradford D.; Scott, Peter J.H.

    2012-01-01

    In order to image the translocator protein (TSPO) 18 kDa in the clinic using positron emission tomography (PET) imaging, we had a cause to prepare [ 11 C]PBR28. In this communication we highlight our novel, recently developed, one-pot synthesis of the desmethyl-PBR28 precursor, as well as present an optimized fully automated preparation of [ 11 C]PBR28 using a GE TRACERlab FX C-Pro . Following radiolabelling, purification is achieved by HPLC and, to the best of our knowledge, the first reported example of reconstituting [ 11 C]PBR28 into ethanolic saline using solid-phase extraction (SPE). This procedure is operationally simple, and provides high quality doses of [ 11 C]PBR28 suitable for use in clinical PET imaging studies. Typical radiochemical yield using the optimized method is 3.6% yield (EOS, n=3), radiochemical and chemical purity are consistently >99%, and specific activities are 14,523 Ci/mmol. Highlights: ► This paper reports a fully automated synthesis of [ 11 C]PBR28 using a TRACERlab FXc-pro. ► We report a solid-phase extraction technique for the reconstitution of [ 11 C]PBR28. ► ICP-MS data for PBR28 precursor is reported confirming suitability for clinical use.

  6. Fully Automated Atlas-Based Hippocampus Volumetry for Clinical Routine: Validation in Subjects with Mild Cognitive Impairment from the ADNI Cohort.

    Science.gov (United States)

    Suppa, Per; Hampel, Harald; Spies, Lothar; Fiebach, Jochen B; Dubois, Bruno; Buchert, Ralph

    2015-01-01

    Hippocampus volumetry based on magnetic resonance imaging (MRI) has not yet been translated into everyday clinical diagnostic patient care, at least in part due to limited availability of appropriate software tools. In the present study, we evaluate a fully-automated and computationally efficient processing pipeline for atlas based hippocampal volumetry using freely available Statistical Parametric Mapping (SPM) software in 198 amnestic mild cognitive impairment (MCI) subjects from the Alzheimer's Disease Neuroimaging Initiative (ADNI1). Subjects were grouped into MCI stable and MCI to probable Alzheimer's disease (AD) converters according to follow-up diagnoses at 12, 24, and 36 months. Hippocampal grey matter volume (HGMV) was obtained from baseline T1-weighted MRI and then corrected for total intracranial volume and age. Average processing time per subject was less than 4 minutes on a standard PC. The area under the receiver operator characteristic curve of the corrected HGMV for identification of MCI to probable AD converters within 12, 24, and 36 months was 0.78, 0.72, and 0.71, respectively. Thus, hippocampal volume computed with the fully-automated processing pipeline provides similar power for prediction of MCI to probable AD conversion as computationally more expensive methods. The whole processing pipeline has been made freely available as an SPM8 toolbox. It is easily set up and integrated into everyday clinical patient care.

  7. Development of a fully automated open-column chemical-separation system—COLUMNSPIDER—and its application to Sr-Nd-Pb isotope analyses of igneous rock samples

    Science.gov (United States)

    Miyazaki, Takashi; Vaglarov, Bogdan Stefanov; Takei, Masakazu; Suzuki, Masahiro; Suzuki, Hiroaki; Ohsawa, Kouzou; Chang, Qing; Takahashi, Toshiro; Hirahara, Yuka; Hanyu, Takeshi; Kimura, Jun-Ichi; Tatsumi, Yoshiyuki

    A fully automated open-column resin-bed chemical-separation system, named COLUMNSPIDER, has been developed. The system consists of a programmable micropipetting robot that dispenses chemical reagents and sample solutions into an open-column resin bed for elemental separation. After the initial set up of resin columns, chemical reagents, and beakers for the separated chemical components, all separation procedures are automated. As many as ten samples can be eluted in parallel in a single automated run. Many separation procedures, such as radiogenic isotope ratio analyses for Sr and Nd, involve the use of multiple column separations with different resin columns, chemical reagents, and beakers of various volumes. COLUMNSPIDER completes these separations using multiple runs. Programmable functions, including the positioning of the micropipetter, reagent volume, and elution time, enable flexible operation. Optimized movements for solution take-up and high-efficiency column flushing allow the system to perform as precisely as when carried out manually by a skilled operator. Procedural blanks, examined for COLUMNSPIDER separations of Sr, Nd, and Pb, are low and negligible. The measured Sr, Nd, and Pb isotope ratios for JB-2 and Nd isotope ratios for JB-3 and BCR-2 rock standards all fall within the ranges reported previously in high-accuracy analyses. COLUMNSPIDER is a versatile tool for the efficient elemental separation of igneous rock samples, a process that is both labor intensive and time consuming.

  8. Fully automated synthesis of ¹¹C-acetate as tumor PET tracer by simple modified solid-phase extraction purification.

    Science.gov (United States)

    Tang, Xiaolan; Tang, Ganghua; Nie, Dahong

    2013-12-01

    Automated synthesis of (11)C-acetate ((11)C-AC) as the most commonly used radioactive fatty acid tracer is performed by a simple, rapid, and modified solid-phase extraction (SPE) purification. Automated synthesis of (11)C-AC was implemented by carboxylation reaction of MeMgBr on a polyethylene Teflon loop ring with (11)C-CO2, followed by acidic hydrolysis with acid and SCX cartridge, and purification on SCX, AG11A8 and C18 SPE cartridges using a commercially available (11)C-tracer synthesizer. Quality control test and animals positron emission tomography (PET) imaging were also carried out. A high and reproducible decay-uncorrected radiochemical yield of (41.0 ± 4.6)% (n=10) was obtained from (11)C-CO2 within the whole synthesis time about 8 min. The radiochemical purity of (11)C-AC was over 95% by high-performance liquid chromatography (HPLC) analysis. Quality control test and PET imaging showed that (11)C-AC injection produced by the simple SPE procedure was safe and efficient, and was in agreement with the current Chinese radiopharmaceutical quality control guidelines. The novel, simple, and rapid method is readily adapted to the fully automated synthesis of (11)C-AC on several existing commercial synthesis module. The method can be used routinely to produce (11)C-AC for preclinical and clinical studies with PET imaging. Copyright © 2013 Elsevier Ltd. All rights reserved.

  9. Centrifugal LabTube platform for fully automated DNA purification and LAMP amplification based on an integrated, low-cost heating system.

    Science.gov (United States)

    Hoehl, Melanie M; Weißert, Michael; Dannenberg, Arne; Nesch, Thomas; Paust, Nils; von Stetten, Felix; Zengerle, Roland; Slocum, Alexander H; Steigert, Juergen

    2014-06-01

    This paper introduces a disposable battery-driven heating system for loop-mediated isothermal DNA amplification (LAMP) inside a centrifugally-driven DNA purification platform (LabTube). We demonstrate LabTube-based fully automated DNA purification of as low as 100 cell-equivalents of verotoxin-producing Escherichia coli (VTEC) in water, milk and apple juice in a laboratory centrifuge, followed by integrated and automated LAMP amplification with a reduction of hands-on time from 45 to 1 min. The heating system consists of two parallel SMD thick film resistors and a NTC as heating and temperature sensing elements. They are driven by a 3 V battery and controlled by a microcontroller. The LAMP reagents are stored in the elution chamber and the amplification starts immediately after the eluate is purged into the chamber. The LabTube, including a microcontroller-based heating system, demonstrates contamination-free and automated sample-to-answer nucleic acid testing within a laboratory centrifuge. The heating system can be easily parallelized within one LabTube and it is deployable for a variety of heating and electrical applications.

  10. Evaluation of a fully automated treponemal test and comparison with conventional VDRL and FTA-ABS tests.

    Science.gov (United States)

    Park, Yongjung; Park, Younhee; Joo, Shin Young; Park, Myoung Hee; Kim, Hyon-Suk

    2011-11-01

    We evaluated analytic performances of an automated treponemal test and compared this test with the Venereal Disease Research Laboratory test (VDRL) and fluorescent treponemal antibody absorption test (FTA-ABS). Precision performance of the Architect Syphilis TP assay (TP; Abbott Japan, Tokyo, Japan) was assessed, and 150 serum samples were assayed with the TP before and after heat inactivation to estimate the effect of heat inactivation. A total of 616 specimens were tested with the FTA-ABS and TP, and 400 were examined with the VDRL. The TP showed good precision performance with total imprecision of less than a 10% coefficient of variation. An excellent linear relationship between results before and after heat inactivation was observed (R(2) = 0.9961). The FTA-ABS and TP agreed well with a κ coefficient of 0.981. The concordance rate between the FTA-ABS and TP was the highest (99.0%), followed by the rates between FTA-ABS and VDRL (85.0%) and between TP and VDRL (83.8%). The automated TP assay may be adequate for screening for syphilis in a large volume of samples and can be an alternative to FTA-ABS.

  11. Technical Note: A fully automated purge and trap GC-MS system for quantification of volatile organic compound (VOC fluxes between the ocean and atmosphere

    Directory of Open Access Journals (Sweden)

    S. J. Andrews

    2015-04-01

    Full Text Available The oceans are a key source of a number of atmospherically important volatile gases. The accurate and robust determination of trace gases in seawater is a significant analytical challenge, requiring reproducible and ideally automated sample handling, a high efficiency of seawater–air transfer, removal of water vapour from the sample stream, and high sensitivity and selectivity of the analysis. Here we describe a system that was developed for the fully automated analysis of dissolved very short-lived halogenated species (VSLS sampled from an under-way seawater supply. The system can also be used for semi-automated batch sampling from Niskin bottles filled during CTD (conductivity, temperature, depth profiles. The essential components comprise a bespoke, automated purge and trap (AutoP & T unit coupled to a commercial thermal desorption and gas chromatograph mass spectrometer (TD-GC-MS. The AutoP & T system has completed five research cruises, from the tropics to the poles, and collected over 2500 oceanic samples to date. It is able to quantify >25 species over a boiling point range of 34–180 °C with Henry's law coefficients of 0.018 and greater (CH22l, kHcc dimensionless gas/aqueous and has been used to measure organic sulfurs, hydrocarbons, halocarbons and terpenes. In the eastern tropical Pacific, the high sensitivity and sampling frequency provided new information regarding the distribution of VSLS, including novel measurements of a photolytically driven diurnal cycle of CH22l within the surface ocean water.

  12. The bright side of snow cover effects on PV production - How to lower the seasonal mismatch between electricity supply and demand in a fully renewable Switzerland

    Science.gov (United States)

    Kahl, Annelen; Dujardin, Jérôme; Dupuis, Sonia; Lehning, Michael

    2017-04-01

    One of the major problems with solar PV in the context of a fully renewable electricity production at mid-latitudes is the trend of higher production in summer and lower production in winter. This trend is most often exactly opposite to demand patterns, causing a seasonal mismatch that requires extensive balancing power from other production sources or large storage capacities. Which possibilities do we have to bring PV production into closer correlation with demand? This question motivated our research and in response we investigated the effects of placing PV panels at different tilt angles in regions with extensive snow cover to increase winter production from ground reflected short wave radiation. The aim of this project is therefore to quantify the effect of varying snow cover duration (SCD) and of panel tilt angle on the annual total production and on production during winter months when electricity is most needed. We chose Switzerland as ideal test site, because it has a wide range of snow cover conditions and a high potential for renewable electricity production. But methods can be applied to other regions of comparable conditions for snow cover and irradiance. Our analysis can be separated into two steps: 1. A systematic, GIS and satellite-based analysis for all of Switzerland: We use time series of satellite-derived irradiance, and snow cover characteristics together with land surface cover types and elevation information to quantify the environmental conditions and to estimate potential production and ideal tilt angles. 2. A scenario-based analysis that contrasts the production patterns of different placement scenarios for PV panels in urban, rural and mountainous areas. We invoke a model of a fully renewable electricity system (including Switzerland's large hydropower system) at national level to compute the electricity import and storage capacity that will be required to balance the remaining mismatch between production and demand to further illuminate

  13. Automation in trace-element chemistry - Development of a fully automated on-line preconcentration device for trace analysis of heavy metals with atomic spectroscopy

    International Nuclear Information System (INIS)

    Michaelis, M.R.A.

    1990-01-01

    Scope of this work was the development of an automated system for trace element preconcentration to be used and integrated to analytic atomic spectroscopic methods like flame atomic absorption spectrometry (FAAS), graphite furnace atomic absorption spectrometry (GFAAS) or atomic emission spectroscopy with inductively coupled plasma (ICP-AES). Based on the newly developed cellulose-based chelating cation exchangers ethylene-diamin-triacetic acid cellulose (EDTrA-Cellulose) and sulfonated-oxine cellulose a flexible, computer-controlled instrument for automation of preconcentration and/or of matrix separation of heavy metals is described. The most important properties of these materials are fast exchange kinetics, good selectivity against alkaline and alkaline earth elements, good flow characteristics and good stability of the material and the chelating functions against changes in pH-values of reagents necessary in the process. The combination of the preconcentration device for on-line determinations of Zn, Cd, Pb, Ni, Fe, Co, Mn, V, Cu, La, U, Th is described for FAAS and for ICP-AES with a simultaneous spectrometer. Signal enhancement factors of 70 are achieved from preconcentration of 10 ml and on-line determination with FAAS due to signal quantification in peak-height mode. For GFAAS and for sequential ICP methods for off-line preconcentration are given. The optimization and adaption of the interface to the different characteristics of the analytical instrumentation is emphasized. For evaluation and future developments with respect to determination and/or preconcentration of anionic species like As, Se, Sb etc. instrument modifications are proposed and a development software is described. (Author)

  14. Development of a Fully Automated Guided Wave System for In-Process Cure Monitoring of CFRP Composite Laminates

    Science.gov (United States)

    Hudson, Tyler B.; Hou, Tan-Hung; Grimsley, Brian W.; Yaun, Fuh-Gwo

    2016-01-01

    A guided wave-based in-process cure monitoring technique for carbon fiber reinforced polymer (CFRP) composites was investigated at NASA Langley Research Center. A key cure transition point (vitrification) was identified and the degree of cure was monitored using metrics such as amplitude and time of arrival (TOA) of guided waves. Using an automated system preliminarily developed in this work, high-temperature piezoelectric transducers were utilized to interrogate a twenty-four ply unidirectional composite panel fabricated from Hexcel (Registered Trademark) IM7/8552 prepreg during cure. It was shown that the amplitude of the guided wave increased sharply around vitrification and the TOA curve possessed an inverse relationship with degree of cure. The work is a first step in demonstrating the feasibility of transitioning the technique to perform in-process cure monitoring in an autoclave, defect detection during cure, and ultimately a closed-loop process control to maximize composite part quality and consistency.

  15. What externally presented information do VRUs require when interacting with fully Automated Road Transport Systems in shared space?

    Science.gov (United States)

    Merat, Natasha; Louw, Tyron; Madigan, Ruth; Wilbrink, Marc; Schieben, Anna

    2018-03-31

    As the desire for deploying automated ("driverless") vehicles increases, there is a need to understand how they might communicate with other road users in a mixed traffic, urban, setting. In the absence of an active and responsible human controller in the driving seat, who might currently communicate with other road users in uncertain/conflicting situations, in the future, understanding a driverless car's behaviour and intentions will need to be relayed via easily comprehensible, intuitive and universally intelligible means, perhaps presented externally via new vehicle interfaces. This paper reports on the results of a questionnaire-based study, delivered to 664 participants, recruited during live demonstrations of an Automated Road Transport Systems (ARTS; SAE Level 4), in three European cities. The questionnaire sought the views of pedestrians and cyclists, focussing on whether respondents felt safe interacting with ARTS in shared space, and also what externally presented travel behaviour information from the ARTS was important to them. Results showed that most pedestrians felt safer when the ARTS were travelling in designated lanes, rather than in shared space, and the majority believed they had priority over the ARTS, in the absence of such infrastructure. Regardless of lane demarcations, all respondents highlighted the importance of receiving some communication information about the behaviour of the ARTS, with acknowledgement of their detection by the vehicle being the most important message. There were no clear patterns across the respondents, regarding preference of modality for these external messages, with cultural and infrastructural differences thought to govern responses. Generally, however, conventional signals (lights and beeps) were preferred to text-based messages and spoken words. The results suggest that until these driverless vehicles are able to provide universally comprehensible externally presented information or messages during interaction

  16. Automated Demand Response Approaches to Household Energy Management in a Smart Grid Environment

    Science.gov (United States)

    Adika, Christopher Otieno

    The advancement of renewable energy technologies and the deregulation of the electricity market have seen the emergence of Demand response (DR) programs. Demand response is a cost-effective load management strategy which enables the electricity suppliers to maintain the integrity of the power grid during high peak periods, when the customers' electrical load is high. DR programs are designed to influence electricity users to alter their normal consumption patterns by offering them financial incentives. A well designed incentive-based DR scheme that offer competitive electricity pricing structure can result in numerous benefits to all the players in the electricity market. Lower power consumption during peak periods will significantly enhance the robustness of constrained networks by reducing the level of power of generation and transmission infrastructure needed to provide electric service. Therefore, this will ease the pressure of building new power networks as we avoiding costly energy procurements thereby translating into huge financial savings for the power suppliers. Peak load reduction will also reduce the inconveniences suffered by end users as a result of brownouts or blackouts. Demand response will also drastically lower the price peaks associated with wholesale markets. This will in turn reduce the electricity costs and risks for all the players in the energy market. Additionally, DR is environmentally friendly since it enhances the flexibility of the power grid through accommodation of renewable energy resources. Despite its many benefits, DR has not been embraced by most electricity networks. This can be attributed to the fact that the existing programs do not provide enough incentives to the end users and, therefore, most electricity users are not willing to participate in them. To overcome these challenges, most utilities are coming up with innovative strategies that will be more attractive to their customers. Thus, this dissertation presents various

  17. Quantification of common carotid artery and descending aorta vessel wall thickness from MR vessel wall imaging using a fully automated processing pipeline.

    Science.gov (United States)

    Gao, Shan; van 't Klooster, Ronald; Brandts, Anne; Roes, Stijntje D; Alizadeh Dehnavi, Reza; de Roos, Albert; Westenberg, Jos J M; van der Geest, Rob J

    2017-01-01

    To develop and evaluate a method that can fully automatically identify the vessel wall boundaries and quantify the wall thickness for both common carotid artery (CCA) and descending aorta (DAO) from axial magnetic resonance (MR) images. 3T MRI data acquired with T 1 -weighted gradient-echo black-blood imaging sequence from carotid (39 subjects) and aorta (39 subjects) were used to develop and test the algorithm. The vessel wall segmentation was achieved by respectively fitting a 3D cylindrical B-spline surface to the boundaries of lumen and outer wall. The tube-fitting was based on the edge detection performed on the signal intensity (SI) profile along the surface normal. To achieve a fully automated process, Hough Transform (HT) was developed to estimate the lumen centerline and radii for the target vessel. Using the outputs of HT, a tube model for lumen segmentation was initialized and deformed to fit the image data. Finally, lumen segmentation was dilated to initiate the adaptation procedure of outer wall tube. The algorithm was validated by determining: 1) its performance against manual tracing; 2) its interscan reproducibility in quantifying vessel wall thickness (VWT); 3) its capability of detecting VWT difference in hypertensive patients compared with healthy controls. Statistical analysis including Bland-Altman analysis, t-test, and sample size calculation were performed for the purpose of algorithm evaluation. The mean distance between the manual and automatically detected lumen/outer wall contours was 0.00 ± 0.23/0.09 ± 0.21 mm for CCA and 0.12 ± 0.24/0.14 ± 0.35 mm for DAO. No significant difference was observed between the interscan VWT assessment using automated segmentation for both CCA (P = 0.19) and DAO (P = 0.94). Both manual and automated segmentation detected significantly higher carotid (P = 0.016 and P = 0.005) and aortic (P < 0.001 and P = 0.021) wall thickness in the hypertensive patients. A reliable and reproducible pipeline for fully

  18. A fully automated non-external marker 4D-CT sorting algorithm using a serial cine scanning protocol.

    Science.gov (United States)

    Carnes, Greg; Gaede, Stewart; Yu, Edward; Van Dyk, Jake; Battista, Jerry; Lee, Ting-Yim

    2009-04-07

    Current 4D-CT methods require external marker data to retrospectively sort image data and generate CT volumes. In this work we develop an automated 4D-CT sorting algorithm that performs without the aid of data collected from an external respiratory surrogate. The sorting algorithm requires an overlapping cine scan protocol. The overlapping protocol provides a spatial link between couch positions. Beginning with a starting scan position, images from the adjacent scan position (which spatial match the starting scan position) are selected by maximizing the normalized cross correlation (NCC) of the images at the overlapping slice position. The process was continued by 'daisy chaining' all couch positions using the selected images until an entire 3D volume was produced. The algorithm produced 16 phase volumes to complete a 4D-CT dataset. Additional 4D-CT datasets were also produced using external marker amplitude and phase angle sorting methods. The image quality of the volumes produced by the different methods was quantified by calculating the mean difference of the sorted overlapping slices from adjacent couch positions. The NCC sorted images showed a significant decrease in the mean difference (p < 0.01) for the five patients.

  19. Fully Automated On-Chip Imaging Flow Cytometry System with Disposable Contamination-Free Plastic Re-Cultivation Chip

    Directory of Open Access Journals (Sweden)

    Tomoyuki Kaneko

    2011-06-01

    Full Text Available We have developed a novel imaging cytometry system using a poly(methyl methacrylate (PMMA based microfluidic chip. The system was contamination-free, because sample suspensions contacted only with a flammable PMMA chip and no other component of the system. The transparency and low-fluorescence of PMMA was suitable for microscopic imaging of cells flowing through microchannels on the chip. Sample particles flowing through microchannels on the chip were discriminated by an image-recognition unit with a high-speed camera in real time at the rate of 200 event/s, e.g., microparticles 2.5 μm and 3.0 μm in diameter were differentiated with an error rate of less than 2%. Desired cells were separated automatically from other cells by electrophoretic or dielectrophoretic force one by one with a separation efficiency of 90%. Cells in suspension with fluorescent dye were separated using the same kind of microfluidic chip. Sample of 5 μL with 1 × 106 particle/mL was processed within 40 min. Separated cells could be cultured on the microfluidic chip without contamination. The whole operation of sample handling was automated using 3D micropipetting system. These results showed that the novel imaging flow cytometry system is practically applicable for biological research and clinical diagnostics.

  20. Inertial Microfluidic Cell Stretcher (iMCS): Fully Automated, High-Throughput, and Near Real-Time Cell Mechanotyping.

    Science.gov (United States)

    Deng, Yanxiang; Davis, Steven P; Yang, Fan; Paulsen, Kevin S; Kumar, Maneesh; Sinnott DeVaux, Rebecca; Wang, Xianhui; Conklin, Douglas S; Oberai, Assad; Herschkowitz, Jason I; Chung, Aram J

    2017-07-01

    Mechanical biomarkers associated with cytoskeletal structures have been reported as powerful label-free cell state identifiers. In order to measure cell mechanical properties, traditional biophysical (e.g., atomic force microscopy, micropipette aspiration, optical stretchers) and microfluidic approaches were mainly employed; however, they critically suffer from low-throughput, low-sensitivity, and/or time-consuming and labor-intensive processes, not allowing techniques to be practically used for cell biology research applications. Here, a novel inertial microfluidic cell stretcher (iMCS) capable of characterizing large populations of single-cell deformability near real-time is presented. The platform inertially controls cell positions in microchannels and deforms cells upon collision at a T-junction with large strain. The cell elongation motions are recorded, and thousands of cell deformability information is visualized near real-time similar to traditional flow cytometry. With a full automation, the entire cell mechanotyping process runs without any human intervention, realizing a user friendly and robust operation. Through iMCS, distinct cell stiffness changes in breast cancer progression and epithelial mesenchymal transition are reported, and the use of the platform for rapid cancer drug discovery is shown as well. The platform returns large populations of single-cell quantitative mechanical properties (e.g., shear modulus) on-the-fly with high statistical significances, enabling actual usages in clinical and biophysical studies. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. SU-G-206-01: A Fully Automated CT Tool to Facilitate Phantom Image QA for Quantitative Imaging in Clinical Trials

    International Nuclear Information System (INIS)

    Wahi-Anwar, M; Lo, P; Kim, H; Brown, M; McNitt-Gray, M

    2016-01-01

    Purpose: The use of Quantitative Imaging (QI) methods in Clinical Trials requires both verification of adherence to a specified protocol and an assessment of scanner performance under that protocol, which are currently accomplished manually. This work introduces automated phantom identification and image QA measure extraction towards a fully-automated CT phantom QA system to perform these functions and facilitate the use of Quantitative Imaging methods in clinical trials. Methods: This study used a retrospective cohort of CT phantom scans from existing clinical trial protocols - totaling 84 phantoms, across 3 phantom types using various scanners and protocols. The QA system identifies the input phantom scan through an ensemble of threshold-based classifiers. Each classifier - corresponding to a phantom type - contains a template slice, which is compared to the input scan on a slice-by-slice basis, resulting in slice-wise similarity metric values for each slice compared. Pre-trained thresholds (established from a training set of phantom images matching the template type) are used to filter the similarity distribution, and the slice with the most optimal local mean similarity, with local neighboring slices meeting the threshold requirement, is chosen as the classifier’s matched slice (if it existed). The classifier with the matched slice possessing the most optimal local mean similarity is then chosen as the ensemble’s best matching slice. If the best matching slice exists, image QA algorithm and ROIs corresponding to the matching classifier extracted the image QA measures. Results: Automated phantom identification performed with 84.5% accuracy and 88.8% sensitivity on 84 phantoms. Automated image quality measurements (following standard protocol) on identified water phantoms (n=35) matched user QA decisions with 100% accuracy. Conclusion: We provide a fullyautomated CT phantom QA system consistent with manual QA performance. Further work will include parallel

  2. SU-G-206-01: A Fully Automated CT Tool to Facilitate Phantom Image QA for Quantitative Imaging in Clinical Trials

    Energy Technology Data Exchange (ETDEWEB)

    Wahi-Anwar, M; Lo, P; Kim, H; Brown, M; McNitt-Gray, M [UCLA Radiological Sciences, Los Angeles, CA (United States)

    2016-06-15

    Purpose: The use of Quantitative Imaging (QI) methods in Clinical Trials requires both verification of adherence to a specified protocol and an assessment of scanner performance under that protocol, which are currently accomplished manually. This work introduces automated phantom identification and image QA measure extraction towards a fully-automated CT phantom QA system to perform these functions and facilitate the use of Quantitative Imaging methods in clinical trials. Methods: This study used a retrospective cohort of CT phantom scans from existing clinical trial protocols - totaling 84 phantoms, across 3 phantom types using various scanners and protocols. The QA system identifies the input phantom scan through an ensemble of threshold-based classifiers. Each classifier - corresponding to a phantom type - contains a template slice, which is compared to the input scan on a slice-by-slice basis, resulting in slice-wise similarity metric values for each slice compared. Pre-trained thresholds (established from a training set of phantom images matching the template type) are used to filter the similarity distribution, and the slice with the most optimal local mean similarity, with local neighboring slices meeting the threshold requirement, is chosen as the classifier’s matched slice (if it existed). The classifier with the matched slice possessing the most optimal local mean similarity is then chosen as the ensemble’s best matching slice. If the best matching slice exists, image QA algorithm and ROIs corresponding to the matching classifier extracted the image QA measures. Results: Automated phantom identification performed with 84.5% accuracy and 88.8% sensitivity on 84 phantoms. Automated image quality measurements (following standard protocol) on identified water phantoms (n=35) matched user QA decisions with 100% accuracy. Conclusion: We provide a fullyautomated CT phantom QA system consistent with manual QA performance. Further work will include parallel

  3. SU-D-BRD-06: Creating a Safety Net for a Fully Automated, Script Driven Electronic Medical Record

    Energy Technology Data Exchange (ETDEWEB)

    Sheu, R; Ghafar, R; Powers, A; Green, S; Lo, Y [Mount Sinai Medical Center, New York, NY (United States)

    2015-06-15

    Purpose: Demonstrate the effectiveness of in-house software in ensuring EMR workflow efficiency and safety. Methods: A web-based dashboard system (WBDS) was developed to monitor clinical workflow in real time using web technology (WAMP) through ODBC (Open Database Connectivity). Within Mosaiq (Elekta Inc), operational workflow is driven and indicated by Quality Check Lists (QCLs), which is triggered by automation software IQ Scripts (Elekta Inc); QCLs rely on user completion to propagate. The WBDS retrieves data directly from the Mosaig SQL database and tracks clinical events in real time. For example, the necessity of a physics initial chart check can be determined by screening all patients on treatment who have received their first fraction and who have not yet had their first chart check. Monitoring similar “real” events with our in-house software creates a safety net as its propagation does not rely on individual users input. Results: The WBDS monitors the following: patient care workflow (initial consult to end of treatment), daily treatment consistency (scheduling, technique, charges), physics chart checks (initial, EOT, weekly), new starts, missing treatments (>3 warning/>5 fractions, action required), and machine overrides. The WBDS can be launched from any web browser which allows the end user complete transparency and timely information. Since the creation of the dashboards, workflow interruptions due to accidental deletion or completion of QCLs were eliminated. Additionally, all physics chart checks were completed timely. Prompt notifications of treatment record inconsistency and machine overrides have decreased the amount of time between occurrence and execution of corrective action. Conclusion: Our clinical workflow relies primarily on QCLs and IQ Scripts; however, this functionality is not the panacea of safety and efficiency. The WBDS creates a more thorough system of checks to provide a safer and near error-less working environment.

  4. SU-D-BRD-06: Creating a Safety Net for a Fully Automated, Script Driven Electronic Medical Record

    International Nuclear Information System (INIS)

    Sheu, R; Ghafar, R; Powers, A; Green, S; Lo, Y

    2015-01-01

    Purpose: Demonstrate the effectiveness of in-house software in ensuring EMR workflow efficiency and safety. Methods: A web-based dashboard system (WBDS) was developed to monitor clinical workflow in real time using web technology (WAMP) through ODBC (Open Database Connectivity). Within Mosaiq (Elekta Inc), operational workflow is driven and indicated by Quality Check Lists (QCLs), which is triggered by automation software IQ Scripts (Elekta Inc); QCLs rely on user completion to propagate. The WBDS retrieves data directly from the Mosaig SQL database and tracks clinical events in real time. For example, the necessity of a physics initial chart check can be determined by screening all patients on treatment who have received their first fraction and who have not yet had their first chart check. Monitoring similar “real” events with our in-house software creates a safety net as its propagation does not rely on individual users input. Results: The WBDS monitors the following: patient care workflow (initial consult to end of treatment), daily treatment consistency (scheduling, technique, charges), physics chart checks (initial, EOT, weekly), new starts, missing treatments (>3 warning/>5 fractions, action required), and machine overrides. The WBDS can be launched from any web browser which allows the end user complete transparency and timely information. Since the creation of the dashboards, workflow interruptions due to accidental deletion or completion of QCLs were eliminated. Additionally, all physics chart checks were completed timely. Prompt notifications of treatment record inconsistency and machine overrides have decreased the amount of time between occurrence and execution of corrective action. Conclusion: Our clinical workflow relies primarily on QCLs and IQ Scripts; however, this functionality is not the panacea of safety and efficiency. The WBDS creates a more thorough system of checks to provide a safer and near error-less working environment

  5. Clinical Evaluation of Fully Automated Elecsys® Syphilis Assay for the Detection of Antibodies of Treponema pallidum.

    Science.gov (United States)

    Li, Dongdong; An, Jingna; Wang, Tingting; Tao, Chuanmin; Wang, Lanlan

    2016-11-01

    The resurgence of syphilis in recent years has become a serious threat to the public health worldwide, and the serological detection of specific antibodies against Treponema pallidum (TP) remains the most reliable method for laboratory diagnosis of syphilis. The performance of the Elecsys ® Syphilis assay, a brand new electrochemiluminescene immunoassay (ECLIA), was assessed by large amounts of samples in this study. In comparison with InTec assay, the Elecsys ® Syphilis assay was evaluated in 146 preselected samples from patients with syphilis, 1803 clinical routine samples, and 175 preselected samples from specific populations with reportedly increased rates of false-positive syphilis test results. Discrepancy samples must be investigated by Mikrogen Syphilis recomline assay. There was an overall agreement of 99.58% between two assays (Kappa = 0.975). The sensitivity and specificity of the Elecsys ® Syphilis assay were 100.0% (95% CI, 96.8-100.0%) and 99.8% (95% CI, 99.5-100.0%), respectively. The Elecsys syphilis assay displays better sensitivity (100%), specificity (99.8%), PPV (98.7%), and NPV (100%) in 2124 samples enrolled, compared with the InTec assay. Considering the excellent ease of use and automation, high throughput, and its superior sensitivity, especially in primary syphilis, the Elecsys ® Syphilis assay could represent an outstanding choice for screening of syphilis in high-volume laboratories. However, more attention was still needed, or the results must be confirmed by other treponemal immunoassays. The new Elecsys ® Syphilis assay is applied to patients with malignant neoplasm or HIV infection. © 2016 Wiley Periodicals, Inc.

  6. Fully automated synthesis of the M{sub 1} receptor agonist [{sup 11}C]GSK1034702 for clinical use on an Eckert and Ziegler Modular Lab system

    Energy Technology Data Exchange (ETDEWEB)

    Huiban, Mickael, E-mail: Mickael.x.huiban@gsk.com [GlaxoSmithKline, Clinical Imaging Centre, Imperial College London, Hammersmith Hospital, Du Cane Road, London, W12 0NN (United Kingdom); Pampols-Maso, Sabina; Passchier, Jan [GlaxoSmithKline, Clinical Imaging Centre, Imperial College London, Hammersmith Hospital, Du Cane Road, London, W12 0NN (United Kingdom)

    2011-10-15

    A fully automated and GMP compatible synthesis has been developed to reliably label the M{sub 1} receptor agonist GSK1034702 with carbon-11. Stille reaction of the trimethylstannyl precursor with [{sup 11}C]methyl iodide afforded [{sup 11}C]GSK1034702 in an estimated 10{+-}3% decay corrected yield. This method utilises the commercially available modular laboratory equipment and provides high purity [{sup 11}C]GSK1034702 in a formulation suitable for human use. - Highlights: > Preparation of [{sup 11}C]GSK1034702 through a Stille cross-coupling reaction. > Provision of the applicability of commercially available modules for the synthesis of non-standard PET tracers. > Defining specification for heavy metals content in final dose product. > Presenting results from validation of manufacturing process.

  7. FULLY AUTOMATED GIS-BASED INDIVIDUAL TREE CROWN DELINEATION BASED ON CURVATURE VALUES FROM A LIDAR DERIVED CANOPY HEIGHT MODEL IN A CONIFEROUS PLANTATION

    Directory of Open Access Journals (Sweden)

    R. J. L. Argamosa

    2016-06-01

    Full Text Available The generation of high resolution canopy height model (CHM from LiDAR makes it possible to delineate individual tree crown by means of a fully-automated method using the CHM’s curvature through its slope. The local maxima are obtained by taking the maximum raster value in a 3 m x 3 m cell. These values are assumed as tree tops and therefore considered as individual trees. Based on the assumptions, thiessen polygons were generated to serve as buffers for the canopy extent. The negative profile curvature is then measured from the slope of the CHM. The results show that the aggregated points from a negative profile curvature raster provide the most realistic crown shape. The absence of field data regarding tree crown dimensions require accurate visual assessment after the appended delineated tree crown polygon was superimposed to the hill shaded CHM.

  8. A fully automated meltwater monitoring and collection system for spatially distributed isotope analysis in snowmelt-dominated catchments

    Science.gov (United States)

    Rücker, Andrea; Boss, Stefan; Von Freyberg, Jana; Zappa, Massimiliano; Kirchner, James

    2016-04-01

    In many mountainous catchments the seasonal snowpack stores a significant volume of water, which is released as streamflow during the melting period. The predicted change in future climate will bring new challenges in water resource management in snow-dominated headwater catchments and their receiving lowlands. To improve predictions of hydrologic extreme events, particularly summer droughts, it is important characterize the relationship between winter snowpack and summer (low) flows in such areas (e.g., Godsey et al., 2014). In this context, stable water isotopes (18O, 2H) are a powerful tool for fingerprinting the sources of streamflow and tracing water flow pathways. For this reason, we have established an isotope sampling network in the Alptal catchment (46.4 km2) in Central-Switzerland as part of the SREP-Drought project (Snow Resources and the Early Prediction of hydrological DROUGHT in mountainous streams). Samples of precipitation (daily), snow cores (weekly) and runoff (daily) are analyzed for their isotopic signature in a regular cycle. Precipitation is also sampled along a horizontal transect at the valley bottom, and along an elevational transect. Additionally, the analysis of snow meltwater is of importance. As the sample collection of snow meltwater in mountainous terrain is often impractical, we have developed a fully automatic snow lysimeter system, which measures meltwater volume and collects samples for isotope analysis at daily intervals. The system consists of three lysimeters built from Decagon-ECRN-100 High Resolution Rain Gauges as standard component that allows monitoring of meltwater flow. Each lysimeter leads the meltwater into a 10-liter container that is automatically sampled and then emptied daily. These water samples are replaced regularly and analyzed afterwards on their isotopic composition in the lab. Snow melt events as well as system status can be monitored in real time. In our presentation we describe the automatic snow lysimeter

  9. Rapid detection of enterovirus in cerebrospinal fluid by a fully-automated PCR assay is associated with improved management of aseptic meningitis in adult patients.

    Science.gov (United States)

    Giulieri, Stefano G; Chapuis-Taillard, Caroline; Manuel, Oriol; Hugli, Olivier; Pinget, Christophe; Wasserfallen, Jean-Blaise; Sahli, Roland; Jaton, Katia; Marchetti, Oscar; Meylan, Pascal

    2015-01-01

    Enterovirus (EV) is the most frequent cause of aseptic meningitis (AM). Lack of microbiological documentation results in unnecessary antimicrobial therapy and hospitalization. To assess the impact of rapid EV detection in cerebrospinal fluid (CSF) by a fully-automated PCR (GeneXpert EV assay, GXEA) on the management of AM. Observational study in adult patients with AM. Three groups were analyzed according to EV documentation in CSF: group A = no PCR or negative PCR (n=17), group B = positive real-time PCR (n = 20), and group C = positive GXEA (n = 22). Clinical, laboratory and health-care costs data were compared. Clinical characteristics were similar in the 3 groups. Median turn-around time of EV PCR decreased from 60 h (IQR (interquartile range) 44-87) in group B to 5h (IQR 4-11) in group C (p<0.0001). Median duration of antibiotics was 1 (IQR 0-6), 1 (0-1.9), and 0.5 days (single dose) in groups A, B, and C, respectively (p < 0.001). Median length of hospitalization was 4 days (2.5-7.5), 2 (1-3.7), and 0.5 (0.3-0.7), respectively (p < 0.001). Median hospitalization costs were $5458 (2676-6274) in group A, $2796 (2062-5726) in group B, and $921 (765-1230) in group C (p < 0.0001). Rapid EV detection in CSF by a fully-automated PCR improves management of AM by significantly reducing antibiotic use, hospitalization length and costs. Copyright © 2014 Elsevier B.V. All rights reserved.

  10. Fully automated quantification of regional cerebral blood flow with three-dimensional stereotaxic region of interest template. Validation using magnetic resonance imaging. Technical note

    Energy Technology Data Exchange (ETDEWEB)

    Takeuchi, Ryo; Katayama, Shigenori; Takeda, Naoya; Fujita, Katsuzo [Nishi-Kobe Medical Center (Japan); Yonekura, Yoshiharu [Fukui Medical Univ., Matsuoka (Japan); Konishi, Junji [Kyoto Univ. (Japan). Graduate School of Medicine

    2003-03-01

    The previously reported three-dimensional stereotaxic region of interest (ROI) template (3DSRT-t) for the analysis of anatomically standardized technetium-99m-L,L-ethyl cysteinate dimer ({sup 99m}Tc-ECD) single photon emission computed tomography (SPECT) images was modified for use in a fully automated regional cerebral blood flow (rCBF) quantification software, 3DSRT, incorporating an anatomical standardization engine transplanted from statistical parametric mapping 99 and ROIs for quantification based on 3DSRT-t. Three-dimensional T{sub 2}-weighted magnetic resonance images of 10 patients with localized infarcted areas were compared with the ROI contour of 3DSRT, and the positions of the central sulcus in the primary sensorimotor area were also estimated. All positions of the 20 lesions were in strict accordance with the ROI delineation of 3DSRT. The central sulcus was identified on at least one side of 210 paired ROIs and in the middle of 192 (91.4%) of these 210 paired ROIs among the 273 paired ROIs of the primary sensorimotor area. The central sulcus was recognized in the middle of more than 71.4% of the ROIs in which the central sulcus was identifiable in the respective 28 slices of the primary sensorimotor area. Fully automated accurate ROI delineation on anatomically standardized images is possible with 3DSRT, which enables objective quantification of rCBF and vascular reserve in only a few minutes using {sup 99m}Tc-ECD SPECT images obtained by the resting and vascular reserve (RVR) method. (author)

  11. Fully automated quantification of regional cerebral blood flow with three-dimensional stereotaxic region of interest template. Validation using magnetic resonance imaging. Technical note

    International Nuclear Information System (INIS)

    Takeuchi, Ryo; Katayama, Shigenori; Takeda, Naoya; Fujita, Katsuzo; Yonekura, Yoshiharu; Konishi, Junji

    2003-01-01

    The previously reported three-dimensional stereotaxic region of interest (ROI) template (3DSRT-t) for the analysis of anatomically standardized technetium-99m-L,L-ethyl cysteinate dimer ( 99m Tc-ECD) single photon emission computed tomography (SPECT) images was modified for use in a fully automated regional cerebral blood flow (rCBF) quantification software, 3DSRT, incorporating an anatomical standardization engine transplanted from statistical parametric mapping 99 and ROIs for quantification based on 3DSRT-t. Three-dimensional T 2 -weighted magnetic resonance images of 10 patients with localized infarcted areas were compared with the ROI contour of 3DSRT, and the positions of the central sulcus in the primary sensorimotor area were also estimated. All positions of the 20 lesions were in strict accordance with the ROI delineation of 3DSRT. The central sulcus was identified on at least one side of 210 paired ROIs and in the middle of 192 (91.4%) of these 210 paired ROIs among the 273 paired ROIs of the primary sensorimotor area. The central sulcus was recognized in the middle of more than 71.4% of the ROIs in which the central sulcus was identifiable in the respective 28 slices of the primary sensorimotor area. Fully automated accurate ROI delineation on anatomically standardized images is possible with 3DSRT, which enables objective quantification of rCBF and vascular reserve in only a few minutes using 99m Tc-ECD SPECT images obtained by the resting and vascular reserve (RVR) method. (author)

  12. Fully Automated Simultaneous Integrated Boosted-Intensity Modulated Radiation Therapy Treatment Planning Is Feasible for Head-and-Neck Cancer: A Prospective Clinical Study

    Energy Technology Data Exchange (ETDEWEB)

    Wu Binbin, E-mail: binbin.wu@gunet.georgetown.edu [Department of Radiation Oncology and Molecular Radiation Science, Johns Hopkins University, Baltimore, Maryland (United States); Department of Radiation Medicine, Georgetown University Hospital, Washington, DC (United States); McNutt, Todd [Department of Radiation Oncology and Molecular Radiation Science, Johns Hopkins University, Baltimore, Maryland (United States); Zahurak, Marianna [Department of Oncology Biostatistics, Johns Hopkins University, Baltimore, Maryland (United States); Simari, Patricio [Autodesk Research, Toronto, ON (Canada); Pang, Dalong [Department of Radiation Medicine, Georgetown University Hospital, Washington, DC (United States); Taylor, Russell [Department of Computer Science, Johns Hopkins University, Baltimore, Maryland (United States); Sanguineti, Giuseppe [Department of Radiation Oncology and Molecular Radiation Science, Johns Hopkins University, Baltimore, Maryland (United States)

    2012-12-01

    Purpose: To prospectively determine whether overlap volume histogram (OVH)-driven, automated simultaneous integrated boosted (SIB)-intensity-modulated radiation therapy (IMRT) treatment planning for head-and-neck cancer can be implemented in clinics. Methods and Materials: A prospective study was designed to compare fully automated plans (APs) created by an OVH-driven, automated planning application with clinical plans (CPs) created by dosimetrists in a 3-dose-level (70 Gy, 63 Gy, and 58.1 Gy), head-and-neck SIB-IMRT planning. Because primary organ sparing (cord, brain, brainstem, mandible, and optic nerve/chiasm) always received the highest priority in clinical planning, the study aimed to show the noninferiority of APs with respect to PTV coverage and secondary organ sparing (parotid, brachial plexus, esophagus, larynx, inner ear, and oral mucosa). The sample size was determined a priori by a superiority hypothesis test that had 85% power to detect a 4% dose decrease in secondary organ sparing with a 2-sided alpha level of 0.05. A generalized estimating equation (GEE) regression model was used for statistical comparison. Results: Forty consecutive patients were accrued from July to December 2010. GEE analysis indicated that in APs, overall average dose to the secondary organs was reduced by 1.16 (95% CI = 0.09-2.33) with P=.04, overall average PTV coverage was increased by 0.26% (95% CI = 0.06-0.47) with P=.02 and overall average dose to the primary organs was reduced by 1.14 Gy (95% CI = 0.45-1.8) with P=.004. A physician determined that all APs could be delivered to patients, and APs were clinically superior in 27 of 40 cases. Conclusions: The application can be implemented in clinics as a fast, reliable, and consistent way of generating plans that need only minor adjustments to meet specific clinical needs.

  13. Fully Automated Simultaneous Integrated Boosted–Intensity Modulated Radiation Therapy Treatment Planning Is Feasible for Head-and-Neck Cancer: A Prospective Clinical Study

    International Nuclear Information System (INIS)

    Wu Binbin; McNutt, Todd; Zahurak, Marianna; Simari, Patricio; Pang, Dalong; Taylor, Russell; Sanguineti, Giuseppe

    2012-01-01

    Purpose: To prospectively determine whether overlap volume histogram (OVH)–driven, automated simultaneous integrated boosted (SIB)-intensity-modulated radiation therapy (IMRT) treatment planning for head-and-neck cancer can be implemented in clinics. Methods and Materials: A prospective study was designed to compare fully automated plans (APs) created by an OVH-driven, automated planning application with clinical plans (CPs) created by dosimetrists in a 3-dose-level (70 Gy, 63 Gy, and 58.1 Gy), head-and-neck SIB-IMRT planning. Because primary organ sparing (cord, brain, brainstem, mandible, and optic nerve/chiasm) always received the highest priority in clinical planning, the study aimed to show the noninferiority of APs with respect to PTV coverage and secondary organ sparing (parotid, brachial plexus, esophagus, larynx, inner ear, and oral mucosa). The sample size was determined a priori by a superiority hypothesis test that had 85% power to detect a 4% dose decrease in secondary organ sparing with a 2-sided alpha level of 0.05. A generalized estimating equation (GEE) regression model was used for statistical comparison. Results: Forty consecutive patients were accrued from July to December 2010. GEE analysis indicated that in APs, overall average dose to the secondary organs was reduced by 1.16 (95% CI = 0.09-2.33) with P=.04, overall average PTV coverage was increased by 0.26% (95% CI = 0.06-0.47) with P=.02 and overall average dose to the primary organs was reduced by 1.14 Gy (95% CI = 0.45-1.8) with P=.004. A physician determined that all APs could be delivered to patients, and APs were clinically superior in 27 of 40 cases. Conclusions: The application can be implemented in clinics as a fast, reliable, and consistent way of generating plans that need only minor adjustments to meet specific clinical needs.

  14. Parameter evaluation and fully-automated radiosynthesis of [11C]harmine for imaging of MAO-A for clinical trials

    International Nuclear Information System (INIS)

    Philippe, C.; Zeilinger, M.; Mitterhauser, M.; Dumanic, M.; Lanzenberger, R.; Hacker, M.; Wadsak, W.

    2015-01-01

    The aim of the present study was the evaluation and automation of the radiosynthesis of [ 11 C]harmine for clinical trials. The following parameters have been investigated: amount of base, precursor concentration, solvent, reaction temperature and time. The optimum reaction conditions were determined to be 2–3 mg/mL precursor activated with 1 eq. 5 M NaOH in DMSO, 80 °C reaction temperature and 2 min reaction time. Under these conditions 6.1±1 GBq (51.0±11% based on [ 11 C]CH 3 I, corrected for decay) of [ 11 C]harmine (n=72) were obtained. The specific activity was 101.32±28.2 GBq/µmol (at EOS). All quality control parameters were in accordance with the standards for parenteral human application. Due to its reliability and high yields, this fully-automated synthesis method can be used as routine set-up. - Highlights: • Preparation of [ 11 C]harmine on a commercially available synthesizer for the routine application. • High reliability: only 4 out of 72 failed syntheses; 5% due to technical problems. • High yields: 6.1±1 GBq overall yield (EOS). • High specific activities: 101.32±28.2 GBq/µmol

  15. Parameter evaluation and fully-automated radiosynthesis of [(11)C]harmine for imaging of MAO-A for clinical trials.

    Science.gov (United States)

    Philippe, C; Zeilinger, M; Mitterhauser, M; Dumanic, M; Lanzenberger, R; Hacker, M; Wadsak, W

    2015-03-01

    The aim of the present study was the evaluation and automation of the radiosynthesis of [(11)C]harmine for clinical trials. The following parameters have been investigated: amount of base, precursor concentration, solvent, reaction temperature and time. The optimum reaction conditions were determined to be 2-3mg/mL precursor activated with 1eq. 5M NaOH in DMSO, 80°C reaction temperature and 2min reaction time. Under these conditions 6.1±1GBq (51.0±11% based on [(11)C]CH3I, corrected for decay) of [(11)C]harmine (n=72) were obtained. The specific activity was 101.32±28.2GBq/µmol (at EOS). All quality control parameters were in accordance with the standards for parenteral human application. Due to its reliability and high yields, this fully-automated synthesis method can be used as routine set-up. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  16. Histograms of Oriented 3D Gradients for Fully Automated Fetal Brain Localization and Robust Motion Correction in 3 T Magnetic Resonance Images.

    Science.gov (United States)

    Serag, Ahmed; Macnaught, Gillian; Denison, Fiona C; Reynolds, Rebecca M; Semple, Scott I; Boardman, James P

    2017-01-01

    Fetal brain magnetic resonance imaging (MRI) is a rapidly emerging diagnostic imaging tool. However, automated fetal brain localization is one of the biggest obstacles in expediting and fully automating large-scale fetal MRI processing. We propose a method for automatic localization of fetal brain in 3 T MRI when the images are acquired as a stack of 2D slices that are misaligned due to fetal motion. First, the Histogram of Oriented Gradients (HOG) feature descriptor is extended from 2D to 3D images. Then, a sliding window is used to assign a score to all possible windows in an image, depending on the likelihood of it containing a brain, and the window with the highest score is selected. In our evaluation experiments using a leave-one-out cross-validation strategy, we achieved 96% of complete brain localization using a database of 104 MRI scans at gestational ages between 34 and 38 weeks. We carried out comparisons against template matching and random forest based regression methods and the proposed method showed superior performance. We also showed the application of the proposed method in the optimization of fetal motion correction and how it is essential for the reconstruction process. The method is robust and does not rely on any prior knowledge of fetal brain development.

  17. Histograms of Oriented 3D Gradients for Fully Automated Fetal Brain Localization and Robust Motion Correction in 3 T Magnetic Resonance Images

    Directory of Open Access Journals (Sweden)

    Ahmed Serag

    2017-01-01

    Full Text Available Fetal brain magnetic resonance imaging (MRI is a rapidly emerging diagnostic imaging tool. However, automated fetal brain localization is one of the biggest obstacles in expediting and fully automating large-scale fetal MRI processing. We propose a method for automatic localization of fetal brain in 3 T MRI when the images are acquired as a stack of 2D slices that are misaligned due to fetal motion. First, the Histogram of Oriented Gradients (HOG feature descriptor is extended from 2D to 3D images. Then, a sliding window is used to assign a score to all possible windows in an image, depending on the likelihood of it containing a brain, and the window with the highest score is selected. In our evaluation experiments using a leave-one-out cross-validation strategy, we achieved 96% of complete brain localization using a database of 104 MRI scans at gestational ages between 34 and 38 weeks. We carried out comparisons against template matching and random forest based regression methods and the proposed method showed superior performance. We also showed the application of the proposed method in the optimization of fetal motion correction and how it is essential for the reconstruction process. The method is robust and does not rely on any prior knowledge of fetal brain development.

  18. Fast and Efficient Fragment-Based Lead Generation by Fully Automated Processing and Analysis of Ligand-Observed NMR Binding Data.

    Science.gov (United States)

    Peng, Chen; Frommlet, Alexandra; Perez, Manuel; Cobas, Carlos; Blechschmidt, Anke; Dominguez, Santiago; Lingel, Andreas

    2016-04-14

    NMR binding assays are routinely applied in hit finding and validation during early stages of drug discovery, particularly for fragment-based lead generation. To this end, compound libraries are screened by ligand-observed NMR experiments such as STD, T1ρ, and CPMG to identify molecules interacting with a target. The analysis of a high number of complex spectra is performed largely manually and therefore represents a limiting step in hit generation campaigns. Here we report a novel integrated computational procedure that processes and analyzes ligand-observed proton and fluorine NMR binding data in a fully automated fashion. A performance evaluation comparing automated and manual analysis results on (19)F- and (1)H-detected data sets shows that the program delivers robust, high-confidence hit lists in a fraction of the time needed for manual analysis and greatly facilitates visual inspection of the associated NMR spectra. These features enable considerably higher throughput, the assessment of larger libraries, and shorter turn-around times.

  19. Fully Automated Robust System to Detect Retinal Edema, Central Serous Chorioretinopathy, and Age Related Macular Degeneration from Optical Coherence Tomography Images

    Directory of Open Access Journals (Sweden)

    Samina Khalid

    2017-01-01

    Full Text Available Maculopathy is the excessive damage to macula that leads to blindness. It mostly occurs due to retinal edema (RE, central serous chorioretinopathy (CSCR, or age related macular degeneration (ARMD. Optical coherence tomography (OCT imaging is the latest eye testing technique that can detect these syndromes in early stages. Many researchers have used OCT images to detect retinal abnormalities. However, to the best of our knowledge, no research that presents a fully automated system to detect all of these macular syndromes is reported. This paper presents the world’s first ever decision support system to automatically detect RE, CSCR, and ARMD retinal pathologies and healthy retina from OCT images. The automated disease diagnosis in our proposed system is based on multilayered support vector machines (SVM classifier trained on 40 labeled OCT scans (10 healthy, 10 RE, 10 CSCR, and 10 ARMD. After training, SVM forms an accurate decision about the type of retinal pathology using 9 extracted features. We have tested our proposed system on 2819 OCT scans (1437 healthy, 640 RE, and 742 CSCR of 502 patients from two different datasets and our proposed system correctly diagnosed 2817/2819 subjects with the accuracy, sensitivity, and specificity ratings of 99.92%, 100%, and 99.86%, respectively.

  20. Fully automated synthesis of (phosphopeptide arrays in microtiter plate wells provides efficient access to protein tyrosine kinase characterization

    Directory of Open Access Journals (Sweden)

    Goldstein David J

    2005-01-01

    Full Text Available Abstract Background Synthetic peptides have played a useful role in studies of protein kinase substrates and interaction domains. Synthetic peptide arrays and libraries, in particular, have accelerated the process. Several factors have hindered or limited the applicability of various techniques, such as the need for deconvolution of combinatorial libraries, the inability or impracticality of achieving full automation using two-dimensional or pin solid phases, the lack of convenient interfacing with standard analytical platforms, or the difficulty of compartmentalization of a planar surface when contact between assay components needs to be avoided. This paper describes a process for synthesis of peptides and phosphopeptides on microtiter plate wells that overcomes previous limitations and demonstrates utility in determination of the epitope of an autophosphorylation site phospho-motif antibody and utility in substrate utilization assays of the protein tyrosine kinase, p60c-src. Results The overall reproducibility of phospho-peptide synthesis and multiplexed EGF receptor (EGFR autophosphorylation site (pY1173 antibody ELISA (9H2 was within 5.5 to 8.0%. Mass spectrometric analyses of the released (phosphopeptides showed homogeneous peaks of the expected molecular weights. An overlapping peptide array of the complete EGFR cytoplasmic sequence revealed a high redundancy of 9H2 reactive sites. The eight reactive phospopeptides were structurally related and interestingly, the most conserved antibody reactive peptide motif coincided with a subset of other known EGFR autophosphorylation and SH2 binding motifs and an EGFR optimal substrate motif. Finally, peptides based on known substrate specificities of c-src and related enzymes were synthesized in microtiter plate array format and were phosphorylated by c-Src with the predicted specificities. The level of phosphorylation was proportional to c-Src concentration with sensitivities below 0.1 Units of

  1. Automated Metadata Formatting for Cornell’s Print-on-Demand Books

    Directory of Open Access Journals (Sweden)

    Dianne Dietrich

    2009-11-01

    Full Text Available Cornell University Library has made Print-On Demand (POD books available for many of its digitized out-of-copyright books. The printer must be supplied with metadata from the MARC bibliographic record in order to produce book covers. Although the names of authors are present in MARC records, they are given in an inverted order suitable for alphabetical filing rather than the natural order that is desirable for book covers. This article discusses a process for parsing and manipulating the MARC author strings to identify their various component parts and to create natural order strings. In particular, the article focuses on processing non-name information in author strings, such as titles that were commonly used in older works, e.g., baron or earl, and suffixes appended to names, e.g., "of Bolsena." Relevant patterns are identified and a Python script is used to manipulate the author name strings.

  2. Comparison of two theory-based, fully automated telephone interventions designed to maintain dietary change in healthy adults: study protocol of a three-arm randomized controlled trial.

    Science.gov (United States)

    Wright, Julie A; Quintiliani, Lisa M; Turner-McGrievy, Gabrielle M; Migneault, Jeffrey P; Heeren, Timothy; Friedman, Robert H

    2014-11-10

    Health behavior change interventions have focused on obtaining short-term intervention effects; few studies have evaluated mid-term and long-term outcomes, and even fewer have evaluated interventions that are designed to maintain and enhance initial intervention effects. Moreover, behavior theory has not been developed for maintenance or applied to maintenance intervention design to the degree that it has for behavior change initiation. The objective of this paper is to describe a study that compared two theory-based interventions (social cognitive theory [SCT] vs goal systems theory [GST]) designed to maintain previously achieved improvements in fruit and vegetable (F&V) consumption. The interventions used tailored, interactive conversations delivered by a fully automated telephony system (Telephone-Linked Care [TLC]) over a 6-month period. TLC maintenance intervention based on SCT used a skills-based approach to build self-efficacy. It assessed confidence in and barriers to eating F&V, provided feedback on how to overcome barriers, plan ahead, and set goals. The TLC maintenance intervention based on GST used a cognitive-based approach. Conversations trained participants in goal management to help them integrate their newly acquired dietary behavior into their hierarchical system of goals. Content included goal facilitation, conflict, shielding, and redundancy, and reflection on personal goals and priorities. To evaluate and compare the two approaches, a sample of adults whose F&V consumption was below public health goal levels were recruited from a large urban area to participate in a fully automated telephony intervention (TLC-EAT) for 3-6 months. Participants who increase their daily intake of F&V by ≥1 serving/day will be eligible for the three-arm randomized controlled trial. A sample of 405 participants will be randomized to one of three arms: (1) an assessment-only control, (2) TLC-SCT, and (3) TLC-GST. The maintenance interventions are 6 months. All 405

  3. Development of a Real-Time PCR Protocol Requiring Minimal Handling for Detection of Vancomycin-Resistant Enterococci with the Fully Automated BD Max System.

    Science.gov (United States)

    Dalpke, Alexander H; Hofko, Marjeta; Zimmermann, Stefan

    2016-09-01

    Vancomycin-resistant enterococci (VRE) are an important cause of health care-associated infections, resulting in significant mortality and a significant economic burden in hospitals. Active surveillance for at-risk populations contributes to the prevention of infections with VRE. The availability of a combination of automation and molecular detection procedures for rapid screening would be beneficial. Here, we report on the development of a laboratory-developed PCR for detection of VRE which runs on the fully automated Becton Dickinson (BD) Max platform, which combines DNA extraction, PCR setup, and real-time PCR amplification. We evaluated two protocols: one using a liquid master mix and the other employing commercially ordered dry-down reagents. The BD Max VRE PCR was evaluated in two rounds with 86 and 61 rectal elution swab (eSwab) samples, and the results were compared to the culture results. The sensitivities of the different PCR formats were 84 to 100% for vanA and 83.7 to 100% for vanB; specificities were 96.8 to 100% for vanA and 81.8 to 97% for vanB The use of dry-down reagents and the ExK DNA-2 kit for extraction showed that the samples were less inhibited (3.3%) than they were by the use of the liquid master mix (14.8%). Adoption of a cutoff threshold cycle of 35 for discrimination of vanB-positive samples allowed an increase of specificity to 87.9%. The performance of the BD Max VRE assay equaled that of the BD GeneOhm VanR assay, which was run in parallel. The use of dry-down reagents simplifies the assay and omits any need to handle liquid PCR reagents. Copyright © 2016, American Society for Microbiology. All Rights Reserved.

  4. Fully Automated Segmentation of Fluid/Cyst Regions in Optical Coherence Tomography Images With Diabetic Macular Edema Using Neutrosophic Sets and Graph Algorithms.

    Science.gov (United States)

    Rashno, Abdolreza; Koozekanani, Dara D; Drayna, Paul M; Nazari, Behzad; Sadri, Saeed; Rabbani, Hossein; Parhi, Keshab K

    2018-05-01

    This paper presents a fully automated algorithm to segment fluid-associated (fluid-filled) and cyst regions in optical coherence tomography (OCT) retina images of subjects with diabetic macular edema. The OCT image is segmented using a novel neutrosophic transformation and a graph-based shortest path method. In neutrosophic domain, an image is transformed into three sets: (true), (indeterminate) that represents noise, and (false). This paper makes four key contributions. First, a new method is introduced to compute the indeterminacy set , and a new -correction operation is introduced to compute the set in neutrosophic domain. Second, a graph shortest-path method is applied in neutrosophic domain to segment the inner limiting membrane and the retinal pigment epithelium as regions of interest (ROI) and outer plexiform layer and inner segment myeloid as middle layers using a novel definition of the edge weights . Third, a new cost function for cluster-based fluid/cyst segmentation in ROI is presented which also includes a novel approach in estimating the number of clusters in an automated manner. Fourth, the final fluid regions are achieved by ignoring very small regions and the regions between middle layers. The proposed method is evaluated using two publicly available datasets: Duke, Optima, and a third local dataset from the UMN clinic which is available online. The proposed algorithm outperforms the previously proposed Duke algorithm by 8% with respect to the dice coefficient and by 5% with respect to precision on the Duke dataset, while achieving about the same sensitivity. Also, the proposed algorithm outperforms a prior method for Optima dataset by 6%, 22%, and 23% with respect to the dice coefficient, sensitivity, and precision, respectively. Finally, the proposed algorithm also achieves sensitivity of 67.3%, 88.8%, and 76.7%, for the Duke, Optima, and the university of minnesota (UMN) datasets, respectively.

  5. TH-AB-207A-05: A Fully-Automated Pipeline for Generating CT Images Across a Range of Doses and Reconstruction Methods

    International Nuclear Information System (INIS)

    Young, S; Lo, P; Hoffman, J; Wahi-Anwar, M; Brown, M; McNitt-Gray, M; Noo, F

    2016-01-01

    Purpose: To evaluate the robustness of CAD or Quantitative Imaging methods, they should be tested on a variety of cases and under a variety of image acquisition and reconstruction conditions that represent the heterogeneity encountered in clinical practice. The purpose of this work was to develop a fully-automated pipeline for generating CT images that represent a wide range of dose and reconstruction conditions. Methods: The pipeline consists of three main modules: reduced-dose simulation, image reconstruction, and quantitative analysis. The first two modules of the pipeline can be operated in a completely automated fashion, using configuration files and running the modules in a batch queue. The input to the pipeline is raw projection CT data; this data is used to simulate different levels of dose reduction using a previously-published algorithm. Filtered-backprojection reconstructions are then performed using FreeCT_wFBP, a freely-available reconstruction software for helical CT. We also added support for an in-house, model-based iterative reconstruction algorithm using iterative coordinate-descent optimization, which may be run in tandem with the more conventional recon methods. The reduced-dose simulations and image reconstructions are controlled automatically by a single script, and they can be run in parallel on our research cluster. The pipeline was tested on phantom and lung screening datasets from a clinical scanner (Definition AS, Siemens Healthcare). Results: The images generated from our test datasets appeared to represent a realistic range of acquisition and reconstruction conditions that we would expect to find clinically. The time to generate images was approximately 30 minutes per dose/reconstruction combination on a hybrid CPU/GPU architecture. Conclusion: The automated research pipeline promises to be a useful tool for either training or evaluating performance of quantitative imaging software such as classifiers and CAD algorithms across the range

  6. TH-AB-207A-05: A Fully-Automated Pipeline for Generating CT Images Across a Range of Doses and Reconstruction Methods

    Energy Technology Data Exchange (ETDEWEB)

    Young, S; Lo, P; Hoffman, J; Wahi-Anwar, M; Brown, M; McNitt-Gray, M [UCLA School of Medicine, Los Angeles, CA (United States); Noo, F [University of Utah, Salt Lake City, UT (United States)

    2016-06-15

    Purpose: To evaluate the robustness of CAD or Quantitative Imaging methods, they should be tested on a variety of cases and under a variety of image acquisition and reconstruction conditions that represent the heterogeneity encountered in clinical practice. The purpose of this work was to develop a fully-automated pipeline for generating CT images that represent a wide range of dose and reconstruction conditions. Methods: The pipeline consists of three main modules: reduced-dose simulation, image reconstruction, and quantitative analysis. The first two modules of the pipeline can be operated in a completely automated fashion, using configuration files and running the modules in a batch queue. The input to the pipeline is raw projection CT data; this data is used to simulate different levels of dose reduction using a previously-published algorithm. Filtered-backprojection reconstructions are then performed using FreeCT-wFBP, a freely-available reconstruction software for helical CT. We also added support for an in-house, model-based iterative reconstruction algorithm using iterative coordinate-descent optimization, which may be run in tandem with the more conventional recon methods. The reduced-dose simulations and image reconstructions are controlled automatically by a single script, and they can be run in parallel on our research cluster. The pipeline was tested on phantom and lung screening datasets from a clinical scanner (Definition AS, Siemens Healthcare). Results: The images generated from our test datasets appeared to represent a realistic range of acquisition and reconstruction conditions that we would expect to find clinically. The time to generate images was approximately 30 minutes per dose/reconstruction combination on a hybrid CPU/GPU architecture. Conclusion: The automated research pipeline promises to be a useful tool for either training or evaluating performance of quantitative imaging software such as classifiers and CAD algorithms across the range

  7. The possibility of a fully automated procedure for radiosynthesis of fluorine-18-labeled fluoromisonidazole using a simplified single, neutral alumina column purification procedure

    International Nuclear Information System (INIS)

    Nandy, Saikat; Rajan, M.G.R.; Korde, A.; Krishnamurthy, N.V.

    2010-01-01

    A novel fully automated radiosynthesis procedure for [ 18 F]Fluoromisonidazole using a simple alumina cartridge-column for purification instead of conventionally used semi-preparative HPLC was developed. [ 18 F]FMISO was prepared via a one-pot, two-step synthesis procedure using a modified nuclear interface synthesis module. Nucleophilic fluorination of the precursor molecule 1-(2'-nitro-1'-imidazolyl) -2-O-tetrahydropyranyl-3-O-toluenesulphonylpropanediol (NITTP) with no-carrier added [ 18 F]fluoride followed by hydrolysis of the protecting group with 1 M HCl. Purification was carried out using a single neutral alumina cartridge-column instead of semi-preparative HPLC. The maximum overall radiochemical yield obtained was 37.49±1.68% with 10 mg NITTP (n=3, without any decay correction) and the total synthesis time was 40±1 min. The radiochemical purity was greater than 95% and the product was devoid of other chemical impurities including residual aluminum and acetonitrile. The biodistribution study in fibrosarcoma tumor model showed maximum uptake in tumor, 2 h post injection. Finally, PET/CT imaging studies in normal healthy rabbit, showed clear uptake in the organs involved in the metabolic process of MISO. No bone uptake was observed excluding the presence of free [ 18 F]fluoride. The reported method can be easily adapted in any commercial FDG synthesis module.

  8. CSF biomarkers of Alzheimer's disease concord with amyloid-β PET and predict clinical progression: A study of fully automated immunoassays in BioFINDER and ADNI cohorts.

    Science.gov (United States)

    Hansson, Oskar; Seibyl, John; Stomrud, Erik; Zetterberg, Henrik; Trojanowski, John Q; Bittner, Tobias; Lifke, Valeria; Corradini, Veronika; Eichenlaub, Udo; Batrla, Richard; Buck, Katharina; Zink, Katharina; Rabe, Christina; Blennow, Kaj; Shaw, Leslie M

    2018-03-01

    We studied whether fully automated Elecsys cerebrospinal fluid (CSF) immunoassay results were concordant with positron emission tomography (PET) and predicted clinical progression, even with cutoffs established in an independent cohort. Cutoffs for Elecsys amyloid-β 1-42 (Aβ), total tau/Aβ(1-42), and phosphorylated tau/Aβ(1-42) were defined against [ 18 F]flutemetamol PET in Swedish BioFINDER (n = 277) and validated against [ 18 F]florbetapir PET in Alzheimer's Disease Neuroimaging Initiative (n = 646). Clinical progression in patients with mild cognitive impairment (n = 619) was studied. CSF total tau/Aβ(1-42) and phosphorylated tau/Aβ(1-42) ratios were highly concordant with PET classification in BioFINDER (overall percent agreement: 90%; area under the curve: 94%). The CSF biomarker statuses established by predefined cutoffs were highly concordant with PET classification in Alzheimer's Disease Neuroimaging Initiative (overall percent agreement: 89%-90%; area under the curves: 96%) and predicted greater 2-year clinical decline in patients with mild cognitive impairment. Strikingly, tau/Aβ ratios were as accurate as semiquantitative PET image assessment in predicting visual read-based outcomes. Elecsys CSF biomarker assays may provide reliable alternatives to PET in Alzheimer's disease diagnosis. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.

  9. Left Ventricle: Fully Automated Segmentation Based on Spatiotemporal Continuity and Myocardium Information in Cine Cardiac Magnetic Resonance Imaging (LV-FAST

    Directory of Open Access Journals (Sweden)

    Lijia Wang

    2015-01-01

    Full Text Available CMR quantification of LV chamber volumes typically and manually defines the basal-most LV, which adds processing time and user-dependence. This study developed an LV segmentation method that is fully automated based on the spatiotemporal continuity of the LV (LV-FAST. An iteratively decreasing threshold region growing approach was used first from the midventricle to the apex, until the LV area and shape discontinued, and then from midventricle to the base, until less than 50% of the myocardium circumference was observable. Region growth was constrained by LV spatiotemporal continuity to improve robustness of apical and basal segmentations. The LV-FAST method was compared with manual tracing on cardiac cine MRI data of 45 consecutive patients. Of the 45 patients, LV-FAST and manual selection identified the same apical slices at both ED and ES and the same basal slices at both ED and ES in 38, 38, 38, and 41 cases, respectively, and their measurements agreed within -1.6±8.7 mL, -1.4±7.8 mL, and 1.0±5.8% for EDV, ESV, and EF, respectively. LV-FAST allowed LV volume-time course quantitatively measured within 3 seconds on a standard desktop computer, which is fast and accurate for processing the cine volumetric cardiac MRI data, and enables LV filling course quantification over the cardiac cycle.

  10. Fully-Automated High-Throughput NMR System for Screening of Haploid Kernels of Maize (Corn by Measurement of Oil Content.

    Directory of Open Access Journals (Sweden)

    Hongzhi Wang

    Full Text Available One of the modern crop breeding techniques uses doubled haploid plants that contain an identical pair of chromosomes in order to accelerate the breeding process. Rapid haploid identification method is critical for large-scale selections of double haploids. The conventional methods based on the color of the endosperm and embryo seeds are slow, manual and prone to error. On the other hand, there exists a significant difference between diploid and haploid seeds generated by high oil inducer, which makes it possible to use oil content to identify the haploid. This paper describes a fully-automated high-throughput NMR screening system for maize haploid kernel identification. The system is comprised of a sampler unit to select a single kernel to feed for measurement of NMR and weight, and a kernel sorter to distribute the kernel according to the measurement result. Tests of the system show a consistent accuracy of 94% with an average screening time of 4 seconds per kernel. Field test result is described and the directions for future improvement are discussed.

  11. Fully-Automated High-Throughput NMR System for Screening of Haploid Kernels of Maize (Corn) by Measurement of Oil Content

    Science.gov (United States)

    Xu, Xiaoping; Huang, Qingming; Chen, Shanshan; Yang, Peiqiang; Chen, Shaojiang; Song, Yiqiao

    2016-01-01

    One of the modern crop breeding techniques uses doubled haploid plants that contain an identical pair of chromosomes in order to accelerate the breeding process. Rapid haploid identification method is critical for large-scale selections of double haploids. The conventional methods based on the color of the endosperm and embryo seeds are slow, manual and prone to error. On the other hand, there exists a significant difference between diploid and haploid seeds generated by high oil inducer, which makes it possible to use oil content to identify the haploid. This paper describes a fully-automated high-throughput NMR screening system for maize haploid kernel identification. The system is comprised of a sampler unit to select a single kernel to feed for measurement of NMR and weight, and a kernel sorter to distribute the kernel according to the measurement result. Tests of the system show a consistent accuracy of 94% with an average screening time of 4 seconds per kernel. Field test result is described and the directions for future improvement are discussed. PMID:27454427

  12. On transcending the impasse of respiratory motion correction applications in routine clinical imaging - a consideration of a fully automated data driven motion control framework

    International Nuclear Information System (INIS)

    Kesner, Adam L; Schleyer, Paul J; Büther, Florian; Walter, Martin A; Schäfers, Klaus P; Koo, Phillip J

    2014-01-01

    Positron emission tomography (PET) is increasingly used for the detection, characterization, and follow-up of tumors located in the thorax. However, patient respiratory motion presents a unique limitation that hinders the application of high-resolution PET technology for this type of imaging. Efforts to transcend this limitation have been underway for more than a decade, yet PET remains for practical considerations a modality vulnerable to motion-induced image degradation. Respiratory motion control is not employed in routine clinical operations. In this article, we take an opportunity to highlight some of the recent advancements in data-driven motion control strategies and how they may form an underpinning for what we are presenting as a fully automated data-driven motion control framework. This framework represents an alternative direction for future endeavors in motion control and can conceptually connect individual focused studies with a strategy for addressing big picture challenges and goals. The online version of this article (doi:10.1186/2197-7364-1-8) contains supplementary material, which is available to authorized users.

  13. The Set-Up and Implementation of Fully Virtualized Lessons with an Automated Workflow Utilizing VMC/Moodle at the Medical University of Graz

    Directory of Open Access Journals (Sweden)

    Herwig Erich Rehatschek

    2011-12-01

    Full Text Available With start of winter semester 2010/11 the Medical University of Graz (MUG successfully introduced a new primary learning management system (LMS Moodle. Moodle currently serves more than 4,300 students from three studies and holds more than 7,500 unique learning objects. With begin of the summer semester 2010 we decided to start a pilot with Moodle and 430 students. For the pilot we migrated the learning content of one module and two optional subjects to Moodle. The evaluation results were extremely promising – more than 92% of the students wanted immediately Moodle – also Moodle did meet our high expectations in terms of performance and scalability. Within this paper we describe how we defined and set-up a scalable and highly available platform for hosting Moodle and extended it by the functionality for fully automated virtual lessons. We state our experiences and give valuable clues for universities and institutions who want to introduce Moodle in the near future.

  14. Fully automated atlas-based method for prescribing 3D PRESS MR spectroscopic imaging: Toward robust and reproducible metabolite measurements in human brain.

    Science.gov (United States)

    Bian, Wei; Li, Yan; Crane, Jason C; Nelson, Sarah J

    2018-02-01

    To implement a fully automated atlas-based method for prescribing 3D PRESS MR spectroscopic imaging (MRSI). The PRESS selected volume and outer-volume suppression bands were predefined on the MNI152 standard template image. The template image was aligned to the subject T 1 -weighted image during a scan, and the resulting transformation was then applied to the predefined prescription. To evaluate the method, H-1 MRSI data were obtained in repeat scan sessions from 20 healthy volunteers. In each session, datasets were acquired twice without repositioning. The overlap ratio of the prescribed volume in the two sessions was calculated and the reproducibility of inter- and intrasession metabolite peak height and area ratios was measured by the coefficient of variation (CoV). The CoVs from intra- and intersession were compared by a paired t-test. The average overlap ratio of the automatically prescribed selection volumes between two sessions was 97.8%. The average voxel-based intersession CoVs were less than 0.124 and 0.163 for peak height and area ratios, respectively. Paired t-test showed no significant difference between the intra- and intersession CoVs. The proposed method provides a time efficient method to prescribe 3D PRESS MRSI with reproducible imaging positioning and metabolite measurements. Magn Reson Med 79:636-642, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.

  15. A Comparison of Fully Automated Methods of Data Analysis and Computer Assisted Heuristic Methods in an Electrode Kinetic Study of the Pathologically Variable [Fe(CN) 6 ] 3–/4– Process by AC Voltammetry

    KAUST Repository

    Morris, Graham P.; Simonov, Alexandr N.; Mashkina, Elena A.; Bordas, Rafel; Gillow, Kathryn; Baker, Ruth E.; Gavaghan, David J.; Bond, Alan M.

    2013-01-01

    Fully automated and computer assisted heuristic data analysis approaches have been applied to a series of AC voltammetric experiments undertaken on the [Fe(CN)6]3-/4- process at a glassy carbon electrode in 3 M KCl aqueous electrolyte. The recovered

  16. Delivering Cognitive Behavior Therapy to Young Adults With Symptoms of Depression and Anxiety Using a Fully Automated Conversational Agent (Woebot): A Randomized Controlled Trial.

    Science.gov (United States)

    Fitzpatrick, Kathleen Kara; Darcy, Alison; Vierhile, Molly

    2017-06-06

    Web-based cognitive-behavioral therapeutic (CBT) apps have demonstrated efficacy but are characterized by poor adherence. Conversational agents may offer a convenient, engaging way of getting support at any time. The objective of the study was to determine the feasibility, acceptability, and preliminary efficacy of a fully automated conversational agent to deliver a self-help program for college students who self-identify as having symptoms of anxiety and depression. In an unblinded trial, 70 individuals age 18-28 years were recruited online from a university community social media site and were randomized to receive either 2 weeks (up to 20 sessions) of self-help content derived from CBT principles in a conversational format with a text-based conversational agent (Woebot) (n=34) or were directed to the National Institute of Mental Health ebook, "Depression in College Students," as an information-only control group (n=36). All participants completed Web-based versions of the 9-item Patient Health Questionnaire (PHQ-9), the 7-item Generalized Anxiety Disorder scale (GAD-7), and the Positive and Negative Affect Scale at baseline and 2-3 weeks later (T2). Participants were on average 22.2 years old (SD 2.33), 67% female (47/70), mostly non-Hispanic (93%, 54/58), and Caucasian (79%, 46/58). Participants in the Woebot group engaged with the conversational agent an average of 12.14 (SD 2.23) times over the study period. No significant differences existed between the groups at baseline, and 83% (58/70) of participants provided data at T2 (17% attrition). Intent-to-treat univariate analysis of covariance revealed a significant group difference on depression such that those in the Woebot group significantly reduced their symptoms of depression over the study period as measured by the PHQ-9 (F=6.47; P=.01) while those in the information control group did not. In an analysis of completers, participants in both groups significantly reduced anxiety as measured by the GAD-7 (F 1

  17. A Comparison of Fully Automated Methods of Data Analysis and Computer Assisted Heuristic Methods in an Electrode Kinetic Study of the Pathologically Variable [Fe(CN) 6 ] 3–/4– Process by AC Voltammetry

    KAUST Repository

    Morris, Graham P.

    2013-12-17

    Fully automated and computer assisted heuristic data analysis approaches have been applied to a series of AC voltammetric experiments undertaken on the [Fe(CN)6]3-/4- process at a glassy carbon electrode in 3 M KCl aqueous electrolyte. The recovered parameters in all forms of data analysis encompass E0 (reversible potential), k0 (heterogeneous charge transfer rate constant at E0), α (charge transfer coefficient), Ru (uncompensated resistance), and Cdl (double layer capacitance). The automated method of analysis employed time domain optimization and Bayesian statistics. This and all other methods assumed the Butler-Volmer model applies for electron transfer kinetics, planar diffusion for mass transport, Ohm\\'s Law for Ru, and a potential-independent Cdl model. Heuristic approaches utilize combinations of Fourier Transform filtering, sensitivity analysis, and simplex-based forms of optimization applied to resolved AC harmonics and rely on experimenter experience to assist in experiment-theory comparisons. Remarkable consistency of parameter evaluation was achieved, although the fully automated time domain method provided consistently higher α values than those based on frequency domain data analysis. The origin of this difference is that the implemented fully automated method requires a perfect model for the double layer capacitance. In contrast, the importance of imperfections in the double layer model is minimized when analysis is performed in the frequency domain. Substantial variation in k0 values was found by analysis of the 10 data sets for this highly surface-sensitive pathologically variable [Fe(CN) 6]3-/4- process, but remarkably, all fit the quasi-reversible model satisfactorily. © 2013 American Chemical Society.

  18. Research on Simulation Requirements and Business Architecture of Automated Demand Response in Power Sales Side Market Liberalization

    Science.gov (United States)

    Liu, Yiqun; Zhou, Pengcheng; Zeng, Ming; Chen, Songsong

    2018-01-01

    With the gradual reform of the electricity market, the power sale side liberalization has become the focus of attention as the key task of reform. The open power market provides a good environment for DR (Demand Response). It is of great significance to research the simulation requirements and business architecture of ADR (Automatic Demand Response) in power sale side market liberalization. Firstly, this paper analyzes the simulation requirements of ADR. Secondly, it analyzes the influence factors that the business development of ADR from five aspects after power sale side market liberalization. Finally, Based on ADR technology support system, the business architecture of ADR after power sale side market liberalization is constructed.

  19. Automated evaluation of ultrasonic indications. State of the art -development trends. Pt. 1

    International Nuclear Information System (INIS)

    Hansch, M.K.T.; Stegemann, D.

    1994-01-01

    Future requirements of reliability and reproducibility in quality assurance demand computer evaluation of defect indications. The ultrasonic method with its large field of applications and a high potential for automation provides all preconditions for fully automated inspection. The survey proposes several desirable hardware improvements, data acquisition requirements and software configurations. (orig.) [de

  20. Fully automated dissolution and separation methods for inductively coupled plasma atomic emission spectrometry rock analysis. Application to the determination of rare earth elements

    International Nuclear Information System (INIS)

    Govindaraju, K.; Mevelle, G.

    1987-01-01

    In rock analysis laboratories, sample preparation is a serious problem, or even an enormous bottleneck. Because this laboratory is production-oriented, this problem was attacked by automating progressively, different steps in rock analysis for major, minor and trace elements. This effort has been considerably eased by the fact that all sample preparation schemes in this laboratory for the past three decades have been based on an initial lithium borate fusion of rock samples and all analytical methods based on multi-element atomic emission spectrometry, with switch-over from solid analysis by arc/spark excitation to solution analysis by plasma excitation in 1974. The sample preparation steps which have been automated are: weighing of samples and fluxes, lithium borate fusion, dissolution and dilution of fusion products and ion-exchange separation of difficult trace elements such as rare earth elements (REE). During 1985 and 1986, these different unit operations have been assembled together as peripheral units in the form of a workstation, called LabRobStation. A travelling robot is the master of LabRobStation, with all peripheral units at its reach in 10 m 2 workspace. As an example of real application, the automated determination of REE, based on more than 8000 samples analysed during 1982 and 1986, is presented. (author)

  1. Labelling of 90Y- and 177Lu-DOTA-Bioconjugates for Targeted Radionuclide Therapy: A Comparison among Manual, Semiautomated, and Fully Automated Synthesis

    Directory of Open Access Journals (Sweden)

    Michele Iori

    2017-01-01

    Full Text Available In spite of the hazard due to the radiation exposure, preparation of 90Y- and 177Lu-labelled radiopharmaceuticals is still mainly performed using manual procedures. In the present study the performance of a commercial automatic synthesizer based on disposable cassettes for the labelling of 177Lu- and 90Y-DOTA-conjugated biomolecules (namely, DOTATOC and PSMA-617 was evaluated and compared to a manual and a semiautomated approach. The dose exposure of the operators was evaluated as well. More than 300 clinical preparations of both 90Y- and 177Lu-labelled radiopharmaceuticals have been performed using the three different methods. The mean radiochemical yields for 90Y-DOTATOC were 96.2±4.9%, 90.3±5.6%, and 82.0±8.4%, while for 177Lu-DOTATOC they were 98.3%  ± 0.6, 90.8%  ± 8.3, and 83.1±5.7% when manual, semiautomated, and automated approaches were used, respectively. The mean doses on the whole hands for yttrium-90 preparations were 0.15±0.4 mSv/GBq, 0.04±0.1 mSv/GBq, and 0.11±0.3 mSv/GBq for manual, semiautomated, and automated synthesis, respectively, and for lutetium-177 preparations, they were 0.02±0.008 mSv/GBq, 0.01±0.03 mSv/GBq, and 0.01±0.02 mSv/GBq, respectively. In conclusion, the automated approach guaranteed reliable and reproducible preparations of pharmaceutical grade therapeutic radiopharmaceuticals in a decent RCY. The radiation exposure of the operators remained comparable to the manual approach mainly due to the fact that a dedicated shielding was still not available for the system.

  2. A fully automated effervescence assisted dispersive liquid–liquid microextraction based on a stepwise injection system. Determination of antipyrine in saliva samples

    International Nuclear Information System (INIS)

    Medinskaia, Kseniia; Vakh, Christina; Aseeva, Darina; Andruch, Vasil; Moskvin, Leonid; Bulatov, Andrey

    2016-01-01

    A first attempt to automate the effervescence assisted dispersive liquid–liquid microextraction (EA-DLLME) has been reported. The method is based on the aspiration of a sample and all required aqueous reagents into the stepwise injection analysis (SWIA) manifold, followed by simultaneous counterflow injection of the extraction solvent (dichloromethane), the mixture of the effervescence agent (0.5 mol L"−"1 Na_2CO_3) and the proton donor solution (1 mol L"−"1 CH_3COOH). Formation of carbon dioxide microbubbles generated in situ leads to the dispersion of the extraction solvent in the whole aqueous sample and extraction of the analyte into organic phase. Unlike the conventional DLLME, in the case of EA-DLLME, the addition of dispersive solvent, as well as, time consuming centrifugation step for disruption of the cloudy state is avoided. The phase separation was achieved by gentle bubbling of nitrogen stream (2 mL min"−"1 during 2 min). The performance of the suggested approach is demonstrated by determination of antipyrine in saliva samples. The procedure is based on the derivatization of antipyrine by nitrite-ion followed by EA-DLLME of 4-nitrosoantipyrine and subsequent UV–Vis detection using SWIA manifold. The absorbance of the yellow-colored extract at the wavelength of 345 nm obeys Beer's law in the range of 1.5–100 µmol L"−"1 of antipyrine in saliva. The LOD, calculated from a blank test based on 3σ, was 0.5 µmol L"−"1. - Highlights: • First attempt to automate the effervescence assisted dispersive liquid–liquid microextraction. • Automation based on Stepwise injection analysis manifold in flow batch system. • Counterflow injection of extraction solvent and the effervescence agent. • Phase separation performed by gentle bubbling of nitrogen. • Application for the determination of antipyrine in saliva samples.

  3. Closed-Loop Real-Time Imaging Enables Fully Automated Cell-Targeted Patch-Clamp Neural Recording In Vivo.

    Science.gov (United States)

    Suk, Ho-Jun; van Welie, Ingrid; Kodandaramaiah, Suhasa B; Allen, Brian; Forest, Craig R; Boyden, Edward S

    2017-08-30

    Targeted patch-clamp recording is a powerful method for characterizing visually identified cells in intact neural circuits, but it requires skill to perform. We previously developed an algorithm that automates "blind" patching in vivo, but full automation of visually guided, targeted in vivo patching has not been demonstrated, with currently available approaches requiring human intervention to compensate for cell movement as a patch pipette approaches a targeted neuron. Here we present a closed-loop real-time imaging strategy that automatically compensates for cell movement by tracking cell position and adjusting pipette motion while approaching a target. We demonstrate our system's ability to adaptively patch, under continuous two-photon imaging and real-time analysis, fluorophore-expressing neurons of multiple types in the living mouse cortex, without human intervention, with yields comparable to skilled human experimenters. Our "imagepatching" robot is easy to implement and will help enable scalable characterization of identified cell types in intact neural circuits. Copyright © 2017 Elsevier Inc. All rights reserved.

  4. Market transformation lessons learned from an automated demand response test in the Summer and Fall of 2003

    Energy Technology Data Exchange (ETDEWEB)

    Shockman, Christine; Piette, Mary Ann; ten Hope, Laurie

    2004-08-01

    A recent pilot test to enable an Automatic Demand Response system in California has revealed several lessons that are important to consider for a wider application of a regional or statewide Demand Response Program. The six facilities involved in the site testing were from diverse areas of our economy. The test subjects included a major retail food marketer and one of their retail grocery stores, financial services buildings for a major bank, a postal services facility, a federal government office building, a state university site, and ancillary buildings to a pharmaceutical research company. Although these organizations are all serving diverse purposes and customers, they share some underlying common characteristics that make their simultaneous study worthwhile from a market transformation perspective. These are large organizations. Energy efficiency is neither their core business nor are the decision makers who will enable this technology powerful players in their organizations. The management of buildings is perceived to be a small issue for top management and unless something goes wrong, little attention is paid to the building manager's problems. All of these organizations contract out a major part of their technical building operating systems. Control systems and energy management systems are proprietary. Their systems do not easily interact with one another. Management is, with the exception of one site, not electronically or computer literate enough to understand the full dimensions of the technology they have purchased. Despite the research team's development of a simple, straightforward method of informing them about the features of the demand response program, they had significant difficulty enabling their systems to meet the needs of the research. The research team had to step in and work directly with their vendors and contractors at all but one location. All of the participants have volunteered to participate in the study for altruistic

  5. A fully automated effervescence assisted dispersive liquid–liquid microextraction based on a stepwise injection system. Determination of antipyrine in saliva samples

    Energy Technology Data Exchange (ETDEWEB)

    Medinskaia, Kseniia; Vakh, Christina; Aseeva, Darina [Department of Analytical Chemistry, Institute of Chemistry, Saint Petersburg State University, RU-198504 Saint Petersburg (Russian Federation); Andruch, Vasil, E-mail: vasil.andruch@upjs.sk [Department of Analytical Chemistry, University of P.J. Šafárik, SK-04154 Košice (Slovakia); Moskvin, Leonid [Department of Analytical Chemistry, Institute of Chemistry, Saint Petersburg State University, RU-198504 Saint Petersburg (Russian Federation); Bulatov, Andrey, E-mail: bulatov_andrey@mail.ru [Department of Analytical Chemistry, Institute of Chemistry, Saint Petersburg State University, RU-198504 Saint Petersburg (Russian Federation)

    2016-01-01

    A first attempt to automate the effervescence assisted dispersive liquid–liquid microextraction (EA-DLLME) has been reported. The method is based on the aspiration of a sample and all required aqueous reagents into the stepwise injection analysis (SWIA) manifold, followed by simultaneous counterflow injection of the extraction solvent (dichloromethane), the mixture of the effervescence agent (0.5 mol L{sup −1} Na{sub 2}CO{sub 3}) and the proton donor solution (1 mol L{sup −1} CH{sub 3}COOH). Formation of carbon dioxide microbubbles generated in situ leads to the dispersion of the extraction solvent in the whole aqueous sample and extraction of the analyte into organic phase. Unlike the conventional DLLME, in the case of EA-DLLME, the addition of dispersive solvent, as well as, time consuming centrifugation step for disruption of the cloudy state is avoided. The phase separation was achieved by gentle bubbling of nitrogen stream (2 mL min{sup −1} during 2 min). The performance of the suggested approach is demonstrated by determination of antipyrine in saliva samples. The procedure is based on the derivatization of antipyrine by nitrite-ion followed by EA-DLLME of 4-nitrosoantipyrine and subsequent UV–Vis detection using SWIA manifold. The absorbance of the yellow-colored extract at the wavelength of 345 nm obeys Beer's law in the range of 1.5–100 µmol L{sup −1} of antipyrine in saliva. The LOD, calculated from a blank test based on 3σ, was 0.5 µmol L{sup −1}. - Highlights: • First attempt to automate the effervescence assisted dispersive liquid–liquid microextraction. • Automation based on Stepwise injection analysis manifold in flow batch system. • Counterflow injection of extraction solvent and the effervescence agent. • Phase separation performed by gentle bubbling of nitrogen. • Application for the determination of antipyrine in saliva samples.

  6. Cloud-based CT dose monitoring using the DICOM-structured report. Fully automated analysis in regard to national diagnostic reference levels

    International Nuclear Information System (INIS)

    Boos, J.; Rubbert, C.; Heusch, P.; Lanzman, R.S.; Aissa, J.; Antoch, G.; Kroepil, P.

    2016-01-01

    To implement automated CT dose data monitoring using the DICOM-Structured Report (DICOM-SR) in order to monitor dose-related CT data in regard to national diagnostic reference levels (DRLs). Materials and Methods: We used a novel in-house co-developed software tool based on the DICOM-SR to automatically monitor dose-related data from CT examinations. The DICOM-SR for each CT examination performed between 09/2011 and 03/2015 was automatically anonymized and sent from the CT scanners to a cloud server. Data was automatically analyzed in accordance with body region, patient age and corresponding DRL for volumetric computed tomography dose index (CTDI vol ) and dose length product (DLP). Results: Data of 36 523 examinations (131 527 scan series) performed on three different CT scanners and one PET/CT were analyzed. The overall mean CTDI vol and DLP were 51.3 % and 52.8 % of the national DRLs, respectively. CTDI vol and DLP reached 43.8 % and 43.1 % for abdominal CT (n = 10 590), 66.6 % and 69.6 % for cranial CT (n = 16 098) and 37.8 % and 44.0 % for chest CT (n = 10 387) of the compared national DRLs, respectively. Overall, the CTDI vol exceeded national DRLs in 1.9 % of the examinations, while the DLP exceeded national DRLs in 2.9 % of the examinations. Between different CT protocols of the same body region, radiation exposure varied up to 50 % of the DRLs. Conclusion: The implemented cloud-based CT dose monitoring based on the DICOM-SR enables automated benchmarking in regard to national DRLs. Overall the local dose exposure from CT reached approximately 50 % of these DRLs indicating that DRL actualization as well as protocol-specific DRLs are desirable. The cloud-based approach enables multi-center dose monitoring and offers great potential to further optimize radiation exposure in radiological departments.

  7. Performance of the fully automated progesterone assays on the Abbott AxSYM and the Technicon Immuno 1 Analyser compared with the radioimmunoassay progesterone MAIA

    International Nuclear Information System (INIS)

    Reinsberg, J.; Jost, E.; Van der Ven, H.

    1997-01-01

    Test performance of two automated progesterone assays available on the immunoassay analysers Abbott AxSYM and Technicon Immuno 1, respectively, was evaluated in comparison with the radioimmunoassay Progesterone MAIA. For assessment of test performance imprecision, functional sensitivity and linearity of dilution was examined. Correlation with the manual radioimmunoassay was assessed using 122 serum samples over the range 0-110 nmol/L. Imprecision studies revealed for the AxSYM Progesterone within-run CV's of 1.8-6.4% and day-to-day CV's of 3.5-9.7% (concentration range 2.3-75 nmol/L); Immuno 1 Progesterone: within-run CV's 1.0-7.3%, day-to-day CV's 2.3-7.7% (concentration range 1.2-60 nmol/L). The functional sensitivity was <1.7 nmol/L for the AxSYM Progesterone and <1.1 nmol/L for the Immuno 1 Progesterone. With the AxSYM Progesterone the mean recovery after dilution from five samples was 102% (89-107%), from one sample only 69-80% was recovered; with the Immuno 1 Progesterone the mean recovery was 95% (80-105%). Despite of a quite good overall correlation (coefficients 0.972 and 0.981) the relationship of both assays to the Progesterone MAIA significantly deviate from linearity with a considerably higher slope within the lower concentration range. The relationship between the automated assays was linear over the entire concentration range (Immuno = 1.207 * AxSYM + 1; r = 0.986). The time to first result was 20 min for the AxSYM Progesterone, 45 min for the Immuno 1 Progesterone and 90 min for the Progesterone MAIA. The evaluated progesterone assays both exhibit an excellent precision and a high degree of sensitivity. They offer a rapid and flexible method for progesterone determination which may be especially useful for the monitoring of ovarian stimulation during in-vitro fertilization. (author)

  8. Fully-automated approach to hippocampus segmentation using a graph-cuts algorithm combined with atlas-based segmentation and morphological opening.

    Science.gov (United States)

    Kwak, Kichang; Yoon, Uicheul; Lee, Dong-Kyun; Kim, Geon Ha; Seo, Sang Won; Na, Duk L; Shim, Hack-Joon; Lee, Jong-Min

    2013-09-01

    The hippocampus has been known to be an important structure as a biomarker for Alzheimer's disease (AD) and other neurological and psychiatric diseases. However, it requires accurate, robust and reproducible delineation of hippocampal structures. In this study, an automated hippocampal segmentation method based on a graph-cuts algorithm combined with atlas-based segmentation and morphological opening was proposed. First of all, the atlas-based segmentation was applied to define initial hippocampal region for a priori information on graph-cuts. The definition of initial seeds was further elaborated by incorporating estimation of partial volume probabilities at each voxel. Finally, morphological opening was applied to reduce false positive of the result processed by graph-cuts. In the experiments with twenty-seven healthy normal subjects, the proposed method showed more reliable results (similarity index=0.81±0.03) than the conventional atlas-based segmentation method (0.72±0.04). Also as for segmentation accuracy which is measured in terms of the ratios of false positive and false negative, the proposed method (precision=0.76±0.04, recall=0.86±0.05) produced lower ratios than the conventional methods (0.73±0.05, 0.72±0.06) demonstrating its plausibility for accurate, robust and reliable segmentation of hippocampus. Copyright © 2013 Elsevier Inc. All rights reserved.

  9. A LabVIEW®-based software for the control of the AUTORAD platform. A fully automated multisequential flow injection analysis Lab-on-Valve (MSFIA-LOV) system for radiochemical analysis

    International Nuclear Information System (INIS)

    Barbesi, Donato; Vilas, Victor Vicente; Millet, Sylvain; Sandow, Miguel; Colle, Jean-Yves; Heras, Laura Aldave de las

    2017-01-01

    A LabVIEW®-based software for the control of the fully automated multi-sequential flow injection analysis Lab-on-Valve (MSFIA-LOV) platform AutoRAD performing radiochemical analysis is described. The analytical platform interfaces an Arduino®-based device triggering multiple detectors providing a flexible and fit for purpose choice of detection systems. The different analytical devices are interfaced to the PC running LabVIEW®VI software using USB and RS232 interfaces, both for sending commands and receiving confirmation or error responses. The AUTORAD platform has been successfully applied for the chemical separation and determination of Sr, an important fission product pertinent to nuclear waste. (author)

  10. A LabVIEW®-based software for the control of the AUTORAD platform: a fully automated multisequential flow injection analysis Lab-on-Valve (MSFIA-LOV) system for radiochemical analysis.

    Science.gov (United States)

    Barbesi, Donato; Vicente Vilas, Víctor; Millet, Sylvain; Sandow, Miguel; Colle, Jean-Yves; Aldave de Las Heras, Laura

    2017-01-01

    A LabVIEW ® -based software for the control of the fully automated multi-sequential flow injection analysis Lab-on-Valve (MSFIA-LOV) platform AutoRAD performing radiochemical analysis is described. The analytical platform interfaces an Arduino ® -based device triggering multiple detectors providing a flexible and fit for purpose choice of detection systems. The different analytical devices are interfaced to the PC running LabVIEW ® VI software using USB and RS232 interfaces, both for sending commands and receiving confirmation or error responses. The AUTORAD platform has been successfully applied for the chemical separation and determination of Sr, an important fission product pertinent to nuclear waste.

  11. Validation of a fully automated solid‐phase extraction and ultra‐high‐performance liquid chromatography–tandem mass spectrometry method for quantification of 30 pharmaceuticals and metabolites in post‐mortem blood and brain samples

    DEFF Research Database (Denmark)

    Nielsen, Marie Katrine Klose; Nedahl, Michael; Johansen, Sys Stybe

    2018-01-01

    In this study, we present the validation of an analytical method capable of quantifying 30 commonly encountered pharmaceuticals and metabolites in whole blood and brain tissue from forensic cases. Solid‐phase extraction was performed by a fully automated robotic system, thereby minimising manual...... labour and human error while increasing sample throughput, robustness, and traceability. The method was validated in blood in terms of selectivity, linear range, matrix effect, extraction recovery, process efficiency, carry‐over, stability, precision, and accuracy. Deuterated analogues of each analyte....../kg. Thus, the linear range covered both therapeutic and toxic levels. The method showed acceptable accuracy and precision, with accuracies ranging from 80 to 118% and precision below 19% for the majority of the analytes. Linear range, matrix effect, extraction recovery, process efficiency, precision...

  12. Quantification of 31 illicit and medicinal drugs and metabolites in whole blood by fully automated solid-phase extraction and ultra-performance liquid chromatography-tandem mass spectrometry.

    Science.gov (United States)

    Bjørk, Marie Kjærgaard; Simonsen, Kirsten Wiese; Andersen, David Wederkinck; Dalsgaard, Petur Weihe; Sigurðardóttir, Stella Rögn; Linnet, Kristian; Rasmussen, Brian Schou

    2013-03-01

    An efficient method for analyzing illegal and medicinal drugs in whole blood using fully automated sample preparation and short ultra-high-performance liquid chromatography-tandem mass spectrometry (MS/MS) run time is presented. A selection of 31 drugs, including amphetamines, cocaine, opioids, and benzodiazepines, was used. In order to increase the efficiency of routine analysis, a robotic system based on automated liquid handling and capable of handling all unit operation for sample preparation was built on a Freedom Evo 200 platform with several add-ons from Tecan and third-party vendors. Solid-phase extraction was performed using Strata X-C plates. Extraction time for 96 samples was less than 3 h. Chromatography was performed using an ACQUITY UPLC system (Waters Corporation, Milford, USA). Analytes were separated on a 100 mm × 2.1 mm, 1.7 μm Acquity UPLC CSH C(18) column using a 6.5 min 0.1 % ammonia (25 %) in water/0.1 % ammonia (25 %) in methanol gradient and quantified by MS/MS (Waters Quattro Premier XE) in multiple-reaction monitoring mode. Full validation, including linearity, precision and trueness, matrix effect, ion suppression/enhancement of co-eluting analytes, recovery, and specificity, was performed. The method was employed successfully in the laboratory and used for routine analysis of forensic material. In combination with tetrahydrocannabinol analysis, the method covered 96 % of cases involving driving under the influence of drugs. The manual labor involved in preparing blood samples, solvents, etc., was reduced to a half an hour per batch. The automated sample preparation setup also minimized human exposure to hazardous materials, provided highly improved ergonomics, and eliminated manual pipetting.

  13. Fully automated preparation of [11C]choline and [18F]fluoromethylcholine using TracerLab synthesis modules and facilitated quality control using analytical HPLC

    International Nuclear Information System (INIS)

    Shao Xia; Hockley, Brian G.; Hoareau, Raphael; Schnau, Paul L.; Scott, Peter J.H.

    2011-01-01

    Modifications of a GE TracerLab FX C-Pro , which can be implemented for solid-phase [ 11 C]methylation are described. The simplified procedure for synthesis of [ 11 C]choline uses a single Sep-Pak CM-Light cation-exchange cartridge for both solid-supported reaction and purification. Compared with the commonly used two Sep-Pak method, the low back-pressure of this Sep-Pak enables efficient and reliable production of [ 11 C]choline using a TracerLab FX C-Pro without requirement for any gas pressure adjustment. Typical radiochemical yields (RCY) are >60%, radiochemical purity (RCP) is 99.9% and levels of residual precursor in the final product, which may inhibit the uptake of [ 11 C]choline, are reduced to 1 μg/mL. Similarly, modification of a GE TracerLab FX FN is reported which enables gas-phase production of [ 18 F]fluoromethylcholine, suitable for pre-clinical use, (in 4-6% RCY and >99.7% RCP) using a related Sep-Pak method. These modifications can be utilized for solid-phase [ 11 C]methylation and [ 18 F]fluoromethylation of other radiotracers, and allow straightforward switching to other module configurations for solution-phase radiochemistry or loop chemistry. In addition, we report a convenient HPLC ion chromatography method, which can monitor residual precursor and the radiochemical purity of product at the same time, providing highly efficient quality control for routine clinical application. The reported HPLC method is appropriate for analysis of doses of both [ 11 C]choline and [ 18 F]fluoromethylcholine, and eliminates the need for a GC method to determine residual precursor levels. -- Graphical abstract: Simplified procedures for the automated synthesis of [ 11 C]choline and [ 18 F]fluoromethylcholine using TracerLab FX C-Pro and TracerLab FX FN synthesis modules are presented. In addition, we report a convenient HPLC ion chromatography method, which can monitor residual precursor and radiochemical purity of the product at the same time. Display Omitted

  14. Oh and by the way, you get meter readings too : a look at distribution automation, intelligent grids, and demand side management : getting the real value from your AMI

    Energy Technology Data Exchange (ETDEWEB)

    Summerlin, T. [Gestalt, Camden, NJ (United States); Ferguson, P.D. [Newmarket Hydro Ltd., Newmarket, ON (Canada)

    2006-07-01

    The full value of smart metering programs will be realized when information and communication capabilities are used to enable distribution automation, intelligent grids, and sufficient data management that will transform interval meter data into useful information. By leveraging and managing meter data effectively, utilities can increase their operational efficiency, improve their understanding of customers' needs, and develop more effective demand side management programs. This presentation examined some of the changing priorities of advanced metering infrastructure (AMI) strategies, and provided details of meter data management (MDM) technologies developed to help utilities increase efficiency, cut costs and provide better service to their customers. An MDM system is the set of data bases and applications required to provide utilities with a solution for the data retention, analysis and storage repository gaps that will be created when monthly manual meter readings are replaced with AMI systems. In order to resolve storage, functionality and legacy integration gaps, MDM systems must be scalable systems that can support large and small quantities of meter data, and must also conform to industry standard data warehouse designs. Data structure in the systems must support both regulated and deregulated markets, and be capable of providing extensive graphical, tabular and Excel export of the metered and totalized data from the interval to system level. MDM systems can provide improved support for demand response decision-making processes; distribution planning and reliability; outage management; revenue assurance; forecasting; and curtailment. It was concluded that MDM systems can be used to improve processes and provide additional benefits well beyond the meter reading and billing process benefits originally identified by utilities as a primary goal of implementing AMI. refs., tabs., figs.

  15. A fully automated calibration method for an optical see-through head-mounted operating microscope with variable zoom and focus.

    Science.gov (United States)

    Figl, Michael; Ede, Christopher; Hummel, Johann; Wanschitz, Felix; Ewers, Rolf; Bergmann, Helmar; Birkfellner, Wolfgang

    2005-11-01

    Ever since the development of the first applications in image-guided therapy (IGT), the use of head-mounted displays (HMDs) was considered an important extension of existing IGT technologies. Several approaches to utilizing HMDs and modified medical devices for augmented reality (AR) visualization were implemented. These approaches include video-see through systems, semitransparent mirrors, modified endoscopes, and modified operating microscopes. Common to all these devices is the fact that a precise calibration between the display and three-dimensional coordinates in the patient's frame of reference is compulsory. In optical see-through devices based on complex optical systems such as operating microscopes or operating binoculars-as in the case of the system presented in this paper-this procedure can become increasingly difficult since precise camera calibration for every focus and zoom position is required. We present a method for fully automatic calibration of the operating binocular Varioscope M5 AR for the full range of zoom and focus settings available. Our method uses a special calibration pattern, a linear guide driven by a stepping motor, and special calibration software. The overlay error in the calibration plane was found to be 0.14-0.91 mm, which is less than 1% of the field of view. Using the motorized calibration rig as presented in the paper, we were also able to assess the dynamic latency when viewing augmentation graphics on a mobile target; spatial displacement due to latency was found to be in the range of 1.1-2.8 mm maximum, the disparity between the true object and its computed overlay represented latency of 0.1 s. We conclude that the automatic calibration method presented in this paper is sufficient in terms of accuracy and time requirements for standard uses of optical see-through systems in a clinical environment.

  16. Distribution automation

    International Nuclear Information System (INIS)

    Gruenemeyer, D.

    1991-01-01

    This paper reports on a Distribution Automation (DA) System enhances the efficiency and productivity of a utility. It also provides intangible benefits such as improved public image and market advantages. A utility should evaluate the benefits and costs of such a system before committing funds. The expenditure for distribution automation is economical when justified by the deferral of a capacity increase, a decrease in peak power demand, or a reduction in O and M requirements

  17. Automated Critical Peak Pricing Field Tests: Program Descriptionand Results

    Energy Technology Data Exchange (ETDEWEB)

    Piette, Mary Ann; Watson, David; Motegi, Naoya; Kiliccote, Sila; Xu, Peng

    2006-04-06

    California utilities have been exploring the use of critical peak prices (CPP) to help reduce needle peaks in customer end-use loads. CPP is a form of price-responsive demand response (DR). Recent experience has shown that customers have limited knowledge of how to operate their facilities in order to reduce their electricity costs under CPP (Quantum 2004). While the lack of knowledge about how to develop and implement DR control strategies is a barrier to participation in DR programs like CPP, another barrier is the lack of automation of DR systems. During 2003 and 2004, the PIER Demand Response Research Center (DRRC) conducted a series of tests of fully automated electric demand response (Auto-DR) at 18 facilities. Overall, the average of the site-specific average coincident demand reductions was 8% from a variety of building types and facilities. Many electricity customers have suggested that automation will help them institutionalize their electric demand savings and improve their overall response and DR repeatability. This report focuses on and discusses the specific results of the Automated Critical Peak Pricing (Auto-CPP, a specific type of Auto-DR) tests that took place during 2005, which build on the automated demand response (Auto-DR) research conducted through PIER and the DRRC in 2003 and 2004. The long-term goal of this project is to understand the technical opportunities of automating demand response and to remove technical and market impediments to large-scale implementation of automated demand response (Auto-DR) in buildings and industry. A second goal of this research is to understand and identify best practices for DR strategies and opportunities. The specific objectives of the Automated Critical Peak Pricing test were as follows: (1) Demonstrate how an automated notification system for critical peak pricing can be used in large commercial facilities for demand response (DR). (2) Evaluate effectiveness of such a system. (3) Determine how customers

  18. A Fully Automated Web-Based Program Improves Lifestyle Habits and HbA1c in Patients With Type 2 Diabetes and Abdominal Obesity: Randomized Trial of Patient E-Coaching Nutritional Support (The ANODE Study).

    Science.gov (United States)

    Hansel, Boris; Giral, Philippe; Gambotti, Laetitia; Lafourcade, Alexandre; Peres, Gilbert; Filipecki, Claude; Kadouch, Diana; Hartemann, Agnes; Oppert, Jean-Michel; Bruckert, Eric; Marre, Michel; Bruneel, Arnaud; Duchene, Emilie; Roussel, Ronan

    2017-11-08

    The prevalence of abdominal obesity and type 2 diabetes mellitus (T2DM) is a public health challenge. New solutions need to be developed to help patients implement lifestyle changes. The objective of the study was to evaluate a fully automated Web-based intervention designed to help users improve their dietary habits and increase their physical activity. The Accompagnement Nutritionnel de l'Obésité et du Diabète par E-coaching (ANODE) study was a 16-week, 1:1 parallel-arm, open-label randomized clinical trial. Patients with T2DM and abdominal obesity (n=120, aged 18-75 years) were recruited. Patients in the intervention arm (n=60) had access to a fully automated program (ANODE) to improve their lifestyle. Patients were asked to log on at least once per week. Human contact was limited to hotline support in cases of technical issues. The dietetic tool provided personalized menus and a shopping list for the day or the week. Stepwise physical activity was prescribed. The control arm (n=60) received general nutritional advice. The primary outcome was the change of the dietary score (International Diet Quality Index; DQI-I) between baseline and the end of the study. Secondary endpoints included changes in body weight, waist circumference, hemoglobin A1c (HbA1c) and measured maximum oxygen consumption (VO2 max). The mean age of the participants was 57 years (standard deviation [SD] 9), mean body mass index was 33 kg/m² (SD 4), mean HbA1c was 7.2% (SD 1.1), and 66.7% (80/120) of participants were women. Using an intention-to-treat analysis, the DQI-I score (54.0, SD 5.7 in the ANODE arm; 52.8, SD 6.2 in the control arm; P=.28) increased significantly in the ANODE arm compared to the control arm (+4.55, SD 5.91 vs -1.68, SD 5.18; between arms Pchanges improved significantly in the intervention. Among patients with T2DM and abdominal obesity, the use of a fully automated Web-based program resulted in a significant improvement in dietary habits and favorable clinical and

  19. Fully automated system for pulsed NMR measurements

    International Nuclear Information System (INIS)

    Cantor, D.M.

    1977-01-01

    A system is described which places many of the complex, tedious operations for pulsed NMR experiments under computer control. It automatically optimizes the experiment parameters of pulse length and phase, and precision, accuracy, and measurement speed are improved. The hardware interface between the computer and the NMR instrument is described. Design features, justification of the choices made between alternative design strategies, and details of the implementation of design goals are presented. Software features common to all the available experiments are discussed. Optimization of pulse lengths and phases is performed via a sequential search technique called Uniplex. Measurements of the spin-lattice and spin-spin relaxation times and of diffusion constants are automatic. Options for expansion of the system are explored along with some of the limitations of the system

  20. Development of on-chip fully automated immunoassay system "μTASWako i30" to measure the changes in glycosylation profiles of alpha-fetoprotein in patients with hepatocellular carcinoma.

    Science.gov (United States)

    Kurosawa, Tatsuo; Watanabe, Mitsuo

    2016-12-01

    Glycosylation profiles significantly change during oncogenesis. Aberrant glycosylation can be used as a cancer biomarker in clinical settings. Different glycoforms can be separately detected using lectin affinity electrophoresis and lectin array-based methods. However, most methodologies and procedures need experienced technique to perform the assays and expertise to interpret the results. To apply glycomarkers for clinical practice, a robust assay system with an easy-to-use workflow is required. Wako's μTASWako i30, a fully automated immunoanalyzer, was developed for in vitro diagnostics based on microfluidic technology. It utilizes the principles of liquid-phase binding assay, where immunoreactions are performed in a liquid phase, and electrokinetic analyte transport assay. Capillary electrophoresis on microfluidic chip has enabled the detection of different glycoform types of alpha-fetoprotein (AFP), a serum biomarker for hepatocellular carcinoma. AFP with altered glycosylation can be separated based on the reactivity to Lens culinaris agglutinin on electrophoresis. The glycoform AFP-L3 was reportedly more specific in hepatocellular carcinoma. This assay system can provide a high sensitivity and rapid results in 9 min. The test results for ratio of AFP-L3 to total AFP using μTASWako i30 are correlated with those of conventional methodology. The μTASWako assay system and the technology can be utilized for glycosylation analysis in the postgenomic era. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Development of a fully automated sequential injection solid-phase extraction procedure coupled to liquid chromatography to determine free 2-hydroxy-4-methoxybenzophenone and 2-hydroxy-4-methoxybenzophenone-5-sulphonic acid in human urine.

    Science.gov (United States)

    León, Zacarías; Chisvert, Alberto; Balaguer, Angel; Salvador, Amparo

    2010-04-07

    2-Hydroxy-4-methoxybenzophenone and 2-hydroxy-4-methoxybenzophenone-5-sulphonic acid, commonly known as benzophenone-3 (BZ3) and benzophenone-4 (BZ4), respectively, are substances widely used as UV filters in cosmetic products in order to absorb UV radiation and protect human skin from direct exposure to the deleterious wavelengths of sunlight. As with other UV filters, there is evidence of their percutaneous absorption. This work describes an analytical method developed to determine trace levels of free BZ3 and BZ4 in human urine. The methodology is based on a solid-phase extraction (SPE) procedure for clean-up and pre-concentration, followed by the monitoring of the UV filters by liquid chromatography-ultraviolet spectrophotometry detection (LC-UV). In order to improve not only the sensitivity and selectivity, but also the precision of the method, the principle of sequential injection analysis was used to automate the SPE process and to transfer the eluates from the SPE to the LC system. The application of a six-channel valve as an interface for the switching arrangements successfully allowed the on-line connection of SPE sample processing with LC analysis. The SPE process for BZ3 and BZ4 was performed using octadecyl (C18) and diethylaminopropyl (DEA) modified silica microcolumns, respectively, in which the analytes were retained and eluted selectively. Due to the matrix effects, the determination was based on standard addition quantification and was fully validated. The relative standard deviations of the results were 13% and 6% for BZ3 and BZ4, respectively, whereas the limits of detection were 60 and 30 ng mL(-1), respectively. The method was satisfactorily applied to determine BZ3 and BZ4 in urine from volunteers that had applied a sunscreen cosmetic containing both UV filters. Copyright 2010 Elsevier B.V. All rights reserved.

  2. Development of a fully automated sequential injection solid-phase extraction procedure coupled to liquid chromatography to determine free 2-hydroxy-4-methoxybenzophenone and 2-hydroxy-4-methoxybenzophenone-5-sulphonic acid in human urine

    International Nuclear Information System (INIS)

    Leon, Zacarias; Chisvert, Alberto; Balaguer, Angel; Salvador, Amparo

    2010-01-01

    2-Hydroxy-4-methoxybenzophenone and 2-hydroxy-4-methoxybenzophenone-5-sulphonic acid, commonly known as benzophenone-3 (BZ3) and benzophenone-4 (BZ4), respectively, are substances widely used as UV filters in cosmetic products in order to absorb UV radiation and protect human skin from direct exposure to the deleterious wavelengths of sunlight. As with other UV filters, there is evidence of their percutaneous absorption. This work describes an analytical method developed to determine trace levels of free BZ3 and BZ4 in human urine. The methodology is based on a solid-phase extraction (SPE) procedure for clean-up and pre-concentration, followed by the monitoring of the UV filters by liquid chromatography-ultraviolet spectrophotometry detection (LC-UV). In order to improve not only the sensitivity and selectivity, but also the precision of the method, the principle of sequential injection analysis was used to automate the SPE process and to transfer the eluates from the SPE to the LC system. The application of a six-channel valve as an interface for the switching arrangements successfully allowed the on-line connection of SPE sample processing with LC analysis. The SPE process for BZ3 and BZ4 was performed using octadecyl (C18) and diethylaminopropyl (DEA) modified silica microcolumns, respectively, in which the analytes were retained and eluted selectively. Due to the matrix effects, the determination was based on standard addition quantification and was fully validated. The relative standard deviations of the results were 13% and 6% for BZ3 and BZ4, respectively, whereas the limits of detection were 60 and 30 ng mL -1 , respectively. The method was satisfactorily applied to determine BZ3 and BZ4 in urine from volunteers that had applied a sunscreen cosmetic containing both UV filters.

  3. Screening for illicit and medicinal drugs in whole blood using fully automated SPE and ultra-high-performance liquid chromatography with TOF-MS with data-independent acquisition.

    Science.gov (United States)

    Pedersen, Anders Just; Dalsgaard, Petur Weihe; Rode, Andrej Jaroslav; Rasmussen, Brian Schou; Müller, Irene Breum; Johansen, Sys Stybe; Linnet, Kristian

    2013-07-01

    A broad forensic screening method for 256 analytes in whole blood based on a fully automated SPE robotic extraction and ultra-high-performance liquid chromatography (UHPLC) with TOF-MS with data-independent acquisition has been developed. The limit of identification was evaluated for all 256 compounds and 95 of these compounds were validated with regard to matrix effects, extraction recovery, and process efficiency. The limit of identification ranged from 0.001 to 0.1 mg/kg, and the process efficiency exceeded 50% for 73 of the 95 analytes. As an example of application, 1335 forensic traffic cases were analyzed with the presented screening method. Of these, 992 cases (74%) were positive for one or more traffic-relevant drugs above the Danish legal limits. Commonly abused drugs such as amphetamine, cocaine, and frequent types of benzodiazepines were the major findings. Nineteen less frequently encountered drugs were detected e.g. buprenorphine, butylone, cathine, fentanyl, lysergic acid diethylamide, m-chlorophenylpiperazine, 3,4-methylenedioxypyrovalerone, mephedrone, 4-methylamphetamine, p-fluoroamphetamine, and p-methoxy-N-methylamphetamine. In conclusion, using UHPLC-TOF-MS screening with data-independent acquisition resulted in the detection of common drugs of abuse as well as new designer drugs and more rarely occurring drugs. Thus, TOF-MS screening of blood samples constitutes a practical way for screening traffic cases, with the exception of δ-9-tetrahydrocannabinol, which should be handled in a separate method. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. Automation in Clinical Microbiology

    Science.gov (United States)

    Ledeboer, Nathan A.

    2013-01-01

    Historically, the trend toward automation in clinical pathology laboratories has largely bypassed the clinical microbiology laboratory. In this article, we review the historical impediments to automation in the microbiology laboratory and offer insight into the reasons why we believe that we are on the cusp of a dramatic change that will sweep a wave of automation into clinical microbiology laboratories. We review the currently available specimen-processing instruments as well as the total laboratory automation solutions. Lastly, we outline the types of studies that will need to be performed to fully assess the benefits of automation in microbiology laboratories. PMID:23515547

  5. Bioprocessing automation in cell therapy manufacturing: Outcomes of special interest group automation workshop.

    Science.gov (United States)

    Ball, Oliver; Robinson, Sarah; Bure, Kim; Brindley, David A; Mccall, David

    2018-04-01

    Phacilitate held a Special Interest Group workshop event in Edinburgh, UK, in May 2017. The event brought together leading stakeholders in the cell therapy bioprocessing field to identify present and future challenges and propose potential solutions to automation in cell therapy bioprocessing. Here, we review and summarize discussions from the event. Deep biological understanding of a product, its mechanism of action and indication pathogenesis underpin many factors relating to bioprocessing and automation. To fully exploit the opportunities of bioprocess automation, therapeutics developers must closely consider whether an automation strategy is applicable, how to design an 'automatable' bioprocess and how to implement process modifications with minimal disruption. Major decisions around bioprocess automation strategy should involve all relevant stakeholders; communication between technical and business strategy decision-makers is of particular importance. Developers should leverage automation to implement in-process testing, in turn applicable to process optimization, quality assurance (QA)/ quality control (QC), batch failure control, adaptive manufacturing and regulatory demands, but a lack of precedent and technical opportunities can complicate such efforts. Sparse standardization across product characterization, hardware components and software platforms is perceived to complicate efforts to implement automation. The use of advanced algorithmic approaches such as machine learning may have application to bioprocess and supply chain optimization. Automation can substantially de-risk the wider supply chain, including tracking and traceability, cryopreservation and thawing and logistics. The regulatory implications of automation are currently unclear because few hardware options exist and novel solutions require case-by-case validation, but automation can present attractive regulatory incentives. Copyright © 2018 International Society for Cellular Therapy

  6. Automation systems for radioimmunoassay

    International Nuclear Information System (INIS)

    Yamasaki, Paul

    1974-01-01

    The application of automation systems for radioimmunoassay (RIA) was discussed. Automated systems could be useful in the second step, of the four basic processes in the course of RIA, i.e., preparation of sample for reaction. There were two types of instrumentation, a semi-automatic pipete, and a fully automated pipete station, both providing for fast and accurate dispensing of the reagent or for the diluting of sample with reagent. Illustrations of the instruments were shown. (Mukohata, S.)

  7. 基于区块链技术的自动需求响应系统应用初探%A Preliminary Study of Block Chain based Automated Demand Response System

    Institute of Scientific and Technical Information of China (English)

    李彬; 卢超; 曹望璋; 祁兵; 李德智; 陈宋宋; 崔高颖

    2017-01-01

    Under the opened retail electricity market,the operation rules and business process of automated demand response service become more complicated.There are many capital operations for the demand response compensation and penalty.The supporting technology that guarantee demand response program is required from the aspect of security and behavior notarization.According to the requirement analysis of basis demand response program,a block chain based solution was proposed.The key issues of certificate mechanism,interconnection agreement,intelligent contract,and information security contract were investigated.Finally,the problems of applying blockchain the demand response system in view of the existing technology was presented together with the corresponding suggestions.%自动需求响应业务在售电侧放开下各类业务主体的运行规则、业务流程趋向于复杂化,在补贴结算、违约惩罚等方面又涉及大量的资金流操作,从安全性、操作行为公证等方面亟需响应的技术手段保障.该文基于对现有自动需求响应业务的需求分析,提出了基于区块链技术的应用方案,并从工作量证明机制、互联共识、智能合约、信息安全等方面剖析了区块链在自动需求响应系统中的关键问题.最后,针对现有的技术水平分析了区块链目前应用与自动需求响应系统所存在的问题,并提出相应的建议方案.

  8. Optimization of Human NK Cell Manufacturing: Fully Automated Separation, Improved Ex Vivo Expansion Using IL-21 with Autologous Feeder Cells, and Generation of Anti-CD123-CAR-Expressing Effector Cells.

    Science.gov (United States)

    Klöß, Stephan; Oberschmidt, Olaf; Morgan, Michael; Dahlke, Julia; Arseniev, Lubomir; Huppert, Volker; Granzin, Markus; Gardlowski, Tanja; Matthies, Nadine; Soltenborn, Stephanie; Schambach, Axel; Koehl, Ulrike

    2017-10-01

    depletion and CD56 enrichment steps. Manually performed experiments to test different culture media demonstrated significantly higher NK cell expansion rates and an approximately equal distribution of CD56 dim CD16 pos and CD56 bright CD16 dim&neg NK subsets on day 14 with cells cultivated in NK MACS ® media. Moreover, effector cell expansion in manually performed experiments with NK MACS ® containing IL-2 and irradiated autologous FCs and IL-21, both added at the initiation of the culture, induced an 85-fold NK cell expansion. Compared to freshly isolated NK cells, expanded NK cells expressed significantly higher levels of NKp30, NKp44, NKG2D, TRAIL, FasL, CD69, and CD137, and showed comparable cell viabilities and killing/degranulation activities against tumor and leukemic cell lines in vitro. NK cells used for CAR transduction showed the highest anti-CD123 CAR expression on day 3 after gene modification. These anti-CD123 CAR-engineered NK cells demonstrated improved cytotoxicity against the CD123 pos AML cell line KG1a and primary AML blasts. In addition, CAR NK cells showed higher degranulation and enhanced secretion of tumor necrosis factor alpha, interferon gamma, and granzyme A and B. In fluorescence imaging, specific interactions that initiated apoptotic processes in the AML target cells were detected between CAR NK cells and KG1a. After the fully automated NK cell separation process on Prodigy, a new NK cell expansion protocol was generated that resulted in high numbers of NK cells with potent antitumor activity, which could be modified efficiently by novel third-generation, alpha-retroviral SIN vector constructs. Next steps are the integration of the manual expansion procedure in the fully integrated platform for a standardized GMP-compliant overall process in this closed system that also may include gene modification of NK cells to optimize target-specific antitumor activity.

  9. Evaluation of cell count and classification capabilities in body fluids using a fully automated Sysmex XN equipped with high-sensitive Analysis (hsA) mode and DI-60 hematology analyzer system.

    Science.gov (United States)

    Takemura, Hiroyuki; Ai, Tomohiko; Kimura, Konobu; Nagasaka, Kaori; Takahashi, Toshihiro; Tsuchiya, Koji; Yang, Haeun; Konishi, Aya; Uchihashi, Kinya; Horii, Takashi; Tabe, Yoko; Ohsaka, Akimichi

    2018-01-01

    The XN series automated hematology analyzer has been equipped with a body fluid (BF) mode to count and differentiate leukocytes in BF samples including cerebrospinal fluid (CSF). However, its diagnostic accuracy is not reliable for CSF samples with low cell concentration at the border between normal and pathologic level. To overcome this limitation, a new flow cytometry-based technology, termed "high sensitive analysis (hsA) mode," has been developed. In addition, the XN series analyzer has been equipped with the automated digital cell imaging analyzer DI-60 to classify cell morphology including normal leukocytes differential and abnormal malignant cells detection. Using various BF samples, we evaluated the performance of the XN-hsA mode and DI-60 compared to manual microscopic examination. The reproducibility of the XN-hsA mode showed good results in samples with low cell densities (coefficient of variation; % CV: 7.8% for 6 cells/μL). The linearity of the XN-hsA mode was established up to 938 cells/μL. The cell number obtained using the XN-hsA mode correlated highly with the corresponding microscopic examination. Good correlation was also observed between the DI-60 analyses and manual microscopic classification for all leukocyte types, except monocytes. In conclusion, the combined use of cell counting with the XN-hsA mode and automated morphological analyses using the DI-60 mode is potentially useful for the automated analysis of BF cells.

  10. Energy Production System Management - Renewable energy power supply integration with Building Automation System

    International Nuclear Information System (INIS)

    Figueiredo, Joao; Martins, Joao

    2010-01-01

    Intelligent buildings, historically and technologically, refers to the integration of four distinctive systems: Building Automation Systems (BAS), Telecommunication Systems, Office Automation Systems and Computer Building Management Systems. The increasing sophisticated BAS has become the 'heart and soul' of modern intelligent buildings. Integrating energy supply and demand elements - often known as Demand-Side Management (DSM) - has became an important energy efficiency policy concept. Nowadays, European countries have diversified their power supplies, reducing the dependence on OPEC, and developing a broader mix of energy sources maximizing the use of renewable energy domestic sources. In this way it makes sense to include a fifth system into the intelligent building group: Energy Production System Management (EPSM). This paper presents a Building Automation System where the Demand-Side Management is fully integrated with the building's Energy Production System, which incorporates a complete set of renewable energy production and storage systems.

  11. Microwave-Assisted Sample Treatment in a Fully Automated Flow-Based Instrument: Oxidation of Reduced Technetium Species in the Analysis of Total Technetium-99 in Caustic Aged Nuclear Waste Samples

    International Nuclear Information System (INIS)

    Egorov, Oleg B.; O'Hara, Matthew J.; Grate, Jay W.

    2004-01-01

    An automated flow-based instrument for microwave-assisted treatment of liquid samples has been developed and characterized. The instrument utilizes a flow-through reaction vessel design that facilitates the addition of multiple reagents during sample treatment, removal of the gaseous reaction products, and enables quantitative removal of liquids from the reaction vessel for carryover-free operations. Matrix modification and speciation control chemistries that are required for the radiochemical determination of total 99Tc in caustic aged nuclear waste samples have been investigated. A rapid and quantitative oxidation procedure using peroxydisulfate in acidic solution was developed to convert reduced technetium species to pertechnetate in samples with high content of reducing organics. The effectiveness of the automated sample treatment procedures has been validated in the radiochemical analysis of total 99Tc in caustic aged nuclear waste matrixes from the Hanford site

  12. Buying Program of the Standard Automated Materiel Management System. Automated Small Purchase System: Defense Supply Center Philadelphia

    National Research Council Canada - National Science Library

    2001-01-01

    The Standard Automated Materiel Management System Automated Small Purchase System is a fully automated micro-purchases system used by the General and Industrial Directorate at the Defense Supply Center Philadelphia...

  13. Future Trends in Process Automation

    OpenAIRE

    Jämsä-Jounela, Sirkka-Liisa

    2007-01-01

    The importance of automation in the process industries has increased dramatically in recent years. In the highly industrialized countries, process automation serves to enhance product quality, master the whole range of products, improve process safety and plant availability, efficiently utilize resources and lower emissions. In the rapidly developing countries, mass production is the main motivation for applying process automation. The greatest demand for process automation is in the chemical...

  14. Marketing automation

    Directory of Open Access Journals (Sweden)

    TODOR Raluca Dania

    2017-01-01

    Full Text Available The automation of the marketing process seems to be nowadays, the only solution to face the major changes brought by the fast evolution of technology and the continuous increase in supply and demand. In order to achieve the desired marketing results, businessis have to employ digital marketing and communication services. These services are efficient and measurable thanks to the marketing technology used to track, score and implement each campaign. Due to the technical progress, the marketing fragmentation, demand for customized products and services on one side and the need to achieve constructive dialogue with the customers, immediate and flexible response and the necessity to measure the investments and the results on the other side, the classical marketing approached had changed continue to improve substantially.

  15. A Fully Automated Radiosynthesis of [18F]Fluoroethyl-Diprenorphine on a Single Module by Use of SPE Cartridges for Preparation of High Quality 2-[18F]Fluoroethyl Tosylate

    Directory of Open Access Journals (Sweden)

    Gjermund Henriksen

    2013-06-01

    Full Text Available We have developed a new method for automated production of 2-[18F]fluoroethyl tosylate ([18F]FETos that enables 18F-alkylation to provide PET tracers with high chemical purity. The method is based on the removal of excess ethylene glycol bistosylate precursor by precipitation and subsequent filtration and purification of the filtrate by means of solid phase extraction cartridges (SPE. The method is integrated to a single synthesis module and thereby provides the advantage over previous methods of not requiring HPLC purification, as demonstrated by the full radiosynthesis of the potent opioid receptor PET tracer [18F]fluoroethyldiprenorphine.

  16. Magnetic Resonance Parkinsonism Index: diagnostic accuracy of a fully automated algorithm in comparison with the manual measurement in a large Italian multicentre study in patients with progressive supranuclear palsy

    International Nuclear Information System (INIS)

    Nigro, Salvatore; Arabia, Gennarina; Antonini, Angelo; Weis, Luca; Marcante, Andrea; Tessitore, Alessandro; Cirillo, Mario; Tedeschi, Gioacchino; Zanigni, Stefano; Tonon, Caterina; Calandra-Buonaura, Giovanna; Pezzoli, Gianni; Cilia, Roberto; Zappia, Mario; Nicoletti, Alessandra; Cicero, Calogero Edoardo; Tinazzi, Michele; Tocco, Pierluigi; Cardobi, Nicolo; Quattrone, Aldo

    2017-01-01

    To investigate the reliability of a new in-house automatic algorithm for calculating the Magnetic Resonance Parkinsonism Index (MRPI), in a large multicentre study population of patients affected by progressive supranuclear palsy (PSP) or Parkinson's disease (PD), and healthy controls (HC), and to compare the diagnostic accuracy of the automatic and manual MRPI values. The study included 88 PSP patients, 234 PD patients and 117 controls. MRI was performed using both 3T and 1.5T scanners. Automatic and manual MRPI values were evaluated, and accuracy of both methods in distinguishing PSP from PD and controls was calculated. No statistical differences were found between automated and manual MRPI values in all groups. The automatic MRPI values differentiated PSP from PD with an accuracy of 95 % (manual MRPI accuracy 96 %) and 97 % (manual MRPI accuracy 100 %) for 1.5T and 3T scanners, respectively. Our study showed that the new in-house automated method for MRPI calculation was highly accurate in distinguishing PSP from PD. Our automatic approach allows a widespread use of MRPI in clinical practice and in longitudinal research studies. (orig.)

  17. Magnetic Resonance Parkinsonism Index: diagnostic accuracy of a fully automated algorithm in comparison with the manual measurement in a large Italian multicentre study in patients with progressive supranuclear palsy

    Energy Technology Data Exchange (ETDEWEB)

    Nigro, Salvatore [National Research Council, Institute of Bioimaging and Molecular Physiology, Catanzaro (Italy); Arabia, Gennarina [University ' ' Magna Graecia' ' , Institute of Neurology, Department of Medical and Surgical Sciences, Catanzaro (Italy); Antonini, Angelo; Weis, Luca; Marcante, Andrea [' ' Fondazione Ospedale San Camillo' ' - I.R.C.C.S, Parkinson' s Disease and Movement Disorders Unit, Venice-Lido (Italy); Tessitore, Alessandro; Cirillo, Mario; Tedeschi, Gioacchino [Second University of Naples, Department of Medical, Surgical, Neurological, Metabolic and Aging Sciences, Naples (Italy); Second University of Naples, MRI Research Center SUN-FISM, Naples (Italy); Zanigni, Stefano; Tonon, Caterina [Policlinico S. Orsola - Malpighi, Functional MR Unit, Bologna (Italy); University of Bologna, Department of Biomedical and Neuromotor Sciences, Bologna (Italy); Calandra-Buonaura, Giovanna [University of Bologna, Department of Biomedical and Neuromotor Sciences, Bologna (Italy); IRCCS Istituto delle Scienze Neurologiche di Bologna, Bologna (Italy); Pezzoli, Gianni; Cilia, Roberto [ASST G.Pini - CTO, ex ICP, Parkinson Institute, Milano (Italy); Zappia, Mario; Nicoletti, Alessandra; Cicero, Calogero Edoardo [University of Catania, Department ' ' G.F. Ingrassia' ' , Section of Neurosciences, Catania (Italy); Tinazzi, Michele; Tocco, Pierluigi [University Hospital of Verona, Department of Neurological and Movement Sciences, Verona (Italy); Cardobi, Nicolo [University Hospital of Verona, Institute of Radiology, Verona (Italy); Quattrone, Aldo [National Research Council, Institute of Bioimaging and Molecular Physiology, Catanzaro (Italy); University ' ' Magna Graecia' ' , Institute of Neurology, Department of Medical and Surgical Sciences, Catanzaro (Italy)

    2017-06-15

    To investigate the reliability of a new in-house automatic algorithm for calculating the Magnetic Resonance Parkinsonism Index (MRPI), in a large multicentre study population of patients affected by progressive supranuclear palsy (PSP) or Parkinson's disease (PD), and healthy controls (HC), and to compare the diagnostic accuracy of the automatic and manual MRPI values. The study included 88 PSP patients, 234 PD patients and 117 controls. MRI was performed using both 3T and 1.5T scanners. Automatic and manual MRPI values were evaluated, and accuracy of both methods in distinguishing PSP from PD and controls was calculated. No statistical differences were found between automated and manual MRPI values in all groups. The automatic MRPI values differentiated PSP from PD with an accuracy of 95 % (manual MRPI accuracy 96 %) and 97 % (manual MRPI accuracy 100 %) for 1.5T and 3T scanners, respectively. Our study showed that the new in-house automated method for MRPI calculation was highly accurate in distinguishing PSP from PD. Our automatic approach allows a widespread use of MRPI in clinical practice and in longitudinal research studies. (orig.)

  18. Flexible demand in the GB domestic electricity sector in 2030

    International Nuclear Information System (INIS)

    Drysdale, Brian; Wu, Jianzhong; Jenkins, Nick

    2015-01-01

    Highlights: • Annual domestic demand by category and daily flexible load profiles are shown to 2030. • Valuable flexible demand requires loads to be identifiable, accessible, and useful. • The extent of flexible demand varies significantly on a diurnal and seasonal basis. • Barriers to accessing domestic demand include multiple low value loads and apathy. • Existing market structure a barrier to fully rewarding individual load flexibility. - Abstract: In order to meet greenhouse gas emissions targets the Great Britain (GB) future electricity supply will include a higher fraction of non-dispatchable generation, increasing opportunities for demand side management to maintain a supply/demand balance. This paper examines the extent of flexible domestic demand (FDD) in GB, its usefulness in system balancing and appropriate incentives to encourage consumers to participate. FDD, classified as electric space and water heating (ESWH), and cold and wet appliances, amounts to 59 TW h in 2012 (113 TW h total domestic demand) and is calculated to increase to 67 TW h in 2030. Summer and winter daily load profiles for flexible loads show significant seasonal and diurnal variations in the total flexible load and between load categories. Low levels of reflective consumer engagement with electricity consumption and a resistance to automation present barriers to effective access to FDD. A value of £1.97/household/year has been calculated for cold appliance loads used for frequency response in 2030, using 2013 market rates. The introduction of smart meters in GB by 2020 will allow access to FDD for system balancing. The low commercial value of individual domestic loads increases the attractiveness of non-financial incentives to fully exploit FDD. It was shown that appliance loads have different characteristics which can contribute to an efficient power system in different ways

  19. Quantification of 31 illicit and medicinal drugs and metabolites in whole blood by fully automated solid-phase extraction and ultra-performance liquid chromatography-tandem mass spectrometry

    DEFF Research Database (Denmark)

    Bjørk, Marie Kjærgaard; Simonsen, Kirsten Wiese; Andersen, David Wederkinck

    2013-01-01

    , and benzodiazepines, was used. In order to increase the efficiency of routine analysis, a robotic system based on automated liquid handling and capable of handling all unit operation for sample preparation was built on a Freedom Evo 200 platform with several add-ons from Tecan and third-party vendors. Solid-phase...... extraction was performed using Strata X-C plates. Extraction time for 96 samples was less than 3 h. Chromatography was performed using an ACQUITY UPLC system (Waters Corporation, Milford, USA). Analytes were separated on a 100 mm × 2.1 mm, 1.7 μm Acquity UPLC CSH C18 column using a 6.5 min 0.1 % ammonia (25...

  20. Automation in Immunohematology

    Directory of Open Access Journals (Sweden)

    Meenu Bajpai

    2012-01-01

    Full Text Available There have been rapid technological advances in blood banking in South Asian region over the past decade with an increasing emphasis on quality and safety of blood products. The conventional test tube technique has given way to newer techniques such as column agglutination technique, solid phase red cell adherence assay, and erythrocyte-magnetized technique. These new technologies are adaptable to automation and major manufacturers in this field have come up with semi and fully automated equipments for immunohematology tests in the blood bank. Automation improves the objectivity and reproducibility of tests. It reduces human errors in patient identification and transcription errors. Documentation and traceability of tests, reagents and processes and archiving of results is another major advantage of automation. Shifting from manual methods to automation is a major undertaking for any transfusion service to provide quality patient care with lesser turnaround time for their ever increasing workload. This article discusses the various issues involved in the process.

  1. A fully automated and reproducible level-set segmentation approach for generation of MR-based attenuation correction map of PET images in the brain employing single STE-MR imaging modality

    Energy Technology Data Exchange (ETDEWEB)

    Kazerooni, Anahita Fathi; Aarabi, Mohammad Hadi [Quantitative MR Imaging and Spectroscopy Group, Research Center for Cellular and Molecular Imaging, Tehran University of Medical Sciences, Tehran (Iran, Islamic Republic of); Department of Medical Physics and Biomedical Engineering, Tehran University of Medical Sciences, Tehran (Iran, Islamic Republic of); Ay, Mohammadreza [Quantitative MR Imaging and Spectroscopy Group, Research Center for Cellular and Molecular Imaging, Tehran University of Medical Sciences, Tehran (Iran, Islamic Republic of); Medical Imaging Systems Group, Research Center for Cellular and Molecular Imaging, Tehran University of Medical Sciences, Tehran (Iran, Islamic Republic of); Rad, Hamidreza Saligheh [Quantitative MR Imaging and Spectroscopy Group, Research Center for Cellular and Molecular Imaging, Tehran University of Medical Sciences, Tehran (Iran, Islamic Republic of); Department of Medical Physics and Biomedical Engineering, Tehran University of Medical Sciences, Tehran (Iran, Islamic Republic of)

    2014-07-29

    Generating MR-based attenuation correction map (μ-map) for quantitative reconstruction of PET images still remains a challenge in hybrid PET/MRI systems, mainly because cortical bone structures are indistinguishable from proximal air cavities in conventional MR images. Recently, development of short echo-time (STE) MR imaging sequences, has shown promise in differentiating cortical bone from air. However, on STE-MR images, the bone appears with discontinuous boundaries. Therefore, segmentation techniques based on intensity classification, such as thresholding or fuzzy C-means, fail to homogeneously delineate bone boundaries, especially in the presence of intrinsic noise and intensity inhomogeneity. Consequently, they cannot be fully automatized, must be fine-tuned on the case-by-case basis, and require additional morphological operations for segmentation refinement. To overcome the mentioned problems, in this study, we introduce a new fully automatic and reproducible STE-MR segmentation approach exploiting level-set in a clustering-based intensity inhomogeneity correction framework to reliably delineate bone from soft tissue and air.

  2. Fully Automated Quantification of the Striatal Uptake Ratio of [99mTc]-TRODAT with SPECT Imaging: Evaluation of the Diagnostic Performance in Parkinson's Disease and the Temporal Regression of Striatal Tracer Uptake

    Science.gov (United States)

    Fang, Yu-Hua Dean; Chiu, Shao-Chieh; Lu, Chin-Song; Weng, Yi-Hsin

    2015-01-01

    Purpose. We aimed at improving the existing methods for the fully automatic quantification of striatal uptake of [99mTc]-TRODAT with SPECT imaging. Procedures. A normal [99mTc]-TRODAT template was first formed based on 28 healthy controls. Images from PD patients (n = 365) and nPD subjects (28 healthy controls and 33 essential tremor patients) were spatially normalized to the normal template. We performed an inverse transform on the predefined striatal and reference volumes of interest (VOIs) and applied the transformed VOIs to the original image data to calculate the striatal-to-reference ratio (SRR). The diagnostic performance of the SRR was determined through receiver operating characteristic (ROC) analysis. Results. The SRR measured with our new and automatic method demonstrated excellent diagnostic performance with 92% sensitivity, 90% specificity, 92% accuracy, and an area under the curve (AUC) of 0.94. For the evaluation of the mean SRR and the clinical duration, a quadratic function fit the data with R 2 = 0.84. Conclusions. We developed and validated a fully automatic method for the quantification of the SRR in a large study sample. This method has an excellent diagnostic performance and exhibits a strong correlation between the mean SRR and the clinical duration in PD patients. PMID:26366413

  3. Fully Automated Quantification of the Striatal Uptake Ratio of [(99m)Tc]-TRODAT with SPECT Imaging: Evaluation of the Diagnostic Performance in Parkinson's Disease and the Temporal Regression of Striatal Tracer Uptake.

    Science.gov (United States)

    Fang, Yu-Hua Dean; Chiu, Shao-Chieh; Lu, Chin-Song; Yen, Tzu-Chen; Weng, Yi-Hsin

    2015-01-01

    We aimed at improving the existing methods for the fully automatic quantification of striatal uptake of [(99m)Tc]-TRODAT with SPECT imaging. A normal [(99m)Tc]-TRODAT template was first formed based on 28 healthy controls. Images from PD patients (n = 365) and nPD subjects (28 healthy controls and 33 essential tremor patients) were spatially normalized to the normal template. We performed an inverse transform on the predefined striatal and reference volumes of interest (VOIs) and applied the transformed VOIs to the original image data to calculate the striatal-to-reference ratio (SRR). The diagnostic performance of the SRR was determined through receiver operating characteristic (ROC) analysis. The SRR measured with our new and automatic method demonstrated excellent diagnostic performance with 92% sensitivity, 90% specificity, 92% accuracy, and an area under the curve (AUC) of 0.94. For the evaluation of the mean SRR and the clinical duration, a quadratic function fit the data with R (2) = 0.84. We developed and validated a fully automatic method for the quantification of the SRR in a large study sample. This method has an excellent diagnostic performance and exhibits a strong correlation between the mean SRR and the clinical duration in PD patients.

  4. Fully Automated Quantification of the Striatal Uptake Ratio of [99mTc]-TRODAT with SPECT Imaging: Evaluation of the Diagnostic Performance in Parkinson’s Disease and the Temporal Regression of Striatal Tracer Uptake

    Directory of Open Access Journals (Sweden)

    Yu-Hua Dean Fang

    2015-01-01

    Full Text Available Purpose. We aimed at improving the existing methods for the fully automatic quantification of striatal uptake of [Tc99m]-TRODAT with SPECT imaging. Procedures. A normal [Tc99m]-TRODAT template was first formed based on 28 healthy controls. Images from PD patients (n=365 and nPD subjects (28 healthy controls and 33 essential tremor patients were spatially normalized to the normal template. We performed an inverse transform on the predefined striatal and reference volumes of interest (VOIs and applied the transformed VOIs to the original image data to calculate the striatal-to-reference ratio (SRR. The diagnostic performance of the SRR was determined through receiver operating characteristic (ROC analysis. Results. The SRR measured with our new and automatic method demonstrated excellent diagnostic performance with 92% sensitivity, 90% specificity, 92% accuracy, and an area under the curve (AUC of 0.94. For the evaluation of the mean SRR and the clinical duration, a quadratic function fit the data with R2=0.84. Conclusions. We developed and validated a fully automatic method for the quantification of the SRR in a large study sample. This method has an excellent diagnostic performance and exhibits a strong correlation between the mean SRR and the clinical duration in PD patients.

  5. A fully automated and reproducible level-set segmentation approach for generation of MR-based attenuation correction map of PET images in the brain employing single STE-MR imaging modality

    International Nuclear Information System (INIS)

    Kazerooni, Anahita Fathi; Aarabi, Mohammad Hadi; Ay, Mohammadreza; Rad, Hamidreza Saligheh

    2014-01-01

    Generating MR-based attenuation correction map (μ-map) for quantitative reconstruction of PET images still remains a challenge in hybrid PET/MRI systems, mainly because cortical bone structures are indistinguishable from proximal air cavities in conventional MR images. Recently, development of short echo-time (STE) MR imaging sequences, has shown promise in differentiating cortical bone from air. However, on STE-MR images, the bone appears with discontinuous boundaries. Therefore, segmentation techniques based on intensity classification, such as thresholding or fuzzy C-means, fail to homogeneously delineate bone boundaries, especially in the presence of intrinsic noise and intensity inhomogeneity. Consequently, they cannot be fully automatized, must be fine-tuned on the case-by-case basis, and require additional morphological operations for segmentation refinement. To overcome the mentioned problems, in this study, we introduce a new fully automatic and reproducible STE-MR segmentation approach exploiting level-set in a clustering-based intensity inhomogeneity correction framework to reliably delineate bone from soft tissue and air.

  6. Demand Uncertainty

    DEFF Research Database (Denmark)

    Nguyen, Daniel Xuyen

    This paper presents a model of trade that explains why firms wait to export and why many exporters fail. Firms face uncertain demands that are only realized after the firm enters the destination. The model retools the timing of uncertainty resolution found in productivity heterogeneity models....... This retooling addresses several shortcomings. First, the imperfect correlation of demands reconciles the sales variation observed in and across destinations. Second, since demands for the firm's output are correlated across destinations, a firm can use previously realized demands to forecast unknown demands...... in untested destinations. The option to forecast demands causes firms to delay exporting in order to gather more information about foreign demand. Third, since uncertainty is resolved after entry, many firms enter a destination and then exit after learning that they cannot profit. This prediction reconciles...

  7. Anesthesiology, automation, and artificial intelligence.

    Science.gov (United States)

    Alexander, John C; Joshi, Girish P

    2018-01-01

    There have been many attempts to incorporate automation into the practice of anesthesiology, though none have been successful. Fundamentally, these failures are due to the underlying complexity of anesthesia practice and the inability of rule-based feedback loops to fully master it. Recent innovations in artificial intelligence, especially machine learning, may usher in a new era of automation across many industries, including anesthesiology. It would be wise to consider the implications of such potential changes before they have been fully realized.

  8. Manual segmentation of the fornix, fimbria, and alveus on high-resolution 3T MRI: Application via fully-automated mapping of the human memory circuit white and grey matter in healthy and pathological aging.

    Science.gov (United States)

    Amaral, Robert S C; Park, Min Tae M; Devenyi, Gabriel A; Lynn, Vivian; Pipitone, Jon; Winterburn, Julie; Chavez, Sofia; Schira, Mark; Lobaugh, Nancy J; Voineskos, Aristotle N; Pruessner, Jens C; Chakravarty, M Mallar

    2018-04-15

    Recently, much attention has been focused on the definition and structure of the hippocampus and its subfields, while the projections from the hippocampus have been relatively understudied. Here, we derive a reliable protocol for manual segmentation of hippocampal white matter regions (alveus, fimbria, and fornix) using high-resolution magnetic resonance images that are complementary to our previous definitions of the hippocampal subfields, both of which are freely available at https://github.com/cobralab/atlases. Our segmentation methods demonstrated high inter- and intra-rater reliability, were validated as inputs in automated segmentation, and were used to analyze the trajectory of these regions in both healthy aging (OASIS), and Alzheimer's disease (AD) and mild cognitive impairment (MCI; using ADNI). We observed significant bilateral decreases in the fornix in healthy aging while the alveus and cornu ammonis (CA) 1 were well preserved (all p's<0.006). MCI and AD demonstrated significant decreases in fimbriae and fornices. Many hippocampal subfields exhibited decreased volume in both MCI and AD, yet no significant differences were found between MCI and AD cohorts themselves. Our results suggest a neuroprotective or compensatory role for the alveus and CA1 in healthy aging and suggest that an improved understanding of the volumetric trajectories of these structures is required. Copyright © 2016 Elsevier Inc. All rights reserved.

  9. Automated Demand Response for Energy Sustainability

    Science.gov (United States)

    2015-05-01

    technology), particularly when coupled with an installation’s microgrid control systems, could provide much needed stabilization. By causing load to...include advanced energy control systems that provide load reduction services to non-critical loads. The microgrid system will use these controls to...signals from the grid operator. Thus, the technology creates a dual- use model for advanced microgrid controls . 14 2.0 TECHNOLOGY DESCRIPTION This

  10. Criteria for demand response systems

    NARCIS (Netherlands)

    Lampropoulos, I.; Kling, W.L.; Bosch, van den P.P.J.; Ribeiro, P.F.; Berg, van den J.

    2013-01-01

    The topic of demand side management is currently becoming more important than ever, in parallel with the further deregulation of the electricity sector, and the increasing integration of renewable energy sources. A historical review of automation integration in power system control assists in

  11. Development of a fully automated, web-based, tailored intervention promoting regular physical activity among insufficiently active adults with type 2 diabetes: integrating the I-change model, self-determination theory, and motivational interviewing components.

    Science.gov (United States)

    Moreau, Michel; Gagnon, Marie-Pierre; Boudreau, François

    2015-02-17

    Type 2 diabetes is a major challenge for Canadian public health authorities, and regular physical activity is a key factor in the management of this disease. Given that fewer than half of people with type 2 diabetes in Canada are sufficiently active to meet the recommendations, effective programs targeting the adoption of regular physical activity (PA) are in demand for this population. Many researchers argue that Web-based, tailored interventions targeting PA are a promising and effective avenue for sedentary populations like Canadians with type 2 diabetes, but few have described the detailed development of this kind of intervention. This paper aims to describe the systematic development of the Web-based, tailored intervention, Diabète en Forme, promoting regular aerobic PA among adult Canadian francophones with type 2 diabetes. This paper can be used as a reference for health professionals interested in developing similar interventions. We also explored the integration of theoretical components derived from the I-Change Model, Self-Determination Theory, and Motivational Interviewing, which is a potential path for enhancing the effectiveness of tailored interventions on PA adoption and maintenance. The intervention development was based on the program-planning model for tailored interventions of Kreuter et al. An additional step was added to the model to evaluate the intervention's usability prior to the implementation phase. An 8-week intervention was developed. The key components of the intervention include a self-monitoring tool for PA behavior, a weekly action planning tool, and eight tailored motivational sessions based on attitude, self-efficacy, intention, type of motivation, PA behavior, and other constructs and techniques. Usability evaluation, a step added to the program-planning model, helped to make several improvements to the intervention prior to the implementation phase. The intervention development cost was about CDN $59,700 and took approximately

  12. Fully portable blood irradiator

    International Nuclear Information System (INIS)

    Hungate, F.P.; Riemath, W.F.; Bunnell, L.R.

    1980-01-01

    A fully portable blood irradiator was developed using the beta emitter thulium-170 as the radiation source and vitreous carbon as the body of the irradiator, matrix for isotope encapsulation, and blood interface material. These units were placed in exteriorized arteriovenous shunts in goats, sheep, and dogs and the effects on circulating lymphocytes and on skin allograft retention times measured. The present work extends these studies by establishing baseline data for skin graft rejection times in untreated animals

  13. Systematic review automation technologies

    Science.gov (United States)

    2014-01-01

    Systematic reviews, a cornerstone of evidence-based medicine, are not produced quickly enough to support clinical practice. The cost of production, availability of the requisite expertise and timeliness are often quoted as major contributors for the delay. This detailed survey of the state of the art of information systems designed to support or automate individual tasks in the systematic review, and in particular systematic reviews of randomized controlled clinical trials, reveals trends that see the convergence of several parallel research projects. We surveyed literature describing informatics systems that support or automate the processes of systematic review or each of the tasks of the systematic review. Several projects focus on automating, simplifying and/or streamlining specific tasks of the systematic review. Some tasks are already fully automated while others are still largely manual. In this review, we describe each task and the effect that its automation would have on the entire systematic review process, summarize the existing information system support for each task, and highlight where further research is needed for realizing automation for the task. Integration of the systems that automate systematic review tasks may lead to a revised systematic review workflow. We envisage the optimized workflow will lead to system in which each systematic review is described as a computer program that automatically retrieves relevant trials, appraises them, extracts and synthesizes data, evaluates the risk of bias, performs meta-analysis calculations, and produces a report in real time. PMID:25005128

  14. Automation synthesis modules review

    International Nuclear Information System (INIS)

    Boschi, S.; Lodi, F.; Malizia, C.; Cicoria, G.; Marengo, M.

    2013-01-01

    The introduction of 68 Ga labelled tracers has changed the diagnostic approach to neuroendocrine tumours and the availability of a reliable, long-lived 68 Ge/ 68 Ga generator has been at the bases of the development of 68 Ga radiopharmacy. The huge increase in clinical demand, the impact of regulatory issues and a careful radioprotection of the operators have boosted for extensive automation of the production process. The development of automated systems for 68 Ga radiochemistry, different engineering and software strategies and post-processing of the eluate were discussed along with impact of automation with regulations. - Highlights: ► Generators availability and robust chemistry boosted for the huge diffusion of 68Ga radiopharmaceuticals. ► Different technological approaches for 68Ga radiopharmaceuticals will be discussed. ► Generator eluate post processing and evolution to cassette based systems were the major issues in automation. ► Impact of regulations on the technological development will be also considered

  15. Android Fully Loaded

    CERN Document Server

    Huddleston, Rob

    2012-01-01

    Fully loaded with the latest tricks and tips on your new Android! Android smartphones are so hot, they're soaring past iPhones on the sales charts. And the second edition of this muscular little book is equally impressive--it's packed with tips and tricks for getting the very most out of your latest-generation Android device. Start Facebooking and tweeting with your Android mobile, scan barcodes to get pricing and product reviews, download your favorite TV shows--the book is positively bursting with practical and fun how-tos. Topics run the gamut from using speech recognition, location-based m

  16. An expert system for automated robotic grasping

    International Nuclear Information System (INIS)

    Stansfield, S.A.

    1990-01-01

    Many US Department of Energy sites and facilities will be environmentally remediated during the next several decades. A number of the restoration activities (e.g., decontamination and decommissioning of inactive nuclear facilities) can only be carried out by remote means and will be manipulation-intensive tasks. Experience has shown that manipulation tasks are especially slow and fatiguing for the human operator of a remote manipulator. In this paper, the authors present a rule-based expert system for automated, dextrous robotic grasping. This system interprets the features of an object to generate hand shaping and wrist orientation for a robot hand and arm. The system can be used in several different ways to lessen the demands on the human operator of a remote manipulation system - either as a fully autonomous grasping system or one that generates grasping options for a human operator and then automatically carries out the selected option

  17. Information management - Assessing the demand for information

    Science.gov (United States)

    Rogers, William H.

    1991-01-01

    Information demand is defined in terms of both information content (what information) and form (when, how, and where it is needed). Providing the information richness required for flight crews to be informed without overwhelming their information processing capabilities will require a great deal of automated intelligence. It is seen that the essence of this intelligence is comprehending and capturing the demand for information.

  18. A Fully Automated Stage for Optical Waveguide Measurements

    Science.gov (United States)

    1993-09-01

    the exit prism with the measurement prism off of the waveguide. This value, Peg , is used as the reference for all measurements taken along the points...Num of-choices,Choices$(*),Constraints$(*),Value$(*)) 3320 CLEAR SCREEN 3330 Filename$=Value$(3) 3340 3350

  19. Fully Automated Concentration Control of the Acidic Texturisation Process

    OpenAIRE

    Dannenberg, T.; Zimmer, M.; Rentsch, J.

    2012-01-01

    To enable a concentration control in the acidic texturing process we have closed the feedback loop from analytical data to the dosing mechanism of the used process tool. In order to analyze the process bath we used near-infrared spectroscopy in an online setup as well as ion chromatography as an inline method in a second approach. Using the developed dosing algorithm allows a concentration optimization of HF and HNO3 in dependence of the Si concentrations. This allows a further optimization o...

  20. Simple Fully Automated Group Classification on Brain fMRI

    International Nuclear Information System (INIS)

    Honorio, J.; Goldstein, R.; Samaras, D.; Tomasi, D.; Goldstein, R.Z.

    2010-01-01

    We propose a simple, well grounded classification technique which is suited for group classification on brain fMRI data sets that have high dimensionality, small number of subjects, high noise level, high subject variability, imperfect registration and capture subtle cognitive effects. We propose threshold-split region as a new feature selection method and majority voteas the classification technique. Our method does not require a predefined set of regions of interest. We use average acros ssessions, only one feature perexperimental condition, feature independence assumption, and simple classifiers. The seeming counter-intuitive approach of using a simple design is supported by signal processing and statistical theory. Experimental results in two block design data sets that capture brain function under distinct monetary rewards for cocaine addicted and control subjects, show that our method exhibits increased generalization accuracy compared to commonly used feature selection and classification techniques.

  1. Simple Fully Automated Group Classification on Brain fMRI

    Energy Technology Data Exchange (ETDEWEB)

    Honorio, J.; Goldstein, R.; Honorio, J.; Samaras, D.; Tomasi, D.; Goldstein, R.Z.

    2010-04-14

    We propose a simple, well grounded classification technique which is suited for group classification on brain fMRI data sets that have high dimensionality, small number of subjects, high noise level, high subject variability, imperfect registration and capture subtle cognitive effects. We propose threshold-split region as a new feature selection method and majority voteas the classification technique. Our method does not require a predefined set of regions of interest. We use average acros ssessions, only one feature perexperimental condition, feature independence assumption, and simple classifiers. The seeming counter-intuitive approach of using a simple design is supported by signal processing and statistical theory. Experimental results in two block design data sets that capture brain function under distinct monetary rewards for cocaine addicted and control subjects, show that our method exhibits increased generalization accuracy compared to commonly used feature selection and classification techniques.

  2. Rapid and fully automated Measurement of Water Vapor Sorption Isotherms

    DEFF Research Database (Denmark)

    Arthur, Emmanuel; Tuller, Markus; Møldrup, Per

    2014-01-01

    Eminent environmental challenges such as remediation of contaminated sites, the establishment and maintenance of nuclear waste repositories, or the design of surface landfill covers all require accurate quantification of the soil water characteristic at low water contents. Furthermore, several...

  3. Home Automation

    OpenAIRE

    Ahmed, Zeeshan

    2010-01-01

    In this paper I briefly discuss the importance of home automation system. Going in to the details I briefly present a real time designed and implemented software and hardware oriented house automation research project, capable of automating house's electricity and providing a security system to detect the presence of unexpected behavior.

  4. How to assess sustainability in automated manufacturing

    DEFF Research Database (Denmark)

    Dijkman, Teunis Johannes; Rödger, Jan-Markus; Bey, Niki

    2015-01-01

    The aim of this paper is to describe how sustainability in automation can be assessed. The assessment method is illustrated using a case study of a robot. Three aspects of sustainability assessment in automation are identified. Firstly, we consider automation as part of a larger system...... that fulfills the market demand for a given functionality. Secondly, three aspects of sustainability have to be assessed: environment, economy, and society. Thirdly, automation is part of a system with many levels, with different actors on each level, resulting in meeting the market demand. In this system......, (sustainability) specifications move top-down, which helps avoiding sub-optimization and problem shifting. From these three aspects, sustainable automation is defined as automation that contributes to products that fulfill a market demand in a more sustainable way. The case study presents the carbon footprints...

  5. An inventory control project in a major Danish company using compound renewal demand models

    DEFF Research Database (Denmark)

    Larsen, Christian; Seiding, Claus Hoe; Teller, Christian

    operation is highly automated. However, the procedures for estimating demands and the policies for the inventory control system that were in use at the beginning of the project did not fully match the sophisticated technological standard of the physical system. During the initial phase of the project...... We describe the development of a framework to compute the optimal inventory policy for a large spare-parts' distribution centre operation in the RA division of the Danfoss Group in Denmark. The RA division distributes spare parts worldwide for cooling and A/C systems. The warehouse logistics...

  6. Integrating Standard Operating Procedures with Spacecraft Automation, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — Spacecraft automation can be used to greatly reduce the demands on crew member and flight controllers time and attention. Automation can monitor critical resources,...

  7. Fully electric waste collection

    CERN Multimedia

    Anaïs Schaeffer

    2015-01-01

    Since 15 June, Transvoirie, which provides waste collection services throughout French-speaking Switzerland, has been using a fully electric lorry for its collections on the CERN site – a first for the region!   Featuring a motor powered by electric batteries that charge up when the brakes are used, the new lorry that roams the CERN site is as green as can be. And it’s not only the motor that’s electric: its waste compactor and lifting mechanism are also electrically powered*, making it the first 100% electric waste collection vehicle in French-speaking Switzerland. Considering that a total of 15.5 tonnes of household waste and paper/cardboard are collected each week from the Meyrin and Prévessin sites, the benefits for the environment are clear. This improvement comes as part of CERN’s contract with Transvoirie, which stipulates that the firm must propose ways of becoming more environmentally friendly (at no extra cost to CERN). *The was...

  8. In demand

    Energy Technology Data Exchange (ETDEWEB)

    Coleman, B. [Bridgestone Ltd. (United Kingdom)

    2005-11-01

    The paper explains how good relationships can help alleviate potential tyre shortages. Demand for large dump truck tyres (largely for China) has increased by 50% within 12 months. Bridgestone's manufacturing plants are operating at maximum capacity. The company supplies tyres to all vehicles at Scottish Coal's opencast coal mines. Its Tyre Management System (TMS) supplied free of charge to customers helps maximise tyre life and minimise downtime from data on pressure, tread and general conditions fed into the hand-held TMS computer. 3 photos.

  9. Safe interaction between cyclists, pedestrians and automated vehicles : what do we know and what do we need to know?

    NARCIS (Netherlands)

    Vissers, L. Kint, S. van der Schagen, I.N.L.G. van & Hagenzieker, M.P.

    2017-01-01

    Automated vehicles are gradually entering our roadway system. Before our roads will be solely used by fully automated vehicles, a long transition period is to be expected in which fully automated vehicles, partly automated vehicles and manually-driven vehicles have to share our roads. The current

  10. Smart Buildings and Demand Response

    Science.gov (United States)

    Kiliccote, Sila; Piette, Mary Ann; Ghatikar, Girish

    2011-11-01

    Advances in communications and control technology, the strengthening of the Internet, and the growing appreciation of the urgency to reduce demand side energy use are motivating the development of improvements in both energy efficiency and demand response (DR) systems in buildings. This paper provides a framework linking continuous energy management and continuous communications for automated demand response (Auto-DR) in various times scales. We provide a set of concepts for monitoring and controls linked to standards and procedures such as Open Automation Demand Response Communication Standards (OpenADR). Basic building energy science and control issues in this approach begin with key building components, systems, end-uses and whole building energy performance metrics. The paper presents a framework about when energy is used, levels of services by energy using systems, granularity of control, and speed of telemetry. DR, when defined as a discrete event, requires a different set of building service levels than daily operations. We provide examples of lessons from DR case studies and links to energy efficiency.

  11. Multipartite fully nonlocal quantum states

    International Nuclear Information System (INIS)

    Almeida, Mafalda L.; Cavalcanti, Daniel; Scarani, Valerio; Acin, Antonio

    2010-01-01

    We present a general method for characterizing the quantum correlations obtained after local measurements on multipartite systems. Sufficient conditions for a quantum system to be fully nonlocal according to a given partition, as well as being (genuinely) multipartite fully nonlocal, are derived. These conditions allow us to identify all completely connected graph states as multipartite fully nonlocal quantum states. Moreover, we show that this feature can also be observed in mixed states: the tensor product of five copies of the Smolin state, a biseparable and bound entangled state, is multipartite fully nonlocal.

  12. Selecting automation for the clinical chemistry laboratory.

    Science.gov (United States)

    Melanson, Stacy E F; Lindeman, Neal I; Jarolim, Petr

    2007-07-01

    Laboratory automation proposes to improve the quality and efficiency of laboratory operations, and may provide a solution to the quality demands and staff shortages faced by today's clinical laboratories. Several vendors offer automation systems in the United States, with both subtle and obvious differences. Arriving at a decision to automate, and the ensuing evaluation of available products, can be time-consuming and challenging. Although considerable discussion concerning the decision to automate has been published, relatively little attention has been paid to the process of evaluating and selecting automation systems. To outline a process for evaluating and selecting automation systems as a reference for laboratories contemplating laboratory automation. Our Clinical Chemistry Laboratory staff recently evaluated all major laboratory automation systems in the United States, with their respective chemistry and immunochemistry analyzers. Our experience is described and organized according to the selection process, the important considerations in clinical chemistry automation, decisions and implementation, and we give conclusions pertaining to this experience. Including the formation of a committee, workflow analysis, submitting a request for proposal, site visits, and making a final decision, the process of selecting chemistry automation took approximately 14 months. We outline important considerations in automation design, preanalytical processing, analyzer selection, postanalytical storage, and data management. Selecting clinical chemistry laboratory automation is a complex, time-consuming process. Laboratories considering laboratory automation may benefit from the concise overview and narrative and tabular suggestions provided.

  13. A fully robust PARAFAC method for analyzing fluorescence data

    DEFF Research Database (Denmark)

    Engelen, Sanne; Frosch, Stina; Jørgensen, Bo

    2009-01-01

    and Rayleigh scatter. Recently, a robust PARAFAC method that circumvents the harmful effects of outlying samples has been developed. For removing the scatter effects on the final PARAFAC model, different techniques exist. Newly, an automated scatter identification tool has been constructed. However......, there still exists no robust method for handling fluorescence data encountering both outlying EEM landscapes and scatter. In this paper, we present an iterative algorithm where the robust PARAFAC method and the scatter identification tool are alternately performed. A fully automated robust PARAFAC method...

  14. Process automation

    International Nuclear Information System (INIS)

    Moser, D.R.

    1986-01-01

    Process automation technology has been pursued in the chemical processing industries and to a very limited extent in nuclear fuel reprocessing. Its effective use has been restricted in the past by the lack of diverse and reliable process instrumentation and the unavailability of sophisticated software designed for process control. The Integrated Equipment Test (IET) facility was developed by the Consolidated Fuel Reprocessing Program (CFRP) in part to demonstrate new concepts for control of advanced nuclear fuel reprocessing plants. A demonstration of fuel reprocessing equipment automation using advanced instrumentation and a modern, microprocessor-based control system is nearing completion in the facility. This facility provides for the synergistic testing of all chemical process features of a prototypical fuel reprocessing plant that can be attained with unirradiated uranium-bearing feed materials. The unique equipment and mission of the IET facility make it an ideal test bed for automation studies. This effort will provide for the demonstration of the plant automation concept and for the development of techniques for similar applications in a full-scale plant. A set of preliminary recommendations for implementing process automation has been compiled. Some of these concepts are not generally recognized or accepted. The automation work now under way in the IET facility should be useful to others in helping avoid costly mistakes because of the underutilization or misapplication of process automation. 6 figs

  15. Automated optical assembly

    Science.gov (United States)

    Bala, John L.

    1995-08-01

    Automation and polymer science represent fundamental new technologies which can be directed toward realizing the goal of establishing a domestic, world-class, commercial optics business. Use of innovative optical designs using precision polymer optics will enable the US to play a vital role in the next generation of commercial optical products. The increased cost savings inherent in the utilization of optical-grade polymers outweighs almost every advantage of using glass for high volume situations. Optical designers must gain experience with combined refractive/diffractive designs and broaden their knowledge base regarding polymer technology beyond a cursory intellectual exercise. Implementation of a fully automated assembly system, combined with utilization of polymer optics, constitutes the type of integrated manufacturing process which will enable the US to successfully compete with the low-cost labor employed in the Far East, as well as to produce an equivalent product.

  16. Automation of Endmember Pixel Selection in SEBAL/METRIC Model

    Science.gov (United States)

    Bhattarai, N.; Quackenbush, L. J.; Im, J.; Shaw, S. B.

    2015-12-01

    The commonly applied surface energy balance for land (SEBAL) and its variant, mapping evapotranspiration (ET) at high resolution with internalized calibration (METRIC) models require manual selection of endmember (i.e. hot and cold) pixels to calibrate sensible heat flux. Current approaches for automating this process are based on statistical methods and do not appear to be robust under varying climate conditions and seasons. In this paper, we introduce a new approach based on simple machine learning tools and search algorithms that provides an automatic and time efficient way of identifying endmember pixels for use in these models. The fully automated models were applied on over 100 cloud-free Landsat images with each image covering several eddy covariance flux sites in Florida and Oklahoma. Observed land surface temperatures at automatically identified hot and cold pixels were within 0.5% of those from pixels manually identified by an experienced operator (coefficient of determination, R2, ≥ 0.92, Nash-Sutcliffe efficiency, NSE, ≥ 0.92, and root mean squared error, RMSE, ≤ 1.67 K). Daily ET estimates derived from the automated SEBAL and METRIC models were in good agreement with their manual counterparts (e.g., NSE ≥ 0.91 and RMSE ≤ 0.35 mm day-1). Automated and manual pixel selection resulted in similar estimates of observed ET across all sites. The proposed approach should reduce time demands for applying SEBAL/METRIC models and allow for their more widespread and frequent use. This automation can also reduce potential bias that could be introduced by an inexperienced operator and extend the domain of the models to new users.

  17. Varying Levels of Automation on UAS Operator Responses to Traffic Resolution Advisories in Civil Airspace

    Science.gov (United States)

    Kenny, Caitlin; Fern, Lisa

    2012-01-01

    Continuing demand for the use of Unmanned Aircraft Systems (UAS) has put increasing pressure on operations in civil airspace. The need to fly UAS in the National Airspace System (NAS) in order to perform missions vital to national security and defense, emergency management, and science is increasing at a rapid pace. In order to ensure safe operations in the NAS, operators of unmanned aircraft, like those of manned aircraft, may be required to maintain separation assurance and avoid loss of separation with other aircraft while performing their mission tasks. This experiment investigated the effects of varying levels of automation on UAS operator performance and workload while responding to conflict resolution instructions provided by the Tactical Collision Avoidance System II (TCAS II) during a UAS mission in high-density airspace. The purpose of this study was not to investigate the safety of using TCAS II on UAS, but rather to examine the effect of automation on the ability of operators to respond to traffic collision alerts. Six licensed pilots were recruited to act as UAS operators for this study. Operators were instructed to follow a specified mission flight path, while maintaining radio contact with Air Traffic Control and responding to TCAS II resolution advisories. Operators flew four, 45 minute, experimental missions with four different levels of automation: Manual, Knobs, Management by Exception, and Fully Automated. All missions included TCAS II Resolution Advisories (RAs) that required operator attention and rerouting. Operator compliance and reaction time to RAs was measured, and post-run NASA-TLX ratings were collected to measure workload. Results showed significantly higher compliance rates, faster responses to TCAS II alerts, as well as less preemptive operator actions when higher levels of automation are implemented. Physical and Temporal ratings of workload were significantly higher in the Manual condition than in the Management by Exception and

  18. Automating the radiographic NDT process

    International Nuclear Information System (INIS)

    Aman, J.K.

    1986-01-01

    Automation, the removal of the human element in inspection, has not been generally applied to film radiographic NDT. The justication for automating is not only productivity but also reliability of results. Film remains in the automated system of the future because of its extremely high image content, approximately 8 x 10 9 bits per 14 x 17. The equivalent to 2200 computer floppy discs. Parts handling systems and robotics applied for manufacturing and some NDT modalities, should now be applied to film radiographic NDT systems. Automatic film handling can be achieved with the daylight NDT film handling system. Automatic film processing is becoming the standard in industry and can be coupled to the daylight system. Robots offer the opportunity to automate fully the exposure step. Finally, computer aided interpretation appears on the horizon. A unit which laser scans a 14 x 17 (inch) film in 6 - 8 seconds can digitize film information for further manipulation and possible automatic interrogations (computer aided interpretation). The system called FDRS (for Film Digital Radiography System) is moving toward 50 micron (*approx* 16 lines/mm) resolution. This is believed to meet the need of the majority of image content needs. We expect the automated system to appear first in parts (modules) as certain operations are automated. The future will see it all come together in an automated film radiographic NDT system (author) [pt

  19. Using microwave Doppler radar in automated manufacturing applications

    Science.gov (United States)

    Smith, Gregory C.

    Since the beginning of the Industrial Revolution, manufacturers worldwide have used automation to improve productivity, gain market share, and meet growing or changing consumer demand for manufactured products. To stimulate further industrial productivity, manufacturers need more advanced automation technologies: "smart" part handling systems, automated assembly machines, CNC machine tools, and industrial robots that use new sensor technologies, advanced control systems, and intelligent decision-making algorithms to "see," "hear," "feel," and "think" at the levels needed to handle complex manufacturing tasks without human intervention. The investigator's dissertation offers three methods that could help make "smart" CNC machine tools and industrial robots possible: (1) A method for detecting acoustic emission using a microwave Doppler radar detector, (2) A method for detecting tool wear on a CNC lathe using a Doppler radar detector, and (3) An online non-contact method for detecting industrial robot position errors using a microwave Doppler radar motion detector. The dissertation studies indicate that microwave Doppler radar could be quite useful in automated manufacturing applications. In particular, the methods developed may help solve two difficult problems that hinder further progress in automating manufacturing processes: (1) Automating metal-cutting operations on CNC machine tools by providing a reliable non-contact method for detecting tool wear, and (2) Fully automating robotic manufacturing tasks by providing a reliable low-cost non-contact method for detecting on-line position errors. In addition, the studies offer a general non-contact method for detecting acoustic emission that may be useful in many other manufacturing and non-manufacturing areas, as well (e.g., monitoring and nondestructively testing structures, materials, manufacturing processes, and devices). By advancing the state of the art in manufacturing automation, the studies may help

  20. Automated drawing generation system

    International Nuclear Information System (INIS)

    Yoshinaga, Toshiaki; Kawahata, Junichi; Yoshida, Naoto; Ono, Satoru

    1991-01-01

    Since automated CAD drawing generation systems still require human intervention, improvements were focussed on an interactive processing section (data input and correcting operation) which necessitates a vast amount of work. As a result, human intervention was eliminated, the original objective of a computerized system. This is the first step taken towards complete automation. The effects of development and commercialization of the system are as described below. (1) The interactive processing time required for generating drawings was improved. It was determined that introduction of the CAD system has reduced the time required for generating drawings. (2) The difference in skills between workers preparing drawings has been eliminated and the quality of drawings has been made uniform. (3) The extent of knowledge and experience demanded of workers has been reduced. (author)

  1. Access control for on-demand provisioned cloud infrastructure services

    NARCIS (Netherlands)

    Ngo, C.T.

    2016-01-01

    The evolution of Cloud Computing brings advantages to both customers and service providers to utilize and manage computing and network resources more efficiently with virtualization, service-oriented architecture technologies, and automated on-demand resource provisioning. However, these advantages

  2. CMS on the GRID: Toward a fully distributed computing architecture

    International Nuclear Information System (INIS)

    Innocente, Vincenzo

    2003-01-01

    The computing systems required to collect, analyse and store the physics data at LHC would need to be distributed and global in scope. CMS is actively involved in several grid-related projects to develop and deploy a fully distributed computing architecture. We present here recent developments of tools for automating job submission and for serving data to remote analysis stations. Plans for further test and deployment of a production grid are also described

  3. The Employment-Impact of Automation in Canada

    OpenAIRE

    McLean, Colin Alexander

    2015-01-01

    Standard neoclassical models of labour demand predict that automation does not produce long-term increases in unemployment. Supporting evidence in Canada between 1970 and 2008 is explained by the reallocation of labour from industries with high levels of automation such as Manufacturing to industries with low levels of automation such as Retail and Wholesale Trade, and Business Services. Recent evidence indicates however that on-going technological advances are now driving labour automation i...

  4. Dynamic adaptive policymaking for the sustainable city: The case of automated taxis

    Directory of Open Access Journals (Sweden)

    Warren E. Walker

    2017-06-01

    Full Text Available By 2050, about two-thirds of the world’s people are expected to live in urban areas. But, the economic viability and sustainability of city centers is threatened by problems related to transport, such as pollution, congestion, and parking. Much has been written about automated vehicles and demand responsive transport. The combination of these potentially disruptive developments could reduce these problems. However, implementation is held back by uncertainties, including public acceptance, liability, and privacy. So, their potential to reduce urban transport problems may not be fully realized. We propose an adaptive approach to implementation that takes some actions right away and creates a framework for future actions that allows for adaptations over time as knowledge about performance and acceptance of the new system (called ‘automated taxis’ accumulates and critical events for implementation take place. The adaptive approach is illustrated in the context of a hypothetical large city.

  5. An Automated, High-Throughput System for GISAXS and GIWAXS Measurements of Thin Films

    Science.gov (United States)

    Schaible, Eric; Jimenez, Jessica; Church, Matthew; Lim, Eunhee; Stewart, Polite; Hexemer, Alexander

    Grazing incidence small-angle X-ray scattering (GISAXS) and grazing incidence wide-angle X-ray scattering (GIWAXS) are important techniques for characterizing thin films. In order to meet rapidly increasing demand, the SAXSWAXS beamline at the Advanced Light Source (beamline 7.3.3) has implemented a fully automated, high-throughput system to conduct SAXS, GISAXS and GIWAXS measurements. An automated robot arm transfers samples from a holding tray to a measurement stage. Intelligent software aligns each sample in turn, and measures each according to user-defined specifications. Users mail in trays of samples on individually barcoded pucks, and can download and view their data remotely. Data will be pipelined to the NERSC supercomputing facility, and will be available to users via a web portal that facilitates highly parallelized analysis.

  6. Virtual automation.

    Science.gov (United States)

    Casis, E; Garrido, A; Uranga, B; Vives, A; Zufiaurre, C

    2001-01-01

    Total laboratory automation (TLA) can be substituted in mid-size laboratories by a computer sample workflow control (virtual automation). Such a solution has been implemented in our laboratory using PSM, software developed in cooperation with Roche Diagnostics (Barcelona, Spain), to this purpose. This software is connected to the online analyzers and to the laboratory information system and is able to control and direct the samples working as an intermediate station. The only difference with TLA is the replacement of transport belts by personnel of the laboratory. The implementation of this virtual automation system has allowed us the achievement of the main advantages of TLA: workload increase (64%) with reduction in the cost per test (43%), significant reduction in the number of biochemistry primary tubes (from 8 to 2), less aliquoting (from 600 to 100 samples/day), automation of functional testing, drastic reduction of preanalytical errors (from 11.7 to 0.4% of the tubes) and better total response time for both inpatients (from up to 48 hours to up to 4 hours) and outpatients (from up to 10 days to up to 48 hours). As an additional advantage, virtual automation could be implemented without hardware investment and significant headcount reduction (15% in our lab).

  7. Physics of fully ionized regions

    International Nuclear Information System (INIS)

    Flower, D.

    1975-01-01

    In this paper the term fully ionised regions is taken to embrace both planetary nebulae and the so-called 'H II' regions referred to as H + regions. Whilst these two types of gaseous nebulae are very different from an evolutionary standpoint, they are physically very similar, being characterised by photoionisation of a low-density plasma by a hot star. (Auth.)

  8. Fully 3D GPU PET reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Herraiz, J.L., E-mail: joaquin@nuclear.fis.ucm.es [Grupo de Fisica Nuclear, Departmento Fisica Atomica, Molecular y Nuclear, Universidad Complutense de Madrid (Spain); Espana, S. [Department of Radiation Oncology, Massachusetts General Hospital and Harvard Medical School, Boston, MA (United States); Cal-Gonzalez, J. [Grupo de Fisica Nuclear, Departmento Fisica Atomica, Molecular y Nuclear, Universidad Complutense de Madrid (Spain); Vaquero, J.J. [Departmento de Bioingenieria e Ingenieria Espacial, Universidad Carlos III, Madrid (Spain); Desco, M. [Departmento de Bioingenieria e Ingenieria Espacial, Universidad Carlos III, Madrid (Spain); Unidad de Medicina y Cirugia Experimental, Hospital General Universitario Gregorio Maranon, Madrid (Spain); Udias, J.M. [Grupo de Fisica Nuclear, Departmento Fisica Atomica, Molecular y Nuclear, Universidad Complutense de Madrid (Spain)

    2011-08-21

    Fully 3D iterative tomographic image reconstruction is computationally very demanding. Graphics Processing Unit (GPU) has been proposed for many years as potential accelerators in complex scientific problems, but it has not been used until the recent advances in the programmability of GPUs that the best available reconstruction codes have started to be implemented to be run on GPUs. This work presents a GPU-based fully 3D PET iterative reconstruction software. This new code may reconstruct sinogram data from several commercially available PET scanners. The most important and time-consuming parts of the code, the forward and backward projection operations, are based on an accurate model of the scanner obtained with the Monte Carlo code PeneloPET and they have been massively parallelized on the GPU. For the PET scanners considered, the GPU-based code is more than 70 times faster than a similar code running on a single core of a fast CPU, obtaining in both cases the same images. The code has been designed to be easily adapted to reconstruct sinograms from any other PET scanner, including scanner prototypes.

  9. Fully 3D GPU PET reconstruction

    International Nuclear Information System (INIS)

    Herraiz, J.L.; Espana, S.; Cal-Gonzalez, J.; Vaquero, J.J.; Desco, M.; Udias, J.M.

    2011-01-01

    Fully 3D iterative tomographic image reconstruction is computationally very demanding. Graphics Processing Unit (GPU) has been proposed for many years as potential accelerators in complex scientific problems, but it has not been used until the recent advances in the programmability of GPUs that the best available reconstruction codes have started to be implemented to be run on GPUs. This work presents a GPU-based fully 3D PET iterative reconstruction software. This new code may reconstruct sinogram data from several commercially available PET scanners. The most important and time-consuming parts of the code, the forward and backward projection operations, are based on an accurate model of the scanner obtained with the Monte Carlo code PeneloPET and they have been massively parallelized on the GPU. For the PET scanners considered, the GPU-based code is more than 70 times faster than a similar code running on a single core of a fast CPU, obtaining in both cases the same images. The code has been designed to be easily adapted to reconstruct sinograms from any other PET scanner, including scanner prototypes.

  10. Lidar Cloud Detection with Fully Convolutional Networks

    Science.gov (United States)

    Cromwell, E.; Flynn, D.

    2017-12-01

    The vertical distribution of clouds from active remote sensing instrumentation is a widely used data product from global atmospheric measuring sites. The presence of clouds can be expressed as a binary cloud mask and is a primary input for climate modeling efforts and cloud formation studies. Current cloud detection algorithms producing these masks do not accurately identify the cloud boundaries and tend to oversample or over-represent the cloud. This translates as uncertainty for assessing the radiative impact of clouds and tracking changes in cloud climatologies. The Atmospheric Radiation Measurement (ARM) program has over 20 years of micro-pulse lidar (MPL) and High Spectral Resolution Lidar (HSRL) instrument data and companion automated cloud mask product at the mid-latitude Southern Great Plains (SGP) and the polar North Slope of Alaska (NSA) atmospheric observatory. Using this data, we train a fully convolutional network (FCN) with semi-supervised learning to segment lidar imagery into geometric time-height cloud locations for the SGP site and MPL instrument. We then use transfer learning to train a FCN for (1) the MPL instrument at the NSA site and (2) for the HSRL. In our semi-supervised approach, we pre-train the classification layers of the FCN with weakly labeled lidar data. Then, we facilitate end-to-end unsupervised pre-training and transition to fully supervised learning with ground truth labeled data. Our goal is to improve the cloud mask accuracy and precision for the MPL instrument to 95% and 80%, respectively, compared to the current cloud mask algorithms of 89% and 50%. For the transfer learning based FCN for the HSRL instrument, our goal is to achieve a cloud mask accuracy of 90% and a precision of 80%.

  11. Automating Finance

    Science.gov (United States)

    Moore, John

    2007-01-01

    In past years, higher education's financial management side has been riddled with manual processes and aging mainframe applications. This article discusses schools which had taken advantage of an array of technologies that automate billing, payment processing, and refund processing in the case of overpayment. The investments are well worth it:…

  12. Library Automation.

    Science.gov (United States)

    Husby, Ole

    1990-01-01

    The challenges and potential benefits of automating university libraries are reviewed, with special attention given to cooperative systems. Aspects discussed include database size, the role of the university computer center, storage modes, multi-institutional systems, resource sharing, cooperative system management, networking, and intelligent…

  13. Automated Analysis of Corpora Callosa

    DEFF Research Database (Denmark)

    Stegmann, Mikkel Bille; Davies, Rhodri H.

    2003-01-01

    This report describes and evaluates the steps needed to perform modern model-based interpretation of the corpus callosum in MRI. The process is discussed from the initial landmark-free contours to full-fledged statistical models based on the Active Appearance Models framework. Topics treated incl...... include landmark placement, background modelling and multi-resolution analysis. Preliminary quantitative and qualitative validation in a cross-sectional study show that fully automated analysis and segmentation of the corpus callosum are feasible....

  14. Automated Test Requirement Document Generation

    Science.gov (United States)

    1987-11-01

    DIAGNOSTICS BASED ON THE PRINCIPLES OF ARTIFICIAL INTELIGENCE ", 1984 International Test Conference, 01Oct84, (A3, 3, Cs D3, E2, G2, H2, 13, J6, K) 425...j0O GLOSSARY OF ACRONYMS 0 ABBREVIATION DEFINITION AFSATCOM Air Force Satellite Communication Al Artificial Intelligence ASIC Application Specific...In-Test Equipment (BITE) and AI ( Artificial Intelligence) - Expert Systems - need to be fully applied before a completely automated process can be

  15. Medical ADP Systems: Automated Medical Records Hold Promise to Improve Patient Care

    Science.gov (United States)

    1991-01-01

    automated medical records. The report discusses the potential benefits that automation could make to the quality of patient care and the factors that impede...information systems, but no organization has fully automated one of the most critical types of information, patient medical records. The patient medical record...its review of automated medical records. GAO’s objectives in this study were to identify the (1) benefits of automating patient records and (2) factors

  16. A Multi-Scale Flood Monitoring System Based on Fully Automatic MODIS and TerraSAR-X Processing Chains

    Directory of Open Access Journals (Sweden)

    Enrico Stein

    2013-10-01

    Full Text Available A two-component fully automated flood monitoring system is described and evaluated. This is a result of combining two individual flood services that are currently under development at DLR’s (German Aerospace Center Center for Satellite based Crisis Information (ZKI to rapidly support disaster management activities. A first-phase monitoring component of the system systematically detects potential flood events on a continental scale using daily-acquired medium spatial resolution optical data from the Moderate Resolution Imaging Spectroradiometer (MODIS. A threshold set controls the activation of the second-phase crisis component of the system, which derives flood information at higher spatial detail using a Synthetic Aperture Radar (SAR based satellite mission (TerraSAR-X. The proposed activation procedure finds use in the identification of flood situations in different spatial resolutions and in the time-critical and on demand programming of SAR satellite acquisitions at an early stage of an evolving flood situation. The automated processing chains of the MODIS (MFS and the TerraSAR-X Flood Service (TFS include data pre-processing, the computation and adaptation of global auxiliary data, thematic classification, and the subsequent dissemination of flood maps using an interactive web-client. The system is operationally demonstrated and evaluated via the monitoring two recent flood events in Russia 2013 and Albania/Montenegro 2013.

  17. Flexible automated manufacturing for SMEs

    DEFF Research Database (Denmark)

    Grube Hansen, David; Bilberg, Arne; Madsen, Erik Skov

    2017-01-01

    SMEs are in general highly flexible and agile in order to accommodate the customer demands in the paradigm of High Mix-Low Volume manufacturing. The flexibility and agility have mainly been enabled by manual labor, but as we are entering the technology and data driven fourth industrial revolution......, where augmented operators and machines work in cooperation in a highly flexible and productive manufacturing system both an opportunity and a need has raised for developing highly flexible and efficient automation....

  18. Delineated Analysis of Robotic Process Automation Tools

    OpenAIRE

    Ruchi Isaac; Riya Muni; Kenali Desai

    2017-01-01

    In this age and time when celerity is expected out of all the sectors of the country, the speed of execution of various processes and hence efficiency, becomes a prominent factor. To facilitate the speeding demands of these diverse platforms, Robotic Process Automation (RPA) is used. Robotic Process Automation can expedite back-office tasks in commercial industries, remote management tasks in IT industries and conservation of resources in multiple sectors. To implement RPA, many software ...

  19. Standard IEC 61850 substation automation

    Energy Technology Data Exchange (ETDEWEB)

    Bricchi, A.; Mezzadri, D. [Selta, Tortoreto (Italy)

    2008-07-01

    The International Electrotechnical Commission (IEC) 61850 standard is the reference communication protocol for all electrical substations protection and control systems. It creates models of all the elements and functionalities of an electrical substation, including physical elements such as switches or circuit breakers, as well as protection, control and monitoring functionalities. Network managers need to renew power substation automation and control systems in order to improve the efficiency and quality of services offered by electric utilities. Selta has proposed a new integrated solution for the automation of power substations which is fully compliant with the IEC 61850 norms. The solution involves the integration of control, automation, protection, monitoring and maintenance functions and applies leading edge technology to its systems, particularly for the TERNA network. The system is based on the use of many electronic devices at a power plant, each one with a specialized function, and all interconnected via a Station LAN. This solution, was tested on the TERNA network in Italy, in VHV and HV stations. It was shown to offer many advantages, such as an architecture based on full interoperability between control, monitoring and protection equipment; centralized and distributed automation; a LAN station that allows full interoperability between different bay units and protection relays in order to integrate equipment from various suppliers; the integration of automation systems in existing bay units and protection relays equipped with standard communication buses or with proprietary interfaces; and time synchronization for the entire system through a station GPS reception system. 10 refs., 1 tab., 7 figs.

  20. Axiomatisation of fully probabilistic design

    Czech Academy of Sciences Publication Activity Database

    Kárný, Miroslav; Kroupa, Tomáš

    2012-01-01

    Roč. 186, č. 1 (2012), s. 105-113 ISSN 0020-0255 R&D Projects: GA MŠk(CZ) 2C06001; GA ČR GA102/08/0567 Institutional research plan: CEZ:AV0Z10750506 Keywords : Bayesian decision making * Fully probabilistic design * Kullback–Leibler divergence * Unified decision making Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 3.643, year: 2012 http://library.utia.cas.cz/separaty/2011/AS/karny-0367271.pdf

  1. Automated spoof-detection for fingerprints using optical coherence tomography

    CSIR Research Space (South Africa)

    Darlow, LN

    2016-05-01

    Full Text Available that they are highly separable, resulting in 100% accuracy regarding spoof-detection, with no false rejections of real fingers. This is the first attempt at fully automated spoof-detection using OCT....

  2. Maximizing Your Investment in Building Automation System Technology.

    Science.gov (United States)

    Darnell, Charles

    2001-01-01

    Discusses how organizational issues and system standardization can be important factors that determine an institution's ability to fully exploit contemporary building automation systems (BAS). Further presented is management strategy for maximizing BAS investments. (GR)

  3. Physics of fully depleted CCDs

    International Nuclear Information System (INIS)

    Holland, S E; Bebek, C J; Kolbe, W F; Lee, J S

    2014-01-01

    In this work we present simple, physics-based models for two effects that have been noted in the fully depleted CCDs that are presently used in the Dark Energy Survey Camera. The first effect is the observation that the point-spread function increases slightly with the signal level. This is explained by considering the effect on charge-carrier diffusion due to the reduction in the magnitude of the channel potential as collected signal charge acts to partially neutralize the fixed charge in the depleted channel. The resulting reduced voltage drop across the carrier drift region decreases the vertical electric field and increases the carrier transit time. The second effect is the observation of low-level, concentric ring patterns seen in uniformly illuminated images. This effect is shown to be most likely due to lateral deflection of charge during the transit of the photo-generated carriers to the potential wells as a result of lateral electric fields. The lateral fields are a result of space charge in the fully depleted substrates arising from resistivity variations inherent to the growth of the high-resistivity silicon used to fabricate the CCDs

  4. Coordinated Demand Response and Distributed Generation Management in Residential Smart Microgrids

    DEFF Research Database (Denmark)

    Anvari-Moghaddam, Amjad; Mokhtari, Ghassem; Guerrero, Josep M.

    2016-01-01

    potentials to increase the functionality of a typical demand-side management (DSM) strategy, and typical implementation of building-level DERs by integrating them into a cohesive, networked package that fully utilizes smart energy-efficient end-use devices, advanced building control/automation systems......Nowadays with the emerging of small-scale integrated energy systems (IESs) in form of residential smart microgrids (SMGs), a large portion of energy can be saved through coordinated scheduling of smart household devices and management of distributed energy resources (DERs). There are significant......, and an integrated communications architecture to efficiently manage energy and comfort at the end-use location. By the aid of such technologies, residential consumers have also the capability to mitigate their energy costs and satisfy their own requirements paying less attention to the configuration of the energy...

  5. Human-automation collaboration in manufacturing: identifying key implementation factors

    OpenAIRE

    Charalambous, George; Fletcher, Sarah; Webb, Philip

    2013-01-01

    Human-automation collaboration refers to the concept of human operators and intelligent automation working together interactively within the same workspace without conventional physical separation. This concept has commanded significant attention in manufacturing because of the potential applications, such as the installation of large sub-assemblies. However, the key human factors relevant to human-automation collaboration have not yet been fully investigated. To maximise effective implement...

  6. Plant automation

    International Nuclear Information System (INIS)

    Christensen, L.J.; Sackett, J.I.; Dayal, Y.; Wagner, W.K.

    1989-01-01

    This paper describes work at EBR-II in the development and demonstration of new control equipment and methods and associated schemes for plant prognosis, diagnosis, and automation. The development work has attracted the interest of other national laboratories, universities, and commercial companies. New initiatives include use of new control strategies, expert systems, advanced diagnostics, and operator displays. The unique opportunity offered by EBR-II is as a test bed where a total integrated approach to automatic reactor control can be directly tested under real power plant conditions

  7. Automated tracking for advanced satellite laser ranging systems

    Science.gov (United States)

    McGarry, Jan F.; Degnan, John J.; Titterton, Paul J., Sr.; Sweeney, Harold E.; Conklin, Brion P.; Dunn, Peter J.

    1996-06-01

    NASA's Satellite Laser Ranging Network was originally developed during the 1970's to track satellites carrying corner cube reflectors. Today eight NASA systems, achieving millimeter ranging precision, are part of a global network of more than 40 stations that track 17 international satellites. To meet the tracking demands of a steadily growing satellite constellation within existing resources, NASA is embarking on a major automation program. While manpower on the current systems will be reduced to a single operator, the fully automated SLR2000 system is being designed to operate for months without human intervention. Because SLR2000 must be eyesafe and operate in daylight, tracking is often performed in a low probability of detection and high noise environment. The goal is to automatically select the satellite, setup the tracking and ranging hardware, verify acquisition, and close the tracking loop to optimize data yield. TO accomplish the autotracking tasks, we are investigating (1) improved satellite force models, (2) more frequent updates of orbital ephemerides, (3) lunar laser ranging data processing techniques to distinguish satellite returns from noise, and (4) angular detection and search techniques to acquire the satellite. A Monte Carlo simulator has been developed to allow optimization of the autotracking algorithms by modeling the relevant system errors and then checking performance against system truth. A combination of simulator and preliminary field results will be presented.

  8. Integrating Test-Form Formatting into Automated Test Assembly

    Science.gov (United States)

    Diao, Qi; van der Linden, Wim J.

    2013-01-01

    Automated test assembly uses the methodology of mixed integer programming to select an optimal set of items from an item bank. Automated test-form generation uses the same methodology to optimally order the items and format the test form. From an optimization point of view, production of fully formatted test forms directly from the item pool using…

  9. AUTOMATED GEOSPATIAL WATERSHED ASSESSMENT ...

    Science.gov (United States)

    The Automated Geospatial Watershed Assessment tool (AGWA) is a GIS interface jointly developed by the USDA Agricultural Research Service, the U.S. Environmental Protection Agency, the University of Arizona, and the University of Wyoming to automate the parameterization and execution of the Soil Water Assessment Tool (SWAT) and KINEmatic Runoff and EROSion (KINEROS2) hydrologic models. The application of these two models allows AGWA to conduct hydrologic modeling and watershed assessments at multiple temporal and spatial scales. AGWA’s current outputs are runoff (volumes and peaks) and sediment yield, plus nitrogen and phosphorus with the SWAT model. AGWA uses commonly available GIS data layers to fully parameterize, execute, and visualize results from both models. Through an intuitive interface the user selects an outlet from which AGWA delineates and discretizes the watershed using a Digital Elevation Model (DEM) based on the individual model requirements. The watershed model elements are then intersected with soils and land cover data layers to derive the requisite model input parameters. The chosen model is then executed, and the results are imported back into AGWA for visualization. This allows managers to identify potential problem areas where additional monitoring can be undertaken or mitigation activities can be focused. AGWA also has tools to apply an array of best management practices. There are currently two versions of AGWA available; AGWA 1.5 for

  10. A fast fiducial marker tracking model for fully automatic alignment in electron tomography

    KAUST Repository

    Han, Renmin; Zhang, Fa; Gao, Xin

    2017-01-01

    Automatic alignment, especially fiducial marker-based alignment, has become increasingly important due to the high demand of subtomogram averaging and the rapid development of large-field electron microscopy. Among the alignment steps, fiducial marker tracking is a crucial one that determines the quality of the final alignment. Yet, it is still a challenging problem to track the fiducial markers accurately and effectively in a fully automatic manner.In this paper, we propose a robust and efficient scheme for fiducial marker tracking. Firstly, we theoretically prove the upper bound of the transformation deviation of aligning the positions of fiducial markers on two micrographs by affine transformation. Secondly, we design an automatic algorithm based on the Gaussian mixture model to accelerate the procedure of fiducial marker tracking. Thirdly, we propose a divide-and-conquer strategy against lens distortions to ensure the reliability of our scheme. To our knowledge, this is the first attempt that theoretically relates the projection model with the tracking model. The real-world experimental results further support our theoretical bound and demonstrate the effectiveness of our algorithm. This work facilitates the fully automatic tracking for datasets with a massive number of fiducial markers.The C/C ++ source code that implements the fast fiducial marker tracking is available at https://github.com/icthrm/gmm-marker-tracking. Markerauto 1.6 version or later (also integrated in the AuTom platform at http://ear.ict.ac.cn/) offers a complete implementation for fast alignment, in which fast fiducial marker tracking is available by the

  11. A fast fiducial marker tracking model for fully automatic alignment in electron tomography

    KAUST Repository

    Han, Renmin

    2017-10-20

    Automatic alignment, especially fiducial marker-based alignment, has become increasingly important due to the high demand of subtomogram averaging and the rapid development of large-field electron microscopy. Among the alignment steps, fiducial marker tracking is a crucial one that determines the quality of the final alignment. Yet, it is still a challenging problem to track the fiducial markers accurately and effectively in a fully automatic manner.In this paper, we propose a robust and efficient scheme for fiducial marker tracking. Firstly, we theoretically prove the upper bound of the transformation deviation of aligning the positions of fiducial markers on two micrographs by affine transformation. Secondly, we design an automatic algorithm based on the Gaussian mixture model to accelerate the procedure of fiducial marker tracking. Thirdly, we propose a divide-and-conquer strategy against lens distortions to ensure the reliability of our scheme. To our knowledge, this is the first attempt that theoretically relates the projection model with the tracking model. The real-world experimental results further support our theoretical bound and demonstrate the effectiveness of our algorithm. This work facilitates the fully automatic tracking for datasets with a massive number of fiducial markers.The C/C ++ source code that implements the fast fiducial marker tracking is available at https://github.com/icthrm/gmm-marker-tracking. Markerauto 1.6 version or later (also integrated in the AuTom platform at http://ear.ict.ac.cn/) offers a complete implementation for fast alignment, in which fast fiducial marker tracking is available by the

  12. WIDAFELS flexible automation systems

    International Nuclear Information System (INIS)

    Shende, P.S.; Chander, K.P.; Ramadas, P.

    1990-01-01

    After discussing the various aspects of automation, some typical examples of various levels of automation are given. One of the examples is of automated production line for ceramic fuel pellets. (M.G.B.)

  13. An Automation Planning Primer.

    Science.gov (United States)

    Paynter, Marion

    1988-01-01

    This brief planning guide for library automation incorporates needs assessment and evaluation of options to meet those needs. A bibliography of materials on automation planning and software reviews, library software directories, and library automation journals is included. (CLB)

  14. On-Demand Telemetry

    Data.gov (United States)

    National Aeronautics and Space Administration — AFRC has previously investigated the use of Network Based Telemetry. We will be building on that research to enable On-Demand Telemetry. On-Demand Telemetry is a way...

  15. Money Demand in Latvia

    OpenAIRE

    Ivars Tillers

    2004-01-01

    The econometric analysis of the demand for broad money in Latvia suggests a stable relationship of money demand. The analysis of parameter exogeneity indicates that the equilibrium adjustment is driven solely by the changes in the amount of money. The demand for money in Latvia is characterised by relatively high income elasticity typical for the economy in a monetary expansion phase. Due to stability, close fit of the money demand function and rapid equilibrium adjustment, broad money aggreg...

  16. Fully inkjet-printed microwave passive electronics

    KAUST Repository

    McKerricher, Garret

    2017-01-30

    Fully inkjet-printed three-dimensional (3D) objects with integrated metal provide exciting possibilities for on-demand fabrication of radio frequency electronics such as inductors, capacitors, and filters. To date, there have been several reports of printed radio frequency components metallized via the use of plating solutions, sputtering, and low-conductivity pastes. These metallization techniques require rather complex fabrication, and do not provide an easily integrated or versatile process. This work utilizes a novel silver ink cured with a low-cost infrared lamp at only 80 °C, and achieves a high conductivity of 1×107 S m−1. By inkjet printing the infrared-cured silver together with a commercial 3D inkjet ultraviolet-cured acrylic dielectric, a multilayer process is demonstrated. By using a smoothing technique, both the conductive ink and dielectric provide surface roughness values of <500 nm. A radio frequency inductor and capacitor exhibit state-of-the-art quality factors of 8 and 20, respectively, and match well with electromagnetic simulations. These components are implemented in a lumped element radio frequency filter with an impressive insertion loss of 0.8 dB at 1 GHz, proving the utility of the process for sensitive radio frequency applications.

  17. Fully inkjet-printed microwave passive electronics

    KAUST Repository

    McKerricher, Garret; Vaseem, Mohammad; Shamim, Atif

    2017-01-01

    Fully inkjet-printed three-dimensional (3D) objects with integrated metal provide exciting possibilities for on-demand fabrication of radio frequency electronics such as inductors, capacitors, and filters. To date, there have been several reports of printed radio frequency components metallized via the use of plating solutions, sputtering, and low-conductivity pastes. These metallization techniques require rather complex fabrication, and do not provide an easily integrated or versatile process. This work utilizes a novel silver ink cured with a low-cost infrared lamp at only 80 °C, and achieves a high conductivity of 1×107 S m−1. By inkjet printing the infrared-cured silver together with a commercial 3D inkjet ultraviolet-cured acrylic dielectric, a multilayer process is demonstrated. By using a smoothing technique, both the conductive ink and dielectric provide surface roughness values of <500 nm. A radio frequency inductor and capacitor exhibit state-of-the-art quality factors of 8 and 20, respectively, and match well with electromagnetic simulations. These components are implemented in a lumped element radio frequency filter with an impressive insertion loss of 0.8 dB at 1 GHz, proving the utility of the process for sensitive radio frequency applications.

  18. Low cost automation

    International Nuclear Information System (INIS)

    1987-03-01

    This book indicates method of building of automation plan, design of automation facilities, automation and CHIP process like basics of cutting, NC processing machine and CHIP handling, automation unit, such as drilling unit, tapping unit, boring unit, milling unit and slide unit, application of oil pressure on characteristics and basic oil pressure circuit, application of pneumatic, automation kinds and application of process, assembly, transportation, automatic machine and factory automation.

  19. Automated Budget System -

    Data.gov (United States)

    Department of Transportation — The Automated Budget System (ABS) automates management and planning of the Mike Monroney Aeronautical Center (MMAC) budget by providing enhanced capability to plan,...

  20. Automated x-ray fluorescence analysis

    International Nuclear Information System (INIS)

    O'Connell, A.M.

    1977-01-01

    A fully automated x-ray fluorescence analytical system is described. The hardware is based on a Philips PW1220 sequential x-ray spectrometer. Software for on-line analysis of a wide range of sample types has been developed for the Hewlett-Packard 9810A programmable calculator. Routines to test the system hardware are also described. (Author)

  1. Automated Analysis of Security in Networking Systems

    DEFF Research Database (Denmark)

    Buchholtz, Mikael

    2004-01-01

    such networking systems are modelled in the process calculus LySa. On top of this programming language based formalism an analysis is developed, which relies on techniques from data and control ow analysis. These are techniques that can be fully automated, which make them an ideal basis for tools targeted at non...

  2. Automated tetraploid genotype calling by hierarchical clustering

    Science.gov (United States)

    SNP arrays are transforming breeding and genetics research for autotetraploids. To fully utilize these arrays, however, the relationship between signal intensity and allele dosage must be inferred independently for each marker. We developed an improved computational method to automate this process, ...

  3. Rig automation: where it's been and where it's going

    Energy Technology Data Exchange (ETDEWEB)

    Rinaldi, R.

    1982-06-01

    For over 30 years dreamers, tinkerers and engineers have attempted to automate various drilling functions. Now this effort is paying off, and a partially automated rig is no longer a curiosity. Fully automated and computerized rigs are on the way. For the contractor this means higher productivity, but more maintenance and training responsibilities.

  4. Automated Critical PeakPricing Field Tests: 2006 Pilot ProgramDescription and Results

    Energy Technology Data Exchange (ETDEWEB)

    Piette, Mary Ann; Watson, David; Motegi, Naoya; Kiliccote, Sila

    2007-06-19

    During 2006 Lawrence Berkeley National Laboratory (LBNL) and the Demand Response Research Center (DRRC) performed a technology evaluation for the Pacific Gas and Electric Company (PG&E) Emerging Technologies Programs. This report summarizes the design, deployment, and results from the 2006 Automated Critical Peak Pricing Program (Auto-CPP). The program was designed to evaluate the feasibility of deploying automation systems that allow customers to participate in critical peak pricing (CPP) with a fully-automated response. The 2006 program was in operation during the entire six-month CPP period from May through October. The methodology for this field study included site recruitment, control strategy development, automation system deployment, and evaluation of sites' participation in actual CPP events through the summer of 2006. LBNL recruited sites in PG&E's territory in northern California through contacts from PG&E account managers, conferences, and industry meetings. Each site contact signed a memorandum of understanding with LBNL that outlined the activities needed to participate in the Auto-CPP program. Each facility worked with LBNL to select and implement control strategies for demand response and developed automation system designs based on existing Internet connectivity and building control systems. Once the automation systems were installed, LBNL conducted communications tests to ensure that the Demand Response Automation Server (DRAS) correctly provided and logged the continuous communications of the CPP signals with the energy management and control system (EMCS) for each site. LBNL also observed and evaluated Demand Response (DR) shed strategies to ensure proper commissioning of controls. The communication system allowed sites to receive day-ahead as well as day-of signals for pre-cooling, a DR strategy used at a few sites. Measurement of demand response was conducted using two different baseline models for estimating peak load savings. One

  5. Electricity demand forecasting techniques

    International Nuclear Information System (INIS)

    Gnanalingam, K.

    1994-01-01

    Electricity demand forecasting plays an important role in power generation. The two areas of data that have to be forecasted in a power system are peak demand which determines the capacity (MW) of the plant required and annual energy demand (GWH). Methods used in electricity demand forecasting include time trend analysis and econometric methods. In forecasting, identification of manpower demand, identification of key planning factors, decision on planning horizon, differentiation between prediction and projection (i.e. development of different scenarios) and choosing from different forecasting techniques are important

  6. Restaurant No. 1 fully renovated

    CERN Document Server

    2007-01-01

    The Restaurant No. 1 team. After several months of patience and goodwill on the part of our clients, we are delighted to announce that the major renovation work which began in September 2006 has now been completed. From 21 May 2007 we look forward to welcoming you to a completely renovated restaurant area designed with you in mind. The restaurant team wishes to thank all its clients for their patience and loyalty. Particular attention has been paid in the new design to creating a spacious serving area and providing a wider choice of dishes. The new restaurant area has been designed as an open-plan space to enable you to view all the dishes before making your selection and to move around freely from one food access point to another. It comprises user-friendly areas that fully comply with hygiene standards. From now on you will be able to pick and choose to your heart's content. We invite you to try out wok cooking or some other speciality. Or select a pizza or a plate of pasta with a choice of two sauces fr...

  7. Fully Employing Software Inspections Data

    Science.gov (United States)

    Shull, Forrest; Feldmann, Raimund L.; Seaman, Carolyn; Regardie, Myrna; Godfrey, Sally

    2009-01-01

    Software inspections provide a proven approach to quality assurance for software products of all kinds, including requirements, design, code, test plans, among others. Common to all inspections is the aim of finding and fixing defects as early as possible, and thereby providing cost savings by minimizing the amount of rework necessary later in the lifecycle. Measurement data, such as the number and type of found defects and the effort spent by the inspection team, provide not only direct feedback about the software product to the project team but are also valuable for process improvement activities. In this paper, we discuss NASA's use of software inspections and the rich set of data that has resulted. In particular, we present results from analysis of inspection data that illustrate the benefits of fully utilizing that data for process improvement at several levels. Examining such data across multiple inspections or projects allows team members to monitor and trigger cross project improvements. Such improvements may focus on the software development processes of the whole organization as well as improvements to the applied inspection process itself.

  8. Strategies for Demand Response in Commercial Buildings

    Energy Technology Data Exchange (ETDEWEB)

    Watson, David S.; Kiliccote, Sila; Motegi, Naoya; Piette, Mary Ann

    2006-06-20

    This paper describes strategies that can be used in commercial buildings to temporarily reduce electric load in response to electric grid emergencies in which supplies are limited or in response to high prices that would be incurred if these strategies were not employed. The demand response strategies discussed herein are based on the results of three years of automated demand response field tests in which 28 commercial facilities with an occupied area totaling over 11 million ft{sup 2} were tested. Although the demand response events in the field tests were initiated remotely and performed automatically, the strategies used could also be initiated by on-site building operators and performed manually, if desired. While energy efficiency measures can be used during normal building operations, demand response measures are transient; they are employed to produce a temporary reduction in demand. Demand response strategies achieve reductions in electric demand by temporarily reducing the level of service in facilities. Heating ventilating and air conditioning (HVAC) and lighting are the systems most commonly adjusted for demand response in commercial buildings. The goal of demand response strategies is to meet the electric shed savings targets while minimizing any negative impacts on the occupants of the buildings or the processes that they perform. Occupant complaints were minimal in the field tests. In some cases, ''reductions'' in service level actually improved occupant comfort or productivity. In other cases, permanent improvements in efficiency were discovered through the planning and implementation of ''temporary'' demand response strategies. The DR strategies that are available to a given facility are based on factors such as the type of HVAC, lighting and energy management and control systems (EMCS) installed at the site.

  9. Automation 2017

    CERN Document Server

    Zieliński, Cezary; Kaliczyńska, Małgorzata

    2017-01-01

    This book consists of papers presented at Automation 2017, an international conference held in Warsaw from March 15 to 17, 2017. It discusses research findings associated with the concepts behind INDUSTRY 4.0, with a focus on offering a better understanding of and promoting participation in the Fourth Industrial Revolution. Each chapter presents a detailed analysis of a specific technical problem, in most cases followed by a numerical analysis, simulation and description of the results of implementing the solution in a real-world context. The theoretical results, practical solutions and guidelines presented are valuable for both researchers working in the area of engineering sciences and practitioners looking for solutions to industrial problems. .

  10. Automating the radiographic NDT process

    International Nuclear Information System (INIS)

    Aman, J.K.

    1988-01-01

    Automation, the removal of the human element in inspection has not been generally applied to film radiographic NDT. The justification for automation is not only productivity but also reliability of results. Film remains in the automated system of the future because of its extremely high image content, approximately 3x10 (to the power of nine) bits per 14x17. This is equivalent to 2200 computer floppy disks parts handling systems and robotics applied for manufacturing and some NDT modalities, should now be applied to film radiographic NDT systems. Automatic film handling can be achieved with the daylight NDT film handling system. Automatic film processing is becoming the standard in industry and can be coupled to the daylight system. Robots offer the opportunity to automate fully the exposure step. Finally, a computer aided interpretation appears on the horizon. A unit which laser scans a 14x27 (inch) film in 6-8 seconds can digitize film in information for further manipulation and possible automatic interrogations (computer aided interpretation). The system called FDRS (for film digital radiography system) is moving toward 50 micron (16 lines/mm) resolution. This is believed to meet the need of the majority of image content needs. (Author). 4 refs.; 21 figs

  11. Automated ultrasonic inspection using PULSDAT

    International Nuclear Information System (INIS)

    Naybour, P.J.

    1992-01-01

    PULSDAT (Portable Ultrasonic Data Acquisition Tool) is a system for recording the data from single probe automated ultrasonic inspections. It is one of a range of instruments and software developed by Nuclear Electric to carry out a wide variety of high quality ultrasonic inspections. These vary from simple semi-automated inspections through to multi-probe, highly automated ones. PULSDAT runs under the control of MIPS software, and collects data which is compatible with the GUIDE data display system. PULSDAT is therefore fully compatible with Nuclear Electric's multi-probe inspection systems and utilises all the reliability and quality assurance of the software. It is a rugged, portable system that can be used in areas of difficult access. The paper discusses the benefits of automated inspection and gives an outline of the main features of PULSDAT. Since April 1990 PULSDAT has been used in several applications within Nuclear Electric and this paper presents two examples: the first is a ferritic set-through nozzle and the second is an austenitic fillet weld. (Author)

  12. Demand Response Resource Quantification with Detailed Building Energy Models

    Energy Technology Data Exchange (ETDEWEB)

    Hale, Elaine; Horsey, Henry; Merket, Noel; Stoll, Brady; Nag, Ambarish

    2017-04-03

    Demand response is a broad suite of technologies that enables changes in electrical load operations in support of power system reliability and efficiency. Although demand response is not a new concept, there is new appetite for comprehensively evaluating its technical potential in the context of renewable energy integration. The complexity of demand response makes this task difficult -- we present new methods for capturing the heterogeneity of potential responses from buildings, their time-varying nature, and metrics such as thermal comfort that help quantify likely acceptability of specific demand response actions. Computed with an automated software framework, the methods are scalable.

  13. Framtagning av en utvecklingsprocess för automation - baserat på konceptet Lean Automation

    OpenAIRE

    Carnbo, Linda

    2012-01-01

    Due to the globalization today the competition in the market has in- creased and it requires flexibility and produce according to customer demand. In order to reduce the cost of wages industrial companies are now considering moving the manufacturing to low-cost countries. To keep up with the competition in the market without moving the manu- facturing abroad, Lean Automation was developed. The concept of Lean Automation is to reduce the perceived complexity with automa- tion and make automati...

  14. Electricity demand in Kazakhstan

    International Nuclear Information System (INIS)

    Atakhanova, Zauresh; Howie, Peter

    2007-01-01

    Properties of electricity demand in transition economies have not been sufficiently well researched mostly due to data limitations. However, information on the properties of electricity demand is necessary for policy makers to evaluate effects of price changes on different consumers and obtain demand forecasts for capacity planning. This study estimates Kazakhstan's aggregate demand for electricity as well as electricity demand in the industrial, service, and residential sectors using regional data. Firstly, our results show that price elasticity of demand in all sectors is low. This fact suggests that there is considerable room for price increases necessary to finance generation and distribution system upgrading. Secondly, we find that income elasticity of demand in the aggregate and all sectoral models is less than unity. Of the three sectors, electricity demand in the residential sector has the lowest income elasticity. This result indicates that policy initiatives to secure affordability of electricity consumption to lower income residential consumers may be required. Finally, our forecast shows that electricity demand may grow at either 3% or 5% per year depending on rates of economic growth and government policy regarding price increases and promotion of efficiency. We find that planned supply increases would be sufficient to cover growing demand only if real electricity prices start to increase toward long-run cost-recovery levels and policy measures are implemented to maintain the current high growth of electricity efficiency

  15. Sensors and Automated Analyzers for Radionuclides

    International Nuclear Information System (INIS)

    Grate, Jay W.; Egorov, Oleg B.

    2003-01-01

    The production of nuclear weapons materials has generated large quantities of nuclear waste and significant environmental contamination. We have developed new, rapid, automated methods for determination of radionuclides using sequential injection methodologies to automate extraction chromatographic separations, with on-line flow-through scintillation counting for real time detection. This work has progressed in two main areas: radionuclide sensors for water monitoring and automated radiochemical analyzers for monitoring nuclear waste processing operations. Radionuclide sensors have been developed that collect and concentrate radionuclides in preconcentrating minicolumns with dual functionality: chemical selectivity for radionuclide capture and scintillation for signal output. These sensors can detect pertechnetate to below regulatory levels and have been engineered into a prototype for field testing. A fully automated process monitor has been developed for total technetium in nuclear waste streams. This instrument performs sample acidification, speciation adjustment, separation and detection in fifteen minutes or less

  16. Demand Response and Energy Storage Integration Study

    Energy Technology Data Exchange (ETDEWEB)

    Ma, Ookie; Cheung, Kerry; Olsen, Daniel J.; Matson, Nance; Sohn, Michael D.; Rose, Cody M.; Dudley, Junqiao Han; Goli, Sasank; Kiliccote, Sila; Cappers, Peter; MacDonald, Jason; Denholm, Paul; Hummon, Marissa; Jorgenson, Jennie; Palchak, David; Starke, Michael; Alkadi, Nasr; Bhatnagar, Dhruv; Currier, Aileen; Hernandez, Jaci; Kirby, Brendan; O' Malley, Mark

    2016-03-01

    Demand response and energy storage resources present potentially important sources of bulk power system services that can aid in integrating variable renewable generation. While renewable integration studies have evaluated many of the challenges associated with deploying large amounts of variable wind and solar generation technologies, integration analyses have not yet fully incorporated demand response and energy storage resources. This report represents an initial effort in analyzing the potential integration value of demand response and energy storage, focusing on the western United States. It evaluates two major aspects of increased deployment of demand response and energy storage: (1) Their operational value in providing bulk power system services and (2) Market and regulatory issues, including potential barriers to deployment.

  17. The fully Mobile City Government Project (MCity)

    DEFF Research Database (Denmark)

    Scholl, Hans; Fidel, Raya; Mai, Jens Erik

    2006-01-01

    The Fully Mobile City Government Project, also known as MCity, is an interdisciplinary research project on the premises, requirements, and effects of fully mobile, wirelessly connected applications (FWMC). The project will develop an analytical framework for interpreting the interaction and inter......The Fully Mobile City Government Project, also known as MCity, is an interdisciplinary research project on the premises, requirements, and effects of fully mobile, wirelessly connected applications (FWMC). The project will develop an analytical framework for interpreting the interaction...

  18. Both Automation and Paper.

    Science.gov (United States)

    Purcell, Royal

    1988-01-01

    Discusses the concept of a paperless society and the current situation in library automation. Various applications of automation and telecommunications are addressed, and future library automation is considered. Automation at the Monroe County Public Library in Bloomington, Indiana, is described as an example. (MES)

  19. Pyrochemical processing automation at Lawrence Livermore National Laboratory

    International Nuclear Information System (INIS)

    Dennison, D.K.; Domning, E.E.; Seivers, R.

    1991-01-01

    Lawrence Livermore National Laboratory (LLNL) is developing a fully automated system for pyrochemical processing of special nuclear materials (SNM). The system utilizes a glove box, an automated tilt-pour furnace (TPF), an IBM developed gantry robot, and specialized automation tooling. All material handling within the glove box (i.e., furnace loading, furnace unloading, product and slag separation, and product packaging) is performed automatically. The objectives of the effort are to increase process productivity, decrease operator radiation, reduce process wastes, and demonstrate system reliability and availability. This paper provides an overview of the automated system hardware, outlines the overall operations sequence, and discusses the current status

  20. Innovation and Demand

    DEFF Research Database (Denmark)

    Andersen, Esben Sloth

    2007-01-01

    the demand-side of markets in the simplest possible way. This strategy has allowed a gradual increase in the sophistication of supply-side aspects of economic evolution, but the one-sided focus on supply is facing diminishing returns. Therefore, demand-side aspects of economic evolution have in recent years...... received increased attention. The present paper argues that the new emphasis on demand-side factors is quite crucial for a deepened understanding of economic evolution. The major reasons are the following: First, demand represents the core force of selection that gives direction to the evolutionary process....... Second, firms' innovative activities relate, directly or indirectly, to the structure of expected and actual demand. Third, the demand side represents the most obvious way of turning to the much-needed analysis of macro-evolutionary change of the economic system....

  1. PERFECT DEMAND ILLUSION

    Directory of Open Access Journals (Sweden)

    Alexander Yu. Sulimov

    2015-01-01

    Full Text Available The article is devoted to technique «Perfect demand illusion», which allows to strengthen the competitive advantageof retailers. Also in the paper spells out the golden rules of visual merchandising.The definition of the method «Demand illusion», formulated the conditions of its functioning, and is determined by the mainhypothesis of the existence of this method.Furthermore, given the definition of the «Perfect demand illusion», and describes its additional conditions. Also spells out the advantages of the «Perfect demandillusion», before the «Demand illusion».

  2. Divers of Passenger Demand

    OpenAIRE

    Wittmer, Andreas

    2011-01-01

    -Overview drivers of passenger demand -Driver 1: Economic growth in developing countries -Driver 2: International business travel in developed countries -Driver 3: International leisure travel in developed countries

  3. Generic Automated Multi-function Finger Design

    Science.gov (United States)

    Honarpardaz, M.; Tarkian, M.; Sirkett, D.; Ölvander, J.; Feng, X.; Elf, J.; Sjögren, R.

    2016-11-01

    Multi-function fingers that are able to handle multiple workpieces are crucial in improvement of a robot workcell. Design automation of multi-function fingers is highly demanded by robot industries to overcome the current iterative, time consuming and complex manual design process. However, the existing approaches for the multi-function finger design automation are unable to entirely meet the robot industries’ need. This paper proposes a generic approach for design automation of multi-function fingers. The proposed approach completely automates the design process and requires no expert skill. In addition, this approach executes the design process much faster than the current manual process. To validate the approach, multi-function fingers are successfully designed for two case studies. Further, the results are discussed and benchmarked with existing approaches.

  4. Quantum dots assisted photocatalysis for the chemiluminometric determination of chemical oxygen demand using a single interface flow system

    Energy Technology Data Exchange (ETDEWEB)

    Silvestre, Cristina I.C.; Frigerio, Christian [Requimte, Department of Chemistry, Faculty of Pharmacy, Porto University, Rua Anibal Cunha 164, 4099-030, Porto (Portugal); Santos, Joao L.M., E-mail: joaolms@ff.up.pt [Requimte, Department of Chemistry, Faculty of Pharmacy, Porto University, Rua Anibal Cunha 164, 4099-030, Porto (Portugal); Lima, Jose L.F.C. [Requimte, Department of Chemistry, Faculty of Pharmacy, Porto University, Rua Anibal Cunha 164, 4099-030, Porto (Portugal)

    2011-08-12

    Highlights: {yields} A novel flow method for the determination of chemical oxygen demand is proposed. {yields} CdTe nanocrystals are irradiated with UV light to generate strong oxidizing species. {yields} Reactive species promote a fast catalytic degradation of organic matter. {yields} Luminol is used as a chemiluminescence probe for indirect COD assessment. {yields} A single interface flow system was implemented to automate the assays. - Abstract: A novel flow method for the determination of chemical oxygen demand (COD) is proposed in this work. It relies on the combination of a fully automated single interface flow system, an on-line UV photocatalytic unit and quantum dot (QD) nanotechnology. The developed approach takes advantage of CdTe nanocrystals capacity to generate strong oxidizing species upon irradiation with UV light, which fostered a fast catalytic degradation of the organic compounds. Luminol was used as a chemiluminescence (CL) probe for indirect COD assessment, since it is easily oxidized by the QD generated species yielding a strong CL emission that is quenched in the presence of the organic matter. The proposed methodology allowed the determination of COD concentrations between 1 and 35 mg L{sup -1}, with good precision (R.S.D. < 1.1%, n = 3) and a sampling frequency of about 33 h{sup -1}. The procedure was applied to the determination of COD in wastewater certified reference materials and the obtained results showed an excellent agreement with the certified values.

  5. [Automated anesthesia record systems].

    Science.gov (United States)

    Heinrichs, W; Mönk, S; Eberle, B

    1997-07-01

    The introduction of electronic anaesthesia documentation systems was attempted as early as in 1979, although their efficient application has become reality only in the past few years. The advantages of the electronic protocol are apparent: Continuous high quality documentation, comparability of data due to the availability of a data bank, reduction in the workload of the anaesthetist and availability of additional data. Disadvantages of the electronic protocol have also been discussed in the literature. By going through the process of entering data on the course of the anaesthetic procedure on the protocol sheet, the information is mentally absorbed and evaluated by the anaesthetist. This information may, however, be lost when the data are recorded fully automatically-without active involvement on the part of the anaesthetist. Recent publications state that by using intelligent alarms and/or integrated displays manual record keeping is no longer necessary for anaesthesia vigilance. The technical design of automated anaesthesia records depends on an integration of network technology into the hospital. It will be appropriate to connect the systems to the internet, but safety requirements have to be followed strictly. Concerning the database, client server architecture as well as language standards like SQL should be used. Object oriented databases will be available in the near future. Another future goal of automated anaesthesia record systems will be using knowledge based technologies within these systems. Drug interactions, disease related anaesthetic techniques and other information sources can be integrated. At this time, almost none of the commercially available systems has matured to a point where their purchase can be recommended without reservation. There is still a lack of standards for the subsequent exchange of data and a solution to a number of ergonomic problems still remains to be found. Nevertheless, electronic anaesthesia protocols will be required in

  6. On-Demand Mobility (ODM) Technical Pathway: Enabling Ease of Use and Safety

    Science.gov (United States)

    Goodrich, Ken; Moore, Mark

    2015-01-01

    On-demand mobility (ODM) through aviation refers to the ability to quickly and easily move people or equivalent cargo without delays introduced by lack of, or infrequently, scheduled service. A necessary attribute of ODM is that it be easy to use, requiring a minimum of special training, skills, or workload. Fully-autonomous vehicles would provide the ultimate in ease-of-use (EU) but are currently unproven for safety-critical applications outside of a few, situationally constrained applications (e.g. automated trains operating in segregated systems). Applied to aviation, the current and near-future state of the art of full-autonomy, may entail undesirable trade-offs such as very conservative operational margins resulting in reduced trip reliability and transportation utility. Furthermore, acceptance by potential users and regulatory authorities will be challenging without confidence in autonomous systems in developed in less critical, but still challenging applications. A question for the aviation community is how we can best develop practical ease-of-use for aircraft that are sized to carry a small number of passengers (e.g. 1-9) or equivalent cargo. Such development is unlikely to be a single event, but rather a managed, evolutionary process where responsibility and authority transitions from human to automation agents as operational experience is gained with increasingly intelligent systems. This talk presents a technology road map being developed at NASA Langley, as part of an overall strategy to foster ODM, for the development of ease-of-use for ODM aviation.

  7. Uranium supply and demand

    Energy Technology Data Exchange (ETDEWEB)

    Spriggs, M J

    1976-01-01

    Papers were presented on the pattern of uranium production in South Africa; Australian uranium--will it ever become available; North American uranium resources, policies, prospects, and pricing; economic and political environment of the uranium mining industry; alternative sources of uranium supply; whither North American demand for uranium; and uranium demand and security of supply--a consumer's point of view. (LK)

  8. Wood supply and demand

    Science.gov (United States)

    Peter J. Ince; David B. McKeever

    2011-01-01

    At times in history, there have been concerns that demand for wood (timber) would be greater than the ability to supply it, but that concern has recently dissipated. The wood supply and demand situation has changed because of market transitions, economic downturns, and continued forest growth. This article provides a concise overview of this change as it relates to the...

  9. Modelling UK energy demand to 2000

    International Nuclear Information System (INIS)

    Thomas, S.D.

    1980-01-01

    A recent long-term demand forecast for the UK was made by Cheshire and Surrey. (SPRU Occasional Paper Series No.5, Science Policy Research Unit, Univ. Of Sussex, 1978.) Although they adopted a sectoral approach their study leaves some questions unanswered. Do they succeed in their aim of making all their assumptions fully explicit. How sensitive are their estimates to changes in assumptions and policies. Are important problems and 'turning points' fully identified in the period up to and immediately beyond their time horizon of 2000. The author addresses these questions by using a computer model based on the study by Cheshire and Surrey. This article is a shortened version of the report, S.D. Thomas, 'Modelling UK Energy Demand to 2000', Operational Research, Univ. of Sussex, Brighton, UK, 1979, in which full details of the author's model are given. Copies are available from the author. (author)

  10. Modelling UK energy demand to 2000

    Energy Technology Data Exchange (ETDEWEB)

    Thomas, S D [Sussex Univ., Brighton (UK)

    1980-03-01

    A recent long-term demand forecast for the UK was made by Cheshire and Surrey. (SPRU Occasional Paper Series No.5, Science Policy Research Unit, Univ. Of Sussex, 1978.) Although they adopted a sectoral approach their study leaves some questions unanswered. Do they succeed in their aim of making all their assumptions fully explicit. How sensitive are their estimates to changes in assumptions and policies. Are important problems and 'turning points' fully identified in the period up to and immediately beyond their time horizon of 2000. The author addresses these questions by using a computer model based on the study by Cheshire and Surrey. This article is a shortened version of the report, S.D. Thomas, 'Modelling UK Energy Demand to 2000', Operational Research, Univ. of Sussex, Brighton, UK, 1979, in which full details of the author's model are given. Copies are available from the author.

  11. A SURVEY OF AUTOMATION TECHNIQUES COMING FORTH IN SHEET-FED OFFSET PRINTING ORGANIZATIONS

    OpenAIRE

    Mr. Ramesh Kumar*, Mr. Bijender & Mr. Sandeep Boora

    2017-01-01

    Sheet-Fed offset is one of the premier processes in India as well as abroad. To cope up with customers large quantity demands automation has become mandatory. From prepress to post press a wide range of automation techniques exist and coming forth for sheet fed offset presses. Objective of this paper is to throw light on various sheet-fed offset automation techniques existing today and their futuristic implications. The data related to automation was collected with the help of survey conducte...

  12. Evaluating a method for automated rigid registration

    DEFF Research Database (Denmark)

    Darkner, Sune; Vester-Christensen, Martin; Larsen, Rasmus

    2007-01-01

    to point distance. T-test for common mean are used to determine the performance of the two methods (supported by a Wilcoxon signed rank test). The performance influence of sampling density, sampling quantity, and norms is analyzed using a similar method.......We evaluate a novel method for fully automated rigid registration of 2D manifolds in 3D space based on distance maps, the Gibbs sampler and Iterated Conditional Modes (ICM). The method is tested against the ICP considered as the gold standard for automated rigid registration. Furthermore...

  13. Asian oil demand

    International Nuclear Information System (INIS)

    Fesharaki, F.

    2005-01-01

    This conference presentation examined global oil market development and the role of Asian demand. It discussed plateau change versus cyclical movement in the global oil market; supply and demand issues of OPEC and non-OPEC oil; if high oil prices reduce demand; and the Asian oil picture in the global context. Asian oil demand has accounted for about 50 per cent of the global incremental oil market growth. The presentation provided data charts in graphical format on global and Asia-Pacific incremental oil demand from 1990-2005; Asia oil demand growth for selected nations; real GDP growth in selected Asian countries; and, Asia-Pacific oil production and net import requirements. It also included charts in petroleum product demand for Asia-Pacific, China, India, Japan, and South Korea. Other data charts included key indicators for China's petroleum sector; China crude production and net oil import requirements; China's imports and the share of the Middle East; China's oil exports and imports; China's crude imports by source for 2004; China's imports of main oil products for 2004; India's refining capacity; India's product balance for net-imports and net-exports; and India's trade pattern of oil products. tabs., figs

  14. Uranium supply and demand

    International Nuclear Information System (INIS)

    1984-05-01

    This report covers the period 1983 to 1995. It draws together the industry's latest views on future trends in supply and demand, and sets them in their historical context. It devotes less discussion than its predecessors to the technical influences underpinning the Institute's supply and demand forecasts, and more to the factors which influence the market behaviour of the industry's various participants. As the last decade has clearly shown, these latter influences can easily be overlooked when undue attention is given to physical imbalances between supply and demand. (author)

  15. On energy demand

    International Nuclear Information System (INIS)

    Haefele, W.

    1977-01-01

    Since the energy crisis, a number of energy plans have been proposed, and almost all of these envisage some kind of energy demand adaptations or conservation measures, hoping thus to escape the anticipated problems of energy supply. However, there seems to be no clear explanation of the basis on which our foreseeable future energy problems could be eased. And in fact, a first attempt at a more exact definition of energy demand and its interaction with other objectives, such as economic ones, shows that it is a highly complex concept which we still hardly understand. The article explains in some detail why it is so difficult to understand energy demand

  16. Automated identification of insect vectors of Chagas disease in Brazil and Mexico: the Virtual Vector Lab

    Directory of Open Access Journals (Sweden)

    Rodrigo Gurgel-Gonçalves

    2017-04-01

    Full Text Available Identification of arthropods important in disease transmission is a crucial, yet difficult, task that can demand considerable training and experience. An important case in point is that of the 150+ species of Triatominae, vectors of Trypanosoma cruzi, causative agent of Chagas disease across the Americas. We present a fully automated system that is able to identify triatomine bugs from Mexico and Brazil with an accuracy consistently above 80%, and with considerable potential for further improvement. The system processes digital photographs from a photo apparatus into landmarks, and uses ratios of measurements among those landmarks, as well as (in a preliminary exploration two measurements that approximate aspects of coloration, as the basis for classification. This project has thus produced a working prototype that achieves reasonably robust correct identification rates, although many more developments can and will be added, and—more broadly—the project illustrates the value of multidisciplinary collaborations in resolving difficult and complex challenges.

  17. Advances in Automated QA/QC for TRISO Fuel Particle Production

    International Nuclear Information System (INIS)

    Hockey, Ronald L.; Bond, Leonard J.; Batishko, Charles R.; Gray, Joseph N.; Saurwein, John J.; Lowden, Richard A.

    2004-01-01

    Fuel in most Generation IV reactor designs typically encompasses billions of the TRISO particles. Present day QA/QC methods, done manually and in many cases destructively, cannot economically test a statistically significant fraction of the large number of the individual fuel particles required. Fully automated inspection technologies are essential to economical TRISO fuel particle production. A combination of in-line nondestructive (NDE) measurements employing electromagnetic induction and digital optical imaging analysis is currently under investigation and preliminary data indicate the potential for meeting the demands of this application. To calibrate high-speed NDE methods, surrogate fuel particle samples are being coated with layers containing a wide array of defect types found to degrade fuel performance and these are being characterized via high-resolution CT and digital radiographic images

  18. Sizewell: UK power demand

    International Nuclear Information System (INIS)

    Anon.

    1986-01-01

    The Sizewell Inquiry was about whether the next power stations to be built in the UK should be nuclear or coal and, if nuclear, PWRs or AGRs. During the period of the Inquiry forecasts of demand for electricity were low. Now, however, it seems that the forecast demand is much increased. This uncertainty in demand and the wide regional variations are examined in some detail. Facts and figures on electricity sales (area by area) are presented. Also the minutes of supply lost per consumer per year. These show that security of supply is also a problem. It is also shown that the way electricity is used has changed. Whilst electricity generation has been changing to large-scale, centralised power stations the demand patterns may make smaller scale, quickly-constructed units more sensible. The questions considered at the Sizewell Inquiry may, indeed, no longer be the right ones. (UK)

  19. In Orbit Performance of a Fully Autonomous Star Tracker

    DEFF Research Database (Denmark)

    Jørgensen, John Leif

    1999-01-01

    The Department of Automation at DTU has developed the Advanced Stellar Compass (ASC), a fully autonomous star tracker, for use as high precision attitude reference onboard spacecrafts. The ASC is composed of a CCD-based camera and a powerful microprocessor containing star catalogue, image......-analysis software and a search engine. The unit autonomously performs all tasks necessary to calculate the inertial attitude from a star image. To allow for flexible attitude manoeuvres, the ASC can, simultaneously, drive from one to four cameras, efficiently removing dropouts from, e.g., sun blinding of one camera......, it is difficult to test and verify the true robustness and accuracy of a star tracker on ground. This is caused by the fact that only real-sky tests offer high fidelity stimulation of the sensor, while the atmosphere instabilities result in a dominant noise source intrinsically limiting the achievable accuracy...

  20. A demanding market

    International Nuclear Information System (INIS)

    Thomas, M.

    1997-01-01

    The article relates to the oil and natural gas market, and it gives a survey of proved reserves at the end of 1996 worldwide. The long term trend of increasing world energy demand has seen a major rise during 1996 when global consumption grew by 3%. But worldwide demand, excluding the Former Soviet Union, shows this figure increasing further to 3.7% for the whole of last year according to statistics. 3 figs

  1. Maximum power demand cost

    International Nuclear Information System (INIS)

    Biondi, L.

    1998-01-01

    The charging for a service is a supplier's remuneration for the expenses incurred in providing it. There are currently two charges for electricity: consumption and maximum demand. While no problem arises about the former, the issue is more complicated for the latter and the analysis in this article tends to show that the annual charge for maximum demand arbitrarily discriminates among consumer groups, to the disadvantage of some [it

  2. An automated swimming respirometer

    DEFF Research Database (Denmark)

    STEFFENSEN, JF; JOHANSEN, K; BUSHNELL, PG

    1984-01-01

    An automated respirometer is described that can be used for computerized respirometry of trout and sharks.......An automated respirometer is described that can be used for computerized respirometry of trout and sharks....

  3. Autonomy and Automation

    Science.gov (United States)

    Shively, Jay

    2017-01-01

    A significant level of debate and confusion has surrounded the meaning of the terms autonomy and automation. Automation is a multi-dimensional concept, and we propose that Remotely Piloted Aircraft Systems (RPAS) automation should be described with reference to the specific system and task that has been automated, the context in which the automation functions, and other relevant dimensions. In this paper, we present definitions of automation, pilot in the loop, pilot on the loop and pilot out of the loop. We further propose that in future, the International Civil Aviation Organization (ICAO) RPAS Panel avoids the use of the terms autonomy and autonomous when referring to automated systems on board RPA. Work Group 7 proposes to develop, in consultation with other workgroups, a taxonomy of Levels of Automation for RPAS.

  4. Configuration Management Automation (CMA) -

    Data.gov (United States)

    Department of Transportation — Configuration Management Automation (CMA) will provide an automated, integrated enterprise solution to support CM of FAA NAS and Non-NAS assets and investments. CMA...

  5. Automated driving and its effects on the safety ecosystem: How do compatibility issues affect the transition period?

    NARCIS (Netherlands)

    van Loon, R.J.; Martens, Marieke Hendrikje

    2015-01-01

    Different components of automated vehicles are being made available commercially as we speak. Much research has been conducted into these components and many of these have been studied with respect to their effects on safety, but the transition period from non-automated driving to fully automated

  6. Automated driving and its effect on the safety ecosystem: how do compatibility issues affect the transition period?

    NARCIS (Netherlands)

    Loon, R.J. van; Martens, M.H.

    2015-01-01

    Different components of automated vehicles are being made available commercially as we speak. Much research has been conducted into these components and many of these have been studied with respect to their effects on safety, but the transition period from non-automated driving to fully automated

  7. Towards a fully automatic and robust DIMM (DIMMA)

    International Nuclear Information System (INIS)

    Varela, A M; Muñoz-Tuñón, C; Del Olmo-García, A M; Rodríguez, L F; Delgado, J M; Castro-Almazán, J A

    2015-01-01

    Quantitative seeing measurements have been provided at the Canarian Observatories since 1990 by differential image motion monitors (DIMMs). Image quality needs to be studied in long term (routine) measurements. This is important, for instance, in deciding on the siting of large telescopes or in the development of adaptive optics programmes, not to mention the development and design of new instruments. On the other hand, the continuous real time monitoring is essential in the day-to-day operation of telescopes.These routine measurements have to be carried out by standard, easy-to-operate and cross- calibrated instruments that required to be be operational with minimum intervention over many years. The DIMMA (Automatic Differential Image Motion Monitor) is the next step, a fully automated seeing monitor that is capable of providing data without manual operation and in remote locations. Currently, the IAC has two DIMMs working at Roque de los Muchachos Observatory (ORM) and Teide Observatory (OT). They are robotic and require an operator to start and initialize the program, focus the telescope, change the star when needed and turn off at the end of the night, all of which is done remotely. With a view to automation, we have designed a code for monitoring image quality (avoiding spurious data) and a program for autofocus, which is presented here. The data quality control protocol is also given. (paper)

  8. Bayesian Safety Risk Modeling of Human-Flightdeck Automation Interaction

    Science.gov (United States)

    Ancel, Ersin; Shih, Ann T.

    2015-01-01

    Usage of automatic systems in airliners has increased fuel efficiency, added extra capabilities, enhanced safety and reliability, as well as provide improved passenger comfort since its introduction in the late 80's. However, original automation benefits, including reduced flight crew workload, human errors or training requirements, were not achieved as originally expected. Instead, automation introduced new failure modes, redistributed, and sometimes increased workload, brought in new cognitive and attention demands, and increased training requirements. Modern airliners have numerous flight modes, providing more flexibility (and inherently more complexity) to the flight crew. However, the price to pay for the increased flexibility is the need for increased mode awareness, as well as the need to supervise, understand, and predict automated system behavior. Also, over-reliance on automation is linked to manual flight skill degradation and complacency in commercial pilots. As a result, recent accidents involving human errors are often caused by the interactions between humans and the automated systems (e.g., the breakdown in man-machine coordination), deteriorated manual flying skills, and/or loss of situational awareness due to heavy dependence on automated systems. This paper describes the development of the increased complexity and reliance on automation baseline model, named FLAP for FLightdeck Automation Problems. The model development process starts with a comprehensive literature review followed by the construction of a framework comprised of high-level causal factors leading to an automation-related flight anomaly. The framework was then converted into a Bayesian Belief Network (BBN) using the Hugin Software v7.8. The effects of automation on flight crew are incorporated into the model, including flight skill degradation, increased cognitive demand and training requirements along with their interactions. Besides flight crew deficiencies, automation system

  9. Experience of automation failures in training: effects on trust, automation bias, complacency and performance.

    Science.gov (United States)

    Sauer, Juergen; Chavaillaz, Alain; Wastell, David

    2016-06-01

    This work examined the effects of operators' exposure to various types of automation failures in training. Forty-five participants were trained for 3.5 h on a simulated process control environment. During training, participants either experienced a fully reliable, automatic fault repair facility (i.e. faults detected and correctly diagnosed), a misdiagnosis-prone one (i.e. faults detected but not correctly diagnosed) or a miss-prone one (i.e. faults not detected). One week after training, participants were tested for 3 h, experiencing two types of automation failures (misdiagnosis, miss). The results showed that automation bias was very high when operators trained on miss-prone automation encountered a failure of the diagnostic system. Operator errors resulting from automation bias were much higher when automation misdiagnosed a fault than when it missed one. Differences in trust levels that were instilled by the different training experiences disappeared during the testing session. Practitioner Summary: The experience of automation failures during training has some consequences. A greater potential for operator errors may be expected when an automatic system failed to diagnose a fault than when it failed to detect one.

  10. Architecture of a fully integrated communication infrastructure for the smart home; Architektur einer vollintegrierten Kommunikationsinfrastruktur fuer das Smart Home

    Energy Technology Data Exchange (ETDEWEB)

    Schaefer, Falk-Moritz; Kays, Ruediger [TU Dortmund (Germany). Lehrstuhl fuer Kommunikationstechnik

    2012-07-01

    For some time, applications in the areas of home automation, ambient assisted living and e-health are discussed. These require reliable and energy-efficient communication solutions in the home environment. In addition, new concepts that go hand in hand with the concept of the smart grids need an access to devices within the home environment. In the realization of smart homes the diversity of market participants involved, the parallel existing business models, the application requirements and the available communication systems make special demands on the underlying network infrastructure. Different solutions should be able to communicate with each other and compatible. In addition, the user expects a simple operation and configuration as well as a long-term support of the products. In the best case, the user is confronted with a single, integrated network infrastructure. Instead of separate systems for reading out of smart meters for monitoring the solar system, for health monitoring and the settings of multimedia devices, the telephone system, or computer network, a fully integrated smart home communications infrastructure should come into operation. This smart home infrastructure should be free of unnecessary duplication of structures; all equipment should be taken into account with a communication interface. The authors of the contribution under consideration report on a possible architecture of such a network infrastructure. Different grades are identified. A protocol stack for different technologies and the linking of different network hierarchies are described.

  11. Using a fully automatic mass spectrometer for fissile material control

    International Nuclear Information System (INIS)

    Wilhelmi, M.

    1978-08-01

    The demand for higher accuracy and a shorter delay in the analysis together with better objectifiability and data security needed in safeguards, lead to the automation of a mass spectrometer. Starting with a continuous feeding of samples via a high vacuum lock and including the subsequent heating, focussing and scanning of the samples as well as the final evaluation of the source data (taking alpha spectrometry and the weights required for the isotope dilution technique into account), the mass spectrometric procedure was completely automated. For this purpose, a serial CH-5 instrument of varian mat was modified to be operated by a varian 620/I computer. A newly developed three chamber high vacuum lock was attached to this system and the final evaluation is made with an IBM 370. The system has been used in operation for the isotope analysis of U, Pu and Nd for one year. Major breakdowns of the hardware did not occur, however, the computer programmes had to be steadily improved according to the changing characteristics of the samples. Compared to manual operation, the automat is superior in its throughput and speed of analysing series of similar samples. The automatic procedure objectifies the analysis and the complete evaluation ensures a better data security. (Orig./HP). (author)

  12. Automated visual fruit detection for harvest estimation and robotic harvesting

    OpenAIRE

    Puttemans, Steven; Vanbrabant, Yasmin; Tits, Laurent; Goedemé, Toon

    2016-01-01

    Fully automated detection and localisation of fruit in orchards is a key component in creating automated robotic harvesting systems, a dream of many farmers around the world to cope with large production and personnel costs. In recent years a lot of research on this topic has been performed, using basic computer vision techniques, like colour based segmentation, as a suggested solution. When not using standard RGB cameras, research tends to resort to other sensors, like hyper spectral or 3D. ...

  13. A Method for Automated Planning of FTTH Access Network Infrastructures

    DEFF Research Database (Denmark)

    Riaz, Muhammad Tahir; Pedersen, Jens Myrup; Madsen, Ole Brun

    2005-01-01

    In this paper a method for automated planning of Fiber to the Home (FTTH) access networks is proposed. We introduced a systematic approach for planning access network infrastructure. The GIS data and a set of algorithms were employed to make the planning process more automatic. The method explains...... method. The method, however, does not fully automate the planning but make the planning process significantly fast. The results and discussion are presented and conclusion is given in the end....

  14. Development of automated analytical systems for large throughput

    International Nuclear Information System (INIS)

    Ernst, P.C.; Hoffman, E.L.

    1982-01-01

    The need to be able to handle a large throughput of samples for neutron activation analysis has led to the development of automated counting and sample handling systems. These are coupled with available computer-assisted INAA techniques to perform a wide range of analytical services on a commercial basis. A fully automated delayed neutron counting system and a computer controlled pneumatic transfer for INAA use are described, as is a multi-detector gamma-spectroscopy system. (author)

  15. UK Nuclear Workforce Demand

    International Nuclear Information System (INIS)

    Roberts, John

    2017-01-01

    UK Nuclear Sites: DECOMMISSIONING - 26 Magnox Reactors, 2 Fast Reactors; OPERATIONAL - 14 AGRs, 1 PWR; 9.6 GWe Total Capacity. Nuclear Workforce Demand • Total workforce demand is expected to grow from ~88,000 in 2017 to ~101,000 in 2021 • Average “inflow” is ~7,000 FTEs per annum • 22% of the workforce is female (28% in civil, 12% in defence) • 81% generic skills, 18% nuclear skills, 1% subject matter experts • 3300 trainees total in SLCs and Defence Enterprise (16% graduate trainees) • At peak demand on Civils Construction, over 4,000 workers will be required on each nuclear new build site • Manufacturing workforce is expected to rise from around 4,000 in 2014 to 8,500 at the peak of onsite activity in 2025

  16. Automated MAD and MIR structure solution

    International Nuclear Information System (INIS)

    Terwilliger, Thomas C.; Berendzen, Joel

    1999-01-01

    A fully automated procedure for solving MIR and MAD structures has been developed using a scoring scheme to convert the structure-solution process into an optimization problem. Obtaining an electron-density map from X-ray diffraction data can be difficult and time-consuming even after the data have been collected, largely because MIR and MAD structure determinations currently require many subjective evaluations of the qualities of trial heavy-atom partial structures before a correct heavy-atom solution is obtained. A set of criteria for evaluating the quality of heavy-atom partial solutions in macromolecular crystallography have been developed. These have allowed the conversion of the crystal structure-solution process into an optimization problem and have allowed its automation. The SOLVE software has been used to solve MAD data sets with as many as 52 selenium sites in the asymmetric unit. The automated structure-solution process developed is a major step towards the fully automated structure-determination, model-building and refinement procedure which is needed for genomic scale structure determinations

  17. Demand Modelling in Telecommunications

    Directory of Open Access Journals (Sweden)

    M. Chvalina

    2009-01-01

    Full Text Available This article analyses the existing possibilities for using Standard Statistical Methods and Artificial Intelligence Methods for a short-term forecast and simulation of demand in the field of telecommunications. The most widespread methods are based on Time Series Analysis. Nowadays, approaches based on Artificial Intelligence Methods, including Neural Networks, are booming. Separate approaches will be used in the study of Demand Modelling in Telecommunications, and the results of these models will be compared with actual guaranteed values. Then we will examine the quality of Neural Network models. 

  18. DemandStat

    International Nuclear Information System (INIS)

    2003-01-01

    DemandStat is an accurate and up-to-date international statistics database dedicated to energy demand, with an unrivaled level of details for powerful market analysis. It provides detailed consumption statistics (30 sectors) on all energies, detailed 2003 data and historical annual data since 1970, frequent data revision and update (2 updates options), 150 data sources gathered and expertized, all data on a single database Consistent and homogeneous statistics, in line with all major data providers (IEA, Eurostat, ADB, OLADE, etc), no ruptures in time-series with easy request building and data analysis and reactive support from data experts. (A.L.B.)

  19. Education on Demand

    DEFF Research Database (Denmark)

    Boysen, Lis; Hende, Merete

    2015-01-01

    Dette notat beskriver nogle af resultaterne fra programmet "Education on Demand' i projektet Det erhvervsrettede Uddannelseslaboratorium. Programmet har haft fokus på udfordringer og forandringsbehov i uddannelsesinstitutioner og -systemet. Herunder har det beskæftiget sig særligt med de to temat......Dette notat beskriver nogle af resultaterne fra programmet "Education on Demand' i projektet Det erhvervsrettede Uddannelseslaboratorium. Programmet har haft fokus på udfordringer og forandringsbehov i uddannelsesinstitutioner og -systemet. Herunder har det beskæftiget sig særligt med de...

  20. Progress report on a fully automatic Gas Tungsten Arc Welding (GTAW) system development

    Energy Technology Data Exchange (ETDEWEB)

    Daumeyer, G.J. III

    1994-12-01

    A plan to develop a fully automatic gas tungsten arc welding (GTAW) system that will utilize a vision-sensing computer (which will provide in-process feedback control) is presently in work. Evaluations of different technological aspects and system design requirements continue. This report summaries major activities in the plan`s successful progress. The technological feasibility of producing the fully automated GTAW system has been proven. The goal of this process development project is to provide a production-ready system within the shortest reasonable time frame.

  1. Responsiveness of residential electricity demand to dynamic tariffs : experiences from a large field test in the Netherlands

    NARCIS (Netherlands)

    Klaassen, E.A.M.; Kobus, C.B.A.; Frunt, J.; Slootweg, J.G.

    2016-01-01

    To efficiently facilitate the energy transition it is essential to evaluate the potential of demand response in practice. Based on the results of a Dutch smart grid pilot, this paper assesses the potential of both manual and semi-automated demand response in residential areas. To stimulate demand

  2. Responsiveness of residential electricity demand to dynamic tariffs : Experiences from a large field test in the Netherlands

    NARCIS (Netherlands)

    Klaassen, EAM; Kobus, C.B.A.; Frunt, J; Slootweg, JG

    2016-01-01

    To efficiently facilitate the energy transition it is essential to evaluate the potential of demand response in practice. Based on the results of a Dutch smart grid pilot, this paper assesses the potential of both manual and semi-automated demand response in residential areas. To stimulate demand

  3. Reactor pressure vessel stud management automation strategies

    International Nuclear Information System (INIS)

    Biach, W.L.; Hill, R.; Hung, K.

    1992-01-01

    The adoption of hydraulic tensioner technology as the standard for bolting and unbolting the reactor pressure vessel (RPV) head 35 yr ago represented an incredible commitment to new technology, but the existing technology was so primitive as to be clearly unacceptable. Today, a variety of approaches for improvement make the decision more difficult. Automation in existing installations must meet complex physical, logistic, and financial parameters while addressing the demands of reduced exposure, reduced critical path, and extended plant life. There are two generic approaches to providing automated RPV stud engagement and disengagement: the multiple stud tensioner and automated individual tools. A variation of the latter would include the handling system. Each has its benefits and liabilities

  4. Energy Assessment of Automated Mobility Districts

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Yuche [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-08-03

    Automated vehicles (AVs) are increasingly being discussed as the basis for on-demand mobility services, introducing a new paradigm in which a fleet of AVs displace private automobiles for day-to-day travel in dense activity districts. This project examines such a concept to displace privately owned automobiles within a region containing dense activity generators (jobs, retail, entertainment, etc.), referred to as an automated mobility district (AMDs). The project reviews several such districts including airport, college campuses, business parks, downtown urban cores, and military bases, with examples of previous attempts to meet the mobility needs apart from private automobiles, some with automated technology and others with more traditional transit based solutions. The issues and benefits of AMDs are framed within the perspective of intra-district, inter-district, and border issues, and the requirements for a modeling framework are identified to adequately reflect the breadth of mobility, energy, and emissions impact anticipated with AMDs.

  5. Automated rapid chemistry in heavy element research

    International Nuclear Information System (INIS)

    Schaedel, M.

    1994-01-01

    With the increasingly short half-lives of the heavy element isotopes in the transition region from the heaviest actinides to the transactinide elements the demand for automated rapid chemistry techniques is also increasing. Separation times of significantly less than one minute, high chemical yields, high repetition rates, and an adequate detection system are prerequisites for many successful experiments in this field. The development of techniques for separations in the gas phase and in the aqueous phase for applications of chemical or nuclear studies of the heaviest elements are briefly outlined. Typical examples of results obtained with automated techniques are presented for studies up to element 105, especially those obtained with the Automated Rapid Chemistry Apparatus, ARCA. The prospects to investigate the properties of even heavier elements with chemical techniques are discussed

  6. Understanding reliance on automation: effects of error type, error distribution, age and experience

    Science.gov (United States)

    Sanchez, Julian; Rogers, Wendy A.; Fisk, Arthur D.; Rovira, Ericka

    2015-01-01

    An obstacle detection task supported by “imperfect” automation was used with the goal of understanding the effects of automation error types and age on automation reliance. Sixty younger and sixty older adults interacted with a multi-task simulation of an agricultural vehicle (i.e. a virtual harvesting combine). The simulator included an obstacle detection task and a fully manual tracking task. A micro-level analysis provided insight into the way reliance patterns change over time. The results indicated that there are distinct patterns of reliance that develop as a function of error type. A prevalence of automation false alarms led participants to under-rely on the automation during alarm states while over relying on it during non-alarms states. Conversely, a prevalence of automation misses led participants to over-rely on automated alarms and under-rely on the automation during non-alarm states. Older adults adjusted their behavior according to the characteristics of the automation similarly to younger adults, although it took them longer to do so. The results of this study suggest the relationship between automation reliability and reliance depends on the prevalence of specific errors and on the state of the system. Understanding the effects of automation detection criterion settings on human-automation interaction can help designers of automated systems make predictions about human behavior and system performance as a function of the characteristics of the automation. PMID:25642142

  7. Causality in demand

    DEFF Research Database (Denmark)

    Nielsen, Max; Jensen, Frank; Setälä, Jari

    2011-01-01

    to fish demand. On the German market for farmed trout and substitutes, it is found that supply sources, i.e. aquaculture and fishery, are not the only determinant of causality. Storing, tightness of management and aggregation level of integrated markets might also be important. The methodological...

  8. Oil supply and demand

    Energy Technology Data Exchange (ETDEWEB)

    Babusiaux, D

    2004-07-01

    Following the military intervention in Iraq, it is taking longer than expected for Iraqi exports to make a comeback on the market. Demand is sustained by economic growth in China and in the United States. OPEC is modulating production to prevent inventory build-up. Prices have stayed high despite increased production by non-OPEC countries, especially Russia. (author)

  9. Oil supply and demand

    Energy Technology Data Exchange (ETDEWEB)

    Rech, O

    2006-07-01

    The year 2004 saw a change in the oil market paradigm that was confirmed in 2005. Despite a calmer geopolitical context, prices continued to rise vigorously. Driven by world demand, they remain high as a result of the saturation of production and refining capacity. The market is still seeking its new equilibrium. (author)

  10. Oil supply and demand

    International Nuclear Information System (INIS)

    Rech, O.

    2006-01-01

    The year 2004 saw a change in the oil market paradigm that was confirmed in 2005. Despite a calmer geopolitical context, prices continued to rise vigorously. Driven by world demand, they remain high as a result of the saturation of production and refining capacity. The market is still seeking its new equilibrium. (author)

  11. The demand for euros

    NARCIS (Netherlands)

    Arnold, I.J.M.; Roelands, S.

    2010-01-01

    This paper investigates the demand for euros using panel data for 10 euro area countries covering the period from 1999 to 2008. Monetary aggregates are constructed to ensure that money is a national concept by excluding deposits owned by non-residents and including external deposits owned by

  12. Oil supply and demand

    International Nuclear Information System (INIS)

    Babusiaux, D.

    2004-01-01

    Following the military intervention in Iraq, it is taking longer than expected for Iraqi exports to make a comeback on the market. Demand is sustained by economic growth in China and in the United States. OPEC is modulating production to prevent inventory build-up. Prices have stayed high despite increased production by non-OPEC countries, especially Russia. (author)

  13. Oil supply and demand

    International Nuclear Information System (INIS)

    Rech, O.

    2004-01-01

    World oil demand, driven by economic development in China, posted the highest growth rate in 20 years. In a context of geopolitical uncertainty, prices are soaring, encouraged by low inventory and the low availability of residual production capacity. Will 2004 bring a change in the oil market paradigm? (author)

  14. Textbook Factor Demand Curves.

    Science.gov (United States)

    Davis, Joe C.

    1994-01-01

    Maintains that teachers and textbook graphics follow the same basic pattern in illustrating changes in demand curves when product prices increase. Asserts that the use of computer graphics will enable teachers to be more precise in their graphic presentation of price elasticity. (CFR)

  15. Oil supply and demand

    Energy Technology Data Exchange (ETDEWEB)

    Rech, O

    2004-07-01

    World oil demand, driven by economic development in China, posted the highest growth rate in 20 years. In a context of geopolitical uncertainty, prices are soaring, encouraged by low inventory and the low availability of residual production capacity. Will 2004 bring a change in the oil market paradigm? (author)

  16. Implementation and development of an automated, ultra-high-capacity, acoustic, flexible dispensing platform for assay-ready plate delivery.

    Science.gov (United States)

    Griffith, Dylan; Northwood, Roger; Owen, Paul; Simkiss, Ellen; Brierley, Andrew; Cross, Kevin; Slaney, Andrew; Davis, Miranda; Bath, Colin

    2012-10-01

    Compound management faces the daily challenge of providing high-quality samples to drug discovery. The advent of new screening technologies has seen demand for liquid samples move toward nanoliter ranges, dispensed by contactless acoustic droplet ejection. Within AstraZeneca, a totally integrated assay-ready plate production platform has been created to fully exploit the advantages of this technology. This enables compound management to efficiently deliver large throughputs demanded by high-throughput screening while maintaining regular delivery of smaller numbers of compounds in varying plate formats for cellular or biochemical concentration-response curves in support of hit and lead optimization (structure-activity relationship screening). The automation solution, CODA, has the capability to deliver compounds on demand for single- and multiple-concentration ranges, in batch sizes ranging from 1 sample to 2 million samples, integrating seamlessly into local compound and test management systems. The software handles compound orders intelligently, grouping test requests together dependent on output plate type and serial dilution ranges so that source compound vessels are shared among numerous tests, ensuring conservation of sample, reduced labware and costs, and efficiency of work cell logistics. We describe the development of CODA to address the customer demand, challenges experienced, learning made, and subsequent enhancements.

  17. Model-based automated testing of critical PLC programs.

    CERN Document Server

    Fernández Adiego, B; Tournier, J-C; González Suárez, V M; Bliudze, S

    2014-01-01

    Testing of critical PLC (Programmable Logic Controller) programs remains a challenging task for control system engineers as it can rarely be automated. This paper proposes a model based approach which uses the BIP (Behavior, Interactions and Priorities) framework to perform automated testing of PLC programs developed with the UNICOS (UNified Industrial COntrol System) framework. This paper defines the translation procedure and rules from UNICOS to BIP which can be fully automated in order to hide the complexity of the underlying model from the control engineers. The approach is illustrated and validated through the study of a water treatment process.

  18. Energy demand in Portuguese manufacturing: a two-stage model

    International Nuclear Information System (INIS)

    Borges, A.M.; Pereira, A.M.

    1992-01-01

    We use a two-stage model of factor demand to estimate the parameters determining energy demand in Portuguese manufacturing. In the first stage, a capital-labor-energy-materials framework is used to analyze the substitutability between energy as a whole and other factors of production. In the second stage, total energy demand is decomposed into oil, coal and electricity demands. The two stages are fully integrated since the energy composite used in the first stage and its price are obtained from the second stage energy sub-model. The estimates obtained indicate that energy demand in manufacturing responds significantly to price changes. In addition, estimation results suggest that there are important substitution possibilities among energy forms and between energy and other factors of production. The role of price changes in energy-demand forecasting, as well as in energy policy in general, is clearly established. (author)

  19. Electricity demand in Tunisia

    International Nuclear Information System (INIS)

    Gam, Imen; Ben Rejeb, Jaleleddine

    2012-01-01

    This paper examines the global electricity demand in Tunisia as a function of gross domestic product in constant price, the degree of urbanization, the average annual temperature, and the real electricity price per Kwh. This demand will be examined employing annual data over a period spanning almost thirty one years from 1976 to 2006. A long run relationship between the variables under consideration is determined using the Vector Autoregressive Regression. The empirical results suggest that the electricity demand in Tunisia is sensitive to its past value, any changes in gross domestic product and electricity price. The electricity price effects have a negative impact on long-run electricity consumption. However, the gross domestic product and the past value of electricity consumption have a positive effect. Moreover, the causality test reveals a unidirectional relationship between price and electricity consumption. Our empirical findings are effective to policy makers to maintain the electricity consumption in Tunisia by using the appropriate strategy. - Highlights: ► This paper examined the electricity demand in Tunisia in the long-run. ► The empirical analysis revealed that in the long-run the electricity demand is affected by changes in its past value, GDP in constant price and real electricity price. ► There is a unidirectional relationship between price and electricity consumption, that is to say, that the electricity price causes the consumption. ► Those results suggest that a pricing policy can be an effective instrument to rationalize the electricity consumption in Tunisia in the long-run.

  20. Emerging technologies for demand side management. Demand side management jitsugen no tame no saishin gijutsu

    Energy Technology Data Exchange (ETDEWEB)

    Suzuki, H; Iyoda, I [Mitsubishi Electric Corp., Tokyo (Japan)

    1993-11-05

    This paper explains the latest situation in hardware technologies to realize the demand side management, divided into the following technologies: communications technology, measurement technology, client information system technology, load controlling technology, home automation technology, and energy storing and saving technologies. Speaking of the communications technology, information exchange between the supply side and the demand side is important in the demand side management, whereas a technology intended of automatic power distribution and automatic meter-reading is advancing in development. The technology covers transmissions using from power lines and telephone lines to optical cables and wireless communications. Power line communications using power transmission lines as communication lines are simple and economical, but weak against noise, and not suitable for long-distance communications. Wireless communications have been drawing attentions along with the development of mobile communication device technologies. These technologies will give benefits to electric power companies in the initial stage of their use, such as for load investigation and general automation in power distribution. They would shift to benefiting users in about 2010 covering from security information such as about power interruption to publicity information and educations. 8 refs., 8 figs., 1 tab.

  1. Implementation of a demand elasticity model in the building energy management system

    NARCIS (Netherlands)

    Ożadowicz, A.; Grela, J.; Babar, M.

    2016-01-01

    Nowadays, crucial part of modern Building Automation and Control Systems (BACS) is electric energy management. An active demand side management is very important feature of a Building Energy Management Systems (BEMS) integrated within the BACS. Since demand value changes in time and depends on

  2. Android based security and home automation system

    OpenAIRE

    Khan, Sadeque Reza; Dristy, Farzana Sultana

    2015-01-01

    The smart mobile terminal operator platform Android is getting popular all over the world with its wide variety of applications and enormous use in numerous spheres of our daily life. Considering the fact of increasing demand of home security and automation, an Android based control system is presented in this paper where the proposed system can maintain the security of home main entrance and also the car door lock. Another important feature of the designed system is that it can control the o...

  3. Automation of solar plants

    Energy Technology Data Exchange (ETDEWEB)

    Yebra, L.J.; Romero, M.; Martinez, D.; Valverde, A. [CIEMAT - Plataforma Solar de Almeria, Tabernas (Spain); Berenguel, M. [Almeria Univ. (Spain). Departamento de Lenguajes y Computacion

    2004-07-01

    This work overviews some of the main activities and research lines that are being carried out within the scope of the specific collaboration agreement between the Plataforma Solar de Almeria-CIEMAT (PSA-CIEMAT) and the Automatic Control, Electronics and Robotics research group of the Universidad de Almeria (TEP197) titled ''Development of control systems and tools for thermosolar plants'' and the projects financed by the MCYT DPI2001-2380-C02-02 and DPI2002-04375-C03. The research is directed by the need of improving the efficiency of the process through which the energy provided by the sun is totally or partially used as energy source, as far as diminishing the costs associated to the operation and maintenance of the installations that use this energy source. The final objective is to develop different automatic control systems and techniques aimed at improving the competitiveness of solar plants. The paper summarizes different objectives and automatic control approaches that are being implemented in different facilities at the PSA-CIEMAT: central receiver systems and solar furnace. For each one of these facilities, a systematic procedure is being followed, composed of several steps: (i) development of dynamic models using the newest modeling technologies (both for simulation and control purposes), (ii) development of fully automated data acquisition and control systems including software tools facilitating the analysis of data and the application of knowledge to the controlled plants and (iii) synthesis of advanced controllers using techniques successfully used in the process industry and development of new and optimized control algorithms for solar plants. These aspects are summarized in this work. (orig.)

  4. Will the future of knowledge work automation transform personalized medicine?

    OpenAIRE

    Gauri Naik; Sanika S. Bhide

    2014-01-01

    Today, we live in a world of ?information overload? which demands high level of knowledge-based work. However, advances in computer hardware and software have opened possibilities to automate ?routine cognitive tasks? for knowledge processing. Engineering intelligent software systems that can process large data sets using unstructured commands and subtle judgments and have the ability to learn ?on the fly? are a significant step towards automation of knowledge work. The applications of this t...

  5. Demand-Driven Success: Designing Your PDA Experiment

    OpenAIRE

    Hillen, Charles; Johnson-Grau, Glenn

    2012-01-01

    Initiating demand-driving acquisition is daunting. Implications for developing a sustainable budget model, choosing a vendor, controlling metadata, monitoring purchases and developing invoice workflows are significant areas of concern that require determinative planning. From mid-February through August 2011, Loyola Marymount University conducted a pilot using demand-driven acquisition; the result of this successful experiment was the library’s decision to fully integrate this purchasing mode...

  6. Aggregated Demand Modelling Including Distributed Generation, Storage and Demand Response

    OpenAIRE

    Marzooghi, Hesamoddin; Hill, David J.; Verbic, Gregor

    2014-01-01

    It is anticipated that penetration of renewable energy sources (RESs) in power systems will increase further in the next decades mainly due to environmental issues. In the long term of several decades, which we refer to in terms of the future grid (FG), balancing between supply and demand will become dependent on demand actions including demand response (DR) and energy storage. So far, FG feasibility studies have not considered these new demand-side developments for modelling future demand. I...

  7. AUTOMATED ANALYSIS OF BREAKERS

    Directory of Open Access Journals (Sweden)

    E. M. Farhadzade

    2014-01-01

    Full Text Available Breakers relate to Electric Power Systems’ equipment, the reliability of which influence, to a great extend, on reliability of Power Plants. In particular, the breakers determine structural reliability of switchgear circuit of Power Stations and network substations. Failure in short-circuit switching off by breaker with further failure of reservation unit or system of long-distance protection lead quite often to system emergency.The problem of breakers’ reliability improvement and the reduction of maintenance expenses is becoming ever more urgent in conditions of systematic increasing of maintenance cost and repair expenses of oil circuit and air-break circuit breakers. The main direction of this problem solution is the improvement of diagnostic control methods and organization of on-condition maintenance. But this demands to use a great amount of statistic information about nameplate data of breakers and their operating conditions, about their failures, testing and repairing, advanced developments (software of computer technologies and specific automated information system (AIS.The new AIS with AISV logo was developed at the department: “Reliability of power equipment” of AzRDSI of Energy. The main features of AISV are:· to provide the security and data base accuracy;· to carry out systematic control of breakers conformity with operating conditions;· to make the estimation of individual  reliability’s value and characteristics of its changing for given combination of characteristics variety;· to provide personnel, who is responsible for technical maintenance of breakers, not only with information but also with methodological support, including recommendations for the given problem solving  and advanced methods for its realization.

  8. Architecture Views Illustrating the Service Automation Aspect of SOA

    Science.gov (United States)

    Gu, Qing; Cuadrado, Félix; Lago, Patricia; Duenãs, Juan C.

    Earlier in this book, Chapter 8 provided a detailed analysis of service engineering, including a review of service engineering techniques and methodologies. This chapter is closely related to Chapter 8 as shows how such approaches can be used to develop a service, with particular emphasis on the identification of three views (the automation decision view, degree of service automation view and service automation related data view) that structure and ease elicitation and documentation of stakeholders' concerns. This is carried out through two large case studies to learn the industrial needs in illustrating services deployment and configuration automation. This set of views adds to the more traditional notations like UML, the visual power of attracting the attention of their users to the addressed concerns, and assist them in their work. This is especially crucial in service oriented architecting where service automation is highly demanded.

  9. Sulphur demand growing. [Alberta

    Energy Technology Data Exchange (ETDEWEB)

    1975-01-20

    Sulfur markets look better going into 1975 than they have for several years, as North American demand growth is being filled largely by elemental sulfur producers and overseas. Demand is rising as fast as the capacity of Canadian transportation and handling facilities. It will take a long time to make much of a dent in the total Alberta stockpile of 14 million long tons at the end of 1974, with involuntary production from sour gas plants exceeding sales volume since 1972. However, there is some encouragement in the approaching peakout of production combined with a substantial increase in price since the low point of the cycle at the beginning of 1973, and a predicted rise of at least 20% in domestic (North American) sales this year over 1974.

  10. Ontario demand response scenarios

    International Nuclear Information System (INIS)

    Rowlands, I.H.

    2005-09-01

    Strategies for demand management in Ontario were examined via 2 scenarios for a commercial/institutional building with a normal summertime peak load of 300 kW between 14:00 and 18:00 during a period of high electricity demand and high electricity prices. The first scenario involved the deployment of a 150 kW on-site generator fuelled by either diesel or natural gas. The second scenario involved curtailing load by 60 kW during the same periods. Costs and benefits of both scenarios were evaluated for 3 groups: consumers, system operators and society. Benefits included electricity cost savings, deferred transmission capacity development, lower system prices for electricity, as well as environmental changes, economic development, and a greater sense of corporate social responsibility. It was noted that while significant benefits were observed for all 3 groups, they were not substantial enough to encourage action, as the savings arising from deferred generation capacity development do not accrue to individual players. The largest potential benefit was identified as lower prices, spread across all users of electricity in Ontario. It was recommended that representative bodies cooperate so that the system-wide benefits can be reaped. It was noted that if 10 municipal utilities were able to have 250 commercial or institutional customers engaged in distributed response, then a total peak demand reduction of 375 MW could be achieved, representing more than 25 per cent of Ontario's target for energy conservation. It was concluded that demand response often involves the investment of capital and new on-site procedures, which may affect reactions to various incentives. 78 refs., 10 tabs., 5 figs

  11. Household electricity demand profiles

    DEFF Research Database (Denmark)

    Marszal, Anna Joanna; Heiselberg, Per Kvols; Larsen, Olena Kalyanova

    2016-01-01

    Highlights •A 1-min resolution household electricity load model is presented. •Model adapts a bottom-up approach with single appliance as the main building block. •Load profiles are used to analyse the flexibility potential of household appliances. •Load profiles can be applied in other domains, .......g. building energy simulations. •The demand level of houses with different number of occupants is well captured....

  12. Energy demand patterns

    Energy Technology Data Exchange (ETDEWEB)

    Hoffmann, L; Schipper, L; Meyers, S; Sathaye, J; Hara, Y

    1984-05-01

    This report brings together three papers on energy demand presented at the Energy Research Priorities Seminar held in Ottawa on 8-10 August 1983. The first paper suggests a framework in which energy demand studies may be organized if they are to be useful in policy-making. Disaggregation and the analysis of the chain of energy transformations are possible paths toward more stable and reliable parameters. The second paper points to another factor that leads to instability in sectoral parameters, namely a changeover from one technology to another; insofar as technologies producing a product (or service) vary in their energy intensity, a technological shift will also change the energy intensity of the product. Rapid technological change is characteristic of some sectors in developing countries, and may well account for the high aggregate GDP-elasticities of energy consumption observed. The third paper begins with estimates of these elasticities, which were greater than one for all the member countries of the Asian Development Bank in 1961-78. The high elasticities, together with extreme oil dependence, made them vulnerable to the drastic rise in the oil price after 1973. The author distinguishes three diverging patterns of national experience. The oil-surplus countries naturally gained from the rise in the oil price. Among oil-deficit countries, the newly industrialized countries expanded their exports so rapidly that the oil crisis no longer worried them. For the rest, balance of payments adjustments became a prime concern of policy. Whether they dealt with the oil bill by borrowing, by import substitution, or by demand restraint, the impact of energy on their growth was unmistakable. The paper also shows why energy-demand studies, and energy studies in general, deserve to be taken seriously. 16 refs., 4 figs., 18 tabs.

  13. Migration and Tourism Demand

    Directory of Open Access Journals (Sweden)

    Nuno Carlos LEITÃO

    2012-02-01

    Full Text Available This study considers the relationship between immigration and Portuguese tourism demand for the period 1995-2008, using a dynamic panel data approach. The findings indicate that Portuguese tourism increased significantly during the period in accordance with the values expected for a developed country. The regression results show that income, shock of immigration, population, and geographical distance between Portugal and countries of origin are the main determinants of Portuguese tourism.

  14. Demand scenarios, worldwide

    Energy Technology Data Exchange (ETDEWEB)

    Schaefer, A [Massachusetts Inst. of Technology, Center for Technology, Policy and Industrial Development and the MIT Joint Program on the Science and Policy of Global Change, Cambridge, MA (United States)

    1996-11-01

    Existing methods are inadequate for developing aggregate (regional and global) and long-term (several decades) passenger transport demand scenarios, since they are mainly based on simple extensions of current patterns rather than causal relationships that account for the competition among transport modes (aircraft, automobiles, buses and trains) to provide transport services. The demand scenario presented in this paper is based on two empirically proven invariances of human behavior. First, transport accounts for 10 to 15 percent of household total expenditures for those owning an automobile, and around 5 percent for non-motorized households on average (travel money budget). Second, the mean time spent traveling is approximately one hour per capita per day (travel time budget). These two budgets constraints determine the dynamics of the scenario: rising income increases per capita expenditure on travel which, in turn, increase demand for mobility. Limited travel time constraints travelers to shift to faster transport systems. The scenario is initi