WorldWideScience

Sample records for fully automated demand

  1. Development and evaluation of fully automated demand response in large facilities

    Energy Technology Data Exchange (ETDEWEB)

    Piette, Mary Ann; Sezgen, Osman; Watson, David S.; Motegi, Naoya; Shockman, Christine; ten Hope, Laurie

    2004-03-30

    This report describes the results of a research project to develop and evaluate the performance of new Automated Demand Response (Auto-DR) hardware and software technology in large facilities. Demand Response (DR) is a set of activities to reduce or shift electricity use to improve electric grid reliability, manage electricity costs, and ensure that customers receive signals that encourage load reduction during times when the electric grid is near its capacity. The two main drivers for widespread demand responsiveness are the prevention of future electricity crises and the reduction of electricity prices. Additional goals for price responsiveness include equity through cost of service pricing, and customer control of electricity usage and bills. The technology developed and evaluated in this report could be used to support numerous forms of DR programs and tariffs. For the purpose of this report, we have defined three levels of Demand Response automation. Manual Demand Response involves manually turning off lights or equipment; this can be a labor-intensive approach. Semi-Automated Response involves the use of building energy management control systems for load shedding, where a preprogrammed load shedding strategy is initiated by facilities staff. Fully-Automated Demand Response is initiated at a building or facility through receipt of an external communications signal--facility staff set up a pre-programmed load shedding strategy which is automatically initiated by the system without the need for human intervention. We have defined this approach to be Auto-DR. An important concept in Auto-DR is that a facility manager is able to ''opt out'' or ''override'' an individual DR event if it occurs at a time when the reduction in end-use services is not desirable. This project sought to improve the feasibility and nature of Auto-DR strategies in large facilities. The research focused on technology development, testing

  2. Automated Demand Response and Commissioning

    Energy Technology Data Exchange (ETDEWEB)

    Piette, Mary Ann; Watson, David S.; Motegi, Naoya; Bourassa, Norman

    2005-04-01

    This paper describes the results from the second season of research to develop and evaluate the performance of new Automated Demand Response (Auto-DR) hardware and software technology in large facilities. Demand Response (DR) is a set of activities to reduce or shift electricity use to improve the electric grid reliability and manage electricity costs. Fully-Automated Demand Response does not involve human intervention, but is initiated at a home, building, or facility through receipt of an external communications signal. We refer to this as Auto-DR. The evaluation of the control and communications must be properly configured and pass through a set of test stages: Readiness, Approval, Price Client/Price Server Communication, Internet Gateway/Internet Relay Communication, Control of Equipment, and DR Shed Effectiveness. New commissioning tests are needed for such systems to improve connecting demand responsive building systems to the electric grid demand response systems.

  3. Fully automated (operational) modal analysis

    Science.gov (United States)

    Reynders, Edwin; Houbrechts, Jeroen; De Roeck, Guido

    2012-05-01

    Modal parameter estimation requires a lot of user interaction, especially when parametric system identification methods are used and the modes are selected in a stabilization diagram. In this paper, a fully automated, generally applicable three-stage clustering approach is developed for interpreting such a diagram. It does not require any user-specified parameter or threshold value, and it can be used in an experimental, operational, and combined vibration testing context and with any parametric system identification algorithm. The three stages of the algorithm correspond to the three stages in a manual analysis: setting stabilization thresholds for clearing out the diagram, detecting columns of stable modes, and selecting a representative mode from each column. An extensive validation study illustrates the accuracy and robustness of this automation strategy.

  4. Automated fully-stressed design with NASTRAN

    Science.gov (United States)

    Wallerstein, D. V.; Haggenmacher, G. W.

    1976-01-01

    An automated strength sizing capability is described. The technique determines the distribution of material among the elements of a structural model. The sizing is based on either a fully stressed design or a scaled feasible fully stressed design. Results obtained from the application of the strength sizing to the structural sizing of a composite material wing box using material strength allowables are presented. These results demonstrate the rapid convergence of the structural sizes to a usable design.

  5. A Fully Automated Penumbra Segmentation Tool

    DEFF Research Database (Denmark)

    Nagenthiraja, Kartheeban; Ribe, Lars Riisgaard; Hougaard, Kristina Dupont

    2012-01-01

    salavageable tissue, quickly and accurately. We present a fully Automated Penumbra Segmentation (APS) algorithm using PWI and DWI images. We compare automatically generated PWI-DWI mismatch mask to mask outlined manually by experts, in 168 patients. Method: The algorithm initially identifies PWI lesions...

  6. Fully integrated, fully automated generation of short tandem repeat profiles

    Science.gov (United States)

    2013-01-01

    Background The generation of short tandem repeat profiles, also referred to as ‘DNA typing,’ is not currently performed outside the laboratory because the process requires highly skilled technical operators and a controlled laboratory environment and infrastructure with several specialized instruments. The goal of this work was to develop a fully integrated system for the automated generation of short tandem repeat profiles from buccal swab samples, to improve forensic laboratory process flow as well as to enable short tandem repeat profile generation to be performed in police stations and in field-forward military, intelligence, and homeland security settings. Results An integrated system was developed consisting of an injection-molded microfluidic BioChipSet cassette, a ruggedized instrument, and expert system software. For each of five buccal swabs, the system purifies DNA using guanidinium-based lysis and silica binding, amplifies 15 short tandem repeat loci and the amelogenin locus, electrophoretically separates the resulting amplicons, and generates a profile. No operator processing of the samples is required, and the time from swab insertion to profile generation is 84 minutes. All required reagents are contained within the BioChipSet cassette; these consist of a lyophilized polymerase chain reaction mix and liquids for purification and electrophoretic separation. Profiles obtained from fully automated runs demonstrate that the integrated system generates concordant short tandem repeat profiles. The system exhibits single-base resolution from 100 to greater than 500 bases, with inter-run precision with a standard deviation of ±0.05 - 0.10 bases for most alleles. The reagents are stable for at least 6 months at 22°C, and the instrument has been designed and tested to Military Standard 810F for shock and vibration ruggedization. A nontechnical user can operate the system within or outside the laboratory. Conclusions The integrated system represents the

  7. Automation of Capacity Bidding with an Aggregator Using Open Automated Demand Response

    Energy Technology Data Exchange (ETDEWEB)

    Kiliccote, Sila; Piette, Mary Ann

    2008-10-01

    This report summarizes San Diego Gas& Electric Company?s collaboration with the Demand Response Research Center to develop and test automation capability for the Capacity Bidding Program in 2007. The report describes the Open Automated Demand Response architecture, summarizes the history of technology development and pilot studies. It also outlines the Capacity Bidding Program and technology being used by an aggregator that participated in this demand response program. Due to delays, the program was not fully operational for summer 2007. However, a test event on October 3, 2007, showed that the project successfully achieved the objective to develop and demonstrate how an open, Web?based interoperable automated notification system for capacity bidding can be used by aggregators for demand response. The system was effective in initiating a fully automated demand response shed at the aggregated sites. This project also demonstrated how aggregators can integrate their demand response automation systems with San Diego Gas& Electric Company?s Demand Response Automation Server and capacity bidding program.

  8. Open Automated Demand Response Communications Specification (Version 1.0)

    Energy Technology Data Exchange (ETDEWEB)

    Piette, Mary Ann; Ghatikar, Girish; Kiliccote, Sila; Koch, Ed; Hennage, Dan; Palensky, Peter; McParland, Charles

    2009-02-28

    The development of the Open Automated Demand Response Communications Specification, also known as OpenADR or Open Auto-DR, began in 2002 following the California electricity crisis. The work has been carried out by the Demand Response Research Center (DRRC), which is managed by Lawrence Berkeley National Laboratory. This specification describes an open standards-based communications data model designed to facilitate sending and receiving demand response price and reliability signals from a utility or Independent System Operator to electric customers. OpenADR is one element of the Smart Grid information and communications technologies that are being developed to improve optimization between electric supply and demand. The intention of the open automated demand response communications data model is to provide interoperable signals to building and industrial control systems that are preprogrammed to take action based on a demand response signal, enabling a demand response event to be fully automated, with no manual intervention. The OpenADR specification is a flexible infrastructure to facilitate common information exchange between the utility or Independent System Operator and end-use participants. The concept of an open specification is intended to allow anyone to implement the signaling systems, the automation server or the automation clients.

  9. Participation through Automation: Fully Automated Critical PeakPricing in Commercial Buildings

    Energy Technology Data Exchange (ETDEWEB)

    Piette, Mary Ann; Watson, David S.; Motegi, Naoya; Kiliccote,Sila; Linkugel, Eric

    2006-06-20

    California electric utilities have been exploring the use of dynamic critical peak prices (CPP) and other demand response programs to help reduce peaks in customer electric loads. CPP is a tariff design to promote demand response. Levels of automation in DR can be defined as follows: Manual Demand Response involves a potentially labor-intensive approach such as manually turning off or changing comfort set points at each equipment switch or controller. Semi-Automated Demand Response involves a pre-programmed demand response strategy initiated by a person via centralized control system. Fully Automated Demand Response does not involve human intervention, but is initiated at a home, building, or facility through receipt of an external communications signal. The receipt of the external signal initiates pre-programmed demand response strategies. They refer to this as Auto-DR. This paper describes the development, testing, and results from automated CPP (Auto-CPP) as part of a utility project in California. The paper presents the project description and test methodology. This is followed by a discussion of Auto-DR strategies used in the field test buildings. They present a sample Auto-CPP load shape case study, and a selection of the Auto-CPP response data from September 29, 2005. If all twelve sites reached their maximum saving simultaneously, a total of approximately 2 MW of DR is available from these twelve sites that represent about two million ft{sup 2}. The average DR was about half that value, at about 1 MW. These savings translate to about 0.5 to 1.0 W/ft{sup 2} of demand reduction. They are continuing field demonstrations and economic evaluations to pursue increasing penetrations of automated DR that has demonstrated ability to provide a valuable DR resource for California.

  10. Fully Mechanically Controlled Automated Electron Microscopic Tomography

    Science.gov (United States)

    Liu, Jinxin; Li, Hongchang; Zhang, Lei; Rames, Matthew; Zhang, Meng; Yu, Yadong; Peng, Bo; Celis, César Díaz; Xu, April; Zou, Qin; Yang, Xu; Chen, Xuefeng; Ren, Gang

    2016-07-01

    Knowledge of three-dimensional (3D) structures of each individual particles of asymmetric and flexible proteins is essential in understanding those proteins’ functions; but their structures are difficult to determine. Electron tomography (ET) provides a tool for imaging a single and unique biological object from a series of tilted angles, but it is challenging to image a single protein for three-dimensional (3D) reconstruction due to the imperfect mechanical control capability of the specimen goniometer under both a medium to high magnification (approximately 50,000–160,000×) and an optimized beam coherence condition. Here, we report a fully mechanical control method for automating ET data acquisition without using beam tilt/shift processes. This method could reduce the accumulation of beam tilt/shift that used to compensate the error from the mechanical control, but downgraded the beam coherence. Our method was developed by minimizing the error of the target object center during the tilting process through a closed-loop proportional-integral (PI) control algorithm. The validations by both negative staining (NS) and cryo-electron microscopy (cryo-EM) suggest that this method has a comparable capability to other ET methods in tracking target proteins while maintaining optimized beam coherence conditions for imaging.

  11. A home-built, fully automated observatory

    Science.gov (United States)

    Beales, M.

    2010-12-01

    This paper describes the design of an automated observatory making use of off-the-shelf components and software. I make no claims for originality in the design but it has been an interesting and rewarding exercise to get all the components to work together.

  12. Home Network Technologies and Automating Demand Response

    Energy Technology Data Exchange (ETDEWEB)

    McParland, Charles

    2009-12-01

    Over the past several years, interest in large-scale control of peak energy demand and total consumption has increased. While motivated by a number of factors, this interest has primarily been spurred on the demand side by the increasing cost of energy and, on the supply side by the limited ability of utilities to build sufficient electricity generation capacity to meet unrestrained future demand. To address peak electricity use Demand Response (DR) systems are being proposed to motivate reductions in electricity use through the use of price incentives. DR systems are also be design to shift or curtail energy demand at critical times when the generation, transmission, and distribution systems (i.e. the 'grid') are threatened with instabilities. To be effectively deployed on a large-scale, these proposed DR systems need to be automated. Automation will require robust and efficient data communications infrastructures across geographically dispersed markets. The present availability of widespread Internet connectivity and inexpensive, reliable computing hardware combined with the growing confidence in the capabilities of distributed, application-level communications protocols suggests that now is the time for designing and deploying practical systems. Centralized computer systems that are capable of providing continuous signals to automate customers reduction of power demand, are known as Demand Response Automation Servers (DRAS). The deployment of prototype DRAS systems has already begun - with most initial deployments targeting large commercial and industrial (C & I) customers. An examination of the current overall energy consumption by economic sector shows that the C & I market is responsible for roughly half of all energy consumption in the US. On a per customer basis, large C & I customers clearly have the most to offer - and to gain - by participating in DR programs to reduce peak demand. And, by concentrating on a small number of relatively

  13. A fully automated TerraSAR-X based flood service

    Science.gov (United States)

    Martinis, Sandro; Kersten, Jens; Twele, André

    2015-06-01

    In this paper, a fully automated processing chain for near real-time flood detection using high resolution TerraSAR-X Synthetic Aperture Radar (SAR) data is presented. The processing chain including SAR data pre-processing, computation and adaption of global auxiliary data, unsupervised initialization of the classification as well as post-classification refinement by using a fuzzy logic-based approach is automatically triggered after satellite data delivery. The dissemination of flood maps resulting from this service is performed through an online service which can be activated on-demand for emergency response purposes (i.e., when a flood situation evolves). The classification methodology is based on previous work of the authors but was substantially refined and extended for robustness and transferability to guarantee high classification accuracy under different environmental conditions and sensor configurations. With respect to accuracy and computational effort, experiments performed on a data set of 175 different TerraSAR-X scenes acquired during flooding all over the world with different sensor configurations confirm the robustness and effectiveness of the proposed flood mapping service. These promising results have been further confirmed by means of an in-depth validation performed for three study sites in Germany, Thailand, and Albania/Montenegro.

  14. Technology for the fully automated milking of cows

    Directory of Open Access Journals (Sweden)

    J. Gouws

    1994-07-01

    Full Text Available Since dairy farming is a very labour intensive, seven-days-per-week activity, increasing emphasis is being placed on the use of advanced technology in dairying throughout the world. Dairy mechanisation has been well established for many years, whereas dairy automation has only started to gain momentum fairly recently. An important milestone was the introduction of systems for automatic animal identification in the 1970’s. That paved the way for all further dairy automation activities. An analysis of the current status of the fully automated milking of cows shows that the automated attachment of a milking machine’s teat cups to a cow ’s teats is the most important task in dairying that remains to be automated.

  15. Northwest Open Automated Demand Response Technology Demonstration Project

    Energy Technology Data Exchange (ETDEWEB)

    Kiliccote, Sila; Piette, Mary Ann; Dudley, Junqiao

    2010-03-17

    The Lawrence Berkeley National Laboratory (LBNL) Demand Response Research Center (DRRC) demonstrated and evaluated open automated demand response (OpenADR) communication infrastructure to reduce winter morning and summer afternoon peak electricity demand in commercial buildings the Seattle area. LBNL performed this demonstration for the Bonneville Power Administration (BPA) in the Seattle City Light (SCL) service territory at five sites: Seattle Municipal Tower, Seattle University, McKinstry, and two Target stores. This report describes the process and results of the demonstration. OpenADR is an information exchange model that uses a client-server architecture to automate demand-response (DR) programs. These field tests evaluated the feasibility of deploying fully automated DR during both winter and summer peak periods. DR savings were evaluated for several building systems and control strategies. This project studied DR during hot summer afternoons and cold winter mornings, both periods when electricity demand is typically high. This is the DRRC project team's first experience using automation for year-round DR resources and evaluating the flexibility of commercial buildings end-use loads to participate in DR in dual-peaking climates. The lessons learned contribute to understanding end-use loads that are suitable for dispatch at different times of the year. The project was funded by BPA and SCL. BPA is a U.S. Department of Energy agency headquartered in Portland, Oregon and serving the Pacific Northwest. BPA operates an electricity transmission system and markets wholesale electrical power at cost from federal dams, one non-federal nuclear plant, and other non-federal hydroelectric and wind energy generation facilities. Created by the citizens of Seattle in 1902, SCL is the second-largest municipal utility in America. SCL purchases approximately 40% of its electricity and the majority of its transmission from BPA through a preference contract. SCL also

  16. Automated Demand Response Opportunities in Wastewater Treatment Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Thompson, Lisa; Song, Katherine; Lekov, Alex; McKane, Aimee

    2008-11-19

    Wastewater treatment is an energy intensive process which, together with water treatment, comprises about three percent of U.S. annual energy use. Yet, since wastewater treatment facilities are often peripheral to major electricity-using industries, they are frequently an overlooked area for automated demand response opportunities. Demand response is a set of actions taken to reduce electric loads when contingencies, such as emergencies or congestion, occur that threaten supply-demand balance, and/or market conditions occur that raise electric supply costs. Demand response programs are designed to improve the reliability of the electric grid and to lower the use of electricity during peak times to reduce the total system costs. Open automated demand response is a set of continuous, open communication signals and systems provided over the Internet to allow facilities to automate their demand response activities without the need for manual actions. Automated demand response strategies can be implemented as an enhanced use of upgraded equipment and facility control strategies installed as energy efficiency measures. Conversely, installation of controls to support automated demand response may result in improved energy efficiency through real-time access to operational data. This paper argues that the implementation of energy efficiency opportunities in wastewater treatment facilities creates a base for achieving successful demand reductions. This paper characterizes energy use and the state of demand response readiness in wastewater treatment facilities and outlines automated demand response opportunities.

  17. FASTER: an unsupervised fully automated sleep staging method for mice

    OpenAIRE

    Sunagawa, GA; Sei, H; Shimba, S; Urade, Y; Ueda, HR

    2013-01-01

    Identifying the stages of sleep, or sleep staging, is an unavoidable step in sleep research and typically requires visual inspection of electroencephalography (EEG) and electromyography (EMG) data. Currently, scoring is slow, biased and prone to error by humans and thus is the most important bottleneck for large-scale sleep research in animals. We have developed an unsupervised, fully automated sleep staging method for mice that allows less subjective and high-throughput evaluation of sleep. ...

  18. A fully automated multicapillary electrophoresis device for DNA analysis.

    Science.gov (United States)

    Behr, S; Mätzig, M; Levin, A; Eickhoff, H; Heller, C

    1999-06-01

    We describe the construction and performance of a fully automated multicapillary electrophoresis system for the analysis of fluorescently labeled biomolecules. A special detection system allows the simultaneous spectral analysis of all 96 capillaries. The main features are true parallel detection without any moving parts, high robustness, and full compatibility to existing protocols. The device can process up to 40 microtiter plates (96 and 384 well) without human interference, which means up to 15,000 samples before it has to be reloaded.

  19. Fully automated apparatus for the proximate analysis of coals

    Energy Technology Data Exchange (ETDEWEB)

    Fukumoto, K.; Ishibashi, Y.; Ishii, T.; Maeda, K.; Ogawa, A.; Gotoh, K.

    1985-01-01

    The authors report the development of fully-automated equipment for the proximate analysis of coals, a development undertaken with the twin aims of labour-saving and developing robot applications technology. This system comprises a balance, electric furnaces, a sulfur analyzer, etc., arranged concentrically around a multi-jointed robot which automatically performs all the necessary operations, such as sampling and weighing the materials for analysis, and inserting and removing them from the furnaces. 2 references.

  20. Fully automated setup for high temperature Seebeck coefficient measurement

    CERN Document Server

    Patel, Ashutosh

    2016-01-01

    In this work, we report the fabrication of fully automated experimental setup for high temperature Seebeck coefficient ($\\alpha$) measurement. The K-type thermocouples are used to measure the average temperature of the sample and Seebeck voltage (SV) across it. The temperature dependence of the Seebeck coefficients of the thermocouple and its negative leg is taken care by using the integration method. Steady state based differential technique is used for $\\alpha$ measurement. Use of limited component and thin heater simplify the sample holder design and minimize the heat loss. The power supplied to the heater decides temperature difference across the sample and measurement is carried out by achieving the steady state. The LabVIEW based program is built to automize the whole measurement process. The complete setup is fabricated by using commonly available materials in the market. This instrument is standardized for materials with a wide range of $\\alpha$ and for the wide range of $\\Delta T$ across the specimen...

  1. Description and calibration of a fully automated infrared scatterometer

    Science.gov (United States)

    Mainguy, Stephane; Olivier, Michel; Josse, Michel A.; Guidon, Michel

    1991-12-01

    A fully automated scatterometer, designed for BRDF measurements in the IR at about 10 micrometers , is described. Basically, it works around a reflecting parabola (464 mm diameter, F/0.25) and permits measurements in and out of the plane of incidence. Optical properties of the parabolic mirror are emphasized by a ray-tracing technique which permits determination of the correct illumination on the sample and detection conditions of scattered light. Advantages and drawbacks of such an instrument are discussed, as well as calibration procedures. As a conclusion, we present experimental results to illustrate the instrument capabilities.

  2. A fully automated high-throughput training system for rodents.

    Directory of Open Access Journals (Sweden)

    Rajesh Poddar

    Full Text Available Addressing the neural mechanisms underlying complex learned behaviors requires training animals in well-controlled tasks, an often time-consuming and labor-intensive process that can severely limit the feasibility of such studies. To overcome this constraint, we developed a fully computer-controlled general purpose system for high-throughput training of rodents. By standardizing and automating the implementation of predefined training protocols within the animal's home-cage our system dramatically reduces the efforts involved in animal training while also removing human errors and biases from the process. We deployed this system to train rats in a variety of sensorimotor tasks, achieving learning rates comparable to existing, but more laborious, methods. By incrementally and systematically increasing the difficulty of the task over weeks of training, rats were able to master motor tasks that, in complexity and structure, resemble ones used in primate studies of motor sequence learning. By enabling fully automated training of rodents in a home-cage setting this low-cost and modular system increases the utility of rodents for studying the neural underpinnings of a variety of complex behaviors.

  3. FASTER: an unsupervised fully automated sleep staging method for mice.

    Science.gov (United States)

    Sunagawa, Genshiro A; Séi, Hiroyoshi; Shimba, Shigeki; Urade, Yoshihiro; Ueda, Hiroki R

    2013-06-01

    Identifying the stages of sleep, or sleep staging, is an unavoidable step in sleep research and typically requires visual inspection of electroencephalography (EEG) and electromyography (EMG) data. Currently, scoring is slow, biased and prone to error by humans and thus is the most important bottleneck for large-scale sleep research in animals. We have developed an unsupervised, fully automated sleep staging method for mice that allows less subjective and high-throughput evaluation of sleep. Fully Automated Sleep sTaging method via EEG/EMG Recordings (FASTER) is based on nonparametric density estimation clustering of comprehensive EEG/EMG power spectra. FASTER can accurately identify sleep patterns in mice that have been perturbed by drugs or by genetic modification of a clock gene. The overall accuracy is over 90% in every group. 24-h data are staged by a laptop computer in 10 min, which is faster than an experienced human rater. Dramatically improving the sleep staging process in both quality and throughput FASTER will open the door to quantitative and comprehensive animal sleep research. © 2013 The Authors Genes to Cells © 2013 by the Molecular Biology Society of Japan and Wiley Publishing Asia Pty Ltd.

  4. Fully Automated Lipid Pool Detection Using Near Infrared Spectroscopy

    Directory of Open Access Journals (Sweden)

    Elżbieta Pociask

    2016-01-01

    Full Text Available Background. Detecting and identifying vulnerable plaque, which is prone to rupture, is still a challenge for cardiologist. Such lipid core-containing plaque is still not identifiable by everyday angiography, thus triggering the need to develop a new tool where NIRS-IVUS can visualize plaque characterization in terms of its chemical and morphologic characteristic. The new tool can lead to the development of new methods of interpreting the newly obtained data. In this study, the algorithm to fully automated lipid pool detection on NIRS images is proposed. Method. Designed algorithm is divided into four stages: preprocessing (image enhancement, segmentation of artifacts, detection of lipid areas, and calculation of Lipid Core Burden Index. Results. A total of 31 NIRS chemograms were analyzed by two methods. The metrics, total LCBI, maximal LCBI in 4 mm blocks, and maximal LCBI in 2 mm blocks, were calculated to compare presented algorithm with commercial available system. Both intraclass correlation (ICC and Bland-Altman plots showed good agreement and correlation between used methods. Conclusions. Proposed algorithm is fully automated lipid pool detection on near infrared spectroscopy images. It is a tool developed for offline data analysis, which could be easily augmented for newer functions and projects.

  5. Fully Automated Deep Learning System for Bone Age Assessment.

    Science.gov (United States)

    Lee, Hyunkwang; Tajmir, Shahein; Lee, Jenny; Zissen, Maurice; Yeshiwas, Bethel Ayele; Alkasab, Tarik K; Choy, Garry; Do, Synho

    2017-08-01

    Skeletal maturity progresses through discrete phases, a fact that is used routinely in pediatrics where bone age assessments (BAAs) are compared to chronological age in the evaluation of endocrine and metabolic disorders. While central to many disease evaluations, little has changed to improve the tedious process since its introduction in 1950. In this study, we propose a fully automated deep learning pipeline to segment a region of interest, standardize and preprocess input radiographs, and perform BAA. Our models use an ImageNet pretrained, fine-tuned convolutional neural network (CNN) to achieve 57.32 and 61.40% accuracies for the female and male cohorts on our held-out test images. Female test radiographs were assigned a BAA within 1 year 90.39% and within 2 years 98.11% of the time. Male test radiographs were assigned 94.18% within 1 year and 99.00% within 2 years. Using the input occlusion method, attention maps were created which reveal what features the trained model uses to perform BAA. These correspond to what human experts look at when manually performing BAA. Finally, the fully automated BAA system was deployed in the clinical environment as a decision supporting system for more accurate and efficient BAAs at much faster interpretation time (<2 s) than the conventional method.

  6. Fully automated algorithm for wound surface area assessment.

    Science.gov (United States)

    Deana, Alessandro Melo; de Jesus, Sérgio Henrique Costa; Sampaio, Brunna Pileggi Azevedo; Oliveira, Marcelo Tavares; Silva, Daniela Fátima Teixeira; França, Cristiane Miranda

    2013-01-01

    Worldwide, clinicians, dentists, nurses, researchers, and other health professionals need to monitor the wound healing progress and to quantify the rate of wound closure. The aim of this study is to demonstrate, step by step, a fully automated numerical method to estimate the size of the wound and the percentage damaged relative to the body surface area (BSA) in images, without the requirement for human intervention. We included the formula for BSA in rats in the algorithm. The methodology was validated in experimental wounds and human ulcers and was compared with the analysis of an experienced pathologist, with good agreement. Therefore, this algorithm is suitable for experimental wounds and burns and human ulcers, as they have a high contrast with adjacent normal skin.

  7. Challenges and Demands on Automated Software Revision

    Science.gov (United States)

    Bonakdarpour, Borzoo; Kulkarni, Sandeep S.

    2008-01-01

    In the past three decades, automated program verification has undoubtedly been one of the most successful contributions of formal methods to software development. However, when verification of a program against a logical specification discovers bugs in the program, manual manipulation of the program is needed in order to repair it. Thus, in the face of existence of numerous unverified and un- certified legacy software in virtually any organization, tools that enable engineers to automatically verify and subsequently fix existing programs are highly desirable. In addition, since requirements of software systems often evolve during the software life cycle, the issue of incomplete specification has become a customary fact in many design and development teams. Thus, automated techniques that revise existing programs according to new specifications are of great assistance to designers, developers, and maintenance engineers. As a result, incorporating program synthesis techniques where an algorithm generates a program, that is correct-by-construction, seems to be a necessity. The notion of manual program repair described above turns out to be even more complex when programs are integrated with large collections of sensors and actuators in hostile physical environments in the so-called cyber-physical systems. When such systems are safety/mission- critical (e.g., in avionics systems), it is essential that the system reacts to physical events such as faults, delays, signals, attacks, etc, so that the system specification is not violated. In fact, since it is impossible to anticipate all possible such physical events at design time, it is highly desirable to have automated techniques that revise programs with respect to newly identified physical events according to the system specification.

  8. Northwest Open Automated Demand Response Technology Demonstration Project

    Energy Technology Data Exchange (ETDEWEB)

    Kiliccote, Sila; Dudley, Junqiao Han; Piette, Mary Ann

    2009-08-01

    Lawrence Berkeley National Laboratory (LBNL) and the Demand Response Research Center (DRRC) performed a technology demonstration and evaluation for Bonneville Power Administration (BPA) in Seattle City Light's (SCL) service territory. This report summarizes the process and results of deploying open automated demand response (OpenADR) in Seattle area with winter morning peaking commercial buildings. The field tests were designed to evaluate the feasibility of deploying fully automated demand response (DR) in four to six sites in the winter and the savings from various building systems. The project started in November of 2008 and lasted 6 months. The methodology for the study included site recruitment, control strategy development, automation system deployment and enhancements, and evaluation of sites participation in DR test events. LBNL subcontracted McKinstry and Akuacom for this project. McKinstry assisted with recruitment, site survey collection, strategy development and overall participant and control vendor management. Akuacom established a new server and enhanced its operations to allow for scheduling winter morning day-of and day-ahead events. Each site signed a Memorandum of Agreement with SCL. SCL offered each site $3,000 for agreeing to participate in the study and an additional $1,000 for each event they participated. Each facility and their control vendor worked with LBNL and McKinstry to select and implement control strategies for DR and developed their automation based on the existing Internet connectivity and building control system. Once the DR strategies were programmed, McKinstry commissioned them before actual test events. McKinstry worked with LBNL to identify control points that can be archived at each facility. For each site LBNL collected meter data and trend logs from the energy management and control system. The communication system allowed the sites to receive day-ahead as well as day-of DR test event signals. Measurement of DR was

  9. Opportunities for Automated Demand Response in California Agricultural Irrigation

    Energy Technology Data Exchange (ETDEWEB)

    Olsen, Daniel [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Aghajanzadeh, Arian [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); McKane, Aimee [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2015-08-01

    Pumping water for agricultural irrigation represents a significant share of California’s annual electricity use and peak demand. It also represents a large source of potential flexibility, as farms possess a form of storage in their wetted soil. By carefully modifying their irrigation schedules, growers can participate in demand response without adverse effects on their crops. This report describes the potential for participation in demand response and automated demand response by agricultural irrigators in California, as well as barriers to widespread participation. The report first describes the magnitude, timing, location, purpose, and manner of energy use in California. Typical on-­farm controls are discussed, as well as common impediments to participation in demand response and automated demand response programs. Case studies of demand response programs in California and across the country are reviewed, and their results along with overall California demand estimates are used to estimate statewide demand response potential. Finally, recommendations are made for future research that can enhance the understanding of demand response potential in this industry.

  10. Opportunities for Automated Demand Response in California Wastewater Treatment Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Aghajanzadeh, Arian [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Wray, Craig [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); McKane, Aimee [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2015-08-30

    Previous research over a period of six years has identified wastewater treatment facilities as good candidates for demand response (DR), automated demand response (Auto-­DR), and Energy Efficiency (EE) measures. This report summarizes that work, including the characteristics of wastewater treatment facilities, the nature of the wastewater stream, energy used and demand, as well as details of the wastewater treatment process. It also discusses control systems and automated demand response opportunities. Furthermore, this report summarizes the DR potential of three wastewater treatment facilities. In particular, Lawrence Berkeley National Laboratory (LBNL) has collected data at these facilities from control systems, submetered process equipment, utility electricity demand records, and governmental weather stations. The collected data were then used to generate a summary of wastewater power demand, factors affecting that demand, and demand response capabilities. These case studies show that facilities that have implemented energy efficiency measures and that have centralized control systems are well suited to shed or shift electrical loads in response to financial incentives, utility bill savings, and/or opportunities to enhance reliability of service. In summary, municipal wastewater treatment energy demand in California is large, and energy-­intensive equipment offers significant potential for automated demand response. In particular, large load reductions were achieved by targeting effluent pumps and centrifuges. One of the limiting factors to implementing demand response is the reaction of effluent turbidity to reduced aeration at an earlier stage of the process. Another limiting factor is that cogeneration capabilities of municipal facilities, including existing power purchase agreements and utility receptiveness to purchasing electricity from cogeneration facilities, limit a facility’s potential to participate in other DR activities.

  11. Fully automated calculation of cardiothoracic ratio in digital chest radiographs

    Science.gov (United States)

    Cong, Lin; Jiang, Luan; Chen, Gang; Li, Qiang

    2017-03-01

    The calculation of Cardiothoracic Ratio (CTR) in digital chest radiographs would be useful for cardiac anomaly assessment and heart enlargement related disease indication. The purpose of this study was to develop and evaluate a fully automated scheme for calculation of CTR in digital chest radiographs. Our automated method consisted of three steps, i.e., lung region localization, lung segmentation, and CTR calculation. We manually annotated the lung boundary with 84 points in 100 digital chest radiographs, and calculated an average lung model for the subsequent work. Firstly, in order to localize the lung region, generalized Hough transform was employed to identify the upper, lower, and outer boundaries of lung by use of Sobel gradient information. The average lung model was aligned to the localized lung region to obtain the initial lung outline. Secondly, we separately applied dynamic programming method to detect the upper, lower, outer and inner boundaries of lungs, and then linked the four boundaries to segment the lungs. Based on the identified outer boundaries of left lung and right lung, we corrected the center and the declination of the original radiography. Finally, CTR was calculated as a ratio of the transverse diameter of the heart to the internal diameter of the chest, based on the segmented lungs. The preliminary results on 106 digital chest radiographs showed that the proposed method could obtain accurate segmentation of lung based on subjective observation, and achieved sensitivity of 88.9% (40 of 45 abnormalities), and specificity of 100% (i.e. 61 of 61 normal) for the identification of heart enlargements.

  12. Fully automated stroke tissue estimation using random forest classifiers (FASTER).

    Science.gov (United States)

    McKinley, Richard; Häni, Levin; Gralla, Jan; El-Koussy, M; Bauer, S; Arnold, M; Fischer, U; Jung, S; Mattmann, Kaspar; Reyes, Mauricio; Wiest, Roland

    2017-08-01

    Several clinical trials have recently proven the efficacy of mechanical thrombectomy for treating ischemic stroke, within a six-hour window for therapy. To move beyond treatment windows and toward personalized risk assessment, it is essential to accurately identify the extent of tissue-at-risk ("penumbra"). We introduce a fully automated method to estimate the penumbra volume using multimodal MRI (diffusion-weighted imaging, a T2w- and T1w contrast-enhanced sequence, and dynamic susceptibility contrast perfusion MRI). The method estimates tissue-at-risk by predicting tissue damage in the case of both persistent occlusion and of complete recanalization. When applied to 19 test cases with a thrombolysis in cerebral infarction grading of 1-2a, mean overestimation of final lesion volume was 30 ml, compared with 121 ml for manually corrected thresholding. Predicted tissue-at-risk volume was positively correlated with final lesion volume ( p serve as an alternative method for identifying tissue-at-risk that may aid in treatment selection in ischemic stroke.

  13. Fully Automated Portable Comprehensive 2-Dimensional Gas Chromatography Device.

    Science.gov (United States)

    Lee, Jiwon; Zhou, Menglian; Zhu, Hongbo; Nidetz, Robert; Kurabayashi, Katsuo; Fan, Xudong

    2016-10-06

    We developed a fully automated portable 2-dimensional (2-D) gas chromatography (GC x GC) device, which had a dimension of 60 cm × 50 cm × 10 cm and weight less than 5 kg. The device incorporated a micropreconcentrator/injector, commercial columns, micro-Deans switches, microthermal injectors, microphotoionization detectors, data acquisition cards, and power supplies, as well as computer control and user interface. It employed multiple channels (4 channels) in the second dimension ((2)D) to increase the (2)D separation time (up to 32 s) and hence (2)D peak capacity. In addition, a nondestructive flow-through vapor detector was installed at the end of the (1)D column to monitor the eluent from (1)D and assist in reconstructing (1)D elution peaks. With the information obtained jointly from the (1)D and (2)D detectors, (1)D elution peaks could be reconstructed with significantly improved (1)D resolution. In this Article, we first discuss the details of the system operating principle and the algorithm to reconstruct (1)D elution peaks, followed by the description and characterization of each component. Finally, 2-D separation of 50 analytes, including alkane (C6-C12), alkene, alcohol, aldehyde, ketone, cycloalkane, and aromatic hydrocarbon, in 14 min is demonstrated, showing the peak capacity of 430-530 and the peak capacity production of 40-80/min.

  14. Fully Automated Operational Modal Analysis using multi-stage clustering

    Science.gov (United States)

    Neu, Eugen; Janser, Frank; Khatibi, Akbar A.; Orifici, Adrian C.

    2017-02-01

    The interest for robust automatic modal parameter extraction techniques has increased significantly over the last years, together with the rising demand for continuous health monitoring of critical infrastructure like bridges, buildings and wind turbine blades. In this study a novel, multi-stage clustering approach for Automated Operational Modal Analysis (AOMA) is introduced. In contrast to existing approaches, the procedure works without any user-provided thresholds, is applicable within large system order ranges, can be used with very small sensor numbers and does not place any limitations on the damping ratio or the complexity of the system under investigation. The approach works with any parametric system identification algorithm that uses the system order n as sole parameter. Here a data-driven Stochastic Subspace Identification (SSI) method is used. Measurements from a wind tunnel investigation with a composite cantilever equipped with Fiber Bragg Grating Sensors (FBGSs) and piezoelectric sensors are used to assess the performance of the algorithm with a highly damped structure and low signal to noise ratio conditions. The proposed method was able to identify all physical system modes in the investigated frequency range from over 1000 individual datasets using FBGSs under challenging signal to noise ratio conditions and under better signal conditions but from only two sensors.

  15. Findings from Seven Years of Field Performance Data for Automated Demand Response in Commercial Buildings

    Energy Technology Data Exchange (ETDEWEB)

    Kiliccote, Sila; Piette, Mary Ann; Mathieu, Johanna; Parrish, Kristen

    2010-05-14

    California is a leader in automating demand response (DR) to promote low-cost, consistent, and predictable electric grid management tools. Over 250 commercial and industrial facilities in California participate in fully-automated programs providing over 60 MW of peak DR savings. This paper presents a summary of Open Automated DR (OpenADR) implementation by each of the investor-owned utilities in California. It provides a summary of participation, DR strategies and incentives. Commercial buildings can reduce peak demand from 5 to 15percent with an average of 13percent. Industrial facilities shed much higher loads. For buildings with multi-year savings we evaluate their load variability and shed variability. We provide a summary of control strategies deployed, along with costs to install automation. We report on how the electric DR control strategies perform over many years of events. We benchmark the peak demand of this sample of buildings against their past baselines to understand the differences in building performance over the years. This is done with peak demand intensities and load factors. The paper also describes the importance of these data in helping to understand possible techniques to reach net zero energy using peak day dynamic control capabilities in commercial buildings. We present an example in which the electric load shape changed as a result of a lighting retrofit.

  16. Detection of virus-specific intrathecally synthesised immunoglobulin G with a fully automated enzyme immunoassay system

    Directory of Open Access Journals (Sweden)

    Weissbrich Benedikt

    2007-05-01

    Full Text Available Abstract Background The determination of virus-specific immunoglobulin G (IgG antibodies in cerebrospinal fluid (CSF is useful for the diagnosis of virus associated diseases of the central nervous system (CNS and for the detection of a polyspecific intrathecal immune response in patients with multiple sclerosis. Quantification of virus-specific IgG in the CSF is frequently performed by calculation of a virus-specific antibody index (AI. Determination of the AI is a demanding and labour-intensive technique and therefore automation is desirable. We evaluated the precision and the diagnostic value of a fully automated enzyme immunoassay for the detection of virus-specific IgG in serum and CSF using the analyser BEP2000 (Dade Behring. Methods The AI for measles, rubella, varicella-zoster, and herpes simplex virus IgG was determined from pairs of serum and CSF samples of patients with viral CNS infections, multiple sclerosis and of control patients. CSF and serum samples were tested simultaneously with reference to a standard curve. Starting dilutions were 1:6 and 1:36 for CSF and 1:1386 and 1:8316 for serum samples. Results The interassay coefficient of variation was below 10% for all parameters tested. There was good agreement between AIs obtained with the BEP2000 and AIs derived from the semi-automated reference method. Conclusion Determination of virus-specific IgG in serum-CSF-pairs for calculation of AI has been successfully automated on the BEP2000. Current limitations of the assay layout imposed by the analyser software should be solved in future versions to offer more convenience in comparison to manual or semi-automated methods.

  17. Enabling Automated Dynamic Demand Response: From Theory to Practice

    Energy Technology Data Exchange (ETDEWEB)

    Frincu, Marc; Chelmis, Charalampos; Aman, Saima; Saeed, Rizwan; Zois, Vasileios; Prasanna, Viktor

    2015-07-14

    Demand response (DR) is a technique used in smart grids to shape customer load during peak hours. Automated DR offers utilities a fine grained control and a high degree of confidence in the outcome. However the impact on the customer's comfort means this technique is more suited for industrial and commercial settings than for residential homes. In this paper we propose a system for achieving automated controlled DR in a heterogeneous environment. We present some of the main issues arising in building such a system, including privacy, customer satisfiability, reliability, and fast decision turnaround, with emphasis on the solutions we proposed. Based on the lessons we learned from empirical results we describe an integrated automated system for controlled DR on the USC microgrid. Results show that while on a per building per event basis the accuracy of our prediction and customer selection techniques varies, it performs well on average when considering several events and buildings.

  18. Open Automated Demand Response for Small Commerical Buildings

    Energy Technology Data Exchange (ETDEWEB)

    Dudley, June Han; Piette, Mary Ann; Koch, Ed; Hennage, Dan

    2009-05-01

    This report characterizes small commercial buildings by market segments, systems and end-uses; develops a framework for identifying demand response (DR) enabling technologies and communication means; and reports on the design and development of a low-cost OpenADR enabling technology that delivers demand reductions as a percentage of the total predicted building peak electric demand. The results show that small offices, restaurants and retail buildings are the major contributors making up over one third of the small commercial peak demand. The majority of the small commercial buildings in California are located in southern inland areas and the central valley. Single-zone packaged units with manual and programmable thermostat controls make up the majority of heating ventilation and air conditioning (HVAC) systems for small commercial buildings with less than 200 kW peak electric demand. Fluorescent tubes with magnetic ballast and manual controls dominate this customer group's lighting systems. There are various ways, each with its pros and cons for a particular application, to communicate with these systems and three methods to enable automated DR in small commercial buildings using the Open Automated Demand Response (or OpenADR) communications infrastructure. Development of DR strategies must consider building characteristics, such as weather sensitivity and load variability, as well as system design (i.e. under-sizing, under-lighting, over-sizing, etc). Finally, field tests show that requesting demand reductions as a percentage of the total building predicted peak electric demand is feasible using the OpenADR infrastructure.

  19. Toward Fully Automated Multicriterial Plan Generation: A Prospective Clinical Study

    Energy Technology Data Exchange (ETDEWEB)

    Voet, Peter W.J., E-mail: p.voet@erasmusmc.nl [Department of Radiation Oncology, Erasmus Medical Center–Daniel den Hoed Cancer Center, Groene Hilledijk 301, Rotterdam 3075EA (Netherlands); Dirkx, Maarten L.P.; Breedveld, Sebastiaan; Fransen, Dennie; Levendag, Peter C.; Heijmen, Ben J.M. [Department of Radiation Oncology, Erasmus Medical Center–Daniel den Hoed Cancer Center, Groene Hilledijk 301, Rotterdam 3075EA (Netherlands)

    2013-03-01

    Purpose: To prospectively compare plans generated with iCycle, an in-house-developed algorithm for fully automated multicriterial intensity modulated radiation therapy (IMRT) beam profile and beam orientation optimization, with plans manually generated by dosimetrists using the clinical treatment planning system. Methods and Materials: For 20 randomly selected head-and-neck cancer patients with various tumor locations (of whom 13 received sequential boost treatments), we offered the treating physician the choice between an automatically generated iCycle plan and a manually optimized plan using standard clinical procedures. Although iCycle used a fixed “wish list” with hard constraints and prioritized objectives, the dosimetrists manually selected the beam configuration and fine tuned the constraints and objectives for each IMRT plan. Dosimetrists were not informed in advance whether a competing iCycle plan was made. The 2 plans were simultaneously presented to the physician, who then selected the plan to be used for treatment. For the patient group, differences in planning target volume coverage and sparing of critical tissues were quantified. Results: In 32 of 33 plan comparisons, the physician selected the iCycle plan for treatment. This highly consistent preference for the automatically generated plans was mainly caused by the improved sparing for the large majority of critical structures. With iCycle, the normal tissue complication probabilities for the parotid and submandibular glands were reduced by 2.4% ± 4.9% (maximum, 18.5%, P=.001) and 6.5% ± 8.3% (maximum, 27%, P=.005), respectively. The reduction in the mean oral cavity dose was 2.8 ± 2.8 Gy (maximum, 8.1 Gy, P=.005). For the swallowing muscles, the esophagus and larynx, the mean dose reduction was 3.3 ± 1.1 Gy (maximum, 9.2 Gy, P<.001). For 15 of the 20 patients, target coverage was also improved. Conclusions: In 97% of cases, automatically generated plans were selected for treatment because of

  20. Costs to Automate Demand Response - Taxonomy and Results from Field Studies and Programs

    Energy Technology Data Exchange (ETDEWEB)

    Piette, Mary A. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Schetrit, Oren [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Kiliccote, Sila [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Cheung, Iris [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Li, Becky Z [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2017-07-31

    During the past decade, the technology to automate demand response (DR) in buildings and industrial facilities has advanced significantly. Automation allows rapid, repeatable, reliable operation. This study focuses on costs for DR automation in commercial buildings with some discussion on residential buildings and industrial facilities. DR automation technology relies on numerous components, including communication systems, hardware and software gateways, standards-based messaging protocols, controls and integration platforms, and measurement and telemetry systems. This report compares cost data from several DR automation programs and pilot projects, evaluates trends in the cost per unit of DR and kilowatts (kW) available from automated systems, and applies a standard naming convention and classification or taxonomy for system elements. Median costs for the 56 installed automated DR systems studied here are about $200/kW. The deviation around this median is large with costs in some cases being an order of magnitude great or less than the median. This wide range is a result of variations in system age, size of load reduction, sophistication, and type of equipment included in cost analysis. The costs to automate fast DR systems for ancillary services are not fully analyzed in this report because additional research is needed to determine the total cost to install, operate, and maintain these systems. However, recent research suggests that they could be developed at costs similar to those of existing hot-summer DR automation systems. This report considers installation and configuration costs and does include the costs of owning and operating DR automation systems. Future analysis of the latter costs should include the costs to the building or facility manager costs as well as utility or third party program manager cost.

  1. ATLAS from Data Research Associates: A Fully Integrated Automation System.

    Science.gov (United States)

    Mellinger, Michael J.

    1987-01-01

    This detailed description of a fully integrated, turnkey library system includes a complete profile of the system (functions, operational characteristics, hardware, operating system, minimum memory and pricing); history of the technologies involved; and descriptions of customer services and availability. (CLB)

  2. Fully automated spectrometric protocols for determination of antioxidant activity: advantages and disadvantages

    National Research Council Canada - National Science Library

    Sochor, Jiri; Ryvolova, Marketa; Krystofova, Olga; Salas, Petr; Hubalek, Jaromir; Adam, Vojtech; Trnkova, Libuse; Havel, Ladislav; Beklova, Miroslava; Zehnalek, Josef; Provaznik, Ivo; Kizek, Rene

    2010-01-01

    The aim of this study was to describe behaviour, kinetics, time courses and limitations of the six different fully automated spectrometric methods--DPPH, TEAC, FRAP, DMPD, Free Radicals and Blue CrO5...

  3. Fully automated assessment of inflammatory cell counts and cytokine expression in bronchial tissue.

    NARCIS (Netherlands)

    Sont, J.K.; Boer, W.I.; Schadewijk, W.A. van; Grunberg, K.; Krieken, J.H.J.M. van; Hiemstra, P.S.; Sterk, P.J.

    2003-01-01

    Automated image analysis of bronchial tissue offers the opportunity to quantify stained area and staining intensity in a standardized way to obtain robust estimates of inflammatory cell counts and cytokine expression from multiple large areas of histopathologic sections. We compared fully automated

  4. Fully automated assessment of inflammatory cell counts and cytokine expression in bronchial tissue.

    NARCIS (Netherlands)

    Sont, J.K.; Boer, W.I.; Schadewijk, W.A. van; Grunberg, K.; Krieken, J.H.J.M. van; Hiemstra, P.S.; Sterk, P.J.

    2003-01-01

    Automated image analysis of bronchial tissue offers the opportunity to quantify stained area and staining intensity in a standardized way to obtain robust estimates of inflammatory cell counts and cytokine expression from multiple large areas of histopathologic sections. We compared fully automated

  5. Results and commissioning issues from an automated demand responsepilot

    Energy Technology Data Exchange (ETDEWEB)

    Piette, Mary Ann; Watson, Dave; Sezgen, Osman; Motegi, Naoya

    2004-08-05

    This paper describes a research project to develop and test Automated Demand Response hardware and software technology in large facilities. We describe the overall project and some of the commissioning and system design problems that took place. Demand Response (DR) is a set of activities to reduce or shift electricity use to improve the electric grid reliability purposes, manage electricity costs, and ensure that customers receive signals that encourage load reduction during times when the electric grid is near its capacity. There were a number of specific commissioning challenges in conducting this test including software compatibility, incorrect time zones, IT and EMCS failures, and hardware issues. The knowledge needed for this type of system commissioning combines knowledge of building controls with network management and knowledge of emerging information technologies.

  6. Results and commissioning issues from an automated demand responsepilot

    Energy Technology Data Exchange (ETDEWEB)

    Piette, Mary Ann; Watson, Dave; Sezgen, Osman; Motegi, Naoya

    2004-08-05

    This paper describes a research project to develop and test Automated Demand Response hardware and software technology in large facilities. We describe the overall project and some of the commissioning and system design problems that took place. Demand Response (DR) is a set of activities to reduce or shift electricity use to improve the electric grid reliability purposes, manage electricity costs, and ensure that customers receive signals that encourage load reduction during times when the electric grid is near its capacity. There were a number of specific commissioning challenges in conducting this test including software compatibility, incorrect time zones, IT and EMCS failures, and hardware issues. The knowledge needed for this type of system commissioning combines knowledge of building controls with network management and knowledge of emerging information technologies.

  7. Toward fully automated genotyping: Genotyping microsatellite markers by deconvolution

    Energy Technology Data Exchange (ETDEWEB)

    Perlin, M.W.; Lancia, G.; See-Kiong, Ng [Carnegie Mellon Univ., Pittsburgh, PA (United States)

    1995-11-01

    Dense genetic linkage maps have been constructed for the human and mouse genomes, with average densities of 2.9 cM and 0.35 cM, respectively. These genetic maps are crucial for mapping both Mendelian and complex traits and are useful in clinical genetic diagnosis. Current maps are largely comprised of abundant, easily assayed, and highly polymorphic PCR-based microsatellite markers, primarily dinucleotide (CA){sub n} repeats. One key limitation of these length polymorphisms is the PCR stutter (or slippage) artifact that introduces additional stutter bands. With two (or more) closely spaced alleles, the stutter bands overlap, and it is difficult to accurately determine the correct alleles; this stutter phenomenon has all but precluded full automation, since a human must visually inspect the allele data. We describe here novel deconvolution methods for accurate genotyping that mathematically remove PCR stutter artifact from microsatellite markers. These methods overcome the manual interpretation bottleneck and thereby enable full automation of genetic map construction and use. New functionalities, including the pooling of DNAs and the pooling of markers, are described that may greatly reduce the associated experimentation requirements. 32 refs., 5 figs., 3 tabs.

  8. Role of Standard Demand Response Signals for Advanced Automated Aggregation

    Energy Technology Data Exchange (ETDEWEB)

    Lawrence Berkeley National Laboratory; Kiliccote, Sila

    2011-11-18

    Emerging standards such as OpenADR enable Demand Response (DR) Resources to interact directly with Utilities and Independent System Operators to allow their facility automation equipment to respond to a variety of DR signals ranging from day ahead to real time ancillary services. In addition, there are Aggregators in today’s markets who are capable of bringing together collections of aggregated DR assets and selling them to the grid as a single resource. However, in most cases these aggregated resources are not automated and when they are, they typically use proprietary technologies. There is a need for a framework for dealing with aggregated resources that supports the following requirements: • Allows demand-side resources to participate in multiple DR markets ranging from wholesale ancillary services to retail tariffs without being completely committed to a single entity like an Aggregator; • Allow aggregated groups of demand-side resources to be formed in an ad hoc fashion to address specific grid-side issues and support the optimization of the collective response of an aggregated group along a number of different dimensions. This is important in order to taylor the aggregated performance envelope to the needs to of the grid; • Allow aggregated groups to be formed in a hierarchical fashion so that each group can participate in variety of markets from wholesale ancillary services to distribution level retail tariffs. This paper explores the issues of aggregated groups of DR resources as described above especially within the context of emerging smart grid standards and the role they will play in both the management and interaction of various grid-side entities with those resources.

  9. Open Automated Demand Response Communications in Demand Response for Wholesale Ancillary Services

    Energy Technology Data Exchange (ETDEWEB)

    Kiliccote, Sila; Piette, Mary Ann; Ghatikar, Girish; Koch, Ed; Hennage, Dan; Hernandez, John; Chiu, Albert; Sezgen, Osman; Goodin, John

    2009-11-06

    The Pacific Gas and Electric Company (PG&E) is conducting a pilot program to investigate the technical feasibility of bidding certain demand response (DR) resources into the California Independent System Operator's (CAISO) day-ahead market for ancillary services nonspinning reserve. Three facilities, a retail store, a local government office building, and a bakery, are recruited into the pilot program. For each facility, hourly demand, and load curtailment potential are forecasted two days ahead and submitted to the CAISO the day before the operation as an available resource. These DR resources are optimized against all other generation resources in the CAISO ancillary service. Each facility is equipped with four-second real time telemetry equipment to ensure resource accountability and visibility to CAISO operators. When CAISO requests DR resources, PG&E's OpenADR (Open Automated DR) communications infrastructure is utilized to deliver DR signals to the facilities energy management and control systems (EMCS). The pre-programmed DR strategies are triggered without a human in the loop. This paper describes the automated system architecture and the flow of information to trigger and monitor the performance of the DR events. We outline the DR strategies at each of the participating facilities. At one site a real time electric measurement feedback loop is implemented to assure the delivery of CAISO dispatched demand reductions. Finally, we present results from each of the facilities and discuss findings.

  10. A Distributed Intelligent Automated Demand Response Building Management System

    Energy Technology Data Exchange (ETDEWEB)

    Auslander, David; Culler, David; Wright, Paul; Lu, Yan; Piette, Mary

    2013-12-30

    The goal of the 2.5 year Distributed Intelligent Automated Demand Response (DIADR) project was to reduce peak electricity load of Sutardja Dai Hall at UC Berkeley by 30% while maintaining a healthy, comfortable, and productive environment for the occupants. We sought to bring together both central and distributed control to provide “deep” demand response1 at the appliance level of the building as well as typical lighting and HVAC applications. This project brought together Siemens Corporate Research and Siemens Building Technology (the building has a Siemens Apogee Building Automation System (BAS)), Lawrence Berkeley National Laboratory (leveraging their Open Automated Demand Response (openADR), Auto-­Demand Response, and building modeling expertise), and UC Berkeley (related demand response research including distributed wireless control, and grid-­to-­building gateway development). Sutardja Dai Hall houses the Center for Information Technology Research in the Interest of Society (CITRIS), which fosters collaboration among industry and faculty and students of four UC campuses (Berkeley, Davis, Merced, and Santa Cruz). The 141,000 square foot building, occupied in 2009, includes typical office spaces and a nanofabrication laboratory. Heating is provided by a district heating system (steam from campus as a byproduct of the campus cogeneration plant); cooling is provided by one of two chillers: a more typical electric centrifugal compressor chiller designed for the cool months (Nov-­ March) and a steam absorption chiller for use in the warm months (April-­October). Lighting in the open office areas is provided by direct-­indirect luminaries with Building Management System-­based scheduling for open areas, and occupancy sensors for private office areas. For the purposes of this project, we focused on the office portion of the building. Annual energy consumption is approximately 8053 MWh; the office portion is estimated as 1924 MWh. The maximum peak load

  11. A Distributed Intelligent Automated Demand Response Building Management System

    Energy Technology Data Exchange (ETDEWEB)

    Auslander, David [Univ. of California, Berkeley, CA (United States); Culler, David [Univ. of California, Berkeley, CA (United States); Wright, Paul [Univ. of California, Berkeley, CA (United States); Lu, Yan [Siemens Corporate Research Inc., Princeton, NJ (United States); Piette, Mary [Univ. of California, Berkeley, CA (United States)

    2013-03-31

    The goal of the 2.5 year Distributed Intelligent Automated Demand Response (DIADR) project was to reduce peak electricity load of Sutardja Dai Hall at UC Berkeley by 30% while maintaining a healthy, comfortable, and productive environment for the occupants. We sought to bring together both central and distributed control to provide “deep” demand response1 at the appliance level of the building as well as typical lighting and HVAC applications. This project brought together Siemens Corporate Research and Siemens Building Technology (the building has a Siemens Apogee Building Automation System (BAS)), Lawrence Berkeley National Laboratory (leveraging their Open Automated Demand Response (openADR), Auto-­Demand Response, and building modeling expertise), and UC Berkeley (related demand response research including distributed wireless control, and grid-­to-­building gateway development). Sutardja Dai Hall houses the Center for Information Technology Research in the Interest of Society (CITRIS), which fosters collaboration among industry and faculty and students of four UC campuses (Berkeley, Davis, Merced, and Santa Cruz). The 141,000 square foot building, occupied in 2009, includes typical office spaces and a nanofabrication laboratory. Heating is provided by a district heating system (steam from campus as a byproduct of the campus cogeneration plant); cooling is provided by one of two chillers: a more typical electric centrifugal compressor chiller designed for the cool months (Nov-­ March) and a steam absorption chiller for use in the warm months (April-­October). Lighting in the open office areas is provided by direct-­indirect luminaries with Building Management System-­based scheduling for open areas, and occupancy sensors for private office areas. For the purposes of this project, we focused on the office portion of the building. Annual energy consumption is approximately 8053 MWh; the office portion is estimated as 1924 MWh. The maximum peak load

  12. A fully automated system for adherent cells microinjection.

    Science.gov (United States)

    Becattini, Gabriele; Mattos, Leonardo S; Caldwell, Darwin G

    2014-01-01

    This paper proposes an automated robotic system to perform cell microinjections to relieve human operators from this highly difficult and tedious manual procedure. The system, which uses commercial equipment currently found on most biomanipulation laboratories, consists of a multitask software framework combining computer vision and robotic control elements. The vision part features an injection pipette tracker and an automatic cell targeting system that is responsible for defining injection points within the contours of adherent cells in culture. The main challenge is the use of bright-field microscopy only, without the need for chemical markers normally employed to highlight the cells. Here, cells are identified and segmented using a threshold-based image processing technique working on defocused images. Fast and precise microinjection pipette positioning over the automatically defined targets is performed by a two-stage robotic system which achieves an average injection rate of 7.6 cells/min with a pipette positioning precision of 0.23 μm. The consistency of these microinjections and the performance of the visual targeting framework were experimentally evaluated using two cell lines (CHO-K1 and HEK) and over 500 cells. In these trials, the cells were automatically targeted and injected with a fluorescent marker, resulting in a correct cell detection rate of 87% and a successful marker delivery rate of 67.5%. These results demonstrate that the new system is capable of better performances than expert operators, highlighting its benefits and potential for large-scale application.

  13. Canadian macromolecular crystallography facility: a suite of fully automated beamlines.

    Science.gov (United States)

    Grochulski, Pawel; Fodje, Michel; Labiuk, Shaunivan; Gorin, James; Janzen, Kathryn; Berg, Russ

    2012-06-01

    The Canadian light source is a 2.9 GeV national synchrotron radiation facility located on the University of Saskatchewan campus in Saskatoon. The small-gap in-vacuum undulator illuminated beamline, 08ID-1, together with the bending magnet beamline, 08B1-1, constitute the Canadian Macromolecular Crystallography Facility (CMCF). The CMCF provides service to more than 50 Principal Investigators in Canada and the United States. Up to 25% of the beam time is devoted to commercial users and the general user program is guaranteed up to 55% of the useful beam time through a peer-review process. CMCF staff provides "Mail-In" crystallography service to users with the highest scored proposals. Both beamlines are equipped with very robust end-stations including on-axis visualization systems, Rayonix 300 CCD series detectors and Stanford-type robotic sample auto-mounters. MxDC, an in-house developed beamline control system, is integrated with a data processing module, AutoProcess, allowing full automation of data collection and data processing with minimal human intervention. Sample management and remote monitoring of experiments is enabled through interaction with a Laboratory Information Management System developed at the facility.

  14. Development of a fully automated online mixing system for SAXS protein structure analysis

    DEFF Research Database (Denmark)

    Nielsen, Søren Skou; Arleth, Lise

    2010-01-01

    This thesis presents the development of an automated high-throughput mixing and exposure system for Small-Angle Scattering analysis on a synchrotron using polymer microfluidics. Software and hardware for both automated mixing, exposure control on a beamline and automated data reduction and prelim......This thesis presents the development of an automated high-throughput mixing and exposure system for Small-Angle Scattering analysis on a synchrotron using polymer microfluidics. Software and hardware for both automated mixing, exposure control on a beamline and automated data reduction...... and preliminary analysis is presented. Three mixing systems that have been the corner stones of the development process are presented including a fully functioning high-throughput microfluidic system that is able to produce and expose 36 mixed samples per hour using 30 μL of sample volume. The system is tested...

  15. Current advances and strategies towards fully automated sample preparation for regulated LC-MS/MS bioanalysis.

    Science.gov (United States)

    Zheng, Naiyu; Jiang, Hao; Zeng, Jianing

    2014-09-01

    Robotic liquid handlers (RLHs) have been widely used in automated sample preparation for liquid chromatography-tandem mass spectrometry (LC-MS/MS) bioanalysis. Automated sample preparation for regulated bioanalysis offers significantly higher assay efficiency, better data quality and potential bioanalytical cost-savings. For RLHs that are used for regulated bioanalysis, there are additional requirements, including 21 CFR Part 11 compliance, software validation, system qualification, calibration verification and proper maintenance. This article reviews recent advances in automated sample preparation for regulated bioanalysis in the last 5 years. Specifically, it covers the following aspects: regulated bioanalysis requirements, recent advances in automation hardware and software development, sample extraction workflow simplification, strategies towards fully automated sample extraction, and best practices in automated sample preparation for regulated bioanalysis.

  16. Fully automated software for quantitative measurements of mitochondrial morphology.

    Science.gov (United States)

    McClatchey, P Mason; Keller, Amy C; Bouchard, Ron; Knaub, Leslie A; Reusch, Jane E B

    2016-01-01

    Mitochondria undergo dynamic changes in morphology in order to adapt to changes in nutrient and oxygen availability, communicate with the nucleus, and modulate intracellular calcium dynamics. Many recent papers have been published assessing mitochondrial morphology endpoints. Although these studies have yielded valuable insights, contemporary assessment of mitochondrial morphology is typically subjective and qualitative, precluding direct comparison of outcomes between different studies and likely missing many subtle effects. In this paper, we describe a novel software technique for measuring the average length, average width, spatial density, and intracellular localization of mitochondria from a fluorescent microscope image. This method was applied to distinguish baseline characteristics of Human Umbilical Vein Endothelial Cells (HUVECs), primary Goto-Kakizaki rat aortic smooth muscle cells (GK SMCs), primary Wistar rat aortic smooth muscle cells (Wistar SMCs), and SH-SY5Ys (human neuroblastoma cell line). Consistent with direct observation, our algorithms found SH-SY5Ys to have the greatest mitochondrial density, while HUVECs were found to have the longest mitochondria. Mitochondrial morphology responses to temperature, nutrient, and oxidative stressors were characterized to test algorithm performance. Large morphology changes recorded by the software agreed with direct observation, and subtle but consistent morphology changes were found that would not otherwise have been detected. Endpoints were consistent between experimental repetitions (R=0.93 for length, R=0.93 for width, R=0.89 for spatial density, and R=0.74 for localization), and maintained reasonable agreement even when compared to images taken with compromised microscope resolution or in an alternate imaging plane. These results indicate that the automated software described herein allows quantitative and objective characterization of mitochondrial morphology from fluorescent microscope images.

  17. Improving reticle defect disposition via fully automated lithography simulation

    Science.gov (United States)

    Mann, Raunak; Goodman, Eliot; Lao, Keith; Ha, Steven; Vacca, Anthony; Fiekowsky, Peter; Fiekowsky, Dan

    2016-03-01

    Most advanced wafer fabs have embraced complex pattern decoration, which creates numerous challenges during in-fab reticle qualification. These optical proximity correction (OPC) techniques create assist features that tend to be very close in size and shape to the main patterns as seen in Figure 1. A small defect on an assist feature will most likely have little or no impact on the fidelity of the wafer image, whereas the same defect on a main feature could significantly decrease device functionality. In order to properly disposition these defects, reticle inspection technicians need an efficient method that automatically separates main from assist features and predicts the resulting defect impact on the wafer image. Analysis System (ADAS) defect simulation system[1]. Up until now, using ADAS simulation was limited to engineers due to the complexity of the settings that need to be manually entered in order to create an accurate result. A single error in entering one of these values can cause erroneous results, therefore full automation is necessary. In this study, we propose a new method where all needed simulation parameters are automatically loaded into ADAS. This is accomplished in two parts. First we have created a scanner parameter database that is automatically identified from mask product and level names. Second, we automatically determine the appropriate simulation printability threshold by using a new reference image (provided by the inspection tool) that contains a known measured value of the reticle critical dimension (CD). This new method automatically loads the correct scanner conditions, sets the appropriate simulation threshold, and automatically measures the percentage of CD change caused by the defect. This streamlines qualification and reduces the number of reticles being put on hold, waiting for engineer review. We also present data showing the consistency and reliability of the new method, along with the impact on the efficiency of in

  18. Automated Dynamic Demand Response Implementation on a Micro-grid

    Energy Technology Data Exchange (ETDEWEB)

    Kuppannagari, Sanmukh R.; Kannan, Rajgopal; Chelmis, Charalampos; Prasanna, Viktor K.

    2016-11-16

    In this paper, we describe a system for real-time automated Dynamic and Sustainable Demand Response with sparse data consumption prediction implemented on the University of Southern California campus microgrid. Supply side approaches to resolving energy supply-load imbalance do not work at high levels of renewable energy penetration. Dynamic Demand Response (D2R) is a widely used demand-side technique to dynamically adjust electricity consumption during peak load periods. Our D2R system consists of accurate machine learning based energy consumption forecasting models that work with sparse data coupled with fast and sustainable load curtailment optimization algorithms that provide the ability to dynamically adapt to changing supply-load imbalances in near real-time. Our Sustainable DR (SDR) algorithms attempt to distribute customer curtailment evenly across sub-intervals during a DR event and avoid expensive demand peaks during a few sub-intervals. It also ensures that each customer is penalized fairly in order to achieve the targeted curtailment. We develop near linear-time constant-factor approximation algorithms along with Polynomial Time Approximation Schemes (PTAS) for SDR curtailment that minimizes the curtailment error defined as the difference between the target and achieved curtailment values. Our SDR curtailment problem is formulated as an Integer Linear Program that optimally matches customers to curtailment strategies during a DR event while also explicitly accounting for customer strategy switching overhead as a constraint. We demonstrate the results of our D2R system using real data from experiments performed on the USC smartgrid and show that 1) our prediction algorithms can very accurately predict energy consumption even with noisy or missing data and 2) our curtailment algorithms deliver DR with extremely low curtailment errors in the 0.01-0.05 kWh range.

  19. A Fully Automated Classification for Mapping the Annual Cropland Extent

    Science.gov (United States)

    Waldner, F.; Defourny, P.

    2015-12-01

    Mapping the global cropland extent is of paramount importance for food security. Indeed, accurate and reliable information on cropland and the location of major crop types is required to make future policy, investment, and logistical decisions, as well as production monitoring. Timely cropland information directly feed early warning systems such as GIEWS and, FEWS NET. In Africa, and particularly in the arid and semi-arid region, food security is center of debate (at least 10% of the population remains undernourished) and accurate cropland estimation is a challenge. Space borne Earth Observation provides opportunities for global cropland monitoring in a spatially explicit, economic, efficient, and objective fashion. In the both agriculture monitoring and climate modelling, cropland maps serve as mask to isolate agricultural land for (i) time-series analysis for crop condition monitoring and (ii) to investigate how the cropland is respond to climatic evolution. A large diversity of mapping strategies ranging from the local to the global scale and associated with various degrees of accuracy can be found in the literature. At the global scale, despite efforts, cropland is generally one of classes with the poorest accuracy which make difficult the use for agricultural. This research aims at improving the cropland delineation from the local scale to the regional and global scales as well as allowing near real time updates. To that aim, five temporal features were designed to target the key- characteristics of crop spectral-temporal behavior. To ensure a high degree of automation, training data is extracted from available baseline land cover maps. The method delivers cropland maps with a high accuracy over contrasted agro-systems in Ukraine, Argentina, China and Belgium. The accuracy reached are comparable to those obtained with classifiers trained with in-situ data. Besides, it was found that the cropland class is associated with a low uncertainty. The temporal features

  20. Accurate, fully-automated NMR spectral profiling for metabolomics.

    Directory of Open Access Journals (Sweden)

    Siamak Ravanbakhsh

    Full Text Available Many diseases cause significant changes to the concentrations of small molecules (a.k.a. metabolites that appear in a person's biofluids, which means such diseases can often be readily detected from a person's "metabolic profile"-i.e., the list of concentrations of those metabolites. This information can be extracted from a biofluids Nuclear Magnetic Resonance (NMR spectrum. However, due to its complexity, NMR spectral profiling has remained manual, resulting in slow, expensive and error-prone procedures that have hindered clinical and industrial adoption of metabolomics via NMR. This paper presents a system, BAYESIL, which can quickly, accurately, and autonomously produce a person's metabolic profile. Given a 1D 1H NMR spectrum of a complex biofluid (specifically serum or cerebrospinal fluid, BAYESIL can automatically determine the metabolic profile. This requires first performing several spectral processing steps, then matching the resulting spectrum against a reference compound library, which contains the "signatures" of each relevant metabolite. BAYESIL views spectral matching as an inference problem within a probabilistic graphical model that rapidly approximates the most probable metabolic profile. Our extensive studies on a diverse set of complex mixtures including real biological samples (serum and CSF, defined mixtures and realistic computer generated spectra; involving > 50 compounds, show that BAYESIL can autonomously find the concentration of NMR-detectable metabolites accurately (~ 90% correct identification and ~ 10% quantification error, in less than 5 minutes on a single CPU. These results demonstrate that BAYESIL is the first fully-automatic publicly-accessible system that provides quantitative NMR spectral profiling effectively-with an accuracy on these biofluids that meets or exceeds the performance of trained experts. We anticipate this tool will usher in high-throughput metabolomics and enable a wealth of new applications of

  1. Improving GPR Surveys Productivity by Array Technology and Fully Automated Processing

    Science.gov (United States)

    Morello, Marco; Ercoli, Emanuele; Mazzucchelli, Paolo; Cottino, Edoardo

    2016-04-01

    The realization of network infrastructures with lower environmental impact and the tendency to use digging technologies less invasive in terms of time and space of road occupation and restoration play a key-role in the development of communication networks. However, pre-existing buried utilities must be detected and located in the subsurface, to exploit the high productivity of modern digging apparatus. According to SUE quality level B+ both position and depth of subsurface utilities must be accurately estimated, demanding for 3D GPR surveys. In fact, the advantages of 3D GPR acquisitions (obtained either by multiple 2D recordings or by an antenna array) versus 2D acquisitions are well-known. Nonetheless, the amount of acquired data for such 3D acquisitions does not usually allow to complete processing and interpretation directly in field and in real-time, thus limiting the overall efficiency of the GPR acquisition. As an example, the "low impact mini-trench "technique (addressed in ITU - International Telecommunication Union - L.83 recommendation) requires that non-destructive mapping of buried services enhances its productivity to match the improvements of new digging equipment. Nowadays multi-antenna and multi-pass GPR acquisitions demand for new processing techniques that can obtain high quality subsurface images, taking full advantage of 3D data: the development of a fully automated and real-time 3D GPR processing system plays a key-role in overall optical network deployment profitability. Furthermore, currently available computing power suggests the feasibility of processing schemes that incorporate better focusing algorithms. A novel processing scheme, whose goal is the automated processing and detection of buried targets that can be applied in real-time to 3D GPR array systems, has been developed and fruitfully tested with two different GPR arrays (16 antennas, 900 MHz central frequency, and 34 antennas, 600 MHz central frequency). The proposed processing

  2. Demand Response and Open Automated Demand Response Opportunities for Data Centers

    Energy Technology Data Exchange (ETDEWEB)

    Ghatikar, Girish; Piette, Mary Ann; Fujita, Sydny; McKane, Aimee; Dudley, Junqiao Han; Radspieler, Anthony; Mares, K.C.; Shroyer, Dave

    2009-12-30

    This study examines data center characteristics, loads, control systems, and technologies to identify demand response (DR) and automated DR (Open Auto-DR) opportunities and challenges. The study was performed in collaboration with technology experts, industrial partners, and data center facility managers and existing research on commercial and industrial DR was collected and analyzed. The results suggest that data centers, with significant and rapidly growing energy use, have significant DR potential. Because data centers are highly automated, they are excellent candidates for Open Auto-DR. 'Non-mission-critical' data centers are the most likely candidates for early adoption of DR. Data center site infrastructure DR strategies have been well studied for other commercial buildings; however, DR strategies for information technology (IT) infrastructure have not been studied extensively. The largest opportunity for DR or load reduction in data centers is in the use of virtualization to reduce IT equipment energy use, which correspondingly reduces facility cooling loads. DR strategies could also be deployed for data center lighting, and heating, ventilation, and air conditioning. Additional studies and demonstrations are needed to quantify benefits to data centers of participating in DR and to address concerns about DR's possible impact on data center performance or quality of service and equipment life span.

  3. Clinical validation of fully automated computation of ejection fraction from gated equilibrium blood-pool scintigrams

    NARCIS (Netherlands)

    J.H.C. Reiber (Johan); S.P. Lie; M.L. Simoons (Maarten); C. Hoek; J.J. Gerbrands (Jan); W. Wijns (William); W.H. Bakker (Willem); P.P.M. Kooij (Peter)

    1983-01-01

    textabstractA fully automated procedure for the computation of left-ventricular ejection fraction (EF) from cardiac-gated Tc-99m blood-pool (GBP) scintigrams with fixed, dual, and variable ROI methods is described. By comparison with EF data from contrast ventriculography in 68 patients, the dual-RO

  4. ProDeGe: A Computational Protocol for fully Automated Decontamination of Genomic Data

    Energy Technology Data Exchange (ETDEWEB)

    2015-12-01

    The Single Cell Data Decontamination Pipeline is a fully-automated software tool which classifies unscreened contigs from single cell datasets through a combination of homology and feature-based methodologies using the organism's nucleotide sequences and known NCBI taxonomony. The software is freely available to download and install, and can be run on any system.

  5. Open Automated Demand Response Dynamic Pricing Technologies and Demonstration

    Energy Technology Data Exchange (ETDEWEB)

    Ghatikar, Girish; Mathieu, Johanna L.; Piette, Mary Ann; Koch, Ed; Hennage, Dan

    2010-08-02

    This study examines the use of OpenADR communications specification, related data models, technologies, and strategies to send dynamic prices (e.g., real time prices and peak prices) and Time of Use (TOU) rates to commercial and industrial electricity customers. OpenADR v1.0 is a Web services-based flexible, open information model that has been used in California utilities' commercial automated demand response programs since 2007. We find that data models can be used to send real time prices. These same data models can also be used to support peak pricing and TOU rates. We present a data model that can accommodate all three types of rates. For demonstration purposes, the data models were generated from California Independent System Operator's real-time wholesale market prices, and a California utility's dynamic prices and TOU rates. Customers can respond to dynamic prices by either using the actual prices, or prices can be mapped into"operation modes," which can act as inputs to control systems. We present several different methods for mapping actual prices. Some of these methods were implemented in demonstration projects. The study results demonstrate show that OpenADR allows interoperability with existing/future systems/technologies and can be used within related dynamic pricing activities within Smart Grid.

  6. Blind testing of routine, fully automated determination of protein structures from NMR data.

    Science.gov (United States)

    Rosato, Antonio; Aramini, James M; Arrowsmith, Cheryl; Bagaria, Anurag; Baker, David; Cavalli, Andrea; Doreleijers, Jurgen F; Eletsky, Alexander; Giachetti, Andrea; Guerry, Paul; Gutmanas, Aleksandras; Güntert, Peter; He, Yunfen; Herrmann, Torsten; Huang, Yuanpeng J; Jaravine, Victor; Jonker, Hendrik R A; Kennedy, Michael A; Lange, Oliver F; Liu, Gaohua; Malliavin, Thérèse E; Mani, Rajeswari; Mao, Binchen; Montelione, Gaetano T; Nilges, Michael; Rossi, Paolo; van der Schot, Gijs; Schwalbe, Harald; Szyperski, Thomas A; Vendruscolo, Michele; Vernon, Robert; Vranken, Wim F; Vries, Sjoerd de; Vuister, Geerten W; Wu, Bin; Yang, Yunhuang; Bonvin, Alexandre M J J

    2012-02-08

    The protocols currently used for protein structure determination by nuclear magnetic resonance (NMR) depend on the determination of a large number of upper distance limits for proton-proton pairs. Typically, this task is performed manually by an experienced researcher rather than automatically by using a specific computer program. To assess whether it is indeed possible to generate in a fully automated manner NMR structures adequate for deposition in the Protein Data Bank, we gathered 10 experimental data sets with unassigned nuclear Overhauser effect spectroscopy (NOESY) peak lists for various proteins of unknown structure, computed structures for each of them using different, fully automatic programs, and compared the results to each other and to the manually solved reference structures that were not available at the time the data were provided. This constitutes a stringent "blind" assessment similar to the CASP and CAPRI initiatives. This study demonstrates the feasibility of routine, fully automated protein structure determination by NMR.

  7. Direct versus Facility Centric Load Control for Automated Demand Response

    Energy Technology Data Exchange (ETDEWEB)

    Koch, Ed; Piette, Mary Ann

    2009-11-06

    Direct load control (DLC) refers to the scenario where third party entities outside the home or facility are responsible for deciding how and when specific customer loads will be controlled in response to Demand Response (DR) events on the electric grid. Examples of third parties responsible for performing DLC may be Utilities, Independent System Operators (ISO), Aggregators, or third party control companies. DLC can be contrasted with facility centric load control (FCLC) where the decisions for how loads are controlled are made entirely within the facility or enterprise control systems. In FCLC the facility owner has more freedom of choice in how to respond to DR events on the grid. Both approaches are in use today in automation of DR and both will continue to be used in future market segments including industrial, commercial and residential facilities. This paper will present a framework which can be used to differentiate between DLC and FCLC based upon where decisions are made on how specific loads are controlled in response to DR events. This differentiation is then used to compare and contrast the differences between DLC and FCLC to identify the impact each has on:(1)Utility/ISO and third party systems for managing demand response, (2)Facility systems for implementing load control, (3)Communications networks for interacting with the facility and (4)Facility operators and managers. Finally a survey of some of the existing DR related specifications and communications standards is given and their applicability to DLC or FCLC. In general FCLC adds more cost and responsibilities to the facilities whereas DLC represents higher costs and complexity for the Utility/ISO. This difference is primarily due to where the DR Logic is implemented and the consequences that creates. DLC may be more certain than FCLC because it is more predictable - however as more loads have the capability to respond to DR signals, people may prefer to have their own control of end-use loads

  8. Validation of fully automated VMAT plan generation for library-based plan-of-the-day cervical cancer radiotherapy

    NARCIS (Netherlands)

    A.W.M. Sharfo (Abdul Wahab M.); S. Breedveld (Sebastiaan); P.W.J. Voet (Peter W.J.); S.T. Heijkoop (Sabrina); J.W.M. Mens (Jan); M.S. Hoogeman (Mischa); B.J.M. Heijmen (Ben)

    2016-01-01

    textabstractPurpose: To develop and validate fully automated generation of VMAT plan-libraries for plan-of-the-day adaptive radiotherapy in locally-advanced cervical cancer. Material and Methods: Our framework for fully automated treatment plan generation (Erasmus-iCycle) was adapted to create dual-

  9. Design and Implementation of an Open, Interoperable AutomatedDemand Response Infrastructure

    Energy Technology Data Exchange (ETDEWEB)

    Piette, Mary Ann; Kiliccote, Sila; Ghatikar, Girish

    2007-10-01

    This paper describes the concept for and lessons from the development and field-testing of an open, interoperable communications infrastructure to support automating demand response (DR). Automating DR allows greater levels of participation and improved reliability and repeatability of the demand response and customer facilities. Automated DR systems have been deployed for critical peak pricing and demand bidding and are being designed for real time pricing. The system is designed to generate, manage, and track DR signals between utilities and Independent System Operators (ISOs) to aggregators and end-use customers and their control systems.

  10. A Fully Automated Method to Detect and Segment a Manufactured Object in an Underwater Color Image

    Directory of Open Access Journals (Sweden)

    Phlypo Ronald

    2010-01-01

    Full Text Available We propose a fully automated active contours-based method for the detection and the segmentation of a moored manufactured object in an underwater image. Detection of objects in underwater images is difficult due to the variable lighting conditions and shadows on the object. The proposed technique is based on the information contained in the color maps and uses the visual attention method, combined with a statistical approach for the detection and an active contour for the segmentation of the object to overcome the above problems. In the classical active contour method the region descriptor is fixed and the convergence of the method depends on the initialization. With our approach, this dependence is overcome with an initialization using the visual attention results and a criterion to select the best region descriptor. This approach improves the convergence and the processing time while providing the advantages of a fully automated method.

  11. Fully automated corneal endothelial morphometry of images captured by clinical specular microscopy

    Science.gov (United States)

    Bucht, Curry; Söderberg, Per; Manneberg, Göran

    2010-02-01

    The corneal endothelium serves as the posterior barrier of the cornea. Factors such as clarity and refractive properties of the cornea are in direct relationship to the quality of the endothelium. The endothelial cell density is considered the most important morphological factor of the corneal endothelium. Pathological conditions and physical trauma may threaten the endothelial cell density to such an extent that the optical property of the cornea and thus clear eyesight is threatened. Diagnosis of the corneal endothelium through morphometry is an important part of several clinical applications. Morphometry of the corneal endothelium is presently carried out by semi automated analysis of pictures captured by a Clinical Specular Microscope (CSM). Because of the occasional need of operator involvement, this process can be tedious, having a negative impact on sampling size. This study was dedicated to the development and use of fully automated analysis of a very large range of images of the corneal endothelium, captured by CSM, using Fourier analysis. Software was developed in the mathematical programming language Matlab. Pictures of the corneal endothelium, captured by CSM, were read into the analysis software. The software automatically performed digital enhancement of the images, normalizing lights and contrasts. The digitally enhanced images of the corneal endothelium were Fourier transformed, using the fast Fourier transform (FFT) and stored as new images. Tools were developed and applied for identification and analysis of relevant characteristics of the Fourier transformed images. The data obtained from each Fourier transformed image was used to calculate the mean cell density of its corresponding corneal endothelium. The calculation was based on well known diffraction theory. Results in form of estimated cell density of the corneal endothelium were obtained, using fully automated analysis software on 292 images captured by CSM. The cell density obtained by the

  12. A Robust and Fully-Automated Chromatographic Method for the Quantitative Purification of Ca and Sr for Isotopic Analysis

    Science.gov (United States)

    Smith, H. B.; Kim, H.; Romaniello, S. J.; Field, P.; Anbar, A. D.

    2014-12-01

    High throughput methods for sample purification are required to effectively exploit new opportunities in the study of non-traditional stable isotopes. Many geochemical isotopic studies would benefit from larger data sets, but these are often impractical with manual drip chromatography techniques, which can be time-consuming and demand the attention of skilled laboratory staff. Here we present a new, fully-automated single-column method suitable for the purification of both Ca and Sr for stable and radiogenic isotopic analysis. The method can accommodate a wide variety of sample types, including carbonates, bones, and teeth; silicate rocks and sediments; fresh and marine waters; and biological samples such as blood and urine. Protocols for these isotopic analyses are being developed for use on the new prepFAST-MCTM system from Elemental Scientific (ESI). The system is highly adaptable and processes up to 24-60 samples per day by reusing a single chromatographic column. Efficient column cleaning between samples and an all Teflon flow path ensures that sample carryover is maintained at the level of background laboratory blanks typical for manual drip chromatography. This method is part of a family of new fully-automated chromatographic methods being developed to address many different isotopic systems including B, Ca, Fe, Cu, Zn, Sr, Cd, Pb, and U. These methods are designed to be rugged and transferrable, and to allow the preparation of large, diverse sample sets via a highly repeatable process with minimal effort.

  13. A Fully Automated Sequential-Injection Analyser for Dual Electrogenerated Chemiluminescence/Amperometric Detection

    OpenAIRE

    Economou, Anastasios; Nika, Maria

    2006-01-01

    This work describes the development of a dedicated, fully automated sequential-injection analysis (SIA) apparatus suitable for simultaneous electrogenerated chemiluminescence (ECL) and amperometric detection. The instrument is composed of a peristaltic pump, a multiposition selection valve, a home-made potentiostat, a thin-layer electrochemical/optical flow-through cell, and a light detector. Control of the experimental sequence and simultaneous data acquisition of the light and the current i...

  14. A fully automated flow-based approach for accelerated peptide synthesis.

    Science.gov (United States)

    Mijalis, Alexander J; Thomas, Dale A; Simon, Mark D; Adamo, Andrea; Beaumont, Ryan; Jensen, Klavs F; Pentelute, Bradley L

    2017-05-01

    Here we report a fully automated, flow-based approach to solid-phase polypeptide synthesis, with amide bond formation in 7 seconds and total synthesis times of 40 seconds per amino acid residue. Crude peptide purities and isolated yields were comparable to those for standard-batch solid-phase peptide synthesis. At full capacity, this approach can yield tens of thousands of individual 30-mer peptides per year.

  15. ClusPro: a fully automated algorithm for protein–protein docking

    OpenAIRE

    Comeau, Stephen R.; Gatchell, David W.; Vajda, Sandor; Camacho, Carlos J.

    2004-01-01

    ClusPro (http://nrc.bu.edu/cluster) represents the first fully automated, web-based program for the computational docking of protein structures. Users may upload the coordinate files of two protein structures through ClusPro's web interface, or enter the PDB codes of the respective structures, which ClusPro will then download from the PDB server (http://www.rcsb.org/pdb/). The docking algorithms evaluate billions of putative complexes, retaining a preset number with favorable surface compleme...

  16. Opportunities for Automated Demand Response in California’s Dairy Processing Industry

    Energy Technology Data Exchange (ETDEWEB)

    Homan, Gregory K. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Aghajanzadeh, Arian [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); McKane, Aimee [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2015-08-30

    During periods of peak electrical demand on the energy grid or when there is a shortage of supply, the stability of the grid may be compromised or the cost of supplying electricity may rise dramatically, respectively. Demand response programs are designed to mitigate the severity of these problems and improve reliability by reducing the demand on the grid during such critical times. In 2010, the Demand Response Research Center convened a group of industry experts to suggest potential industries that would be good demand response program candidates for further review. The dairy industry was suggested due to the perception that the industry had suitable flexibility and automatic controls in place. The purpose of this report is to provide an initial description of the industry with regard to demand response potential, specifically automated demand response. This report qualitatively describes the potential for participation in demand response and automated demand response by dairy processing facilities in California, as well as barriers to widespread participation. The report first describes the magnitude, timing, location, purpose, and manner of energy use. Typical process equipment and controls are discussed, as well as common impediments to participation in demand response and automated demand response programs. Two case studies of demand response at dairy facilities in California and across the country are reviewed. Finally, recommendations are made for future research that can enhance the understanding of demand response potential in this industry.

  17. Fully Automated Trimethylsilyl (TMS Derivatisation Protocol for Metabolite Profiling by GC-MS

    Directory of Open Access Journals (Sweden)

    Erica Zarate

    2016-12-01

    Full Text Available Gas Chromatography-Mass Spectrometry (GC-MS has long been used for metabolite profiling of a wide range of biological samples. Many derivatisation protocols are already available and among these, trimethylsilyl (TMS derivatisation is one of the most widely used in metabolomics. However, most TMS methods rely on off-line derivatisation prior to GC-MS analysis. In the case of manual off-line TMS derivatisation, the derivative created is unstable, so reduction in recoveries occurs over time. Thus, derivatisation is carried out in small batches. Here, we present a fully automated TMS derivatisation protocol using robotic autosamplers and we also evaluate a commercial software, Maestro available from Gerstel GmbH. Because of automation, there was no waiting time of derivatised samples on the autosamplers, thus reducing degradation of unstable metabolites. Moreover, this method allowed us to overlap samples and improved throughputs. We compared data obtained from both manual and automated TMS methods performed on three different matrices, including standard mix, wine, and plasma samples. The automated TMS method showed better reproducibility and higher peak intensity for most of the identified metabolites than the manual derivatisation method. We also validated the automated method using 114 quality control plasma samples. Additionally, we showed that this online method was highly reproducible for most of the metabolites detected and identified (RSD < 20 and specifically achieved excellent results for sugars, sugar alcohols, and some organic acids. To the very best of our knowledge, this is the first time that the automated TMS method has been applied to analyse a large number of complex plasma samples. Furthermore, we found that this method was highly applicable for routine metabolite profiling (both targeted and untargeted in any metabolomics laboratory.

  18. Fully Automated Driving: Impact of Trust and Practice on Manual Control Recovery.

    Science.gov (United States)

    Payre, William; Cestac, Julien; Delhomme, Patricia

    2016-03-01

    An experiment was performed in a driving simulator to investigate the impacts of practice, trust, and interaction on manual control recovery (MCR) when employing fully automated driving (FAD). To increase the use of partially or highly automated driving efficiency and to improve safety, some studies have addressed trust in driving automation and training, but few studies have focused on FAD. FAD is an autonomous system that has full control of a vehicle without any need for intervention by the driver. A total of 69 drivers with a valid license practiced with FAD. They were distributed evenly across two conditions: simple practice and elaborate practice. When examining emergency MCR, a correlation was found between trust and reaction time in the simple practice group (i.e., higher trust meant a longer reaction time), but not in the elaborate practice group. This result indicated that to mitigate the negative impact of overtrust on reaction time, more appropriate practice may be needed. Drivers should be trained in how the automated device works so as to improve MCR performance in case of an emergency. The practice format used in this study could be used for the first interaction with an FAD car when acquiring such a vehicle. © 2015, Human Factors and Ergonomics Society.

  19. Grcarma: A fully automated task-oriented interface for the analysis of molecular dynamics trajectories.

    Science.gov (United States)

    Koukos, Panagiotis I; Glykos, Nicholas M

    2013-10-05

    We report the availability of grcarma, a program encoding for a fully automated set of tasks aiming to simplify the analysis of molecular dynamics trajectories of biological macromolecules. It is a cross-platform, Perl/Tk-based front-end to the program carma and is designed to facilitate the needs of the novice as well as those of the expert user, while at the same time maintaining a user-friendly and intuitive design. Particular emphasis was given to the automation of several tedious tasks, such as extraction of clusters of structures based on dihedral and Cartesian principal component analysis, secondary structure analysis, calculation and display of root-meansquare deviation (RMSD) matrices, calculation of entropy, calculation and analysis of variance–covariance matrices, calculation of the fraction of native contacts, etc. The program is free-open source software available immediately for download.

  20. Reagent preparation and storage for amplification of microarray hybridization targets with a fully automated system.

    Science.gov (United States)

    Zhou, Mingjie; Marlowe, Jon; Graves, Jaime; Dahl, Jason; Riley, Zackery; Tian, Lena; Duenwald, Sven; Tokiwa, George; Fare, Thomas L

    2007-08-01

    The advent of automated systems for gene expression profiling has accentuated the need for the development of convenient and cost-effective methods for reagent preparation. We have developed a method for the preparation and storage of pre-aliquoted cocktail plates that contain all reagents required for amplification of nucleic acid by reverse transcription and in vitro transcription reactions. Plates can be stored at -80 degrees C for at least 1 month and kept in a hotel at 4 degrees C for at least 24 h prior to use. Microarray data quality generated from these pre-aliquoted reagent plates is not statistically different between cRNA amplified with stored cocktails and cRNA amplified with freshly prepared cocktails. Deployment of pre-aliquoted, stored cocktail plates in a fully automated system not only increases the throughput of amplifying cRNA targets from thousands of RNA samples, but could also considerably reduce reagent costs and potentially improve process robustness.

  1. A fully automated linear polyacrylamide coating and regeneration method for capillary electrophoresis of proteins.

    Science.gov (United States)

    Bodnar, Judit; Hajba, Laszlo; Guttman, Andras

    2016-12-01

    Surface modification of the inner capillary wall in CE of proteins is frequently required to alter EOF and to prevent protein adsorption. Manual protocols for such coating techniques are cumbersome. In this paper, an automated covalent linear polyacrylamide coating and regeneration process is described to support long-term stability of fused-silica capillaries for protein analysis. The stability of the resulting capillary coatings was evaluated by a large number of separations using a three-protein test mixture in pH 6 and 3 buffer systems. The results were compared to that obtained with the use of bare fused-silica capillaries. If necessary, the fully automated capillary coating process was easily applied to regenerate the capillary to extend its useful life-time.

  2. MAGNUM project: four years of operation of the fully automated observatory

    Science.gov (United States)

    Kobayashi, Y.; Yoshii, Y.; Minezaki, T.

    2004-10-01

    We present an overview of the MAGNUM (Multicolor Active Galactic Nuclei Monitoring) project and its current status. The MAGNUM project is designed to carry out long-term monitoring observations of many AGN in the optical and near-infrared wavelength regions. For this purpose, we built a 2 m automated telescope as well as a multicolor imaging photometer (MIP)(\\cite{kobayashib}). The telescope is located near the Haleakala summit, at a height of 3050 m, within the area of the University of Hawaii's Haleakala Observatory on the Hawaiian Island of Maui. We have been continuously carrying out observations since early 2001 when the preliminary observations were commenced. We have realized a fully automated operation that is suitable for relatively simple and stable observations over a period of several years.

  3. TreeRipper web application: towards a fully automated optical tree recognition software

    Directory of Open Access Journals (Sweden)

    Hughes Joseph

    2011-05-01

    Full Text Available Abstract Background Relationships between species, genes and genomes have been printed as trees for over a century. Whilst this may have been the best format for exchanging and sharing phylogenetic hypotheses during the 20th century, the worldwide web now provides faster and automated ways of transferring and sharing phylogenetic knowledge. However, novel software is needed to defrost these published phylogenies for the 21st century. Results TreeRipper is a simple website for the fully-automated recognition of multifurcating phylogenetic trees (http://linnaeus.zoology.gla.ac.uk/~jhughes/treeripper/. The program accepts a range of input image formats (PNG, JPG/JPEG or GIF. The underlying command line c++ program follows a number of cleaning steps to detect lines, remove node labels, patch-up broken lines and corners and detect line edges. The edge contour is then determined to detect the branch length, tip label positions and the topology of the tree. Optical Character Recognition (OCR is used to convert the tip labels into text with the freely available tesseract-ocr software. 32% of images meeting the prerequisites for TreeRipper were successfully recognised, the largest tree had 115 leaves. Conclusions Despite the diversity of ways phylogenies have been illustrated making the design of a fully automated tree recognition software difficult, TreeRipper is a step towards automating the digitization of past phylogenies. We also provide a dataset of 100 tree images and associated tree files for training and/or benchmarking future software. TreeRipper is an open source project licensed under the GNU General Public Licence v3.

  4. Fully automated quantitative analysis of breast cancer risk in DCE-MR images

    Science.gov (United States)

    Jiang, Luan; Hu, Xiaoxin; Gu, Yajia; Li, Qiang

    2015-03-01

    Amount of fibroglandular tissue (FGT) and background parenchymal enhancement (BPE) in dynamic contrast enhanced magnetic resonance (DCE-MR) images are two important indices for breast cancer risk assessment in the clinical practice. The purpose of this study is to develop and evaluate a fully automated scheme for quantitative analysis of FGT and BPE in DCE-MR images. Our fully automated method consists of three steps, i.e., segmentation of whole breast, fibroglandular tissues, and enhanced fibroglandular tissues. Based on the volume of interest extracted automatically, dynamic programming method was applied in each 2-D slice of a 3-D MR scan to delineate the chest wall and breast skin line for segmenting the whole breast. This step took advantages of the continuity of chest wall and breast skin line across adjacent slices. We then further used fuzzy c-means clustering method with automatic selection of cluster number for segmenting the fibroglandular tissues within the segmented whole breast area. Finally, a statistical method was used to set a threshold based on the estimated noise level for segmenting the enhanced fibroglandular tissues in the subtraction images of pre- and post-contrast MR scans. Based on the segmented whole breast, fibroglandular tissues, and enhanced fibroglandular tissues, FGT and BPE were automatically computed. Preliminary results of technical evaluation and clinical validation showed that our fully automated scheme could obtain good segmentation of the whole breast, fibroglandular tissues, and enhanced fibroglandular tissues to achieve accurate assessment of FGT and BPE for quantitative analysis of breast cancer risk.

  5. Fully Automated Non-Native Speech Recognition Using Confusion-Based Acoustic Model Integration

    OpenAIRE

    Bouselmi, Ghazi; Fohr, Dominique; Illina, Irina; Haton, Jean-Paul

    2005-01-01

    This paper presents a fully automated approach for the recognition of non-native speech based on acoustic model modification. For a native language (L1) and a spoken language (L2), pronunciation variants of the phones of L2 are automatically extracted from an existing non-native database as a confusion matrix with sequences of phones of L1. This is done using L1's and L2's ASR systems. This confusion concept deals with the problem of non existence of match between some L2 and L1 phones. The c...

  6. Fully Automated Fluorescent in situ Hybridization (FISH) Staining and Digital Analysis of HER2 in Breast Cancer : A Validation Study

    NARCIS (Netherlands)

    van der Logt, Elise M. J.; Kuperus, Deborah A. J.; van Setten, Jan W.; van den Heuvel, Marius C.; Boers, James. E.; Schuuring, Ed; Kibbelaar, Robby E.

    2015-01-01

    HER2 assessment is routinely used to select patients with invasive breast cancer that might benefit from HER2-targeted therapy. The aim of this study was to validate a fully automated in situ hybridization (ISH) procedure that combines the automated Leica HER2 fluorescent ISH system for Bond with su

  7. Opportunities for Energy Efficiency and Automated Demand Response in Industrial Refrigerated Warehouses in California

    Energy Technology Data Exchange (ETDEWEB)

    Lekov, Alex; Thompson, Lisa; McKane, Aimee; Rockoff, Alexandra; Piette, Mary Ann

    2009-05-11

    This report summarizes the Lawrence Berkeley National Laboratory's research to date in characterizing energy efficiency and open automated demand response opportunities for industrial refrigerated warehouses in California. The report describes refrigerated warehouses characteristics, energy use and demand, and control systems. It also discusses energy efficiency and open automated demand response opportunities and provides analysis results from three demand response studies. In addition, several energy efficiency, load management, and demand response case studies are provided for refrigerated warehouses. This study shows that refrigerated warehouses can be excellent candidates for open automated demand response and that facilities which have implemented energy efficiency measures and have centralized control systems are well-suited to shift or shed electrical loads in response to financial incentives, utility bill savings, and/or opportunities to enhance reliability of service. Control technologies installed for energy efficiency and load management purposes can often be adapted for open automated demand response (OpenADR) at little additional cost. These improved controls may prepare facilities to be more receptive to OpenADR due to both increased confidence in the opportunities for controlling energy cost/use and access to the real-time data.

  8. A new fully automated FTIR system for total column measurements of greenhouse gases

    Science.gov (United States)

    Geibel, M. C.; Gerbig, C.; Feist, D. G.

    2010-10-01

    This article introduces a new fully automated FTIR system that is part of the Total Carbon Column Observing Network (TCCON). It will provide continuous ground-based measurements of column-averaged volume mixing ratio for CO2, CH4 and several other greenhouse gases in the tropics. Housed in a 20-foot shipping container it was developed as a transportable system that could be deployed almost anywhere in the world. We describe the automation concept which relies on three autonomous subsystems and their interaction. Crucial components like a sturdy and reliable solar tracker dome are described in detail. The automation software employs a new approach relying on multiple processes, database logging and web-based remote control. First results of total column measurements at Jena, Germany show that the instrument works well and can provide parts of the diurnal as well as seasonal cycle for CO2. Instrument line shape measurements with an HCl cell suggest that the instrument stays well-aligned over several months. After a short test campaign for side by side intercomaprison with an existing TCCON instrument in Australia, the system will be transported to its final destination Ascension Island.

  9. Fully Automated Data Collection Using PAM and the Development of PAM/SPACE Reversible Cassettes

    Science.gov (United States)

    Hiraki, Masahiko; Watanabe, Shokei; Chavas, Leonard M. G.; Yamada, Yusuke; Matsugaki, Naohiro; Igarashi, Noriyuki; Wakatsuki, Soichi; Fujihashi, Masahiro; Miki, Kunio; Baba, Seiki; Ueno, Go; Yamamoto, Masaki; Suzuki, Mamoru; Nakagawa, Atsushi; Watanabe, Nobuhisa; Tanaka, Isao

    2010-06-01

    To remotely control and automatically collect data in high-throughput X-ray data collection experiments, the Structural Biology Research Center at the Photon Factory (PF) developed and installed sample exchange robots PAM (PF Automated Mounting system) at PF macromolecular crystallography beamlines; BL-5A, BL-17A, AR-NW12A and AR-NE3A. We developed and installed software that manages the flow of the automated X-ray experiments; sample exchanges, loop-centering and X-ray diffraction data collection. The fully automated data collection function has been available since February 2009. To identify sample cassettes, PAM employs a two-dimensional bar code reader. New beamlines, BL-1A at the Photon Factory and BL32XU at SPring-8, are currently under construction as part of Targeted Proteins Research Program (TPRP) by the Ministry of Education, Culture, Sports, Science and Technology of Japan. However, different robots, PAM and SPACE (SPring-8 Precise Automatic Cryo-sample Exchanger), will be installed at BL-1A and BL32XU, respectively. For the convenience of the users of both facilities, pins and cassettes for PAM and SPACE are developed as part of the TPRP.

  10. A new fully automated FTIR system for total column measurements of greenhouse gases

    Directory of Open Access Journals (Sweden)

    M. C. Geibel

    2010-10-01

    Full Text Available This article introduces a new fully automated FTIR system that is part of the Total Carbon Column Observing Network (TCCON. It will provide continuous ground-based measurements of column-averaged volume mixing ratio for CO2, CH4 and several other greenhouse gases in the tropics.

    Housed in a 20-foot shipping container it was developed as a transportable system that could be deployed almost anywhere in the world. We describe the automation concept which relies on three autonomous subsystems and their interaction. Crucial components like a sturdy and reliable solar tracker dome are described in detail. The automation software employs a new approach relying on multiple processes, database logging and web-based remote control.

    First results of total column measurements at Jena, Germany show that the instrument works well and can provide parts of the diurnal as well as seasonal cycle for CO2. Instrument line shape measurements with an HCl cell suggest that the instrument stays well-aligned over several months.

    After a short test campaign for side by side intercomaprison with an existing TCCON instrument in Australia, the system will be transported to its final destination Ascension Island.

  11. MAGNETIC RESONANCE IMAGING COMPATIBLE ROBOTIC SYSTEM FOR FULLY AUTOMATED BRACHYTHERAPY SEED PLACEMENT

    Science.gov (United States)

    Muntener, Michael; Patriciu, Alexandru; Petrisor, Doru; Mazilu, Dumitru; Bagga, Herman; Kavoussi, Louis; Cleary, Kevin; Stoianovici, Dan

    2011-01-01

    Objectives To introduce the development of the first magnetic resonance imaging (MRI)-compatible robotic system capable of automated brachytherapy seed placement. Methods An MRI-compatible robotic system was conceptualized and manufactured. The entire robot was built of nonmagnetic and dielectric materials. The key technology of the system is a unique pneumatic motor that was specifically developed for this application. Various preclinical experiments were performed to test the robot for precision and imager compatibility. Results The robot was fully operational within all closed-bore MRI scanners. Compatibility tests in scanners of up to 7 Tesla field intensity showed no interference of the robot with the imager. Precision tests in tissue mockups yielded a mean seed placement error of 0.72 ± 0.36 mm. Conclusions The robotic system is fully MRI compatible. The new technology allows for automated and highly accurate operation within MRI scanners and does not deteriorate the MRI quality. We believe that this robot may become a useful instrument for image-guided prostate interventions. PMID:17169653

  12. Fully automated segmentation of left ventricle using dual dynamic programming in cardiac cine MR images

    Science.gov (United States)

    Jiang, Luan; Ling, Shan; Li, Qiang

    2016-03-01

    Cardiovascular diseases are becoming a leading cause of death all over the world. The cardiac function could be evaluated by global and regional parameters of left ventricle (LV) of the heart. The purpose of this study is to develop and evaluate a fully automated scheme for segmentation of LV in short axis cardiac cine MR images. Our fully automated method consists of three major steps, i.e., LV localization, LV segmentation at end-diastolic phase, and LV segmentation propagation to the other phases. First, the maximum intensity projection image along the time phases of the midventricular slice, located at the center of the image, was calculated to locate the region of interest of LV. Based on the mean intensity of the roughly segmented blood pool in the midventricular slice at each phase, end-diastolic (ED) and end-systolic (ES) phases were determined. Second, the endocardial and epicardial boundaries of LV of each slice at ED phase were synchronously delineated by use of a dual dynamic programming technique. The external costs of the endocardial and epicardial boundaries were defined with the gradient values obtained from the original and enhanced images, respectively. Finally, with the advantages of the continuity of the boundaries of LV across adjacent phases, we propagated the LV segmentation from the ED phase to the other phases by use of dual dynamic programming technique. The preliminary results on 9 clinical cardiac cine MR cases show that the proposed method can obtain accurate segmentation of LV based on subjective evaluation.

  13. Designs and concept reliance of a fully automated high-content screening platform.

    Science.gov (United States)

    Radu, Constantin; Adrar, Hosna Sana; Alamir, Ab; Hatherley, Ian; Trinh, Trung; Djaballah, Hakim

    2012-10-01

    High-content screening (HCS) is becoming an accepted platform in academic and industry screening labs and does require slightly different logistics for execution. To automate our stand-alone HCS microscopes, namely, an alpha IN Cell Analyzer 3000 (INCA3000), originally a Praelux unit hooked to a Hudson Plate Crane with a maximum capacity of 50 plates per run, and the IN Cell Analyzer 2000 (INCA2000), in which up to 320 plates could be fed per run using the Thermo Fisher Scientific Orbitor, we opted for a 4 m linear track system harboring both microscopes, plate washer, bulk dispensers, and a high-capacity incubator allowing us to perform both live and fixed cell-based assays while accessing both microscopes on deck. Considerations in design were given to the integration of the alpha INCA3000, a new gripper concept to access the onboard nest, and peripheral locations on deck to ensure a self-reliant system capable of achieving higher throughput. The resulting system, referred to as Hestia, has been fully operational since the new year, has an onboard capacity of 504 plates, and harbors the only fully automated alpha INCA3000 unit in the world.

  14. Wine analysis to check quality and authenticity by fully-automated 1H-NMR

    Directory of Open Access Journals (Sweden)

    Spraul Manfred

    2015-01-01

    Full Text Available Fully-automated high resolution 1H-NMR spectroscopy offers unique screening capabilities for food quality and safety by combining non-targeted and targeted screening in one analysis (15–20 min from acquisition to report. The advantage of high resolution 1H-NMR is its absolute reproducibility and transferability from laboratory to laboratory, which is not equaled by any other method currently used in food analysis. NMR reproducibility allows statistical investigations e.g. for detection of variety, geographical origin and adulterations, where smallest changes of many ingredients at the same time must be recorded. Reproducibility and transferability of the solutions shown are user-, instrument- and laboratory-independent. Sample prepara- tion, measurement and processing are based on strict standard operation procedures which are substantial for this fully automated solution. The non-targeted approach to the data allows detecting even unknown deviations, if they are visible in the 1H-NMR spectra of e.g. fruit juice, wine or honey. The same data acquired in high-throughput mode are also subjected to quantification of multiple compounds. This 1H-NMR methodology will shortly be introduced, then results on wine will be presented and the advantages of the solutions shown. The method has been proven on juice, honey and wine, where so far unknown frauds could be detected, while at the same time generating targeted parameters are obtained.

  15. Feasibility of Commercially Available, Fully Automated Hepatic CT Volumetry for Assessing Both Total and Territorial Liver Volumes in Liver Transplantation

    Energy Technology Data Exchange (ETDEWEB)

    Shin, Cheong Il; Kim, Se Hyung; Rhim, Jung Hyo; Yi, Nam Joon; Suh, Kyung Suk; Lee, Jeong Min; Han, Joon Koo; Choi, Byung Ihn [Seoul National University Hospital, Seoul (Korea, Republic of)

    2013-02-15

    To assess the feasibility of commercially-available, fully automated hepatic CT volumetry for measuring both total and territorial liver volumes by comparing with interactive manual volumetry and measured ex-vivo liver volume. For the assessment of total and territorial liver volume, portal phase CT images of 77 recipients and 107 donors who donated right hemiliver were used. Liver volume was measured using both the fully automated and interactive manual methods with Advanced Liver Analysis software. The quality of the automated segmentation was graded on a 4-point scale. Grading was performed by two radiologists in consensus. For the cases with excellent-to-good quality, the accuracy of automated volumetry was compared with interactive manual volumetry and measured ex-vivo liver volume which was converted from weight using analysis of variance test and Pearson's or Spearman correlation test. Processing time for both automated and interactive manual methods was also compared. Excellent-to-good quality of automated segmentation for total liver and right hemiliver was achieved in 57.1% (44/77) and 17.8% (19/107), respectively. For both total and right hemiliver volumes, there were no significant differences among automated, manual, and ex-vivo volumes except between automate volume and manual volume of the total liver (p = 0.011). There were good correlations between automate volume and ex-vivo liver volume ({gamma}= 0.637 for total liver and {gamma}= 0.767 for right hemiliver). Both correlation coefficients were higher than those with manual method. Fully automated volumetry required significantly less time than interactive manual method (total liver: 48.6 sec vs. 53.2 sec, right hemiliver: 182 sec vs. 244.5 sec). Fully automated hepatic CT volumetry is feasible and time-efficient for total liver volume measurement. However, its usefulness for territorial liver volumetry needs to be improved.

  16. Design and Operation of an Open, Interoperable Automated Demand Response Infrastructure for Commercial Buildings

    Energy Technology Data Exchange (ETDEWEB)

    Piette, Mary Ann; Ghatikar, Girish; Kiliccote, Sila; Watson, David; Koch, Ed; Hennage, Dan

    2009-05-01

    This paper describes the concept for and lessons from the development and field-testing of an open, interoperable communications infrastructure to support automated demand response (auto-DR). Automating DR allows greater levels of participation, improved reliability, and repeatability of the DR in participating facilities. This paper also presents the technical and architectural issues associated with auto-DR and description of the demand response automation server (DRAS), the client/server architecture-based middle-ware used to automate the interactions between the utilities or any DR serving entity and their customers for DR programs. Use case diagrams are presented to show the role of the DRAS between utility/ISO and the clients at the facilities.

  17. The worldwide NORM production and a fully automated gamma-ray spectrometer for their characterization

    CERN Document Server

    Xhixha, G; Broggini, C; Buso, GP; Caciolli, A; Callegari, I; De Bianchi, S; Fiorentini, G; Guastaldi, E; Xhixha, M Kaçeli; Mantovani, F; Massa, G; Menegazzo, R; Mou, L; Pasquini, A; Alvarez, C Rossi; Shyti, M

    2012-01-01

    Materials containing radionuclides of natural origin, which is modified by human made processes and being subject to regulation because of their radioactivity are known as NORM. We present a brief review of the main categories of non-nuclear industries together with the levels of activity concentration in feed raw materials, products and waste, including mechanisms of radioisotope enrichments. The global management of NORM shows a high level of complexity, mainly due to different degrees of radioactivity enhancement and the huge amount of worldwide waste production. The future tendency of guidelines concerning environmental protection will require both a systematic monitoring based on the ever-increasing sampling and high performance of gamma ray spectroscopy. On the ground of these requirements a new low background fully automated high-resolution gamma-ray spectrometer MCA_Rad has been developed. The design of Pb and Cu shielding allowed to reach a background reduction of two order of magnitude with respect ...

  18. A fully automated in vitro diagnostic system based on magnetic tunnel junction arrays and superparamagnetic particles

    Science.gov (United States)

    Lian, Jie; Chen, Si; Qiu, Yuqin; Zhang, Suohui; Shi, Stone; Gao, Yunhua

    2012-04-01

    A fully automated in vitro diagnostic (IVD) system for diagnosing acute myocardial infarction was developed using high sensitivity MTJ array as sensors and nano-magnetic particles as tags. On the chip is an array of 12 × 106 MTJ devices integrated onto a 3 metal layer CMOS circuit. The array is divided into 48 detection areas, therefore 48 different types of bio targets can be analyzed simultaneously if needed. The chip is assembled with a micro-fluidic cartridge which contains all the reagents necessary for completing the assaying process. Integrated with electrical, mechanical and micro-fluidic pumping devices and with the reaction protocol programed in a microprocessor, the system only requires a simple one-step analyte application procedure to operate and yields results of the three major AMI bio-markers (cTnI, MYO, CK-MB) in 15 mins.

  19. Simple and fully automated preparation of [carbonyl-{sup 11}C]WAY-100635

    Energy Technology Data Exchange (ETDEWEB)

    Wadsak, W.; Ettlinger, D.E; Dudczak, R.; Kletter, K. [Medical Univ. of Vienna (Austria). Dept. of Nuclear Medicine; Mien, L.K. [Medical Univ. of Vienna (Austria). Dept. of Nuclear Medicine; Medical Univ. of Vienna (Austria). Dept. of Psychiatry; Vienna Univ. (Austria). Dept. of Pharmaceutical Technology and Biopharmaceutics; Lanzenberger, R.R. [Medical Univ. of Vienna (Austria). Dept. of Psychiatry; Haeusler, D. [Medical Univ. of Vienna (Austria). Dept. of Nuclear Medicine; Vienna Univ. (Austria). Dept. of Pharmaceutical Technology and Biopharmaceutics; Mitterhauser, M. [Medical Univ. of Vienna (Austria). Dept. of Nuclear Medicine; Vienna Univ. (Austria). Dept. of Pharmaceutical Technology and Biopharmaceutics; General Hospital of Vienna (Austria). Hospital Pharmacy

    2007-07-01

    So far, [carbonyl-{sup 11}C]WAY-100635 is the PET-tracer of choice for 5HT{sub 1A}-receptor-imaging. Since the preparation is still a challenge, we aimed at (1) the evaluation of various essential parameters for the successful preparation, (2) the simplification of the radiosynthesis and (3) the establishment of a safe and fully automated system. The preparation is based on a commercial synthesizer and all chemicals are used without further processing. We found a low failure rate (7.7%), high average yield (4.0 {+-} 1.0 GBq) and a specific radioactivity of 292 {+-} 168 GBq/{mu}mol (both at the end of synthesis, EOS). (orig.)

  20. A fully automated system for quantification of background parenchymal enhancement in breast DCE-MRI

    Science.gov (United States)

    Ufuk Dalmiş, Mehmet; Gubern-Mérida, Albert; Borelli, Cristina; Vreemann, Suzan; Mann, Ritse M.; Karssemeijer, Nico

    2016-03-01

    Background parenchymal enhancement (BPE) observed in breast dynamic contrast enhanced magnetic resonance imaging (DCE-MRI) has been identified as an important biomarker associated with risk for developing breast cancer. In this study, we present a fully automated framework for quantification of BPE. We initially segmented fibroglandular tissue (FGT) of the breasts using an improved version of an existing method. Subsequently, we computed BPEabs (volume of the enhancing tissue), BPErf (BPEabs divided by FGT volume) and BPErb (BPEabs divided by breast volume), using different relative enhancement threshold values between 1% and 100%. To evaluate and compare the previous and improved FGT segmentation methods, we used 20 breast DCE-MRI scans and we computed Dice similarity coefficient (DSC) values with respect to manual segmentations. For evaluation of the BPE quantification, we used a dataset of 95 breast DCE-MRI scans. Two radiologists, in individual reading sessions, visually analyzed the dataset and categorized each breast into minimal, mild, moderate and marked BPE. To measure the correlation between automated BPE values to the radiologists' assessments, we converted these values into ordinal categories and we used Spearman's rho as a measure of correlation. According to our results, the new segmentation method obtained an average DSC of 0.81 0.09, which was significantly higher (p<0.001) compared to the previous method (0.76 0.10). The highest correlation values between automated BPE categories and radiologists' assessments were obtained with the BPErf measurement (r=0.55, r=0.49, p<0.001 for both), while the correlation between the scores given by the two radiologists was 0.82 (p<0.001). The presented framework can be used to systematically investigate the correlation between BPE and risk in large screening cohorts.

  1. Significance of fully automated tests for the diagnosis of antiphospholipid syndrome.

    Science.gov (United States)

    Oku, Kenji; Amengual, Olga; Kato, Masaru; Bohgaki, Toshiyuki; Horita, Tetsuya; Yasuda, Shinsuke; Sakamoto, Naoya; Ieko, Masahiro; Norman, Gary L; Atsumi, Tatsuya

    2016-10-01

    Antiphospholipid antibodies (aPLs) can vary both immunologically and functionally, thus it is important to effectively and correctly identify their presence when diagnosing antiphospholipid syndrome. Furthermore, since many immunological/functional tests are necessary to measure aPLs, complete examinations are often not performed in many cases due to significant burden on the testing departments. To address this issue, we measured aPLs defined according to the classification criteria (anticardiolipin antibody: aCL) IgG/IgM and anti-β2 glycoprotein I antibody (aβ2GPI) (IgG/IgM) as well as non-criteria antibodies (aCL IgA, aβ2GPI IgA and aβ2GPI domain I), in a cohort of 211 patients (61 APS, 140 disease controls and 10 healthy individuals). APLs were measured using a fully automated chemiluminescent immunoassay instrument (BIO-FLASH®/ACL AcuStar®) and with conventional ELISA tests. We demonstrated that both sensitivity and accuracy of diagnosis of aCL IgG and aβ2GPI IgG were high, in agreement with the past reports. When multiple aPLs were examined, the accuracy of diagnosis increased. The proportion of APS patients that were positive for 2 or more types of aPLs (47/61, 77%) was higher than that of patients with systemic lupus erythematosus (SLE)(3/37, 9%), those with non-SLE connective tissues diseases (1/53,2%), those with other diseases or healthy volunteers. Based on these findings, it was concluded that the fully automated chemiluminescent immunoassay instrument, which allows the simultaneous evaluation of many types of aPLs, offers clear advantages for a more complete, more rapid and less labor-intensive alternative to running multiple ELISA and could help in better diagnosis for suspected APS patients.

  2. "Smart" RCTs: Development of a Smartphone App for Fully Automated Nutrition-Labeling Intervention Trials.

    Science.gov (United States)

    Volkova, Ekaterina; Li, Nicole; Dunford, Elizabeth; Eyles, Helen; Crino, Michelle; Michie, Jo; Ni Mhurchu, Cliona

    2016-03-17

    There is substantial interest in the effects of nutrition labels on consumer food-purchasing behavior. However, conducting randomized controlled trials on the impact of nutrition labels in the real world presents a significant challenge. The Food Label Trial (FLT) smartphone app was developed to enable conducting fully automated trials, delivering intervention remotely, and collecting individual-level data on food purchases for two nutrition-labeling randomized controlled trials (RCTs) in New Zealand and Australia. Two versions of the smartphone app were developed: one for a 5-arm trial (Australian) and the other for a 3-arm trial (New Zealand). The RCT protocols guided requirements for app functionality, that is, obtaining informed consent, two-stage eligibility check, questionnaire administration, randomization, intervention delivery, and outcome assessment. Intervention delivery (nutrition labels) and outcome data collection (individual shopping data) used the smartphone camera technology, where a barcode scanner was used to identify a packaged food and link it with its corresponding match in a food composition database. Scanned products were either recorded in an electronic list (data collection mode) or allocated a nutrition label on screen if matched successfully with an existing product in the database (intervention delivery mode). All recorded data were transmitted to the RCT database hosted on a server. In total approximately 4000 users have downloaded the FLT app to date; 606 (Australia) and 1470 (New Zealand) users met the eligibility criteria and were randomized. Individual shopping data collected by participants currently comprise more than 96,000 (Australia) and 229,000 (New Zealand) packaged food and beverage products. The FLT app is one of the first smartphone apps to enable conducting fully automated RCTs. Preliminary app usage statistics demonstrate large potential of such technology, both for intervention delivery and data collection. Australian

  3. DeepPicker: A deep learning approach for fully automated particle picking in cryo-EM.

    Science.gov (United States)

    Wang, Feng; Gong, Huichao; Liu, Gaochao; Li, Meijing; Yan, Chuangye; Xia, Tian; Li, Xueming; Zeng, Jianyang

    2016-09-01

    Particle picking is a time-consuming step in single-particle analysis and often requires significant interventions from users, which has become a bottleneck for future automated electron cryo-microscopy (cryo-EM). Here we report a deep learning framework, called DeepPicker, to address this problem and fill the current gaps toward a fully automated cryo-EM pipeline. DeepPicker employs a novel cross-molecule training strategy to capture common features of particles from previously-analyzed micrographs, and thus does not require any human intervention during particle picking. Tests on the recently-published cryo-EM data of three complexes have demonstrated that our deep learning based scheme can successfully accomplish the human-level particle picking process and identify a sufficient number of particles that are comparable to those picked manually by human experts. These results indicate that DeepPicker can provide a practically useful tool to significantly reduce the time and manual effort spent in single-particle analysis and thus greatly facilitate high-resolution cryo-EM structure determination. DeepPicker is released as an open-source program, which can be downloaded from https://github.com/nejyeah/DeepPicker-python.

  4. Photochemical-chemiluminometric determination of aldicarb in a fully automated multicommutation based flow-assembly

    Energy Technology Data Exchange (ETDEWEB)

    Palomeque, M.; Garcia Bautista, J.A.; Catala Icardo, M.; Garcia Mateo, J.V.; Martinez Calatayud, J

    2004-06-04

    A sensitive and fully automated method for determination of aldicarb in technical formulations (Temik) and mineral waters is proposed. The automation of the flow-assembly is based on the multicommutation approach, which uses a set of solenoid valves acting as independent switchers. The operating cycle for obtaining a typical analytical transient signal can be easily programmed by means of a home-made software running in the Windows environment. The manifold is provided with a photoreactor consisting of a 150 cm long x 0.8 mm i.d. piece of PTFE tubing coiled around a 20 W low-pressure mercury lamp. The determination of aldicarb is performed on the basis of the iron(III) catalytic mineralization of the pesticide by UV irradiation (150 s), and the chemiluminescent (CL) behavior of the photodegradated pesticide in presence of potassium permanganate and quinine sulphate as sensitizer. UV irradiation of aldicarb turns the very week chemiluminescent pesticide into a strongly chemiluminescent photoproduct. The method is linear over the range 2.2-100.0 {mu}g l{sup -1} of aldicarb; the limit of detection is 0.069 {mu}g l{sup -1}; the reproducibility (as the R.S.D. of 20 peaks of a 24 {mu}g l{sup -1} solution) is 3.7% and the sample throughput is 17 h{sup -1}.

  5. Fabrication of fully automated setup for high temperature thermal conductivity measurement

    CERN Document Server

    Patel, Ashutosh

    2016-01-01

    In this work, we report the fabrication of fully automated experimental setup for high temperature thermal conductivity ($\\kappa$) measurement. Steady state based axial heat flow technique is used for $\\kappa$ measurement. Heat loss is measured using parallel thermal conductance technique. Simple design, light weight and small size sample holder is developed by using a thin heater and limited components. Low heat loss value is achieved by using small cross section very low thermal conductive insulator block. Power delivered to the heater is measured accurately by using 4-wire technique and for this, heater is developed with 4-wire. Program is built on graphical programming language "LabVIEW" to automize the whole measurement process. This setup is validated by using $Bi_{0.36}Sb_{1.45}Te_3$, polycrystalline bismuth, gadolinium, and alumina samples. The data obtained for these samples are found to be in good agreement with the reported data and the maximum deviation of 6 \\% in the value $\\kappa$ are observed. ...

  6. A fully automated FTIR system for remote sensing of greenhouse gases in the tropics

    Science.gov (United States)

    Geibel, M. C.; Gerbig, C.; Feist, D. G.

    2010-07-01

    This article introduces a new fully automated FTIR system that is part of the Total Carbon Column Observing Network. It will provide continuous ground-based measurements of column-averaged volume mixing ratio for CO2, CH4 and several other greenhouse gases in the tropics. Housed in a 20-foot shipping container it was developed as a transportable system that could be deployed almost anywhere in the world. We describe the automation concept which relies on three autonomous subsystems and their interaction. Crucial components like a sturdy and reliable solar tracker dome are described in detail. First results of total column measurements at Jena, Germany show that the instrument works well and can provide diurnal as well as seasonal cycle for CO2. Instrument line shape measurements with an HCl cell suggest that the instrument stays well-aligned over several months. After a short test campaign for side by side intercomaprison with an existing TCCON instrument in Australia, the system will be transported to its final destination Ascension Island.

  7. A fully automated FTIR system for remote sensing of greenhouse gases in the tropics

    Directory of Open Access Journals (Sweden)

    M. C. Geibel

    2010-07-01

    Full Text Available This article introduces a new fully automated FTIR system that is part of the Total Carbon Column Observing Network. It will provide continuous ground-based measurements of column-averaged volume mixing ratio for CO2, CH4 and several other greenhouse gases in the tropics.

    Housed in a 20-foot shipping container it was developed as a transportable system that could be deployed almost anywhere in the world. We describe the automation concept which relies on three autonomous subsystems and their interaction. Crucial components like a sturdy and reliable solar tracker dome are described in detail.

    First results of total column measurements at Jena, Germany show that the instrument works well and can provide diurnal as well as seasonal cycle for CO2. Instrument line shape measurements with an HCl cell suggest that the instrument stays well-aligned over several months.

    After a short test campaign for side by side intercomaprison with an existing TCCON instrument in Australia, the system will be transported to its final destination Ascension Island.

  8. A fully adaptive forecasting model for short-term drinking water demand

    NARCIS (Netherlands)

    Bakker, M.; Vreeburg, J.H.G.; Schagen, van K.M.; Rietveld, L.C.

    2013-01-01

    For the optimal control of a water supply system, a short-term water demand forecast is necessary. We developed a model that forecasts the water demand for the next 48 h with 15-min time steps. The model uses measured water demands and static calendar data as single input. Based on this input, the m

  9. Toward fully automated processing of dynamic susceptibility contrast perfusion MRI for acute ischemic cerebral stroke.

    Science.gov (United States)

    Kim, Jinsuh; Leira, Enrique C; Callison, Richard C; Ludwig, Bryan; Moritani, Toshio; Magnotta, Vincent A; Madsen, Mark T

    2010-05-01

    We developed fully automated software for dynamic susceptibility contrast (DSC) MR perfusion-weighted imaging (PWI) to efficiently and reliably derive critical hemodynamic information for acute stroke treatment decisions. Brain MR PWI was performed in 80 consecutive patients with acute nonlacunar ischemic stroke within 24h after onset of symptom from January 2008 to August 2009. These studies were automatically processed to generate hemodynamic parameters that included cerebral blood flow and cerebral blood volume, and the mean transit time (MTT). To develop reliable software for PWI analysis, we used computationally robust algorithms including the piecewise continuous regression method to determine bolus arrival time (BAT), log-linear curve fitting, arrival time independent deconvolution method and sophisticated motion correction methods. An optimal arterial input function (AIF) search algorithm using a new artery-likelihood metric was also developed. Anatomical locations of the automatically determined AIF were reviewed and validated. The automatically computed BAT values were statistically compared with estimated BAT by a single observer. In addition, gamma-variate curve-fitting errors of AIF and inter-subject variability of AIFs were analyzed. Lastly, two observes independently assessed the quality and area of hypoperfusion mismatched with restricted diffusion area from motion corrected MTT maps and compared that with time-to-peak (TTP) maps using the standard approach. The AIF was identified within an arterial branch and enhanced areas of perfusion deficit were visualized in all evaluated cases. Total processing time was 10.9+/-2.5s (mean+/-s.d.) without motion correction and 267+/-80s (mean+/-s.d.) with motion correction on a standard personal computer. The MTT map produced with our software adequately estimated brain areas with perfusion deficit and was significantly less affected by random noise of the PWI when compared with the TTP map. Results of image

  10. Towards fully automated structure-based function prediction in structural genomics: a case study.

    Science.gov (United States)

    Watson, James D; Sanderson, Steve; Ezersky, Alexandra; Savchenko, Alexei; Edwards, Aled; Orengo, Christine; Joachimiak, Andrzej; Laskowski, Roman A; Thornton, Janet M

    2007-04-13

    As the global Structural Genomics projects have picked up pace, the number of structures annotated in the Protein Data Bank as hypothetical protein or unknown function has grown significantly. A major challenge now involves the development of computational methods to assign functions to these proteins accurately and automatically. As part of the Midwest Center for Structural Genomics (MCSG) we have developed a fully automated functional analysis server, ProFunc, which performs a battery of analyses on a submitted structure. The analyses combine a number of sequence-based and structure-based methods to identify functional clues. After the first stage of the Protein Structure Initiative (PSI), we review the success of the pipeline and the importance of structure-based function prediction. As a dataset, we have chosen all structures solved by the MCSG during the 5 years of the first PSI. Our analysis suggests that two of the structure-based methods are particularly successful and provide examples of local similarity that is difficult to identify using current sequence-based methods. No one method is successful in all cases, so, through the use of a number of complementary sequence and structural approaches, the ProFunc server increases the chances that at least one method will find a significant hit that can help elucidate function. Manual assessment of the results is a time-consuming process and subject to individual interpretation and human error. We present a method based on the Gene Ontology (GO) schema using GO-slims that can allow the automated assessment of hits with a success rate approaching that of expert manual assessment.

  11. Difference Tracker: ImageJ plugins for fully automated analysis of multiple axonal transport parameters.

    Science.gov (United States)

    Andrews, Simon; Gilley, Jonathan; Coleman, Michael P

    2010-11-30

    Studies of axonal transport are critical, not only to understand its normal regulation, but also to determine the roles of transport impairment in disease. Exciting new resources have recently become available allowing live imaging of axonal transport in physiologically relevant settings, such as mammalian nerves. Thus the effects of disease, ageing and therapies can now be assessed directly in nervous system tissue. However, these imaging studies present new challenges. Manual or semi-automated analysis of the range of transport parameters required for a suitably complete evaluation is very time-consuming and can be subjective due to the complexity of the particle movements in axons in ex vivo explants or in vivo. We have developed Difference Tracker, a program combining two new plugins for the ImageJ image-analysis freeware, to provide fast, fully automated and objective analysis of a number of relevant measures of trafficking of fluorescently labeled particles so that axonal transport in different situations can be easily compared. We confirm that Difference Tracker can accurately track moving particles in highly simplified, artificial simulations. It can also identify and track multiple motile fluorescently labeled mitochondria simultaneously in time-lapse image stacks from live imaging of tibial nerve axons, reporting values for a number of parameters that are comparable to those obtained through manual analysis of the same axons. Difference Tracker therefore represents a useful free resource for the comparative analysis of axonal transport under different conditions, and could potentially be used and developed further in many other studies requiring quantification of particle movements.

  12. Fully Automated Home Tofu Machine%家用型全自动豆腐机

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    This paper introduces a miniature, fast, easy to wash, low-cost, fully automated home tofu ma-chine. Using this new product, soy beans (dry beans) and quantitative water can be added, ultra-fine grinding, slurry separation, steam heating, point gypsum, water and other automated operations, are achieved completed with a cup grinding, slurry separation, heating, point the brain into the brain, broken brain, transmission, filter press molding and other functions. In which filter press molding using a structured way (no cloth filter press).%  本文介绍了一款微型、快速、易洗涤、低成本的全自动家用豆腐机。通过这个新型产品,黄豆(干豆)加定量水后可自动完成超细粉碎、渣浆分离、蒸汽加热、点石膏水等作业流程,只用一个工作杯,就可实现研磨粉碎、渣浆分离、加热、点脑、成脑、破脑、输送、压滤成型等多种功能为一体的家用全自动豆腐机。其中压滤成型采用刚性结构方式(无布压滤),实现了豆腐制作压滤成型过程自动化。

  13. Validation of a fully automated robotic setup for preparation of whole blood samples for LC-MS toxicology analysis

    DEFF Research Database (Denmark)

    Andersen, David Wederkinck; Rasmussen, Brian; Linnet, Kristian

    2012-01-01

    A fully automated setup was developed for preparing whole blood samples using a Tecan Evo workstation. By integrating several add-ons to the robotic platform, the flexible setup was able to prepare samples from sample tubes to a 96-well sample plate ready for injection on liquid chromatography...

  14. Fully automated synthesis module for the high yield one-pot preparation of 6-[F-18]fluoro-L-DOPA

    NARCIS (Netherlands)

    de Vries, EFJ; Luurtsema, G; Brussermann, M; Elsinga, PH; Vaalburg, W

    1999-01-01

    A fully automated one-pot synthesis of 6-[F-18]fluoro-L-DOPA, an important radiopharmaceutical for studies on the presynaptic dopamine metabolism with positron emission tomography,is described. 6-[F-18]Fluoro-L-DOPA was prepared in high radiochemical yield (33 +/- 4%, c.f.d.) and radiochemical

  15. Fully automated objective-based method for master recession curve separation.

    Science.gov (United States)

    Posavec, Kristijan; Parlov, Jelena; Nakić, Zoran

    2010-01-01

    The fully automated objective-based method for master recession curve (MRC) separation was developed by using Microsoft Excel spreadsheet and Visual Basic for Applications (VBA) code. The core of the program code is used to construct an MRC by using the adapted matching strip method (Posavec et al. 2006). Criteria for separating the MRC into two or three segments are determined from the flow-duration curve and are represented as the probable range of percent of flow rate duration. Successive separations are performed automatically on two and three MRCs using sets of percent of flow rate duration from selected ranges and an optimal separation model scenario, having the highest average coefficient of determination R(2), is selected as the most appropriate one. The resulting separated master recession curves are presented graphically, whereas the statistics are presented numerically, all in separate sheets. Examples of field data obtained from two springs in Istria, Croatia, are used to illustrate its application. The freely available Excel spreadsheet and VBA program ensures the ease of use and applicability for larger data sets.

  16. Ex vivo encapsulation of dexamethasone sodium phosphate into human autologous erythrocytes using fully automated biomedical equipment.

    Science.gov (United States)

    Mambrini, Giovanni; Mandolini, Marco; Rossi, Luigia; Pierigè, Francesca; Capogrossi, Giovanni; Salvati, Patricia; Serafini, Sonja; Benatti, Luca; Magnani, Mauro

    2017-01-30

    Erythrocyte-based drug delivery systems are emerging as potential new solutions for the release of drugs into the bloodstream. The aim of the present work was to assess the performance of a fully automated process (EDS) for the ex-vivo encapsulation of the pro-drug dexamethasone sodium phosphate (DSP) into autologous erythrocytes in compliance with regulatory requirements. The loading method was based on reversible hypotonic hemolysis, which allows the opening of transient pores in the cell membrane to be crossed by DSP. The efficiency of encapsulation and the biochemical and physiological characteristics of the processed erythrocytes were investigated in blood samples from 34 healthy donors. It was found that the processed erythrocytes maintained their fundamental properties and the encapsulation process was reproducible. The EDS under study showed greater loading efficiency and reduced variability compared to previous EDS versions. Notably, these results were confirmed using blood samples from Ataxia Telangiectasia (AT) patients, 9.33±1.40 and 19.41±2.10mg of DSP (mean±SD, n=134) by using 62.5 and 125mg DSP loading quantities, respectively. These results support the use of the new EDS version 3.2.0 to investigate the effect of erythrocyte-delivered dexamethasone in regulatory trials in patients with AT.

  17. Fully Automated and Robust Tracking of Transient Waves in Structured Anatomies Using Dynamic Programming.

    Science.gov (United States)

    Akkus, Zeynettin; Bayat, Mahdi; Cheong, Mathew; Viksit, Kumar; Erickson, Bradley J; Alizad, Azra; Fatemi, Mostafa

    2016-10-01

    Tissue stiffness is often linked to underlying pathology and can be quantified by measuring the mechanical transient transverse wave speed (TWS) within the medium. Time-of-flight methods based on correlation of the transient signals or tracking of peaks have been used to quantify the TWS from displacement maps obtained with ultrasound pulse-echo techniques. However, it is challenging to apply these methods to in vivo data because of tissue inhomogeneity, noise and artifacts that produce outliers. In this study, we introduce a robust and fully automated method based on dynamic programming to estimate TWS in tissues with known geometries. The method is validated using ultrasound bladder vibrometry data from an in vivo study. We compared the results of our method with those of time-of-flight techniques. Our method performs better than time-of-flight techniques. In conclusion, we present a robust and accurate TWS detection method that overcomes the difficulties of time-of-flight methods.

  18. Analysis of xanthines in beverages using a fully automated SPE-SPC-DAD hyphenated system

    Energy Technology Data Exchange (ETDEWEB)

    Medvedovici, A. [Bucarest Univ., Bucarest (Romania). Faculty of Chemistry, Dept. of Analytical Chemistry; David, F.; David, V.; Sandra, P. [Research Institute of Chromatography, Kortrijk (Belgium)

    2000-08-01

    Analysis of some xanthines (caffeine, theophylline and theobromine) in beverages has been achieved by a fully automated on-line Solid Phase Extraction - Supercritical Fluid Chromatography - Diode Array Detection (Spe - Sofc - Dad). Three adsorbents have been tested for the Spe procedure: octadecyl modified silicagel (ODS) and two types of styrene-divinylbenzen copolymer based materials, from which Porapack proved to be the most suitable adsorbent. Optimisation and correlation of both Spe and Sofc operational parameters are also discussed. By this technique, caffeine was determined in ice tea and Coca-Cola in a concentration of 0.15 ppm, theobromine - 1.5 ppb, and theophylline - 0.15 ppb. [Italian] Si e' realizzata l'analis di alcune xantine (caffeina, teofillina e teobromina) mediante un sistema, in linea, completamente automatizzato basato su Estrazione in Fase Solida - Cromatografia in Fase Supercritica - Rivelazione con Diode Array (Spe - Sfc - Dad). Per la procedura Spe sono stati valutati tre substrati: silice ottadecilica (ODS) e due tipi di materiali polimerici a base stirene-divinilbenzene, di cui, quello denominato PRP-1, e' risultato essere il piu' efficiente. Sono discusse sia l'ottimizzazione che la correlazione dei parametri operazionali per la Spe e la Sfc. Con questa tecnica sono state determinate, in te' ghiacciato e Coca-Cola, la caffeina, la teobromina e la teofillina alle concentrazini di 0.15, 1.5 e 0.15 ppm.

  19. Methodology for fully automated segmentation and plaque characterization in intracoronary optical coherence tomography images.

    Science.gov (United States)

    Athanasiou, Lambros S; Bourantas, Christos V; Rigas, George; Sakellarios, Antonis I; Exarchos, Themis P; Siogkas, Panagiotis K; Ricciardi, Andrea; Naka, Katerina K; Papafaklis, Michail I; Michalis, Lampros K; Prati, Francesco; Fotiadis, Dimitrios I

    2014-02-01

    Optical coherence tomography (OCT) is a light-based intracoronary imaging modality that provides high-resolution cross-sectional images of the luminal and plaque morphology. Currently, the segmentation of OCT images and identification of the composition of plaque are mainly performed manually by expert observers. However, this process is laborious and time consuming and its accuracy relies on the expertise of the observer. To address these limitations, we present a methodology that is able to process the OCT data in a fully automated fashion. The proposed methodology is able to detect the lumen borders in the OCT frames, identify the plaque region, and detect four tissue types: calcium (CA), lipid tissue (LT), fibrous tissue (FT), and mixed tissue (MT). The efficiency of the developed methodology was evaluated using annotations from 27 OCT pullbacks acquired from 22 patients. High Pearson's correlation coefficients were obtained between the output of the developed methodology and the manual annotations (from 0.96 to 0.99), while no significant bias with good limits of agreement was shown in the Bland-Altman analysis. The overlapping areas ratio between experts' annotations and methodology in detecting CA, LT, FT, and MT was 0.81, 0.71, 0.87, and 0.81, respectively.

  20. Fully automated dialysis system based on the central dialysis fluid delivery system.

    Science.gov (United States)

    Kawanishi, Hideki; Moriishi, Misaki; Sato, Takashi; Taoka, Masahiro

    2009-01-01

    The fully automated dialysis system (FADS) was developed as an improvement over previous patient monitors used in the treatment of hemodialysis, with the aim of standardizing and promoting labor-saving in such treatment. This system uses backfiltration dialysis fluid to perform priming, blood rinse back and rapid fluid replenishment, and causes guiding of blood into the dialyzer by the drainage pump for ultrafiltration. This requires that the dialysis fluid used be purified to a high level. The central dialysis fluid delivery system (CDDS) combines the process of the creation and supply of dialysis water and dialysis fluid to achieve a level of purity equivalent with ultrapure dialysis fluid. FADS has the further advantages of greater efficiency and streamlined operation, reducing human error and the risk of infection without requiring the storage or disposal of normal saline solution. The simplification of hemodialysis allows for greater frequency of dialysis or extended dialysis, enabling treatment to be provided in line with the patient's particular situation. FADS thus markedly improves the reliability, safety and standardization of dialysis procedures while ensuring labor-saving in these procedures, making it of particular utility for institutions dealing with dialysis on a large scale.

  1. Thin film production with a new fully automated optical thickness monitoring system (Invited Paper)

    Science.gov (United States)

    Lardon, M.; Selhofer, H.

    1986-10-01

    The increasing demand for complex multilayer optical coatings requires equipment with a completely automated process control system. The new optical thickness monitor GSM 420, which is part of the deposition control system BPU 420 allows the remotely controlled wave-length selection either with a grating monochromator combined with the appropriate order sorting filters or with a set of six narrow bandpass filters. The endpoint detection is based on the digital processing of the signal corresponding to the light intensity after transmission through or reflexion from a testglass located side by side with a quartz crystal microbalance at the center of the coating plant. Turning value monitoring or termination of the process at an arbitrary predetermined point are both possible. Single and multiple layers of silicon dioxide and titanium dioxide and combinations thereof were deposited. Excellent linear correlation between the optical thickness on the test glass and the geometrical layer thickness as measured by the quartz crystal microbalance was observed. The reproducibility for single layers of quarterwave thickness was found to be between +/- 0.7 to +/- 1.7 % of the center wavelength of the spectral extremum measured on the test glass, depending on wavelength (350 - 3200 nm) and coating material (SiO2 or TiO2 on glass).

  2. Open Automated Demand Response Technologies for Dynamic Pricing and Smart Grid

    Energy Technology Data Exchange (ETDEWEB)

    Ghatikar, Girish; Mathieu, Johanna L.; Piette, Mary Ann; Kiliccote, Sila

    2010-06-02

    We present an Open Automated Demand Response Communications Specifications (OpenADR) data model capable of communicating real-time prices to electricity customers. We also show how the same data model could be used to for other types of dynamic pricing tariffs (including peak pricing tariffs, which are common throughout the United States). Customers participating in automated demand response programs with building control systems can respond to dynamic prices by using the actual prices as inputs to their control systems. Alternatively, prices can be mapped into"building operation modes," which can act as inputs to control systems. We present several different strategies customers could use to map prices to operation modes. Our results show that OpenADR can be used to communicate dynamic pricing within the Smart Grid and that OpenADR allows for interoperability with existing and future systems, technologies, and electricity markets.

  3. A novel fully automated molecular diagnostic system (AMDS for colorectal cancer mutation detection.

    Directory of Open Access Journals (Sweden)

    Shiro Kitano

    Full Text Available BACKGROUND: KRAS, BRAF and PIK3CA mutations are frequently observed in colorectal cancer (CRC. In particular, KRAS mutations are strong predictors for clinical outcomes of EGFR-targeted treatments such as cetuximab and panitumumab in metastatic colorectal cancer (mCRC. For mutation analysis, the current methods are time-consuming, and not readily available to all oncologists and pathologists. We have developed a novel, simple, sensitive and fully automated molecular diagnostic system (AMDS for point of care testing (POCT. Here we report the results of a comparison study between AMDS and direct sequencing (DS in the detection of KRAS, BRAF and PI3KCA somatic mutations. METHODOLOGY/PRINCIPAL FINDING: DNA was extracted from a slice of either frozen (n = 89 or formalin-fixed and paraffin-embedded (FFPE CRC tissue (n = 70, and then used for mutation analysis by AMDS and DS. All mutations (n = 41 among frozen and 27 among FFPE samples detected by DS were also successfully (100% detected by the AMDS. However, 8 frozen and 6 FFPE samples detected as wild-type in the DS analysis were shown as mutants in the AMDS analysis. By cloning-sequencing assays, these discordant samples were confirmed as true mutants. One sample had simultaneous "hot spot" mutations of KRAS and PIK3CA, and cloning assay comfirmed that E542K and E545K were not on the same allele. Genotyping call rates for DS were 100.0% (89/89 and 74.3% (52/70 in frozen and FFPE samples, respectively, for the first attempt; whereas that of AMDS was 100.0% for both sample sets. For automated DNA extraction and mutation detection by AMDS, frozen tissues (n = 41 were successfully detected all mutations within 70 minutes. CONCLUSIONS/SIGNIFICANCE: AMDS has superior sensitivity and accuracy over DS, and is much easier to execute than conventional labor intensive manual mutation analysis. AMDS has great potential for POCT equipment for mutation analysis.

  4. Fully automated prostate magnetic resonance imaging and transrectal ultrasound fusion via a probabilistic registration metric

    Science.gov (United States)

    Sparks, Rachel; Bloch, B. Nicholas; Feleppa, Ernest; Barratt, Dean; Madabhushi, Anant

    2013-03-01

    In this work, we present a novel, automated, registration method to fuse magnetic resonance imaging (MRI) and transrectal ultrasound (TRUS) images of the prostate. Our methodology consists of: (1) delineating the prostate on MRI, (2) building a probabilistic model of prostate location on TRUS, and (3) aligning the MRI prostate segmentation to the TRUS probabilistic model. TRUS-guided needle biopsy is the current gold standard for prostate cancer (CaP) diagnosis. Up to 40% of CaP lesions appear isoechoic on TRUS, hence TRUS-guided biopsy cannot reliably target CaP lesions and is associated with a high false negative rate. MRI is better able to distinguish CaP from benign prostatic tissue, but requires special equipment and training. MRI-TRUS fusion, whereby MRI is acquired pre-operatively and aligned to TRUS during the biopsy procedure, allows for information from both modalities to be used to help guide the biopsy. The use of MRI and TRUS in combination to guide biopsy at least doubles the yield of positive biopsies. Previous work on MRI-TRUS fusion has involved aligning manually determined fiducials or prostate surfaces to achieve image registration. The accuracy of these methods is dependent on the reader's ability to determine fiducials or prostate surfaces with minimal error, which is a difficult and time-consuming task. Our novel, fully automated MRI-TRUS fusion method represents a significant advance over the current state-of-the-art because it does not require manual intervention after TRUS acquisition. All necessary preprocessing steps (i.e. delineation of the prostate on MRI) can be performed offline prior to the biopsy procedure. We evaluated our method on seven patient studies, with B-mode TRUS and a 1.5 T surface coil MRI. Our method has a root mean square error (RMSE) for expertly selected fiducials (consisting of the urethra, calcifications, and the centroids of CaP nodules) of 3.39 +/- 0.85 mm.

  5. Evaluation of three fully automated immunoassay systems for detection of IgA anti-beta 2-glycoprotein I antibodies.

    Science.gov (United States)

    Pérez, D; Martínez-Flores, J A; Serrano, M; Lora, D; Paz-Artal, E; Morales, J M; Serrano, A

    2016-10-01

    In recent years, we have been witnessing increased clinical interest in the determination of IgA anti-beta 2-glycoprotein I (aB2GPI) antibodies as well as increased demand for this test. Some ELISA-based diagnostic systems for IgA aB2GPI antibodies detection are suboptimal to detect it. The aim of our study was to determine whether the diagnostic yield of modern detection systems based on automatic platforms to measure IgA aB2GPI is equivalent to that of the well-optimized ELISA-based assays. In total, 130 patients were analyzed for IgA aB2GPI by three fully automated immunoassays using an ELISA-based assay as reference. The three systems were also analyzed for IgG aB2GPI with 58 patients. System 1 was able to detect IgA aB2GPI with good sensitivity and kappa index (99% and 0.72, respectively). The other two systems had also poor sensitivity (20% and 15%) and kappa index (0.10 and 0.07), respectively. On the other hand, kappa index for IgG aB2GPI was >0.89 in the three systems. Some analytical methods to detect IgA aB2GPI are suboptimal as well as some ELISA-based diagnostic systems. It is important that the scientific community work to standardize analytical methods to determine IgA aB2GPI antibodies. © 2016 John Wiley & Sons Ltd.

  6. Fully automated intrinsic respiratory and cardiac gating for small animal CT

    Energy Technology Data Exchange (ETDEWEB)

    Kuntz, J; Baeuerle, T; Semmler, W; Bartling, S H [Department of Medical Physics in Radiology, German Cancer Research Center, Heidelberg (Germany); Dinkel, J [Department of Radiology, German Cancer Research Center, Heidelberg (Germany); Zwick, S [Department of Diagnostic Radiology, Medical Physics, Freiburg University (Germany); Grasruck, M [Siemens Healthcare, Forchheim (Germany); Kiessling, F [Chair of Experimental Molecular Imaging, RWTH-Aachen University, Medical Faculty, Aachen (Germany); Gupta, R [Department of Radiology, Massachusetts General Hospital, Boston, MA (United States)], E-mail: j.kuntz@dkfz.de

    2010-04-07

    A fully automated, intrinsic gating algorithm for small animal cone-beam CT is described and evaluated. A parameter representing the organ motion, derived from the raw projection images, is used for both cardiac and respiratory gating. The proposed algorithm makes it possible to reconstruct motion-corrected still images as well as to generate four-dimensional (4D) datasets representing the cardiac and pulmonary anatomy of free-breathing animals without the use of electrocardiogram (ECG) or respiratory sensors. Variation analysis of projections from several rotations is used to place a region of interest (ROI) on the diaphragm. The ROI is cranially extended to include the heart. The centre of mass (COM) variation within this ROI, the filtered frequency response and the local maxima are used to derive a binary motion-gating parameter for phase-sensitive gated reconstruction. This algorithm was implemented on a flat-panel-based cone-beam CT scanner and evaluated using a moving phantom and animal scans (seven rats and eight mice). Volumes were determined using a semiautomatic segmentation. In all cases robust gating signals could be obtained. The maximum volume error in phantom studies was less than 6%. By utilizing extrinsic gating via externally placed cardiac and respiratory sensors, the functional parameters (e.g. cardiac ejection fraction) and image quality were equivalent to this current gold standard. This algorithm obviates the necessity of both gating hardware and user interaction. The simplicity of the proposed algorithm enables adoption in a wide range of small animal cone-beam CT scanners.

  7. A CAD of fully automated colonic polyp detection for contrasted and non-contrasted CT scans.

    Science.gov (United States)

    Tulum, Gökalp; Bolat, Bülent; Osman, Onur

    2017-04-01

    Computer-aided detection (CAD) systems are developed to help radiologists detect colonic polyps over CT scans. It is possible to reduce the detection time and increase the detection accuracy rates by using CAD systems. In this paper, we aimed to develop a fully integrated CAD system for automated detection of polyps that yields a high polyp detection rate with a reasonable number of false positives. The proposed CAD system is a multistage implementation whose main components are: automatic colon segmentation, candidate detection, feature extraction and classification. The first element of the algorithm includes a discrete segmentation for both air and fluid regions. Colon-air regions were determined based on adaptive thresholding, and the volume/length measure was used to detect air regions. To extract the colon-fluid regions, a rule-based connectivity test was used to detect the regions belong to the colon. Potential polyp candidates were detected based on the 3D Laplacian of Gaussian filter. The geometrical features were used to reduce false-positive detections. A 2D projection image was generated to extract discriminative features as the inputs of an artificial neural network classifier. Our CAD system performs at 100% sensitivity for polyps larger than 9 mm, 95.83% sensitivity for polyps 6-10 mm and 85.71% sensitivity for polyps smaller than 6 mm with 5.3 false positives per dataset. Also, clinically relevant polyps ([Formula: see text]6 mm) were identified with 96.67% sensitivity at 1.12 FP/dataset. To the best of our knowledge, the novel polyp candidate detection system which determines polyp candidates with LoG filters is one of the main contributions. We also propose a new 2D projection image calculation scheme to determine the distinctive features. We believe that our CAD system is highly effective for assisting radiologist interpreting CT.

  8. Fully Automated Laser Ablation Liquid Capture Sample Analysis using NanoElectrospray Ionization Mass Spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Lorenz, Matthias [ORNL; Ovchinnikova, Olga S [ORNL; Van Berkel, Gary J [ORNL

    2014-01-01

    RATIONALE: Laser ablation provides for the possibility of sampling a large variety of surfaces with high spatial resolution. This type of sampling when employed in conjunction with liquid capture followed by nanoelectrospray ionization provides the opportunity for sensitive and prolonged interrogation of samples by mass spectrometry as well as the ability to analyze surfaces not amenable to direct liquid extraction. METHODS: A fully automated, reflection geometry, laser ablation liquid capture spot sampling system was achieved by incorporating appropriate laser fiber optics and a focusing lens into a commercially available, liquid extraction surface analysis (LESA ) ready Advion TriVersa NanoMate system. RESULTS: Under optimized conditions about 10% of laser ablated material could be captured in a droplet positioned vertically over the ablation region using the NanoMate robot controlled pipette. The sampling spot size area with this laser ablation liquid capture surface analysis (LA/LCSA) mode of operation (typically about 120 m x 160 m) was approximately 50 times smaller than that achievable by direct liquid extraction using LESA (ca. 1 mm diameter liquid extraction spot). The set-up was successfully applied for the analysis of ink on glass and paper as well as the endogenous components in Alstroemeria Yellow King flower petals. In a second mode of operation with a comparable sampling spot size, termed laser ablation/LESA , the laser system was used to drill through, penetrate, or otherwise expose material beneath a solvent resistant surface. Once drilled, LESA was effective in sampling soluble material exposed at that location on the surface. CONCLUSIONS: Incorporating the capability for different laser ablation liquid capture spot sampling modes of operation into a LESA ready Advion TriVersa NanoMate enhanced the spot sampling spatial resolution of this device and broadened the surface types amenable to analysis to include absorbent and solvent resistant

  9. Fully automated whole-head segmentation with improved smoothness and continuity, with theory reviewed.

    Science.gov (United States)

    Huang, Yu; Parra, Lucas C

    2015-01-01

    Individualized current-flow models are needed for precise targeting of brain structures using transcranial electrical or magnetic stimulation (TES/TMS). The same is true for current-source reconstruction in electroencephalography and magnetoencephalography (EEG/MEG). The first step in generating such models is to obtain an accurate segmentation of individual head anatomy, including not only brain but also cerebrospinal fluid (CSF), skull and soft tissues, with a field of view (FOV) that covers the whole head. Currently available automated segmentation tools only provide results for brain tissues, have a limited FOV, and do not guarantee continuity and smoothness of tissues, which is crucially important for accurate current-flow estimates. Here we present a tool that addresses these needs. It is based on a rigorous Bayesian inference framework that combines image intensity model, anatomical prior (atlas) and morphological constraints using Markov random fields (MRF). The method is evaluated on 20 simulated and 8 real head volumes acquired with magnetic resonance imaging (MRI) at 1 mm3 resolution. We find improved surface smoothness and continuity as compared to the segmentation algorithms currently implemented in Statistical Parametric Mapping (SPM). With this tool, accurate and morphologically correct modeling of the whole-head anatomy for individual subjects may now be feasible on a routine basis. Code and data are fully integrated into SPM software tool and are made publicly available. In addition, a review on the MRI segmentation using atlas and the MRF over the last 20 years is also provided, with the general mathematical framework clearly derived.

  10. Fully automated whole-head segmentation with improved smoothness and continuity, with theory reviewed.

    Directory of Open Access Journals (Sweden)

    Yu Huang

    Full Text Available Individualized current-flow models are needed for precise targeting of brain structures using transcranial electrical or magnetic stimulation (TES/TMS. The same is true for current-source reconstruction in electroencephalography and magnetoencephalography (EEG/MEG. The first step in generating such models is to obtain an accurate segmentation of individual head anatomy, including not only brain but also cerebrospinal fluid (CSF, skull and soft tissues, with a field of view (FOV that covers the whole head. Currently available automated segmentation tools only provide results for brain tissues, have a limited FOV, and do not guarantee continuity and smoothness of tissues, which is crucially important for accurate current-flow estimates. Here we present a tool that addresses these needs. It is based on a rigorous Bayesian inference framework that combines image intensity model, anatomical prior (atlas and morphological constraints using Markov random fields (MRF. The method is evaluated on 20 simulated and 8 real head volumes acquired with magnetic resonance imaging (MRI at 1 mm3 resolution. We find improved surface smoothness and continuity as compared to the segmentation algorithms currently implemented in Statistical Parametric Mapping (SPM. With this tool, accurate and morphologically correct modeling of the whole-head anatomy for individual subjects may now be feasible on a routine basis. Code and data are fully integrated into SPM software tool and are made publicly available. In addition, a review on the MRI segmentation using atlas and the MRF over the last 20 years is also provided, with the general mathematical framework clearly derived.

  11. Fully automated treatment planning for head and neck radiotherapy using a voxel-based dose prediction and dose mimicking method

    Science.gov (United States)

    McIntosh, Chris; Welch, Mattea; McNiven, Andrea; Jaffray, David A.; Purdie, Thomas G.

    2017-08-01

    Recent works in automated radiotherapy treatment planning have used machine learning based on historical treatment plans to infer the spatial dose distribution for a novel patient directly from the planning image. We present a probabilistic, atlas-based approach which predicts the dose for novel patients using a set of automatically selected most similar patients (atlases). The output is a spatial dose objective, which specifies the desired dose-per-voxel, and therefore replaces the need to specify and tune dose-volume objectives. Voxel-based dose mimicking optimization then converts the predicted dose distribution to a complete treatment plan with dose calculation using a collapsed cone convolution dose engine. In this study, we investigated automated planning for right-sided oropharaynx head and neck patients treated with IMRT and VMAT. We compare four versions of our dose prediction pipeline using a database of 54 training and 12 independent testing patients by evaluating 14 clinical dose evaluation criteria. Our preliminary results are promising and demonstrate that automated methods can generate comparable dose distributions to clinical. Overall, automated plans achieved an average of 0.6% higher dose for target coverage evaluation criteria, and 2.4% lower dose at the organs at risk criteria levels evaluated compared with clinical. There was no statistically significant difference detected in high-dose conformity between automated and clinical plans as measured by the conformation number. Automated plans achieved nine more unique criteria than clinical across the 12 patients tested and automated plans scored a significantly higher dose at the evaluation limit for two high-risk target coverage criteria and a significantly lower dose in one critical organ maximum dose. The novel dose prediction method with dose mimicking can generate complete treatment plans in 12-13 min without user interaction. It is a promising approach for fully automated treatment

  12. “Smart” RCTs: Development of a Smartphone App for Fully Automated Nutrition-Labeling Intervention Trials

    OpenAIRE

    Volkova, Ekaterina; Li, Nicole; Dunford, Elizabeth; Eyles, Helen; Crino, Michelle; Michie, Jo; Ni Mhurchu, Cliona

    2016-01-01

    Background There is substantial interest in the effects of nutrition labels on consumer food-purchasing behavior. However, conducting randomized controlled trials on the impact of nutrition labels in the real world presents a significant challenge. Objective The Food Label Trial (FLT) smartphone app was developed to enable conducting fully automated trials, delivering intervention remotely, and collecting individual-level data on food purchases for two nutrition-labeling randomized controlled...

  13. Fully automated subchondral bone segmentation from knee MR images: Data from the Osteoarthritis Initiative.

    Science.gov (United States)

    Gandhamal, Akash; Talbar, Sanjay; Gajre, Suhas; Razak, Ruslan; Hani, Ahmad Fadzil M; Kumar, Dileep

    2017-09-01

    Knee osteoarthritis (OA) progression can be monitored by measuring changes in the subchondral bone structure such as area and shape from MR images as an imaging biomarker. However, measurements of these minute changes are highly dependent on the accurate segmentation of bone tissue from MR images and it is challenging task due to the complex tissue structure and inadequate image contrast/brightness. In this paper, a fully automated method for segmenting subchondral bone from knee MR images is proposed. Here, the contrast of knee MR images is enhanced using a gray-level S-curve transformation followed by automatic seed point detection using a three-dimensional multi-edge overlapping technique. Successively, bone regions are initially extracted using distance-regularized level-set evolution followed by identification and correction of leakages along the bone boundary regions using a boundary displacement technique. The performance of the developed technique is evaluated against ground truths by measuring sensitivity, specificity, dice similarity coefficient (DSC), average surface distance (AvgD) and root mean square surface distance (RMSD). An average sensitivity (91.14%), specificity (99.12%) and DSC (90.28%) with 95% confidence interval (CI) in the range 89.74-92.54%, 98.93-99.31% and 88.68-91.88% respectively is achieved for the femur bone segmentation in 8 datasets. For tibia bone, average sensitivity (90.69%), specificity (99.65%) and DSC (91.35%) with 95% CI in the range 88.59-92.79%, 99.50-99.80% and 88.68-91.88% respectively is achieved. AvgD and RMSD values for femur are 1.43 ± 0.23 (mm) and 2.10 ± 0.35 (mm) respectively while for tibia, the values are 0.95 ± 0.28 (mm) and 1.30 ± 0.42 (mm) respectively that demonstrates acceptable error between proposed method and ground truths. In conclusion, results obtained in this work demonstrate substantially significant performance with consistency and robustness that led the proposed method to be

  14. Gene Expression Measurement Module (GEMM) - a fully automated, miniaturized instrument for measuring gene expression in space

    Science.gov (United States)

    Karouia, Fathi; Ricco, Antonio; Pohorille, Andrew; Peyvan, Kianoosh

    2012-07-01

    The capability to measure gene expression on board spacecrafts opens the doors to a large number of experiments on the influence of space environment on biological systems that will profoundly impact our ability to conduct safe and effective space travel, and might also shed light on terrestrial physiology or biological function and human disease and aging processes. Measurements of gene expression will help us to understand adaptation of terrestrial life to conditions beyond the planet of origin, identify deleterious effects of the space environment on a wide range of organisms from microbes to humans, develop effective countermeasures against these effects, determine metabolic basis of microbial pathogenicity and drug resistance, test our ability to sustain and grow in space organisms that can be used for life support and in situ resource utilization during long-duration space exploration, and monitor both the spacecraft environment and crew health. These and other applications hold significant potential for discoveries in space biology, biotechnology and medicine. Accordingly, supported by funding from the NASA Astrobiology Science and Technology Instrument Development Program, we are developing a fully automated, miniaturized, integrated fluidic system for small spacecraft capable of in-situ measuring microbial expression of thousands of genes from multiple samples. The instrument will be capable of (1) lysing bacterial cell walls, (2) extracting and purifying RNA released from cells, (3) hybridizing it on a microarray and (4) providing electrochemical readout, all in a microfluidics cartridge. The prototype under development is suitable for deployment on nanosatellite platforms developed by the NASA Small Spacecraft Office. The first target application is to cultivate and measure gene expression of the photosynthetic bacterium Synechococcus elongatus, i.e. a cyanobacterium known to exhibit remarkable metabolic diversity and resilience to adverse conditions

  15. Gene Expression Measurement Module (GEMM) - A Fully Automated, Miniaturized Instrument for Measuring Gene Expression in Space

    Science.gov (United States)

    Pohorille, Andrew; Peyvan, Kia; Karouia, Fathi; Ricco, Antonio

    2012-01-01

    The capability to measure gene expression on board spacecraft opens the door to a large number of high-value experiments on the influence of the space environment on biological systems. For example, measurements of gene expression will help us to understand adaptation of terrestrial life to conditions beyond the planet of origin, identify deleterious effects of the space environment on a wide range of organisms from microbes to humans, develop effective countermeasures against these effects, and determine the metabolic bases of microbial pathogenicity and drug resistance. These and other applications hold significant potential for discoveries in space biology, biotechnology, and medicine. Supported by funding from the NASA Astrobiology Science and Technology Instrument Development Program, we are developing a fully automated, miniaturized, integrated fluidic system for small spacecraft capable of in-situ measurement of expression of several hundreds of microbial genes from multiple samples. The instrument will be capable of (1) lysing cell walls of bacteria sampled from cultures grown in space, (2) extracting and purifying RNA released from cells, (3) hybridizing the RNA on a microarray and (4) providing readout of the microarray signal, all in a single microfluidics cartridge. The device is suitable for deployment on nanosatellite platforms developed by NASA Ames' Small Spacecraft Division. To meet space and other technical constraints imposed by these platforms, a number of technical innovations are being implemented. The integration and end-to-end technological and biological validation of the instrument are carried out using as a model the photosynthetic bacterium Synechococcus elongatus, known for its remarkable metabolic diversity and resilience to adverse conditions. Each step in the measurement process-lysis, nucleic acid extraction, purification, and hybridization to an array-is assessed through comparison of the results obtained using the instrument with

  16. Analysis of Open Automated Demand Response Deployments in California and Guidelines to Transition to Industry Standards

    Energy Technology Data Exchange (ETDEWEB)

    Ghatikar, Girish; Riess, David; Piette, Mary Ann

    2014-01-02

    This report reviews the Open Automated Demand Response (OpenADR) deployments within the territories serviced by California?s investor-owned utilities (IOUs) and the transition from the OpenADR 1.0 specification to the formal standard?OpenADR 2.0. As demand response service providers and customers start adopting OpenADR 2.0, it is necessary to ensure that the existing Automated Demand Response (AutoDR) infrastructure investment continues to be useful and takes advantage of the formal standard and its many benefits. This study focused on OpenADR deployments and systems used by the California IOUs and included a summary of the OpenADR deployment from the U.S. Department of Energy-funded demonstration conducted by the Sacramento Municipal Utility District (SMUD). Lawrence Berkeley National Laboratory collected and analyzed data about OpenADR 1.0 deployments, categorized architectures, developed a data model mapping to understand the technical compatibility of each version, and compared the capabilities and features of the two specifications. The findings, for the first time, provided evidence of the total enabled load shed and average first cost for system enablement in the IOU and SMUD service territories. The OpenADR 2.0a profile specification semantically supports AutoDR system architectures and data propagation with a testing and certification program that promotes interoperability, scaled deployments by multiple vendors, and provides additional features that support future services.

  17. Results from the first fully automated PBS-mask process and pelliclization

    Science.gov (United States)

    Oelmann, Andreas B.; Unger, Gerd M.

    1994-02-01

    Automation is widely discussed in IC- and mask-manufacturing and partially realized everywhere. The idea for the automation goes back to 1978, when it turned out that the operators for the then newly installed PBS-process-line (the first in Europe) should be trained to behave like robots for particle reduction gaining lower defect densities on the masks. More than this goal has been achieved. It turned out recently, that the automation with its dedicated work routes and detailed documentation of every lot (individual mask or reticle) made it easy to obtain the CEEC certificate which includes ISO 9001.

  18. Fully automated VMAT treatment planning for advanced-stage NSCLC patients

    Energy Technology Data Exchange (ETDEWEB)

    Della Gala, Giuseppe [Erasmus MC Cancer Institute, Department of Radiation Oncology, Rotterdam (Netherlands); Universita di Bologna, Scuola di Scienze, Alma Mater Studiorum, Bologna (Italy); Dirkx, Maarten L.P.; Hoekstra, Nienke; Fransen, Dennie; Pol, Marjan van de; Heijmen, Ben J.M. [Erasmus MC Cancer Institute, Department of Radiation Oncology, Rotterdam (Netherlands); Lanconelli, Nico [Universita di Bologna, Scuola di Scienze, Alma Mater Studiorum, Bologna (Italy); Petit, Steven F. [Erasmus MC Cancer Institute, Department of Radiation Oncology, Rotterdam (Netherlands); Massachusetts General Hospital - Harvard Medical School, Department of Radiation Oncology, Boston, MA (United States)

    2017-05-15

    To develop a fully automated procedure for multicriterial volumetric modulated arc therapy (VMAT) treatment planning (autoVMAT) for stage III/IV non-small cell lung cancer (NSCLC) patients treated with curative intent. After configuring the developed autoVMAT system for NSCLC, autoVMAT plans were compared with manually generated clinically delivered intensity-modulated radiotherapy (IMRT) plans for 41 patients. AutoVMAT plans were also compared to manually generated VMAT plans in the absence of time pressure. For 16 patients with reduced planning target volume (PTV) dose prescription in the clinical IMRT plan (to avoid violation of organs at risk tolerances), the potential for dose escalation with autoVMAT was explored. Two physicians evaluated 35/41 autoVMAT plans (85%) as clinically acceptable. Compared to the manually generated IMRT plans, autoVMAT plans showed statistically significant improved PTV coverage (V{sub 95%} increased by 1.1% ± 1.1%), higher dose conformity (R{sub 50} reduced by 12.2% ± 12.7%), and reduced mean lung, heart, and esophagus doses (reductions of 0.9 Gy ± 1.0 Gy, 1.5 Gy ± 1.8 Gy, 3.6 Gy ± 2.8 Gy, respectively, all p < 0.001). To render the six remaining autoVMAT plans clinically acceptable, a dosimetrist needed less than 10 min hands-on time for fine-tuning. AutoVMAT plans were also considered equivalent or better than manually optimized VMAT plans. For 6/16 patients, autoVMAT allowed tumor dose escalation of 5-10 Gy. Clinically deliverable, high-quality autoVMAT plans can be generated fully automatically for the vast majority of advanced-stage NSCLC patients. For a subset of patients, autoVMAT allowed for tumor dose escalation. (orig.) [German] Entwicklung einer vollautomatisierten, auf multiplen Kriterien basierenden volumenmodulierten Arc-Therapie-(VMAT-)Behandlungsplanung (autoVMAT) fuer kurativ behandelte Patienten mit nicht-kleinzelligem Bronchialkarzinom (NSCLC) im Stadium III/IV. Nach Konfiguration unseres auto

  19. Fully automated atlas-based hippocampal volumetry for detection of Alzheimer's disease in a memory clinic setting.

    Science.gov (United States)

    Suppa, Per; Anker, Ulrich; Spies, Lothar; Bopp, Irene; Rüegger-Frey, Brigitte; Klaghofer, Richard; Gocke, Carola; Hampel, Harald; Beck, Sacha; Buchert, Ralph

    2015-01-01

    Hippocampal volume is a promising biomarker to enhance the accuracy of the diagnosis of dementia due to Alzheimer's disease (AD). However, whereas hippocampal volume is well studied in patient samples from clinical trials, its value in clinical routine patient care is still rather unclear. The aim of the present study, therefore, was to evaluate fully automated atlas-based hippocampal volumetry for detection of AD in the setting of a secondary care expert memory clinic for outpatients. One-hundred consecutive patients with memory complaints were clinically evaluated and categorized into three diagnostic groups: AD, intermediate AD, and non-AD. A software tool based on open source software (Statistical Parametric Mapping SPM8) was employed for fully automated tissue segmentation and stereotactical normalization of high-resolution three-dimensional T1-weighted magnetic resonance images. Predefined standard masks were used for computation of grey matter volume of the left and right hippocampus which then was scaled to the patient's total grey matter volume. The right hippocampal volume provided an area under the receiver operating characteristic curve of 84% for detection of AD patients in the whole sample. This indicates that fully automated MR-based hippocampal volumetry fulfills the requirements for a relevant core feasible biomarker for detection of AD in everyday patient care in a secondary care memory clinic for outpatients. The software used in the present study has been made freely available as an SPM8 toolbox. It is robust and fast so that it is easily integrated into routine workflow.

  20. Can fully automated detection of corticospinal tract damage be used in stroke patients?

    OpenAIRE

    Leff, Alexander P.; Seghier, Mohamed L.; Kou, Nancy; Park, Chang-hyun; Ward, Nick S.

    2013-01-01

    We compared manual infarct definition, which is time-consuming and open to bias, with an automated abnormal tissue detection method in measuring corticospinal tract-infarct overlap volumes in chronic stroke patients to help predict motor outcome.

  1. Automated drop-on-demand system with real-time gravimetric control for precise dosage formulation.

    Science.gov (United States)

    Sahay, A; Brown, M; Muzzio, F; Takhistov, Paul

    2013-04-01

    Many of the therapies for personalized medicine have few dosage options, and the successful translation of these therapies to the clinic is significantly dependent on the drug/formulation delivery platform. We have developed a lab-scale integrated system for microdosing of drug formulations with high accuracy and precision that is capable of feedback control. The designed modular drug dispensing system includes a microdispensing valve unit and is fully automated with a LabVIEW-controlled computer interface. The designed system is capable of dispensing drug droplets with volumes ranging from nanoliters to microliters with high accuracy (relative standard deviation gravimetric control.

  2. Application of existing technology to meet increasing demands for automated sample handling.

    Science.gov (United States)

    Chow, A T; Kegelman, J E; Kohli, C; McCabe, D D; Moore, J F

    1990-09-01

    As the clinical laboratory advances toward total automation, the marketplace is now demanding more-efficient sample-handling systems. These demands have arisen over a relatively short period of time, in part because of heightened concern over laboratory safety and the resulting manpower shortages. Adding sample-handling capabilities to existing instrumentation is often a challenge, because usually mechanical or system constraints are present that interfere. This challenge has been overcome in the DuPont Sample Management System (SMS), a second-generation general chemistry analyzer that incorporates the latest barcode and computer-interfacing technology. The development of the SMS system relies heavily on recent advances in technology, e.g., software modeling and computer-aided design. The SMS system includes a barcode scanner based on "charge-coupled device" technology, a random-access sample wheel, and new software that oversees the various functions.

  3. A fully automated method for quantifying and localizing white matter hyperintensities on MR images.

    Science.gov (United States)

    Wu, Minjie; Rosano, Caterina; Butters, Meryl; Whyte, Ellen; Nable, Megan; Crooks, Ryan; Meltzer, Carolyn C; Reynolds, Charles F; Aizenstein, Howard J

    2006-12-01

    White matter hyperintensities (WMH), commonly found on T2-weighted FLAIR brain MR images in the elderly, are associated with a number of neuropsychiatric disorders, including vascular dementia, Alzheimer's disease, and late-life depression. Previous MRI studies of WMHs have primarily relied on the subjective and global (i.e., full-brain) ratings of WMH grade. In the current study we implement and validate an automated method for quantifying and localizing WMHs. We adapt a fuzzy-connected algorithm to automate the segmentation of WMHs and use a demons-based image registration to automate the anatomic localization of the WMHs using the Johns Hopkins University White Matter Atlas. The method is validated using the brain MR images acquired from eleven elderly subjects with late-onset late-life depression (LLD) and eight elderly controls. This dataset was chosen because LLD subjects are known to have significant WMH burden. The volumes of WMH identified in our automated method are compared with the accepted gold standard (manual ratings). A significant correlation of the automated method and the manual ratings is found (Pdepression. Progress in Neuro-Psychopharmacology and Biological Psychiatry. 27 (3), 539-544.]), we found there was a significantly greater WMH burden in the LLD subjects versus the controls for both the manual and automated method. The effect size was greater for the automated method, suggesting that it is a more specific measure. Additionally, we describe the anatomic localization of the WMHs in LLD subjects as well as in the control subjects, and detect the regions of interest (ROIs) specific for the WMH burden of LLD patients. Given the emergence of large NeuroImage databases, techniques, such as that described here, will allow for a better understanding of the relationship between WMHs and neuropsychiatric disorders.

  4. Opportunities for Energy Efficiency and Open Automated Demand Response in Wastewater Treatment Facilities in California -- Phase I Report

    Energy Technology Data Exchange (ETDEWEB)

    Lekov, Alex; Thompson, Lisa; McKane, Aimee; Song, Katherine; Piette, Mary Ann

    2009-04-01

    This report summarizes the Lawrence Berkeley National Laboratory?s research to date in characterizing energy efficiency and automated demand response opportunities for wastewater treatment facilities in California. The report describes the characteristics of wastewater treatment facilities, the nature of the wastewater stream, energy use and demand, as well as details of the wastewater treatment process. It also discusses control systems and energy efficiency and automated demand response opportunities. In addition, several energy efficiency and load management case studies are provided for wastewater treatment facilities.This study shows that wastewater treatment facilities can be excellent candidates for open automated demand response and that facilities which have implemented energy efficiency measures and have centralized control systems are well-suited to shift or shed electrical loads in response to financial incentives, utility bill savings, and/or opportunities to enhance reliability of service. Control technologies installed for energy efficiency and load management purposes can often be adapted for automated demand response at little additional cost. These improved controls may prepare facilities to be more receptive to open automated demand response due to both increased confidence in the opportunities for controlling energy cost/use and access to the real-time data.

  5. Automated Demand Response Technology Demonstration Project for Small and Medium Commercial Buildings

    Energy Technology Data Exchange (ETDEWEB)

    Page, Janie; Kiliccote, Sila; Dudley, Junqiao Han; Piette, Mary Ann; Chiu, Albert K.; Kellow, Bashar; Koch, Ed; Lipkin, Paul

    2011-07-01

    Small and medium commercial customers in California make up about 20-25% of electric peak load in California. With the roll out of smart meters to this customer group, which enable granular measurement of electricity consumption, the investor-owned utilities will offer dynamic prices as default tariffs by the end of 2011. Pacific Gas and Electric Company, which successfully deployed Automated Demand Response (AutoDR) Programs to its large commercial and industrial customers, started investigating the same infrastructures application to the small and medium commercial customers. This project aims to identify available technologies suitable for automating demand response for small-medium commercial buildings; to validate the extent to which that technology does what it claims to be able to do; and determine the extent to which customers find the technology useful for DR purpose. Ten sites, enabled by eight vendors, participated in at least four test AutoDR events per site in the summer of 2010. The results showed that while existing technology can reliably receive OpenADR signals and translate them into pre-programmed response strategies, it is likely that better levels of load sheds could be obtained than what is reported here if better understanding of the building systems were developed and the DR response strategies had been carefully designed and optimized for each site.

  6. Effects of Granular Control on Customers’ Perspective and Behavior with Automated Demand Response Systems

    Energy Technology Data Exchange (ETDEWEB)

    Schetrit, Oren; Kim, Joyce; Yin, Rongxin; Kiliccote, Sila

    2014-08-01

    Automated demand response (Auto-DR) is expected to close the loop between buildings and the grid by providing machine-to-machine communications to curtail loads without the need for human intervention. Hence, it can offer more reliable and repeatable demand response results to the grid than the manual approach and make demand response participation a hassle-free experience for customers. However, many building operators misunderstand Auto-DR and are afraid of losing control over their building operation. To ease the transition from manual to Auto-DR, we designed and implemented granular control of Auto-DR systems so that building operators could modify or opt out of individual load-shed strategies whenever they wanted. This paper reports the research findings from this effort demonstrated through a field study in large commercial buildings located in New York City. We focused on (1) understanding how providing granular control affects building operators’ perspective on Auto-DR, and (2) evaluating the usefulness of granular control by examining their interaction with the Auto-DR user interface during test events. Through trend log analysis, interviews, and surveys, we found that: (1) the opt-out capability during Auto-DR events can remove the feeling of being forced into load curtailments and increase their willingness to adopt Auto-DR; (2) being able to modify individual load-shed strategies allows flexible Auto-DR participation that meets the building’s changing operational requirements; (3) a clear display of automation strategies helps building operators easily identify how Auto-DR is functioning and can build trust in Auto-DR systems.

  7. Automated Demand Response: The Missing Link in the Electricity Value Chain

    Energy Technology Data Exchange (ETDEWEB)

    McKane, Aimee; Rhyne, Ivin; Piette, Mary Ann; Thompson, Lisa; Lekov, Alex

    2008-08-01

    In 2006, the Public Interest Energy Research Program (PIER) Demand Response Research Center (DRRC) at Lawrence Berkeley National Laboratory initiated research into Automated Demand Response (OpenADR) applications in California industry. The goal is to improve electric grid reliability and lower electricity use during periods of peak demand. The purpose of this research is to begin to define the relationship among a portfolio of actions that industrial facilities can undertake relative to their electricity use. This 'electricity value chain' defines energy management and demand response (DR) at six levels of service, distinguished by the magnitude, type, and rapidity of response. One element in the electricity supply chain is OpenADR, an open-standards based communications system to send signals to customers to allow them to manage their electric demand in response to supply conditions, such as prices or reliability, through a set of standard, open communications. Initial DRRC research suggests that industrial facilities that have undertaken energy efficiency measures are probably more, not less, likely to initiate other actions within this value chain such as daily load management and demand response. Moreover, OpenADR appears to afford some facilities the opportunity to develop the supporting control structure and to 'demo' potential reductions in energy use that can later be applied to either more effective load management or a permanent reduction in use via energy efficiency. Under the right conditions, some types of industrial facilities can shift or shed loads, without any, or minimal disruption to operations, to protect their energy supply reliability and to take advantage of financial incentives. In 2007 and 2008, 35 industrial facilities agreed to implement OpenADR, representing a total capacity of nearly 40 MW. This paper describes how integrated or centralized demand management and system-level network controls are linked to Open

  8. Automated Demand Response: The Missing Link in the Electricity Value Chain

    Energy Technology Data Exchange (ETDEWEB)

    McKane, Aimee; Rhyne, Ivin; Lekov, Alex; Thompson, Lisa; Piette, MaryAnn

    2009-08-01

    In 2006, the Public Interest Energy Research Program (PIER) Demand Response Research Center (DRRC) at Lawrence Berkeley National Laboratory initiated research into Automated Demand Response (OpenADR) applications in California industry. The goal is to improve electric grid reliability and lower electricity use during periods of peak demand. The purpose of this research is to begin to define the relationship among a portfolio of actions that industrial facilities can undertake relative to their electricity use. This ?electricity value chain? defines energy management and demand response (DR) at six levels of service, distinguished by the magnitude, type, and rapidity of response. One element in the electricity supply chain is OpenADR, an open-standards based communications system to send signals to customers to allow them to manage their electric demand in response to supply conditions, such as prices or reliability, through a set of standard, open communications. Initial DRRC research suggests that industrial facilities that have undertaken energy efficiency measures are probably more, not less, likely to initiate other actions within this value chain such as daily load management and demand response. Moreover, OpenADR appears to afford some facilities the opportunity to develop the supporting control structure and to"demo" potential reductions in energy use that can later be applied to either more effective load management or a permanent reduction in use via energy efficiency. Under the right conditions, some types of industrial facilities can shift or shed loads, without any, or minimal disruption to operations, to protect their energy supply reliability and to take advantage of financial incentives.1 In 2007 and 2008, 35 industrial facilities agreed to implement OpenADR, representing a total capacity of nearly 40 MW. This paper describes how integrated or centralized demand management and system-level network controls are linked to OpenADR systems. Case studies

  9. Development of a fully automated network system for long-term health-care monitoring at home.

    Science.gov (United States)

    Motoi, K; Kubota, S; Ikarashi, A; Nogawa, M; Tanaka, S; Nemoto, T; Yamakoshi, K

    2007-01-01

    Daily monitoring of health condition at home is very important not only as an effective scheme for early diagnosis and treatment of cardiovascular and other diseases, but also for prevention and control of such diseases. From this point of view, we have developed a prototype room for fully automated monitoring of various vital signs. From the results of preliminary experiments using this room, it was confirmed that (1) ECG and respiration during bathing, (2) excretion weight and blood pressure, and (3) respiration and cardiac beat during sleep could be monitored with reasonable accuracy by the sensor system installed in bathtub, toilet and bed, respectively.

  10. Comparison of subjective and fully automated methods for measuring mammographic density.

    Science.gov (United States)

    Moshina, Nataliia; Roman, Marta; Sebuødegård, Sofie; Waade, Gunvor G; Ursin, Giske; Hofvind, Solveig

    2017-01-01

    Background Breast radiologists of the Norwegian Breast Cancer Screening Program subjectively classified mammographic density using a three-point scale between 1996 and 2012 and changed into the fourth edition of the BI-RADS classification since 2013. In 2015, an automated volumetric breast density assessment software was installed at two screening units. Purpose To compare volumetric breast density measurements from the automated method with two subjective methods: the three-point scale and the BI-RADS density classification. Material and Methods Information on subjective and automated density assessment was obtained from screening examinations of 3635 women recalled for further assessment due to positive screening mammography between 2007 and 2015. The score of the three-point scale (I = fatty; II = medium dense; III = dense) was available for 2310 women. The BI-RADS density score was provided for 1325 women. Mean volumetric breast density was estimated for each category of the subjective classifications. The automated software assigned volumetric breast density to four categories. The agreement between BI-RADS and volumetric breast density categories was assessed using weighted kappa (kw). Results Mean volumetric breast density was 4.5%, 7.5%, and 13.4% for categories I, II, and III of the three-point scale, respectively, and 4.4%, 7.5%, 9.9%, and 13.9% for the BI-RADS density categories, respectively ( P for trend density categories was kw = 0.5 (95% CI = 0.47-0.53; P density increased with increasing density category of the subjective classifications. The agreement between BI-RADS and volumetric breast density categories was moderate.

  11. Fully Automated Prostate Magnetic Resonance Imaging and Transrectal Ultrasound Fusion via a Probabilistic Registration Metric

    OpenAIRE

    Sparks, Rachel; Bloch, B. Nicolas; Feleppa, Ernest; Barratt, Dean; Madabhushi, Anant

    2013-01-01

    In this work, we present a novel, automated, registration method to fuse magnetic resonance imaging (MRI) and transrectal ultrasound (TRUS) images of the prostate. Our methodology consists of: (1) delineating the prostate on MRI, (2) building a probabilistic model of prostate location on TRUS, and (3) aligning the MRI prostate segmentation to the TRUS probabilistic model. TRUS-guided needle biopsy is the current gold standard for prostate cancer (CaP) diagnosis. Up to 40% of CaP lesions appea...

  12. Fully automated radiocarbon AMS measurements with elemental analyser and gas ion source

    Energy Technology Data Exchange (ETDEWEB)

    Ruff, Matthias; Gaeggeler, Heinz [University of Berne (Switzerland)]|[Paul Scherrer Institute (Switzerland); Suter, Martin [ETH Zurich (Switzerland)]|[PSI/ETH Zurich (Switzerland); Synal, Hans-Arno [PSI/ETH Zurich (Switzerland); Szidat, Soenke [University of Berne (Switzerland); Lukas, Wacker [ETH Zurich (Switzerland)

    2008-07-01

    The MIACADAS gas ion source in Zurich for measuring radiocarbon in small samples in the range of 2-50 ug carbon is now routinely running semi-automated for more than one and a half years. So far the carbon dioxide to be measured is supplied in glass ampoules and released in an ampoule cracker. The gas is flushed into a syringe with helium and transported onto the surface of a titanium gas target in the Cs sputter ion source. Thereby, the syringe acts as an adjustable tool according to the sample size and can also be moved by a stepping motor to keep a constant flow into the source. For full automation of this system an elemental analyser has been connected for combustion of the sample and separation of the combustion gases. The isolated carbon dioxide leaves the elemental analyser in a high helium stream of about 80 ml/min and has to be first concentrated on a small trap before feeding it into the syringe. Some technical solutions and first results of this automated online system are discussed.

  13. Towards Automated Lecture Capture, Navigation and Delivery System for Web-Lecture on Demand

    CERN Document Server

    Kannan, Rajkumar

    2010-01-01

    Institutions all over the world are continuously exploring ways to use ICT in improving teaching and learning effectiveness. The use of course web pages, discussion groups, bulletin boards, and e-mails have shown considerable impact on teaching and learning in significant ways, across all disciplines. ELearning has emerged as an alternative to traditional classroom-based education and training and web lectures can be a powerful addition to traditional lectures. They can even serve as a main content source for learning, provided users can quickly navigate and locate relevant pages in a web lecture. A web lecture consists of video and audio of the presenter and slides complemented with screen capturing. In this paper, an automated approach for recording live lectures and for browsing available web lectures for on-demand applications by end users is presented.

  14. Integrating Electrochemical Detection with Centrifugal Microfluidics for Real-Time and Fully Automated Sample Testing

    DEFF Research Database (Denmark)

    Andreasen, Sune Zoëga; Kwasny, Dorota; Amato, Letizia

    2015-01-01

    experiments, even when the microfluidic disc is spinning at high velocities. Automated sample handling is achieved by designing a microfluidic system to release analyte sequentially, utilizing on-disc passive valving. In addition, the microfluidic system is designed to trap and keep the liquid sample...... stationary during analysis. In this way it is possible to perform cyclic voltammetry (CV) measurements at varying spin speeds, without altering the electrochemical response. This greatly simplifies the interpretation and quantification of data. Finally, real-time and continuous monitoring of an entire...

  15. Fully automated synthesis of [(18) F]fluoro-dihydrotestosterone ([(18) F]FDHT) using the FlexLab module.

    Science.gov (United States)

    Ackermann, Uwe; Lewis, Jason S; Young, Kenneth; Morris, Michael J; Weickhardt, Andrew; Davis, Ian D; Scott, Andrew M

    2016-08-01

    Imaging of androgen receptor expression in prostate cancer using F-18 FDHT is becoming increasingly popular. With the radiolabelling precursor now commercially available, developing a fully automated synthesis of [(18) F] FDHT is important. We have fully automated the synthesis of F-18 FDHT using the iPhase FlexLab module using only commercially available components. Total synthesis time was 90 min, radiochemical yields were 25-33% (n = 11). Radiochemical purity of the final formulation was > 99% and specific activity was > 18.5 GBq/µmol for all batches. This method can be up-scaled as desired, thus making it possible to study multiple patients in a day. Furthermore, our procedure uses 4 mg of precursor only and is therefore cost-effective. The synthesis has now been validated at Austin Health and is currently used for [(18) F]FDHT studies in patients. We believe that this method can easily adapted by other modules to further widen the availability of [(18) F]FDHT.

  16. Preliminary study of a new, fully automated system for liquid-based cytology: the NovaPrep® processor system.

    Science.gov (United States)

    Esquivias López-Cuervo, Javier; Montalbán Beltran, Estanislao; Cuadros Lopez, Jose Luis; Alonso Castillo, Angeles; Nieto Sanchez, Teresa

    2011-01-01

    To evaluate a fully automated system for liquid-based cytology (LBC): the NovaPrep® Processor System (NPS), which is based on the new concept of double decantation, versus conventional cytology (CC), the gold standard for cytology. We performed a preliminary comparative study involving 1,129 female patients who underwent sampling for a Pap test; the sample was first smeared for CC and then, using the remaining specimen on the brush, for LBC with the NPS. The performances of CC and NPS were evaluated for accuracy and compared using the gold standard of a combination of one of the two methods of pathological cytology with screening for positive human papilloma virus, quantification of cells (normal and pathological), and improvement in the quality of samples and reading time. The results showed improvement in sensitivity (3.81% for CC vs. 4.52% for NPS) with a specificity superior to 90% for both, a markedly decreased number of unsatisfactory specimens, notably samples containing too many inflamed cells (7.4% for CC vs. 0.5% for NPS), and a shortening of the reading time, which was three times less using NPS. This preliminary study showed a gain in sensitivity, a drop in the number of unsatisfactory specimens and a reduction in reading time with NPS. The results achieved using this fully automated LBC procedure are very promising and will hopefully reduce the overall cost of cervical cancer screening in the future. Copyright © 2011 S. Karger AG, Basel.

  17. Lab on valve-multisyringe flow injection system (LOV-MSFIA) for fully automated uranium determination in environmental samples.

    Science.gov (United States)

    Avivar, Jessica; Ferrer, Laura; Casas, Montserrat; Cerdà, Víctor

    2011-06-15

    The hyphenation of lab-on-valve (LOV) and multisyringe flow analysis (MSFIA), coupled to a long path length liquid waveguide capillary cell (LWCC), allows the spectrophotometric determination of uranium in different types of environmental sample matrices, without any manual pre-treatment, and achieving high selectivity and sensitivity levels. On-line separation and preconcentration of uranium is carried out by means of UTEVA resin. The potential of the LOV-MSFIA makes possible the fully automation of the system by the in-line regeneration of the column. After elution, uranium(VI) is spectrophotometrically detected after reaction with arsenazo-III. The determination of levels of uranium present in environmental samples is required in order to establish an environmental control. Thus, we propose a rapid, cheap and fully automated method to determine uranium(VI) in environmental samples. The limit of detection reached is 1.9 ηg of uranium and depending on the preconcentrated volume; it results in ppt levels (10.3 ηg L(-1)). Different water sample matrices (seawater, well water, freshwater, tap water and mineral water) and a phosphogypsum sample (with natural uranium content) were satisfactorily analyzed.

  18. Revisiting the Fully Automated Double-ring Infiltrometer using Open-source Electronics

    Science.gov (United States)

    Ong, J.; Werkema, D., Jr.; Lane, J. W.

    2012-12-01

    The double-ring infiltrometer (DRI) is commonly used for measuring soil hydraulic conductivity. However, constant-head DRI tests typically involve the use of Mariotte tubes, which can be problematic to set-up, and time-consuming to maintain and monitor during infiltration tests. Maheshwari (1996, Australian Journal of Soil Research, v. 34, p. 709-714) developed a method for eliminating Mariotte tubes for constant-head tests using a computer-controlled combination of water-level indicators and solenoids to maintain a near-constant head in the DRI. A pressure transducer mounted on a depth-to-volume calibrated tank measures the water delivery rates during the test and data are saved on a hard drive or floppy disk. Here we use an inexpensive combination of pressure transducers, microcontroller, and open-source electronics that eliminate the need for Mariotte tubes. The system automates DRI water delivery and data recording for both constant- and falling-head infiltration tests. The user has the option of choosing water supplied to the DRI through a pressurized water system, pump, or gravity fed. An LCD screen enables user interface and observation of data for quality analysis in the field. The digital data are stored on a micro-SD card in standard column format for future retrieval and easy importing into conventional processing and plotting software. We show the results of infiltrometer tests using the automated system and a conventional Mariotte tube system conducted over test beds of uniform soils.

  19. Fully automated high-performance liquid chromatographic assay for the analysis of free catecholamines in urine.

    Science.gov (United States)

    Said, R; Robinet, D; Barbier, C; Sartre, J; Huguet, C

    1990-08-24

    A totally automated and reliable high-performance liquid chromatographic method is described for the routine determination of free catecholamines (norepinephrine, epinephrine and dopamine) in urine. The catecholamines were isolated from urine samples using small alumina columns. A standard automated method for pH adjustment of urine before the extraction step has been developed. The extraction was performed on an ASPEC (Automatic Sample Preparation with Extraction Columns, Gilson). The eluate was collected in a separate tube and then automatically injected into the chromatographic column. The catecholamines were separated by reversed-phase ion-pair liquid chromatography and quantified by fluorescence detection. No manual intervention was required during the extraction and separation procedure. One sample may be run every 15 min, ca. 96 samples in 24 h. Analytical recoveries for all three catecholamines are 63-87%, and the detection limits are 0.01, 0.01, and 0.03 microM for norepinephrine, epinephrine and dopamine, respectively, which is highly satisfactory for urine. Day-to-day coefficients of variation were less than 10%.

  20. Development of a Fully Automated Flow Injection Analyzer Implementing Bioluminescent Biosensors for Water Toxicity Assessment

    Directory of Open Access Journals (Sweden)

    Constantinos Georgiou

    2010-07-01

    Full Text Available This paper describes the development of an automated Flow Injection analyzer for water toxicity assessment. The analyzer is validated by assessing the toxicity of heavy metal (Pb2+, Hg2+ and Cu2+ solutions. One hundred μL of a Vibrio fischeri suspension are injected in a carrier solution containing different heavy metal concentrations. Biosensor cells are mixed with the toxic carrier solution in the mixing coil on the way to the detector. Response registered is % inhibition of biosensor bioluminescence due to heavy metal toxicity in comparison to that resulting by injecting the Vibrio fischeri suspension in deionised water. Carrier solutions of mercury showed higher toxicity than the other heavy metals, whereas all metals show concentration related levels of toxicity. The biosensor’s response to carrier solutions of different pHs was tested. Vibrio fischeri’s bioluminescence is promoted in the pH 5–10 range. Experiments indicate that the whole cell biosensor, as applied in the automated fluidic system, responds to various toxic solutions.

  1. Fully automated image-guided needle insertion: application to small animal biopsies.

    Science.gov (United States)

    Ayadi, A; Bour, G; Aprahamian, M; Bayle, B; Graebling, P; Gangloff, J; Soler, L; Egly, J M; Marescaux, J

    2007-01-01

    The study of biological process evolution in small animals requires time-consuming and expansive analyses of a large population of animals. Serial analyses of the same animal is potentially a great alternative. However non-invasive procedures must be set up, to retrieve valuable tissue samples from precisely defined areas in living animals. Taking advantage of the high resolution level of in vivo molecular imaging, we defined a procedure to perform image-guided needle insertion and automated biopsy using a micro CT-scan, a robot and a vision system. Workspace limitations in the scanner require the animal to be removed and laid in front of the robot. A vision system composed of a grid projector and a camera is used to register the designed animal-bed with to respect to the robot and to calibrate automatically the needle position and orientation. Automated biopsy is then synchronised with respiration and performed with a pneumatic translation device, at high velocity, to minimize organ deformation. We have experimentally tested our biopsy system with different needles.

  2. Fully automated fluorescent in situ hybridization (FISH staining and digital analysis of HER2 in breast cancer: a validation study.

    Directory of Open Access Journals (Sweden)

    Elise M J van der Logt

    Full Text Available HER2 assessment is routinely used to select patients with invasive breast cancer that might benefit from HER2-targeted therapy. The aim of this study was to validate a fully automated in situ hybridization (ISH procedure that combines the automated Leica HER2 fluorescent ISH system for Bond with supervised automated analysis with the Visia imaging D-Sight digital imaging platform. HER2 assessment was performed on 328 formalin-fixed/paraffin-embedded invasive breast cancer tumors on tissue microarrays (TMA and 100 (50 selected IHC 2+ and 50 random IHC scores full-sized slides of resections/biopsies obtained for diagnostic purposes previously. For digital analysis slides were pre-screened at 20x and 100x magnification for all fluorescent signals and supervised-automated scoring was performed on at least two pictures (in total at least 20 nuclei were counted with the D-Sight HER2 FISH analysis module by two observers independently. Results were compared to data obtained previously with the manual Abbott FISH test. The overall agreement with Abbott FISH data among TMA samples and 50 selected IHC 2+ cases was 98.8% (κ = 0.94 and 93.8% (κ = 0.88, respectively. The results of 50 additionally tested unselected IHC cases were concordant with previously obtained IHC and/or FISH data. The combination of the Leica FISH system with the D-Sight digital imaging platform is a feasible method for HER2 assessment in routine clinical practice for patients with invasive breast cancer.

  3. Development and Demonstration of the Open Automated Demand Response Standard for the Residential Sector

    Energy Technology Data Exchange (ETDEWEB)

    Herter, Karen; Rasin, Josh; Perry, Tim

    2009-11-30

    The goal of this study was to demonstrate a demand response system that can signal nearly every customer in all sectors through the integration of two widely available and non- proprietary communications technologies--Open Automated Demand Response (OpenADR) over lnternet protocol and Utility Messaging Channel (UMC) over FM radio. The outcomes of this project were as follows: (1) a software bridge to allow translation of pricing signals from OpenADR to UMC; and (2) a portable demonstration unit with an lnternet-connected notebook computer, a portfolio of DR-enabling technologies, and a model home. The demonstration unit provides visitors the opportunity to send electricity-pricing information over the lnternet (through OpenADR and UMC) and then watch as the model appliances and lighting respond to the signals. The integration of OpenADR and UMC completed and demonstrated in this study enables utilities to send hourly or sub-hourly electricity pricing information simultaneously to the residential, commercial and industrial sectors.

  4. AutoRoot: open-source software employing a novel image analysis approach to support fully-automated plant phenotyping.

    Science.gov (United States)

    Pound, Michael P; Fozard, Susan; Torres Torres, Mercedes; Forde, Brian G; French, Andrew P

    2017-01-01

    then be carefully manually inspected if the nature of the precise differences is required. We suggest such flexible measurement approaches are necessary for fully automated, high throughput systems such as the Microphenotron.

  5. Construction and calibration of a low cost and fully automated vibrating sample magnetometer

    Energy Technology Data Exchange (ETDEWEB)

    El-Alaily, T.M., E-mail: toson_alaily@yahoo.com [Physics Department, Faculty of Science, Tanta University, Tanta (Egypt); El-Nimr, M.K.; Saafan, S.A.; Kamel, M.M.; Meaz, T.M. [Physics Department, Faculty of Science, Tanta University, Tanta (Egypt); Assar, S.T. [Engineering Physics and Mathematics Department, Faculty of Engineering, Tanta University, Tanta (Egypt)

    2015-07-15

    A low cost vibrating sample magnetometer (VSM) has been constructed by using an electromagnet and an audio loud speaker; where both are controlled by a data acquisition device. The constructed VSM records the magnetic hysteresis loop up to 8.3 KG at room temperature. The apparatus has been calibrated and tested by using magnetic hysteresis data of some ferrite samples measured by two scientifically calibrated magnetometers; model (Lake Shore 7410) and model (LDJ Electronics Inc. Troy, MI). Our VSM lab-built new design proved success and reliability. - Highlights: • A low cost automated vibrating sample magnetometer VSM has been constructed. • The VSM records the magnetic hysteresis loop up to 8.3 KG at room temperature. • The VSM has been calibrated and tested by using some measured ferrite samples. • Our VSM lab-built new design proved success and reliability.

  6. Fully automated contour detection algorithm the preliminary step for scatter and attenuation compensation in SPECT

    Energy Technology Data Exchange (ETDEWEB)

    Younes, R.B.; Mas, J.; Bidet, R.

    1988-12-01

    Contour detection is an important step in information extraction from nuclear medicine images. In order to perform accurate quantitative studies in single photon emission computed tomography (SPECT) a new procedure is described which can rapidly derive the best fit contour of an attenuated medium. Some authors evaluate the influence of the detected contour on the reconstructed images with various attenuation correction techniques. Most of the methods are strongly affected by inaccurately detected contours. This approach uses the Compton window to redetermine the convex contour: It seems to be simpler and more practical in clinical SPECT studies. The main advantages of this procedure are the high speed of computation, the accuracy of the contour found and the programme's automation. Results obtained using computer simulated and real phantoms or clinical studies demonstrate the reliability of the present algorithm.

  7. Fully automated hybrid diode laser assembly using high precision active alignment

    Science.gov (United States)

    Böttger, Gunnar; Weber, Daniel; Scholz, Friedemann; Schröder, Henning; Schneider-Ramelow, Martin; Lang, Klaus-Dieter

    2016-03-01

    Fraunhofer IZM, Technische Universität Berlin and eagleyard Photonics present various implementations of current micro-optical assemblies for high quality free space laser beam forming and efficient fiber coupling. The laser modules shown are optimized for fast and automated assembly in small form factor packages via state-of-the-art active alignment machinery, using alignment and joining processes that have been developed and established in various industrial research projects. Operational wavelengths and optical powers ranging from 600 to 1600 nm and from 1 mW to several W respectively are addressed, for application in high-resolution laser spectroscopy, telecom and optical sensors, up to the optical powers needed in industrial and medical laser treatment.

  8. UFCORIN: A Fully Automated Predictor of Solar Flares in GOES X-Ray Flux

    CERN Document Server

    Muranushi, Takayuki; Muranushi, Yuko Hada; Isobe, Hiroaki; Nemoto, Shigeru; Komazaki, Kenji; Shibata, Kazunari

    2015-01-01

    We have developed UFCORIN, a platform for studying and automating space weather prediction. Using our system we have tested 6,160 different combinations of SDO/HMI data as input data, and simulated the prediction of GOES X-ray flux for 2 years (2011-2012) with one-hour cadence. We have found that direct comparison of the true skill statistics (TSS) is ill-posed, and used the standard scores ($z$) of the TSS to compare the performance of the various prediction strategies. The best strategies we have found for predicting X, $\\geq$M and $\\geq$C class flares are better than the average of the 6,160 strategies by 2.3$\\sigma$, 2.1$\\sigma$, 3.8$\\sigma$ confidence levels, respectively. The best three's TSS values were $0.745\\pm0.072$, $0.481\\pm0.017$, and $0.557\\pm0.043$, respectively.

  9. a Fully Automated Pipeline for Classification Tasks with AN Application to Remote Sensing

    Science.gov (United States)

    Suzuki, K.; Claesen, M.; Takeda, H.; De Moor, B.

    2016-06-01

    Nowadays deep learning has been intensively in spotlight owing to its great victories at major competitions, which undeservedly pushed `shallow' machine learning methods, relatively naive/handy algorithms commonly used by industrial engineers, to the background in spite of their facilities such as small requisite amount of time/dataset for training. We, with a practical point of view, utilized shallow learning algorithms to construct a learning pipeline such that operators can utilize machine learning without any special knowledge, expensive computation environment, and a large amount of labelled data. The proposed pipeline automates a whole classification process, namely feature-selection, weighting features and the selection of the most suitable classifier with optimized hyperparameters. The configuration facilitates particle swarm optimization, one of well-known metaheuristic algorithms for the sake of generally fast and fine optimization, which enables us not only to optimize (hyper)parameters but also to determine appropriate features/classifier to the problem, which has conventionally been a priori based on domain knowledge and remained untouched or dealt with naïve algorithms such as grid search. Through experiments with the MNIST and CIFAR-10 datasets, common datasets in computer vision field for character recognition and object recognition problems respectively, our automated learning approach provides high performance considering its simple setting (i.e. non-specialized setting depending on dataset), small amount of training data, and practical learning time. Moreover, compared to deep learning the performance stays robust without almost any modification even with a remote sensing object recognition problem, which in turn indicates that there is a high possibility that our approach contributes to general classification problems.

  10. Comparisons of fully automated syphilis tests with conventional VDRL and FTA-ABS tests.

    Science.gov (United States)

    Choi, Seung Jun; Park, Yongjung; Lee, Eun Young; Kim, Sinyoung; Kim, Hyon-Suk

    2013-06-01

    Serologic tests are widely used for the diagnosis of syphilis. However, conventional methods require well-trained technicians to produce reliable results. We compared automated nontreponemal and treponemal tests with conventional methods. The HiSens Auto Rapid Plasma Reagin (AutoRPR) and Treponema Pallidum particle agglutination (AutoTPPA) tests, which utilize latex turbidimetric immunoassay, were assessed. A total of 504 sera were assayed by AutoRPR, AutoTPPA, conventional VDRL and FTA-ABS. Among them, 250 samples were also tested by conventional TPPA. The concordance rate between the results of VDRL and AutoRPR was 67.5%, and 164 discrepant cases were all VDRL reactive but AutoRPR negative. In the 164 cases, 133 showed FTA-ABS reactivity. Medical records of 106 among the 133 cases were reviewed, and 82 among 106 specimens were found to be collected from patients already treated for syphilis. The concordance rate between the results of AutoTPPA and FTA-ABS was 97.8%. The results of conventional TPPA and AutoTPPA for 250 samples were concordant in 241 cases (96.4%). AutoRPR showed higher specificity than that of VDRL, while VDRL demonstrated higher sensitivity than that of AutoRPR regardless of whether the patients had been already treated for syphilis or not. Both FTA-ABS and AutoTPPA showed high sensitivities and specificities greater than 98.0%. Automated RPR and TPPA tests could be alternatives to conventional syphilis tests, and AutoRPR would be particularly suitable in treatment monitoring, since results by AutoRPR in cases after treatment became negative more rapidly than by VDRL. Copyright © 2013. Published by Elsevier Inc.

  11. Automated Price and Demand Response Demonstration for Large Customers in New York City using OpenADR

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Joyce Jihyun; Yin, Rongxin; Kiliccote, Sila

    2013-10-01

    Open Automated Demand Response (OpenADR), an XML-based information exchange model, is used to facilitate continuous price-responsive operation and demand response participation for large commercial buildings in New York who are subject to the default day-ahead hourly pricing. We summarize the existing demand response programs in New York and discuss OpenADR communication, prioritization of demand response signals, and control methods. Building energy simulation models are developed and field tests are conducted to evaluate continuous energy management and demand response capabilities of two commercial buildings in New York City. Preliminary results reveal that providing machine-readable prices to commercial buildings can facilitate both demand response participation and continuous energy cost savings. Hence, efforts should be made to develop more sophisticated algorithms for building control systems to minimize customer's utility bill based on price and reliability information from the electricity grid.

  12. Validation of a Fully Automated HER2 Staining Kit in Breast Cancer

    Directory of Open Access Journals (Sweden)

    Cathy B. Moelans

    2010-01-01

    Full Text Available Background: Testing for HER2 amplification and/or overexpression is currently routine practice to guide Herceptin therapy in invasive breast cancer. At present, HER2 status is most commonly assessed by immunohistochemistry (IHC. Standardization of HER2 IHC assays is of utmost clinical and economical importance. At present, HER2 IHC is most commonly performed with the HercepTest which contains a polyclonal antibody and applies a manual staining procedure. Analytical variability in HER2 IHC testing could be diminished by a fully automatic staining system with a monoclonal antibody.

  13. A quality assurance framework for the fully automated and objective evaluation of image quality in cone-beam computed tomography

    Energy Technology Data Exchange (ETDEWEB)

    Steiding, Christian; Kolditz, Daniel; Kalender, Willi A., E-mail: willi.kalender@imp.uni-erlangen.de [Institute of Medical Physics, University of Erlangen-Nürnberg, Henkestraße 91, 91052 Erlangen, Germany and CT Imaging GmbH, 91052 Erlangen (Germany)

    2014-03-15

    Purpose: Thousands of cone-beam computed tomography (CBCT) scanners for vascular, maxillofacial, neurological, and body imaging are in clinical use today, but there is no consensus on uniform acceptance and constancy testing for image quality (IQ) and dose yet. The authors developed a quality assurance (QA) framework for fully automated and time-efficient performance evaluation of these systems. In addition, the dependence of objective Fourier-based IQ metrics on direction and position in 3D volumes was investigated for CBCT. Methods: The authors designed a dedicated QA phantom 10 cm in length consisting of five compartments, each with a diameter of 10 cm, and an optional extension ring 16 cm in diameter. A homogeneous section of water-equivalent material allows measuring CT value accuracy, image noise and uniformity, and multidimensional global and local noise power spectra (NPS). For the quantitative determination of 3D high-contrast spatial resolution, the modulation transfer function (MTF) of centrally and peripherally positioned aluminum spheres was computed from edge profiles. Additional in-plane and axial resolution patterns were used to assess resolution qualitatively. The characterization of low-contrast detectability as well as CT value linearity and artifact behavior was tested by utilizing sections with soft-tissue-equivalent and metallic inserts. For an automated QA procedure, a phantom detection algorithm was implemented. All tests used in the dedicated QA program were initially verified in simulation studies and experimentally confirmed on a clinical dental CBCT system. Results: The automated IQ evaluation of volume data sets of the dental CBCT system was achieved with the proposed phantom requiring only one scan for the determination of all desired parameters. Typically, less than 5 min were needed for phantom set-up, scanning, and data analysis. Quantitative evaluation of system performance over time by comparison to previous examinations was also

  14. UFCORIN: A fully automated predictor of solar flares in GOES X-ray flux

    Science.gov (United States)

    Muranushi, Takayuki; Shibayama, Takuya; Muranushi, Yuko Hada; Isobe, Hiroaki; Nemoto, Shigeru; Komazaki, Kenji; Shibata, Kazunari

    2015-11-01

    We have developed UFCORIN, a platform for studying and automating space weather prediction. Using our system we have tested 6160 different combinations of Solar Dynamic Observatory/Helioseismic and Magnetic Imager data as input data, and simulated the prediction of GOES X-ray flux for 2 years (2011-2012) with 1 h cadence. We have found that direct comparison of the true skill statistic (TSS) from small cross-validation sets is ill posed and used the standard scores (z) of the TSS to compare the performance of the various prediction strategies. The z of a strategy is a stochastic variable of the stochastically chosen cross-validation data set, and the z for the three strategies best at predicting X-, ≥M-, and ≥C-class flares are better than the average z of the 6160 strategies by 2.3σ, 2.1σ, and 3.8σ confidence levels, respectively. The best three TSS values were 0.75 ± 0.07, 0.48 ± 0.02, and 0.56 ± 0.04, respectively.

  15. Fully Automated Detection of Corticospinal Tract Damage in Chronic Stroke Patients

    Directory of Open Access Journals (Sweden)

    Ming Yang

    2014-01-01

    Full Text Available Structural integrity of the corticospinal tract (CST after stroke is closely linked to the degree of motor impairment. However, current methods for measurement of fractional atrophy (FA of CST based on region of interest (ROI are time-consuming and open to bias. Here, we used tract-based spatial statistics (TBSS together with a CST template with healthy volunteers to quantify structural integrity of CST automatically. Two groups of patients after ischemic stroke were enrolled, group 1 (10 patients, 7 men, and Fugl-Meyer assessment (FMA scores ⩽ 50 and group 2 (12 patients, 12 men, and FMA scores = 100. CST of FAipsi, FAcontra, and FAratio was compared between the two groups. Relative to group 2, FA was decreased in group 1 in the ipsilesional CST (P<0.01, as well as the FAratio (P<0.01. There was no significant difference between the two subgroups in the contralesional CST (P=0.23. Compared with contralesional CST, FA of ipsilesional CST decreased in group 1 (P<0.01. These results suggest that the automated method used in our study could detect a surrogate biomarker to quantify the CST after stroke, which would facilitate implementation of clinical practice.

  16. Grid-Competitive Residential and Commercial Fully Automated PV Systems Technology: Final technical Report, August 2011

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Katie E.; Cousins, Peter; Culligan, Matt; Jonathan Botkin; DeGraaff, David; Bunea, Gabriella; Rose, Douglas; Bourne, Ben; Koehler, Oliver

    2011-08-26

    Under DOE's Technology Pathway Partnership program, SunPower Corporation developed turn-key, high-efficiency residential and commercial systems that are cost effective. Key program objectives include a reduction in LCOE values to 9-12 cents/kWh and 13-18 cents/kWh respectively for the commercial and residential markets. Target LCOE values for the commercial ground, commercial roof, and residential markets are 10, 11, and 13 cents/kWh. For this effort, SunPower collaborated with a variety of suppliers and partners to complete the tasks below. Subcontractors included: Solaicx, SiGen, Ribbon Technology, Dow Corning, Xantrex, Tigo Energy, and Solar Bridge. SunPower's TPP addressed nearly the complete PV value chain: from ingot growth through system deployment. Throughout the award period of performance, SunPower has made progress toward achieving these reduced costs through the development of 20%+ efficient modules, increased cell efficiency through the understanding of loss mechanisms and improved manufacturing technologies, novel module development, automated design tools and techniques, and reduced system development and installation time. Based on an LCOE assessment using NREL's Solar Advisor Model, SunPower achieved the 2010 target range, as well as progress toward 2015 targets.

  17. Fully automated, gas sensing, and electronic parameter measurement setup for miniaturized nanoparticle gas sensors

    Science.gov (United States)

    Kennedy, M. K.; Kruis, F. E.; Fissan, H.; Mehta, B. R.

    2003-11-01

    In this study, a measurement setup has been designed and fabricated for the measurement of gas sensor characteristics and electronic parameters of nanostructured thin layers in the temperature range from room temperature to 450 °C in controlled gas environments. The setup consists of: (i) a gas environment chamber, (ii) a specially designed substrate and substrate holder, and (iii) control, supply, and measurement electronics. The buried geometry of the contacts is specially designed for the deposition of nanoparticles from the gas phase to guarantee uniform thin layers, and the setup can be used to make measurement on high resistivity (1010 Ω cm) nanoparticle samples. The gas inlet, operating temperature, and electronic control of the measurement system are automated by means of a personal computer. Coupling the measurements of interdependent gas sensing and electronic parameters at identical conditions, in a single setup encompassing a wide range of sensing gas levels and substrate temperatures, makes this system ideally suited for carrying out multiple measurements required for optimizing sensor configuration and understanding the size-dependent properties of nanoparticle sensors.

  18. A Fully Automated and Robust Method to Incorporate Stamping Data in Crash, NVH and Durability Analysis

    Science.gov (United States)

    Palaniswamy, Hariharasudhan; Kanthadai, Narayan; Roy, Subir; Beauchesne, Erwan

    2011-08-01

    Crash, NVH (Noise, Vibration, Harshness), and durability analysis are commonly deployed in structural CAE analysis for mechanical design of components especially in the automotive industry. Components manufactured by stamping constitute a major portion of the automotive structure. In CAE analysis they are modeled at a nominal state with uniform thickness and no residual stresses and strains. However, in reality the stamped components have non-uniformly distributed thickness and residual stresses and strains resulting from stamping. It is essential to consider the stamping information in CAE analysis to accurately model the behavior of the sheet metal structures under different loading conditions. Especially with the current emphasis on weight reduction by replacing conventional steels with aluminum and advanced high strength steels it is imperative to avoid over design. Considering this growing need in industry, a highly automated and robust method has been integrated within Altair Hyperworks® to initialize sheet metal components in CAE models with stamping data. This paper demonstrates this new feature and the influence of stamping data for a full car frontal crash analysis.

  19. HATSouth: a global network of fully automated identical wide-field telescopes

    CERN Document Server

    Bakos, G Á; Penev, K; Bayliss, D; Jordán, A; Afonso, C; Hartman, J D; Henning, T; Kovács, G; Noyes, R W; Béky, B; Suc, V; Csák, B; Rabus, M; Lázár, J; Papp, I; Sári, P; Conroy, P; Zhou, G; Sackett, P D; Schmidt, B; Mancini, L; Sasselov, D D; Ueltzhoeffer, K

    2012-01-01

    HATSouth is the world's first network of automated and homogeneous telescopes that is capable of year-round 24-hour monitoring of positions over an entire hemisphere of the sky. The primary scientific goal of the network is to discover and characterize a large number of transiting extrasolar planets, reaching out to long periods and down to small planetary radii. HATSouth achieves this by monitoring extended areas on the sky, deriving high precision light curves for a large number of stars, searching for the signature of planetary transits, and confirming planetary candidates with larger telescopes. HATSouth employs 6 telescope units spread over 3 locations with large longitude separation in the southern hemisphere (Las Campanas Observatory, Chile; HESS site, Namibia; Siding Spring Observatory, Australia). Each of the HATSouth units holds four 0.18m diameter f/2.8 focal ratio telescope tubes on a common mount producing an 8.2x8.2 arcdeg field, imaged using four 4Kx4K CCD cameras and Sloan r filters, to give a...

  20. Introducing Powell's Direction Set Method to a Fully Automated Analysis of Eclipsing Binary Stars

    CERN Document Server

    Prsa, A

    2006-01-01

    With recent observational advancements, substantial amounts of photometric and spectroscopic eclipsing binary data have been acquired. As part of an ongoing effort to assemble a reliable pipeline for fully automatic data analysis, we put Powell's direction set method to the test. The method does not depend on numerical derivatives, only on function evaluations, and as such it cannot diverge. Compared to differential corrections (DC) and Nelder & Mead's downhill simplex (NMS) method, Powell's method proves to be more efficient in terms of solution determination and the required number of iterations. However, its application is still not optimal in terms of time cost. Causes for this deficiency are identified and two steps toward the solution are proposed: non-ortogonality of the parameter set should be removed and better initial directions should be determined before the minimization is initiated. Once these setbacks are worked out, Powell's method will probably replace DC and NMS as the default minimizing...

  1. An alternative method for monitoring carbonyls, and the development of a 24-port fully automated carbonyl sampler for PAMS program

    Energy Technology Data Exchange (ETDEWEB)

    Parmar, S.S.; Ugarova, L. [Atmospheric Analysis and Consulting, Ventura, CA (United States); Fernandes, C.; Guyton, J.; Lee, C.P. [Arizona Dept. of Environmental Quality, Phoenix, AZ (United States)

    1994-12-31

    The authors have investigated the possibility of collecting different aldehydes and ketones on different sorbents such as silica gel, molecular sieve and charcoal followed by solvent extraction, DNPH derivatization and HPLC/UV analysis. Carbonyl collection efficiencies for these sorbents were calculated relative to a DNPH coated C{sub 18} sep-pak cartridge. From a limited number of laboratory experiments, at various concentrations, it appears that silica gel tubes can be used for sampling aldehydes (collection efficiencies {approximately} 1), whereas charcoal tubes are suitable for collecting ketones. Molecular sieve was found to be unsuitable for collecting most of the carbonyl studied. The authors also report the development of a fully automated 24-port carbonyl sampler specially designed for EPA`s PAMS program.

  2. Effectiveness of a Web-Based Screening and Fully Automated Brief Motivational Intervention for Adolescent Substance Use

    DEFF Research Database (Denmark)

    Arnaud, Nicolas; Baldus, Christiane; Elgán, Tobias H.

    2016-01-01

    among adolescents screened for at-risk substance use in four European countries. Methods: In an open-access, purely Web-based randomized controlled trial, a convenience sample of adolescents aged 16-18 years from Sweden, Germany, Belgium, and the Czech Republic was recruited using online and offline.......5%) provided follow-up data. Compared to the control group, results from linear mixed models revealed significant reductions in self-reported past-month drinking in favor of the intervention group in both the non-imputed (P=.010) and the EM-imputed sample (P=.022). Secondary analyses revealed a significant......).Conclusions: Although the study is limited by a large drop-out, significant between-group effects for alcohol use indicate that targeted brief motivational intervention in a fully automated Web-based format can be effective to reduce drinking and lessen existing substance use service barriers for at...

  3. Screening for Anabolic Steroids in Urine of Forensic Cases Using Fully Automated Solid Phase Extraction and LC–MS-MS

    DEFF Research Database (Denmark)

    Andersen, David Wederkinck; Linnet, Kristian

    2014-01-01

    A screening method for 18 frequently measured exogenous anabolic steroids and the testosterone/epitestosterone (T/E) ratio in forensic cases has been developed and validated. The method involves a fully automated sample preparation including enzyme treatment, addition of internal standards...... and solid phase extraction followed by analysis by liquid chromatography-tandem mass spectrometry (LC-MS-MS) using electrospray ionization with adduct formation for two compounds. Urine samples from 580 forensic cases were analyzed to determine the T/E ratio and occurrence of exogenous anabolic steroids....... Extraction recoveries ranged from 77 to 95%, matrix effects from 48 to 78%, overall process efficiencies from 40 to 54% and the lower limit of identification ranged from 2 to 40 ng/mL. In the 580 urine samples analyzed from routine forensic cases, 17 (2.9%) were found positive for one or more anabolic...

  4. Evaluation of a Fully Automated Research Prototype for the Immediate Identification of Microorganisms from Positive Blood Cultures under Clinical Conditions

    Directory of Open Access Journals (Sweden)

    Jay M. Hyman

    2016-04-01

    Full Text Available A clinical laboratory evaluation of an intrinsic fluorescence spectroscopy (IFS-based identification system paired to a BacT/Alert Virtuo microbial detection system (bioMéééérieux, Inc., Durham, NC was performed to assess the potential for fully automated identification of positive blood cultures. The prototype IFS system incorporates a novel method combining a simple microbial purification procedure with rapid in situ identification via spectroscopy. Results were available within 15 min of a bottle signaling positive and required no manual intervention. Among cultures positive for organisms contained within the database and producing acceptable spectra, 75 of 88 (85.2% and 79 of 88 (89.8% were correctly identified to the species and genus level, respectively. These results are similar to the performance of existing rapid methods.

  5. “Smart” RCTs: Development of a Smartphone App for Fully Automated Nutrition-Labeling Intervention Trials

    Science.gov (United States)

    Li, Nicole; Dunford, Elizabeth; Eyles, Helen; Crino, Michelle; Michie, Jo; Ni Mhurchu, Cliona

    2016-01-01

    Background There is substantial interest in the effects of nutrition labels on consumer food-purchasing behavior. However, conducting randomized controlled trials on the impact of nutrition labels in the real world presents a significant challenge. Objective The Food Label Trial (FLT) smartphone app was developed to enable conducting fully automated trials, delivering intervention remotely, and collecting individual-level data on food purchases for two nutrition-labeling randomized controlled trials (RCTs) in New Zealand and Australia. Methods Two versions of the smartphone app were developed: one for a 5-arm trial (Australian) and the other for a 3-arm trial (New Zealand). The RCT protocols guided requirements for app functionality, that is, obtaining informed consent, two-stage eligibility check, questionnaire administration, randomization, intervention delivery, and outcome assessment. Intervention delivery (nutrition labels) and outcome data collection (individual shopping data) used the smartphone camera technology, where a barcode scanner was used to identify a packaged food and link it with its corresponding match in a food composition database. Scanned products were either recorded in an electronic list (data collection mode) or allocated a nutrition label on screen if matched successfully with an existing product in the database (intervention delivery mode). All recorded data were transmitted to the RCT database hosted on a server. Results In total approximately 4000 users have downloaded the FLT app to date; 606 (Australia) and 1470 (New Zealand) users met the eligibility criteria and were randomized. Individual shopping data collected by participants currently comprise more than 96,000 (Australia) and 229,000 (New Zealand) packaged food and beverage products. Conclusions The FLT app is one of the first smartphone apps to enable conducting fully automated RCTs. Preliminary app usage statistics demonstrate large potential of such technology, both for

  6. Fully automated [{sup 18}F]fluorocholine synthesis in the TracerLab MX{sub FDG} Coincidence synthesizer

    Energy Technology Data Exchange (ETDEWEB)

    Kryza, David [Laboratoire CREATIS-ANIMAGE, UMR 5515 Cnrs-U630 Inserm-Insa de Lyon (France); Hospices Civils de Lyon, Hopital E Herriot, Radiopharmacie, 69437 Lyon (France); Departement de Biophysique, Universite Lyon 1, domaine Rockfeller, 69008 Lyon (France)], E-mail: david.kryza@chu-lyon.fr; Tadino, Vincent [Optimized Radiochemical Applications, Saint-Nicolas (Belgium); Filannino, Maria Azzurra; Villeret, Guillaume; Lemoucheux, Laurent [Advanced Accelerator Applications, 01630 Saint Genis Pouilly (France)

    2008-02-15

    Introduction: We developed a new fully automated method for the radiosynthesis of [{sup 18}F]fluorocholine by modifying the commercial 2-[{sup 18}F]fluoro-2-D-deoxy-glucose ([{sup 18}F]FDG) synthesizer module (GE TracerLab MX, formerly Coincidence). Methods: [{sup 18}F]Flurocholine was synthesized by {sup 18}F-fluoroalkylation of N,N-dimethylaminoethanol using [{sup 18}F]fluorobromomethane as fluoromethylating agent. [{sup 18}F]Fluorobromomethane was produced by reaction of dibromomethane with [{sup 18}F]fluoride, assisted by Kryptofix 2.2.2. Results: After purification on solid-phase extraction cartridges, the [{sup 18}F]fluorocholine was obtained in 15-25% radiochemical yields (decay not corrected), with more than 99% radiochemical purity. Specific activity was more than 37 GBq/{mu}mol. Synthesis time was less than 35 min. Conclusion: This new automated synthesis technique provides high and reproducible yields that could be dedicated for routine use with the same [{sup 18}F]FDG disposable cassette system.

  7. Screening for anabolic steroids in urine of forensic cases using fully automated solid phase extraction and LC-MS-MS.

    Science.gov (United States)

    Andersen, David W; Linnet, Kristian

    2014-01-01

    A screening method for 18 frequently measured exogenous anabolic steroids and the testosterone/epitestosterone (T/E) ratio in forensic cases has been developed and validated. The method involves a fully automated sample preparation including enzyme treatment, addition of internal standards and solid phase extraction followed by analysis by liquid chromatography-tandem mass spectrometry (LC-MS-MS) using electrospray ionization with adduct formation for two compounds. Urine samples from 580 forensic cases were analyzed to determine the T/E ratio and occurrence of exogenous anabolic steroids. Extraction recoveries ranged from 77 to 95%, matrix effects from 48 to 78%, overall process efficiencies from 40 to 54% and the lower limit of identification ranged from 2 to 40 ng/mL. In the 580 urine samples analyzed from routine forensic cases, 17 (2.9%) were found positive for one or more anabolic steroids. Only seven different steroids including testosterone were found in the material, suggesting that only a small number of common steroids are likely to occur in a forensic context. The steroids were often in high concentrations (>100 ng/mL), and a combination of steroids and/or other drugs of abuse were seen in the majority of cases. The method presented serves as a fast and automated screening procedure, proving the suitability of LC-MS-MS for analyzing anabolic steroids.

  8. Fully automated screening of immunocytochemically stained specimens for early cancer detection

    Science.gov (United States)

    Bell, André A.; Schneider, Timna E.; Müller-Frank, Dirk A. C.; Meyer-Ebrecht, Dietrich; Böcking, Alfred; Aach, Til

    2007-03-01

    Cytopathological cancer diagnoses can be obtained less invasive than histopathological investigations. Cells containing specimens can be obtained without pain or discomfort, bloody biopsies are avoided, and the diagnosis can, in some cases, even be made earlier. Since no tissue biopsies are necessary these methods can also be used in screening applications, e.g., for cervical cancer. Among the cytopathological methods a diagnosis based on the analysis of the amount of DNA in individual cells achieves high sensitivity and specificity. Yet this analysis is time consuming, which is prohibitive for a screening application. Hence, it will be advantageous to retain, by a preceding selection step, only a subset of suspicious specimens. This can be achieved using highly sensitive immunocytochemical markers like p16 ink4a for preselection of suspicious cells and specimens. We present a method to fully automatically acquire images at distinct positions at cytological specimens using a conventional computer controlled microscope and an autofocus algorithm. Based on the thus obtained images we automatically detect p16 ink4a-positive objects. This detection in turn is based on an analysis of the color distribution of the p16 ink4a marker in the Lab-colorspace. A Gaussian-mixture-model is used to describe this distribution and the method described in this paper so far achieves a sensitivity of up to 90%.

  9. Fully automated software for mitral annulus evaluation in chronic mitral regurgitation by 3-dimensional transesophageal echocardiography.

    Science.gov (United States)

    Aquila, Iolanda; Fernández-Golfín, Covadonga; Rincon, Luis Miguel; González, Ariana; García Martín, Ana; Hinojar, Rocio; Jimenez Nacher, Jose Julio; Indolfi, Ciro; Zamorano, Jose Luis

    2016-12-01

    Three-dimensional (3D) transesophageal echocardiography (TEE) is the gold standard for mitral valve (MV) anatomic and functional evaluation. Currently, dedicated MV analysis software has limitations for its use in clinical practice. Thus, we tested here a complete and reproducible evaluation of a new fully automatic software to characterize MV anatomy in different forms of mitral regurgitation (MR) by 3D TEE.Sixty patients were included: 45 with more than moderate MR (28 organic MR [OMR] and 17 functional MR [FMR]) and 15 controls. All patients underwent TEE. 3D MV images obtained using 3D zoom were imported into the new software for automatic analysis. Different MV parameters were obtained and compared. Anatomic and dynamic differences between FMR and OMR were detected. A significant increase in systolic (859.75 vs 801.83 vs 607.78 mm; P = 0.002) and diastolic (1040.60 vs. 1217.83 and 859.74 mm; P software analysis automatically calculates several significant parameters that provide a correct and complete assessment of anatomy and dynamic mitral annulus geometry and displacement in the 3D space. This analysis allows a better characterization of MR pathophysiology and could be useful in designing new devices for MR repair or replacement.

  10. Evaluation of a New Fully Automated Assay for Plasma Intact FGF23.

    Science.gov (United States)

    Souberbielle, Jean-Claude; Prié, Dominique; Piketty, Marie-Liesse; Rothenbuhler, Anya; Delanaye, Pierre; Chanson, Philippe; Cavalier, Etienne

    2017-07-31

    Several FGF23 immunoassays are available. However, they are reserved for research purposes as none have been approved for clinical use. We evaluated the performances of a new automated assay for intact FGF23 on the DiaSorin Liaison platform which is approved for clinical use. We established reference values in 908 healthy French subjects aged 18-89 years, and measured iFGF23 in patients with disorders of phosphate metabolism and in patients with chronic kidney disease (CKD). Intra-assay CV was 1.04-2.86% and inter-assay CV was 4.01-6.3%. The limit of quantification was <10 ng/L. Serum iFGF23 concentrations were considerably lower than EDTA values highlighting the importance of using exclusively EDTA plasma. Liaison iFGF23 values were approximately 25% higher than Immutopics values. In the 908 healthy subjects, distribution of the Liaison iFGF23 values was Gaussian with a mean ± 2SD interval of 22.7-93.1 ng/L. Men had a slightly higher level than women (60.3 ± 17.6 and 55.2 ± 17.2 ng/L, respectively). Plasma iFGF23 concentration in 11 patients with tumour-induced osteomalacia, 8 patients with X-linked hypophosphatemic rickets, 43 stage 3a, 43 stage 3b, 43 stage 4, 44 stage 5 CKD patients, and 44 dialysis patients were 217.2 ± 144.0, 150.9 ± 28.6, 98.5 ± 42.0, 130.8 ± 88.6, 130.8 ± 88.6, 331.7 ± 468.2, 788.8 ± 1306.6 and 6103.9 ± 11,178.8 ng/L, respectively. This new iFGF23 assay available on a platform that already allows the measurement of other important parameters of the mineral metabolism is a real improvement for the laboratories and clinicians/researchers involved in this field.

  11. Opportunities for Automated Demand Response in Wastewater Treatment Facilities in California - Southeast Water Pollution Control Plant Case Study

    Energy Technology Data Exchange (ETDEWEB)

    Olsen, Daniel [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Goli, Sasank [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Faulkner, David [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); McKane, Aimee [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2012-12-20

    This report details a study into the demand response potential of a large wastewater treatment facility in San Francisco. Previous research had identified wastewater treatment facilities as good candidates for demand response and automated demand response, and this study was conducted to investigate facility attributes that are conducive to demand response or which hinder its implementation. One years' worth of operational data were collected from the facility's control system, submetered process equipment, utility electricity demand records, and governmental weather stations. These data were analyzed to determine factors which affected facility power demand and demand response capabilities The average baseline demand at the Southeast facility was approximately 4 MW. During the rainy season (October-March) the facility treated 40% more wastewater than the dry season, but demand only increased by 4%. Submetering of the facility's lift pumps and centrifuges predicted load shifts capabilities of 154 kW and 86 kW, respectively, with large lift pump shifts in the rainy season. Analysis of demand data during maintenance events confirmed the magnitude of these possible load shifts, and indicated other areas of the facility with demand response potential. Load sheds were seen to be possible by shutting down a portion of the facility's aeration trains (average shed of 132 kW). Load shifts were seen to be possible by shifting operation of centrifuges, the gravity belt thickener, lift pumps, and external pump stations These load shifts were made possible by the storage capabilities of the facility and of the city's sewer system. Large load reductions (an average of 2,065 kW) were seen from operating the cogeneration unit, but normal practice is continuous operation, precluding its use for demand response. The study also identified potential demand response opportunities that warrant further study: modulating variable-demand aeration loads, shifting

  12. On-demand droplet loading for automated organic chemistry on digital microfluidics.

    Science.gov (United States)

    Shah, Gaurav J; Ding, Huijiang; Sadeghi, Saman; Chen, Supin; Kim, Chang-Jin C J; van Dam, R Michael

    2013-07-21

    Organic chemistry applications on digital microfluidic devices often involve reagents that are volatile or sensitive and must be introduced to the chip immediately before use. We present a new technique for automated, on-demand loading of ~1 μL droplets from large (~1 mL), sealed, off-chip reservoirs to a digital microfluidic chip in order to address this challenge. Unlike aqueous liquids which generally are non-wetting to the hydrophobic surface and must be actively drawn into the electrowetting-on-dielectric (EWOD) chip by electrode activation, organic liquids tend to be wetting and can spontaneously flood the chip, and hence require a retracting force for controlled liquid delivery. Using a combination of compressed inert gas and gravity to exert driving and retracting forces on the liquid, the simple loading technique enables precise loading of droplets of both wetting and non-wetting liquids in a reliable manner. A key feature from a practical point of view is that all of the wetted parts are inexpensive and potentially disposable, thus avoiding cross-contamination in chemical and biochemical applications. We provide a theoretical treatment of the underlying physics, discuss the effect of geometry and liquid properties on its performance, and show repeatable reagent loading using the technique. Its versatility is demonstrated with the loading of several aqueous and non-aqueous liquids on an EWOD digital microfluidic device.

  13. A preliminary study for fully automated quantification of psoriasis severity using image mapping

    Science.gov (United States)

    Mukai, Kazuhiro; Iyatomi, Hitoshi

    2014-03-01

    Psoriasis is a common chronic skin disease and it detracts patients' QoL seriously. Since there is no known permanent cure so far, controlling appropriate disease condition is necessary and therefore quantification of its severity is important. In clinical, psoriasis area and severity index (PASI) is commonly used for abovementioned purpose, however it is often subjective and troublesome. A fully automatic computer-assisted area and severity index (CASI) was proposed to make an objective quantification of skin disease. It investigates the size and density of erythema based on digital image analysis, however it does not consider various inadequate effects caused by different geometrical conditions under clinical follow-up (i.e. variability in direction and distance between camera and patient). In this study, we proposed an image alignment method for clinical images and investigated to quantify the severity of psoriasis under clinical follow-up combined with the idea of CASI. The proposed method finds geometrical same points in patient's body (ROI) between images with Scale Invariant Feature Transform (SIFT) and performs the Affine transform to map the pixel value to the other. In this study, clinical images from 7 patients with psoriasis lesions on their trunk under clinical follow-up were used. In each series, our image alignment algorithm align images to the geometry of their first image. Our proposed method aligned images appropriately on visual assessment and confirmed that psoriasis areas were properly extracted using the approach of CASI. Although we cannot evaluate PASI and CASI directly due to their different definition of ROI, we confirmed that there is a large correlation between those scores with our image quantification method.

  14. Fully automated software for mitral annulus evaluation in chronic mitral regurgitation by 3-dimensional transesophageal echocardiography

    Science.gov (United States)

    Aquila, Iolanda; Fernández-Golfín, Covadonga; Rincon, Luis Miguel; González, Ariana; García Martín, Ana; Hinojar, Rocio; Jimenez Nacher, Jose Julio; Indolfi, Ciro; Zamorano, Jose Luis

    2016-01-01

    Abstract Three-dimensional (3D) transesophageal echocardiography (TEE) is the gold standard for mitral valve (MV) anatomic and functional evaluation. Currently, dedicated MV analysis software has limitations for its use in clinical practice. Thus, we tested here a complete and reproducible evaluation of a new fully automatic software to characterize MV anatomy in different forms of mitral regurgitation (MR) by 3D TEE. Sixty patients were included: 45 with more than moderate MR (28 organic MR [OMR] and 17 functional MR [FMR]) and 15 controls. All patients underwent TEE. 3D MV images obtained using 3D zoom were imported into the new software for automatic analysis. Different MV parameters were obtained and compared. Anatomic and dynamic differences between FMR and OMR were detected. A significant increase in systolic (859.75 vs 801.83 vs 607.78 mm2; P = 0.002) and diastolic (1040.60 vs. 1217.83 and 859.74 mm2; P < 0.001) annular sizes was observed in both OMR and FMR compared to that in controls. FMR had a reduced mitral annular contraction compared to degenerative cases of OMR and to controls (17.14% vs 32.78% and 29.89%; P = 0.007). Good reproducibility was demonstrated along with a short analysis time (mean 4.30 minutes). Annular characteristics and dynamics are abnormal in both FMR and OMR. Full 3D software analysis automatically calculates several significant parameters that provide a correct and complete assessment of anatomy and dynamic mitral annulus geometry and displacement in the 3D space. This analysis allows a better characterization of MR pathophysiology and could be useful in designing new devices for MR repair or replacement. PMID:27930514

  15. Fully-automated left ventricular mass and volume MRI analysis in the UK Biobank population cohort: evaluation of initial results.

    Science.gov (United States)

    Suinesiaputra, Avan; Sanghvi, Mihir M; Aung, Nay; Paiva, Jose Miguel; Zemrak, Filip; Fung, Kenneth; Lukaschuk, Elena; Lee, Aaron M; Carapella, Valentina; Kim, Young Jin; Francis, Jane; Piechnik, Stefan K; Neubauer, Stefan; Greiser, Andreas; Jolly, Marie-Pierre; Hayes, Carmel; Young, Alistair A; Petersen, Steffen E

    2017-08-23

    UK Biobank, a large cohort study, plans to acquire 100,000 cardiac MRI studies by 2020. Although fully-automated left ventricular (LV) analysis was performed in the original acquisition, this was not designed for unsupervised incorporation into epidemiological studies. We sought to evaluate automated LV mass and volume (Siemens syngo InlineVF versions D13A and E11C), against manual analysis in a substantial sub-cohort of UK Biobank participants. Eight readers from two centers, trained to give consistent results, manually analyzed 4874 UK Biobank cases for LV end-diastolic volume (EDV), end-systolic volume (ESV), stroke volume (SV), ejection fraction (EF) and LV mass (LVM). Agreement between manual and InlineVF automated analyses were evaluated using Bland-Altman analysis and the intra-class correlation coefficient (ICC). Tenfold cross-validation was used to establish a linear regression calibration between manual and InlineVF results. InlineVF D13A returned results in 4423 cases, whereas InlineVF E11C returned results in 4775 cases and also reported LVM. Rapid visual assessment of the E11C results found 178 cases (3.7%) with grossly misplaced contours or landmarks. In the remaining 4597 cases, LV function showed good agreement: ESV -6.4 ± 9.0 ml, 0.853 (mean ± SD of the differences, ICC) EDV -3.0 ± 11.6 ml, 0.937; SV 3.4 ± 9.8 ml, 0.855; and EF 3.5 ± 5.1%, 0.586. Although LV mass was consistently overestimated (29.9 ± 17.0 g, 0.534) due to larger epicardial contours on all slices, linear regression could be used to correct the bias and improve accuracy. Automated InlineVF results can be used for case-control studies in UK Biobank, provided visual quality control and linear bias correction are performed. Improvements between InlineVF D13A and InlineVF E11C show the field is rapidly advancing, with further improvements expected in the near future.

  16. The next-generation Hybrid Capture High-Risk HPV DNA assay on a fully automated platform.

    Science.gov (United States)

    Eder, Paul S; Lou, Jianrong; Huff, John; Macioszek, Jerzy

    2009-07-01

    A next-generation diagnostic system has been developed at QIAGEN. The QIAensemble system consists of an analytical subsystem (JE2000) that utilizes a re-engineered Hybrid Capture chemistry (NextGen) to maintain the high level of clinical sensitivity established by the digene High-Risk HPV DNA Test (HC2), while creating improved analytical specificity as shown both in plasmid-based analyses and in processing of clinical specimens. Limit-of-detection and cross-reactivity experiments were performed using plasmid DNA constructs containing multiple high-risk (HR) and low-risk (LR) HPV types. Cervical specimens collected into a novel specimen collection medium, DCM, were used to measure stability of specimens, as well as analytical specificity. Signal carryover, instrument precision, and specimen reproducibility were measured on the prototype JE2000 system using the automated NextGen assay. The Limit of Detection (LOD) is HPV 16 plasmid in the automated assay. No cross-reactivity (signal above cutoff) was detected on the automated system from any of 13 LR types tested at 10(7) copies per assay. Within-plate, plate-to-plate, and day-to-day performance in the prototype system yielded a CV of 20%. No indication of target carryover was found when samples containing up to 10(9) copies/ml of HPV DNA type 16 were processed on the JE2000 instrument. In an agreement study with HC2, 1038 donor cervical specimens were tested in both the manual NextGen assay and HC2 to evaluate agreement between the two tests. After eliminating discrepant specimens that were adjudicated by HR-HPV genotyping, the adjudicated positive agreement was 98.5% (95% CI: 94.6, 99.6). The JE2000 prototype system automates NextGen assay processing, yielding accurate, reproducible, and highly specific results with both plasmid analytical model tests and cervical specimens collected in DCM. The final system will process more than 2000 specimens in an 8-hour shift, with fully continuous loading.

  17. A fully automated multi-modal computer aided diagnosis approach to coronary calcium scoring of MSCT images

    Science.gov (United States)

    Wu, Jing; Ferns, Gordon; Giles, John; Lewis, Emma

    2012-03-01

    Inter- and intra- observer variability is a problem often faced when an expert or observer is tasked with assessing the severity of a disease. This issue is keenly felt in coronary calcium scoring of patients suffering from atherosclerosis where in clinical practice, the observer must identify firstly the presence, followed by the location of candidate calcified plaques found within the coronary arteries that may prevent oxygenated blood flow to the heart muscle. However, it can be difficult for a human observer to differentiate calcified plaques that are located in the coronary arteries from those found in surrounding anatomy such as the mitral valve or pericardium. In addition to the benefits to scoring accuracy, the use of fast, low dose multi-slice CT imaging to perform the cardiac scan is capable of acquiring the entire heart within a single breath hold. Thus exposing the patient to lower radiation dose, which for a progressive disease such as atherosclerosis where multiple scans may be required, is beneficial to their health. Presented here is a fully automated method for calcium scoring using both the traditional Agatston method, as well as the volume scoring method. Elimination of the unwanted regions of the cardiac image slices such as lungs, ribs, and vertebrae is carried out using adaptive heart isolation. Such regions cannot contain calcified plaques but can be of a similar intensity and their removal will aid detection. Removal of both the ascending and descending aortas, as they contain clinical insignificant plaques, is necessary before the final calcium scores are calculated and examined against ground truth scores of three averaged expert observer results. The results presented here are intended to show the feasibility and requirement for an automated scoring method to reduce the subjectivity and reproducibility error inherent with manual clinical calcium scoring.

  18. Simultaneous analysis of cortisol and cortisone in saliva using XLC-MS/MS for fully automated online solid phase extraction.

    Science.gov (United States)

    Jones, Rachel L; Owen, Laura J; Adaway, Joanne E; Keevil, Brian G

    2012-01-15

    Salivary cortisol measurements are increasingly being used in the investigation of disorders of the hypothalamic-pituitary-adrenal axis. In the salivary gland, cortisol is metabolised to cortisone by the action of 11β-hydroxysteroid dehydrogenase type 2, and cortisone is partly responsible for the variable interference observed in current salivary cortisol immunoassays. The aim of this study was to validate an assay for the simultaneous analysis of salivary cortisol and cortisone using the Spark Holland Symbiosis™ in eXtraction liquid chromatography-tandem mass spectrometry (XLC-MS/MS) mode for fully automated online solid phase extraction (SPE). Saliva samples were diluted in water with the addition of internal standard (d4-cortisol and d7-cortisone). Online SPE was performed using the Spark Holland Symbiosis™ with HySphere™ C18 SPE cartridges and compounds were eluted onto a Phenomenex® C18 guard column attached to a Phenomenex® Onyx monolithic C18 column for chromatography. Mass spectrometry used the Waters® Xevo™ TQ MS in electrospray positive mode. Cortisol and cortisone eluted with their internal standards at 1.95 and 2.17 min, respectively, with a total run time of four minutes. No evidence of ion-suppression was observed. The assay was linear up to 3393 nmol/L for cortisol and 3676 nmol/L for cortisone, with lower limits of quantitation of 0.75 nmol/L and 0.50 nmol/L, respectively. Intra- and inter-assay imprecision was cortisone across three levels of internal quality control, with accuracy and recovery within accepted limits. High specificity was demonstrated following interference studies which assessed 29 structurally-related steroids at supra-physiological concentrations. We have successfully validated an assay for the simultaneous analysis of salivary cortisol and cortisone using XLC-MS/MS and fully automated online SPE. The assay benefits from increased specificity compared to immunoassay and minimal sample preparation which allows high

  19. Towards fully automated structure-based NMR resonance assignment of 15N-labeled proteins from automatically picked peaks

    KAUST Repository

    Jang, Richard

    2011-03-01

    In NMR resonance assignment, an indispensable step in NMR protein studies, manually processed peaks from both N-labeled and C-labeled spectra are typically used as inputs. However, the use of homologous structures can allow one to use only N-labeled NMR data and avoid the added expense of using C-labeled data. We propose a novel integer programming framework for structure-based backbone resonance assignment using N-labeled data. The core consists of a pair of integer programming models: one for spin system forming and amino acid typing, and the other for backbone resonance assignment. The goal is to perform the assignment directly from spectra without any manual intervention via automatically picked peaks, which are much noisier than manually picked peaks, so methods must be error-tolerant. In the case of semi-automated/manually processed peak data, we compare our system with the Xiong-Pandurangan-Bailey- Kellogg\\'s contact replacement (CR) method, which is the most error-tolerant method for structure-based resonance assignment. Our system, on average, reduces the error rate of the CR method by five folds on their data set. In addition, by using an iterative algorithm, our system has the added capability of using the NOESY data to correct assignment errors due to errors in predicting the amino acid and secondary structure type of each spin system. On a publicly available data set for human ubiquitin, where the typing accuracy is 83%, we achieve 91% accuracy, compared to the 59% accuracy obtained without correcting for such errors. In the case of automatically picked peaks, using assignment information from yeast ubiquitin, we achieve a fully automatic assignment with 97% accuracy. To our knowledge, this is the first system that can achieve fully automatic structure-based assignment directly from spectra. This has implications in NMR protein mutant studies, where the assignment step is repeated for each mutant. © Copyright 2011, Mary Ann Liebert, Inc.

  20. Computer-aided liver volumetry: performance of a fully-automated, prototype post-processing solution for whole-organ and lobar segmentation based on MDCT imaging.

    Science.gov (United States)

    Fananapazir, Ghaneh; Bashir, Mustafa R; Marin, Daniele; Boll, Daniel T

    2015-06-01

    To evaluate the performance of a prototype, fully-automated post-processing solution for whole-liver and lobar segmentation based on MDCT datasets. A polymer liver phantom was used to assess accuracy of post-processing applications comparing phantom volumes determined via Archimedes' principle with MDCT segmented datasets. For the IRB-approved, HIPAA-compliant study, 25 patients were enrolled. Volumetry performance compared the manual approach with the automated prototype, assessing intraobserver variability, and interclass correlation for whole-organ and lobar segmentation using ANOVA comparison. Fidelity of segmentation was evaluated qualitatively. Phantom volume was 1581.0 ± 44.7 mL, manually segmented datasets estimated 1628.0 ± 47.8 mL, representing a mean overestimation of 3.0%, automatically segmented datasets estimated 1601.9 ± 0 mL, representing a mean overestimation of 1.3%. Whole-liver and segmental volumetry demonstrated no significant intraobserver variability for neither manual nor automated measurements. For whole-liver volumetry, automated measurement repetitions resulted in identical values; reproducible whole-organ volumetry was also achieved with manual segmentation, p(ANOVA) 0.98. For lobar volumetry, automated segmentation improved reproducibility over manual approach, without significant measurement differences for either methodology, p(ANOVA) 0.95-0.99. Whole-organ and lobar segmentation results from manual and automated segmentation showed no significant differences, p(ANOVA) 0.96-1.00. Assessment of segmentation fidelity found that segments I-IV/VI showed greater segmentation inaccuracies compared to the remaining right hepatic lobe segments. Automated whole-liver segmentation showed non-inferiority of fully-automated whole-liver segmentation compared to manual approaches with improved reproducibility and post-processing duration; automated dual-seed lobar segmentation showed slight tendencies for underestimating the right hepatic lobe

  1. Fully-automated radiosynthesis and in vitro uptake investigation of [N-methyl-¹¹C]methylene blue.

    Science.gov (United States)

    Schweiger, Lutz F; Smith, Tim A D

    2013-10-01

    Malignant melanoma is a type of skin cancer which can spread rapidly if not detected early and left untreated. Positron Emission Tomography (PET) is a powerful imaging technique for detecting cancer but with only a limited number of radiotracers available the development of novel PET probes for detection and prevention of cancer is imperative. In the present study we present the fully-automated radiosynthesis of [N-methyl-(11)C]methylene blue and an in vitro uptake study in metastasic melanoma cell lines. Using the GE TRACERlab FXc Pro module [N-methyl-(11)C]methylene blue was isolated via solid-phase extraction in an average time of 36 min after end of bombardment and formulated with a radiochemical purity greater than 95%. The in vitro uptake study of [N-methyl-(11)C]methylene blue in SK-MEL28 melanin-expressing melanoma cell line demonstrated in site-specific binding of 51% promoting it as a promising melanoma PET imaging agent.

  2. A Closed-Loop Proportional-Integral (PI) Control Software for Fully Mechanically Controlled Automated Electron Microscopic Tomography

    Energy Technology Data Exchange (ETDEWEB)

    2016-06-23

    A closed-loop proportional-integral (PI) control software is provided for fully mechanically controlled automated electron microscopic tomography. The software is developed based on Gatan DigitalMicrograph�, and is compatible with Zeiss LIBRA� 120 transmission electron microscope. However, it can be expanded to other TEM instrument with modification. The software consists of a graphical user interface, a digital PI controller, an image analyzing unit, and other drive units (i.e.: image acquire unit and goniometer drive unit). During a tomography data collection process, the image analyzing unit analyzes both the accumulated shift and defocus value of the latest acquired image, and provides the results to the digital PI controller. The digital PI control compares the results with the preset values and determines the optimum adjustments of the goniometer. The goniometer drive unit adjusts the spatial position of the specimen according to the instructions given by the digital PI controller for the next tilt angle and image acquisition. The goniometer drive unit achieves high precision positioning by using a backlash elimination method. The major benefits of the software are: 1) the goniometer drive unit keeps pre-aligned/optimized beam conditions unchanged and achieves position tracking solely through mechanical control; 2) the image analyzing unit relies on only historical data and therefore does not require additional images/exposures; 3) the PI controller enables the system to dynamically track the imaging target with extremely low system error.

  3. Mutation Profile of B-Raf Gene Analyzed by fully Automated System and Clinical Features in Japanese Melanoma Patients.

    Science.gov (United States)

    Ide, Masaru; Koba, Shinichi; Sueoka-Aragane, Naoko; Sato, Akemi; Nagano, Yuri; Inoue, Takuya; Misago, Noriyuki; Narisawa, Yutaka; Kimura, Shinya; Sueoka, Eisaburo

    2017-01-01

    BRAF gene mutations have been observed in 30-50 % of malignant melanoma patients. Recent development of therapeutic intervention using BRAF inhibitors requires an accurate and rapid detection system for BRAF mutations. In addition, the clinical characteristics of the melanoma associated with BRAF mutations in Japanese patients have not been investigated on a large scale evaluation. We recently established quenching probe system (QP) for detection of an activating BRAF mutation, V600E and evaluated 113 melanoma samples diagnosed in Saga University Hospital from 1982 to 2011. The QP system includes fully automated genotyping, based on analysis of the probe DNA melting curve, which binds the target mutated site using a fluorescent guanine quenched probe. BRAF mutations were detected in 54 of 115 (47 %) including 51 of V600E and 3 of V600 K in Japanese melanoma cases. Among clinical subtypes of melanoma, nodular melanoma showed high frequency (12 of 15; 80 %) of mutation followed by superficial spreading melanoma (13 of 26; 50 %). The QP system is a simple and sensitive method to determine BRAF V600E mutation, and will be useful tool for patient-oriented therapy with BRAF inhibitors.

  4. Development and laboratory-scale testing of a fully automated online flow cytometer for drinking water analysis.

    Science.gov (United States)

    Hammes, Frederik; Broger, Tobias; Weilenmann, Hans-Ulrich; Vital, Marius; Helbing, Jakob; Bosshart, Ulrich; Huber, Pascal; Odermatt, Res Peter; Sonnleitner, Bernhard

    2012-06-01

    Accurate and sensitive online detection tools would benefit both fundamental research and practical applications in aquatic microbiology. Here, we describe the development and testing of an online flow cytometer (FCM), with a specific use foreseen in the field of drinking water microbiology. The system incorporated fully automated sampling and fluorescent labeling of bacterial nucleic acids with analysis at 5-min intervals for periods in excess of 24 h. The laboratory scale testing showed sensitive detection (< 5% error) of bacteria over a broad concentration range (1 × 10(3) -1 × 10(6) cells mL(-1) ) and particularly the ability to track both gradual changes and dramatic events in water samples. The system was tested with bacterial pure cultures as well as indigenous microbial communities from natural water samples. Moreover, we demonstrated the possibility of using either a single fluorescent dye (e.g., SYBR Green I) or a combination of two dyes (SYBR Green I and Propidium Iodide), thus broadening the application possibilities of the system. The online FCM approach described herein has considerable potential for routine and continuous monitoring of drinking water, optimization of specific drinking water processes such as biofiltration or disinfection, as well as aquatic microbiology research in general.

  5. Determination of 18 beta-glycyrrhetinic acid in human serum using the fully automated ALCA-system.

    Science.gov (United States)

    Heilmann, P; Heide, J; Schöneshöfer, M

    1997-07-01

    We report a method for the determination of 18 beta-glycyrrhetinic acid (glycyrrhetinic acid) in human serum using the ALCA-system. The technology of the ALCA-system is based on the principles of adsorptive and desorptive processes between liquid and solid phases. The assay is run fully automated and selective. Procedural losses throughout the analysis are negligible, thereby allowing for external calibration. The calibration curve is linear up to 10 mg/l and concentrations as low as 10 micrograms/l are detectable. CV is 2.5% for within- and 7.5% for between-assay precision at a level of 50 micrograms/l and 1.2% for within- and 8.5% for between-assay precision at a level of 500 micrograms/l. Specific and expensive reagents are not necessary and time-consuming manual operations are not involved. This assay can be selected from a wide spectrum of methods at any time. Thus, the present method is well-suited for drug monitoring purposes in the routine laboratory. In a pharmacokinetic study we measured serum levels of glycyrrhetinic acid in ten healthy young volunteers after ingestion of 500 mg glycyrrhetinic acid. Maximum levels of glycyrrhetinic acid were 6.3 mg/l 2 to 4 hours after ingestion. Twenty-four (24) hours after ingestion seven probands still had glycyrrhetinic acid levels above the detection limit with a mean level of 0.33 mg/l.

  6. Fully-automated in-syringe dispersive liquid-liquid microextraction for the determination of caffeine in coffee beverages.

    Science.gov (United States)

    Frizzarin, Rejane M; Maya, Fernando; Estela, José M; Cerdà, Víctor

    2016-12-01

    A novel fully-automated magnetic stirring-assisted lab-in-syringe analytical procedure has been developed for the fast and efficient dispersive liquid-liquid microextraction (DLLME) of caffeine in coffee beverages. The procedure is based on the microextraction of caffeine with a minute amount of dichloromethane, isolating caffeine from the sample matrix with no further sample pretreatment. Selection of the relevant extraction parameters such as the dispersive solvent, proportion of aqueous/organic phase, pH and flow rates have been carefully evaluated. Caffeine quantification was linear from 2 to 75mgL(-1), with detection and quantification limits of 0.46mgL(-1) and 1.54mgL(-1), respectively. A coefficient of variation (n=8; 5mgL(-1)) of a 2.1% and a sampling rate of 16h(-1), were obtained. The procedure was satisfactorily applied to the determination of caffeine in brewed, instant and decaf coffee samples, being the results for the sample analysis validated using high-performance liquid chromatography. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. Fully automated SPE-based synthesis and purification of 2-[{sup 18}F]fluoroethyl-choline for human use

    Energy Technology Data Exchange (ETDEWEB)

    Schmaljohann, Joern [Department of Nuclear Medicine, University of Bonn, Bonn (Germany); Department of Nuclear Medicine, University of Aachen, Aachen (Germany); Schirrmacher, Esther [McConnell Brain Imaging Centre, Montreal Neurological Institute, McGill University, Montreal, Quebec (Canada); Waengler, Bjoern; Waengler, Carmen [Department of Nuclear Medicine, Ludwig-Maximilians University, Munich (Germany); Schirrmacher, Ralf, E-mail: ralf.schirrmacher@mcgill.c [McConnell Brain Imaging Centre, Montreal Neurological Institute, McGill University, Montreal, Quebec (Canada); Guhlke, Stefan, E-mail: stefan.guhlke@ukb.uni-bonn.d [Department of Nuclear Medicine, University of Bonn, Bonn (Germany)

    2011-02-15

    Introduction: 2-[{sup 18}F]Fluoroethyl-choline ([{sup 18}F]FECH) is a promising tracer for the detection of prostate cancer as well as brain tumors with positron emission tomography (PET). [{sup 18}F]FECH is actively transported into mammalian cells, becomes phosphorylated by choline kinase and gets incorporated into the cell membrane after being metabolized to phosphatidylcholine. So far, its synthesis is a two-step procedure involving at least one HPLC purification step. To allow a wider dissemination of this tracer, finding a purification method avoiding HPLC is highly desirable and would result in easier accessibility and more reliable production of [{sup 18}F]FECH. Methods: [{sup 18}F]FECH was synthesized by reaction of 2-bromo-1-[{sup 18}F]fluoroethane ([{sup 18}F]BFE) with dimethylaminoethanol (DMAE) in DMSO. We applied a novel and very reliable work-up procedure for the synthesis of [{sup 18}F]BFE. Based on a combination of three different solid-phase cartridges, the purification of [{sup 18}F]BFE from its precursor 2-bromoethyl-4-nitrobenzenesulfonate (BENos) could be achieved without using HPLC. Following the subsequent reaction of the purified [{sup 18}F]BFE with DMAE, the final product [{sup 18}F]FECH was obtained as a sterile solution by passing the crude reaction mixture through a combination of two CM plus cartridges and a sterile filter. The fully automated synthesis was performed using as well a Raytest SynChrom module (Raytest, Germany) or a Scintomics HotboxIII module (Scintomics, Germany). Results: The radiotracer [{sup 18}F]FECH can be synthesized in reliable radiochemical yields (RCY) of 37{+-}5% (Synchrom module) and 33{+-}5% (Hotbox III unit) in less than 1 h using these two fully automated commercially available synthesis units without HPLC involvement for purification. Detailed quality control of the final injectable [{sup 18}F]FECH solution proved the high radiochemical purity and the absence of Kryptofix2.2.2, DMAE and DMSO used in the

  8. Association between fully automated MRI-based volumetry of different brain regions and neuropsychological test performance in patients with amnestic mild cognitive impairment and Alzheimer's disease.

    Science.gov (United States)

    Arlt, Sönke; Buchert, Ralph; Spies, Lothar; Eichenlaub, Martin; Lehmbeck, Jan T; Jahn, Holger

    2013-06-01

    Fully automated magnetic resonance imaging (MRI)-based volumetry may serve as biomarker for the diagnosis in patients with mild cognitive impairment (MCI) or dementia. We aimed at investigating the relation between fully automated MRI-based volumetric measures and neuropsychological test performance in amnestic MCI and patients with mild dementia due to Alzheimer's disease (AD) in a cross-sectional and longitudinal study. In order to assess a possible prognostic value of fully automated MRI-based volumetry for future cognitive performance, the rate of change of neuropsychological test performance over time was also tested for its correlation with fully automated MRI-based volumetry at baseline. In 50 subjects, 18 with amnestic MCI, 21 with mild AD, and 11 controls, neuropsychological testing and T1-weighted MRI were performed at baseline and at a mean follow-up interval of 2.1 ± 0.5 years (n = 19). Fully automated MRI volumetry of the grey matter volume (GMV) was performed using a combined stereotactic normalisation and segmentation approach as provided by SPM8 and a set of pre-defined binary lobe masks. Left and right hippocampus masks were derived from probabilistic cytoarchitectonic maps. Volumes of the inner and outer liquor space were also determined automatically from the MRI. Pearson's test was used for the correlation analyses. Left hippocampal GMV was significantly correlated with performance in memory tasks, and left temporal GMV was related to performance in language tasks. Bilateral frontal, parietal and occipital GMVs were correlated to performance in neuropsychological tests comprising multiple domains. Rate of GMV change in the left hippocampus was correlated with decline of performance in the Boston Naming Test (BNT), Mini-Mental Status Examination, and trail making test B (TMT-B). The decrease of BNT and TMT-A performance over time correlated with the loss of grey matter in multiple brain regions. We conclude that fully automated MRI

  9. Towards improved estimation of the unsaturated soil hydraulic conductivity in the near saturated range by a fully automated, pressure controlled unit gradient experiment.

    Science.gov (United States)

    Werisch, Stefan; Müller, Marius

    2017-04-01

    Determination of soil hydraulic properties has always been an important part of soil physical research and model applications. While several experiments are available to measure the water retention of soil samples, the determination of the unsaturated hydraulic conductivity is often more complicated, bound to strong assumption and time consuming. Although, the application of unit gradient experiments is recommended since the middle of the last century, as one method towards a (assumption free) direct measurement of the unsaturated hydraulic conductivity, data from unit gradient experiments is seldom to never reported in literature. We developed and build a fully automated, pressure controlled, unit gradient experiment, which allows a precise determination of the unsaturated soil hydraulic conductivity K(h) and water retention VWC(h), especially in the highly dynamic near saturated range. The measurement apparatus applies the concept of hanging water columns and imposes the required soil water pressure by dual porous plates. This concepts allows the simultaneous and direct measurement of water retention and hydraulic conductivity. Moreover, this approach results in a technically less demanding experiment than related flux controlled experiments, and virtually any flux can be measured. Thus, both soil properties can be measured in mm resolution, for wetting and drying processes, between saturation and field capacity for all soil types. Our results show, that it is important to establish separate measurements of the unsaturated hydraulic conductivity in the near saturated range, as the shape of the retention function and hydraulic conductivity curve do not necessarily match. Consequently, the prediction of the hydraulic conductivity curve from measurements of the water retention behavior in combination with a value for the saturated hydraulic conductivity can be misleading. Thus, separate parameterizations of the individual functions might be necessary and are

  10. Multicenter evaluation of fully automated BACTEC Mycobacteria Growth Indicator Tube 960 system for susceptibility testing of Mycobacterium tuberculosis.

    Science.gov (United States)

    Bemer, Pascale; Palicova, Frantiska; Rüsch-Gerdes, Sabine; Drugeon, Henri B; Pfyffer, Gaby E

    2002-01-01

    The reliability of the BACTEC Mycobacteria Growth Indicator Tube (MGIT) 960 system for testing of Mycobacterium tuberculosis susceptibility to the three front-line drugs (isoniazid [INH], rifampin [RIF], and ethambutol [EMB]) plus streptomycin (STR) was compared to that of the BACTEC 460 TB system. The proportion method was used to resolve discrepant results by an independent arbiter. One hundred and ten strains were tested with an overall agreement of 93.5%. Discrepant results were obtained for seven strains (6.4%) with INH (resistant by BACTEC MGIT 960; susceptible by BACTEC 460 TB), for one strain (0.9%) with RIF (resistant by BACTEC MGIT 960; susceptible by BACTEC 460 TB), for seven strains (6.4%) with EMB (six resistant by BACTEC MGIT 960 and susceptible by BACTEC 460 TB; one susceptible by BACTEC MGIT 960 and resistant by BACTEC 460 TB), and for 19 strains (17.3%) with STR (resistant by BACTEC MGIT 960 and susceptible by BACTEC 460 TB). After resolution of discrepant results, the sensitivity of the BACTEC MGIT 960 system was 100% for all four drugs and specificity ranged from 89.8% for STR to 100% for RIF. Turnaround times were 4.6 to 11.7 days (median, 6.5 days) for BACTEC MGIT 960 and 4.0 to 10.0 days (median, 7.0 days) for BACTEC 460 TB. These data demonstrate that the fully automated and nonradiometric BACTEC MGIT 960 system is an accurate method for rapid susceptibility testing of M. tuberculosis.

  11. The Allocation of Automated Test Equipment Capacity with Variability in Demand and Processing Rates

    Science.gov (United States)

    2010-12-01

    L., Richardson, S. C., Savage , H. S., Devers, W. C., Balaban, H. S., Bailey, E. K., et al. (1996). The Capability of the Consolidated Automated...Postgraduate School Monterey, CA 8. Kara Harp Okulu Savunma Bilimleri Enstitusu Bakanliklar Ankara, Turkey 9. Kara Kuvvetleri Komutanligi

  12. Fully Automated Pulmonary Lobar Segmentation: Influence of Different Prototype Software Programs onto Quantitative Evaluation of Chronic Obstructive Lung Disease.

    Directory of Open Access Journals (Sweden)

    Hyun-ju Lim

    Full Text Available Surgical or bronchoscopic lung volume reduction (BLVR techniques can be beneficial for heterogeneous emphysema. Post-processing software tools for lobar emphysema quantification are useful for patient and target lobe selection, treatment planning and post-interventional follow-up. We aimed to evaluate the inter-software variability of emphysema quantification using fully automated lobar segmentation prototypes.66 patients with moderate to severe COPD who underwent CT for planning of BLVR were included. Emphysema quantification was performed using 2 modified versions of in-house software (without and with prototype advanced lung vessel segmentation; programs 1 [YACTA v.2.3.0.2] and 2 [YACTA v.2.4.3.1], as well as 1 commercial program 3 [Pulmo3D VA30A_HF2] and 1 pre-commercial prototype 4 [CT COPD ISP ver7.0]. The following parameters were computed for each segmented anatomical lung lobe and the whole lung: lobar volume (LV, mean lobar density (MLD, 15th percentile of lobar density (15th, emphysema volume (EV and emphysema index (EI. Bland-Altman analysis (limits of agreement, LoA and linear random effects models were used for comparison between the software.Segmentation using programs 1, 3 and 4 was unsuccessful in 1 (1%, 7 (10% and 5 (7% patients, respectively. Program 2 could analyze all datasets. The 53 patients with successful segmentation by all 4 programs were included for further analysis. For LV, program 1 and 4 showed the largest mean difference of 72 ml and the widest LoA of [-356, 499 ml] (p<0.05. Program 3 and 4 showed the largest mean difference of 4% and the widest LoA of [-7, 14%] for EI (p<0.001.Only a single software program was able to successfully analyze all scheduled data-sets. Although mean bias of LV and EV were relatively low in lobar quantification, ranges of disagreement were substantial in both of them. For longitudinal emphysema monitoring, not only scanning protocol but also quantification software needs to be kept

  13. Fully automated calculation of image-derived input function in simultaneous PET/MRI in a sheep model

    Energy Technology Data Exchange (ETDEWEB)

    Jochimsen, Thies H.; Zeisig, Vilia [Department of Nuclear Medicine, Leipzig University Hospital, Liebigstr. 18, Leipzig (Germany); Schulz, Jessica [Max Planck Institute for Human Cognitive and Brain Sciences, Stephanstr. 1a, Leipzig, D-04103 (Germany); Werner, Peter; Patt, Marianne; Patt, Jörg [Department of Nuclear Medicine, Leipzig University Hospital, Liebigstr. 18, Leipzig (Germany); Dreyer, Antje Y. [Fraunhofer Institute of Cell Therapy and Immunology, Perlickstr. 1, Leipzig, D-04103 (Germany); Translational Centre for Regenerative Medicine, University Leipzig, Philipp-Rosenthal-Str. 55, Leipzig, D-04103 (Germany); Boltze, Johannes [Fraunhofer Institute of Cell Therapy and Immunology, Perlickstr. 1, Leipzig, D-04103 (Germany); Translational Centre for Regenerative Medicine, University Leipzig, Philipp-Rosenthal-Str. 55, Leipzig, D-04103 (Germany); Fraunhofer Research Institution of Marine Biotechnology and Institute for Medical and Marine Biotechnology, University of Lübeck, Lübeck (Germany); Barthel, Henryk; Sabri, Osama; Sattler, Bernhard [Department of Nuclear Medicine, Leipzig University Hospital, Liebigstr. 18, Leipzig (Germany)

    2016-02-13

    Obtaining the arterial input function (AIF) from image data in dynamic positron emission tomography (PET) examinations is a non-invasive alternative to arterial blood sampling. In simultaneous PET/magnetic resonance imaging (PET/MRI), high-resolution MRI angiographies can be used to define major arteries for correction of partial-volume effects (PVE) and point spread function (PSF) response in the PET data. The present study describes a fully automated method to obtain the image-derived input function (IDIF) in PET/MRI. Results are compared to those obtained by arterial blood sampling. To segment the trunk of the major arteries in the neck, a high-resolution time-of-flight MRI angiography was postprocessed by a vessel-enhancement filter based on the inertia tensor. Together with the measured PSF of the PET subsystem, the arterial mask was used for geometrical deconvolution, yielding the time-resolved activity concentration averaged over a major artery. The method was compared to manual arterial blood sampling at the hind leg of 21 sheep (animal stroke model) during measurement of blood flow with O15-water. Absolute quantification of activity concentration was compared after bolus passage during steady state, i.e., between 2.5- and 5-min post injection. Cerebral blood flow (CBF) values from blood sampling and IDIF were also compared. The cross-calibration factor obtained by comparing activity concentrations in blood samples and IDIF during steady state is 0.98 ± 0.10. In all examinations, the IDIF provided a much earlier and sharper bolus peak than in the time course of activity concentration obtained by arterial blood sampling. CBF using the IDIF was 22 % higher than CBF obtained by using the AIF yielded by blood sampling. The small deviation between arterial blood sampling and IDIF during steady state indicates that correction of PVE and PSF is possible with the method presented. The differences in bolus dynamics and, hence, CBF values can be explained by the

  14. The Intelligent Automation Demands of Taiwanese Companies with Businesses in Taiwan and China

    Directory of Open Access Journals (Sweden)

    Ying-Mei Tai

    2014-08-01

    Full Text Available Most Taiwanese companies invest in Mainland Mainland Mainland Mainland China due to its cheap and a great deal of labor force, or production flexibility. Facing that Mainland China's labor conditions undergo sudden discontinuity, Taiwanese firms with businesses in Taiwan and Mainland China have to change their business models in order to overcome the current difficulties, and create new opportunities. This study shows that the "manufacturing process of the manufacturing industry" is the most important automation item for Taiwanese firms with businesses in Taiwan and Mainland China, followed by "system integration of the manufacturing industry" and "product design of the manufacturing industry", etc. Nearly 60% of these companies have further intelligent automation requirements, and they may combine several methods such as "self-develop" and "collaborate with other companies".

  15. Optimized Energy Management of a Single-House Residential Micro-Grid With Automated Demand Response

    DEFF Research Database (Denmark)

    Anvari-Moghaddam, Amjad; Monsef, Hassan; Rahimi-Kian, Ashkan

    2015-01-01

    to take part in demand response (DR) programs. The superior performance and efficiency of the proposed system is studied through several scenarios and case studies and validated in comparison with the conventional models. The simulation results demonstrate that the proposed MOEMS has the capability...

  16. Accuracy of fully automated, quantitative, volumetric measurement of the amount of fibroglandular breast tissue using MRI: correlation with anthropomorphic breast phantoms.

    Science.gov (United States)

    Wengert, Georg J; Pinker, Katja; Helbich, Thomas H; Vogl, Wolf-Dieter; Spijker, Sylvia M; Bickel, Hubert; Polanec, Stephan H; Baltzer, Pascal A

    2017-06-01

    To demonstrate the accuracy of fully automated, quantitative, volumetric measurement of the amount of fibroglandular breast tissue (FGT), using MRI, and to investigate the impact of different MRI sequences using anthropomorphic breast phantoms as the ground truth. In this study, 10 anthropomorphic breast phantoms that consisted of different known fractions of adipose and protein tissue, which closely resembled normal breast parenchyma, were developed. Anthropomorphic breast phantoms were imaged with a 1.5 T unit (Siemens, Avantofit) using an 18-channel breast coil. The sequence protocol consisted of an isotropic Dixon sequence (Di), an anisotropic Dixon sequence (Da), and T1 3D FLASH sequences with and without fat saturation (T1). Fully automated, quantitative, volumetric measurement of FGT for all anthropomorphic phantoms and sequences was performed and correlated with the amounts of fatty and protein components in the phantoms as the ground truth. Fully automated, quantitative, volumetric measurements of FGT with MRI for all sequences ranged from 5.86 to 61.05% (mean 33.36%). The isotropic Dixon sequence yielded the highest accuracy (median 0.51%-0.78%) and precision (median range 0.19%) compared with anisotropic Dixon (median 1.92%-2.09%; median range 0.55%) and T1 -weighted sequences (median 2.54%-2.46%; median range 0.82%). All sequences yielded good correlation with the FGT content of the anthropomorphic phantoms. The best correlation of FGT measurements was identified for Dixon sequences (Di, R(2)  = 0.999; Da, R(2)  = 0.998) compared with conventional T1 -weighted sequences (R(2)  = 0.971). MRI yields accurate, fully automated, quantitative, volumetric measurements of FGT, an increasingly important and sensitive imaging biomarker for breast cancer risk. Compared with conventional T1 sequences, Dixon-type sequences show the highest correlation and reproducibility for automated, quantitative, volumetric FGT measurements using anthropomorphic breast

  17. 自动需求响应信息交换接口设计%Design of Automated Demand Response Information Exchange Interface

    Institute of Scientific and Technical Information of China (English)

    祁兵; 张荣; 李彬; 陈宋宋

    2014-01-01

    Automated demand response information exchange interface, which is one of the key technologies for automated demand response, is entity connecting customer automation systems, demand response service systems in power grid-side and the third-party aggregation service systems for participating in the automated demand response. Information exchange capabilities between demand response systems are realized by the interface. By analyzing demand response business, architecture, hierarchical model and functions of interface were designed in this paper. By integrating function layer, information layer and communication layer entities, efficient, flexible, convenient and secure demand response information exchange can be implemented by the interface that mapped to a physical network reasonable, which is significant for automated demand response.%自动需求响应信息交换接口是连接参加自动需求响应的用户自动化系统、电网公司的需求响应服务系统及第三方的聚合服务系统的实体。接口能实现需求响应各系统间的信息交换功能,是自动需求响应的关键技术之一。从需求响应业务分析出发,设计了接口的体系架构、层次模型和接口功能,通过接口的功能层、信息层和通信层实体间的密切配合,合理映射到物理网络后可进行高效、灵活、方便、安全的需求响应信息交换,对实现需求响应的自动化具有重要意义。

  18. A Fully Automated Method of Locating Building Shadows for Aerosol Optical Depth Calculations in High-Resolution Satellite Imagery

    Science.gov (United States)

    2010-09-01

    identifying usable shadows and calculating AOD. 11 III. METHODOLOGY An automated approach, based on trigonometry , is created using the shadow technique and...correcting for the offset appears to be the biggest obstacle to implementing this algorithm operationally. 102 THIS PAGE INTENTIONALLY LEFT BLANK

  19. Rapid screening test for primary hyperaldosteronism: ratio of plasma aldosterone to renin concentration determined by fully automated chemiluminescence immunoassays.

    NARCIS (Netherlands)

    Perschel, F.H.; Schemer, R.; Seiler, L.; Reincke, M.; Deinum, J.; Maser-Gluth, C.; Mechelhoff, D.; Tauber, R.; Diederich, S.

    2004-01-01

    BACKGROUND: The ratio of plasma aldosterone concentration to plasma renin activity (PAC/PRA) is the most common screening test for primary hyperaldosteronism (PHA), but it is not standardized among laboratories. We evaluated new automated assays for the simultaneous measurement of PAC and plasma ren

  20. Toward fully automated genotyping: Allele assignment, pedigree construction, phase determination, and recombination detection in Duchenne muscular dystrophy

    Energy Technology Data Exchange (ETDEWEB)

    Perlin, M.W.; Burks, M.B. [Carnegie Mellon Univ., Pittsburgh, PA (United States); Hoop, R.C.; Hoffman, E.P. [Univ. of Pittsburgh School of Medicine, PA (United States)

    1994-10-01

    Human genetic maps have made quantum leaps in the past few years, because of the characterization of >2,000 CA dinucleotide repeat loci: these PCR-based markers offer extraordinarily high PIC, and within the next year their density is expected to reach intervals of a few centimorgans per marker. These new genetic maps open new avenues for disease gene research, including large-scale genotyping for both simple and complex disease loci. However, the allele patterns of many dinucleotide repeat loci can be complex and difficult to interpret, with genotyping errors a recognized problem. Furthermore, the possibility of genotyping individuals at hundreds or thousands of polymorphic loci requires improvements in data handling and analysis. The automation of genotyping and analysis of computer-derived haplotypes would remove many of the barriers preventing optimal use of dense and informative dinucleotide genetic maps. Toward this end, we have automated the allele identification, genotyping, phase determinations, and inheritance consistency checks generated by four CA repeats within the 2.5-Mbp, 10-cM X-linked dystrophin gene, using fluorescein-labeled multiplexed PCR products analyzed on automated sequencers. The described algorithms can deconvolute and resolve closely spaced alleles, despite interfering stutter noise; set phase in females; propagate the phase through the family; and identify recombination events. We show the implementation of these algorithms for the completely automated interpretation of allele data and risk assessment for five Duchenne/Becker muscular dystrophy families. The described approach can be scaled up to perform genome-based analyses with hundreds or thousands of CA-repeat loci, using multiple fluorophors on automated sequencers. 16 refs., 5 figs., 1 tab.

  1. Automated Demand Response Approaches to Household Energy Management in a Smart Grid Environment

    Science.gov (United States)

    Adika, Christopher Otieno

    The advancement of renewable energy technologies and the deregulation of the electricity market have seen the emergence of Demand response (DR) programs. Demand response is a cost-effective load management strategy which enables the electricity suppliers to maintain the integrity of the power grid during high peak periods, when the customers' electrical load is high. DR programs are designed to influence electricity users to alter their normal consumption patterns by offering them financial incentives. A well designed incentive-based DR scheme that offer competitive electricity pricing structure can result in numerous benefits to all the players in the electricity market. Lower power consumption during peak periods will significantly enhance the robustness of constrained networks by reducing the level of power of generation and transmission infrastructure needed to provide electric service. Therefore, this will ease the pressure of building new power networks as we avoiding costly energy procurements thereby translating into huge financial savings for the power suppliers. Peak load reduction will also reduce the inconveniences suffered by end users as a result of brownouts or blackouts. Demand response will also drastically lower the price peaks associated with wholesale markets. This will in turn reduce the electricity costs and risks for all the players in the energy market. Additionally, DR is environmentally friendly since it enhances the flexibility of the power grid through accommodation of renewable energy resources. Despite its many benefits, DR has not been embraced by most electricity networks. This can be attributed to the fact that the existing programs do not provide enough incentives to the end users and, therefore, most electricity users are not willing to participate in them. To overcome these challenges, most utilities are coming up with innovative strategies that will be more attractive to their customers. Thus, this dissertation presents various

  2. Total cost of ownership of pH-measurements - what is the benefit of fully automated systems?; Total Cost of Ownership von pH-Messeinrichtungen oder: Wann lohnen sich vollautomatisierte pH-Messsysteme?

    Energy Technology Data Exchange (ETDEWEB)

    Symietz, I.; Stieler, S. [Infraserv Hoechst Technik GmbH und Co.KG, Frankfurt am Main (Germany). Business Unit MSR- und Analysentechnik; Lenz, M. [Aventis Pharma Deutschland GmbH (Germany)

    2005-07-01

    Fully automated pH-measuring systems offer sensor cleaning, sensor calibration and reduced sensor wear in corrosive media by intermittent measurements. Cost of Ownership of theses systems were compared to conventional pH-measurements. (orig.) (orig.)

  3. Preliminary evaluation of the publicly available Laboratory for Breast Radiodensity Assessment (LIBRA) software tool: comparison of fully automated area and volumetric density measures in a case-control study with digital mammography

    National Research Council Canada - National Science Library

    Keller, Brad M; Chen, Jinbo; Daye, Dania; Conant, Emily F; Kontos, Despina

    2015-01-01

    .... We investigated associations between breast cancer and fully automated measures of breast density made by a new publicly available software tool, the Laboratory for Individualized Breast Radiodensity Assessment (LIBRA...

  4. Screening for illicit and medicinal drugs in whole blood using fully automated SPE and UHPLC-TOF-MS with data-independent acquisition

    DEFF Research Database (Denmark)

    Pedersen, Anders Just; Dalsgaard, Petur Weihe; Rode, Andrej Jaroslav

    2013-01-01

    A broad forensic screening method for 256 analytes in whole blood based on a fully automated SPE robotic extraction and UHPLC-TOF-MS with data-independent acquisition has been developed. The limit of identification was evaluated for all 256 compounds and 95 of these compounds were validated with ......-MS screening of blood samples constitutes a practical way for screening traffic cases, with the exception of THC, which should be handled in a separate method. This article is protected by copyright. All rights reserved....

  5. A fully automated method for simultaneous determination of aflatoxins and ochratoxin A in dried fruits by pressurized liquid extraction and online solid-phase extraction cleanup coupled to ultra-high-pressure liquid chromatography-tandem mass spectrometry.

    Science.gov (United States)

    Campone, Luca; Piccinelli, Anna Lisa; Celano, Rita; Russo, Mariateresa; Valdés, Alberto; Ibáñez, Clara; Rastrelli, Luca

    2015-04-01

    According to current demands and future perspectives in food safety, this study reports a fast and fully automated analytical method for the simultaneous analysis of the mycotoxins with high toxicity and wide spread, aflatoxins (AFs) and ochratoxin A (OTA) in dried fruits, a high-risk foodstuff. The method is based on pressurized liquid extraction (PLE), with aqueous methanol (30%) at 110 °C, of the slurried dried fruit and online solid-phase extraction (online SPE) cleanup of the PLE extracts with a C18 cartridge. The purified sample was directly analysed by ultra-high-pressure liquid chromatography-tandem mass spectrometry (UHPLC-MS/MS) for sensitive and selective determination of AFs and OTA. The proposed analytical procedure was validated for different dried fruits (vine fruit, fig and apricot), providing method detection and quantification limits much lower than the AFs and OTA maximum levels imposed by EU regulation in dried fruit for direct human consumption. Also, recoveries (83-103%) and repeatability (RSD < 8, n = 3) meet the performance criteria required by EU regulation for the determination of the levels of mycotoxins in foodstuffs. The main advantage of the proposed method is full automation of the whole analytical procedure that reduces the time and cost of the analysis, sample manipulation and solvent consumption, enabling high-throughput analysis and highly accurate and precise results.

  6. Eco-HAB as a fully automated and ecologically relevant assessment of social impairments in mouse models of autism

    OpenAIRE

    Puścian, Alicja; Łęski, Szymon; Kasprowicz, Grzegorz; Winiarski, Maciej; Borowska, Joanna; Nikolaev, Tomasz; Boguszewski, Paweł M; Lipp, Hans-Peter; Knapska, Ewelina

    2016-01-01

    Eco-HAB is an open source, RFID-based system for automated measurement and analysis of social preference and in-cohort sociability in mice. The system closely follows murine ethology. It requires no contact between a human experimenter and tested animals, overcoming the confounding factors that lead to irreproducible assessment of murine social behavior between laboratories. In Eco-HAB, group-housed animals live in a spacious, four-compartment apparatus with shadowed areas and narrow tunnels,...

  7. Assessment of Pain Response in Capsaicin-Induced Dynamic Mechanical Allodynia Using a Novel and Fully Automated Brushing Device

    Directory of Open Access Journals (Sweden)

    Kristian G du Jardin

    2013-01-01

    Full Text Available BACKGROUND: Dynamic mechanical allodynia is traditionally induced by manual brushing of the skin. Brushing force and speed have been shown to influence the intensity of brush-evoked pain. There are still limited data available with respect to the optimal stroke number, length, force, angle and speed. Therefore, an automated brushing device (ABD was developed, for which brushing angle and speed could be controlled to enable quantitative assessment of dynamic mechanical allodynia.

  8. Automated Metadata Formatting for Cornell’s Print-on-Demand Books

    Directory of Open Access Journals (Sweden)

    Dianne Dietrich

    2009-11-01

    Full Text Available Cornell University Library has made Print-On Demand (POD books available for many of its digitized out-of-copyright books. The printer must be supplied with metadata from the MARC bibliographic record in order to produce book covers. Although the names of authors are present in MARC records, they are given in an inverted order suitable for alphabetical filing rather than the natural order that is desirable for book covers. This article discusses a process for parsing and manipulating the MARC author strings to identify their various component parts and to create natural order strings. In particular, the article focuses on processing non-name information in author strings, such as titles that were commonly used in older works, e.g., baron or earl, and suffixes appended to names, e.g., "of Bolsena." Relevant patterns are identified and a Python script is used to manipulate the author name strings.

  9. Fully Automated One-Step Production of Functional 3D Tumor Spheroids for High-Content Screening.

    Science.gov (United States)

    Monjaret, François; Fernandes, Mathieu; Duchemin-Pelletier, Eve; Argento, Amelie; Degot, Sébastien; Young, Joanne

    2016-04-01

    Adoption of spheroids within high-content screening (HCS) has lagged behind high-throughput screening (HTS) due to issues with running complex assays on large three-dimensional (3D) structures.To enable multiplexed imaging and analysis of spheroids, different cancer cell lines were grown in 3D on micropatterned 96-well plates with automated production of nine uniform spheroids per well. Spheroids achieve diameters of up to 600 µm, and reproducibility was experimentally validated (interwell and interplate CV(diameter) integration of micropatterned spheroid models within fundamental research and drug discovery applications.

  10. Centrifugal LabTube platform for fully automated DNA purification and LAMP amplification based on an integrated, low-cost heating system.

    Science.gov (United States)

    Hoehl, Melanie M; Weißert, Michael; Dannenberg, Arne; Nesch, Thomas; Paust, Nils; von Stetten, Felix; Zengerle, Roland; Slocum, Alexander H; Steigert, Juergen

    2014-06-01

    This paper introduces a disposable battery-driven heating system for loop-mediated isothermal DNA amplification (LAMP) inside a centrifugally-driven DNA purification platform (LabTube). We demonstrate LabTube-based fully automated DNA purification of as low as 100 cell-equivalents of verotoxin-producing Escherichia coli (VTEC) in water, milk and apple juice in a laboratory centrifuge, followed by integrated and automated LAMP amplification with a reduction of hands-on time from 45 to 1 min. The heating system consists of two parallel SMD thick film resistors and a NTC as heating and temperature sensing elements. They are driven by a 3 V battery and controlled by a microcontroller. The LAMP reagents are stored in the elution chamber and the amplification starts immediately after the eluate is purged into the chamber. The LabTube, including a microcontroller-based heating system, demonstrates contamination-free and automated sample-to-answer nucleic acid testing within a laboratory centrifuge. The heating system can be easily parallelized within one LabTube and it is deployable for a variety of heating and electrical applications.

  11. Fully Automated Atlas-Based Hippocampus Volumetry for Clinical Routine: Validation in Subjects with Mild Cognitive Impairment from the ADNI Cohort.

    Science.gov (United States)

    Suppa, Per; Hampel, Harald; Spies, Lothar; Fiebach, Jochen B; Dubois, Bruno; Buchert, Ralph

    2015-01-01

    Hippocampus volumetry based on magnetic resonance imaging (MRI) has not yet been translated into everyday clinical diagnostic patient care, at least in part due to limited availability of appropriate software tools. In the present study, we evaluate a fully-automated and computationally efficient processing pipeline for atlas based hippocampal volumetry using freely available Statistical Parametric Mapping (SPM) software in 198 amnestic mild cognitive impairment (MCI) subjects from the Alzheimer's Disease Neuroimaging Initiative (ADNI1). Subjects were grouped into MCI stable and MCI to probable Alzheimer's disease (AD) converters according to follow-up diagnoses at 12, 24, and 36 months. Hippocampal grey matter volume (HGMV) was obtained from baseline T1-weighted MRI and then corrected for total intracranial volume and age. Average processing time per subject was less than 4 minutes on a standard PC. The area under the receiver operator characteristic curve of the corrected HGMV for identification of MCI to probable AD converters within 12, 24, and 36 months was 0.78, 0.72, and 0.71, respectively. Thus, hippocampal volume computed with the fully-automated processing pipeline provides similar power for prediction of MCI to probable AD conversion as computationally more expensive methods. The whole processing pipeline has been made freely available as an SPM8 toolbox. It is easily set up and integrated into everyday clinical patient care.

  12. Eco-HAB as a fully automated and ecologically relevant assessment of social impairments in mouse models of autism.

    Science.gov (United States)

    Puścian, Alicja; Łęski, Szymon; Kasprowicz, Grzegorz; Winiarski, Maciej; Borowska, Joanna; Nikolaev, Tomasz; Boguszewski, Paweł M; Lipp, Hans-Peter; Knapska, Ewelina

    2016-10-12

    Eco-HAB is an open source, RFID-based system for automated measurement and analysis of social preference and in-cohort sociability in mice. The system closely follows murine ethology. It requires no contact between a human experimenter and tested animals, overcoming the confounding factors that lead to irreproducible assessment of murine social behavior between laboratories. In Eco-HAB, group-housed animals live in a spacious, four-compartment apparatus with shadowed areas and narrow tunnels, resembling natural burrows. Eco-HAB allows for assessment of the tendency of mice to voluntarily spend time together in ethologically relevant mouse group sizes. Custom-made software for automated tracking, data extraction, and analysis enables quick evaluation of social impairments. The developed protocols and standardized behavioral measures demonstrate high replicability. Unlike classic three-chambered sociability tests, Eco-HAB provides measurements of spontaneous, ecologically relevant social behaviors in group-housed animals. Results are obtained faster, with less manpower, and without confounding factors.

  13. Fully Automated Electro Membrane Extraction Autosampler for LC-MS Systems Allowing Soft Extractions for High-Throughput Applications.

    Science.gov (United States)

    Fuchs, David; Pedersen-Bjergaard, Stig; Jensen, Henrik; Rand, Kasper D; Honoré Hansen, Steen; Petersen, Nickolaj Jacob

    2016-07-05

    The current work describes the implementation of electro membrane extraction (EME) into an autosampler for high-throughput analysis of samples by EME-LC-MS. The extraction probe was built into a luer lock adapter connected to a HTC PAL autosampler syringe. As the autosampler drew sample solution, analytes were extracted into the lumen of the extraction probe and transferred to a LC-MS system for further analysis. Various parameters affecting extraction efficacy were investigated including syringe fill strokes, syringe pull up volume, pull up delay and volume in the sample vial. The system was optimized for soft extraction of analytes and high sample throughput. Further, it was demonstrated that by flushing the EME-syringe with acidic wash buffer and reverting the applied electric potential, carry-over between samples can be reduced to below 1%. Performance of the system was characterized (RSD, high extraction speed of EME, a complete analytical workflow of purification, separation, and analysis of sample could be achieved within only 5.5 min. With the developed system large sequences of samples could be analyzed in a completely automated manner. This high degree of automation makes the developed EME-autosampler a powerful tool for a wide range of applications where high-throughput extractions are required before sample analysis.

  14. Evaluation of a fully automated treponemal test and comparison with conventional VDRL and FTA-ABS tests.

    Science.gov (United States)

    Park, Yongjung; Park, Younhee; Joo, Shin Young; Park, Myoung Hee; Kim, Hyon-Suk

    2011-11-01

    We evaluated analytic performances of an automated treponemal test and compared this test with the Venereal Disease Research Laboratory test (VDRL) and fluorescent treponemal antibody absorption test (FTA-ABS). Precision performance of the Architect Syphilis TP assay (TP; Abbott Japan, Tokyo, Japan) was assessed, and 150 serum samples were assayed with the TP before and after heat inactivation to estimate the effect of heat inactivation. A total of 616 specimens were tested with the FTA-ABS and TP, and 400 were examined with the VDRL. The TP showed good precision performance with total imprecision of less than a 10% coefficient of variation. An excellent linear relationship between results before and after heat inactivation was observed (R(2) = 0.9961). The FTA-ABS and TP agreed well with a κ coefficient of 0.981. The concordance rate between the FTA-ABS and TP was the highest (99.0%), followed by the rates between FTA-ABS and VDRL (85.0%) and between TP and VDRL (83.8%). The automated TP assay may be adequate for screening for syphilis in a large volume of samples and can be an alternative to FTA-ABS.

  15. Steps Towards a Fully Automated Classification and -measurement Pipeline for LAMOST Spectra I. Continuum level and wavelength estimation for galaxies

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    The Large Sky-Area Multi-Object Spectroscopic Telescope (LAMOST) under construction by the National Astronomical Observatories will yield up to fourthousand multi-fiber spectra of stars and galaxies per field. The present series of papers describes the automated data-reduction pipeline currently being designed inorder to cope with the anticipated flood of spectrographic data. In this prelimi-nary paper, we present an automated method for estimating the continuum level,the positions of strong lines and the 4000 A break in galaxy spectra. In order to obtain detailed information on the continuum, we use a wavelet filter bank. Aftercontinuum fitting, our software searches for a 4000 A break and distinguishes be-tween emission-line galaxies (ELGs) and non-ELGs according to whether the break is small or large. It then searches for strong lines and measures the intensities of emission lines and the equivalent widths of absorption lines. For non-ELGs, the absorption lines arc identified automatically yielding redshift measurements.

  16. Field performance of a low-cost and fully-automated blood counting system operated by trained and untrained users (Conference Presentation)

    Science.gov (United States)

    Xie, Dengling; Xie, Yanjun; Liu, Peng; Tong, Lieshu; Chu, Kaiqin; Smith, Zachary J.

    2017-02-01

    Current flow-based blood counting devices require expensive and centralized medical infrastructure and are not appropriate for field use. In this paper we report a method to count red blood cells, white blood cells as well as platelets through a low-cost and fully-automated blood counting system. The approach consists of using a compact, custom-built microscope with large field-of-view to record bright-field and fluorescence images of samples that are diluted with a single, stable reagent mixture and counted using automatic algorithms. Sample collection is performed manually using a spring loaded lancet, and volume-metering capillary tubes. The capillaries are then dropped into a tube of pre-measured reagents and gently shaken for 10-30 seconds. The sample is loaded into a measurement chamber and placed on a custom 3D printed platform. Sample translation and focusing is fully automated, and a user has only to press a button for the measurement and analysis to commence. Cost of the system is minimized through the use of custom-designed motorized components. We performed a series of comparative experiments by trained and untrained users on blood from adults and children. We compare the performance of our system, as operated by trained and untrained users, to the clinical gold standard using a Bland-Altman analysis, demonstrating good agreement of our system to the clinical standard. The system's low cost, complete automation, and good field performance indicate that it can be successfully translated for use in low-resource settings where central hematology laboratories are not accessible.

  17. The bright side of snow cover effects on PV production - How to lower the seasonal mismatch between electricity supply and demand in a fully renewable Switzerland

    Science.gov (United States)

    Kahl, Annelen; Dujardin, Jérôme; Dupuis, Sonia; Lehning, Michael

    2017-04-01

    One of the major problems with solar PV in the context of a fully renewable electricity production at mid-latitudes is the trend of higher production in summer and lower production in winter. This trend is most often exactly opposite to demand patterns, causing a seasonal mismatch that requires extensive balancing power from other production sources or large storage capacities. Which possibilities do we have to bring PV production into closer correlation with demand? This question motivated our research and in response we investigated the effects of placing PV panels at different tilt angles in regions with extensive snow cover to increase winter production from ground reflected short wave radiation. The aim of this project is therefore to quantify the effect of varying snow cover duration (SCD) and of panel tilt angle on the annual total production and on production during winter months when electricity is most needed. We chose Switzerland as ideal test site, because it has a wide range of snow cover conditions and a high potential for renewable electricity production. But methods can be applied to other regions of comparable conditions for snow cover and irradiance. Our analysis can be separated into two steps: 1. A systematic, GIS and satellite-based analysis for all of Switzerland: We use time series of satellite-derived irradiance, and snow cover characteristics together with land surface cover types and elevation information to quantify the environmental conditions and to estimate potential production and ideal tilt angles. 2. A scenario-based analysis that contrasts the production patterns of different placement scenarios for PV panels in urban, rural and mountainous areas. We invoke a model of a fully renewable electricity system (including Switzerland's large hydropower system) at national level to compute the electricity import and storage capacity that will be required to balance the remaining mismatch between production and demand to further illuminate

  18. Technical Note: A fully automated purge and trap GC-MS system for quantification of volatile organic compound (VOC fluxes between the ocean and atmosphere

    Directory of Open Access Journals (Sweden)

    S. J. Andrews

    2015-04-01

    Full Text Available The oceans are a key source of a number of atmospherically important volatile gases. The accurate and robust determination of trace gases in seawater is a significant analytical challenge, requiring reproducible and ideally automated sample handling, a high efficiency of seawater–air transfer, removal of water vapour from the sample stream, and high sensitivity and selectivity of the analysis. Here we describe a system that was developed for the fully automated analysis of dissolved very short-lived halogenated species (VSLS sampled from an under-way seawater supply. The system can also be used for semi-automated batch sampling from Niskin bottles filled during CTD (conductivity, temperature, depth profiles. The essential components comprise a bespoke, automated purge and trap (AutoP & T unit coupled to a commercial thermal desorption and gas chromatograph mass spectrometer (TD-GC-MS. The AutoP & T system has completed five research cruises, from the tropics to the poles, and collected over 2500 oceanic samples to date. It is able to quantify >25 species over a boiling point range of 34–180 °C with Henry's law coefficients of 0.018 and greater (CH22l, kHcc dimensionless gas/aqueous and has been used to measure organic sulfurs, hydrocarbons, halocarbons and terpenes. In the eastern tropical Pacific, the high sensitivity and sampling frequency provided new information regarding the distribution of VSLS, including novel measurements of a photolytically driven diurnal cycle of CH22l within the surface ocean water.

  19. Fully Automated Treatment Planning for Head and Neck Radiotherapy using a Voxel-Based Dose Prediction and Dose Mimicking Method

    CERN Document Server

    McIntosh, Chris; McNiven, Andrea; Jaffray, David A; Purdie, Thomas G

    2016-01-01

    Recent works in automated radiotherapy treatment planning have used machine learning based on historical treatment plans to infer the spatial dose distribution for a novel patient directly from the planning image. We present an atlas-based approach which learns a dose prediction model for each patient (atlas) in a training database, and then learns to match novel patients to the most relevant atlases. The method creates a spatial dose objective, which specifies the desired dose-per-voxel, and therefore replaces any requirement for specifying dose-volume objectives for conveying the goals of treatment planning. A probabilistic dose distribution is inferred from the most relevant atlases, and is scalarized using a conditional random field to determine the most likely spatial distribution of dose to yield a specific dose prior (histogram) for relevant regions of interest. Voxel-based dose mimicking then converts the predicted dose distribution to a deliverable treatment plan dose distribution. In this study, we ...

  20. Fully automated, high speed, tomographic phase object reconstruction using the transport of intensity equation in transmission and reflection configurations.

    Science.gov (United States)

    Nguyen, Thanh; Nehmetallah, George; Tran, Dat; Darudi, Ahmad; Soltani, Peyman

    2015-12-10

    While traditional transport of intensity equation (TIE) based phase retrieval of a phase object is performed through axial translation of the CCD, in this work a tunable lens TIE is employed in both transmission and reflection configurations. These configurations are extended to a 360° tomographic 3D reconstruction through multiple illuminations from different angles by a custom fabricated rotating assembly of the phase object. Synchronization circuitry is developed to control the CCD camera and the Arduino board, which in its turn controls the tunable lens and the stepper motor to automate the tomographic reconstruction process. Finally, a MATLAB based user friendly graphical user interface is developed to control the whole system and perform tomographic reconstruction using both multiplicative and inverse radon based techniques.

  1. Development of a Fully Automated Guided Wave System for In-Process Cure Monitoring of CFRP Composite Laminates

    Science.gov (United States)

    Hudson, Tyler B.; Hou, Tan-Hung; Grimsley, Brian W.; Yaun, Fuh-Gwo

    2016-01-01

    A guided wave-based in-process cure monitoring technique for carbon fiber reinforced polymer (CFRP) composites was investigated at NASA Langley Research Center. A key cure transition point (vitrification) was identified and the degree of cure was monitored using metrics such as amplitude and time of arrival (TOA) of guided waves. Using an automated system preliminarily developed in this work, high-temperature piezoelectric transducers were utilized to interrogate a twenty-four ply unidirectional composite panel fabricated from Hexcel (Registered Trademark) IM7/8552 prepreg during cure. It was shown that the amplitude of the guided wave increased sharply around vitrification and the TOA curve possessed an inverse relationship with degree of cure. The work is a first step in demonstrating the feasibility of transitioning the technique to perform in-process cure monitoring in an autoclave, defect detection during cure, and ultimately a closed-loop process control to maximize composite part quality and consistency.

  2. Cooperative Control and Active Interfaces for Vehicle Assitsance and Automation

    OpenAIRE

    Flemisch, Frank; Kelsch, Johann; Löper, Christan; Schieben, Anna; Schindler, Julian; Heesen, Matthias

    2008-01-01

    Enabled by scientific, technological and societal progress, and pulled by human demands, more and more aspects of our life can be assisted or automated by technical artefacts. One example is the transportation domain, where in the sky commercial aircraft are flying highly automated most of the times and where on the roads a gradual revolution takes place towards assisted, highly automated or even fully automated cars and trucks. Automobiles and mobility are changing gradually t...

  3. Comparison of Cobas 6500 and Iris IQ200 fully-automated urine analyzers to manual urine microscopy.

    Science.gov (United States)

    Bakan, Ebubekir; Ozturk, Nurinnisa; Baygutalp, Nurcan Kilic; Polat, Elif; Akpinar, Kadriye; Dorman, Emrullah; Polat, Harun; Bakan, Nuri

    2016-10-15

    Urine screening is achieved by either automated or manual microscopic analysis. The aim of the study was to compare Cobas 6500 and Iris IQ200 urine analyzers, and manual urine microscopic analysis. A total of 540 urine samples sent to the laboratory for chemical and sediment analysis were analyzed on Cobas 6500 and Iris IQ200 within 1 hour from sampling. One hundred and fifty three samples were found to have pathological sediment results and were subjected to manual microscopic analysis performed by laboratory staff blinded to the study. Spearman's and Gamma statistics were used for correlation analyses, and the McNemar test for the comparison of the two automated analyzers. The comparison of Cobas u701 to the manual method yielded the following regression equations: y = - 0.12 (95% CI: - 1.09 to 0.67) + 0.78 (95% CI: 0.65 to 0.95) x for WBC and y = 0.06 (95% CI: - 0.09 to 0.25) + 0.66 (95% CI: 0.57 to 0.73) x for RBC. The comparison of IQ200 Elite to manual method the following equations: y = 0.03 (95% CI: - 1.00 to 1.00) + 0.88 (95% CI: 0.66 to 1.00) x for WBC and y = - 0.22 (95% CI: - 0.80 to 0.20) + 0.40 (95% CI: 0.32 to 0.50) x for RBC. IQ200 Elite compared to Cobas u701 yielded the following equations: y = - 0.95 (95% CI: - 2.13 to 0.11) + 1.25 (95% CI: 1.08 to 1.44) x for WBC and y = - 1.20 (95% CI: - 1.80 to -0.30) + 0. 80 (95% CI: 0.55 to 1.00) x for RBC. The two analyzers showed similar performances and good compatibility to manual microscopy. However, they are still inadequate in the determination of WBC, RBC, and EC in highly-pathological samples. Thus, confirmation by manual microscopic analysis may be useful.

  4. Fully Automated On-Chip Imaging Flow Cytometry System with Disposable Contamination-Free Plastic Re-Cultivation Chip

    Directory of Open Access Journals (Sweden)

    Tomoyuki Kaneko

    2011-06-01

    Full Text Available We have developed a novel imaging cytometry system using a poly(methyl methacrylate (PMMA based microfluidic chip. The system was contamination-free, because sample suspensions contacted only with a flammable PMMA chip and no other component of the system. The transparency and low-fluorescence of PMMA was suitable for microscopic imaging of cells flowing through microchannels on the chip. Sample particles flowing through microchannels on the chip were discriminated by an image-recognition unit with a high-speed camera in real time at the rate of 200 event/s, e.g., microparticles 2.5 μm and 3.0 μm in diameter were differentiated with an error rate of less than 2%. Desired cells were separated automatically from other cells by electrophoretic or dielectrophoretic force one by one with a separation efficiency of 90%. Cells in suspension with fluorescent dye were separated using the same kind of microfluidic chip. Sample of 5 μL with 1 × 106 particle/mL was processed within 40 min. Separated cells could be cultured on the microfluidic chip without contamination. The whole operation of sample handling was automated using 3D micropipetting system. These results showed that the novel imaging flow cytometry system is practically applicable for biological research and clinical diagnostics.

  5. [Condition setting for the measurement of blood coagulation factor XIII activity using a fully automated blood coagulation analyzer, COAGTRON-350].

    Science.gov (United States)

    Kanno, Nobuko; Kaneko, Makoto; Tanabe, Kumiko; Jyona, Masahiro; Yokota, Hiromitsu; Yatomi, Yutaka

    2012-12-01

    The automated laboratory analyzer COAGTRON-350 (Trinity Biotech) is used for routine and specific coagulation testing for the detection of fibrin formation utilizing either mechanical principles (ball method) or photo-optical principles, chromogenic kinetic enzyme analysis, and immune-turbidimetric detection systems in one benchtop unit. In this study, we demonstrated and established a parameter for the measurement of factor XIII (FXIII) activity using Berichrom FXIII reagent and the COAGTRON-350 analyzer. The usual protocol used for this reagent, based on the handling method, was slightly modified for this device. The analysis showed that fundamental study for the measurement of FXIII activity under our condition setting was favorable in terms of reproducibility, linearity, and correlation with another assays. Since FXIII is the key enzyme that plays important roles in hemostasis by stabilizing fibrin formation, the measurement of FXIII is essential for the diagnosis of bleeding disorders. Therefore, FXIII activity assessment as well as a routine coagulation testing can be conducted simultaneously with one instrument, which is useful in coagulopathy assessment.

  6. Fully automated macular pathology detection in retina optical coherence tomography images using sparse coding and dictionary learning

    Science.gov (United States)

    Sun, Yankui; Li, Shan; Sun, Zhongyang

    2017-01-01

    We propose a framework for automated detection of dry age-related macular degeneration (AMD) and diabetic macular edema (DME) from retina optical coherence tomography (OCT) images, based on sparse coding and dictionary learning. The study aims to improve the classification performance of state-of-the-art methods. First, our method presents a general approach to automatically align and crop retina regions; then it obtains global representations of images by using sparse coding and a spatial pyramid; finally, a multiclass linear support vector machine classifier is employed for classification. We apply two datasets for validating our algorithm: Duke spectral domain OCT (SD-OCT) dataset, consisting of volumetric scans acquired from 45 subjects-15 normal subjects, 15 AMD patients, and 15 DME patients; and clinical SD-OCT dataset, consisting of 678 OCT retina scans acquired from clinics in Beijing-168, 297, and 213 OCT images for AMD, DME, and normal retinas, respectively. For the former dataset, our classifier correctly identifies 100%, 100%, and 93.33% of the volumes with DME, AMD, and normal subjects, respectively, and thus performs much better than the conventional method; for the latter dataset, our classifier leads to a correct classification rate of 99.67%, 99.67%, and 100.00% for DME, AMD, and normal images, respectively.

  7. Improved synthesis of [(18)F]FLETT via a fully automated vacuum distillation method for [(18)F]2-fluoroethyl azide purification.

    Science.gov (United States)

    Ackermann, Uwe; Plougastel, Lucie; Goh, Yit Wooi; Yeoh, Shinn Dee; Scott, Andrew M

    2014-12-01

    The synthesis of [(18)F]2-fluoroethyl azide and its subsequent click reaction with 5-ethynyl-2'-deoxyuridine (EDU) to form [(18)F]FLETT was performed using an iPhase FlexLab module. The implementation of a vacuum distillation method afforded [(18)F]2-fluoroethyl azide in 87±5.3% radiochemical yield. The use of Cu(CH3CN)4PF6 and TBTA as catalyst enabled us to fully automate the [(18)F]FLETT synthesis without the need for the operator to enter the radiation field. [(18)F]FLETT was produced in higher overall yield (41.3±6.5%) and shorter synthesis time (67min) than with our previously reported manual method (32.5±2.5% in 130min). Copyright © 2014 Elsevier Ltd. All rights reserved.

  8. On transcending the impasse of respiratory motion correction applications in routine clinical imaging - a consideration of a fully automated data driven motion control framework.

    Science.gov (United States)

    Kesner, Adam L; Schleyer, Paul J; Büther, Florian; Walter, Martin A; Schäfers, Klaus P; Koo, Phillip J

    2014-12-01

    Positron emission tomography (PET) is increasingly used for the detection, characterization, and follow-up of tumors located in the thorax. However, patient respiratory motion presents a unique limitation that hinders the application of high-resolution PET technology for this type of imaging. Efforts to transcend this limitation have been underway for more than a decade, yet PET remains for practical considerations a modality vulnerable to motion-induced image degradation. Respiratory motion control is not employed in routine clinical operations. In this article, we take an opportunity to highlight some of the recent advancements in data-driven motion control strategies and how they may form an underpinning for what we are presenting as a fully automated data-driven motion control framework. This framework represents an alternative direction for future endeavors in motion control and can conceptually connect individual focused studies with a strategy for addressing big picture challenges and goals.

  9. A fully automated two-step synthesis of an {sup 18}F-labelled tyrosine kinase inhibitor for EGFR kinase activity imaging in tumors

    Energy Technology Data Exchange (ETDEWEB)

    Kobus, D.; Giesen, Y.; Ullrich, R.; Backes, H. [Max Planck Institute for Neurological Research with Klaus-Joachim-Zuelch Laboratories of the Max Planck Society and the Faculty of Medicine of the University of Cologne, Cologne (Germany); Neumaier, B. [Max Planck Institute for Neurological Research with Klaus-Joachim-Zuelch Laboratories of the Max Planck Society and the Faculty of Medicine of the University of Cologne, Cologne (Germany)], E-mail: bernd.neumaier@nf.mpg.de

    2009-11-15

    Radiolabelled epidermal growth factor receptor (EGFR) tyrosine kinase (TK) inhibitors potentially facilitate the assessment of EGFR overexpression in tumors. Since elaborate multi-step radiosyntheses are required for {sup 18}F-labelling of EGFR-specific anilinoquinazolines we report on the development of a two-step click labelling approach that was adapted to a fully automated synthesis module. 6-(4-N,N-Dimethylaminocrotonyl)amido-4-(3-chloro-4-fluoro)phenylamino-7-{l_brace}3- [4-(2-[{sup 18}F]fluoroethyl)-2,3,4-triazol-1-yl]propoxy{r_brace}quinazoline ([{sup 18}F]6) was synthesized via Huisgen 1,3-dipolar cycloaddition between 2-[{sup 18}F]fluoroethylazide ([{sup 18}F]4) and the alkyne modified anilinoquinazoline precursor 5. PET images of PC9 tumor xenograft using the novel biomarker showed promising results to visualize EGFR overexpression.

  10. FULLY AUTOMATED GIS-BASED INDIVIDUAL TREE CROWN DELINEATION BASED ON CURVATURE VALUES FROM A LIDAR DERIVED CANOPY HEIGHT MODEL IN A CONIFEROUS PLANTATION

    Directory of Open Access Journals (Sweden)

    R. J. L. Argamosa

    2016-06-01

    Full Text Available The generation of high resolution canopy height model (CHM from LiDAR makes it possible to delineate individual tree crown by means of a fully-automated method using the CHM’s curvature through its slope. The local maxima are obtained by taking the maximum raster value in a 3 m x 3 m cell. These values are assumed as tree tops and therefore considered as individual trees. Based on the assumptions, thiessen polygons were generated to serve as buffers for the canopy extent. The negative profile curvature is then measured from the slope of the CHM. The results show that the aggregated points from a negative profile curvature raster provide the most realistic crown shape. The absence of field data regarding tree crown dimensions require accurate visual assessment after the appended delineated tree crown polygon was superimposed to the hill shaded CHM.

  11. Platform-Independent Cirrus and Spectralis Thickness Measurements in Eyes with Diabetic Macular Edema Using Fully Automated Software.

    Science.gov (United States)

    Willoughby, Alex S; Chiu, Stephanie J; Silverman, Rachel K; Farsiu, Sina; Bailey, Clare; Wiley, Henry E; Ferris, Frederick L; Jaffe, Glenn J

    2017-02-01

    We determine whether the automated segmentation software, Duke Optical Coherence Tomography Retinal Analysis Program (DOCTRAP), can measure, in a platform-independent manner, retinal thickness on Cirrus and Spectralis spectral domain optical coherence tomography (SD-OCT) images in eyes with diabetic macular edema (DME) under treatment in a clinical trial. Automatic segmentation software was used to segment the internal limiting membrane (ILM), inner retinal pigment epithelium (RPE), and Bruch's membrane (BM) in SD-OCT images acquired by Cirrus and Spectralis commercial systems, from the same eye, on the same day during a clinical interventional DME trial. Mean retinal thickness differences were compared across commercial and DOCTRAP platforms using intraclass correlation (ICC) and Bland-Altman plots. The mean 1 mm central subfield thickness difference (standard error [SE]) comparing segmentation of Spectralis images with DOCTRAP versus HEYEX was 0.7 (0.3) μm (0.2 pixels). The corresponding values comparing segmentation of Cirrus images with DOCTRAP versus Cirrus software was 2.2 (0.7) μm. The mean 1 mm central subfield thickness difference (SE) comparing segmentation of Cirrus and Spectralis scan pairs with DOCTRAP using BM as the outer retinal boundary was -2.3 (0.9) μm compared to 2.8 (0.9) μm with inner RPE as the outer boundary. DOCTRAP segmentation of Cirrus and Spectralis images produces validated thickness measurements that are very similar to each other, and very similar to the values generated by the corresponding commercial software in eyes with treated DME. This software enables automatic total retinal thickness measurements across two OCT platforms, a process that is impractical to perform manually.

  12. SU-D-BRD-06: Creating a Safety Net for a Fully Automated, Script Driven Electronic Medical Record

    Energy Technology Data Exchange (ETDEWEB)

    Sheu, R; Ghafar, R; Powers, A; Green, S; Lo, Y [Mount Sinai Medical Center, New York, NY (United States)

    2015-06-15

    Purpose: Demonstrate the effectiveness of in-house software in ensuring EMR workflow efficiency and safety. Methods: A web-based dashboard system (WBDS) was developed to monitor clinical workflow in real time using web technology (WAMP) through ODBC (Open Database Connectivity). Within Mosaiq (Elekta Inc), operational workflow is driven and indicated by Quality Check Lists (QCLs), which is triggered by automation software IQ Scripts (Elekta Inc); QCLs rely on user completion to propagate. The WBDS retrieves data directly from the Mosaig SQL database and tracks clinical events in real time. For example, the necessity of a physics initial chart check can be determined by screening all patients on treatment who have received their first fraction and who have not yet had their first chart check. Monitoring similar “real” events with our in-house software creates a safety net as its propagation does not rely on individual users input. Results: The WBDS monitors the following: patient care workflow (initial consult to end of treatment), daily treatment consistency (scheduling, technique, charges), physics chart checks (initial, EOT, weekly), new starts, missing treatments (>3 warning/>5 fractions, action required), and machine overrides. The WBDS can be launched from any web browser which allows the end user complete transparency and timely information. Since the creation of the dashboards, workflow interruptions due to accidental deletion or completion of QCLs were eliminated. Additionally, all physics chart checks were completed timely. Prompt notifications of treatment record inconsistency and machine overrides have decreased the amount of time between occurrence and execution of corrective action. Conclusion: Our clinical workflow relies primarily on QCLs and IQ Scripts; however, this functionality is not the panacea of safety and efficiency. The WBDS creates a more thorough system of checks to provide a safer and near error-less working environment.

  13. Platform-Independent Cirrus and Spectralis Thickness Measurements in Eyes with Diabetic Macular Edema Using Fully Automated Software

    Science.gov (United States)

    Willoughby, Alex S.; Chiu, Stephanie J.; Silverman, Rachel K.; Farsiu, Sina; Bailey, Clare; Wiley, Henry E.; Ferris, Frederick L.; Jaffe, Glenn J.

    2017-01-01

    Purpose We determine whether the automated segmentation software, Duke Optical Coherence Tomography Retinal Analysis Program (DOCTRAP), can measure, in a platform-independent manner, retinal thickness on Cirrus and Spectralis spectral domain optical coherence tomography (SD-OCT) images in eyes with diabetic macular edema (DME) under treatment in a clinical trial. Methods Automatic segmentation software was used to segment the internal limiting membrane (ILM), inner retinal pigment epithelium (RPE), and Bruch's membrane (BM) in SD-OCT images acquired by Cirrus and Spectralis commercial systems, from the same eye, on the same day during a clinical interventional DME trial. Mean retinal thickness differences were compared across commercial and DOCTRAP platforms using intraclass correlation (ICC) and Bland-Altman plots. Results The mean 1 mm central subfield thickness difference (standard error [SE]) comparing segmentation of Spectralis images with DOCTRAP versus HEYEX was 0.7 (0.3) μm (0.2 pixels). The corresponding values comparing segmentation of Cirrus images with DOCTRAP versus Cirrus software was 2.2 (0.7) μm. The mean 1 mm central subfield thickness difference (SE) comparing segmentation of Cirrus and Spectralis scan pairs with DOCTRAP using BM as the outer retinal boundary was −2.3 (0.9) μm compared to 2.8 (0.9) μm with inner RPE as the outer boundary. Conclusions DOCTRAP segmentation of Cirrus and Spectralis images produces validated thickness measurements that are very similar to each other, and very similar to the values generated by the corresponding commercial software in eyes with treated DME. Translational Relevance This software enables automatic total retinal thickness measurements across two OCT platforms, a process that is impractical to perform manually. PMID:28180033

  14. Fully Automated Simultaneous Integrated Boosted-Intensity Modulated Radiation Therapy Treatment Planning Is Feasible for Head-and-Neck Cancer: A Prospective Clinical Study

    Energy Technology Data Exchange (ETDEWEB)

    Wu Binbin, E-mail: binbin.wu@gunet.georgetown.edu [Department of Radiation Oncology and Molecular Radiation Science, Johns Hopkins University, Baltimore, Maryland (United States); Department of Radiation Medicine, Georgetown University Hospital, Washington, DC (United States); McNutt, Todd [Department of Radiation Oncology and Molecular Radiation Science, Johns Hopkins University, Baltimore, Maryland (United States); Zahurak, Marianna [Department of Oncology Biostatistics, Johns Hopkins University, Baltimore, Maryland (United States); Simari, Patricio [Autodesk Research, Toronto, ON (Canada); Pang, Dalong [Department of Radiation Medicine, Georgetown University Hospital, Washington, DC (United States); Taylor, Russell [Department of Computer Science, Johns Hopkins University, Baltimore, Maryland (United States); Sanguineti, Giuseppe [Department of Radiation Oncology and Molecular Radiation Science, Johns Hopkins University, Baltimore, Maryland (United States)

    2012-12-01

    Purpose: To prospectively determine whether overlap volume histogram (OVH)-driven, automated simultaneous integrated boosted (SIB)-intensity-modulated radiation therapy (IMRT) treatment planning for head-and-neck cancer can be implemented in clinics. Methods and Materials: A prospective study was designed to compare fully automated plans (APs) created by an OVH-driven, automated planning application with clinical plans (CPs) created by dosimetrists in a 3-dose-level (70 Gy, 63 Gy, and 58.1 Gy), head-and-neck SIB-IMRT planning. Because primary organ sparing (cord, brain, brainstem, mandible, and optic nerve/chiasm) always received the highest priority in clinical planning, the study aimed to show the noninferiority of APs with respect to PTV coverage and secondary organ sparing (parotid, brachial plexus, esophagus, larynx, inner ear, and oral mucosa). The sample size was determined a priori by a superiority hypothesis test that had 85% power to detect a 4% dose decrease in secondary organ sparing with a 2-sided alpha level of 0.05. A generalized estimating equation (GEE) regression model was used for statistical comparison. Results: Forty consecutive patients were accrued from July to December 2010. GEE analysis indicated that in APs, overall average dose to the secondary organs was reduced by 1.16 (95% CI = 0.09-2.33) with P=.04, overall average PTV coverage was increased by 0.26% (95% CI = 0.06-0.47) with P=.02 and overall average dose to the primary organs was reduced by 1.14 Gy (95% CI = 0.45-1.8) with P=.004. A physician determined that all APs could be delivered to patients, and APs were clinically superior in 27 of 40 cases. Conclusions: The application can be implemented in clinics as a fast, reliable, and consistent way of generating plans that need only minor adjustments to meet specific clinical needs.

  15. Multi-center evaluation of the novel fully-automated PCR-based Idylla™ BRAF Mutation Test on formalin-fixed paraffin-embedded tissue of malignant melanoma.

    Science.gov (United States)

    Melchior, Linea; Grauslund, Morten; Bellosillo, Beatriz; Montagut, Clara; Torres, Erica; Moragón, Ester; Micalessi, Isabel; Frans, Johan; Noten, Veerle; Bourgain, Claire; Vriesema, Renske; van der Geize, Robert; Cokelaere, Kristof; Vercooren, Nancy; Crul, Katrien; Rüdiger, Thomas; Buchmüller, Diana; Reijans, Martin; Jans, Caroline

    2015-12-01

    The advent of BRAF-targeted therapies led to increased survival in patients with metastatic melanomas harboring a BRAF V600 mutation (implicated in 46-48% of malignant melanomas). The Idylla(™) System (Idylla(™)), i.e., the real-time-PCR-based Idylla(™) BRAF Mutation Test performed on the fully-automated Idylla(™) platform, enables detection of the most frequent BRAF V600 mutations (V600E/E2/D, V600K/R/M) in tumor material within approximately 90 min and with 1% detection limit. Idylla(™) performance was determined in a multi-center study by analyzing BRAF mutational status of 148 archival formalin-fixed paraffin-embedded (FFPE) tumor samples from malignant melanoma patients, and comparing Idylla(™) results with assessments made by commercial or in-house routine diagnostic methods. Of the 148 samples analyzed, Idylla(™) initially recorded 7 insufficient DNA input calls and 15 results discordant with routine method results. Further analysis learned that the quality of 8 samples was insufficient for Idylla(™) testing, 1 sample had an invalid routine test result, and Idylla(™) results were confirmed in 10 samples. Hence, Idylla(™) identified all mutations present, including 7 not identified by routine methods. Idylla(™) enables fully automated BRAF V600 testing directly on FFPE tumor tissue with increased sensitivity, ease-of-use, and much shorter turnaround time compared to existing diagnostic tests, making it a tool for rapid, simple and highly reliable analysis of therapeutically relevant BRAF mutations, in particular for diagnostic units without molecular expertise and infrastructure.

  16. Fully automated quantification of regional cerebral blood flow with three-dimensional stereotaxic region of interest template. Validation using magnetic resonance imaging. Technical note

    Energy Technology Data Exchange (ETDEWEB)

    Takeuchi, Ryo; Katayama, Shigenori; Takeda, Naoya; Fujita, Katsuzo [Nishi-Kobe Medical Center (Japan); Yonekura, Yoshiharu [Fukui Medical Univ., Matsuoka (Japan); Konishi, Junji [Kyoto Univ. (Japan). Graduate School of Medicine

    2003-03-01

    The previously reported three-dimensional stereotaxic region of interest (ROI) template (3DSRT-t) for the analysis of anatomically standardized technetium-99m-L,L-ethyl cysteinate dimer ({sup 99m}Tc-ECD) single photon emission computed tomography (SPECT) images was modified for use in a fully automated regional cerebral blood flow (rCBF) quantification software, 3DSRT, incorporating an anatomical standardization engine transplanted from statistical parametric mapping 99 and ROIs for quantification based on 3DSRT-t. Three-dimensional T{sub 2}-weighted magnetic resonance images of 10 patients with localized infarcted areas were compared with the ROI contour of 3DSRT, and the positions of the central sulcus in the primary sensorimotor area were also estimated. All positions of the 20 lesions were in strict accordance with the ROI delineation of 3DSRT. The central sulcus was identified on at least one side of 210 paired ROIs and in the middle of 192 (91.4%) of these 210 paired ROIs among the 273 paired ROIs of the primary sensorimotor area. The central sulcus was recognized in the middle of more than 71.4% of the ROIs in which the central sulcus was identifiable in the respective 28 slices of the primary sensorimotor area. Fully automated accurate ROI delineation on anatomically standardized images is possible with 3DSRT, which enables objective quantification of rCBF and vascular reserve in only a few minutes using {sup 99m}Tc-ECD SPECT images obtained by the resting and vascular reserve (RVR) method. (author)

  17. Fast and Efficient Fragment-Based Lead Generation by Fully Automated Processing and Analysis of Ligand-Observed NMR Binding Data.

    Science.gov (United States)

    Peng, Chen; Frommlet, Alexandra; Perez, Manuel; Cobas, Carlos; Blechschmidt, Anke; Dominguez, Santiago; Lingel, Andreas

    2016-04-14

    NMR binding assays are routinely applied in hit finding and validation during early stages of drug discovery, particularly for fragment-based lead generation. To this end, compound libraries are screened by ligand-observed NMR experiments such as STD, T1ρ, and CPMG to identify molecules interacting with a target. The analysis of a high number of complex spectra is performed largely manually and therefore represents a limiting step in hit generation campaigns. Here we report a novel integrated computational procedure that processes and analyzes ligand-observed proton and fluorine NMR binding data in a fully automated fashion. A performance evaluation comparing automated and manual analysis results on (19)F- and (1)H-detected data sets shows that the program delivers robust, high-confidence hit lists in a fraction of the time needed for manual analysis and greatly facilitates visual inspection of the associated NMR spectra. These features enable considerably higher throughput, the assessment of larger libraries, and shorter turn-around times.

  18. A fully automated meltwater monitoring and collection system for spatially distributed isotope analysis in snowmelt-dominated catchments

    Science.gov (United States)

    Rücker, Andrea; Boss, Stefan; Von Freyberg, Jana; Zappa, Massimiliano; Kirchner, James

    2016-04-01

    In many mountainous catchments the seasonal snowpack stores a significant volume of water, which is released as streamflow during the melting period. The predicted change in future climate will bring new challenges in water resource management in snow-dominated headwater catchments and their receiving lowlands. To improve predictions of hydrologic extreme events, particularly summer droughts, it is important characterize the relationship between winter snowpack and summer (low) flows in such areas (e.g., Godsey et al., 2014). In this context, stable water isotopes (18O, 2H) are a powerful tool for fingerprinting the sources of streamflow and tracing water flow pathways. For this reason, we have established an isotope sampling network in the Alptal catchment (46.4 km2) in Central-Switzerland as part of the SREP-Drought project (Snow Resources and the Early Prediction of hydrological DROUGHT in mountainous streams). Samples of precipitation (daily), snow cores (weekly) and runoff (daily) are analyzed for their isotopic signature in a regular cycle. Precipitation is also sampled along a horizontal transect at the valley bottom, and along an elevational transect. Additionally, the analysis of snow meltwater is of importance. As the sample collection of snow meltwater in mountainous terrain is often impractical, we have developed a fully automatic snow lysimeter system, which measures meltwater volume and collects samples for isotope analysis at daily intervals. The system consists of three lysimeters built from Decagon-ECRN-100 High Resolution Rain Gauges as standard component that allows monitoring of meltwater flow. Each lysimeter leads the meltwater into a 10-liter container that is automatically sampled and then emptied daily. These water samples are replaced regularly and analyzed afterwards on their isotopic composition in the lab. Snow melt events as well as system status can be monitored in real time. In our presentation we describe the automatic snow lysimeter

  19. Fully automated synthesis of (phosphopeptide arrays in microtiter plate wells provides efficient access to protein tyrosine kinase characterization

    Directory of Open Access Journals (Sweden)

    Goldstein David J

    2005-01-01

    Full Text Available Abstract Background Synthetic peptides have played a useful role in studies of protein kinase substrates and interaction domains. Synthetic peptide arrays and libraries, in particular, have accelerated the process. Several factors have hindered or limited the applicability of various techniques, such as the need for deconvolution of combinatorial libraries, the inability or impracticality of achieving full automation using two-dimensional or pin solid phases, the lack of convenient interfacing with standard analytical platforms, or the difficulty of compartmentalization of a planar surface when contact between assay components needs to be avoided. This paper describes a process for synthesis of peptides and phosphopeptides on microtiter plate wells that overcomes previous limitations and demonstrates utility in determination of the epitope of an autophosphorylation site phospho-motif antibody and utility in substrate utilization assays of the protein tyrosine kinase, p60c-src. Results The overall reproducibility of phospho-peptide synthesis and multiplexed EGF receptor (EGFR autophosphorylation site (pY1173 antibody ELISA (9H2 was within 5.5 to 8.0%. Mass spectrometric analyses of the released (phosphopeptides showed homogeneous peaks of the expected molecular weights. An overlapping peptide array of the complete EGFR cytoplasmic sequence revealed a high redundancy of 9H2 reactive sites. The eight reactive phospopeptides were structurally related and interestingly, the most conserved antibody reactive peptide motif coincided with a subset of other known EGFR autophosphorylation and SH2 binding motifs and an EGFR optimal substrate motif. Finally, peptides based on known substrate specificities of c-src and related enzymes were synthesized in microtiter plate array format and were phosphorylated by c-Src with the predicted specificities. The level of phosphorylation was proportional to c-Src concentration with sensitivities below 0.1 Units of

  20. Assessment of pain response in capsaicin-induced dynamic mechanical allodynia using a novel and fully automated brushing device.

    Science.gov (United States)

    du Jardin, Kristian Gaarn; Gregersen, Lise Skøtt; Røsland, Turid; Uggerhøj, Kathrine Hebo; Petersen, Lars Jelstrup; Arendt-Nielsen, Lars; Gazerani, Parisa

    2013-01-01

    Dynamic mechanical allodynia is traditionally induced by manual brushing of the skin. Brushing force and speed have been shown to influence the intensity of brush-evoked pain. There are still limited data available with respect to the optimal stroke number, length, force, angle and speed. Therefore, an automated brushing device (ABD) was developed, for which brushing angle and speed could be controlled to enable quantitative assessment of dynamic mechanical allodynia. To compare the ABD with manual brushing using capsaicin-induced allodynia, and to investigate the role of stroke angle and speed on pain intensity. Experimental dynamic mechanical allodynia was induced by an intradermal injection of capsaicin (100 µg) into the volar forearm of 12 healthy, male volunteers. Dynamic mechanical allodynia was rated on a 10 cm visual analogue scale (VAS) after each set of strokes at angles of 30°, 60° and 90° with speeds of 17 mm⁄s, 21 mm⁄s and 25 mm⁄s for each angle. A two-way ANOVA with repeated measures was performed to assess the influence of brushing parameters. To evaluate test-retest reliability, Bland-Altman 95% limits of agreement, including a coefficient of repeatability and an intraclass correlation coefficient (ICC), were determined. The angle and speed exhibited a significant impact on pain intensity (P<0.001 and P<0.015, respectively). Post hoc analysis showed that the highest pain intensity was recorded with an angle of 30° regardless of brushing speed. The ABD demonstrated superior test-retest reliability (coefficient of repeatability = 1.9 VAS; ICC=0.91) compared with manual brushing (coefficient of repeatability = 2.8 VAS; ICC=0.80; P<0.05). The most reliable combination of parameters (coefficient of repeatability = 1.3 VAS; ICC=0.97) was an angle of 60° and a speed of 21 mm⁄s. A controlled, automatic brushing method can be used for quantitative investigations of allodynic reactions, and is more reliable for quantitative assessment of dynamic

  1. Fully Automated Surveillance of Healthcare-Associated Infections with MONI-ICU: A Breakthrough in Clinical Infection Surveillance.

    Science.gov (United States)

    Blacky, A; Mandl, H; Adlassnig, K-P; Koller, W

    2011-01-01

    Expert surveillance of healthcare-associated infections (HCAIs) is a key parameter for good clinical practice, especially in intensive care medicine. Assessment of clinical entities such as HCAIs is a time-consuming task for highly trained experts. Such are neither available nor affordable in sufficient numbers for continuous surveillance services. Intelligent information technology (IT) tools are in urgent demand. MONI-ICU (monitoring of nosocomial infections in intensive care units (ICUs)) has been developed methodologically and practically in a stepwise manner and is a reliable surveillance IT tool for clinical experts. It uses information from the patient data management systems in the ICUs, the laboratory information system, and the administrative hospital information system of the Vienna General Hospital as well as medical expert knowledge on infection criteria applied in a multilevel approach which includes fuzzy logic rules. We describe the use of this system in clinical routine and compare the results generated automatically by MONI-ICU with those generated in parallel by trained surveillance staff using patient chart reviews and other available information ("gold standard"). A total of 99 ICU patient admissions representing 1007 patient days were analyzed. MONI-ICU identified correctly the presence of an HCAI condition in 28/31 cases (sensitivity, 90.3%) and their absence in 68/68 of the non-HCAI cases (specificity, 100%), the latter meaning that MONI-ICU produced no "false alarms". The 3 missed cases were due to correctable technical errors. The time taken for conventional surveillance at the 52 ward visits was 82.5 hours. MONI-ICU analysis of the same patient cases, including careful review of the generated results, required only 12.5 hours (15.2%). Provided structured and sufficient information on clinical findings is online available, MONI-ICU provides an almost real-time view of clinical indicators for HCAI - at the cost of almost no additional time

  2. Validation of a fully automated high throughput liquid chromatographic/tandem mass spectrometric method for roxithromycin quantification in human plasma. Application to a bioequivalence study.

    Science.gov (United States)

    Kousoulos, Constantinos; Tsatsou, Georgia; Dotsikas, Yannis; Apostolou, Constantinos; Loukas, Yannis L

    2008-05-01

    A fully automated high-throughput liquid chromatography/tandem mass spectrometry (LC-MS/MS) method was developed for the determination of roxithromycin in human plasma. The plasma samples were treated by liquid-liquid extraction (LLE) in 2.2 mL 96-deep-well plates. Roxithromycin and the internal standard clarithromycin were extracted from 100 microL of human plasma by LLE, using methyl t-butyl ether as the organic solvent. All liquid transfer steps were performed automatically using robotic liquid handling workstations. After vortexing, centrifugation and freezing, the supernatant organic solvent was evaporated and reconstituted. Sample analysis was performed by reversed-phase LC-MS/MS, with positive ion electrospray ionization, using multiple-reaction monitoring. The method had a very short chromatographic run time of 1.6 min. The calibration curve was linear for the range of concentrations 50.0-20.0x10(3) ng mL(-1). The proposed method was fully validated and it was proven to be selective, accurate, precise, reproducible and suitable for the determination of roxithromycin in human plasma. Therefore, it was applied to the rapid and reliable determination of roxithromycin in a bioequivalence study after per os administration of 300 mg tablet formulations of roxithromycin.

  3. Development of a Real-Time PCR Protocol Requiring Minimal Handling for Detection of Vancomycin-Resistant Enterococci with the Fully Automated BD Max System.

    Science.gov (United States)

    Dalpke, Alexander H; Hofko, Marjeta; Zimmermann, Stefan

    2016-09-01

    Vancomycin-resistant enterococci (VRE) are an important cause of health care-associated infections, resulting in significant mortality and a significant economic burden in hospitals. Active surveillance for at-risk populations contributes to the prevention of infections with VRE. The availability of a combination of automation and molecular detection procedures for rapid screening would be beneficial. Here, we report on the development of a laboratory-developed PCR for detection of VRE which runs on the fully automated Becton Dickinson (BD) Max platform, which combines DNA extraction, PCR setup, and real-time PCR amplification. We evaluated two protocols: one using a liquid master mix and the other employing commercially ordered dry-down reagents. The BD Max VRE PCR was evaluated in two rounds with 86 and 61 rectal elution swab (eSwab) samples, and the results were compared to the culture results. The sensitivities of the different PCR formats were 84 to 100% for vanA and 83.7 to 100% for vanB; specificities were 96.8 to 100% for vanA and 81.8 to 97% for vanB The use of dry-down reagents and the ExK DNA-2 kit for extraction showed that the samples were less inhibited (3.3%) than they were by the use of the liquid master mix (14.8%). Adoption of a cutoff threshold cycle of 35 for discrimination of vanB-positive samples allowed an increase of specificity to 87.9%. The performance of the BD Max VRE assay equaled that of the BD GeneOhm VanR assay, which was run in parallel. The use of dry-down reagents simplifies the assay and omits any need to handle liquid PCR reagents.

  4. Opportunities for Open Automated Demand Response in Wastewater Treatment Facilities in California - Phase II Report. San Luis Rey Wastewater Treatment Plant Case Study

    Energy Technology Data Exchange (ETDEWEB)

    Thompson, Lisa; Lekov, Alex; McKane, Aimee; Piette, Mary Ann

    2010-08-20

    This case study enhances the understanding of open automated demand response opportunities in municipal wastewater treatment facilities. The report summarizes the findings of a 100 day submetering project at the San Luis Rey Wastewater Treatment Plant, a municipal wastewater treatment facility in Oceanside, California. The report reveals that key energy-intensive equipment such as pumps and centrifuges can be targeted for large load reductions. Demand response tests on the effluent pumps resulted a 300 kW load reduction and tests on centrifuges resulted in a 40 kW load reduction. Although tests on the facility?s blowers resulted in peak period load reductions of 78 kW sharp, short-lived increases in the turbidity of the wastewater effluent were experienced within 24 hours of the test. The results of these tests, which were conducted on blowers without variable speed drive capability, would not be acceptable and warrant further study. This study finds that wastewater treatment facilities have significant open automated demand response potential. However, limiting factors to implementing demand response are the reaction of effluent turbidity to reduced aeration load, along with the cogeneration capabilities of municipal facilities, including existing power purchase agreements and utility receptiveness to purchasing electricity from cogeneration facilities.

  5. [Development and Validation of a Fully Automated, Experimental Set-Up for Ex-Vivo Burst Pressure Testing after Surgical Vessel Closure].

    Science.gov (United States)

    Wallimann, Herbert; Menges, Pia; Hausen, Bernard; Linder, Albert

    2017-06-20

    Background A growing number of operations are performed using minimally invasive techniques. Therefore, a lot of new requirements must be met by the staplers currently available. At the present time, the most widely used methods of minimally invasive vascular occlusion involve high-frequency energy, clips, and staplers. The most important quality parameter is burst pressure, which is measured with a variety of experimental set-ups, all of which are subject to criticism. With this study, we want to introduce a fully automated vascular burst pressure measuring system that largely mimics physiological conditions. An important feature of this set-up is the detection of very early leakage from the staple line (FAIR Leakage = First Appearance of Leakage requiring Intervention). Material and Methods Burst pressure was measured in vessel segments of porcine common carotid arteries. For vascular occlusion, we used the stapler device Micro Cutter XCHANGE(®) by DexteraSurgical. Prior to closure, the vessel was filled to a pressure of 80 mmHg. The pressure was increased at a defined flow rate. Burst pressure was defined as staple line leakage requiring intervention. Results and Validation 30 staple lines were examined. The average burst pressure visually determined by two independent investigators was 515.8 mmHg ± 236.3 mmHg. Maximal burst pressure was 911 mmHg, and minimal burst pressure 80 mmHg. The average burst pressure detected electronically was 511.8 mmHg ± 239.1 mmHg. Statistically, there was a highly significant correlation of visually and electronically detected burst pressures. Conclusion This is the first experimental set-up for a systematic burst pressure test that is fully automated and therefore eliminates any bias related to the investigator. The experimental set-up with a defined intravascular pressure prior to closure and the use of a liquid with blood-like viscosity enabled us to largely mimic intraoperative conditions. Since burst

  6. A fully automated simultaneous single-stage separation of Sr, Pb, and Nd using DGA Resin for the isotopic analysis of marine sediments.

    Science.gov (United States)

    Retzmann, A; Zimmermann, T; Pröfrock, D; Prohaska, T; Irrgeher, J

    2017-07-04

    A novel, fast and reliable sample preparation procedure for the simultaneous separation of Sr, Pb, and Nd has been developed for subsequent isotope ratio analysis of sediment digests. The method applying a fully automated, low-pressure chromatographic system separates all three analytes in a single-stage extraction step using self-packed columns filled with DGA Resin. The fully automated set-up allows the unattended processing of three isotopic systems from one sediment digest every 2 h, offering high sample throughput of up to 12 samples per day and reducing substantially laboratory manpower as compared to conventional manual methods. The developed separation method was validated using the marine sediment GBW-07313 as matrix-matched certified reference material and combines quantitative recoveries (>90% for Sr, >93% for Pb, and >91% for Nd) with low procedural blank levels following the sample separation (0.07 μg L(-1) Sr, 0.03 μg L(-1) Pb, and 0.57 μg L(-1) Nd). The average δ values for Sr, Pb, and Nd of the separated reference standards were within the certified ranges (δ ((87)Sr/(86)Sr)NIST SRM 987 of -0.05(28) ‰, δ((208)Pb/(206)Pb)NIST SRM 981 of -0.21(14) ‰, and δ((143)Nd/(144)Nd)JNdi-1 of 0.00(7) ‰). The DGA Resin proved to be reusable for the separation of >10 sediment digests with no significant carry-over or memory effects, as well as no significant on-column fractionation of Sr, Pb, and Nd isotope ratios. Additional spike experiments of NIST SRM 987 with Pb, NIST SRM 981 with Sr, and JNdi-1 with Ce revealed no significant impact on the measured isotopic ratios, caused by potential small analyte peak overlaps during the separation of Sr and Pb, as well as Ce and Nd.

  7. Fully automated determination of 74 pharmaceuticals in environmental and waste waters by online solid phase extraction-liquid chromatography-electrospray-tandem mass spectrometry.

    Science.gov (United States)

    López-Serna, Rebeca; Pérez, Sandra; Ginebreda, Antoni; Petrović, Mira; Barceló, Damià

    2010-12-15

    The present work describes the development of a fully automated method, based on on-line solid-phase extraction (SPE)-liquid chromatography-electrospray-tandem mass spectrometry (LC-MS-MS), for the determination of 74 pharmaceuticals in environmental waters (superficial water and groundwater) as well as sewage waters. On-line SPE is performed by passing 2.5 mL of the water sample through a HySphere Resin GP cartridge. For unequivocal identification and confirmation two selected reaction monitoring (SRM) transitions are monitored per compound, thus four identification points are achieved. Quantification is performed by the internal standard approach, indispensable to correct the losses during the solid phase extraction, as well as the matrix effects. The main advantages of the method developed are high sensitivity (limits of detection in the low ng L(-1) range), selectivity due the use of tandem mass spectrometry and reliability due the use of 51 surrogates and minimum sample manipulation. As a part of the validation procedure, the method developed has been applied to the analysis of various environmental and sewage samples from a Spanish river and a sewage treatment plant. Copyright © 2010 Elsevier B.V. All rights reserved.

  8. The Set-Up and Implementation of Fully Virtualized Lessons with an Automated Workflow Utilizing VMC/Moodle at the Medical University of Graz

    Directory of Open Access Journals (Sweden)

    Herwig Erich Rehatschek

    2011-12-01

    Full Text Available With start of winter semester 2010/11 the Medical University of Graz (MUG successfully introduced a new primary learning management system (LMS Moodle. Moodle currently serves more than 4,300 students from three studies and holds more than 7,500 unique learning objects. With begin of the summer semester 2010 we decided to start a pilot with Moodle and 430 students. For the pilot we migrated the learning content of one module and two optional subjects to Moodle. The evaluation results were extremely promising – more than 92% of the students wanted immediately Moodle – also Moodle did meet our high expectations in terms of performance and scalability. Within this paper we describe how we defined and set-up a scalable and highly available platform for hosting Moodle and extended it by the functionality for fully automated virtual lessons. We state our experiences and give valuable clues for universities and institutions who want to introduce Moodle in the near future.

  9. Black tea volatiles fingerprinting by comprehensive two-dimensional gas chromatography - Mass spectrometry combined with high concentration capacity sample preparation techniques: Toward a fully automated sensomic assessment.

    Science.gov (United States)

    Magagna, Federico; Cordero, Chiara; Cagliero, Cecilia; Liberto, Erica; Rubiolo, Patrizia; Sgorbini, Barbara; Bicchi, Carlo

    2017-06-15

    Tea prepared by infusion of dried leaves of Camellia sinensis (L.) Kuntze, is the second world's most popular beverage, after water. Its consumption is associated with its chemical composition: it influences its sensory and nutritional quality addressing consumer preferences, and potential health benefits. This study aims to obtain an informative chemical signature of the volatile fraction of black tea samples from Ceylon by applying the principles of sensomics. In particular, several high concentration capacity (HCC) sample preparation techniques were tested in combination with GC×GC-MS to investigate chemical signatures of black tea volatiles. This platform, using headspace solid phase microextraction (HS-SPME) with multicomponent fiber as sampling technique, recovers 95% of the key-odorants in a fully automated work-flow. A group 123 components, including key-odorants, technological and botanical tracers, were mapped. The resulting 2D fingerprints were interpreted by pattern recognition tools (i.e. template matching fingerprinting and scripting) providing highly informative chemical signatures for quality assessment. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. Fully automated lobe-based airway taper index calculation in a low dose MDCT CF study over 4 time-points

    Science.gov (United States)

    Weinheimer, Oliver; Wielpütz, Mark O.; Konietzke, Philip; Heussel, Claus P.; Kauczor, Hans-Ulrich; Brochhausen, Christoph; Hollemann, David; Savage, Dasha; Galbán, Craig J.; Robinson, Terry E.

    2017-02-01

    Cystic Fibrosis (CF) results in severe bronchiectasis in nearly all cases. Bronchiectasis is a disease where parts of the airways are permanently dilated. The development and the progression of bronchiectasis is not evenly distributed over the entire lungs - rather, individual functional units are affected differently. We developed a fully automated method for the precise calculation of lobe-based airway taper indices. To calculate taper indices, some preparatory algorithms are needed. The airway tree is segmented, skeletonized and transformed to a rooted acyclic graph. This graph is used to label the airways. Then a modified version of the previously validated integral based method (IBM) for airway geometry determination is utilized. The rooted graph, the airway lumen and wall information are then used to calculate the airway taper indices. Using a computer-generated phantom simulating 10 cross sections of airways we present results showing a high accuracy of the modified IBM. The new taper index calculation method was applied to 144 volumetric inspiratory low-dose MDCT scans. The scans were acquired from 36 children with mild CF at 4 time-points (baseline, 3 month, 1 year, 2 years). We found a moderate correlation with the visual lobar Brody bronchiectasis scores by three raters (r2 = 0.36, p < .0001). The taper index has the potential to be a precise imaging biomarker but further improvements are needed. In combination with other imaging biomarkers, taper index calculation can be an important tool for monitoring the progression and the individual treatment of patients with bronchiectasis.

  11. Antioxidant effects of carnitine supplementation on 14-3-3 protein isoforms in the aged rat hippocampus detected using fully automated two-dimensional chip gel electrophoresis.

    Science.gov (United States)

    Iwamoto, M; Miura, Y; Tsumoto, H; Tanaka, Y; Morisawa, H; Endo, T; Toda, T

    2014-12-01

    We here described the antioxidant effects of carnitine supplementation on 14-3-3 protein isoforms in the aged rat hippocampus detected using the fully automated two-dimensional chip gel electrophoresis system (Auto2D). This system was easy and convenient to use, and the resolution obtained was more sensitive and higher than that of conventional two-dimensional polyacrylamide gel electrophoresis (2-D PAGE). We separated and identified five isoforms of the 14-3-3 protein (beta/alpha, gamma, epsilon, zeta/delta, and eta) using the Auto2D system. We then examined the antioxidant effects of carnitine supplementation on the protein profiles of the cytosolic fraction in the aged rat hippocampus, demonstrating that carnitine supplementation suppressed the oxidation of methionine residues in these isoforms. Since methionine residues are easily oxidized to methionine sulfoxide, the convenient and high-resolution 2-D PAGE system can be available to analyze methionine oxidation avoiding artifactual oxidation. We showed here that the Auto2D system was a very useful tool for studying antioxidant effects through proteomic analysis of protein oxidation.

  12. Left Ventricle: Fully Automated Segmentation Based on Spatiotemporal Continuity and Myocardium Information in Cine Cardiac Magnetic Resonance Imaging (LV-FAST

    Directory of Open Access Journals (Sweden)

    Lijia Wang

    2015-01-01

    Full Text Available CMR quantification of LV chamber volumes typically and manually defines the basal-most LV, which adds processing time and user-dependence. This study developed an LV segmentation method that is fully automated based on the spatiotemporal continuity of the LV (LV-FAST. An iteratively decreasing threshold region growing approach was used first from the midventricle to the apex, until the LV area and shape discontinued, and then from midventricle to the base, until less than 50% of the myocardium circumference was observable. Region growth was constrained by LV spatiotemporal continuity to improve robustness of apical and basal segmentations. The LV-FAST method was compared with manual tracing on cardiac cine MRI data of 45 consecutive patients. Of the 45 patients, LV-FAST and manual selection identified the same apical slices at both ED and ES and the same basal slices at both ED and ES in 38, 38, 38, and 41 cases, respectively, and their measurements agreed within -1.6±8.7 mL, -1.4±7.8 mL, and 1.0±5.8% for EDV, ESV, and EF, respectively. LV-FAST allowed LV volume-time course quantitatively measured within 3 seconds on a standard desktop computer, which is fast and accurate for processing the cine volumetric cardiac MRI data, and enables LV filling course quantification over the cardiac cycle.

  13. Web-Based Fully Automated Self-Help With Different Levels of Therapist Support for Individuals With Eating Disorder Symptoms: A Randomized Controlled Trial

    Science.gov (United States)

    Dingemans, Alexandra E; Spinhoven, Philip; van Ginkel, Joost R; de Rooij, Mark; van Furth, Eric F

    2016-01-01

    Background Despite the disabling nature of eating disorders (EDs), many individuals with ED symptoms do not receive appropriate mental health care. Internet-based interventions have potential to reduce the unmet needs by providing easily accessible health care services. Objective This study aimed to investigate the effectiveness of an Internet-based intervention for individuals with ED symptoms, called “Featback.” In addition, the added value of different intensities of therapist support was investigated. Methods Participants (N=354) were aged 16 years or older with self-reported ED symptoms, including symptoms of anorexia nervosa, bulimia nervosa, and binge eating disorder. Participants were recruited via the website of Featback and the website of a Dutch pro-recovery–focused e-community for young women with ED problems. Participants were randomized to: (1) Featback, consisting of psychoeducation and a fully automated self-monitoring and feedback system, (2) Featback supplemented with low-intensity (weekly) digital therapist support, (3) Featback supplemented with high-intensity (3 times a week) digital therapist support, and (4) a waiting list control condition. Internet-administered self-report questionnaires were completed at baseline, post-intervention (ie, 8 weeks after baseline), and at 3- and 6-month follow-up. The primary outcome measure was ED psychopathology. Secondary outcome measures were symptoms of depression and anxiety, perseverative thinking, and ED-related quality of life. Statistical analyses were conducted according to an intent-to-treat approach using linear mixed models. Results The 3 Featback conditions were superior to a waiting list in reducing bulimic psychopathology (d=−0.16, 95% confidence interval (CI)=−0.31 to −0.01), symptoms of depression and anxiety (d=−0.28, 95% CI=−0.45 to −0.11), and perseverative thinking (d=−0.28, 95% CI=−0.45 to −0.11). No added value of therapist support was found in terms of symptom

  14. A Survey on Theory and Practice of Automated Demand Response%自动需求响应的理论与实践综述

    Institute of Scientific and Technical Information of China (English)

    高赐威; 梁甜甜; 李扬

    2014-01-01

    自动需求响应是智能电网的核心技术功能之一,但是对于自动需求响应及其在智能电网中的重要作用,国内外在学术研究、工业实践等方面都存在巨大的差距。自动需求响应不依赖于任何的人工操作,可以大大提高需求响应的时效性、可靠性、灵活性和成本效益。自动需求响应颠覆了传统的需求响应认识,将需求响应的主要功能从优化电能配置拓展到了向系统提供实时辅助服务,真正将需求响应纳入实时调度范畴,充分利用负荷的实时可调节潜力,极大地提高系统间歇性能源接入能力和安全稳定运行能力。文章还讨论了智能电网新技术对自动需求响应功能的支撑作标准及试点工程。最后,分析了国内实施自动需求响应存在的障碍,指出了我国开展自动需求响应的潜力和发展方向。%Automated demand response (ADR) is one of the core technical functions of smart grids, however, as for ADR itself and its important role in smart grid there is large gap in both academic researches and engineering practices home and abroad. Independent of any manual operation, ADR can obviously improve timeliness, reliability and flexibility as well as cost-effectiveness of demand response (DR). ADR subverts the traditional understanding of DR and expands main functions of DR from optimizing power allocation to providing real-time auxiliary services to power system, thus ADR is really brought DR into the field of real-time dispatching, fully utilizes real-time adjustable potential of power load, tremendously enhances the connecting-ability of Intermittent energy sources and secure and stable operation capability of power grids. In this paper the supporting role of smart grid technology to ADR functions is discussed and based on international investigation the research items, standards and pilot projects related to ADR are introduced. Finally, existing obstacles to the

  15. Development and evaluation of a real-time PCR assay for detection of Pneumocystis jirovecii on the fully automated BD MAX platform.

    Science.gov (United States)

    Dalpke, Alexander H; Hofko, Marjeta; Zimmermann, Stefan

    2013-07-01

    Pneumocystis jirovecii is an opportunistic pathogen in immunocompromised and AIDS patients. Detection by quantitative PCR is faster and more sensitive than microscopic diagnosis yet requires specific infrastructure. We adapted a real-time PCR amplifying the major surface glycoprotein (MSG) target from Pneumocystis jirovecii for use on the new BD MAX platform. The assay allowed fully automated DNA extraction and multiplex real-time PCR. The BD MAX assay was evaluated against manual DNA extraction and conventional real-time PCR. The BD MAX was used in the research mode running a multiplex PCR (MSG, internal control, and sample process control). The assay had a detection limit of 10 copies of an MSG-encoding plasmid per PCR that equated to 500 copies/ml in respiratory specimens. We observed accurate quantification of MSG targets over a 7- to 8-log range. Prealiquoting and sealing of the complete PCR reagents in conical tubes allowed easy and convenient handling of the BD MAX PCR. In a retrospective analysis of 54 positive samples, the BD MAX assay showed good quantitative correlation with the reference PCR method (R(2) = 0.82). Cross-contamination was not observed. Prospectively, 278 respiratory samples were analyzed by both molecular assays. The positivity rate overall was 18.3%. The BD MAX assay identified 46 positive samples, compared to 40 by the reference PCR. The BD MAX assay required liquefaction of highly viscous samples with dithiothreitol as the only manual step, thus offering advantages for timely availability of molecular-based detection assays.

  16. Evaluation of fully automated assays for the detection of Rubella IgM and IgG antibodies by the Elecsys(®) immunoassay system.

    Science.gov (United States)

    van Helden, Josef; Grangeot-Keros, Liliane; Vauloup-Fellous, Christelle; Vleminckx, Renaud; Masset, Frédéric; Revello, Maria-Grazia

    2014-04-01

    Screening for acute rubella infection in pregnancy is an important element of antenatal care. This study compared the sensitivity, specificity and reproducibility of two new, fully automated Elecsys(®) Rubella IgM and IgG immunoassays designed for the Elecsys 2010, Modular Analytics E170, COBAS e-411 and COBAS e-601 and e602 analytical platforms, with current assays using serum from patients with primary rubella infections, vaccinated patients, patients with potentially cross-reacting infections and on routine samples in clinical laboratories in France, Germany and Italy. Both assays showed good within-run and within-laboratory precision. A sensitivity of 79.8-96.0% was demonstrated for Elecsys IgM in primary, early acute infection, consistent with existing assays. In samples obtained from routine antenatal screening, the Elecsys Rubella IgM assay revealed high specificity (98.7-99.0%). A significantly (prubella infection was excluded, and the incidence of false positives in patients with potentially cross-reacting infections was lower with Elecsys Rubella IgM compared with other. The Elecsys Rubella IgG assay exhibited a relative sensitivity of 99.9-100.0% and specificity of 97.4-100.0% in samples from routine antenatal screening. The Elecsys Rubella IgM and IgG assays allow convenient, rapid and reliable determination of anti-rubella antibodies. Sensitivity, specificity and reproducibility were comparable with existing assay systems. Assay results were available in approximately half the time required for currently employed methods and the assays are compatible with widely used analytical platforms.

  17. Fully automated determination of parabens, triclosan and methyl triclosan in wastewater by microextraction by packed sorbents and gas chromatography-mass spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Gonzalez-Marino, Iria, E-mail: iria.gonzalez@usc.es [Department of Analytical Chemistry, Nutrition and Food Sciences, Institute for Food Analysis and Research-IIAA, University of Santiago de Compostela, 15782 Santiago de Compostela (Spain); Benito Quintana, Jose; Rodriguez, Isaac [Department of Analytical Chemistry, Nutrition and Food Sciences, Institute for Food Analysis and Research-IIAA, University of Santiago de Compostela, 15782 Santiago de Compostela (Spain); Schrader, Steffi; Moeder, Monika [Department of Analytical Chemistry, Helmholtz Centre for Environmental Research-UFZ, Permoserstrasse 15, D-04318 Leipzig (Germany)

    2011-01-17

    A fully automated method for the determination of triclosan (TCS), its derivative methyl triclosan (MeTCS) and six parabens (esters of 4-hydroxybenzoic acid) including branched and linear isomers of propyl (i-PrP and n-PrP) and butyl paraben (i-BuP and n-BuP) in sewage water samples is presented. The procedure includes analytes enrichment by microextraction by packed sorbent (MEPS) coupled at-line to large volume injection-gas chromatography-mass spectrometry (LVI-GC-MS). Under optimised conditions, compounds were extracted from 2 mL samples, adjusted at pH 3, using a C18 MEPS-sorbent. Adsorbed analytes were eluted directly into the Programmable Temperature Vaporizer (PTV) injector of the chromatograph with 2 x 25 {mu}L of ethyl acetate. They were quantified using standard solutions in ultrapure water submitted to the same sample enrichment process as real sewage water samples. After signal normalisation using isotopic labelled species as internal surrogates, no differences were noticed among the extraction efficiency for sewage and ultrapure water; moreover, the proposed method reported lineal calibration curves from 0.1 to 10 ng mL{sup -1}, relative standard deviations (%RSD) between 2 and 7.1% and limits of detection (LODs) varying from 0.001 to 0.015 ng mL{sup -1} in ultrapure water and from 0.02 to 0.59 ng mL{sup -1} in the most complex sample (raw wastewater).

  18. Predicting survival in heart failure case and control subjects by use of fully automated methods for deriving nonlinear and conventional indices of heart rate dynamics

    Science.gov (United States)

    Ho, K. K.; Moody, G. B.; Peng, C. K.; Mietus, J. E.; Larson, M. G.; Levy, D.; Goldberger, A. L.

    1997-01-01

    BACKGROUND: Despite much recent interest in quantification of heart rate variability (HRV), the prognostic value of conventional measures of HRV and of newer indices based on nonlinear dynamics is not universally accepted. METHODS AND RESULTS: We have designed algorithms for analyzing ambulatory ECG recordings and measuring HRV without human intervention, using robust methods for obtaining time-domain measures (mean and SD of heart rate), frequency-domain measures (power in the bands of 0.001 to 0.01 Hz [VLF], 0.01 to 0.15 Hz [LF], and 0.15 to 0.5 Hz [HF] and total spectral power [TP] over all three of these bands), and measures based on nonlinear dynamics (approximate entropy [ApEn], a measure of complexity, and detrended fluctuation analysis [DFA], a measure of long-term correlations). The study population consisted of chronic congestive heart failure (CHF) case patients and sex- and age-matched control subjects in the Framingham Heart Study. After exclusion of technically inadequate studies and those with atrial fibrillation, we used these algorithms to study HRV in 2-hour ambulatory ECG recordings of 69 participants (mean age, 71.7+/-8.1 years). By use of separate Cox proportional-hazards models, the conventional measures SD (Psurvival over a mean follow-up period of 1.9 years; other measures, including ApEn (P>.3), were not. In multivariable models, DFA was of borderline predictive significance (P=.06) after adjustment for the diagnosis of CHF and SD. CONCLUSIONS: These results demonstrate that HRV analysis of ambulatory ECG recordings based on fully automated methods can have prognostic value in a population-based study and that nonlinear HRV indices may contribute prognostic value to complement traditional HRV measures.

  19. Safeguarding Schiphol airports accessibility for freight transport : the design of a fully automated underground transport system with an extensive use of simulation

    OpenAIRE

    Heijden, van der, Hans; Harten, van, A.; Ebben, M.J.R.; Saanen, Y.A.; Valentin, E.C.; Verbraeck, A

    2001-01-01

    Automated, underground freight transport should enable sustainable economic growth in the Amsterdam area in the Netherlands. An innovative transport system, which guarantees reliable logistics and which avoids congestion problems, is currently being developed. This logistics system will be highly automated, using AGVs (Automatic Guided Vehicles) for transport and automated loading and unloading equipment. It is unique in its scale, covering a 15-25 km tube system, and in its complexity, using...

  20. Contaminant analysis automation demonstration proposal

    Energy Technology Data Exchange (ETDEWEB)

    Dodson, M.G.; Schur, A.; Heubach, J.G.

    1993-10-01

    The nation-wide and global need for environmental restoration and waste remediation (ER&WR) presents significant challenges to the analytical chemistry laboratory. The expansion of ER&WR programs forces an increase in the volume of samples processed and the demand for analysis data. To handle this expanding volume, productivity must be increased. However. The need for significantly increased productivity, faces contaminant analysis process which is costly in time, labor, equipment, and safety protection. Laboratory automation offers a cost effective approach to meeting current and future contaminant analytical laboratory needs. The proposed demonstration will present a proof-of-concept automated laboratory conducting varied sample preparations. This automated process also highlights a graphical user interface that provides supervisory, control and monitoring of the automated process. The demonstration provides affirming answers to the following questions about laboratory automation: Can preparation of contaminants be successfully automated?; Can a full-scale working proof-of-concept automated laboratory be developed that is capable of preparing contaminant and hazardous chemical samples?; Can the automated processes be seamlessly integrated and controlled?; Can the automated laboratory be customized through readily convertible design? and Can automated sample preparation concepts be extended to the other phases of the sample analysis process? To fully reap the benefits of automation, four human factors areas should be studied and the outputs used to increase the efficiency of laboratory automation. These areas include: (1) laboratory configuration, (2) procedures, (3) receptacles and fixtures, and (4) human-computer interface for the full automated system and complex laboratory information management systems.

  1. Fully automated ionic liquid-based headspace single drop microextraction coupled to GC-MS/MS to determine musk fragrances in environmental water samples.

    Science.gov (United States)

    Vallecillos, Laura; Pocurull, Eva; Borrull, Francesc

    2012-09-15

    A fully automated ionic liquid-based headspace single drop microextraction (IL-HS-SDME) procedure has been developed for the first time to preconcentrate trace amounts of ten musk fragrances extensively used in personal care products (six polycyclic musks, three nitro musks and one polycyclic musk degradation product) from wastewater samples prior to analysis by gas chromatography and ion trap tandem mass spectrometry (GC-IT-MS/MS). Due to the low volatility of the ILs, a large internal diameter liner (3.4 mm i.d.) was used to improve the ILs evaporation. Furthermore, a piece of glass wool was introduced into the liner to avoid the entrance of the ILs in the GC column and a guard column was used to prevent analytical column damages. The main factors influencing the IL-HS-SDME were optimized. For all species, the highest enrichments factors were achieved using 1 μL of 1-octyl-3-methylimidazolium hexafluorophosphate ([OMIM][PF(6)]) ionic liquid exposed in the headspace of 10 mL water samples containing 300 g L(-1) of NaCl and stirred at 750 rpm and 60 °C for 45 min. All compounds were determined by direct injection GC-IT-MS/MS with a chromatographic time of 19 min. Method detection limits were found in the low ng mL(-1) range between 0.010 ng mL(-1) and 0.030 ng mL(-1) depending on the target analytes. Also, under optimized conditions, the method gave good levels of intra-day and inter-day repeatabilities in wastewater samples with relative standard deviations varying between 3% and 6% and 5% and 11%, respectively (n=3, 1 ng mL(-1)). The applicability of the method was tested with different wastewater samples from influent and effluent urban wastewater treatment plants (WWTPs) and one potable treatment plant (PTP). The analysis of influent urban wastewater revealed the presence of galaxolide and tonalide at concentrations of between 2.10 ng mL(-1) and 0.29 ng mL(-1) and 0.32 ng mL(-1) and waters from PTP only galaxolide was found at a concentration higher than MQL.

  2. 自动需求响应系统的需求及架构研究%Studies on Requirements and Architecture for Automated Demand Response System

    Institute of Scientific and Technical Information of China (English)

    张晶; 孙万珺; 王婷

    2015-01-01

    用户是电力系统服务的最终对象,需求响应(demand response,DR)与用户紧密相关,随着智能电网的发展,传统的交互模式已经不能满足DR领域的需求。该文综合阐述国内外自动需求响应(automated demand response,ADR)系统的研究进展,提出ADR系统概念模型,逻辑架构和物理部署,从而明确 ADR 系统的参与角色及作用;剖析 ADR系统的业务需求,功能需求及非功能需求,分析系统构建的关键技术,包括用例分析技术、信息模型技术、通信传输技术、可视化技术、需求响应测量与评估及一致性和安全要求等,对需求响应自动化建设具有重要意义。最后,介绍ADR在国内的相关试点工程,对下一步工作提出展望。%Customer is the ultimate object of power system services, while demand response (DR) is closely related to the customer. With the development of smart grid, the traditional interactive mode cannot meet the needs of DR field. This paper described the progress of domestic automated demand response (ADR), and then proposed configuration, logical structure and physical deployment of ADR system, which thus cleared the participation role and effects of ADR system. This paper analyzed business requirements, functional requirements and non-functional requirements of ADR system, and also analyzed key technology to build the system, including case analysis technology, information modeling technology, communications transmission technology, visualization technology, demand response measurement and evaluation technology, consistency and safety requirements, and so on. These requirements were important for building ADR system. Finally, ADR related pilot projects in China were introduced, and expectations for the next step were raised.

  3. Safeguarding Schiphol airports accessibility for freight transport : the design of a fully automated underground transport system with an extensive use of simulation

    NARCIS (Netherlands)

    Heijden, van der M.C.; Harten, van A.; Ebben, M.J.R.; Saanen, Y.A.; Valentin, E.C.; Verbraeck, A.

    2001-01-01

    Automated, underground freight transport should enable sustainable economic growth in the Amsterdam area in the Netherlands. An innovative transport system, which guarantees reliable logistics and which avoids congestion problems, is currently being developed. This logistics system will be highly au

  4. A Comparison of Fully Automated Methods of Data Analysis and Computer Assisted Heuristic Methods in an Electrode Kinetic Study of the Pathologically Variable [Fe(CN) 6 ] 3–/4– Process by AC Voltammetry

    KAUST Repository

    Morris, Graham P.

    2013-12-17

    Fully automated and computer assisted heuristic data analysis approaches have been applied to a series of AC voltammetric experiments undertaken on the [Fe(CN)6]3-/4- process at a glassy carbon electrode in 3 M KCl aqueous electrolyte. The recovered parameters in all forms of data analysis encompass E0 (reversible potential), k0 (heterogeneous charge transfer rate constant at E0), α (charge transfer coefficient), Ru (uncompensated resistance), and Cdl (double layer capacitance). The automated method of analysis employed time domain optimization and Bayesian statistics. This and all other methods assumed the Butler-Volmer model applies for electron transfer kinetics, planar diffusion for mass transport, Ohm\\'s Law for Ru, and a potential-independent Cdl model. Heuristic approaches utilize combinations of Fourier Transform filtering, sensitivity analysis, and simplex-based forms of optimization applied to resolved AC harmonics and rely on experimenter experience to assist in experiment-theory comparisons. Remarkable consistency of parameter evaluation was achieved, although the fully automated time domain method provided consistently higher α values than those based on frequency domain data analysis. The origin of this difference is that the implemented fully automated method requires a perfect model for the double layer capacitance. In contrast, the importance of imperfections in the double layer model is minimized when analysis is performed in the frequency domain. Substantial variation in k0 values was found by analysis of the 10 data sets for this highly surface-sensitive pathologically variable [Fe(CN) 6]3-/4- process, but remarkably, all fit the quasi-reversible model satisfactorily. © 2013 American Chemical Society.

  5. Market transformation lessons learned from an automated demand response test in the Summer and Fall of 2003

    Energy Technology Data Exchange (ETDEWEB)

    Shockman, Christine; Piette, Mary Ann; ten Hope, Laurie

    2004-08-01

    A recent pilot test to enable an Automatic Demand Response system in California has revealed several lessons that are important to consider for a wider application of a regional or statewide Demand Response Program. The six facilities involved in the site testing were from diverse areas of our economy. The test subjects included a major retail food marketer and one of their retail grocery stores, financial services buildings for a major bank, a postal services facility, a federal government office building, a state university site, and ancillary buildings to a pharmaceutical research company. Although these organizations are all serving diverse purposes and customers, they share some underlying common characteristics that make their simultaneous study worthwhile from a market transformation perspective. These are large organizations. Energy efficiency is neither their core business nor are the decision makers who will enable this technology powerful players in their organizations. The management of buildings is perceived to be a small issue for top management and unless something goes wrong, little attention is paid to the building manager's problems. All of these organizations contract out a major part of their technical building operating systems. Control systems and energy management systems are proprietary. Their systems do not easily interact with one another. Management is, with the exception of one site, not electronically or computer literate enough to understand the full dimensions of the technology they have purchased. Despite the research team's development of a simple, straightforward method of informing them about the features of the demand response program, they had significant difficulty enabling their systems to meet the needs of the research. The research team had to step in and work directly with their vendors and contractors at all but one location. All of the participants have volunteered to participate in the study for altruistic

  6. Toward an Integrated Framework for Automated Development and Optimization of Online Advertising Campaigns

    OpenAIRE

    Thomaidou, Stamatina; Vazirgiannis, Michalis; Liakopoulos, Kyriakos

    2012-01-01

    Creating and monitoring competitive and cost-effective pay-per-click advertisement campaigns through the web-search channel is a resource demanding task in terms of expertise and effort. Assisting or even automating the work of an advertising specialist will have an unrivaled commercial value. In this paper we propose a methodology, an architecture, and a fully functional framework for semi- and fully- automated creation, monitoring, and optimization of cost-efficient pay-per-click campaigns ...

  7. Fully automated GMP production of [(68)Ga]Ga-DO3A-VS-Cys(40)-Exendin-4 for clinical use.

    Science.gov (United States)

    Velikyan, Irina; Rosenstrom, Ulrika; Eriksson, Olof

    2017-01-01

    [(68)Ga]Ga-DO3A-VS-Cys(40)-Exendin-4/PET-CT targeting glucagon like peptide-1 receptor (GLP-1R) has previously demonstrated its potential clinical value for the detection of insulinomas. The production and accessibility of this radiopharmaceutical is one of the critical factors in realization of clinical trials and routine clinical examinations. Previously, the radiopharmaceutical was prepared manually, however larger scale of clinical trials and healthcare requires automation of the production process in order to limit the operator radiation dose as well as improve tracer manufacturing robustness and on-line documentation for enhanced good manufacturing practice (GMP) compliance. A method for (68)Ga-labelling of DO3A-VS-Cys(40)-Exendin-4 on a commercially available synthesis platform was developed. Equipment such as (68)Ge/(68)Ga generator, synthesis platform, and disposable cassettes for (68)Ga-labelling used in the study was purchased from Eckert & Ziegler. DO3A-VS-Cys(40)-Exendin-4 was synthesized in-house. The parameters such as time, temperature, precursor concentration, radical scavenger, buffer concentration, pH, product purification step were investigated and optimised. Reproducible and GMP compliant automated production of [(68)Ga]Ga-DO3A-VS-Cys(40)-Exendin-4 was developed. Exendin-4 comprising methionine amino acid residue was prone to oxidation which was strongly influenced by the elevated temperature, radioactivity amount, and precursor concentration. The suppression of the oxidative radiolysis was achieved by addition of ethanol, dihydroxybenzoic acid and ascorbic acid to the reaction buffer as well as by optimizing heating temperature. The non-decay corrected radiochemical yield was 43±2% with radiochemical purity of over 90% wherein the individual impurity signals in HPLC chromatogram did not exceed 5%. Automated production and quality control methods were established for paving the pathway for broader clinical use of [(68)Ga]Ga-DO3A-VS-Cys(40

  8. Determination of organochlorine Pesticides in Water Samples by Fully Automated Quantitative Concentrator-Gas Chromatography%全自动定量浓缩-气相色谱法分析地表水中的有机氯农药

    Institute of Scientific and Technical Information of China (English)

    曹旭静

    2016-01-01

    Organochlorine pesticides in water were extracted by n-hexan,the extracted liquid was concentrated to 1mL with fully automated quantitative concentrator in the water bath temperature 35℃and the vacuum 300mbar.Which only need 25min. Organochlorine pesticides were determined by gas chromatograph after samples pre-treatment by liquid-liquid ex⁃traction with n-hexane and concentration with fully automated quantitative concentrator.The detection limits of method for organochlorine pesticides were in the range of 0.001~0.008μg/L.The average recoveries were 78.6%~104%. This method had advantages of good accuracy and precision,rapid,high degree of automation and was suitable for batch samples.%地表水中的有机氯农药用正己烷萃取后,用全自动定量蒸发浓缩仪在水浴温度35℃,真空度为300mbar时浓缩定容到1mL,一个样品只需要25min。用液液萃取-全自动定量浓缩仪-气相色谱法分析地表水水中的有机氯农药,该方法的检出限为为0.001~0.008μg/L,方法的平均回收率在78.6%~104%之间。该方法检出限低,精密度好,省时省力,自动化程度高,适合于大批量样品的监测。

  9. A fully automated effervescence assisted dispersive liquid–liquid microextraction based on a stepwise injection system. Determination of antipyrine in saliva samples

    Energy Technology Data Exchange (ETDEWEB)

    Medinskaia, Kseniia; Vakh, Christina; Aseeva, Darina [Department of Analytical Chemistry, Institute of Chemistry, Saint Petersburg State University, RU-198504 Saint Petersburg (Russian Federation); Andruch, Vasil, E-mail: vasil.andruch@upjs.sk [Department of Analytical Chemistry, University of P.J. Šafárik, SK-04154 Košice (Slovakia); Moskvin, Leonid [Department of Analytical Chemistry, Institute of Chemistry, Saint Petersburg State University, RU-198504 Saint Petersburg (Russian Federation); Bulatov, Andrey, E-mail: bulatov_andrey@mail.ru [Department of Analytical Chemistry, Institute of Chemistry, Saint Petersburg State University, RU-198504 Saint Petersburg (Russian Federation)

    2016-01-01

    A first attempt to automate the effervescence assisted dispersive liquid–liquid microextraction (EA-DLLME) has been reported. The method is based on the aspiration of a sample and all required aqueous reagents into the stepwise injection analysis (SWIA) manifold, followed by simultaneous counterflow injection of the extraction solvent (dichloromethane), the mixture of the effervescence agent (0.5 mol L{sup −1} Na{sub 2}CO{sub 3}) and the proton donor solution (1 mol L{sup −1} CH{sub 3}COOH). Formation of carbon dioxide microbubbles generated in situ leads to the dispersion of the extraction solvent in the whole aqueous sample and extraction of the analyte into organic phase. Unlike the conventional DLLME, in the case of EA-DLLME, the addition of dispersive solvent, as well as, time consuming centrifugation step for disruption of the cloudy state is avoided. The phase separation was achieved by gentle bubbling of nitrogen stream (2 mL min{sup −1} during 2 min). The performance of the suggested approach is demonstrated by determination of antipyrine in saliva samples. The procedure is based on the derivatization of antipyrine by nitrite-ion followed by EA-DLLME of 4-nitrosoantipyrine and subsequent UV–Vis detection using SWIA manifold. The absorbance of the yellow-colored extract at the wavelength of 345 nm obeys Beer's law in the range of 1.5–100 µmol L{sup −1} of antipyrine in saliva. The LOD, calculated from a blank test based on 3σ, was 0.5 µmol L{sup −1}. - Highlights: • First attempt to automate the effervescence assisted dispersive liquid–liquid microextraction. • Automation based on Stepwise injection analysis manifold in flow batch system. • Counterflow injection of extraction solvent and the effervescence agent. • Phase separation performed by gentle bubbling of nitrogen. • Application for the determination of antipyrine in saliva samples.

  10. Fully automated analysis of chemically induced γH2AX foci in human peripheral blood mononuclear cells by indirect immunofluorescence.

    Science.gov (United States)

    Willitzki, Annika; Lorenz, Sebastian; Hiemann, Rico; Guttek, Karina; Goihl, Alexander; Hartig, Roland; Conrad, Karsten; Feist, Eugen; Sack, Ulrich; Schierack, Peter; Heiserich, Lisa; Eberle, Caroline; Peters, Vanessa; Roggenbuck, Dirk; Reinhold, Dirk

    2013-11-01

    Analysis of phosphorylated histone protein H2AX (γH2AX) foci is currently the most sensitive method to detect DNA double-strand breaks (DSB). This protein modification has the potential to become an individual biomarker of cellular stress, especially in the diagnosis and monitoring of neoplastic diseases. To make γH2AX foci analysis available as a routine screening method, different software approaches for automated immunofluorescence pattern evaluation have recently been developed. In this study, we used novel pattern recognition algorithms on the AKLIDES® platform to automatically analyze immunofluorescence images of γH2AX foci and compared the results with visual assessments. Dose- and time-dependent γH2AX foci formation was investigated in human peripheral blood mononuclear cells (PBMCs) treated with the chemotherapeutic drug etoposide (ETP). Moreover, the AKLIDES system was used to analyze the impact of different immunomodulatory reagents on γH2AX foci formation in PBMCs. Apart from γH2AX foci counting the use of novel pattern recognition algorithms allowed the measurement of their fluorescence intensity and size, as well as the analysis of overlapping γH2AX foci. The comparison of automated and manual foci quantification showed overall a good correlation. After ETP exposure, a clear dose-dependent increase of γH2AX foci formation was evident using the AKLIDES as well as Western blot analysis. Kinetic experiments on PBMCs incubated with 5 μM ETP demonstrated a peak in γH2AX foci formation after 4 to 8 h, while a removal of ETP resulted in a strong reduction of γH2AX foci after 1 to 4 h. In summary, this study demonstrated that the AKLIDES system can be used as an efficient automatic screening tool for γH2AX foci analysis by providing new evaluation features and facilitating the identification of drugs which induce or modulate DNA damage.

  11. Automated Critical Peak Pricing Field Tests: Program Descriptionand Results

    Energy Technology Data Exchange (ETDEWEB)

    Piette, Mary Ann; Watson, David; Motegi, Naoya; Kiliccote, Sila; Xu, Peng

    2006-04-06

    California utilities have been exploring the use of critical peak prices (CPP) to help reduce needle peaks in customer end-use loads. CPP is a form of price-responsive demand response (DR). Recent experience has shown that customers have limited knowledge of how to operate their facilities in order to reduce their electricity costs under CPP (Quantum 2004). While the lack of knowledge about how to develop and implement DR control strategies is a barrier to participation in DR programs like CPP, another barrier is the lack of automation of DR systems. During 2003 and 2004, the PIER Demand Response Research Center (DRRC) conducted a series of tests of fully automated electric demand response (Auto-DR) at 18 facilities. Overall, the average of the site-specific average coincident demand reductions was 8% from a variety of building types and facilities. Many electricity customers have suggested that automation will help them institutionalize their electric demand savings and improve their overall response and DR repeatability. This report focuses on and discusses the specific results of the Automated Critical Peak Pricing (Auto-CPP, a specific type of Auto-DR) tests that took place during 2005, which build on the automated demand response (Auto-DR) research conducted through PIER and the DRRC in 2003 and 2004. The long-term goal of this project is to understand the technical opportunities of automating demand response and to remove technical and market impediments to large-scale implementation of automated demand response (Auto-DR) in buildings and industry. A second goal of this research is to understand and identify best practices for DR strategies and opportunities. The specific objectives of the Automated Critical Peak Pricing test were as follows: (1) Demonstrate how an automated notification system for critical peak pricing can be used in large commercial facilities for demand response (DR). (2) Evaluate effectiveness of such a system. (3) Determine how customers

  12. Automated Critical Peak Pricing Field Tests: Program Descriptionand Results

    Energy Technology Data Exchange (ETDEWEB)

    Piette, Mary Ann; Watson, David; Motegi, Naoya; Kiliccote, Sila; Xu, Peng

    2006-04-06

    California utilities have been exploring the use of critical peak prices (CPP) to help reduce needle peaks in customer end-use loads. CPP is a form of price-responsive demand response (DR). Recent experience has shown that customers have limited knowledge of how to operate their facilities in order to reduce their electricity costs under CPP (Quantum 2004). While the lack of knowledge about how to develop and implement DR control strategies is a barrier to participation in DR programs like CPP, another barrier is the lack of automation of DR systems. During 2003 and 2004, the PIER Demand Response Research Center (DRRC) conducted a series of tests of fully automated electric demand response (Auto-DR) at 18 facilities. Overall, the average of the site-specific average coincident demand reductions was 8% from a variety of building types and facilities. Many electricity customers have suggested that automation will help them institutionalize their electric demand savings and improve their overall response and DR repeatability. This report focuses on and discusses the specific results of the Automated Critical Peak Pricing (Auto-CPP, a specific type of Auto-DR) tests that took place during 2005, which build on the automated demand response (Auto-DR) research conducted through PIER and the DRRC in 2003 and 2004. The long-term goal of this project is to understand the technical opportunities of automating demand response and to remove technical and market impediments to large-scale implementation of automated demand response (Auto-DR) in buildings and industry. A second goal of this research is to understand and identify best practices for DR strategies and opportunities. The specific objectives of the Automated Critical Peak Pricing test were as follows: (1) Demonstrate how an automated notification system for critical peak pricing can be used in large commercial facilities for demand response (DR). (2) Evaluate effectiveness of such a system. (3) Determine how customers

  13. Cloud-based CT dose monitoring using the DICOM-structured report. Fully automated analysis in regard to national diagnostic reference levels

    Energy Technology Data Exchange (ETDEWEB)

    Boos, J.; Rubbert, C.; Heusch, P.; Lanzman, R.S.; Aissa, J.; Antoch, G.; Kroepil, P. [Univ. Duesseldorf (Germany). Dept. of Diagnostic an Interventional Radiology; Meineke, A. [Cerner Health Services, Idstein (Germany)

    2016-03-15

    To implement automated CT dose data monitoring using the DICOM-Structured Report (DICOM-SR) in order to monitor dose-related CT data in regard to national diagnostic reference levels (DRLs). Materials and Methods: We used a novel in-house co-developed software tool based on the DICOM-SR to automatically monitor dose-related data from CT examinations. The DICOM-SR for each CT examination performed between 09/2011 and 03/2015 was automatically anonymized and sent from the CT scanners to a cloud server. Data was automatically analyzed in accordance with body region, patient age and corresponding DRL for volumetric computed tomography dose index (CTDI{sub vol}) and dose length product (DLP). Results: Data of 36 523 examinations (131 527 scan series) performed on three different CT scanners and one PET/CT were analyzed. The overall mean CTDI{sub vol} and DLP were 51.3 % and 52.8 % of the national DRLs, respectively. CTDI{sub vol} and DLP reached 43.8 % and 43.1 % for abdominal CT (n = 10 590), 66.6 % and 69.6 % for cranial CT (n = 16 098) and 37.8 % and 44.0 % for chest CT (n = 10 387) of the compared national DRLs, respectively. Overall, the CTDI{sub vol} exceeded national DRLs in 1.9 % of the examinations, while the DLP exceeded national DRLs in 2.9 % of the examinations. Between different CT protocols of the same body region, radiation exposure varied up to 50 % of the DRLs. Conclusion: The implemented cloud-based CT dose monitoring based on the DICOM-SR enables automated benchmarking in regard to national DRLs. Overall the local dose exposure from CT reached approximately 50 % of these DRLs indicating that DRL actualization as well as protocol-specific DRLs are desirable. The cloud-based approach enables multi-center dose monitoring and offers great potential to further optimize radiation exposure in radiological departments.

  14. Preliminary evaluation of a fully automated quantitative framework for characterizing general breast tissue histology via color histogram and color texture analysis

    Science.gov (United States)

    Keller, Brad M.; Gastounioti, Aimilia; Batiste, Rebecca C.; Kontos, Despina; Feldman, Michael D.

    2016-03-01

    Visual characterization of histologic specimens is known to suffer from intra- and inter-observer variability. To help address this, we developed an automated framework for characterizing digitized histology specimens based on a novel application of color histogram and color texture analysis. We perform a preliminary evaluation of this framework using a set of 73 trichrome-stained, digitized slides of normal breast tissue which were visually assessed by an expert pathologist in terms of the percentage of collagenous stroma, stromal collagen density, duct-lobular unit density and the presence of elastosis. For each slide, our algorithm automatically segments the tissue region based on the lightness channel in CIELAB colorspace. Within each tissue region, a color histogram feature vector is extracted using a common color palette for trichrome images generated with a previously described method. Then, using a whole-slide, lattice-based methodology, color texture maps are generated using a set of color co-occurrence matrix statistics: contrast, correlation, energy and homogeneity. The extracted features sets are compared to the visually assessed tissue characteristics. Overall, the extracted texture features have high correlations to both the percentage of collagenous stroma (r=0.95, phistological processes in digitized histology specimens.

  15. A fully automated effervescence assisted dispersive liquid-liquid microextraction based on a stepwise injection system. Determination of antipyrine in saliva samples.

    Science.gov (United States)

    Medinskaia, Kseniia; Vakh, Christina; Aseeva, Darina; Andruch, Vasil; Moskvin, Leonid; Bulatov, Andrey

    2016-01-01

    A first attempt to automate the effervescence assisted dispersive liquid-liquid microextraction (EA-DLLME) has been reported. The method is based on the aspiration of a sample and all required aqueous reagents into the stepwise injection analysis (SWIA) manifold, followed by simultaneous counterflow injection of the extraction solvent (dichloromethane), the mixture of the effervescence agent (0.5 mol L(-1) Na2CO3) and the proton donor solution (1 mol L(-1) CH3COOH). Formation of carbon dioxide microbubbles generated in situ leads to the dispersion of the extraction solvent in the whole aqueous sample and extraction of the analyte into organic phase. Unlike the conventional DLLME, in the case of EA-DLLME, the addition of dispersive solvent, as well as, time consuming centrifugation step for disruption of the cloudy state is avoided. The phase separation was achieved by gentle bubbling of nitrogen stream (2 mL min(-1) during 2 min). The performance of the suggested approach is demonstrated by determination of antipyrine in saliva samples. The procedure is based on the derivatization of antipyrine by nitrite-ion followed by EA-DLLME of 4-nitrosoantipyrine and subsequent UV-Vis detection using SWIA manifold. The absorbance of the yellow-colored extract at the wavelength of 345 nm obeys Beer's law in the range of 1.5-100 µmol L(-1) of antipyrine in saliva. The LOD, calculated from a blank test based on 3σ, was 0.5 µmol L(-1).

  16. Fully-automated approach to hippocampus segmentation using a graph-cuts algorithm combined with atlas-based segmentation and morphological opening.

    Science.gov (United States)

    Kwak, Kichang; Yoon, Uicheul; Lee, Dong-Kyun; Kim, Geon Ha; Seo, Sang Won; Na, Duk L; Shim, Hack-Joon; Lee, Jong-Min

    2013-09-01

    The hippocampus has been known to be an important structure as a biomarker for Alzheimer's disease (AD) and other neurological and psychiatric diseases. However, it requires accurate, robust and reproducible delineation of hippocampal structures. In this study, an automated hippocampal segmentation method based on a graph-cuts algorithm combined with atlas-based segmentation and morphological opening was proposed. First of all, the atlas-based segmentation was applied to define initial hippocampal region for a priori information on graph-cuts. The definition of initial seeds was further elaborated by incorporating estimation of partial volume probabilities at each voxel. Finally, morphological opening was applied to reduce false positive of the result processed by graph-cuts. In the experiments with twenty-seven healthy normal subjects, the proposed method showed more reliable results (similarity index=0.81±0.03) than the conventional atlas-based segmentation method (0.72±0.04). Also as for segmentation accuracy which is measured in terms of the ratios of false positive and false negative, the proposed method (precision=0.76±0.04, recall=0.86±0.05) produced lower ratios than the conventional methods (0.73±0.05, 0.72±0.06) demonstrating its plausibility for accurate, robust and reliable segmentation of hippocampus.

  17. Validation of a novel, fully automated high throughput high-performance liquid chromatographic/tandem mass Spectrometric method for quantification of pantoprazole in human plasma.

    Science.gov (United States)

    Dotsikas, Yannis; Apostolou, Constantinos; Soumelas, Stefanos; Kolocouri, Filomila; Ziaka, Afroditi; Kousoulos, Constantinos; Loukas, Yannis L

    2010-01-01

    An automated high-throughput HPLC/MS/MS method was developed for the quantitative determination of pantoprazole in human plasma. Only 100 microL plasma was placed in 2.2 mL 96 deep-well plates, and both pantoprazole and omeprazole (IS) were extracted from human plasma by liquid-liquid extraction, using diethyl ether-dichloromethane (70:30, v/v) as the organic solvent. Robotic liquid-handling workstations were used for all liquid transfer and solution preparation steps and resulted in a short sample preparation time. After vortexing, centrifugation, and freezing, the supernatant organic solvent was evaporated and reconstituted in a small volume of reconstitution solution. Sample analysis was performed by utilizing the combination of RP-HPLC/MS/MS, with positive-ion electrospray ionization and multiple reaction monitoring detection. The chromatographic run time was set at 1.8 min with a flow rate of 0.6 mL/min on a Nucleosil octylsilyl (C8) analytical column. The method was proven to be sensitive, specific, accurate, and precise for the determination of pantoprazole in human plasma. The method was applied to a bioequivalence study after per os administration of a 40 mg pantoprazole gastric retentive tablet.

  18. Platotex: an innovative and fully automated device for cell growth scale-up of agar-supported solid-state fermentation.

    Science.gov (United States)

    Adelin, Emilie; Slimani, Noureddine; Cortial, Sylvie; Schmitz-Alfonso, Isabelle; Ouazzani, Jamal

    2011-02-01

    Among various factors that influence the production of microbial secondary metabolites (MSM), the method of cultivation is an important one that has not been thoroughly investigated. In order to increase microbial throughput and simplify the extraction and workup steps, we performed a study to compare liquid-state fermentation (LSF) with agar-supported solid-state fermentation (AgSF). We found that AgSF is not only more suitable for our applications but offers, for some microbial strains, a higher yield and broader diversity of secondary metabolites. The main limitation of AgSF is the lack of a system to allow production scale-up. In order to overcome this obstacle we developed Platotex, an original fermentation unit offering 2 m(2) of cultivation surface that combines automatic sterilization, cultivation, and drying steps. Platotex is also able to support both LSF and solid-state fermentation (SSF). Platotex conforms to international security and quality requirements and benefits from total remote automation through industrial communication and control standards.

  19. Quantification of 31 illicit and medicinal drugs and metabolites in whole blood by fully automated solid-phase extraction and ultra-performance liquid chromatography-tandem mass spectrometry

    DEFF Research Database (Denmark)

    Bjørk, Marie Kjærgaard; Simonsen, Kirsten Wiese; Andersen, David Wederkinck

    2013-01-01

    An efficient method for analyzing illegal and medicinal drugs in whole blood using fully automated sample preparation and short ultra-high-performance liquid chromatography–tandem mass spectrometry (MS/MS) run time is presented. A selection of 31 drugs, including amphetamines, cocaine, opioids......-phase extraction was performed using Strata X-C plates. Extraction time for 96 samples was less than 3 h. Chromatography was performed using an ACQUITY UPLC system (Waters Corporation, Milford, USA). Analytes were separated on a 100 mm × 2.1 mm, 1.7 μm Acquity UPLC CSH C18 column using a 6.5 min 0.1 % ammonia (25...... %) in water/0.1 % ammonia (25 %) in methanol gradient and quantified by MS/MS (Waters Quattro Premier XE) in multiple-reaction monitoring mode. Full validation, including linearity, precision and trueness, matrix effect, ion suppression/enhancement of co-eluting analytes, recovery, and specificity...

  20. A LabVIEW(®)-based software for the control of the AUTORAD platform: a fully automated multisequential flow injection analysis Lab-on-Valve (MSFIA-LOV) system for radiochemical analysis.

    Science.gov (United States)

    Barbesi, Donato; Vicente Vilas, Víctor; Millet, Sylvain; Sandow, Miguel; Colle, Jean-Yves; Aldave de Las Heras, Laura

    2017-01-01

    A LabVIEW(®)-based software for the control of the fully automated multi-sequential flow injection analysis Lab-on-Valve (MSFIA-LOV) platform AutoRAD performing radiochemical analysis is described. The analytical platform interfaces an Arduino(®)-based device triggering multiple detectors providing a flexible and fit for purpose choice of detection systems. The different analytical devices are interfaced to the PC running LabVIEW(®)VI software using USB and RS232 interfaces, both for sending commands and receiving confirmation or error responses. The AUTORAD platform has been successfully applied for the chemical separation and determination of Sr, an important fission product pertinent to nuclear waste.

  1. Toward fully automated high performance computing drug discovery: a massively parallel virtual screening pipeline for docking and molecular mechanics/generalized Born surface area rescoring to improve enrichment.

    Science.gov (United States)

    Zhang, Xiaohua; Wong, Sergio E; Lightstone, Felice C

    2014-01-27

    In this work we announce and evaluate a high throughput virtual screening pipeline for in-silico screening of virtual compound databases using high performance computing (HPC). Notable features of this pipeline are an automated receptor preparation scheme with unsupervised binding site identification. The pipeline includes receptor/target preparation, ligand preparation, VinaLC docking calculation, and molecular mechanics/generalized Born surface area (MM/GBSA) rescoring using the GB model by Onufriev and co-workers [J. Chem. Theory Comput. 2007, 3, 156-169]. Furthermore, we leverage HPC resources to perform an unprecedented, comprehensive evaluation of MM/GBSA rescoring when applied to the DUD-E data set (Directory of Useful Decoys: Enhanced), in which we selected 38 protein targets and a total of ∼0.7 million actives and decoys. The computer wall time for virtual screening has been reduced drastically on HPC machines, which increases the feasibility of extremely large ligand database screening with more accurate methods. HPC resources allowed us to rescore 20 poses per compound and evaluate the optimal number of poses to rescore. We find that keeping 5-10 poses is a good compromise between accuracy and computational expense. Overall the results demonstrate that MM/GBSA rescoring has higher average receiver operating characteristic (ROC) area under curve (AUC) values and consistently better early recovery of actives than Vina docking alone. Specifically, the enrichment performance is target-dependent. MM/GBSA rescoring significantly out performs Vina docking for the folate enzymes, kinases, and several other enzymes. The more accurate energy function and solvation terms of the MM/GBSA method allow MM/GBSA to achieve better enrichment, but the rescoring is still limited by the docking method to generate the poses with the correct binding modes.

  2. Fully automated determination of nicotine and its major metabolites in whole blood by means of a DBS online-SPE LC-HR-MS/MS approach for sports drug testing.

    Science.gov (United States)

    Tretzel, Laura; Thomas, Andreas; Piper, Thomas; Hedeland, Mikael; Geyer, Hans; Schänzer, Wilhelm; Thevis, Mario

    2016-05-10

    Dried blood spots (DBS) represent a sample matrix collected under minimal-invasive, straightforward and robust conditions. DBS specimens have been shown to provide appropriate test material for different analytical disciplines, e.g., preclinical drug development, therapeutic drug monitoring, forensic toxicology and diagnostic analysis of metabolic disorders in newborns. However, the sample preparation has occasionally been reported as laborious and time consuming. In order to minimize the manual workload and to substantiate the suitability of DBS for high sample-throughput, the automation of sample preparation processes is of paramount interest. In the current study, the development and validation of a fully automated DBS extraction method coupled to online solid-phase extraction using the example of nicotine, its major metabolites nornicotine, cotinine and trans-3'-hydroxycotinine and the tobacco alkaloids anabasine and anatabine is presented, based on the rationale that the use of nicotine-containing products for performance-enhancing purposes has been monitored by the World Anti-Doping Agency (WADA) for several years. Automation-derived DBS sample extracts were directed online to liquid chromatography high resolution/high mass accuracy tandem mass spectrometry, and target analytes were determined with support of four deuterated internal standards. Validation of the method yielded precise (CV 0.998) results. The limit of detection was established at 5 ng mL(-1) for all studied compounds, the extraction recovery ranged from 25 to 44%, and no matrix effects were observed. To exemplify the applicability of the DBS online-SPE LC-MS/MS approach for sports drug testing purposes, the method was applied to authentic DBS samples obtained from smokers, snus users, and e-cigarette users. Statistical evaluation of the obtained results indicated differences in metabolic behavior depending on the route of administration (inhalative versus buccal absorption) in terms of the

  3. Detection of BRAF Mutations Using a Fully Automated Platform and Comparison with High Resolution Melting, Real-Time Allele Specific Amplification, Immunohistochemistry and Next Generation Sequencing Assays, for Patients with Metastatic Melanoma.

    Directory of Open Access Journals (Sweden)

    Alexandre Harlé

    Full Text Available Metastatic melanoma is a severe disease with one of the highest mortality rate in skin diseases. Overall survival has significantly improved with immunotherapy and targeted therapies. Kinase inhibitors targeting BRAF V600 showed promising results. BRAF genotyping is mandatory for the prescription of anti-BRAF therapies.Fifty-nine formalin-fixed paraffin-embedded melanoma samples were assessed using High-Resolution-Melting (HRM PCR, Real-time allele-specific amplification (RT-ASA PCR, Next generation sequencing (NGS, immunohistochemistry (IHC and the fully-automated molecular diagnostics platform IdyllaTM. Sensitivity, specificity, positive predictive value and negative predictive value were calculated using NGS as the reference standard to compare the different assays.BRAF mutations were found in 28(47.5%, 29(49.2%, 31(52.5%, 29(49.2% and 27(45.8% samples with HRM, RT-ASA, NGS, IdyllaTM and IHC respectively. Twenty-six (81.2% samples were found bearing a c.1799T>A (p.Val600Glu mutation, three (9.4% with a c.1798_1799delinsAA (p.Val600Lys mutation and one with c.1789_1790delinsTC (p.Leu597Ser mutation. Two samples were found bearing complex mutations.HRM appears the less sensitive assay for the detection of BRAF V600 mutations. The RT-ASA, IdyllaTM and IHC assays are suitable for routine molecular diagnostics aiming at the prescription of anti-BRAF therapies. IdyllaTM assay is fully-automated and requires less than 2 minutes for samples preparation and is the fastest of the tested assays.

  4. Clinical value of fully automated p16/Ki-67 dual staining in the triage of HPV-positive women in the Norwegian Cervical Cancer Screening Program.

    Science.gov (United States)

    Ovestad, Irene T; Dalen, Ingvild; Hansen, Elisabeth; Loge, Janne L D; Dybdahl, Britt Mona; Dirdal, Marius B; Moltu, Pia; Berland, Jannicke M

    2017-04-01

    More accurate biomarkers in cervical cytology screening could reduce the number of women unnecessarily referred for biopsy. This study investigated the ability of p16/Ki-67 dual staining to predict high-grade cervical intraepithelial neoplasia (CIN) in human papillomavirus (HPV)-positive women from the Norwegian Cervical Cancer Screening Program. Automated p16/Ki-67 dual staining was performed on liquid-based cytology samples from 266 women who were HPV-positive at their secondary screening. At a mean of 184 days after p16/Ki-67 staining, 201 women had a valid staining result and a conclusive follow-up diagnosis (histological diagnosis or HPV-negative diagnosis with normal cytology findings). The sensitivity and specificity for predicting the follow-up diagnosis were compared for cytology, p16/Ki-67 dual staining, and their combination. Sixty-seven percent of the study sample was p16/Ki-67-positive. The sensitivity of p16/Ki-67 staining for predicting CIN-2/3 was statistically significantly higher than the sensitivity of cytology (0.88 vs 0.79; P = .008), but this was not true for the prediction of CIN-3 (0.94 vs 0.88; P = .23). The specificity of cytology for predicting CIN-3 was significantly higher than the specificity of p16/Ki-67 staining (0.35 vs 0.28; P = .002), but this was not true for CIN-2/3 (0.35 vs 0.31; P = .063). For predicting CIN-2/3 and CIN-3, combination testing gave potentially better sensitivity (0.95 and 0.96, respectively) and better specificity (0.49 and 0.50, respectively). In a population of HPV-positive women, p16/Ki-67 dual staining was more sensitive but less specific than cytology for predicting high-grade CIN. The advantage of using both tests in different combinations is the potential for increasing the specificity or sensitivity in comparison with both methods performed individually. Cancer Cytopathol 2017;125:283-291. © 2016 American Cancer Society. © 2016 American Cancer Society.

  5. Automation or De-automation

    Science.gov (United States)

    Gorlach, Igor; Wessel, Oliver

    2008-09-01

    In the global automotive industry, for decades, vehicle manufacturers have continually increased the level of automation of production systems in order to be competitive. However, there is a new trend to decrease the level of automation, especially in final car assembly, for reasons of economy and flexibility. In this research, the final car assembly lines at three production sites of Volkswagen are analysed in order to determine the best level of automation for each, in terms of manufacturing costs, productivity, quality and flexibility. The case study is based on the methodology proposed by the Fraunhofer Institute. The results of the analysis indicate that fully automated assembly systems are not necessarily the best option in terms of cost, productivity and quality combined, which is attributed to high complexity of final car assembly systems; some de-automation is therefore recommended. On the other hand, the analysis shows that low automation can result in poor product quality due to reasons related to plant location, such as inadequate workers' skills, motivation, etc. Hence, the automation strategy should be formulated on the basis of analysis of all relevant aspects of the manufacturing process, such as costs, quality, productivity and flexibility in relation to the local context. A more balanced combination of automated and manual assembly operations provides better utilisation of equipment, reduces production costs and improves throughput.

  6. Fully automated trace level determination of parent and alkylated PAHs in environmental waters by online SPE-LC-APPI-MS/MS.

    Science.gov (United States)

    Ramirez, Cesar E; Wang, Chengtao; Gardinali, Piero R

    2014-01-01

    Polycyclic aromatic hydrocarbons (PAHs) are ubiquitous compounds that enter the environment from natural and anthropogenic sources, often used as markers to determine the extent, fate, and potential effects on natural resources after a crude oil accidental release. Gas chromatography-mass spectrometry (GC-MS) after liquid-liquid extraction (LLE+GC-MS) has been extensively used to isolate and quantify both parent and alkylated PAHs. However, it requires labor-intensive extraction and cleanup steps and generates large amounts of toxic solvent waste. Therefore, there is a clear need for greener, faster techniques with enough reproducibility and sensitivity to quantify many PAHs in large numbers of water samples in a short period of time. This study combines online solid-phase extraction followed by liquid chromatography (LC) separation with dopant-assisted atmospheric pressure photoionization (APPI) and tandem MS detection, to provide a one-step protocol that detects PAHs at low nanograms per liter with almost no sample preparation and with a significantly lower consumption of toxic halogenated solvents. Water samples were amended with methanol, fortified with isotopically labeled PAHs, and loaded onto an online SPE column, using a large-volume sample loop with an auxiliary LC pump for sample preconcentration and salt removal. The loaded SPE column was connected to an UPLC pump and analytes were backflushed to a Thermo Hypersil Green PAH analytical column where a 20-min gradient separation was performed at a variable flow rate. Detection was performed by a triple-quadrupole MS equipped with a gas-phase dopant delivery system, using 1.50 mL of chlorobenzene dopant per run. In contrast, LLE+GC-MS typically use 150 mL of organic solvents per sample, and methylene chloride is preferred because of its low boiling point. However, this solvent has a higher environmental persistence than chlorobenzene and is considered a carcinogen. The automated system is capable of

  7. ON-DEMAND SERIAL DILUTION USING QUANTIZED NANO/PICOLITER-SCALE DROPLETS

    Energy Technology Data Exchange (ETDEWEB)

    Jambovane, Sachin R.; Prost, Spencer A.; Sheen, Allison M.; Magnuson, Jon K.; Kelly, Ryan T.

    2014-10-29

    This paper describes a fully automated droplet-based microfluidic device for on-demand serial dilution that is capable of achieving a dilution ratio of >6000 (concentration ranges from 1 mM to 160nM) over 35 nanoliter-scale droplets. This serial diluter can be applied to high throughput and label-free kinetic assays by integrating with our previously developed on-demand droplet-based microfluidic with mass spectrometry detection.

  8. Application of fully automated blood grouping analyzer in the blood donor testing%全自动血型分析仪应用于献血者血型筛查

    Institute of Scientific and Technical Information of China (English)

    周国平; 周结; 向东; 谢云峥; 杨军; 郑岚; 曹斌; 吴蓉晖

    2011-01-01

    Objective To evaluate the performance of a fully automated blood grouping analyzer for ABO and RhD screening and the red blood cell IgM unexpected antibody detection. Methods A total of 25 554 samples were collected from blood donors. ABO, RhD, and IgM unexpected antibodies were tested simultaneously by fully automated blood grouping analyzer and manual colorimetric method with semi-automated sampler. For discrepancies between forward and reverse ABO grouping, agglutination of O cells, RhD negative results, samples would be referred to the reference laboratory of Shanghai Blood Center for further identification. Results The accuracy rates of ABO grouping by fully automated blood grouping analyzer and manual colorimetric method with semi-automated sampler were 99.93% (25 535/25 554) and 99.95% (25 542/25 554)respectively(P > 0.05 ) ;the rates of agglutination of O cells were 0.18% (46/25 554), and 0.10% (26/25 554) (P <0.05) respectively;the ABO forward and reverse grouping discrepancies were 17(0.06% ) and 10(0.04% ) ,respectively. Reference lab confirmed that there were 5 subgroups discovered by both methods ;2 subgroups were missed by each method(0.01% ) ,the rest were normal ABO blood group specimens (10/17 vs 3/10, P > 0.05). Conclusion The fully automated blood grouping analyzer can perform blood donor testing with high accuracy, high standardization in operation, and easier identification of IgM irregular antibodies.%目的 探讨并评价全自动血型分析仪应用于献血者血型筛查和盐水不规则抗体检测.方法 采用全自动血型分析仪(全自动法)对25 554例献血者标本作ABO及BhD血型鉴定、盐水不规则抗体初筛,并与加样仪加样手工比色法(半自动法)作比对实验.ABO正反定型不一致而无法定型、O细胞凝集、RhD阴性的标本送血型红细胞参比实验室鉴定.结果 全自动法与半自动法比较,ABO、RhD阴性血型1次准确定型率:99.93%(25 535/25 554)vs 99

  9. Manufacturing and automation

    Directory of Open Access Journals (Sweden)

    Ernesto Córdoba Nieto

    2010-04-01

    Full Text Available The article presents concepts and definitions from different sources concerning automation. The work approaches automation by virtue of the author’s experience in manufacturing production; why and how automation prolects are embarked upon is considered. Technological reflection regarding the progressive advances or stages of automation in the production area is stressed. Coriat and Freyssenet’s thoughts about and approaches to the problem of automation and its current state are taken and examined, especially that referring to the problem’s relationship with reconciling the level of automation with the flexibility and productivity demanded by competitive, worldwide manufacturing.

  10. Fully automated analysis of four tobacco-specific N-nitrosamines in mainstream cigarette smoke using two-dimensional online solid phase extraction combined with liquid chromatography-tandem mass spectrometry.

    Science.gov (United States)

    Zhang, Jie; Bai, Ruoshi; Yi, Xiaoli; Yang, Zhendong; Liu, Xingyu; Zhou, Jun; Liang, Wei

    2016-01-01

    A fully automated method for the detection of four tobacco-specific nitrosamines (TSNAs) in mainstream cigarette smoke (MSS) has been developed. The new developed method is based on two-dimensional online solid-phase extraction-liquid chromatography-tandem mass spectrometry (SPE/LC-MS/MS). The two dimensional SPE was performed in the method utilizing two cartridges with different extraction mechanisms to cleanup disturbances of different polarity to minimize sample matrix effects on each analyte. Chromatographic separation was achieved using a UPLC C18 reversed phase analytical column. Under the optimum online SPE/LC-MS/MS conditions, N'-nitrosonornicotine (NNN), N'-nitrosoanatabine (NAT), N'-nitrosoanabasine (NAB), and 4-(methylnitrosamino)-1-(3-pyridyl)-1-butanone (NNK) were baseline separated with good peak shapes. This method appears to be the most sensitive method yet reported for determination of TSNAs in mainstream cigarette smoke. The limits of quantification for NNN, NNK, NAT and NAB reached the levels of 6.0, 1.0, 3.0 and 0.6 pg/cig, respectively, which were well below the lowest levels of TSNAs in MSS of current commercial cigarettes. The accuracy of the measurement of four TSNAs was from 92.8 to 107.3%. The relative standard deviations of intra-and inter-day analysis were less than 5.4% and 7.5%, respectively. The main advantages of the method developed are fairly high sensitivity, selectivity and accuracy of results, minimum sample pre-treatment, full automation, and high throughput. As a part of the validation procedure, the developed method was applied to evaluate TSNAs yields for 27 top-selling commercial cigarettes in China. Copyright © 2015 Elsevier B.V. All rights reserved.

  11. Fully automated system for pulsed NMR measurements

    Energy Technology Data Exchange (ETDEWEB)

    Cantor, David Milton

    1977-01-01

    A system is described which places many of the complex, tedious operations for pulsed NMR experiments under computer control. It automatically optimizes the experiment parameters of pulse length and phase, and precision, accuracy, and measurement speed are improved. The hardware interface between the computer and the NMR instrument is described. Design features, justification of the choices made between alternative design strategies, and details of the implementation of design goals are presented. Software features common to all the available experiments are discussed. Optimization of pulse lengths and phases is performed via a sequential search technique called Uniplex. Measurements of the spin-lattice and spin-spin relaxation times and of diffusion constants are automatic. Options for expansion of the system are explored along with some of the limitations of the system.

  12. Fully Automated Anesthesia, Analgesia and Fluid Management

    Science.gov (United States)

    2016-09-05

    General Anesthetic Drug Overdose; Adverse Effect of Intravenous Anesthetics, Sequela; Complication of Anesthesia; Drug Delivery System Malfunction; Hemodynamic Instability; Underdosing of Other General Anesthetics

  13. Marketing automation

    OpenAIRE

    Raluca Dania TODOR

    2017-01-01

    The automation of the marketing process seems to be nowadays, the only solution to face the major changes brought by the fast evolution of technology and the continuous increase in supply and demand. In order to achieve the desired marketing results, businessis have to employ digital marketing and communication services. These services are efficient and measurable thanks to the marketing technology used to track, score and implement each campaign. Due to the...

  14. Automation in Clinical Microbiology

    Science.gov (United States)

    Ledeboer, Nathan A.

    2013-01-01

    Historically, the trend toward automation in clinical pathology laboratories has largely bypassed the clinical microbiology laboratory. In this article, we review the historical impediments to automation in the microbiology laboratory and offer insight into the reasons why we believe that we are on the cusp of a dramatic change that will sweep a wave of automation into clinical microbiology laboratories. We review the currently available specimen-processing instruments as well as the total laboratory automation solutions. Lastly, we outline the types of studies that will need to be performed to fully assess the benefits of automation in microbiology laboratories. PMID:23515547

  15. Automated paleomagnetic and rock magnetic data acquisition with an in-line horizontal “2G” system

    NARCIS (Netherlands)

    Mullender, T.A.T.; Frederichs, T.; Hilgenfeldt, C.; de Groot, L.V.; Fabian, K.; Dekkers, M.J.

    Today's paleomagnetic and magnetic proxy studies involve processing of large sample collections while simultaneously demanding high quality data and high reproducibility. Here we describe a fully automated interface based on a commercial horizontal pass-through “2G” DC-SQUID magnetometer. This

  16. 全自动电化学发光免疫分析法检测血清TRAb的方法学评价%Methodological Evaluation of Fully Automated Electrochemiluminescence Immunoassay for Determination of Serum Thyrotropin Receptor Antibody

    Institute of Scientific and Technical Information of China (English)

    王亚萍; 肖锦华; 朱华燕; 潘宇红; 黄璇

    2011-01-01

    目的:对电化学发光免疫分析(ECLIA)检测血清促甲状腺激素受体抗体(TRAb)进行方法学评价.方法:采用ECLIA分别检测62例GD患者、65例非GD患者以及41例健康对照者血清中TRAb的含量,评估ECLIA检测TRAb的精密度、回收率、诊断敏感性和特异性及与ELISA的相关性.结果:ECLIA检测TRAb的批内、批间变异系数分别为0.78%~3.30%、1.25%~5.42%;回收率为96.8%~101.5%;对GD诊断的敏感性和特异性分别为95.1%、96.2%;与ELISA检测的相关性良好(r=0.9815,P<0.01).结论:ECLIA检测TRAb的精密度、准确性均符合临床诊断要求,对GD诊断的敏感性、特异性都很高,且操作简便,检测实现仪器的自动化,能很好地满足临床检测的需要.%Objective To evaluate an automatic immunological analyzer for the determination of serum thyrotropin receptor antibody (TRAb) by electrochemiluminescence immunoassay (ECLIA) technique. Methods The TRAb concentrations were simultaneously analyzed by ECLIA in 62 patients with Graves' disease (CD) ; 65 with thyroid diseases without diagnosis of CD and 41 healthy controls. The methodology of ECLIA assay; including precision; recovery; sensitivity; specificity and its correlation to ELISA assay; was fully evaluated. Results The intaa- and inter- precision; recovery of ECLIA assay were 0. 78% ~3. 30% ; 1. 25% ~5. 42% and 96. 8% ~ 101. 5%; respectively. The diagnostic sensitivity and specificity for GD were 95. 1% and 96. 2%; separately. There is a good correlation between ECLIA and ELISA for detection of the serum TRAb( r = 0. 9815 ; P < 0. 01) . Conclusion The ECLIA assay is a new immunological technique with advantage of high precision and accuracy for the determination of TRAb concentration. The method also shows good specificity; sensitivity in GD diagnosis. Furthermore ; the ECLIA assay exhibits a convenient; fast and suitable for automation analysis. The method can be applied for clinical application.

  17. Energy Production System Management - Renewable energy power supply integration with Building Automation System

    Energy Technology Data Exchange (ETDEWEB)

    Figueiredo, Joao [Centre of Mechatronics Engineering - CEM/Institut of Mechanical Engineering - IDMEC, University of Evora, R. Romao Ramalho, 59, 7000-671 Evora (Portugal); Martins, Joao [Centre of Technology and Systems/Faculdade de Ciencias e Tecnologia, Universidade Nova Lisboa, 1049-001 Lisboa (Portugal)

    2010-06-15

    Intelligent buildings, historically and technologically, refers to the integration of four distinctive systems: Building Automation Systems (BAS), Telecommunication Systems, Office Automation Systems and Computer Building Management Systems. The increasing sophisticated BAS has become the ''heart and soul'' of modern intelligent buildings. Integrating energy supply and demand elements - often known as Demand-Side Management (DSM) - has became an important energy efficiency policy concept. Nowadays, European countries have diversified their power supplies, reducing the dependence on OPEC, and developing a broader mix of energy sources maximizing the use of renewable energy domestic sources. In this way it makes sense to include a fifth system into the intelligent building group: Energy Production System Management (EPSM). This paper presents a Building Automation System where the Demand-Side Management is fully integrated with the building's Energy Production System, which incorporates a complete set of renewable energy production and storage systems. (author)

  18. Marketing automation

    Directory of Open Access Journals (Sweden)

    TODOR Raluca Dania

    2017-01-01

    Full Text Available The automation of the marketing process seems to be nowadays, the only solution to face the major changes brought by the fast evolution of technology and the continuous increase in supply and demand. In order to achieve the desired marketing results, businessis have to employ digital marketing and communication services. These services are efficient and measurable thanks to the marketing technology used to track, score and implement each campaign. Due to the technical progress, the marketing fragmentation, demand for customized products and services on one side and the need to achieve constructive dialogue with the customers, immediate and flexible response and the necessity to measure the investments and the results on the other side, the classical marketing approached had changed continue to improve substantially.

  19. Intraperitoneal insulin delivery provides superior glycaemic regulation to subcutaneous insulin delivery in model predictive control-based fully-automated artificial pancreas in patients with type 1 diabetes: a pilot study.

    Science.gov (United States)

    Dassau, Eyal; Renard, Eric; Place, Jérôme; Farret, Anne; Pelletier, Marie-José; Lee, Justin; Huyett, Lauren M; Chakrabarty, Ankush; Doyle, Francis J; Zisser, Howard C

    2017-05-05

    To compare intraperitoneal (IP) to subcutaneous (SC) insulin delivery in an artificial pancreas (AP). Ten adults with type 1 diabetes participated in a non-randomized, non-blinded sequential AP study using the same SC glucose sensing and Zone Model Predictive Control (ZMPC) algorithm adjusted for insulin clearance. On first admission, subjects underwent closed-loop control with SC delivery of a fast-acting insulin analogue for 24 hours. Following implantation of a DiaPort IP insulin delivery system, the identical 24-hour trial was performed with IP regular insulin delivery. The clinical protocol included 3 unannounced meals with 70, 40 and 70 g carbohydrate, respectively. Primary endpoint was time spent with blood glucose (BG) in the range of 80 to 140 mg/dL (4.4-7.7 mmol/L). Percent of time spent within the 80 to 140 mg/dL range was significantly higher for IP delivery than for SC delivery: 39.8 ± 7.6 vs 25.6 ± 13.1 ( P  = .03). Mean BG (mg/dL) and percent of time spent within the broader 70 to 180 mg/dL range were also significantly better for IP insulin: 151.0 ± 11.0 vs 190.0 ± 31.0 ( P  = .004) and 65.7 ± 9.2 vs 43.9 ± 14.7 ( P  = .001), respectively. Superiority of glucose control with IP insulin came from the reduced time spent in hyperglycaemia (>180 mg/dL: 32.4 ± 8.9 vs 53.5 ± 17.4, P  = .014; >250 mg/dL: 5.9 ± 5.6 vs 23.0 ± 11.3, P  = .0004). Higher daily doses of insulin (IU) were delivered with the IP route (43.7 ± 0.1 vs 32.3 ± 0.1, P  time spent <70 mg/dL (IP: 2.5 ± 2.9 vs SC: 4.1 ± 5.3, P  = .42). Glycaemic regulation with fully-automated AP delivering IP insulin was superior to that with SC insulin delivery. This pilot study provides proof-of-concept for an AP system combining a ZMPC algorithm with IP insulin delivery. © 2017 John Wiley & Sons Ltd.

  20. Optimization of Human NK Cell Manufacturing: Fully Automated Separation, Improved Ex Vivo Expansion Using IL-21 with Autologous Feeder Cells, and Generation of Anti-CD123-CAR-Expressing Effector Cells.

    Science.gov (United States)

    Klöß, Stephan; Oberschmidt, Olaf; Morgan, Michael; Dahlke, Julia; Arseniev, Lubomir; Huppert, Volker; Granzin, Markus; Gardlowski, Tanja; Matthies, Nadine; Soltenborn, Stephanie; Schambach, Axel; Koehl, Ulrike

    2017-10-01

    CD3 depletion and CD56 enrichment steps. Manually performed experiments to test different culture media demonstrated significantly higher NK cell expansion rates and an approximately equal distribution of CD56(dim)CD16(pos) and CD56(bright)CD16(dim&neg) NK subsets on day 14 with cells cultivated in NK MACS(®) media. Moreover, effector cell expansion in manually performed experiments with NK MACS(®) containing IL-2 and irradiated autologous FCs and IL-21, both added at the initiation of the culture, induced an 85-fold NK cell expansion. Compared to freshly isolated NK cells, expanded NK cells expressed significantly higher levels of NKp30, NKp44, NKG2D, TRAIL, FasL, CD69, and CD137, and showed comparable cell viabilities and killing/degranulation activities against tumor and leukemic cell lines in vitro. NK cells used for CAR transduction showed the highest anti-CD123 CAR expression on day 3 after gene modification. These anti-CD123 CAR-engineered NK cells demonstrated improved cytotoxicity against the CD123(pos) AML cell line KG1a and primary AML blasts. In addition, CAR NK cells showed higher degranulation and enhanced secretion of tumor necrosis factor alpha, interferon gamma, and granzyme A and B. In fluorescence imaging, specific interactions that initiated apoptotic processes in the AML target cells were detected between CAR NK cells and KG1a. After the fully automated NK cell separation process on Prodigy, a new NK cell expansion protocol was generated that resulted in high numbers of NK cells with potent antitumor activity, which could be modified efficiently by novel third-generation, alpha-retroviral SIN vector constructs. Next steps are the integration of the manual expansion procedure in the fully integrated platform for a standardized GMP-compliant overall process in this closed system that also may include gene modification of NK cells to optimize target-specific antitumor activity.

  1. Energy Demand

    NARCIS (Netherlands)

    Stehfest, E. et al.

    2014-01-01

    Key policy issues – How will energy demand evolve particularly in emerging and medium- and low- income economies? – What is the mix of end-use energy carriers to meet future energy demand? – How can energy efficiency contribute to reducing the growth rate of energy demand and mitigate pressures on t

  2. An inventory control project in a major Danish company using compound renewal demand models

    DEFF Research Database (Denmark)

    Larsen, Christian; Seiding, Claus Hoe; Teller, Christian

    operation is highly automated. However, the procedures for estimating demands and the policies for the inventory control system that were in use at the beginning of the project did not fully match the sophisticated technological standard of the physical system. During the initial phase of the project...... inventory control variables based on the fitted demand distributions and a service level requirement stated in terms of an order fill rate. Finally, we validated the results of our models against the procedures that had been in use in the company. It was concluded that the new procedures were considerably...

  3. An inventory control project in a major Danish company using compound renewal demand models

    DEFF Research Database (Denmark)

    Larsen, Christian; Seiding, Claus Hoe; Teller, Christian

    2008-01-01

    -conditioning systems. The warehouse logistics operation is highly automated. However, the procedures for estimating demands and the policies for the inventory control system that were in use at the beginning of the project did not fully match the sophisticated technological standard of the physical system. During...... procedures for determining suitable inventory control variables based on the fitted demand distributions and a service-level requirement stated in terms of an order fill rate. Finally, we validated the results of our models against the procedures that had been in use in the company. It was concluded...

  4. An inventory control project in a major Danish company using compound renewal demand models

    DEFF Research Database (Denmark)

    Larsen, Christian; Seiding, Claus Hoe; Teller, Christian

    2008-01-01

    -conditioning systems. The warehouse logistics operation is highly automated. However, the procedures for estimating demands and the policies for the inventory control system that were in use at the beginning of the project did not fully match the sophisticated technological standard of the physical system. During...... that the new procedures provided a better fit with the actual demand processes and were more consistent with the stated objectives for the distribution centre. We also initiated the implementation and integration of the new procedures into the company's inventory management system....

  5. Automated Contingency Management for Propulsion Systems

    Data.gov (United States)

    National Aeronautics and Space Administration — Increasing demand for improved reliability and survivability of mission-critical systems is driving the development of health monitoring and Automated Contingency...

  6. 21 CFR 864.5240 - Automated blood cell diluting apparatus.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Automated blood cell diluting apparatus. 864.5240... § 864.5240 Automated blood cell diluting apparatus. (a) Identification. An automated blood cell diluting apparatus is a fully automated or semi-automated device used to make appropriate dilutions of a blood...

  7. Automation in Immunohematology

    Directory of Open Access Journals (Sweden)

    Meenu Bajpai

    2012-01-01

    Full Text Available There have been rapid technological advances in blood banking in South Asian region over the past decade with an increasing emphasis on quality and safety of blood products. The conventional test tube technique has given way to newer techniques such as column agglutination technique, solid phase red cell adherence assay, and erythrocyte-magnetized technique. These new technologies are adaptable to automation and major manufacturers in this field have come up with semi and fully automated equipments for immunohematology tests in the blood bank. Automation improves the objectivity and reproducibility of tests. It reduces human errors in patient identification and transcription errors. Documentation and traceability of tests, reagents and processes and archiving of results is another major advantage of automation. Shifting from manual methods to automation is a major undertaking for any transfusion service to provide quality patient care with lesser turnaround time for their ever increasing workload. This article discusses the various issues involved in the process.

  8. Automation in immunohematology.

    Science.gov (United States)

    Bajpai, Meenu; Kaur, Ravneet; Gupta, Ekta

    2012-07-01

    There have been rapid technological advances in blood banking in South Asian region over the past decade with an increasing emphasis on quality and safety of blood products. The conventional test tube technique has given way to newer techniques such as column agglutination technique, solid phase red cell adherence assay, and erythrocyte-magnetized technique. These new technologies are adaptable to automation and major manufacturers in this field have come up with semi and fully automated equipments for immunohematology tests in the blood bank. Automation improves the objectivity and reproducibility of tests. It reduces human errors in patient identification and transcription errors. Documentation and traceability of tests, reagents and processes and archiving of results is another major advantage of automation. Shifting from manual methods to automation is a major undertaking for any transfusion service to provide quality patient care with lesser turnaround time for their ever increasing workload. This article discusses the various issues involved in the process.

  9. Automation synthesis modules review.

    Science.gov (United States)

    Boschi, S; Lodi, F; Malizia, C; Cicoria, G; Marengo, M

    2013-06-01

    The introduction of (68)Ga labelled tracers has changed the diagnostic approach to neuroendocrine tumours and the availability of a reliable, long-lived (68)Ge/(68)Ga generator has been at the bases of the development of (68)Ga radiopharmacy. The huge increase in clinical demand, the impact of regulatory issues and a careful radioprotection of the operators have boosted for extensive automation of the production process. The development of automated systems for (68)Ga radiochemistry, different engineering and software strategies and post-processing of the eluate were discussed along with impact of automation with regulations. Copyright © 2012 Elsevier Ltd. All rights reserved.

  10. A Fully Automated Radiosynthesis of [18F]Fluoroethyl-Diprenorphine on a Single Module by Use of SPE Cartridges for Preparation of High Quality 2-[18F]Fluoroethyl Tosylate

    Directory of Open Access Journals (Sweden)

    Gjermund Henriksen

    2013-06-01

    Full Text Available We have developed a new method for automated production of 2-[18F]fluoroethyl tosylate ([18F]FETos that enables 18F-alkylation to provide PET tracers with high chemical purity. The method is based on the removal of excess ethylene glycol bistosylate precursor by precipitation and subsequent filtration and purification of the filtrate by means of solid phase extraction cartridges (SPE. The method is integrated to a single synthesis module and thereby provides the advantage over previous methods of not requiring HPLC purification, as demonstrated by the full radiosynthesis of the potent opioid receptor PET tracer [18F]fluoroethyldiprenorphine.

  11. Automated urinalysis.

    Science.gov (United States)

    Carlson, D A; Statland, B E

    1988-09-01

    Many sources of variation affect urinalysis testing. These are due to physiologic changes in the patient, therapeutic interventions, and collection, transportation, and storage of urine specimens. There are problems inherent to the manual performance of this high-volume test. Procedures are poorly standardized across the United States, and even within the same laboratory there can be significant technologist-to-technologist variability. The methods used can perturb the specimen so that recovery of analytes is less than 100 per cent in the aliquot examined. The absence of significant automation of the entire test, with the one exception of the Yellow IRIS, is unusual in the clinical laboratory setting, where most other hematology and chemistry testing has been fully automated. Our evaluation of the Yellow IRIS found that this system is an excellent way to improve the quality of the results and thereby physician acceptance. There is a positive impact for those centers using this instrument, both for the laboratory and for the hospital.

  12. Magnetic Resonance Parkinsonism Index: diagnostic accuracy of a fully automated algorithm in comparison with the manual measurement in a large Italian multicentre study in patients with progressive supranuclear palsy

    Energy Technology Data Exchange (ETDEWEB)

    Nigro, Salvatore [National Research Council, Institute of Bioimaging and Molecular Physiology, Catanzaro (Italy); Arabia, Gennarina [University ' ' Magna Graecia' ' , Institute of Neurology, Department of Medical and Surgical Sciences, Catanzaro (Italy); Antonini, Angelo; Weis, Luca; Marcante, Andrea [' ' Fondazione Ospedale San Camillo' ' - I.R.C.C.S, Parkinson' s Disease and Movement Disorders Unit, Venice-Lido (Italy); Tessitore, Alessandro; Cirillo, Mario; Tedeschi, Gioacchino [Second University of Naples, Department of Medical, Surgical, Neurological, Metabolic and Aging Sciences, Naples (Italy); Second University of Naples, MRI Research Center SUN-FISM, Naples (Italy); Zanigni, Stefano; Tonon, Caterina [Policlinico S. Orsola - Malpighi, Functional MR Unit, Bologna (Italy); University of Bologna, Department of Biomedical and Neuromotor Sciences, Bologna (Italy); Calandra-Buonaura, Giovanna [University of Bologna, Department of Biomedical and Neuromotor Sciences, Bologna (Italy); IRCCS Istituto delle Scienze Neurologiche di Bologna, Bologna (Italy); Pezzoli, Gianni; Cilia, Roberto [ASST G.Pini - CTO, ex ICP, Parkinson Institute, Milano (Italy); Zappia, Mario; Nicoletti, Alessandra; Cicero, Calogero Edoardo [University of Catania, Department ' ' G.F. Ingrassia' ' , Section of Neurosciences, Catania (Italy); Tinazzi, Michele; Tocco, Pierluigi [University Hospital of Verona, Department of Neurological and Movement Sciences, Verona (Italy); Cardobi, Nicolo [University Hospital of Verona, Institute of Radiology, Verona (Italy); Quattrone, Aldo [National Research Council, Institute of Bioimaging and Molecular Physiology, Catanzaro (Italy); University ' ' Magna Graecia' ' , Institute of Neurology, Department of Medical and Surgical Sciences, Catanzaro (Italy)

    2017-06-15

    To investigate the reliability of a new in-house automatic algorithm for calculating the Magnetic Resonance Parkinsonism Index (MRPI), in a large multicentre study population of patients affected by progressive supranuclear palsy (PSP) or Parkinson's disease (PD), and healthy controls (HC), and to compare the diagnostic accuracy of the automatic and manual MRPI values. The study included 88 PSP patients, 234 PD patients and 117 controls. MRI was performed using both 3T and 1.5T scanners. Automatic and manual MRPI values were evaluated, and accuracy of both methods in distinguishing PSP from PD and controls was calculated. No statistical differences were found between automated and manual MRPI values in all groups. The automatic MRPI values differentiated PSP from PD with an accuracy of 95 % (manual MRPI accuracy 96 %) and 97 % (manual MRPI accuracy 100 %) for 1.5T and 3T scanners, respectively. Our study showed that the new in-house automated method for MRPI calculation was highly accurate in distinguishing PSP from PD. Our automatic approach allows a widespread use of MRPI in clinical practice and in longitudinal research studies. (orig.)

  13. Automated Demand Response for Energy Sustainability

    Science.gov (United States)

    2015-05-01

    potential cost disadvantages (such as increased first cost, installation cost, and/or operations & maintenance costs) are expected. 2.3.5 Social Acceptance...temperature data was measured at the KBYS Fort Irwin / Barstow station, which is located on the Bicycle Lake Army Airfield about three miles from the

  14. Demand Uncertainty

    DEFF Research Database (Denmark)

    Nguyen, Daniel Xuyen

    This paper presents a model of trade that explains why firms wait to export and why many exporters fail. Firms face uncertain demands that are only realized after the firm enters the destination. The model retools the timing of uncertainty resolution found in productivity heterogeneity models...... in untested destinations. The option to forecast demands causes firms to delay exporting in order to gather more information about foreign demand. Third, since uncertainty is resolved after entry, many firms enter a destination and then exit after learning that they cannot profit. This prediction reconciles...

  15. STUDY ON C2 B E-BUSINESS AUTOMATED NEGOTIATION SYSTEM FOR TOURISM SUPPORTING DEMANDS AGGREGATION%支持需求聚合的C2B旅游电子商务自动谈判系统研究

    Institute of Scientific and Technical Information of China (English)

    刘晓文; 韩冰; 蒋永辉; 于瑾

    2015-01-01

    为了更好地撮合旅游服务供需双方的交易活动,将C2 B电子商务模式和自动谈判系统相结合,提出一种新的支持需求聚合的C2B旅游电子商务自动谈判系统。引入Agent技术自动聚合大量分散的旅游服务需求,再由C2B电子商务自动谈判系统通过谈判代理Agent和旅游服务企业方的谈判Agent进行自动谈判,撮合旅游服务供需双方交易。阐述系统的总体框架和运行机制;研究需求聚合的策略和方法;基于FIPA规范设计相应的自动谈判协议和自动谈判过程控制算法。基于JADE开发了一套原型系统,验证了相关设计的可行性。%We propose in this paper a new C2B e-business automated negotiation system for tourism supporting demands aggregation for better coupling the transaction activities of both suppliers and demanders in tourism services by combing C2B e-business pattern with automa-ted negotiation system.Agent technology is introduced to automatically aggregate a great deal of scattered personal tourism service demands, then the aggregated demands will be inputted to C2B e-business automated negotiation system, though its negotiation Agent, to automatically negotiate with the negotiation Agent of tourism services companies, so as to match the transactions between both the suppliers and demanders of the travel services.In this paper we elaborate the overall framework and operation mechanism of the system, and study the strategy and method of demands aggregation.Based on FIPA specification we designed correlated automated negotiation protocol and the control algorithm of automated negotiation process.Based on JADE we developed a prototype system, and the feasibility of the relative design has been valida-ted as well.

  16. Demand forecasting

    OpenAIRE

    Gregor, Belčec

    2011-01-01

    Companies operate in an increasingly challenging environment that requires them to continuously improve all areas of the business process. Demand forecasting is one area in manufacturing companies where we can hope to gain great advantages. Improvements in forecasting can result in cost savings throughout the supply chain, improve the reliability of information and the quality of the service for our customers. In the company Danfoss Trata, d. o. o. we did not have a system for demand forecast...

  17. Warehouse automation

    OpenAIRE

    Pogačnik, Jure

    2017-01-01

    An automated high bay warehouse is commonly used for storing large number of material with a high throughput. In an automated warehouse pallet movements are mainly performed by a number of automated devices like conveyors systems, trolleys, and stacker cranes. From the introduction of the material to the automated warehouse system to its dispatch the system requires no operator input or intervention since all material movements are done automatically. This allows the automated warehouse to op...

  18. Systematic review automation technologies

    Science.gov (United States)

    2014-01-01

    Systematic reviews, a cornerstone of evidence-based medicine, are not produced quickly enough to support clinical practice. The cost of production, availability of the requisite expertise and timeliness are often quoted as major contributors for the delay. This detailed survey of the state of the art of information systems designed to support or automate individual tasks in the systematic review, and in particular systematic reviews of randomized controlled clinical trials, reveals trends that see the convergence of several parallel research projects. We surveyed literature describing informatics systems that support or automate the processes of systematic review or each of the tasks of the systematic review. Several projects focus on automating, simplifying and/or streamlining specific tasks of the systematic review. Some tasks are already fully automated while others are still largely manual. In this review, we describe each task and the effect that its automation would have on the entire systematic review process, summarize the existing information system support for each task, and highlight where further research is needed for realizing automation for the task. Integration of the systems that automate systematic review tasks may lead to a revised systematic review workflow. We envisage the optimized workflow will lead to system in which each systematic review is described as a computer program that automatically retrieves relevant trials, appraises them, extracts and synthesizes data, evaluates the risk of bias, performs meta-analysis calculations, and produces a report in real time. PMID:25005128

  19. Manual segmentation of the fornix, fimbria, and alveus on high-resolution 3T MRI: Application via fully-automated mapping of the human memory circuit white and grey matter in healthy and pathological aging.

    Science.gov (United States)

    Amaral, Robert S C; Park, Min Tae M; Devenyi, Gabriel A; Lynn, Vivian; Pipitone, Jon; Winterburn, Julie; Chavez, Sofia; Schira, Mark; Lobaugh, Nancy J; Voineskos, Aristotle N; Pruessner, Jens C; Chakravarty, M Mallar

    2016-10-18

    Recently, much attention has been focused on the definition and structure of the hippocampus and its subfields, while the projections from the hippocampus have been relatively understudied. Here, we derive a reliable protocol for manual segmentation of hippocampal white matter regions (alveus, fimbria, and fornix) using high-resolution magnetic resonance images that are complementary to our previous definitions of the hippocampal subfields, both of which are freely available at https://github.com/cobralab/atlases. Our segmentation methods demonstrated high inter- and intra-rater reliability, were validated as inputs in automated segmentation, and were used to analyze the trajectory of these regions in both healthy aging (OASIS), and Alzheimer's disease (AD) and mild cognitive impairment (MCI; using ADNI). We observed significant bilateral decreases in the fornix in healthy aging while the alveus and cornu ammonis (CA) 1 were well preserved (all p's<0.006). MCI and AD demonstrated significant decreases in fimbriae and fornices. Many hippocampal subfields exhibited decreased volume in both MCI and AD, yet no significant differences were found between MCI and AD cohorts themselves. Our results suggest a neuroprotective or compensatory role for the alveus and CA1 in healthy aging and suggest that an improved understanding of the volumetric trajectories of these structures is required.

  20. Information management - Assessing the demand for information

    Science.gov (United States)

    Rogers, William H.

    1991-01-01

    Information demand is defined in terms of both information content (what information) and form (when, how, and where it is needed). Providing the information richness required for flight crews to be informed without overwhelming their information processing capabilities will require a great deal of automated intelligence. It is seen that the essence of this intelligence is comprehending and capturing the demand for information.

  1. Demanding Satisfaction

    Science.gov (United States)

    Oguntoyinbo, Lekan

    2010-01-01

    It was the kind of crisis most universities dread. In November 2006, a group of minority student leaders at Indiana University-Purdue University Indianapolis (IUPUI) threatened to sue the university if administrators did not heed demands that included providing more funding for multicultural student groups. This article discusses how this threat…

  2. 全自动多通道毛细管区带电泳技术在血红蛋白分析中的临床应用%Clinical application of fully automated multicapillary zone electrophoresis in hemoglobin analysis

    Institute of Scientific and Technical Information of China (English)

    卢业成; 郑师陵; 肖艳华; 陈星; 初德强

    2009-01-01

    目的 探讨全自动多通道毛细管区带电泳技术在血红蛋白分析中的临床应用价值.方法 采用法国Sebia Capillarys 2电泳系统(Version 5.50)及配套试剂,全血样品经系统自动处理后,在9.8 kV电压、34℃、pH9.4的缓冲液条件下,在8条并联的毛细管(17.5 cm×25μm)内进行血红蛋白电泳,415 nm下直接检测各组分的百分比浓度.结果 分析183例健康成人血红蛋白电泳结果 ,确定本实验室参考范围HbA2:2.1%~3.2%.应用Hb AFSC质控进行精密度试验,各组分的CV值在(批内:0.21%~1.07%;批间:0.60%~3.10%)之间.检测6 045例临床需要进行溶血性黄疸诊断和鉴别诊断的患者标本,检出HbA2增高379例;HbF增高203例;HbH带27例;HbE带8例;HbS带4例;Hb J-K带3例;Hb Bart's 2例;HbD 1例.结论 Sebia Capillarys 2电泳系统能够清晰分辨HbA2、HbF增高及其他异常区带;可准确地将HbC和HbE与HbA2区分;还可完美区分和聚焦HbA与HbS之间的HbF;并可以轻易区分移动速度相近的HbS和HbD.具有自动化程度高、操作简便、快速准确的特点,适合医疗机构临床实验室在血红蛋白分析中常规应用.%Objective To explore the applicable value of automated capillary zone electrophoresis in hemoglobin analysis. Methods By using French Sebia Capillarys 2 electrophoresis system (Version 5.50) and related reagents, the whole blood samples were treated by the system automatically,and under the condition of 9.8 kV, 34 ℃, pH9.4 of buffer solution, hemoglobin electrophoresis was carried out within eight parallel capillaries (17.5 cm× 25 μm) and the percentage concentration of each component were directly detected at 415 nm. Results By analyzing the results of hemoglobin electrophoresis of 183 healthy adults, HbA2 reference range of our laboratory was established: 2.1%-3.2%.Precision test was performed by quality control of Hb AFSC, and CVs about each component ranged from (intra-assay: 0. 21% to 1.07%; inter-assay: 0

  3. Accounting Automation

    OpenAIRE

    Laynebaril1

    2017-01-01

    Accounting Automation   Click Link Below To Buy:   http://hwcampus.com/shop/accounting-automation/  Or Visit www.hwcampus.com Accounting Automation” Please respond to the following: Imagine you are a consultant hired to convert a manual accounting system to an automated system. Suggest the key advantages and disadvantages of automating a manual accounting system. Identify the most important step in the conversion process. Provide a rationale for your response. ...

  4. Home Automation

    OpenAIRE

    Ahmed, Zeeshan

    2010-01-01

    In this paper I briefly discuss the importance of home automation system. Going in to the details I briefly present a real time designed and implemented software and hardware oriented house automation research project, capable of automating house's electricity and providing a security system to detect the presence of unexpected behavior.

  5. Theory and key problems for automated demand response of user side considering generalized demand side resources%计及广义需求侧资源的用户侧自动响应机理与关键问题

    Institute of Scientific and Technical Information of China (English)

    汤庆峰; 刘念; 张建华

    2014-01-01

    Automated demand response (Auto-DR) is one of the key technologies in smart grid. When the generalized demand side resources (DSR) are connected to the grid, the implementation of auto-DR on the user side needs more requirements. Firstly, this paper introduces the basic form of smart unit of the electricity consumption on the user side and analyses the applicability of DSR such as controllable load, distributed energy resources, energy storage and electric vehicles. Two operation modes of auto-DR (“Independent users & smart unit” and “collective users & aggregated smart unit”) are proposed, and their related electrical and information structure is also introduced. Secondly, the research status and development of auto-DR on user side are summarized, including system architecture design, load features and forecasting of different type of users, categories of load type and control models, optimization models and methods, etc. Finally, the paper concludes that key problems such as short-term load forecast of users, models of load regulation and planning, optimized operation for auto-DR and comprehensive benefit evaluation still need further research.%自动需求响应是智能电网的关键技术之一,在广义需求侧资源接入的情况下,对用户侧如何实施自动需求响应提出了更高要求。首先,介绍了用户侧的智能用电单元的基本形态,分析了负荷、分布式电源、储能与电动汽车等需求侧资源的适用性;提出了“独立用户+节点型智能用电单元”与“集体用户+聚合型智能用电单元”两种用户自动需求响应运行模式,并给出了相应的电气与信息架构。其次,从系统架构设计、用户负荷特征及负荷预测、负荷可调控性及控制模型、优化模型与方法等多个方面综述了用户侧自动需求响应的国内外研究现状及发展动态。最后,总结分析认为用户侧短期负荷预测、负荷可调控性与可计划

  6. Demand Uncertainty

    DEFF Research Database (Denmark)

    Nguyen, Daniel Xuyen

    This paper presents a model of trade that explains why firms wait to export and why many exporters fail. Firms face uncertain demands that are only realized after the firm enters the destination. The model retools the timing of uncertainty resolution found in productivity heterogeneity models...... the high rate of exit seen in the first years of exporting. Finally, when faced with multiple countries in which to export, some firms will choose to sequentially export in order to slowly learn more about its chances for success in untested markets....

  7. The automated Palomar 60 inch telescope

    OpenAIRE

    Cenko, S Bradley; Fox, Derek B.; Moon, Dae-Sik; Harrison, Fiona A.; Kulkarni, S.R.; Henning, John R.; Guzman, C. Dani; Bonati, Marco; Smith, Roger M.; Thicksten, Robert P.; Doyle, Michael W.; Petrie, Hal L.; Gal-Yam, Avishay; Soderberg, Alicia M.; Anagnostou, Nathaniel L.

    2006-01-01

    We have converted the Palomar 60-inch telescope (P60) from a classical night assistant-operated telescope to a fully robotic facility. The automated system, which has been operational since September 2004, is designed for moderately fast (t

  8. Fully automated procedure for ship detection using optical satellite imagery

    Science.gov (United States)

    Corbane, C.; Pecoul, E.; Demagistri, L.; Petit, M.

    2009-01-01

    Ship detection from remote sensing imagery is a crucial application for maritime security which includes among others traffic surveillance, protection against illegal fisheries, oil discharge control and sea pollution monitoring. In the framework of a European integrated project GMES-Security/LIMES, we developed an operational ship detection algorithm using high spatial resolution optical imagery to complement existing regulations, in particular the fishing control system. The automatic detection model is based on statistical methods, mathematical morphology and other signal processing techniques such as the wavelet analysis and Radon transform. This paper presents current progress made on the detection model and describes the prototype designed to classify small targets. The prototype was tested on panchromatic SPOT 5 imagery taking into account the environmental and fishing context in French Guiana. In terms of automatic detection of small ship targets, the proposed algorithm performs well. Its advantages are manifold: it is simple and robust, but most of all, it is efficient and fast, which is a crucial point in performance evaluation of advanced ship detection strategies.

  9. Rapid and fully automated Measurement of Water Vapor Sorption Isotherms

    DEFF Research Database (Denmark)

    Arthur, Emmanuel; Tuller, Markus; Møldrup, Per

    2014-01-01

    Eminent environmental challenges such as remediation of contaminated sites, the establishment and maintenance of nuclear waste repositories, or the design of surface landfill covers all require accurate quantification of the soil water characteristic at low water contents. Furthermore, several...... essential but difficult-to-measure soil properties such as clay content and specific surface area are intimately related to water vapor sorption. Until recently, it was a major challenge to accurately measure detailed water vapor sorption isotherms within an acceptable time frame. This priority...... and pesticide volatilization, toxic organic vapor sorption kinetics, and soil water repellency are illustrated. Several methods to quantify hysteresis effects and to derive soil clay content and specific surface area from VSA-measured isotherms are presented. Besides above mentioned applications, potential...

  10. Simple Fully Automated Group Classification on Brain fMRI

    Energy Technology Data Exchange (ETDEWEB)

    Honorio, J.; Goldstein, R.; Honorio, J.; Samaras, D.; Tomasi, D.; Goldstein, R.Z.

    2010-04-14

    We propose a simple, well grounded classification technique which is suited for group classification on brain fMRI data sets that have high dimensionality, small number of subjects, high noise level, high subject variability, imperfect registration and capture subtle cognitive effects. We propose threshold-split region as a new feature selection method and majority voteas the classification technique. Our method does not require a predefined set of regions of interest. We use average acros ssessions, only one feature perexperimental condition, feature independence assumption, and simple classifiers. The seeming counter-intuitive approach of using a simple design is supported by signal processing and statistical theory. Experimental results in two block design data sets that capture brain function under distinct monetary rewards for cocaine addicted and control subjects, show that our method exhibits increased generalization accuracy compared to commonly used feature selection and classification techniques.

  11. Library Automation

    OpenAIRE

    Dhakne, B. N.; Giri, V. V.; Waghmode, S. S.

    2010-01-01

    New technologies library provides several new materials, media and mode of storing and communicating the information. Library Automation reduces the drudgery of repeated manual efforts in library routine. By use of library automation collection, Storage, Administration, Processing, Preservation and communication etc.

  12. Android Fully Loaded

    CERN Document Server

    Huddleston, Rob

    2012-01-01

    Fully loaded with the latest tricks and tips on your new Android! Android smartphones are so hot, they're soaring past iPhones on the sales charts. And the second edition of this muscular little book is equally impressive--it's packed with tips and tricks for getting the very most out of your latest-generation Android device. Start Facebooking and tweeting with your Android mobile, scan barcodes to get pricing and product reviews, download your favorite TV shows--the book is positively bursting with practical and fun how-tos. Topics run the gamut from using speech recognition, location-based m

  13. On Fully Homomorphic Encryption

    OpenAIRE

    Fauzi, Prastudy

    2012-01-01

    Fully homomorphic encryption is an encryption scheme where a party can receive encrypted data and perform arbitrary operations on this data efficiently.The data remains encrypted throughout, but the operations can be done regardless, without having to know the decryption key.Such a scheme would be very advantageous, for example in ensuring the privacy of data that is sent to a third-party service.This is in contrast with schemes like Paillier where you can not perform a multiplication of encr...

  14. Coordinated Demand Response and Distributed Generation Management in Residential Smart Microgrids

    DEFF Research Database (Denmark)

    Anvari-Moghaddam, Amjad; Mokhtari, Ghassem; Guerrero, Josep M.

    2016-01-01

    potentials to increase the functionality of a typical demand-side management (DSM) strategy, and typical implementation of building-level DERs by integrating them into a cohesive, networked package that fully utilizes smart energy-efficient end-use devices, advanced building control/automation systems....... Finally, the effectiveness and applicability of the proposed model is tested and validated in different operating modes compared to the existing models. The findings of this chapter show that by the use of an expert EMS that coordinates supply and demand sides simultaneously, it is very possible not only......Nowadays with the emerging of small-scale integrated energy systems (IESs) in form of residential smart microgrids (SMGs), a large portion of energy can be saved through coordinated scheduling of smart household devices and management of distributed energy resources (DERs). There are significant...

  15. 21 CFR 864.5620 - Automated hemoglobin system.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Automated hemoglobin system. 864.5620 Section 864....5620 Automated hemoglobin system. (a) Identification. An automated hemoglobin system is a fully... hemoglobin content of human blood. (b) Classification. Class II (performance standards)....

  16. Fully Awake Breast Reduction.

    Science.gov (United States)

    Filson, Simon A; Yarhi, Danielle; Ramon, Yitzhak

    2016-11-01

    The authors present 25 cases and an in-depth 4-minute video of fully awake aesthetic breast reduction, which was made possible by thoracic epidural anesthesia. There are obvious and important advantages to this technique. Not only does this allow for intraoperative patient cooperation (i.e., patient self-positioning and opinion for comparison of breasts), meaning a shorter and more efficient intraoperative time, there also is a reduction in postoperative pain, complications, recovery, and discharge times. The authors have also enjoyed great success and no complications with this technique in over 150 awake abdominoplasty/total body lift patients. The authors feel that the elimination of the need for general anesthesia by thoracic epidural sensorial-only anesthesia is a highly effective and efficient technique, with very few disadvantages/complications, providing advantages to both patients and surgeons. Therapeutic, IV.

  17. In Orbit Performance of a Fully Autonomous Star Tracker

    DEFF Research Database (Denmark)

    Jørgensen, John Leif

    1999-01-01

    The Department of Automation at DTU has developed the Advanced Stellar Compass (ASC), a fully autonomous star tracker, for use as high precision attitude reference onboard spacecrafts. The ASC is composed of a CCD-based camera and a powerful microprocessor containing star catalogue, image...

  18. Automation of Endmember Pixel Selection in SEBAL/METRIC Model

    Science.gov (United States)

    Bhattarai, N.; Quackenbush, L. J.; Im, J.; Shaw, S. B.

    2015-12-01

    The commonly applied surface energy balance for land (SEBAL) and its variant, mapping evapotranspiration (ET) at high resolution with internalized calibration (METRIC) models require manual selection of endmember (i.e. hot and cold) pixels to calibrate sensible heat flux. Current approaches for automating this process are based on statistical methods and do not appear to be robust under varying climate conditions and seasons. In this paper, we introduce a new approach based on simple machine learning tools and search algorithms that provides an automatic and time efficient way of identifying endmember pixels for use in these models. The fully automated models were applied on over 100 cloud-free Landsat images with each image covering several eddy covariance flux sites in Florida and Oklahoma. Observed land surface temperatures at automatically identified hot and cold pixels were within 0.5% of those from pixels manually identified by an experienced operator (coefficient of determination, R2, ≥ 0.92, Nash-Sutcliffe efficiency, NSE, ≥ 0.92, and root mean squared error, RMSE, ≤ 1.67 K). Daily ET estimates derived from the automated SEBAL and METRIC models were in good agreement with their manual counterparts (e.g., NSE ≥ 0.91 and RMSE ≤ 0.35 mm day-1). Automated and manual pixel selection resulted in similar estimates of observed ET across all sites. The proposed approach should reduce time demands for applying SEBAL/METRIC models and allow for their more widespread and frequent use. This automation can also reduce potential bias that could be introduced by an inexperienced operator and extend the domain of the models to new users.

  19. Automating the Purple Crow Lidar

    Directory of Open Access Journals (Sweden)

    Hicks Shannon

    2016-01-01

    Full Text Available The Purple Crow LiDAR (PCL was built to measure short and long term coupling between the lower, middle, and upper atmosphere. The initial component of my MSc. project is to automate two key elements of the PCL: the rotating liquid mercury mirror and the Zaber alignment mirror. In addition to the automation of the Zaber alignment mirror, it is also necessary to describe the mirror’s movement and positioning errors. Its properties will then be added into the alignment software. Once the alignment software has been completed, we will compare the new alignment method with the previous manual procedure. This is the first among several projects that will culminate in a fully-automated lidar. Eventually, we will be able to work remotely, thereby increasing the amount of data we collect. This paper will describe the motivation for automation, the methods we propose, preliminary results for the Zaber alignment error analysis, and future work.

  20. Automating the Purple Crow Lidar

    Science.gov (United States)

    Hicks, Shannon; Sica, R. J.; Argall, P. S.

    2016-06-01

    The Purple Crow LiDAR (PCL) was built to measure short and long term coupling between the lower, middle, and upper atmosphere. The initial component of my MSc. project is to automate two key elements of the PCL: the rotating liquid mercury mirror and the Zaber alignment mirror. In addition to the automation of the Zaber alignment mirror, it is also necessary to describe the mirror's movement and positioning errors. Its properties will then be added into the alignment software. Once the alignment software has been completed, we will compare the new alignment method with the previous manual procedure. This is the first among several projects that will culminate in a fully-automated lidar. Eventually, we will be able to work remotely, thereby increasing the amount of data we collect. This paper will describe the motivation for automation, the methods we propose, preliminary results for the Zaber alignment error analysis, and future work.

  1. Automotive automation: Investigating the impact on drivers' mental workload

    OpenAIRE

    Young, M.S.; Stanton, N A

    1997-01-01

    Recent advances in technology have meant that an increasing number of vehicle driving tasks are becoming automated. Such automation poses new problems for the ergonomist. Of particular concern in this paper are the twofold effects of automation on mental workload - novel technologies could increase attentional demand and workload, alternatively one could argue that fewer driving tasks will lead to the problem of reduced attentional demand and driver underload. A brief review of...

  2. Varying Levels of Automation on UAS Operator Responses to Traffic Resolution Advisories in Civil Airspace

    Science.gov (United States)

    Kenny, Caitlin; Fern, Lisa

    2012-01-01

    Continuing demand for the use of Unmanned Aircraft Systems (UAS) has put increasing pressure on operations in civil airspace. The need to fly UAS in the National Airspace System (NAS) in order to perform missions vital to national security and defense, emergency management, and science is increasing at a rapid pace. In order to ensure safe operations in the NAS, operators of unmanned aircraft, like those of manned aircraft, may be required to maintain separation assurance and avoid loss of separation with other aircraft while performing their mission tasks. This experiment investigated the effects of varying levels of automation on UAS operator performance and workload while responding to conflict resolution instructions provided by the Tactical Collision Avoidance System II (TCAS II) during a UAS mission in high-density airspace. The purpose of this study was not to investigate the safety of using TCAS II on UAS, but rather to examine the effect of automation on the ability of operators to respond to traffic collision alerts. Six licensed pilots were recruited to act as UAS operators for this study. Operators were instructed to follow a specified mission flight path, while maintaining radio contact with Air Traffic Control and responding to TCAS II resolution advisories. Operators flew four, 45 minute, experimental missions with four different levels of automation: Manual, Knobs, Management by Exception, and Fully Automated. All missions included TCAS II Resolution Advisories (RAs) that required operator attention and rerouting. Operator compliance and reaction time to RAs was measured, and post-run NASA-TLX ratings were collected to measure workload. Results showed significantly higher compliance rates, faster responses to TCAS II alerts, as well as less preemptive operator actions when higher levels of automation are implemented. Physical and Temporal ratings of workload were significantly higher in the Manual condition than in the Management by Exception and

  3. Automated High Throughput Drug Target Crystallography

    Energy Technology Data Exchange (ETDEWEB)

    Rupp, B

    2005-02-18

    The molecular structures of drug target proteins and receptors form the basis for 'rational' or structure guided drug design. The majority of target structures are experimentally determined by protein X-ray crystallography, which as evolved into a highly automated, high throughput drug discovery and screening tool. Process automation has accelerated tasks from parallel protein expression, fully automated crystallization, and rapid data collection to highly efficient structure determination methods. A thoroughly designed automation technology platform supported by a powerful informatics infrastructure forms the basis for optimal workflow implementation and the data mining and analysis tools to generate new leads from experimental protein drug target structures.

  4. Fully electric waste collection

    CERN Multimedia

    Anaïs Schaeffer

    2015-01-01

    Since 15 June, Transvoirie, which provides waste collection services throughout French-speaking Switzerland, has been using a fully electric lorry for its collections on the CERN site – a first for the region!   Featuring a motor powered by electric batteries that charge up when the brakes are used, the new lorry that roams the CERN site is as green as can be. And it’s not only the motor that’s electric: its waste compactor and lifting mechanism are also electrically powered*, making it the first 100% electric waste collection vehicle in French-speaking Switzerland. Considering that a total of 15.5 tonnes of household waste and paper/cardboard are collected each week from the Meyrin and Prévessin sites, the benefits for the environment are clear. This improvement comes as part of CERN’s contract with Transvoirie, which stipulates that the firm must propose ways of becoming more environmentally friendly (at no extra cost to CERN). *The was...

  5. The Employment-Impact of Automation in Canada

    OpenAIRE

    2015-01-01

    Standard neoclassical models of labour demand predict that automation does not produce long-term increases in unemployment. Supporting evidence in Canada between 1970 and 2008 is explained by the reallocation of labour from industries with high levels of automation such as Manufacturing to industries with low levels of automation such as Retail and Wholesale Trade, and Business Services. Recent evidence indicates however that on-going technological advances are now driving labour automation i...

  6. Wireless Demand Response Controls for HVAC Systems

    Energy Technology Data Exchange (ETDEWEB)

    Federspiel, Clifford

    2009-06-30

    The objectives of this scoping study were to develop and test control software and wireless hardware that could enable closed-loop, zone-temperature-based demand response in buildings that have either pneumatic controls or legacy digital controls that cannot be used as part of a demand response automation system. We designed a SOAP client that is compatible with the Demand Response Automation Server (DRAS) being used by the IOUs in California for their CPP program, design the DR control software, investigated the use of cellular routers for connecting to the DRAS, and tested the wireless DR system with an emulator running a calibrated model of a working building. The results show that the wireless DR system can shed approximately 1.5 Watts per design CFM on the design day in a hot, inland climate in California while keeping temperatures within the limits of ASHRAE Standard 55: Thermal Environmental Conditions for Human Occupancy.

  7. Testing fully depleted CCD

    Science.gov (United States)

    Casas, Ricard; Cardiel-Sas, Laia; Castander, Francisco J.; Jiménez, Jorge; de Vicente, Juan

    2014-08-01

    The focal plane of the PAU camera is composed of eighteen 2K x 4K CCDs. These devices, plus four spares, were provided by the Japanese company Hamamatsu Photonics K.K. with type no. S10892-04(X). These detectors are 200 μm thick fully depleted and back illuminated with an n-type silicon base. They have been built with a specific coating to be sensitive in the range from 300 to 1,100 nm. Their square pixel size is 15 μm. The read-out system consists of a Monsoon controller (NOAO) and the panVIEW software package. The deafualt CCD read-out speed is 133 kpixel/s. This is the value used in the calibration process. Before installing these devices in the camera focal plane, they were characterized using the facilities of the ICE (CSIC- IEEC) and IFAE in the UAB Campus in Bellaterra (Barcelona, Catalonia, Spain). The basic tests performed for all CCDs were to obtain the photon transfer curve (PTC), the charge transfer efficiency (CTE) using X-rays and the EPER method, linearity, read-out noise, dark current, persistence, cosmetics and quantum efficiency. The X-rays images were also used for the analysis of the charge diffusion for different substrate voltages (VSUB). Regarding the cosmetics, and in addition to white and dark pixels, some patterns were also found. The first one, which appears in all devices, is the presence of half circles in the external edges. The origin of this pattern can be related to the assembly process. A second one appears in the dark images, and shows bright arcs connecting corners along the vertical axis of the CCD. This feature appears in all CCDs exactly in the same position so our guess is that the pattern is due to electrical fields. Finally, and just in two devices, there is a spot with wavelength dependence whose origin could be the result of a defectous coating process.

  8. Dynamic adaptive policymaking for the sustainable city: The case of automated taxis

    Directory of Open Access Journals (Sweden)

    Warren E. Walker

    2017-06-01

    Full Text Available By 2050, about two-thirds of the world’s people are expected to live in urban areas. But, the economic viability and sustainability of city centers is threatened by problems related to transport, such as pollution, congestion, and parking. Much has been written about automated vehicles and demand responsive transport. The combination of these potentially disruptive developments could reduce these problems. However, implementation is held back by uncertainties, including public acceptance, liability, and privacy. So, their potential to reduce urban transport problems may not be fully realized. We propose an adaptive approach to implementation that takes some actions right away and creates a framework for future actions that allows for adaptations over time as knowledge about performance and acceptance of the new system (called ‘automated taxis’ accumulates and critical events for implementation take place. The adaptive approach is illustrated in the context of a hypothetical large city.

  9. An automated HIV-1 Env-pseudotyped virus production for global HIV vaccine trials.

    Directory of Open Access Journals (Sweden)

    Anke Schultz

    Full Text Available BACKGROUND: Infections with HIV still represent a major human health problem worldwide and a vaccine is the only long-term option to fight efficiently against this virus. Standardized assessments of HIV-specific immune responses in vaccine trials are essential for prioritizing vaccine candidates in preclinical and clinical stages of development. With respect to neutralizing antibodies, assays with HIV-1 Env-pseudotyped viruses are a high priority. To cover the increasing demands of HIV pseudoviruses, a complete cell culture and transfection automation system has been developed. METHODOLOGY/PRINCIPAL FINDINGS: The automation system for HIV pseudovirus production comprises a modified Tecan-based Cellerity system. It covers an area of 5×3 meters and includes a robot platform, a cell counting machine, a CO(2 incubator for cell cultivation and a media refrigerator. The processes for cell handling, transfection and pseudovirus production have been implemented according to manual standard operating procedures and are controlled and scheduled autonomously by the system. The system is housed in a biosafety level II cabinet that guarantees protection of personnel, environment and the product. HIV pseudovirus stocks in a scale from 140 ml to 1000 ml have been produced on the automated system. Parallel manual production of HIV pseudoviruses and comparisons (bridging assays confirmed that the automated produced pseudoviruses were of equivalent quality as those produced manually. In addition, the automated method was fully validated according to Good Clinical Laboratory Practice (GCLP guidelines, including the validation parameters accuracy, precision, robustness and specificity. CONCLUSIONS: An automated HIV pseudovirus production system has been successfully established. It allows the high quality production of HIV pseudoviruses under GCLP conditions. In its present form, the installed module enables the production of 1000 ml of virus-containing cell

  10. Automated growth of metal-organic framework coatings on flow-through functional supports.

    Science.gov (United States)

    Maya, F; Palomino Cabello, C; Clavijo, S; Estela, J M; Cerdà, V; Turnes Palomino, G

    2015-05-11

    A fully automated method for the controlled growth of metal-organic framework coatings on flow-through functional supports is reported. The obtained hybrid flow-through supports show high performance for the automated extraction of water pollutants.

  11. Automating Finance

    Science.gov (United States)

    Moore, John

    2007-01-01

    In past years, higher education's financial management side has been riddled with manual processes and aging mainframe applications. This article discusses schools which had taken advantage of an array of technologies that automate billing, payment processing, and refund processing in the case of overpayment. The investments are well worth it:…

  12. Demand Forecasting Errors

    OpenAIRE

    Mackie, Peter; Nellthorp, John; Laird, James

    2005-01-01

    Demand forecasts form a key input to the economic appraisal. As such any errors present within the demand forecasts will undermine the reliability of the economic appraisal. The minimization of demand forecasting errors is therefore important in the delivery of a robust appraisal. This issue is addressed in this note by introducing the key issues, and error types present within demand fore...

  13. Heating automation

    OpenAIRE

    Tomažič, Tomaž

    2013-01-01

    This degree paper presents usage and operation of peripheral devices with microcontroller for heating automation. The main goal is to make a quality system control for heating three house floors and with that, increase efficiency of heating devices and lower heating expenses. Heat pump, furnace, boiler pump, two floor-heating pumps and two radiator pumps need to be controlled by this system. For work, we have chosen a development kit stm32f4 - discovery with five temperature sensors, LCD disp...

  14. Automation Security

    OpenAIRE

    Mirzoev, Dr. Timur

    2014-01-01

    Web-based Automated Process Control systems are a new type of applications that use the Internet to control industrial processes with the access to the real-time data. Supervisory control and data acquisition (SCADA) networks contain computers and applications that perform key functions in providing essential services and commodities (e.g., electricity, natural gas, gasoline, water, waste treatment, transportation) to all Americans. As such, they are part of the nation s critical infrastructu...

  15. Automated Analysis of Corpora Callosa

    DEFF Research Database (Denmark)

    Stegmann, Mikkel Bille; Davies, Rhodri H.

    2003-01-01

    This report describes and evaluates the steps needed to perform modern model-based interpretation of the corpus callosum in MRI. The process is discussed from the initial landmark-free contours to full-fledged statistical models based on the Active Appearance Models framework. Topics treated incl...... include landmark placement, background modelling and multi-resolution analysis. Preliminary quantitative and qualitative validation in a cross-sectional study show that fully automated analysis and segmentation of the corpus callosum are feasible....

  16. Automated Test Requirement Document Generation

    Science.gov (United States)

    1987-11-01

    DIAGNOSTICS BASED ON THE PRINCIPLES OF ARTIFICIAL INTELIGENCE ", 1984 International Test Conference, 01Oct84, (A3, 3, Cs D3, E2, G2, H2, 13, J6, K) 425...j0O GLOSSARY OF ACRONYMS 0 ABBREVIATION DEFINITION AFSATCOM Air Force Satellite Communication Al Artificial Intelligence ASIC Application Specific...In-Test Equipment (BITE) and AI ( Artificial Intelligence) - Expert Systems - need to be fully applied before a completely automated process can be

  17. How to assess sustainability in automated manufacturing

    DEFF Research Database (Denmark)

    Dijkman, Teunis Johannes; Rödger, Jan-Markus; Bey, Niki

    2015-01-01

    The aim of this paper is to describe how sustainability in automation can be assessed. The assessment method is illustrated using a case study of a robot. Three aspects of sustainability assessment in automation are identified. Firstly, we consider automation as part of a larger system that fulfi......The aim of this paper is to describe how sustainability in automation can be assessed. The assessment method is illustrated using a case study of a robot. Three aspects of sustainability assessment in automation are identified. Firstly, we consider automation as part of a larger system...... that fulfills the market demand for a given functionality. Secondly, three aspects of sustainability have to be assessed: environment, economy, and society. Thirdly, automation is part of a system with many levels, with different actors on each level, resulting in meeting the market demand. In this system...... is a minor contributor to the carbon footprint at cell or line level, from the perspective of a robot producer reducing the electricity consumption during the robot's use stage can be a considerable improvement in the carbon footprint of a robot, and thus in the sustainability profile of the robot....

  18. Automated Budget System

    Data.gov (United States)

    Department of Transportation — The Automated Budget System (ABS) automates management and planning of the Mike Monroney Aeronautical Center (MMAC) budget by providing enhanced capability to plan,...

  19. Automated Critical PeakPricing Field Tests: 2006 Pilot ProgramDescription and Results

    Energy Technology Data Exchange (ETDEWEB)

    Piette, Mary Ann; Watson, David; Motegi, Naoya; Kiliccote, Sila

    2007-06-19

    During 2006 Lawrence Berkeley National Laboratory (LBNL) and the Demand Response Research Center (DRRC) performed a technology evaluation for the Pacific Gas and Electric Company (PG&E) Emerging Technologies Programs. This report summarizes the design, deployment, and results from the 2006 Automated Critical Peak Pricing Program (Auto-CPP). The program was designed to evaluate the feasibility of deploying automation systems that allow customers to participate in critical peak pricing (CPP) with a fully-automated response. The 2006 program was in operation during the entire six-month CPP period from May through October. The methodology for this field study included site recruitment, control strategy development, automation system deployment, and evaluation of sites' participation in actual CPP events through the summer of 2006. LBNL recruited sites in PG&E's territory in northern California through contacts from PG&E account managers, conferences, and industry meetings. Each site contact signed a memorandum of understanding with LBNL that outlined the activities needed to participate in the Auto-CPP program. Each facility worked with LBNL to select and implement control strategies for demand response and developed automation system designs based on existing Internet connectivity and building control systems. Once the automation systems were installed, LBNL conducted communications tests to ensure that the Demand Response Automation Server (DRAS) correctly provided and logged the continuous communications of the CPP signals with the energy management and control system (EMCS) for each site. LBNL also observed and evaluated Demand Response (DR) shed strategies to ensure proper commissioning of controls. The communication system allowed sites to receive day-ahead as well as day-of signals for pre-cooling, a DR strategy used at a few sites. Measurement of demand response was conducted using two different baseline models for estimating peak load savings. One

  20. Demand Response Resource Quantification with Detailed Building Energy Models

    Energy Technology Data Exchange (ETDEWEB)

    Hale, Elaine; Horsey, Henry; Merket, Noel; Stoll, Brady; Nag, Ambarish

    2017-04-03

    Demand response is a broad suite of technologies that enables changes in electrical load operations in support of power system reliability and efficiency. Although demand response is not a new concept, there is new appetite for comprehensively evaluating its technical potential in the context of renewable energy integration. The complexity of demand response makes this task difficult -- we present new methods for capturing the heterogeneity of potential responses from buildings, their time-varying nature, and metrics such as thermal comfort that help quantify likely acceptability of specific demand response actions. Computed with an automated software framework, the methods are scalable.

  1. Automation 2017

    CERN Document Server

    Zieliński, Cezary; Kaliczyńska, Małgorzata

    2017-01-01

    This book consists of papers presented at Automation 2017, an international conference held in Warsaw from March 15 to 17, 2017. It discusses research findings associated with the concepts behind INDUSTRY 4.0, with a focus on offering a better understanding of and promoting participation in the Fourth Industrial Revolution. Each chapter presents a detailed analysis of a specific technical problem, in most cases followed by a numerical analysis, simulation and description of the results of implementing the solution in a real-world context. The theoretical results, practical solutions and guidelines presented are valuable for both researchers working in the area of engineering sciences and practitioners looking for solutions to industrial problems. .

  2. A Multi-Scale Flood Monitoring System Based on Fully Automatic MODIS and TerraSAR-X Processing Chains

    Directory of Open Access Journals (Sweden)

    Enrico Stein

    2013-10-01

    Full Text Available A two-component fully automated flood monitoring system is described and evaluated. This is a result of combining two individual flood services that are currently under development at DLR’s (German Aerospace Center Center for Satellite based Crisis Information (ZKI to rapidly support disaster management activities. A first-phase monitoring component of the system systematically detects potential flood events on a continental scale using daily-acquired medium spatial resolution optical data from the Moderate Resolution Imaging Spectroradiometer (MODIS. A threshold set controls the activation of the second-phase crisis component of the system, which derives flood information at higher spatial detail using a Synthetic Aperture Radar (SAR based satellite mission (TerraSAR-X. The proposed activation procedure finds use in the identification of flood situations in different spatial resolutions and in the time-critical and on demand programming of SAR satellite acquisitions at an early stage of an evolving flood situation. The automated processing chains of the MODIS (MFS and the TerraSAR-X Flood Service (TFS include data pre-processing, the computation and adaptation of global auxiliary data, thematic classification, and the subsequent dissemination of flood maps using an interactive web-client. The system is operationally demonstrated and evaluated via the monitoring two recent flood events in Russia 2013 and Albania/Montenegro 2013.

  3. Fully 3D GPU PET reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Herraiz, J.L., E-mail: joaquin@nuclear.fis.ucm.es [Grupo de Fisica Nuclear, Departmento Fisica Atomica, Molecular y Nuclear, Universidad Complutense de Madrid (Spain); Espana, S. [Department of Radiation Oncology, Massachusetts General Hospital and Harvard Medical School, Boston, MA (United States); Cal-Gonzalez, J. [Grupo de Fisica Nuclear, Departmento Fisica Atomica, Molecular y Nuclear, Universidad Complutense de Madrid (Spain); Vaquero, J.J. [Departmento de Bioingenieria e Ingenieria Espacial, Universidad Carlos III, Madrid (Spain); Desco, M. [Departmento de Bioingenieria e Ingenieria Espacial, Universidad Carlos III, Madrid (Spain); Unidad de Medicina y Cirugia Experimental, Hospital General Universitario Gregorio Maranon, Madrid (Spain); Udias, J.M. [Grupo de Fisica Nuclear, Departmento Fisica Atomica, Molecular y Nuclear, Universidad Complutense de Madrid (Spain)

    2011-08-21

    Fully 3D iterative tomographic image reconstruction is computationally very demanding. Graphics Processing Unit (GPU) has been proposed for many years as potential accelerators in complex scientific problems, but it has not been used until the recent advances in the programmability of GPUs that the best available reconstruction codes have started to be implemented to be run on GPUs. This work presents a GPU-based fully 3D PET iterative reconstruction software. This new code may reconstruct sinogram data from several commercially available PET scanners. The most important and time-consuming parts of the code, the forward and backward projection operations, are based on an accurate model of the scanner obtained with the Monte Carlo code PeneloPET and they have been massively parallelized on the GPU. For the PET scanners considered, the GPU-based code is more than 70 times faster than a similar code running on a single core of a fast CPU, obtaining in both cases the same images. The code has been designed to be easily adapted to reconstruct sinograms from any other PET scanner, including scanner prototypes.

  4. Chordal Graphs are Fully Orientable

    CERN Document Server

    Lai, Hsin-Hao

    2012-01-01

    Suppose that D is an acyclic orientation of a graph G. An arc of D is called dependent if its reversal creates a directed cycle. Let m and M denote the minimum and the maximum of the number of dependent arcs over all acyclic orientations of G. We call G fully orientable if G has an acyclic orientation with exactly d dependent arcs for every d satisfying m <= d <= M. A graph G is called chordal if every cycle in G of length at least four has a chord. We show that all chordal graphs are fully orientable.

  5. Properties of wideband resonant reflectors under fully conical light incidence

    Science.gov (United States)

    Ko, Yeong Hwan; Niraula, Manoj; Lee, Kyu Jin; Magnusson, Robert

    2016-03-01

    Applying numerical modeling coupled with experiments, we investigate the properties of wideband resonant reflectors under fully conical light incidence. We show that the wave vectors pertinent to resonant first-order diffraction under fully conical mounting vary less with incident angle than those associated with reflectors in classical mounting. Therefore, as the evanescent diffracted waves drive the leaky modes responsible for the resonance effects, fully-conical mounting imbues reflectors with larger angular tolerance than their classical counterparts. We quantify the angular-spectral performance of representative resonant wideband reflectors in conic and classic mounts by numerical calculations with improved spectra found for fully conic incidence. Moreover, these predictions are verified experimentally for wideband reflectors fashioned in crystalline and amorphous silicon in distinct spectral regions spanning the 1200-1600-nm and 1600-2400-nm spectral bands. These results will be useful in various applications demanding wideband reflectors that are efficient and materially sparse.

  6. Automated tetraploid genotype calling by hierarchical clustering

    Science.gov (United States)

    SNP arrays are transforming breeding and genetics research for autotetraploids. To fully utilize these arrays, however, the relationship between signal intensity and allele dosage must be inferred independently for each marker. We developed an improved computational method to automate this process, ...

  7. Library Automation: A Survey of Leading Academic and Public Libraries in the United States.

    Science.gov (United States)

    Mann, Thomas W., Jr.; And Others

    Results of this survey of 26 public and academic libraries of national stature show that the country's major libraries are fully committed to automating their library operations. Major findings of the survey show that: (1) all libraries surveyed are involved in automation; (2) all libraries surveyed have automated their catalogs and bibliographic…

  8. The Systems Development Life Cycle as a Planning Methodology for Library Automation.

    Science.gov (United States)

    Cheatham, David

    1985-01-01

    Discussion of the systems development life cycle (SDLC) that supports operational and managerial planning of automation projects covers challenges of library automation, evolution and scope of SDLC, lack of dissemination of SDLC literature within library and information science community, and corrective measures to meet library automation demands.…

  9. Innovation and Demand

    DEFF Research Database (Denmark)

    Andersen, Esben Sloth

    2007-01-01

    the demand-side of markets in the simplest possible way. This strategy has allowed a gradual increase in the sophistication of supply-side aspects of economic evolution, but the one-sided focus on supply is facing diminishing returns. Therefore, demand-side aspects of economic evolution have in recent years...... received increased attention. The present paper argues that the new emphasis on demand-side factors is quite crucial for a deepened understanding of economic evolution. The major reasons are the following: First, demand represents the core force of selection that gives direction to the evolutionary process....... Second, firms' innovative activities relate, directly or indirectly, to the structure of expected and actual demand. Third, the demand side represents the most obvious way of turning to the much-needed analysis of macro-evolutionary change of the economic system....

  10. Automated Demand Response for Energy Sustainability Cost and Performance Report

    Science.gov (United States)

    2015-07-23

    limitations are foreseen in the use of OpenADR, and no potential cost disadvantages (such as increased first cost, installation cost, and/or...commercial firms. The temperature data was measured at the KBYS Fort Irwin / Barstow station, which is located on the Bicycle Lake Army Airfield about

  11. Demand Response and Energy Storage Integration Study

    Energy Technology Data Exchange (ETDEWEB)

    Ookie Ma, Kerry Cheung

    2016-03-01

    Demand response and energy storage resources present potentially important sources of bulk power system services that can aid in integrating variable renewable generation. While renewable integration studies have evaluated many of the challenges associated with deploying large amounts of variable wind and solar generation technologies, integration analyses have not yet fully incorporated demand response and energy storage resources. This report represents an initial effort in analyzing the potential integration value of demand response and energy storage, focusing on the western United States. It evaluates two major aspects of increased deployment of demand response and energy storage: (1) Their operational value in providing bulk power system services and (2) Market and regulatory issues, including potential barriers to deployment.

  12. PERFECT DEMAND ILLUSION

    Directory of Open Access Journals (Sweden)

    Alexander Yu. Sulimov

    2015-01-01

    Full Text Available The article is devoted to technique «Perfect demand illusion», which allows to strengthen the competitive advantageof retailers. Also in the paper spells out the golden rules of visual merchandising.The definition of the method «Demand illusion», formulated the conditions of its functioning, and is determined by the mainhypothesis of the existence of this method.Furthermore, given the definition of the «Perfect demand illusion», and describes its additional conditions. Also spells out the advantages of the «Perfect demandillusion», before the «Demand illusion».

  13. Divers of Passenger Demand

    OpenAIRE

    Wittmer, Andreas

    2011-01-01

    -Overview drivers of passenger demand -Driver 1: Economic growth in developing countries -Driver 2: International business travel in developed countries -Driver 3: International leisure travel in developed countries

  14. Automated methods of corrosion measurement

    DEFF Research Database (Denmark)

    Andersen, Jens Enevold Thaulov; Bech-Nielsen, Gregers; Reeve, John Ch

    1997-01-01

    Measurements of corrosion rates and other parameters connected with corrosion processes are important, first as indicators of the corrosion resistance of metallic materials and second because such measurements are based on general and fundamental physical, chemical, and electrochemical relations....... Hence improvements and innovations in methods applied in corrosion research are likeliy to benefit basic disciplines as well. A method for corrosion measurements can only provide reliable data if the beckground of the method is fully understood. Failure of a method to give correct data indicates a need...... to revise assumptions regarding the basis of the method, which sometimes leads to the discovery of as-yet unnoticed phenomena. The present selection of automated methods for corrosion measurements is not motivated simply by the fact that a certain measurement can be performed automatically. Automation...

  15. Automated methods of corrosion measurement

    DEFF Research Database (Denmark)

    Andersen, Jens Enevold Thaulov; Bech-Nielsen, Gregers; Reeve, John Ch

    1997-01-01

    Measurements of corrosion rates and other parameters connected with corrosion processes are important, first as indicators of the corrosion resistance of metallic materials and second because such measurements are based on general and fundamental physical, chemical, and electrochemical relations....... Hence improvements and innovations in methods applied in corrosion research are likeliy to benefit basic disciplines as well. A method for corrosion measurements can only provide reliable data if the beckground of the method is fully understood. Failure of a method to give correct data indicates a need...... to revise assumptions regarding the basis of the method, which sometimes leads to the discovery of as-yet unnoticed phenomena. The present selection of automated methods for corrosion measurements is not motivated simply by the fact that a certain measurement can be performed automatically. Automation...

  16. 3D-Printed Microfluidic Automation

    Science.gov (United States)

    Au, Anthony K.; Bhattacharjee, Nirveek; Horowitz, Lisa F.; Chang, Tim C.; Folch, Albert

    2015-01-01

    Microfluidic automation – the automated routing, dispensing, mixing, and/or separation of fluids through microchannels – generally remains a slowly-spreading technology because device fabrication requires sophisticated facilities and the technology’s use demands expert operators. Integrating microfluidic automation in devices has involved specialized multi-layering and bonding approaches. Stereolithography is an assembly-free, 3D-printing technique that is emerging as an efficient alternative for rapid prototyping of biomedical devices. Here we describe fluidic valves and pumps that can be stereolithographically printed in optically-clear, biocompatible plastic and integrated within microfluidic devices at low cost. User-friendly fluid automation devices can be printed and used by non-engineers as replacement for costly robotic pipettors or tedious manual pipetting. Engineers can manipulate the designs as digital modules into new devices of expanded functionality. Printing these devices only requires the digital file and electronic access to a printer. PMID:25738695

  17. 3D-printed microfluidic automation.

    Science.gov (United States)

    Au, Anthony K; Bhattacharjee, Nirveek; Horowitz, Lisa F; Chang, Tim C; Folch, Albert

    2015-04-21

    Microfluidic automation - the automated routing, dispensing, mixing, and/or separation of fluids through microchannels - generally remains a slowly-spreading technology because device fabrication requires sophisticated facilities and the technology's use demands expert operators. Integrating microfluidic automation in devices has involved specialized multi-layering and bonding approaches. Stereolithography is an assembly-free, 3D-printing technique that is emerging as an efficient alternative for rapid prototyping of biomedical devices. Here we describe fluidic valves and pumps that can be stereolithographically printed in optically-clear, biocompatible plastic and integrated within microfluidic devices at low cost. User-friendly fluid automation devices can be printed and used by non-engineers as replacement for costly robotic pipettors or tedious manual pipetting. Engineers can manipulate the designs as digital modules into new devices of expanded functionality. Printing these devices only requires the digital file and electronic access to a printer.

  18. Generic Automated Multi-function Finger Design

    Science.gov (United States)

    Honarpardaz, M.; Tarkian, M.; Sirkett, D.; Ölvander, J.; Feng, X.; Elf, J.; Sjögren, R.

    2016-11-01

    Multi-function fingers that are able to handle multiple workpieces are crucial in improvement of a robot workcell. Design automation of multi-function fingers is highly demanded by robot industries to overcome the current iterative, time consuming and complex manual design process. However, the existing approaches for the multi-function finger design automation are unable to entirely meet the robot industries’ need. This paper proposes a generic approach for design automation of multi-function fingers. The proposed approach completely automates the design process and requires no expert skill. In addition, this approach executes the design process much faster than the current manual process. To validate the approach, multi-function fingers are successfully designed for two case studies. Further, the results are discussed and benchmarked with existing approaches.

  19. Digitisation-on-Demand in Academic Research Libraries

    OpenAIRE

    Chamberlain, Edmund M.

    2011-01-01

    The investigation finds that digitisation-on-demand and print on-demand services have the potential to provide greater value access to libraries’ collections and could help a library to realise its true potential as a ‘long tail’. There are at present a number of practical and financial limitations that prevent this from being fully realised. Whilst the concept remains a viable one and demand is noted, copyright legislation restricts the material available for full digitisation to a nich...

  20. Paramedic Physical Demands Analysis

    Science.gov (United States)

    2014-07-01

    medical bags, cardiac monitor, stretcher, stair chair, etc.) were not standardized across services. As a result the total amount of equipment weight ...report describes the pushing/pulling, walking, and stair climbing demands as observed during the observation periods. Walking demands varied between the...standard deviation about the mean. .................................................................. 25 Figure 7 - The maximum weight (heaviest patient

  1. Wood supply and demand

    Science.gov (United States)

    Peter J. Ince; David B. McKeever

    2011-01-01

    At times in history, there have been concerns that demand for wood (timber) would be greater than the ability to supply it, but that concern has recently dissipated. The wood supply and demand situation has changed because of market transitions, economic downturns, and continued forest growth. This article provides a concise overview of this change as it relates to the...

  2. Quantum dots assisted photocatalysis for the chemiluminometric determination of chemical oxygen demand using a single interface flow system

    Energy Technology Data Exchange (ETDEWEB)

    Silvestre, Cristina I.C.; Frigerio, Christian [Requimte, Department of Chemistry, Faculty of Pharmacy, Porto University, Rua Anibal Cunha 164, 4099-030, Porto (Portugal); Santos, Joao L.M., E-mail: joaolms@ff.up.pt [Requimte, Department of Chemistry, Faculty of Pharmacy, Porto University, Rua Anibal Cunha 164, 4099-030, Porto (Portugal); Lima, Jose L.F.C. [Requimte, Department of Chemistry, Faculty of Pharmacy, Porto University, Rua Anibal Cunha 164, 4099-030, Porto (Portugal)

    2011-08-12

    Highlights: {yields} A novel flow method for the determination of chemical oxygen demand is proposed. {yields} CdTe nanocrystals are irradiated with UV light to generate strong oxidizing species. {yields} Reactive species promote a fast catalytic degradation of organic matter. {yields} Luminol is used as a chemiluminescence probe for indirect COD assessment. {yields} A single interface flow system was implemented to automate the assays. - Abstract: A novel flow method for the determination of chemical oxygen demand (COD) is proposed in this work. It relies on the combination of a fully automated single interface flow system, an on-line UV photocatalytic unit and quantum dot (QD) nanotechnology. The developed approach takes advantage of CdTe nanocrystals capacity to generate strong oxidizing species upon irradiation with UV light, which fostered a fast catalytic degradation of the organic compounds. Luminol was used as a chemiluminescence (CL) probe for indirect COD assessment, since it is easily oxidized by the QD generated species yielding a strong CL emission that is quenched in the presence of the organic matter. The proposed methodology allowed the determination of COD concentrations between 1 and 35 mg L{sup -1}, with good precision (R.S.D. < 1.1%, n = 3) and a sampling frequency of about 33 h{sup -1}. The procedure was applied to the determination of COD in wastewater certified reference materials and the obtained results showed an excellent agreement with the certified values.

  3. Causality in demand

    DEFF Research Database (Denmark)

    Nielsen, Max; Jensen, Frank; Setälä, Jari;

    2011-01-01

    This article focuses on causality in demand. A methodology where causality is imposed and tested within an empirical co-integrated demand model, not prespecified, is suggested. The methodology allows different causality of different products within the same demand system. The methodology is applied...... to fish demand. On the German market for farmed trout and substitutes, it is found that supply sources, i.e. aquaculture and fishery, are not the only determinant of causality. Storing, tightness of management and aggregation level of integrated markets might also be important. The methodological...... implication is that more explicit focus on causality in demand analyses provides improved information. The results suggest that frozen trout forms part of a large European whitefish market, where prices of fresh trout are formed on a relatively separate market. Redfish is a substitute on both markets...

  4. Smart Home Automation with Linux

    CERN Document Server

    Goodwin, Steven

    2010-01-01

    Linux users can now control their homes remotely! Are you a Linux user who has ever wanted to turn on the lights in your house, or open and close the curtains, while away on holiday? Want to be able to play the same music in every room, controlled from your laptop or mobile phone? Do you want to do these things without an expensive off-the-shelf kit? In Beginning Linux Home Automation, Steven Goodwin will show you how a house can be fully controlled by its occupants, all using open source software. From appliances to kettles to curtains, control your home remotely! What you'll learn* Control a

  5. On-Demand Mobility (ODM) Technical Pathway: Enabling Ease of Use and Safety

    Science.gov (United States)

    Goodrich, Ken; Moore, Mark

    2015-01-01

    On-demand mobility (ODM) through aviation refers to the ability to quickly and easily move people or equivalent cargo without delays introduced by lack of, or infrequently, scheduled service. A necessary attribute of ODM is that it be easy to use, requiring a minimum of special training, skills, or workload. Fully-autonomous vehicles would provide the ultimate in ease-of-use (EU) but are currently unproven for safety-critical applications outside of a few, situationally constrained applications (e.g. automated trains operating in segregated systems). Applied to aviation, the current and near-future state of the art of full-autonomy, may entail undesirable trade-offs such as very conservative operational margins resulting in reduced trip reliability and transportation utility. Furthermore, acceptance by potential users and regulatory authorities will be challenging without confidence in autonomous systems in developed in less critical, but still challenging applications. A question for the aviation community is how we can best develop practical ease-of-use for aircraft that are sized to carry a small number of passengers (e.g. 1-9) or equivalent cargo. Such development is unlikely to be a single event, but rather a managed, evolutionary process where responsibility and authority transitions from human to automation agents as operational experience is gained with increasingly intelligent systems. This talk presents a technology road map being developed at NASA Langley, as part of an overall strategy to foster ODM, for the development of ease-of-use for ODM aviation.

  6. Fully differential NLO predictions for the rare muon decay

    Science.gov (United States)

    Pruna, G. M.; Signer, A.; Ulrich, Y.

    2017-02-01

    Using the automation program GoSam, fully differential NLO corrections were obtained for the rare decay of the muon μ → eν ν bar ee. This process is an important Standard Model background to searches of the Mu3e Collaboration for lepton-flavor violation, as it becomes indistinguishable from the signal μ → 3 e if the neutrinos carry little energy. With our NLO program we are able to compute the branching ratio as well as custom-tailored observables for the experiment. With minor modifications, related decays of the tau can also be computed.

  7. Fully differential NLO predictions for the rare muon decay

    Directory of Open Access Journals (Sweden)

    G.M. Pruna

    2017-02-01

    Full Text Available Using the automation program GoSam, fully differential NLO corrections were obtained for the rare decay of the muon μ→eνν¯ee. This process is an important Standard Model background to searches of the Mu3e Collaboration for lepton-flavor violation, as it becomes indistinguishable from the signal μ→3e if the neutrinos carry little energy. With our NLO program we are able to compute the branching ratio as well as custom-tailored observables for the experiment. With minor modifications, related decays of the tau can also be computed.

  8. A fully robust PARAFAC method for analyzing fluorescence data

    DEFF Research Database (Denmark)

    Engelen, Sanne; Frosch, Stina; Jørgensen, Bo

    2009-01-01

    Parallel factor analysis (PARAFAC) is a widespread method for modeling fluorescence data by means of an alternating least squares procedure. Consequently, the PARAFAC estimates are highly influenced by outlying excitation–emission landscapes (EEM) and element-wise outliers, like for example Raman......, there still exists no robust method for handling fluorescence data encountering both outlying EEM landscapes and scatter. In this paper, we present an iterative algorithm where the robust PARAFAC method and the scatter identification tool are alternately performed. A fully automated robust PARAFAC method...

  9. 45 CFR 30.11 - Demand for payment.

    Science.gov (United States)

    2010-10-01

    ... this part, including immediate referral to Justice for litigation. (b) Demand letters. The specific..., including income tax refunds, salary, certain benefit payments such as Social Security, retirement, and...) Reporting the debt to a credit bureau or other automated database; (E) Referring the debt to Justice...

  10. Domestic Demand Will Work

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    China can invigorate its economy by expanding domestic demand and boosting consumption chinese bankers are preparing to set up finance companies that provide consumer loans in major cities like Beijing and Shanghai.

  11. Intelligent energy demand forecasting

    CERN Document Server

    Hong, Wei-Chiang

    2013-01-01

    This book offers approaches and methods to calculate optimal electric energy allocation, using evolutionary algorithms and intelligent analytical tools to improve the accuracy of demand forecasting. Focuses on improving the drawbacks of existing algorithms.

  12. Household fuel demand analysis

    Energy Technology Data Exchange (ETDEWEB)

    Cohn, S.; Hirst, E.; Jackson, J.

    1976-01-01

    This study develops econometric models of residential demands for electricity, natural gas, and petroleum products. Fuel demands per household are estimated as functions of fuel prices, per capita income, heating degree days, and mean July temperature. Cross-sectional models are developed using a large data base containing observations for each state and year from 1951 through 1974. Long-run own-price elasticities for all three fuels are greater than unity with natural gas showing the greatest sensitivity to own-price changes. Cross-price elasticities are all less than unity except for the elasticity of demand for oil with respect to the price of gas (which is even larger than the own-price elasticity of demand for oil). The models show considerable stabiity with respect to own-price elasticities but much instability with respect to the cross-price and income elasticities.

  13. Impact of Energy Demands

    Science.gov (United States)

    Cambel, Ali B.

    1970-01-01

    The types of pollutants associated with the process of power production are identified. A nine-point proposal is presented on the ways the increase in power demands might be achieved with the minimum threat to the environment. (PR)

  14. DeMand: A tool for evaluating and comparing device-level demand and supply forecast models

    DEFF Research Database (Denmark)

    Neupane, Bijay; Siksnys, Laurynas; Pedersen, Torben Bach

    2016-01-01

    datasets, forecast models, features, and errors measures, thus semi-automating most of the steps of the forecast model selection and validation process. This paper presents the architecture and data model of the DeMand system; and provides a use-case example on how one particular forecast model...

  15. Physics of Fully Depleted CCDs

    CERN Document Server

    Holland, S E; Kolbe, W F; Lee, J S

    2014-01-01

    In this work we present simple, physics-based models for two effects that have been noted in the fully depleted CCDs that are presently used in the Dark Energy Survey Camera. The first effect is the observation that the point-spread function increases slightly with the signal level. This is explained by considering the effect on charge-carrier diffusion due to the reduction in the magnitude of the channel potential as collected signal charge acts to partially neutralize the fixed charge in the depleted channel. The resulting reduced voltage drop across the carrier drift region decreases the vertical electric field and increases the carrier transit time. The second effect is the observation of low-level, concentric ring patterns seen in uniformly illuminated images. This effect is shown to be most likely due to lateral deflection of charge during the transit of the photogenerated carriers to the potential wells as a result of lateral electric fields. The lateral fields are a result of space charge in the fully...

  16. Automation of the proximate analysis of coals

    Energy Technology Data Exchange (ETDEWEB)

    1985-01-01

    A study is reported of the feasibility of using a multi-jointed general-purpose robot for the automated analysis of moisture, volatile matter, ash and total post-combustion sulfur in coal and coke. The results obtained with an automated system are compared with those of conventional manual methods. The design of the robot hand and the safety measures provided are now both fully satisfactory, and the analytic values obtained exhibit little scatter. It is concluded that the use of this robot system results in a better working environment and in considerable labour saving. Applications to other tasks are under development.

  17. Longwall automation - an ACARP Landmark Project

    Energy Technology Data Exchange (ETDEWEB)

    Kelly, M.; Hainsworth, D.; Lever, P.; Gurgenci, H. [CSIRO Exploration and Mining, Kenmore, Qld. (Australia)

    2002-07-01

    A Landmark Longwall Automation project was commenced in July 2001. The major outcome of automation using on-face observation has been divided into ten outcome areas that have been fully scoped for a three-year initial project life. A major facilitating technology has been the implementation of inertial navigation system (INS) technology that can map the shearer position in 3D. A focus of the project is to deliver a system that is at least as productive as the current most productive manually controlled longwall face. 4 refs., 6 figs.

  18. Autonomy and Automation

    Science.gov (United States)

    Shively, Jay

    2017-01-01

    A significant level of debate and confusion has surrounded the meaning of the terms autonomy and automation. Automation is a multi-dimensional concept, and we propose that Remotely Piloted Aircraft Systems (RPAS) automation should be described with reference to the specific system and task that has been automated, the context in which the automation functions, and other relevant dimensions. In this paper, we present definitions of automation, pilot in the loop, pilot on the loop and pilot out of the loop. We further propose that in future, the International Civil Aviation Organization (ICAO) RPAS Panel avoids the use of the terms autonomy and autonomous when referring to automated systems on board RPA. Work Group 7 proposes to develop, in consultation with other workgroups, a taxonomy of Levels of Automation for RPAS.

  19. An automated swimming respirometer

    DEFF Research Database (Denmark)

    STEFFENSEN, JF; JOHANSEN, K; BUSHNELL, PG

    1984-01-01

    An automated respirometer is described that can be used for computerized respirometry of trout and sharks.......An automated respirometer is described that can be used for computerized respirometry of trout and sharks....

  20. Configuration Management Automation (CMA) -

    Data.gov (United States)

    Department of Transportation — Configuration Management Automation (CMA) will provide an automated, integrated enterprise solution to support CM of FAA NAS and Non-NAS assets and investments. CMA...

  1. Automated identification of insect vectors of Chagas disease in Brazil and Mexico: the Virtual Vector Lab

    Directory of Open Access Journals (Sweden)

    Rodrigo Gurgel-Gonçalves

    2017-04-01

    Full Text Available Identification of arthropods important in disease transmission is a crucial, yet difficult, task that can demand considerable training and experience. An important case in point is that of the 150+ species of Triatominae, vectors of Trypanosoma cruzi, causative agent of Chagas disease across the Americas. We present a fully automated system that is able to identify triatomine bugs from Mexico and Brazil with an accuracy consistently above 80%, and with considerable potential for further improvement. The system processes digital photographs from a photo apparatus into landmarks, and uses ratios of measurements among those landmarks, as well as (in a preliminary exploration two measurements that approximate aspects of coloration, as the basis for classification. This project has thus produced a working prototype that achieves reasonably robust correct identification rates, although many more developments can and will be added, and—more broadly—the project illustrates the value of multidisciplinary collaborations in resolving difficult and complex challenges.

  2. The Search for Fractional Charge Particles in an Advanced, Automated Variation of the Millikan Experiment

    Science.gov (United States)

    Lee, I. T.; Halyo, V.; Lee, E. R.; Loomba, D.; Perl, M. L.

    2001-04-01

    We will present a variation on the Millikan apparatus designed to look for fractionally charged particles in bulk materials, and results from the current run. Oil drops are produced from a drop-on-demand ejector, and imaged by a digital CCD camera and framegrabber combination. A networked Linux cluster is used to simultaneously collect and analyze data, and to monitor and control the apparatus. The experiment is fully automated, and utilizes laminar air flow to make possible the accurate measurements of charge on large (20 micron) fluid drops. The experiment has the capability to process a total of 10^7 to 10^8 drops (20-200 mg), and the ability to use large drops enables the search to be carried out in mineral suspensions.

  3. Bayesian Safety Risk Modeling of Human-Flightdeck Automation Interaction

    Science.gov (United States)

    Ancel, Ersin; Shih, Ann T.

    2015-01-01

    Usage of automatic systems in airliners has increased fuel efficiency, added extra capabilities, enhanced safety and reliability, as well as provide improved passenger comfort since its introduction in the late 80's. However, original automation benefits, including reduced flight crew workload, human errors or training requirements, were not achieved as originally expected. Instead, automation introduced new failure modes, redistributed, and sometimes increased workload, brought in new cognitive and attention demands, and increased training requirements. Modern airliners have numerous flight modes, providing more flexibility (and inherently more complexity) to the flight crew. However, the price to pay for the increased flexibility is the need for increased mode awareness, as well as the need to supervise, understand, and predict automated system behavior. Also, over-reliance on automation is linked to manual flight skill degradation and complacency in commercial pilots. As a result, recent accidents involving human errors are often caused by the interactions between humans and the automated systems (e.g., the breakdown in man-machine coordination), deteriorated manual flying skills, and/or loss of situational awareness due to heavy dependence on automated systems. This paper describes the development of the increased complexity and reliance on automation baseline model, named FLAP for FLightdeck Automation Problems. The model development process starts with a comprehensive literature review followed by the construction of a framework comprised of high-level causal factors leading to an automation-related flight anomaly. The framework was then converted into a Bayesian Belief Network (BBN) using the Hugin Software v7.8. The effects of automation on flight crew are incorporated into the model, including flight skill degradation, increased cognitive demand and training requirements along with their interactions. Besides flight crew deficiencies, automation system

  4. Fully inkjet-printed microwave passive electronics

    KAUST Repository

    Mckerricher, Garret

    2017-01-30

    Fully inkjet-printed three-dimensional (3D) objects with integrated metal provide exciting possibilities for on-demand fabrication of radio frequency electronics such as inductors, capacitors, and filters. To date, there have been several reports of printed radio frequency components metallized via the use of plating solutions, sputtering, and low-conductivity pastes. These metallization techniques require rather complex fabrication, and do not provide an easily integrated or versatile process. This work utilizes a novel silver ink cured with a low-cost infrared lamp at only 80 °C, and achieves a high conductivity of 1×107 S m−1. By inkjet printing the infrared-cured silver together with a commercial 3D inkjet ultraviolet-cured acrylic dielectric, a multilayer process is demonstrated. By using a smoothing technique, both the conductive ink and dielectric provide surface roughness values of <500 nm. A radio frequency inductor and capacitor exhibit state-of-the-art quality factors of 8 and 20, respectively, and match well with electromagnetic simulations. These components are implemented in a lumped element radio frequency filter with an impressive insertion loss of 0.8 dB at 1 GHz, proving the utility of the process for sensitive radio frequency applications.

  5. Workflow automation architecture standard

    Energy Technology Data Exchange (ETDEWEB)

    Moshofsky, R.P.; Rohen, W.T. [Boeing Computer Services Co., Richland, WA (United States)

    1994-11-14

    This document presents an architectural standard for application of workflow automation technology. The standard includes a functional architecture, process for developing an automated workflow system for a work group, functional and collateral specifications for workflow automation, and results of a proof of concept prototype.

  6. DISRUPTION MANAGEMENT FOR SUPPLY CHAIN COORDINATION WITH EXPONENTIAL DEMAND FUNCTION

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    The coordination problem of a supply chain comprising one supplier and one retailer under market demand disruption is studied in this article. A novel exponential demand function is adopted, and the penalty cost is introduced explicitly to capture the deviation production cost caused by the market demand disruption. The optimal strategies are obtained for different disruption scale under the centralized mode. For the decentralized mode, it is proved that the supply chain can be fully coordinated by adjusting the price discount policy appropriately when disruption occurs. Furthermore, the authors point out that similar results can be established for more general demand functions that represent different market circumstances if certain assumptions are satisfied.

  7. Singularities in fully developed turbulence

    Energy Technology Data Exchange (ETDEWEB)

    Shivamoggi, Bhimsen K., E-mail: bhimsen.shivamoggi@ucf.edu

    2015-09-18

    Phenomenological arguments are used to explore finite-time singularity (FTS) development in different physical fully-developed turbulence (FDT) situations. Effects of spatial intermittency and fluid compressibility in three-dimensional (3D) FDT and the role of the divorticity amplification mechanism in two-dimensional (2D) FDT and quasi-geostrophic FDT and the advection–diffusion mechanism in magnetohydrodynamic turbulence are considered to provide physical insights into the FTS development in variant cascade physics situations. The quasi-geostrophic FDT results connect with the 2D FDT results in the barotropic limit while they connect with 3D FDT results in the baroclinic limit and hence apparently provide a bridge between 2D and 3D. - Highlights: • Finite-time singularity development in turbulence situations is phenomenologically explored. • Spatial intermittency and compressibility effects are investigated. • Quasi-geostrophic turbulence is shown to provide a bridge between two-dimensional and three-dimensional cases.

  8. Experience of automation failures in training: effects on trust, automation bias, complacency and performance.

    Science.gov (United States)

    Sauer, Juergen; Chavaillaz, Alain; Wastell, David

    2016-06-01

    This work examined the effects of operators' exposure to various types of automation failures in training. Forty-five participants were trained for 3.5 h on a simulated process control environment. During training, participants either experienced a fully reliable, automatic fault repair facility (i.e. faults detected and correctly diagnosed), a misdiagnosis-prone one (i.e. faults detected but not correctly diagnosed) or a miss-prone one (i.e. faults not detected). One week after training, participants were tested for 3 h, experiencing two types of automation failures (misdiagnosis, miss). The results showed that automation bias was very high when operators trained on miss-prone automation encountered a failure of the diagnostic system. Operator errors resulting from automation bias were much higher when automation misdiagnosed a fault than when it missed one. Differences in trust levels that were instilled by the different training experiences disappeared during the testing session. Practitioner Summary: The experience of automation failures during training has some consequences. A greater potential for operator errors may be expected when an automatic system failed to diagnose a fault than when it failed to detect one.

  9. Travel Demand Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Southworth, Frank [ORNL; Garrow, Dr. Laurie [Georgia Institute of Technology

    2011-01-01

    This chapter describes the principal types of both passenger and freight demand models in use today, providing a brief history of model development supported by references to a number of popular texts on the subject, and directing the reader to papers covering some of the more recent technical developments in the area. Over the past half century a variety of methods have been used to estimate and forecast travel demands, drawing concepts from economic/utility maximization theory, transportation system optimization and spatial interaction theory, using and often combining solution techniques as varied as Box-Jenkins methods, non-linear multivariate regression, non-linear mathematical programming, and agent-based microsimulation.

  10. Education on Demand

    DEFF Research Database (Denmark)

    Boysen, Lis; Hende, Merete

    2015-01-01

    Dette notat beskriver nogle af resultaterne fra programmet "Education on Demand' i projektet Det erhvervsrettede Uddannelseslaboratorium. Programmet har haft fokus på udfordringer og forandringsbehov i uddannelsesinstitutioner og -systemet. Herunder har det beskæftiget sig særligt med de to temat......Dette notat beskriver nogle af resultaterne fra programmet "Education on Demand' i projektet Det erhvervsrettede Uddannelseslaboratorium. Programmet har haft fokus på udfordringer og forandringsbehov i uddannelsesinstitutioner og -systemet. Herunder har det beskæftiget sig særligt med de...

  11. Demand Modelling in Telecommunications

    Directory of Open Access Journals (Sweden)

    M. Chvalina

    2009-01-01

    Full Text Available This article analyses the existing possibilities for using Standard Statistical Methods and Artificial Intelligence Methods for a short-term forecast and simulation of demand in the field of telecommunications. The most widespread methods are based on Time Series Analysis. Nowadays, approaches based on Artificial Intelligence Methods, including Neural Networks, are booming. Separate approaches will be used in the study of Demand Modelling in Telecommunications, and the results of these models will be compared with actual guaranteed values. Then we will examine the quality of Neural Network models. 

  12. Automated solar cell assembly team process research

    Science.gov (United States)

    Nowlan, M. J.; Hogan, S. J.; Darkazalli, G.; Breen, W. F.; Murach, J. M.; Sutherland, S. F.; Patterson, J. S.

    1994-06-01

    This report describes work done under the Photovoltaic Manufacturing Technology (PVMaT) project, Phase 3A, which addresses problems that are generic to the photovoltaic (PV) industry. Spire's objective during Phase 3A was to use its light soldering technology and experience to design and fabricate solar cell tabbing and interconnecting equipment to develop new, high-yield, high-throughput, fully automated processes for tabbing and interconnecting thin cells. Areas that were addressed include processing rates, process control, yield, throughput, material utilization efficiency, and increased use of automation. Spire teamed with Solec International, a PV module manufacturer, and the University of Massachusetts at Lowell's Center for Productivity Enhancement (CPE), automation specialists, who are lower-tier subcontractors. A number of other PV manufacturers, including Siemens Solar, Mobil Solar, Solar Web, and Texas instruments, agreed to evaluate the processes developed under this program.

  13. The newest trends in marshalling yards automation

    Directory of Open Access Journals (Sweden)

    Jiří ŽILKA

    2008-01-01

    Full Text Available Marshalling yards are one of the most important parts of every railway infrastructure. Means of mechanization and automation are being built to achieve as efficient forming of freight trains as possible. Modern, fully automatic systems based on extensive utilization of computers are being implemented. Their main function is to make freight trains into unit trains and divided according to their destinations. One part of these systems is responsible for automatic routing of coupled or isolated cars through the ladder. The other part automatically regulates by retarders the speed of the cars on their way into the destination tracks in the classification bowl. The state-of-the-art marshalling systems provide not only these basic automation functions. They offer also setting interlocked routes with a level of safety integrity SIL3. This article is focused on both above mentioned parts of marshalling systems – automation and safety one.

  14. Energy Assessment of Automated Mobility Districts

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Yuche [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-08-03

    Automated vehicles (AVs) are increasingly being discussed as the basis for on-demand mobility services, introducing a new paradigm in which a fleet of AVs displace private automobiles for day-to-day travel in dense activity districts. This project examines such a concept to displace privately owned automobiles within a region containing dense activity generators (jobs, retail, entertainment, etc.), referred to as an automated mobility district (AMDs). The project reviews several such districts including airport, college campuses, business parks, downtown urban cores, and military bases, with examples of previous attempts to meet the mobility needs apart from private automobiles, some with automated technology and others with more traditional transit based solutions. The issues and benefits of AMDs are framed within the perspective of intra-district, inter-district, and border issues, and the requirements for a modeling framework are identified to adequately reflect the breadth of mobility, energy, and emissions impact anticipated with AMDs.

  15. Restaurant No. 1 fully renovated

    CERN Multimedia

    2007-01-01

    The Restaurant No. 1 team. After several months of patience and goodwill on the part of our clients, we are delighted to announce that the major renovation work which began in September 2006 has now been completed. From 21 May 2007 we look forward to welcoming you to a completely renovated restaurant area designed with you in mind. The restaurant team wishes to thank all its clients for their patience and loyalty. Particular attention has been paid in the new design to creating a spacious serving area and providing a wider choice of dishes. The new restaurant area has been designed as an open-plan space to enable you to view all the dishes before making your selection and to move around freely from one food access point to another. It comprises user-friendly areas that fully comply with hygiene standards. From now on you will be able to pick and choose to your heart's content. We invite you to try out wok cooking or some other speciality. Or select a pizza or a plate of pasta with a choice of two sauces fr...

  16. Fully Traversable Wormholes Hiding Charge

    CERN Document Server

    Guendelman, Eduardo

    2012-01-01

    The charge-hiding effect by a wormhole, which was studied for the case where gravity/gauge-field system is self-consistently interacting with a charged lightlike brane (LLB) as a matter source, is now studied for the case of a time like brane. From the demand that no surfaces of infinite coordinate time redshift appear in the problem we are lead now to a completly traversable wormhole space, according to not only the traveller that goes through the wormhole (as was the case for the LLB), but also to a static external observer, this requires negative surface energy density for the shell sitting at the throat of the wormhole. We study a gauge field subsystem which is of a special non-linear form containing a square-root of the Maxwell term and which previously has been shown to produce a QCD-like confining gauge field dynamics in flat space-time. The condition of finite energy of the system or asymptotic flatness on one side of the wormhole implies that the charged object sitting at the wormhole throat expels a...

  17. Oil supply and demand

    Energy Technology Data Exchange (ETDEWEB)

    Rech, O

    2004-07-01

    World oil demand, driven by economic development in China, posted the highest growth rate in 20 years. In a context of geopolitical uncertainty, prices are soaring, encouraged by low inventory and the low availability of residual production capacity. Will 2004 bring a change in the oil market paradigm? (author)

  18. The demand for euros

    NARCIS (Netherlands)

    Arnold, I.J.M.; Roelands, S.

    2010-01-01

    This paper investigates the demand for euros using panel data for 10 euro area countries covering the period from 1999 to 2008. Monetary aggregates are constructed to ensure that money is a national concept by excluding deposits owned by non-residents and including external deposits owned by

  19. DEMAND AND PRICES

    Directory of Open Access Journals (Sweden)

    VĂDUVA MARIA

    2014-08-01

    Full Text Available Studying the consumer’s behavior by the ordinal approach of utility with the help of indifference curves allows us to deduce the two “movement laws of demand” in this chapter: the demand for a “normal” good is decreasing function of its price and an increasing function of income. We will use the elasticity concept to measure the intensity of the relation that is established between the demand, on the one hand, and prices or income, on the other hand: elasticity – price, direct and crossed, and elasticity – income. We can classify the goods in many categories, depending on the values that this elasticity takes. The demand elasticity can be determined depending on price and income. It reflects the proportion in which the demand for different products changes with the modification of the consumers’ income, the other factors remaining constant. The elasticity compared to the income is a demonstration of legality from the consumer’s sphere, which determines a certain hierarchy of the needs of each population category in a certain level of income. The movement of prices orients both the options and decisions of producers, namely the most useful productions and the most efficient investments, as well as the consumers’ options and decisions on the most advantageous buying of goods and services that they need. The prices appear as a “signal system” coordinating and making coherence the economic agents’ decisions – producers, consumers and population.

  20. The Automated Palomar 60 Inch Telescope

    Science.gov (United States)

    Cenko, S. Bradley; Fox, Derek B.; Moon, Dae-Sik; Harrison, Fiona A.; Kulkarni, S. R.; Henning, John R.; Guzman, C. Dani; Bonati, Marco; Smith, Roger M.; Thicksten, Robert P.; Doyle, Michael W.; Petrie, Hal L.; Gal-Yam, Avishay; Soderberg, Alicia M.; Anagnostou, Nathaniel L.; Laity, Anastasia C.

    2006-10-01

    We have converted the Palomar 60 inch (1.52 m) telescope from a classic night-assistant-operated telescope to a fully robotic facility. The automated system, which has been operational since 2004 September, is designed for moderately fast (tdesign requirements, hardware and software upgrades, and lessons learned from roboticization. We present an overview of the current system performance as well as plans for future upgrades.

  1. Implementation and development of an automated, ultra-high-capacity, acoustic, flexible dispensing platform for assay-ready plate delivery.

    Science.gov (United States)

    Griffith, Dylan; Northwood, Roger; Owen, Paul; Simkiss, Ellen; Brierley, Andrew; Cross, Kevin; Slaney, Andrew; Davis, Miranda; Bath, Colin

    2012-10-01

    Compound management faces the daily challenge of providing high-quality samples to drug discovery. The advent of new screening technologies has seen demand for liquid samples move toward nanoliter ranges, dispensed by contactless acoustic droplet ejection. Within AstraZeneca, a totally integrated assay-ready plate production platform has been created to fully exploit the advantages of this technology. This enables compound management to efficiently deliver large throughputs demanded by high-throughput screening while maintaining regular delivery of smaller numbers of compounds in varying plate formats for cellular or biochemical concentration-response curves in support of hit and lead optimization (structure-activity relationship screening). The automation solution, CODA, has the capability to deliver compounds on demand for single- and multiple-concentration ranges, in batch sizes ranging from 1 sample to 2 million samples, integrating seamlessly into local compound and test management systems. The software handles compound orders intelligently, grouping test requests together dependent on output plate type and serial dilution ranges so that source compound vessels are shared among numerous tests, ensuring conservation of sample, reduced labware and costs, and efficiency of work cell logistics. We describe the development of CODA to address the customer demand, challenges experienced, learning made, and subsequent enhancements.

  2. Towards Automated Annotation of Benthic Survey Images: Variability of Human Experts and Operational Modes of Automation.

    Directory of Open Access Journals (Sweden)

    Oscar Beijbom

    Full Text Available Global climate change and other anthropogenic stressors have heightened the need to rapidly characterize ecological changes in marine benthic communities across large scales. Digital photography enables rapid collection of survey images to meet this need, but the subsequent image annotation is typically a time consuming, manual task. We investigated the feasibility of using automated point-annotation to expedite cover estimation of the 17 dominant benthic categories from survey-images captured at four Pacific coral reefs. Inter- and intra- annotator variability among six human experts was quantified and compared to semi- and fully- automated annotation methods, which are made available at coralnet.ucsd.edu. Our results indicate high expert agreement for identification of coral genera, but lower agreement for algal functional groups, in particular between turf algae and crustose coralline algae. This indicates the need for unequivocal definitions of algal groups, careful training of multiple annotators, and enhanced imaging technology. Semi-automated annotation, where 50% of the annotation decisions were performed automatically, yielded cover estimate errors comparable to those of the human experts. Furthermore, fully-automated annotation yielded rapid, unbiased cover estimates but with increased variance. These results show that automated annotation can increase spatial coverage and decrease time and financial outlay for image-based reef surveys.

  3. Towards Automated Annotation of Benthic Survey Images: Variability of Human Experts and Operational Modes of Automation.

    Science.gov (United States)

    Beijbom, Oscar; Edmunds, Peter J; Roelfsema, Chris; Smith, Jennifer; Kline, David I; Neal, Benjamin P; Dunlap, Matthew J; Moriarty, Vincent; Fan, Tung-Yung; Tan, Chih-Jui; Chan, Stephen; Treibitz, Tali; Gamst, Anthony; Mitchell, B Greg; Kriegman, David

    2015-01-01

    Global climate change and other anthropogenic stressors have heightened the need to rapidly characterize ecological changes in marine benthic communities across large scales. Digital photography enables rapid collection of survey images to meet this need, but the subsequent image annotation is typically a time consuming, manual task. We investigated the feasibility of using automated point-annotation to expedite cover estimation of the 17 dominant benthic categories from survey-images captured at four Pacific coral reefs. Inter- and intra- annotator variability among six human experts was quantified and compared to semi- and fully- automated annotation methods, which are made available at coralnet.ucsd.edu. Our results indicate high expert agreement for identification of coral genera, but lower agreement for algal functional groups, in particular between turf algae and crustose coralline algae. This indicates the need for unequivocal definitions of algal groups, careful training of multiple annotators, and enhanced imaging technology. Semi-automated annotation, where 50% of the annotation decisions were performed automatically, yielded cover estimate errors comparable to those of the human experts. Furthermore, fully-automated annotation yielded rapid, unbiased cover estimates but with increased variance. These results show that automated annotation can increase spatial coverage and decrease time and financial outlay for image-based reef surveys.

  4. Towards Automated Annotation of Benthic Survey Images: Variability of Human Experts and Operational Modes of Automation

    Science.gov (United States)

    Beijbom, Oscar; Edmunds, Peter J.; Roelfsema, Chris; Smith, Jennifer; Kline, David I.; Neal, Benjamin P.; Dunlap, Matthew J.; Moriarty, Vincent; Fan, Tung-Yung; Tan, Chih-Jui; Chan, Stephen; Treibitz, Tali; Gamst, Anthony; Mitchell, B. Greg; Kriegman, David

    2015-01-01

    Global climate change and other anthropogenic stressors have heightened the need to rapidly characterize ecological changes in marine benthic communities across large scales. Digital photography enables rapid collection of survey images to meet this need, but the subsequent image annotation is typically a time consuming, manual task. We investigated the feasibility of using automated point-annotation to expedite cover estimation of the 17 dominant benthic categories from survey-images captured at four Pacific coral reefs. Inter- and intra- annotator variability among six human experts was quantified and compared to semi- and fully- automated annotation methods, which are made available at coralnet.ucsd.edu. Our results indicate high expert agreement for identification of coral genera, but lower agreement for algal functional groups, in particular between turf algae and crustose coralline algae. This indicates the need for unequivocal definitions of algal groups, careful training of multiple annotators, and enhanced imaging technology. Semi-automated annotation, where 50% of the annotation decisions were performed automatically, yielded cover estimate errors comparable to those of the human experts. Furthermore, fully-automated annotation yielded rapid, unbiased cover estimates but with increased variance. These results show that automated annotation can increase spatial coverage and decrease time and financial outlay for image-based reef surveys. PMID:26154157

  5. Automated DNA Sequencing System

    Energy Technology Data Exchange (ETDEWEB)

    Armstrong, G.A.; Ekkebus, C.P.; Hauser, L.J.; Kress, R.L.; Mural, R.J.

    1999-04-25

    Oak Ridge National Laboratory (ORNL) is developing a core DNA sequencing facility to support biological research endeavors at ORNL and to conduct basic sequencing automation research. This facility is novel because its development is based on existing standard biology laboratory equipment; thus, the development process is of interest to the many small laboratories trying to use automation to control costs and increase throughput. Before automation, biology Laboratory personnel purified DNA, completed cycle sequencing, and prepared 96-well sample plates with commercially available hardware designed specifically for each step in the process. Following purification and thermal cycling, an automated sequencing machine was used for the sequencing. A technician handled all movement of the 96-well sample plates between machines. To automate the process, ORNL is adding a CRS Robotics A- 465 arm, ABI 377 sequencing machine, automated centrifuge, automated refrigerator, and possibly an automated SpeedVac. The entire system will be integrated with one central controller that will direct each machine and the robot. The goal of this system is to completely automate the sequencing procedure from bacterial cell samples through ready-to-be-sequenced DNA and ultimately to completed sequence. The system will be flexible and will accommodate different chemistries than existing automated sequencing lines. The system will be expanded in the future to include colony picking and/or actual sequencing. This discrete event, DNA sequencing system will demonstrate that smaller sequencing labs can achieve cost-effective the laboratory grow.

  6. The fully Mobile City Government Project (MCity)

    DEFF Research Database (Denmark)

    Scholl, Hans; Fidel, Raya; Mai, Jens Erik

    2006-01-01

    The Fully Mobile City Government Project, also known as MCity, is an interdisciplinary research project on the premises, requirements, and effects of fully mobile, wirelessly connected applications (FWMC). The project will develop an analytical framework for interpreting the interaction...

  7. Prospects for European labour demand.

    Science.gov (United States)

    Lindley, R M

    1988-07-01

    The impact of economic and technological trends upon the level and structure of labor demand is examined, exploring the methods used to model the labor market and making special reference to demography and technology. Evidence on recent and prospective changes in labor demand is reviewed for France, Germany, Italy, the Netherlands, and the UK. The models used to explore future employment scenarios usually fail to incorporate the linkages required to fully analyze the various demographic-economic interactions. Further, this is not generally viewed as a limitation, given the time frame of most employment projections and their preoccupation with changes in the structure of labor demand. Medium-term multisectoral models tend to pay more attention to both demographic and technical change, but the treatment of both aspects is limited. The projections provide a framework for considering how both socioeconomic behavior and policy might change to achieve different outcomes. The greater a model's behavioral content, as expressed in its relationships between different variables, the greater the insight obtainable from simulation exercises. The 1st half of the 1970s was characterized by a reduction in German employment, representing the severest of European reactions to the oil crisis. The 2nd half of the decade recorded rapid growth in Italy and the Netherlands. The 1980s started with marked declines in Germany and the UK. Overall, the net gains of the 1970s were lost in the recession following the 2nd oil crisis. In none of the 5 countries studied does any realistic prospect emerge of achieving full employment before 2000. The most optimistic outcome is that unemployment will decline only slowly, it at all. The growth of both new forms and areas of employment will not compensate sufficiently for the loss of jobs elsewhere and the growth of labor supply. The industrial sector will continue to experience change in favor of the service sector but at a slower rate than during

  8. Automated Solid-Phase Radiofluorination Using Polymer-Supported Phosphazenes

    Directory of Open Access Journals (Sweden)

    Bente Mathiessen

    2013-08-01

    Full Text Available The polymer supported phosphazene bases PS-P2tBu and the novel PS-P2PEG allowed for efficient extraction of [18F]F− from proton irradiated [18O]H2O and subsequent radiofluorination of a broad range of substrates directly on the resin. The highest radiochemical yields were obtained with aliphatic sulfonates (69% and bromides (42%; the total radiosynthesis time was 35–45 min. The multivariate analysis showed that the radiochemical yields and purities were controlled by the resin load, reaction temperature, and column packing effects. The resins could be reused several times with the same or different substrates. The fully automated on-column radiofluorination methodology was applied to the radiosynthesis of the important PET radiotracers [18F]FLT and [18F]FDG. The latter was produced with 40% yield on a 120 GBq scale and passed GMP-regulated quality control required for commercial production of [18F]FDG. The combination of compact form factor, simplicity of [18F]F− recovery and processing, and column reusability can make solid phase radiofluorination an attractive radiochemistry platform for the emerging dose-on-demand instruments for bedside production of PET radiotracers.

  9. A Semi-Automated Functional Test Data Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Peng; Haves, Philip; Kim, Moosung

    2005-05-01

    The growing interest in commissioning is creating a demand that will increasingly be met by mechanical contractors and less experienced commissioning agents. They will need tools to help them perform commissioning effectively and efficiently. The widespread availability of standardized procedures, accessible in the field, will allow commissioning to be specified with greater certainty as to what will be delivered, enhancing the acceptance and credibility of commissioning. In response, a functional test data analysis tool is being developed to analyze the data collected during functional tests for air-handling units. The functional test data analysis tool is designed to analyze test data, assess performance of the unit under test and identify the likely causes of the failure. The tool has a convenient user interface to facilitate manual entry of measurements made during a test. A graphical display shows the measured performance versus the expected performance, highlighting significant differences that indicate the unit is not able to pass the test. The tool is described as semiautomated because the measured data need to be entered manually, instead of being passed from the building control system automatically. However, the data analysis and visualization are fully automated. The tool is designed to be used by commissioning providers conducting functional tests as part of either new building commissioning or retro-commissioning, as well as building owners and operators interested in conducting routine tests periodically to check the performance of their HVAC systems.

  10. Economic Dispatch of Demand Response Balancing through Asymmetric Block Offers

    DEFF Research Database (Denmark)

    O'Connell, Niamh; Pinson, Pierre; Madsen, Henrik

    2015-01-01

    load to provide a response to the power system and the subsequent need to recover. The conventional system dispatch algorithm is altered to facilitate the dispatch of demand response units alongside generating units using the proposed offer structure. The value of demand response is assessed through...... case studies that dispatch flexible supermarket refrigeration loads for the provision of regulating power. The demand resource is described by a set of asymmetric blocks, and a set of four blocks offers is shown to offer cost savings for the procurement of regulating power in excess of 20......%. For comparative purposes, the cost savings achievable with a fully observable and controllable demand response resource are evaluated, using a time series model of the refrigeration loads. The fully modeled resource offers greater savings; however the difference is small and potentially insufficient to justify...

  11. Automation of solar plants

    Energy Technology Data Exchange (ETDEWEB)

    Yebra, L.J.; Romero, M.; Martinez, D.; Valverde, A. [CIEMAT - Plataforma Solar de Almeria, Tabernas (Spain); Berenguel, M. [Almeria Univ. (Spain). Departamento de Lenguajes y Computacion

    2004-07-01

    This work overviews some of the main activities and research lines that are being carried out within the scope of the specific collaboration agreement between the Plataforma Solar de Almeria-CIEMAT (PSA-CIEMAT) and the Automatic Control, Electronics and Robotics research group of the Universidad de Almeria (TEP197) titled ''Development of control systems and tools for thermosolar plants'' and the projects financed by the MCYT DPI2001-2380-C02-02 and DPI2002-04375-C03. The research is directed by the need of improving the efficiency of the process through which the energy provided by the sun is totally or partially used as energy source, as far as diminishing the costs associated to the operation and maintenance of the installations that use this energy source. The final objective is to develop different automatic control systems and techniques aimed at improving the competitiveness of solar plants. The paper summarizes different objectives and automatic control approaches that are being implemented in different facilities at the PSA-CIEMAT: central receiver systems and solar furnace. For each one of these facilities, a systematic procedure is being followed, composed of several steps: (i) development of dynamic models using the newest modeling technologies (both for simulation and control purposes), (ii) development of fully automated data acquisition and control systems including software tools facilitating the analysis of data and the application of knowledge to the controlled plants and (iii) synthesis of advanced controllers using techniques successfully used in the process industry and development of new and optimized control algorithms for solar plants. These aspects are summarized in this work. (orig.)

  12. AUTOMATED ANALYSIS OF BREAKERS

    Directory of Open Access Journals (Sweden)

    E. M. Farhadzade

    2014-01-01

    Full Text Available Breakers relate to Electric Power Systems’ equipment, the reliability of which influence, to a great extend, on reliability of Power Plants. In particular, the breakers determine structural reliability of switchgear circuit of Power Stations and network substations. Failure in short-circuit switching off by breaker with further failure of reservation unit or system of long-distance protection lead quite often to system emergency.The problem of breakers’ reliability improvement and the reduction of maintenance expenses is becoming ever more urgent in conditions of systematic increasing of maintenance cost and repair expenses of oil circuit and air-break circuit breakers. The main direction of this problem solution is the improvement of diagnostic control methods and organization of on-condition maintenance. But this demands to use a great amount of statistic information about nameplate data of breakers and their operating conditions, about their failures, testing and repairing, advanced developments (software of computer technologies and specific automated information system (AIS.The new AIS with AISV logo was developed at the department: “Reliability of power equipment” of AzRDSI of Energy. The main features of AISV are:· to provide the security and data base accuracy;· to carry out systematic control of breakers conformity with operating conditions;· to make the estimation of individual  reliability’s value and characteristics of its changing for given combination of characteristics variety;· to provide personnel, who is responsible for technical maintenance of breakers, not only with information but also with methodological support, including recommendations for the given problem solving  and advanced methods for its realization.

  13. Demand surge following earthquakes

    Science.gov (United States)

    Olsen, Anna H.

    2012-01-01

    Demand surge is understood to be a socio-economic phenomenon where repair costs for the same damage are higher after large- versus small-scale natural disasters. It has reportedly increased monetary losses by 20 to 50%. In previous work, a model for the increased costs of reconstruction labor and materials was developed for hurricanes in the Southeast United States. The model showed that labor cost increases, rather than the material component, drove the total repair cost increases, and this finding could be extended to earthquakes. A study of past large-scale disasters suggested that there may be additional explanations for demand surge. Two such explanations specific to earthquakes are the exclusion of insurance coverage for earthquake damage and possible concurrent causation of damage from an earthquake followed by fire or tsunami. Additional research into these aspects might provide a better explanation for increased monetary losses after large- vs. small-scale earthquakes.

  14. Model-based automated testing of critical PLC programs.

    CERN Document Server

    Fernández Adiego, B; Tournier, J-C; González Suárez, V M; Bliudze, S

    2014-01-01

    Testing of critical PLC (Programmable Logic Controller) programs remains a challenging task for control system engineers as it can rarely be automated. This paper proposes a model based approach which uses the BIP (Behavior, Interactions and Priorities) framework to perform automated testing of PLC programs developed with the UNICOS (UNified Industrial COntrol System) framework. This paper defines the translation procedure and rules from UNICOS to BIP which can be fully automated in order to hide the complexity of the underlying model from the control engineers. The approach is illustrated and validated through the study of a water treatment process.

  15. Demand scenarios, worldwide

    Energy Technology Data Exchange (ETDEWEB)

    Schaefer, A. [Massachusetts Inst. of Technology, Center for Technology, Policy and Industrial Development and the MIT Joint Program on the Science and Policy of Global Change, Cambridge, MA (United States)

    1996-11-01

    Existing methods are inadequate for developing aggregate (regional and global) and long-term (several decades) passenger transport demand scenarios, since they are mainly based on simple extensions of current patterns rather than causal relationships that account for the competition among transport modes (aircraft, automobiles, buses and trains) to provide transport services. The demand scenario presented in this paper is based on two empirically proven invariances of human behavior. First, transport accounts for 10 to 15 percent of household total expenditures for those owning an automobile, and around 5 percent for non-motorized households on average (travel money budget). Second, the mean time spent traveling is approximately one hour per capita per day (travel time budget). These two budgets constraints determine the dynamics of the scenario: rising income increases per capita expenditure on travel which, in turn, increase demand for mobility. Limited travel time constraints travelers to shift to faster transport systems. The scenario is initiated with the first integrated historical data set on traffic volume in 11 world regions and the globe from 1960 to 1990 for all major modes of motorized transport. World average per capita traffic volume, which was 1,800 kilometers in 1960 and 4,2090 in 1990, is estimated to rise to 7,900 kilometers in 2020 - given a modest average increase in Gross World Product of 1.9% per year. Higher economic growth rates in Asian regions result in an increase in regional per capita traffic volume up to a factor of 5.3 from 1990 levels. Modal splits continue shifting to more flexible and faster modes of transport. At one point, passenger cars can no longer satisfy the increasing demand for speed (i.e. rising mobility within a fixed time budget). In North America it is estimated that the absolute traffic volume of automobiles will gradually decline starting in the 2010s. (author) 13 figs., 6 tabs., 35 refs.

  16. Market Expects Demand Increase

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    In the recent releasing Textile Industry Invigorating Plan,"givingattention to both domestlc and overseas markets"is put into a keyposition.Under a series policies,such as increasing the tax rebaterate for textile and garment exports,and granting loan for SME,thefurther development of this industry is expectative.Otherwise,weshould know that it costs time for demand driving.This need ourpatients.The only questionis how much time we have to wait.

  17. Migration and Tourism Demand

    Directory of Open Access Journals (Sweden)

    Nuno Carlos LEITÃO

    2012-02-01

    Full Text Available This study considers the relationship between immigration and Portuguese tourism demand for the period 1995-2008, using a dynamic panel data approach. The findings indicate that Portuguese tourism increased significantly during the period in accordance with the values expected for a developed country. The regression results show that income, shock of immigration, population, and geographical distance between Portugal and countries of origin are the main determinants of Portuguese tourism.

  18. Resilience Evaluation of Demand Response as Spinning Reserve under Cyber-Physical Threats

    Directory of Open Access Journals (Sweden)

    Anas AlMajali

    2016-12-01

    Full Text Available In the future, automated demand response mechanisms will be used as spinning reserve. Demand response in the smart grid must be resilient to cyber-physical threats. In this paper, we evaluate the resilience of demand response when used as spinning reserve in the presence of cyber-physical threats. We quantify this evaluation by correlating the stability of the system in the presence of attacks measured by system frequency (Hz and attack level measured by the amount of load (MW that responds to the demand response event. The results demonstrate the importance of anticipating the dependability of demand response before it can be relied upon as spinning reserve.

  19. Laboratory Automation and Middleware.

    Science.gov (United States)

    Riben, Michael

    2015-06-01

    The practice of surgical pathology is under constant pressure to deliver the highest quality of service, reduce errors, increase throughput, and decrease turnaround time while at the same time dealing with an aging workforce, increasing financial constraints, and economic uncertainty. Although not able to implement total laboratory automation, great progress continues to be made in workstation automation in all areas of the pathology laboratory. This report highlights the benefits and challenges of pathology automation, reviews middleware and its use to facilitate automation, and reviews the progress so far in the anatomic pathology laboratory.

  20. Progress toward Producing Demand-Response-Ready Appliances

    Energy Technology Data Exchange (ETDEWEB)

    Hammerstrom, Donald J.; Sastry, Chellury

    2009-12-01

    This report summarizes several historical and ongoing efforts to make small electrical demand-side devices like home appliances more responsive to the dynamic needs of electric power grids. Whereas the utility community often reserves the word demand response for infrequent 2 to 6 hour curtailments that reduce total electrical system peak load, other beneficial responses and ancillary services that may be provided by responsive electrical demand are of interest. Historically, demand responses from the demand side have been obtained by applying external, retrofitted, controlled switches to existing electrical demand. This report is directed instead toward those manufactured products, including appliances, that are able to provide demand responses as soon as they are purchased and that require few, or no, after-market modifications to make them responsive to needs of power grids. Efforts to be summarized include Open Automated Demand Response, the Association of Home Appliance Manufacturer standard CHA 1, a simple interface being developed by the U-SNAP Alliance, various emerging autonomous responses, and the recent PinBus interface that was developed at Pacific Northwest National Laboratory.

  1. Architecture of a fully integrated communication infrastructure for the smart home; Architektur einer vollintegrierten Kommunikationsinfrastruktur fuer das Smart Home

    Energy Technology Data Exchange (ETDEWEB)

    Schaefer, Falk-Moritz; Kays, Ruediger [TU Dortmund (Germany). Lehrstuhl fuer Kommunikationstechnik

    2012-07-01

    For some time, applications in the areas of home automation, ambient assisted living and e-health are discussed. These require reliable and energy-efficient communication solutions in the home environment. In addition, new concepts that go hand in hand with the concept of the smart grids need an access to devices within the home environment. In the realization of smart homes the diversity of market participants involved, the parallel existing business models, the application requirements and the available communication systems make special demands on the underlying network infrastructure. Different solutions should be able to communicate with each other and compatible. In addition, the user expects a simple operation and configuration as well as a long-term support of the products. In the best case, the user is confronted with a single, integrated network infrastructure. Instead of separate systems for reading out of smart meters for monitoring the solar system, for health monitoring and the settings of multimedia devices, the telephone system, or computer network, a fully integrated smart home communications infrastructure should come into operation. This smart home infrastructure should be free of unnecessary duplication of structures; all equipment should be taken into account with a communication interface. The authors of the contribution under consideration report on a possible architecture of such a network infrastructure. Different grades are identified. A protocol stack for different technologies and the linking of different network hierarchies are described.

  2. Supply Chain Coordination with Demand Disruptions under Convex Production Cost Function

    Institute of Scientific and Technical Information of China (English)

    XU Ming-hui; GAO Cheng-xiu

    2005-01-01

    This paper investigates the problem of how to handling demand disruptions in a one-supplier-one-retailer supply chain, where production cost is a convex function of production quantity andprice-demand relationship is linear. Our results show that, if demand is disrupted, under the new price-demand relationship, all-unit wholesale quantity discount policies combining capacitated linear pricingpolicies can also fully coordinate the supply chain.

  3. Meeting increased demand.

    Science.gov (United States)

    Blair, Andrew

    2004-07-01

    New Zealand is a little country with a little economy but with a population that's rapidly aging. New Zealand's population is only 4.3 million people. It's GDP is only $US58.6 billion (2002). New Zealand's expenditure on health as a percentage of GDP is not out of line with that of other countries. As a nation we have been increasing expenditure on health over recent years. In 1990 we spent 7% of GDP on health. In 1995 that increased to 7.65% and is now 8.3%. However, in per capita terms our expenditure on health does not compare so well with like countries. The size of New Zealand's economy is restricting what our country spends on health. Health is already the second highest demand on the New Zealand tax dollar. The tolerance of New Zealanders would be challenged if a Government attempted to increase taxes further to meet the growing demands for expenditure on health, but at the same time the population's expectations are increasing. This is the challenging situation we face today. What lies ahead? Like all industrialized countries New Zealand is facing an aging population. The population below age 40 is decreasing, but it is increasing significantly over that age. 16% of the population is currently aged over 60. By 2051 this proportion will almost double to just over 31%. Coupled with the aging population is increased awareness and expectations, as access to options for treatment and technology becomes readily accessible to the population through such media as the internet. The extent of the impact of the aging population can be clearly represented by focusing on one specialty such as orthopaedics. The New Zealand Orthopaecic Association undertook a study in July 2003 which concluded (among other things) that as a result of the projected aging of the population, over the next 50 years: Musculo-skeletal operations will increase by over 30%. The number of hip replacements will nearly double. The incidence of osteoporosis will increase by a massive 201%. The number

  4. AUTOMATION OF IMAGE DATA PROCESSING

    Directory of Open Access Journals (Sweden)

    Preuss Ryszard

    2014-12-01

    Full Text Available This article discusses the current capabilities of automate processing of the image data on the example of using PhotoScan software by Agisoft . At present, image data obtained by various registration systems (metric and non - metric cameras placed on airplanes , satellites , or more often on UAVs is used to create photogrammetric products. Multiple registrations of object or land area (large groups of photos are captured are usually performed in order to eliminate obscured area as well as to raise the final accuracy of the photogrammetric product. Because of such a situation t he geometry of the resulting image blocks is far from the typical configuration of images . For fast images georeferencing automatic image matching algorithms are currently applied . They can create a model of a block in the local coordinate system or using initial exterior orientation and measured control points can provide image georeference in an external reference frame. In the case of non - metric image application, it is also possible to carry out self - calibration process at this stage . Image matching algorithm is also used in generation of dense point clouds reconstructing spatial shape of the object ( area. In subsequent processing steps it is possible to obtain typical photogrammetric products such as orthomosaic , DSM or DTM and a photorealistic solid model of an object . All aforementioned processing steps are implemented in a single program in contrary to standard commercial software dividing all steps into dedicated modules . I mage processing leading to final geo referenced products can be fully automated including sequential implementation of the processing steps at predetermined control parameters . The paper presents the practical results of the application fully automatic generation of othomosaic for both images obtained by a metric Vexell camera and a block of images acquired by a non - metric UAV system.

  5. Evaluation of Representative Smart Grid Investment Project Technologies: Demand Response

    Energy Technology Data Exchange (ETDEWEB)

    Fuller, Jason C.; Prakash Kumar, Nirupama; Bonebrake, Christopher A.

    2012-02-14

    This document is one of a series of reports estimating the benefits of deploying technologies similar to those implemented on the Smart Grid Investment Grant (SGIG) projects. Four technical reports cover the various types of technologies deployed in the SGIG projects, distribution automation, demand response, energy storage, and renewables integration. A fifth report in the series examines the benefits of deploying these technologies on a national level. This technical report examines the impacts of a limited number of demand response technologies and implementations deployed in the SGIG projects.

  6. Automating checks of plan check automation.

    Science.gov (United States)

    Halabi, Tarek; Lu, Hsiao-Ming

    2014-07-08

    While a few physicists have designed new plan check automation solutions for their clinics, fewer, if any, managed to adapt existing solutions. As complex and varied as the systems they check, these programs must gain the full confidence of those who would run them on countless patient plans. The present automation effort, planCheck, therefore focuses on versatility and ease of implementation and verification. To demonstrate this, we apply planCheck to proton gantry, stereotactic proton gantry, stereotactic proton fixed beam (STAR), and IMRT treatments.

  7. Automate functional testing

    Directory of Open Access Journals (Sweden)

    Ramesh Kalindri

    2014-06-01

    Full Text Available Currently, software engineers are increasingly turning to the option of automating functional tests, but not always have successful in this endeavor. Reasons range from low planning until over cost in the process. Some principles that can guide teams in automating these tests are described in this article.

  8. Automation in Warehouse Development

    NARCIS (Netherlands)

    Hamberg, R.; Verriet, J.

    2012-01-01

    The warehouses of the future will come in a variety of forms, but with a few common ingredients. Firstly, human operational handling of items in warehouses is increasingly being replaced by automated item handling. Extended warehouse automation counteracts the scarcity of human operators and support

  9. More Benefits of Automation.

    Science.gov (United States)

    Getz, Malcolm

    1988-01-01

    Describes a study that measured the benefits of an automated catalog and automated circulation system from the library user's point of view in terms of the value of time saved. Topics discussed include patterns of use, access time, availability of information, search behaviors, and the effectiveness of the measures used. (seven references)…

  10. A scheme for a future distribution automation system in Finnish utilities

    Energy Technology Data Exchange (ETDEWEB)

    Lehtonen, M.; Kaerkkaeinen, S. [VTT Energy, Espoo (Finland); Partanen, J. [Lappeenranta Univ. of Technology (Finland)

    1996-12-31

    This presentation summarizes the results of a project, the aim of which was to define the optimal set of functions for the future distribution automation (DA) systems in Finland. The general factors. which affect the automation needs, are first discussed. The benefits of various functions of DA and demand side management (DSM) are then studied. Next a computer model for a DA feasibility analysis is presented, and some computation results are given. From these. the proposed automation scheme is finally concluded

  11. A scheme for a future distribution automation system in Finnish utilities

    Energy Technology Data Exchange (ETDEWEB)

    Lehtonen, M.; Kaerkkaeinen, S. [VTT Energy, Espoo (Finland); Partanen, J. [Lappeenranta Univ. of Technology (Finland)

    1998-08-01

    This presentation summarizes the results of a project, the aim of which was to define the optimal set of functions for the future distribution automation (DA) systems in Finland. The general factors, which affect the automation needs, are first discussed. The benefits of various functions of DA and demand side management (DSM) are then studied. Next a computer model for a DA feasibility analysis is presented, and some computation results are given. From these, the proposed automation scheme is finally concluded

  12. Integration of drinking water treatment plant process models and emulated process automation software

    NARCIS (Netherlands)

    Worm, G.I.M.

    2012-01-01

    The objective of this research is to limit the risks of fully automated operation of drinking water treatment plants and to improve their operation by using an integrated system of process models and emulated process automation software. This thesis contains the design of such an integrated system.

  13. Integration of drinking water treatment plant process models and emulated process automation software

    NARCIS (Netherlands)

    Worm, G.I.M.

    2012-01-01

    The objective of this research is to limit the risks of fully automated operation of drinking water treatment plants and to improve their operation by using an integrated system of process models and emulated process automation software. This thesis contains the design of such an integrated system.

  14. Assessing Library Automation and Virtual Library Development in Four Academic Libraries in Oyo, Oyo State, Nigeria

    Science.gov (United States)

    Gbadamosi, Belau Olatunde

    2011-01-01

    The paper examines the level of library automation and virtual library development in four academic libraries. A validated questionnaire was used to capture the responses from academic librarians of the libraries under study. The paper discovers that none of the four academic libraries is fully automated. The libraries make use of librarians with…

  15. Automated model building

    CERN Document Server

    Caferra, Ricardo; Peltier, Nicholas

    2004-01-01

    This is the first book on automated model building, a discipline of automated deduction that is of growing importance Although models and their construction are important per se, automated model building has appeared as a natural enrichment of automated deduction, especially in the attempt to capture the human way of reasoning The book provides an historical overview of the field of automated deduction, and presents the foundations of different existing approaches to model construction, in particular those developed by the authors Finite and infinite model building techniques are presented The main emphasis is on calculi-based methods, and relevant practical results are provided The book is of interest to researchers and graduate students in computer science, computational logic and artificial intelligence It can also be used as a textbook in advanced undergraduate courses

  16. Automation in Warehouse Development

    CERN Document Server

    Verriet, Jacques

    2012-01-01

    The warehouses of the future will come in a variety of forms, but with a few common ingredients. Firstly, human operational handling of items in warehouses is increasingly being replaced by automated item handling. Extended warehouse automation counteracts the scarcity of human operators and supports the quality of picking processes. Secondly, the development of models to simulate and analyse warehouse designs and their components facilitates the challenging task of developing warehouses that take into account each customer’s individual requirements and logistic processes. Automation in Warehouse Development addresses both types of automation from the innovative perspective of applied science. In particular, it describes the outcomes of the Falcon project, a joint endeavour by a consortium of industrial and academic partners. The results include a model-based approach to automate warehouse control design, analysis models for warehouse design, concepts for robotic item handling and computer vision, and auton...

  17. Advances in inspection automation

    Science.gov (United States)

    Weber, Walter H.; Mair, H. Douglas; Jansen, Dion; Lombardi, Luciano

    2013-01-01

    This new session at QNDE reflects the growing interest in inspection automation. Our paper describes a newly developed platform that makes the complex NDE automation possible without the need for software programmers. Inspection tasks that are tedious, error-prone or impossible for humans to perform can now be automated using a form of drag and drop visual scripting. Our work attempts to rectify the problem that NDE is not keeping pace with the rest of factory automation. Outside of NDE, robots routinely and autonomously machine parts, assemble components, weld structures and report progress to corporate databases. By contrast, components arriving in the NDT department typically require manual part handling, calibrations and analysis. The automation examples in this paper cover the development of robotic thickness gauging and the use of adaptive contour following on the NRU reactor inspection at Chalk River.

  18. A Validity-Based Approach to Quality Control and Assurance of Automated Scoring

    Science.gov (United States)

    Bejar, Isaac I.

    2011-01-01

    Automated scoring of constructed responses is already operational in several testing programmes. However, as the methodology matures and the demand for the utilisation of constructed responses increases, the volume of automated scoring is likely to increase at a fast pace. Quality assurance and control of the scoring process will likely be more…

  19. Individual Differences in Response to Automation: The Five Factor Model of Personality

    Science.gov (United States)

    Szalma, James L.; Taylor, Grant S.

    2011-01-01

    This study examined the relationship of operator personality (Five Factor Model) and characteristics of the task and of adaptive automation (reliability and adaptiveness--whether the automation was well-matched to changes in task demand) to operator performance, workload, stress, and coping. This represents the first investigation of how the Five…

  20. Building energy demand aggregation and simulation tools

    DEFF Research Database (Denmark)

    Gianniou, Panagiota; Heller, Alfred; Rode, Carsten

    2015-01-01

    Nowadays, the minimization of energy consumption and the optimization of efficiency of the overall energy grid have been in the agenda of most national and international energy policies. At the same time, urbanization has put cities under the microscope towards achieving cost-effective energy...... savings due to their compact and highly dense form. Thus, accurate estimation of energy demand of cities is of high importance to policy-makers and energy planners. This calls for automated methods that can be easily expandable to higher levels of aggregation, ranging from clusters of buildings...... to neighbourhoods and cities. Buildings occupy a key place in the development of smart cities as they represent an important potential to integrate smart energy solutions. Building energy consumption affects significantly the performance of the entire energy network. Therefore, a realistic estimation...

  1. Reducing and Calibrating SCUBA Data on Demand

    Science.gov (United States)

    Jenness, Tim; Bohlender, David A.; Gaudet, Séverin J.; Economou, Frossie; Tilanus, Remo P. J.; Durand, Daniel; Hill, Norman R.

    One of the most important aspects of the new GRID and virtual observatory initiatives is the production of reliably calibrated data from telescope archives. The archive for the Submillimeter Common-User Bolometer Array (SCUBA) on the James Clerk Maxwell Telescope is small enough (about 100 GB) to allow it to be on-line and for it to be processed in a reasonable amount of time. This makes it possible to address the important issues of calibration and source identification in a dataset that is tractable with reasonable computer hardware. This paper will focus on the JAC's and CADC's experience of automating the processing of SCUBA data, the difficulties and the successes, culminating in the ability of the SCUBA archive at the CADC to serve reduced images on demand.

  2. CONSUMER DEMAND FOR FOOD DIVERSITY

    OpenAIRE

    Lee, Jonq-Ying; Mark G. Brown

    1989-01-01

    In this paper, consumer demand for food diversity is measured by the entropy and Simpson indices for budget shares. Results show that consumer demand for food diversity is related to total food expenditures and household size and composition.

  3. Space station automation study. Automation requirements derived from space manufacturing concepts. Volume 1: Executive summary

    Science.gov (United States)

    1984-01-01

    The two manufacturing concepts developed represent innovative, technologically advanced manufacturing schemes. The concepts were selected to facilitate an in depth analysis of manufacturing automation requirements in the form of process mechanization, teleoperation and robotics, and artificial intelligence. While the cost effectiveness of these facilities has not been analyzed as part of this study, both appear entirely feasible for the year 2000 timeframe. The growing demand for high quality gallium arsenide microelectronics may warrant the ventures.

  4. Bioeconomic Analysis of Pesticide Demand

    OpenAIRE

    Moffitt, L. Joe; Farnsworth, Richard L.

    1981-01-01

    The ability of insects to develop resistance to specific pesticides affects pesticide demand. However, the affect of resistance on demand cannot be observed or measured. This analysis substitutes an expression for the unobserved resistance variable in a pesticide demand model and then illustrates the model's potential by estimating demand for DDT. To arrive at the expression characterizing the unobserved resistance variable a biological resistance model is constructed then incorporated into t...

  5. Rewarding yet demanding

    DEFF Research Database (Denmark)

    Bjørkedal, S T B; Torsting, A M B; Møller, T

    2016-01-01

    AIM: The purpose of this study, by exploring client perspectives, was to achieve a better understanding of how people with schizophrenia experience an occupational therapy intervention designed to enable them to carry out meaningful occupations in the early phases of recovery. METHOD: A qualitative......, as demanding. Participants valued engaging in real-life occupations while anchoring new strategies but also the occupational therapist's role in dealing with failure. Participants felt the intervention assisted in their recovery process and enabled them to engage in meaningful occupations. CONCLUSION......: The study provided unique insight into how participants experienced a client-centred partnership with an occupational therapist in the early phases of recovery. The intervention was feasible and supported the participants' recovery process....

  6. Physical demands in worklife.

    Science.gov (United States)

    Astrand, I

    1988-01-01

    Industrial occupations which are physically strenuous in the traditional sense of the word have decreased in number. They have partly been replaced by "light," repetitive, monotonous work tasks performed in a sitting position. The number of heavy work tasks within the service sector has increased. Specialization has been intensified. The individual's capacity for strenuous work is still of importance to successful work performance. Many studies show that an optional choice of work pace in physically demanding occupational work results in an adaptation of pace or intensity until the worker is utilizing 40-50% of her or his capacity. When the work rate is constrained, the relative strain of the individual varies inversely with the physical work capacity. The frequency of musculoskeletal disorders has concurrently increased with the implementation of industrial mechanization. New, wise, ergonomic moves are needed to stop this development.

  7. Policy challenges of increasing automation in driving

    Directory of Open Access Journals (Sweden)

    Ata M. Khan

    2012-03-01

    Full Text Available The convergence of information and communication technologies (ICT with automotive technologies has already resulted in automation features in road vehicles and this trend is expected to continue in the future owing to consumer demand, dropping costs of components, and improved reliability. While the automation features that have taken place so far are mainly in the form of information and driver warning technologies (classified as level I pre-2010, future developments in the medium term (level II 2010–2025 are expected to exhibit connected cognitive vehicle features and encompass increasing degree of automation in the form of advanced driver assistance systems. Although autonomous vehicles have been developed for research purposes and are being tested in controlled driving missions, the autonomous driving case is only a long term (level III 2025+ scenario. This paper contributes knowledge on technological forecasts regarding automation, policy challenges for each level of technology development and application context, and the essential instrument of cost-effectiveness for policy analysis which enables policy decisions on the automation systems to be assessed in a consistent and balanced manner. The cost of a system per vehicle is viewed against its effectiveness in meeting policy objectives of improving safety, efficiency, mobility, convenience and reducing environmental effects. Example applications are provided that illustrate the contribution of the methodology in providing information for supporting policy decisions. Given the uncertainties in system costs as well as effectiveness, the tool for assessing policies for future generation features probabilistic and utility-theoretic analysis capability. The policy issues defined and the assessment framework enable the resolution of policy challenges while allowing worthy innovative automation in driving to enhance future road transportation.

  8. Participatory Demand-supply Systems

    NARCIS (Netherlands)

    Rezaee, S.A.; Oey, M.A.; Nevejan, C.I.M.; Brazier, F.M.

    2015-01-01

    Introducing the notion of Participatory Demand-Supply (PDS) systems as socio-technical systems, this paper focuses on a new approach to coordinating demand and supply in dynamic environments. A participatory approach to demand and supply provides a new frame of reference for system design, for which

  9. Participatory Demand-supply Systems

    NARCIS (Netherlands)

    Rezaee, S.A.; Oey, M.A.; Nevejan, C.I.M.; Brazier, F.M.

    2015-01-01

    Introducing the notion of Participatory Demand-Supply (PDS) systems as socio-technical systems, this paper focuses on a new approach to coordinating demand and supply in dynamic environments. A participatory approach to demand and supply provides a new frame of reference for system design, for which

  10. Rebooting the human mitochondrial phylogeny: an automated and scalable methodology with expert knowledge

    Directory of Open Access Journals (Sweden)

    Mayordomo Elvira

    2011-05-01

    Full Text Available Abstract Background Mitochondrial DNA is an ideal source of information to conduct evolutionary and phylogenetic studies due to its extraordinary properties and abundance. Many insights can be gained from these, including but not limited to screening genetic variation to identify potentially deleterious mutations. However, such advances require efficient solutions to very difficult computational problems, a need that is hampered by the very plenty of data that confers strength to the analysis. Results We develop a systematic, automated methodology to overcome these difficulties, building from readily available, public sequence databases to high-quality alignments and phylogenetic trees. Within each stage in an autonomous workflow, outputs are carefully evaluated and outlier detection rules defined to integrate expert knowledge and automated curation, hence avoiding the manual bottleneck found in past approaches to the problem. Using these techniques, we have performed exhaustive updates to the human mitochondrial phylogeny, illustrating the power and computational scalability of our approach, and we have conducted some initial analyses on the resulting phylogenies. Conclusions The problem at hand demands careful definition of inputs and adequate algorithmic treatment for its solutions to be realistic and useful. It is possible to define formal rules to address the former requirement by refining inputs directly and through their combination as outputs, and the latter are also of help to ascertain the performance of chosen algorithms. Rules can exploit known or inferred properties of datasets to simplify inputs through partitioning, therefore cutting computational costs and affording work on rapidly growing, otherwise intractable datasets. Although expert guidance may be necessary to assist the learning process, low-risk results can be fully automated and have proved themselves convenient and valuable.

  11. A Framework for the Automation of Air Defence Systems

    NARCIS (Netherlands)

    Choenni, R.S.; Leijnse, C.

    1999-01-01

    The need for more efficiency in military organizations is growing. It is expected that a significant increase in efficiency can be obtained by an integration of communication and information technology. This integration may result in (sub)systems that are fully automated, i.e., systems that are unma

  12. Automated Segmentation of the Choroid from Clinical SD-OCT

    OpenAIRE

    Zhang, Li; Lee, Kyungmoo; Niemeijer, Meindert; Mullins, Robert F.; Sonka, Milan; Michael D Abràmoff

    2012-01-01

    Aging and eye disease changes the choroid, but imaging it is hard. We describe a fully automated, highly reproducible, 3D method for segmentation of choroidal vessels, and quantification of choroidal and choriocapillaris-equivalent thickness, in standard clinical SD-OCT.

  13. An automated scanning system for particle physics and medical applications

    Energy Technology Data Exchange (ETDEWEB)

    De Lellis, Giovanni [Dipartimento di Fisica, Universita ' Federico II' di Napoli, Complesso Universitario Monte Sant' Angelo, via Cintia, 80126 Naples (Italy)], E-mail: giovanni.de.lellis@cern.ch

    2007-10-01

    In this paper we present the performance of a fully automated microscope aimed at very precise spatial and angular measurement with the nuclear emulsion technology. We show in particular its application to the study of the fragmentation of carbon ions used in the oncological hadrontherapy.

  14. Progress report on a fully automatic Gas Tungsten Arc Welding (GTAW) system development

    Energy Technology Data Exchange (ETDEWEB)

    Daumeyer, G.J. III

    1994-12-01

    A plan to develop a fully automatic gas tungsten arc welding (GTAW) system that will utilize a vision-sensing computer (which will provide in-process feedback control) is presently in work. Evaluations of different technological aspects and system design requirements continue. This report summaries major activities in the plan`s successful progress. The technological feasibility of producing the fully automated GTAW system has been proven. The goal of this process development project is to provide a production-ready system within the shortest reasonable time frame.

  15. Chef infrastructure automation cookbook

    CERN Document Server

    Marschall, Matthias

    2013-01-01

    Chef Infrastructure Automation Cookbook contains practical recipes on everything you will need to automate your infrastructure using Chef. The book is packed with illustrated code examples to automate your server and cloud infrastructure.The book first shows you the simplest way to achieve a certain task. Then it explains every step in detail, so that you can build your knowledge about how things work. Eventually, the book shows you additional things to consider for each approach. That way, you can learn step-by-step and build profound knowledge on how to go about your configuration management

  16. Automated PCB Inspection System

    Directory of Open Access Journals (Sweden)

    Syed Usama BUKHARI

    2017-05-01

    Full Text Available Development of an automated PCB inspection system as per the need of industry is a challenging task. In this paper a case study is presented, to exhibit, a proposed system for an immigration process of a manual PCB inspection system to an automated PCB inspection system, with a minimal intervention on the existing production flow, for a leading automotive manufacturing company. A detailed design of the system, based on computer vision followed by testing and analysis was proposed, in order to aid the manufacturer in the process of automation.

  17. Opportunities, Barriers and Actions for Industrial Demand Response in California

    Energy Technology Data Exchange (ETDEWEB)

    McKane, Aimee T.; Piette, Mary Ann; Faulkner, David; Ghatikar, Girish; Radspieler Jr., Anthony; Adesola, Bunmi; Murtishaw, Scott; Kiliccote, Sila

    2008-01-31

    In 2006 the Demand Response Research Center (DRRC) formed an Industrial Demand Response Team to investigate opportunities and barriers to implementation of Automated Demand Response (Auto-DR) systems in California industries. Auto-DR is an open, interoperable communications and technology platform designed to: Provide customers with automated, electronic price and reliability signals; Provide customers with capability to automate customized DR strategies; Automate DR, providing utilities with dispatchable operational capability similar to conventional generation resources. This research began with a review of previous Auto-DR research on the commercial sector. Implementing Auto-DR in industry presents a number of challenges, both practical and perceived. Some of these include: the variation in loads and processes across and within sectors, resource-dependent loading patterns that are driven by outside factors such as customer orders or time-critical processing (e.g. tomato canning), the perceived lack of control inherent in the term 'Auto-DR', and aversion to risk, especially unscheduled downtime. While industry has demonstrated a willingness to temporarily provide large sheds and shifts to maintain grid reliability and be a good corporate citizen, the drivers for widespread Auto-DR will likely differ. Ultimately, most industrial facilities will balance the real and perceived risks associated with Auto-DR against the potential for economic gain through favorable pricing or incentives. Auto-DR, as with any ongoing industrial activity, will need to function effectively within market structures. The goal of the industrial research is to facilitate deployment of industrial Auto-DR that is economically attractive and technologically feasible. Automation will make DR: More visible by providing greater transparency through two-way end-to-end communication of DR signals from end-use customers; More repeatable, reliable, and persistent because the automated

  18. Assessing Rotation-Invariant Feature Classification for Automated Wildebeest Population Counts.

    Directory of Open Access Journals (Sweden)

    Colin J Torney

    Full Text Available Accurate and on-demand animal population counts are the holy grail for wildlife conservation organizations throughout the world because they enable fast and responsive adaptive management policies. While the collection of image data from camera traps, satellites, and manned or unmanned aircraft has advanced significantly, the detection and identification of animals within images remains a major bottleneck since counting is primarily conducted by dedicated enumerators or citizen scientists. Recent developments in the field of computer vision suggest a potential resolution to this issue through the use of rotation-invariant object descriptors combined with machine learning algorithms. Here we implement an algorithm to detect and count wildebeest from aerial images collected in the Serengeti National Park in 2009 as part of the biennial wildebeest count. We find that the per image error rates are greater than, but comparable to, two separate human counts. For the total count, the algorithm is more accurate than both manual counts, suggesting that human counters have a tendency to systematically over or under count images. While the accuracy of the algorithm is not yet at an acceptable level for fully automatic counts, our results show this method is a promising avenue for further research and we highlight specific areas where future research should focus in order to develop fast and accurate enumeration of aerial count data. If combined with a bespoke image collection protocol, this approach may yield a fully automated wildebeest count in the near future.

  19. Automated Generalisation Within NMAs in 2016

    Science.gov (United States)

    Stoter, Jantien; van Altena, Vincent; Post, Marc; Burghardt, Dirk; Duchêne, Cecile

    2016-06-01

    Producing maps and geo-data at different scales is traditionally one of the main tasks of National (and regional) Mapping Agencies (NMAs). The derivation of low-scale maps (i.e. with less detail) from large-scale maps (with more detail), i.e. generalisation, used to be a manual task of cartographers. With the need for more up-to-date data as well as the development of automated generalisation solutions in both research and industry, NMAs are implementing automated generalisation production lines. To exchange experiences and identify remaining issues, a workshop was organised end 2015 by the Commission on Generalisation and Multirepresentation of the International Cartographic Association and the Commission on Modelling and Processing of the European Spatial Data Research. This paper reports about the workshop outcomes. It shows that, most NMAs have implemented a certain form of automation in their workflows, varying from generalisation of certain features while still maintaining a manual workflow; semiautomated editing and generalisation to a fully automated procedure.

  20. Building Extraction from Remote Sensing Data Using Fully Convolutional Networks

    Science.gov (United States)

    Bittner, K.; Cui, S.; Reinartz, P.

    2017-05-01

    Building detection and footprint extraction are highly demanded for many remote sensing applications. Though most previous works have shown promising results, the automatic extraction of building footprints still remains a nontrivial topic, especially in complex urban areas. Recently developed extensions of the CNN framework made it possible to perform dense pixel-wise classification of input images. Based on these abilities we propose a methodology, which automatically generates a full resolution binary building mask out of a Digital Surface Model (DSM) using a Fully Convolution Network (FCN) architecture. The advantage of using the depth information is that it provides geometrical silhouettes and allows a better separation of buildings from background as well as through its invariance to illumination and color variations. The proposed framework has mainly two steps. Firstly, the FCN is trained on a large set of patches consisting of normalized DSM (nDSM) as inputs and available ground truth building mask as target outputs. Secondly, the generated predictions from FCN are viewed as unary terms for a Fully connected Conditional Random Fields (FCRF), which enables us to create a final binary building mask. A series of experiments demonstrate that our methodology is able to extract accurate building footprints which are close to the buildings original shapes to a high degree. The quantitative and qualitative analysis show the significant improvements of the results in contrast to the multy-layer fully connected network from our previous work.