WorldWideScience

Sample records for fully automated demand

  1. Development and evaluation of fully automated demand response in large facilities

    Energy Technology Data Exchange (ETDEWEB)

    Piette, Mary Ann; Sezgen, Osman; Watson, David S.; Motegi, Naoya; Shockman, Christine; ten Hope, Laurie

    2004-03-30

    This report describes the results of a research project to develop and evaluate the performance of new Automated Demand Response (Auto-DR) hardware and software technology in large facilities. Demand Response (DR) is a set of activities to reduce or shift electricity use to improve electric grid reliability, manage electricity costs, and ensure that customers receive signals that encourage load reduction during times when the electric grid is near its capacity. The two main drivers for widespread demand responsiveness are the prevention of future electricity crises and the reduction of electricity prices. Additional goals for price responsiveness include equity through cost of service pricing, and customer control of electricity usage and bills. The technology developed and evaluated in this report could be used to support numerous forms of DR programs and tariffs. For the purpose of this report, we have defined three levels of Demand Response automation. Manual Demand Response involves manually turning off lights or equipment; this can be a labor-intensive approach. Semi-Automated Response involves the use of building energy management control systems for load shedding, where a preprogrammed load shedding strategy is initiated by facilities staff. Fully-Automated Demand Response is initiated at a building or facility through receipt of an external communications signal--facility staff set up a pre-programmed load shedding strategy which is automatically initiated by the system without the need for human intervention. We have defined this approach to be Auto-DR. An important concept in Auto-DR is that a facility manager is able to ''opt out'' or ''override'' an individual DR event if it occurs at a time when the reduction in end-use services is not desirable. This project sought to improve the feasibility and nature of Auto-DR strategies in large facilities. The research focused on technology development, testing

  2. Participation through Automation: Fully Automated Critical PeakPricing in Commercial Buildings

    Energy Technology Data Exchange (ETDEWEB)

    Piette, Mary Ann; Watson, David S.; Motegi, Naoya; Kiliccote,Sila; Linkugel, Eric

    2006-06-20

    California electric utilities have been exploring the use of dynamic critical peak prices (CPP) and other demand response programs to help reduce peaks in customer electric loads. CPP is a tariff design to promote demand response. Levels of automation in DR can be defined as follows: Manual Demand Response involves a potentially labor-intensive approach such as manually turning off or changing comfort set points at each equipment switch or controller. Semi-Automated Demand Response involves a pre-programmed demand response strategy initiated by a person via centralized control system. Fully Automated Demand Response does not involve human intervention, but is initiated at a home, building, or facility through receipt of an external communications signal. The receipt of the external signal initiates pre-programmed demand response strategies. They refer to this as Auto-DR. This paper describes the development, testing, and results from automated CPP (Auto-CPP) as part of a utility project in California. The paper presents the project description and test methodology. This is followed by a discussion of Auto-DR strategies used in the field test buildings. They present a sample Auto-CPP load shape case study, and a selection of the Auto-CPP response data from September 29, 2005. If all twelve sites reached their maximum saving simultaneously, a total of approximately 2 MW of DR is available from these twelve sites that represent about two million ft{sup 2}. The average DR was about half that value, at about 1 MW. These savings translate to about 0.5 to 1.0 W/ft{sup 2} of demand reduction. They are continuing field demonstrations and economic evaluations to pursue increasing penetrations of automated DR that has demonstrated ability to provide a valuable DR resource for California.

  3. Open Automated Demand Response Communications Specification (Version 1.0)

    Energy Technology Data Exchange (ETDEWEB)

    Piette, Mary Ann; Ghatikar, Girish; Kiliccote, Sila; Koch, Ed; Hennage, Dan; Palensky, Peter; McParland, Charles

    2009-02-28

    The development of the Open Automated Demand Response Communications Specification, also known as OpenADR or Open Auto-DR, began in 2002 following the California electricity crisis. The work has been carried out by the Demand Response Research Center (DRRC), which is managed by Lawrence Berkeley National Laboratory. This specification describes an open standards-based communications data model designed to facilitate sending and receiving demand response price and reliability signals from a utility or Independent System Operator to electric customers. OpenADR is one element of the Smart Grid information and communications technologies that are being developed to improve optimization between electric supply and demand. The intention of the open automated demand response communications data model is to provide interoperable signals to building and industrial control systems that are preprogrammed to take action based on a demand response signal, enabling a demand response event to be fully automated, with no manual intervention. The OpenADR specification is a flexible infrastructure to facilitate common information exchange between the utility or Independent System Operator and end-use participants. The concept of an open specification is intended to allow anyone to implement the signaling systems, the automation server or the automation clients.

  4. Demands on digital automation; Anforderungen an die Digitale Automation

    Energy Technology Data Exchange (ETDEWEB)

    Bieler, P.

    1995-12-31

    In chapter 12 of the anthology about building control the demands on digital automation are presented. The following aspects are discussed: variety of the companies` philosophies, demands of the customer/investor, demands of the use of buildings/rooms, the user, point of view of manufacturer of technical plants. (BWI) [Deutsch] Kapitel 12 des Sammelbandes ueber Building Control stellt die Anforderungen an die Digitale Automation vor. In diesem Zusammenhang wird auf folgende Themenbereiche eingegangen: Spektrum der Firmenphilosophien, Forderungen der Auftraggeber/Investoren, der Gebaeude-/Raumnutzung, der Betreiber sowie Sicht der Ersteller betriebstechnischer Anlagen. (BWI)

  5. Developments towards a fully automated AMS system

    International Nuclear Information System (INIS)

    Steier, P.; Puchegger, S.; Golser, R.; Kutschera, W.; Priller, A.; Rom, W.; Wallner, A.; Wild, E.

    2000-01-01

    The possibilities of computer-assisted and automated accelerator mass spectrometry (AMS) measurements were explored. The goal of these efforts is to develop fully automated procedures for 'routine' measurements at the Vienna Environmental Research Accelerator (VERA), a dedicated 3-MV Pelletron tandem AMS facility. As a new tool for automatic tuning of the ion optics we developed a multi-dimensional optimization algorithm robust to noise, which was applied for 14 C and 10 Be. The actual isotope ratio measurements are performed in a fully automated fashion and do not require the presence of an operator. Incoming data are evaluated online and the results can be accessed via Internet. The system was used for 14 C, 10 Be, 26 Al and 129 I measurements

  6. An international crowdsourcing study into people's statements on fully automated driving

    NARCIS (Netherlands)

    Bazilinskyy, P.; Kyriakidis, M.; de Winter, J.C.F.; Ahram, Tareq; Karwowski, Waldemar; Schmorrow, Dylan

    2015-01-01

    Fully automated driving can potentially provide enormous benefits to society. However, it has been unclear whether people will appreciate such far-reaching technology. This study investigated anonymous textual comments regarding fully automated driving, based on data extracted from three online

  7. Fully automated MRI-guided robotics for prostate brachytherapy

    International Nuclear Information System (INIS)

    Stoianovici, D.; Vigaru, B.; Petrisor, D.; Muntener, M.; Patriciu, A.; Song, D.

    2008-01-01

    The uncertainties encountered in the deployment of brachytherapy seeds are related to the commonly used ultrasound imager and the basic instrumentation used for the implant. An alternative solution is under development in which a fully automated robot is used to place the seeds according to the dosimetry plan under direct MRI-guidance. Incorporation of MRI-guidance creates potential for physiological and molecular image-guided therapies. Moreover, MRI-guided brachytherapy is also enabling for re-estimating dosimetry during the procedure, because with the MRI the seeds already implanted can be localised. An MRI compatible robot (MrBot) was developed. The robot is designed for transperineal percutaneous prostate interventions, and customised for fully automated MRI-guided brachytherapy. With different end-effectors, the robot applies to other image-guided interventions of the prostate. The robot is constructed of non-magnetic and dielectric materials and is electricity free using pneumatic actuation and optic sensing. A new motor (PneuStep) was purposely developed to set this robot in motion. The robot fits alongside the patient in closed-bore MRI scanners. It is able to stay fully operational during MR imaging without deteriorating the quality of the scan. In vitro, cadaver, and animal tests showed millimetre needle targeting accuracy, and very precise seed placement. The robot tested without any interference up to 7T. The robot is the first fully automated robot to function in MRI scanners. Its first application is MRI-guided seed brachytherapy. It is capable of automated, highly accurate needle placement. Extensive testing is in progress prior to clinical trials. Preliminary results show that the robot may become a useful image-guided intervention instrument. (author)

  8. Findings from Seven Years of Field Performance Data for Automated Demand Response in Commercial Buildings

    Energy Technology Data Exchange (ETDEWEB)

    Kiliccote, Sila; Piette, Mary Ann; Mathieu, Johanna; Parrish, Kristen

    2010-05-14

    California is a leader in automating demand response (DR) to promote low-cost, consistent, and predictable electric grid management tools. Over 250 commercial and industrial facilities in California participate in fully-automated programs providing over 60 MW of peak DR savings. This paper presents a summary of Open Automated DR (OpenADR) implementation by each of the investor-owned utilities in California. It provides a summary of participation, DR strategies and incentives. Commercial buildings can reduce peak demand from 5 to 15percent with an average of 13percent. Industrial facilities shed much higher loads. For buildings with multi-year savings we evaluate their load variability and shed variability. We provide a summary of control strategies deployed, along with costs to install automation. We report on how the electric DR control strategies perform over many years of events. We benchmark the peak demand of this sample of buildings against their past baselines to understand the differences in building performance over the years. This is done with peak demand intensities and load factors. The paper also describes the importance of these data in helping to understand possible techniques to reach net zero energy using peak day dynamic control capabilities in commercial buildings. We present an example in which the electric load shape changed as a result of a lighting retrofit.

  9. Home Network Technologies and Automating Demand Response

    Energy Technology Data Exchange (ETDEWEB)

    McParland, Charles

    2009-12-01

    Over the past several years, interest in large-scale control of peak energy demand and total consumption has increased. While motivated by a number of factors, this interest has primarily been spurred on the demand side by the increasing cost of energy and, on the supply side by the limited ability of utilities to build sufficient electricity generation capacity to meet unrestrained future demand. To address peak electricity use Demand Response (DR) systems are being proposed to motivate reductions in electricity use through the use of price incentives. DR systems are also be design to shift or curtail energy demand at critical times when the generation, transmission, and distribution systems (i.e. the 'grid') are threatened with instabilities. To be effectively deployed on a large-scale, these proposed DR systems need to be automated. Automation will require robust and efficient data communications infrastructures across geographically dispersed markets. The present availability of widespread Internet connectivity and inexpensive, reliable computing hardware combined with the growing confidence in the capabilities of distributed, application-level communications protocols suggests that now is the time for designing and deploying practical systems. Centralized computer systems that are capable of providing continuous signals to automate customers reduction of power demand, are known as Demand Response Automation Servers (DRAS). The deployment of prototype DRAS systems has already begun - with most initial deployments targeting large commercial and industrial (C & I) customers. An examination of the current overall energy consumption by economic sector shows that the C & I market is responsible for roughly half of all energy consumption in the US. On a per customer basis, large C & I customers clearly have the most to offer - and to gain - by participating in DR programs to reduce peak demand. And, by concentrating on a small number of relatively

  10. Cardiac imaging: working towards fully-automated machine analysis & interpretation.

    Science.gov (United States)

    Slomka, Piotr J; Dey, Damini; Sitek, Arkadiusz; Motwani, Manish; Berman, Daniel S; Germano, Guido

    2017-03-01

    Non-invasive imaging plays a critical role in managing patients with cardiovascular disease. Although subjective visual interpretation remains the clinical mainstay, quantitative analysis facilitates objective, evidence-based management, and advances in clinical research. This has driven developments in computing and software tools aimed at achieving fully automated image processing and quantitative analysis. In parallel, machine learning techniques have been used to rapidly integrate large amounts of clinical and quantitative imaging data to provide highly personalized individual patient-based conclusions. Areas covered: This review summarizes recent advances in automated quantitative imaging in cardiology and describes the latest techniques which incorporate machine learning principles. The review focuses on the cardiac imaging techniques which are in wide clinical use. It also discusses key issues and obstacles for these tools to become utilized in mainstream clinical practice. Expert commentary: Fully-automated processing and high-level computer interpretation of cardiac imaging are becoming a reality. Application of machine learning to the vast amounts of quantitative data generated per scan and integration with clinical data also facilitates a move to more patient-specific interpretation. These developments are unlikely to replace interpreting physicians but will provide them with highly accurate tools to detect disease, risk-stratify, and optimize patient-specific treatment. However, with each technological advance, we move further from human dependence and closer to fully-automated machine interpretation.

  11. Intention to use a fully automated car: attitudes and a priori acceptability

    OpenAIRE

    PAYRE, William; CESTAC, Julien; DELHOMME, Patricia

    2014-01-01

    If previous research studied acceptability of partially or highly automated driving, few of them focused on fully automated driving (FAD), including the ability to master longitudinal control, lateral control and maneuvers. The present study analyzes a priori acceptability, attitudes, personality traits and intention to use a fully automated vehicle. 421 French drivers (153 males, M= 40.2 years, age range 19-73) answered an online questionnaire. 68.1% of the sample a priori accepted FAD. P...

  12. Fully automated segmentation of callus by micro-CT compared to biomechanics.

    Science.gov (United States)

    Bissinger, Oliver; Götz, Carolin; Wolff, Klaus-Dietrich; Hapfelmeier, Alexander; Prodinger, Peter Michael; Tischer, Thomas

    2017-07-11

    A high percentage of closed femur fractures have slight comminution. Using micro-CT (μCT), multiple fragment segmentation is much more difficult than segmentation of unfractured or osteotomied bone. Manual or semi-automated segmentation has been performed to date. However, such segmentation is extremely laborious, time-consuming and error-prone. Our aim was to therefore apply a fully automated segmentation algorithm to determine μCT parameters and examine their association with biomechanics. The femura of 64 rats taken after randomised inhibitory or neutral medication, in terms of the effect on fracture healing, and controls were closed fractured after a Kirschner wire was inserted. After 21 days, μCT and biomechanical parameters were determined by a fully automated method and correlated (Pearson's correlation). The fully automated segmentation algorithm automatically detected bone and simultaneously separated cortical bone from callus without requiring ROI selection for each single bony structure. We found an association of structural callus parameters obtained by μCT to the biomechanical properties. However, results were only explicable by additionally considering the callus location. A large number of slightly comminuted fractures in combination with therapies that influence the callus qualitatively and/or quantitatively considerably affects the association between μCT and biomechanics. In the future, contrast-enhanced μCT imaging of the callus cartilage might provide more information to improve the non-destructive and non-invasive prediction of callus mechanical properties. As studies evaluating such important drugs increase, fully automated segmentation appears to be clinically important.

  13. A Fully Automated High-Throughput Zebrafish Behavioral Ototoxicity Assay.

    Science.gov (United States)

    Todd, Douglas W; Philip, Rohit C; Niihori, Maki; Ringle, Ryan A; Coyle, Kelsey R; Zehri, Sobia F; Zabala, Leanne; Mudery, Jordan A; Francis, Ross H; Rodriguez, Jeffrey J; Jacob, Abraham

    2017-08-01

    Zebrafish animal models lend themselves to behavioral assays that can facilitate rapid screening of ototoxic, otoprotective, and otoregenerative drugs. Structurally similar to human inner ear hair cells, the mechanosensory hair cells on their lateral line allow the zebrafish to sense water flow and orient head-to-current in a behavior called rheotaxis. This rheotaxis behavior deteriorates in a dose-dependent manner with increased exposure to the ototoxin cisplatin, thereby establishing itself as an excellent biomarker for anatomic damage to lateral line hair cells. Building on work by our group and others, we have built a new, fully automated high-throughput behavioral assay system that uses automated image analysis techniques to quantify rheotaxis behavior. This novel system consists of a custom-designed swimming apparatus and imaging system consisting of network-controlled Raspberry Pi microcomputers capturing infrared video. Automated analysis techniques detect individual zebrafish, compute their orientation, and quantify the rheotaxis behavior of a zebrafish test population, producing a powerful, high-throughput behavioral assay. Using our fully automated biological assay to test a standardized ototoxic dose of cisplatin against varying doses of compounds that protect or regenerate hair cells may facilitate rapid translation of candidate drugs into preclinical mammalian models of hearing loss.

  14. Opportunities for Automated Demand Response in California Agricultural Irrigation

    Energy Technology Data Exchange (ETDEWEB)

    Olsen, Daniel [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Aghajanzadeh, Arian [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); McKane, Aimee [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2015-08-01

    Pumping water for agricultural irrigation represents a significant share of California’s annual electricity use and peak demand. It also represents a large source of potential flexibility, as farms possess a form of storage in their wetted soil. By carefully modifying their irrigation schedules, growers can participate in demand response without adverse effects on their crops. This report describes the potential for participation in demand response and automated demand response by agricultural irrigators in California, as well as barriers to widespread participation. The report first describes the magnitude, timing, location, purpose, and manner of energy use in California. Typical on-­farm controls are discussed, as well as common impediments to participation in demand response and automated demand response programs. Case studies of demand response programs in California and across the country are reviewed, and their results along with overall California demand estimates are used to estimate statewide demand response potential. Finally, recommendations are made for future research that can enhance the understanding of demand response potential in this industry.

  15. Northwest Open Automated Demand Response Technology Demonstration Project

    Energy Technology Data Exchange (ETDEWEB)

    Kiliccote, Sila; Dudley, Junqiao Han; Piette, Mary Ann

    2009-08-01

    Lawrence Berkeley National Laboratory (LBNL) and the Demand Response Research Center (DRRC) performed a technology demonstration and evaluation for Bonneville Power Administration (BPA) in Seattle City Light's (SCL) service territory. This report summarizes the process and results of deploying open automated demand response (OpenADR) in Seattle area with winter morning peaking commercial buildings. The field tests were designed to evaluate the feasibility of deploying fully automated demand response (DR) in four to six sites in the winter and the savings from various building systems. The project started in November of 2008 and lasted 6 months. The methodology for the study included site recruitment, control strategy development, automation system deployment and enhancements, and evaluation of sites participation in DR test events. LBNL subcontracted McKinstry and Akuacom for this project. McKinstry assisted with recruitment, site survey collection, strategy development and overall participant and control vendor management. Akuacom established a new server and enhanced its operations to allow for scheduling winter morning day-of and day-ahead events. Each site signed a Memorandum of Agreement with SCL. SCL offered each site $3,000 for agreeing to participate in the study and an additional $1,000 for each event they participated. Each facility and their control vendor worked with LBNL and McKinstry to select and implement control strategies for DR and developed their automation based on the existing Internet connectivity and building control system. Once the DR strategies were programmed, McKinstry commissioned them before actual test events. McKinstry worked with LBNL to identify control points that can be archived at each facility. For each site LBNL collected meter data and trend logs from the energy management and control system. The communication system allowed the sites to receive day-ahead as well as day-of DR test event signals. Measurement of DR was

  16. A fully automated algorithm of baseline correction based on wavelet feature points and segment interpolation

    Science.gov (United States)

    Qian, Fang; Wu, Yihui; Hao, Peng

    2017-11-01

    Baseline correction is a very important part of pre-processing. Baseline in the spectrum signal can induce uneven amplitude shifts across different wavenumbers and lead to bad results. Therefore, these amplitude shifts should be compensated before further analysis. Many algorithms are used to remove baseline, however fully automated baseline correction is convenient in practical application. A fully automated algorithm based on wavelet feature points and segment interpolation (AWFPSI) is proposed. This algorithm finds feature points through continuous wavelet transformation and estimates baseline through segment interpolation. AWFPSI is compared with three commonly introduced fully automated and semi-automated algorithms, using simulated spectrum signal, visible spectrum signal and Raman spectrum signal. The results show that AWFPSI gives better accuracy and has the advantage of easy use.

  17. Costs to Automate Demand Response - Taxonomy and Results from Field Studies and Programs

    Energy Technology Data Exchange (ETDEWEB)

    Piette, Mary A. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Schetrit, Oren [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Kiliccote, Sila [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Cheung, Iris [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Li, Becky Z [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2017-07-31

    During the past decade, the technology to automate demand response (DR) in buildings and industrial facilities has advanced significantly. Automation allows rapid, repeatable, reliable operation. This study focuses on costs for DR automation in commercial buildings with some discussion on residential buildings and industrial facilities. DR automation technology relies on numerous components, including communication systems, hardware and software gateways, standards-based messaging protocols, controls and integration platforms, and measurement and telemetry systems. This report compares cost data from several DR automation programs and pilot projects, evaluates trends in the cost per unit of DR and kilowatts (kW) available from automated systems, and applies a standard naming convention and classification or taxonomy for system elements. Median costs for the 56 installed automated DR systems studied here are about $200/kW. The deviation around this median is large with costs in some cases being an order of magnitude great or less than the median. This wide range is a result of variations in system age, size of load reduction, sophistication, and type of equipment included in cost analysis. The costs to automate fast DR systems for ancillary services are not fully analyzed in this report because additional research is needed to determine the total cost to install, operate, and maintain these systems. However, recent research suggests that they could be developed at costs similar to those of existing hot-summer DR automation systems. This report considers installation and configuration costs and does include the costs of owning and operating DR automation systems. Future analysis of the latter costs should include the costs to the building or facility manager costs as well as utility or third party program manager cost.

  18. Opportunities for Automated Demand Response in California’s Dairy Processing Industry

    Energy Technology Data Exchange (ETDEWEB)

    Homan, Gregory K. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Aghajanzadeh, Arian [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); McKane, Aimee [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2015-08-30

    During periods of peak electrical demand on the energy grid or when there is a shortage of supply, the stability of the grid may be compromised or the cost of supplying electricity may rise dramatically, respectively. Demand response programs are designed to mitigate the severity of these problems and improve reliability by reducing the demand on the grid during such critical times. In 2010, the Demand Response Research Center convened a group of industry experts to suggest potential industries that would be good demand response program candidates for further review. The dairy industry was suggested due to the perception that the industry had suitable flexibility and automatic controls in place. The purpose of this report is to provide an initial description of the industry with regard to demand response potential, specifically automated demand response. This report qualitatively describes the potential for participation in demand response and automated demand response by dairy processing facilities in California, as well as barriers to widespread participation. The report first describes the magnitude, timing, location, purpose, and manner of energy use. Typical process equipment and controls are discussed, as well as common impediments to participation in demand response and automated demand response programs. Two case studies of demand response at dairy facilities in California and across the country are reviewed. Finally, recommendations are made for future research that can enhance the understanding of demand response potential in this industry.

  19. Open Automated Demand Response for Small Commerical Buildings

    Energy Technology Data Exchange (ETDEWEB)

    Dudley, June Han; Piette, Mary Ann; Koch, Ed; Hennage, Dan

    2009-05-01

    This report characterizes small commercial buildings by market segments, systems and end-uses; develops a framework for identifying demand response (DR) enabling technologies and communication means; and reports on the design and development of a low-cost OpenADR enabling technology that delivers demand reductions as a percentage of the total predicted building peak electric demand. The results show that small offices, restaurants and retail buildings are the major contributors making up over one third of the small commercial peak demand. The majority of the small commercial buildings in California are located in southern inland areas and the central valley. Single-zone packaged units with manual and programmable thermostat controls make up the majority of heating ventilation and air conditioning (HVAC) systems for small commercial buildings with less than 200 kW peak electric demand. Fluorescent tubes with magnetic ballast and manual controls dominate this customer group's lighting systems. There are various ways, each with its pros and cons for a particular application, to communicate with these systems and three methods to enable automated DR in small commercial buildings using the Open Automated Demand Response (or OpenADR) communications infrastructure. Development of DR strategies must consider building characteristics, such as weather sensitivity and load variability, as well as system design (i.e. under-sizing, under-lighting, over-sizing, etc). Finally, field tests show that requesting demand reductions as a percentage of the total building predicted peak electric demand is feasible using the OpenADR infrastructure.

  20. Fully Automated Volumetric Modulated Arc Therapy Plan Generation for Prostate Cancer Patients

    International Nuclear Information System (INIS)

    Voet, Peter W.J.; Dirkx, Maarten L.P.; Breedveld, Sebastiaan; Al-Mamgani, Abrahim; Incrocci, Luca; Heijmen, Ben J.M.

    2014-01-01

    Purpose: To develop and evaluate fully automated volumetric modulated arc therapy (VMAT) treatment planning for prostate cancer patients, avoiding manual trial-and-error tweaking of plan parameters by dosimetrists. Methods and Materials: A system was developed for fully automated generation of VMAT plans with our commercial clinical treatment planning system (TPS), linked to the in-house developed Erasmus-iCycle multicriterial optimizer for preoptimization. For 30 randomly selected patients, automatically generated VMAT plans (VMAT auto ) were compared with VMAT plans generated manually by 1 expert dosimetrist in the absence of time pressure (VMAT man ). For all treatment plans, planning target volume (PTV) coverage and sparing of organs-at-risk were quantified. Results: All generated plans were clinically acceptable and had similar PTV coverage (V 95%  > 99%). For VMAT auto and VMAT man plans, the organ-at-risk sparing was similar as well, although only the former plans were generated without any planning workload. Conclusions: Fully automated generation of high-quality VMAT plans for prostate cancer patients is feasible and has recently been implemented in our clinic

  1. Opportunities for Automated Demand Response in California Wastewater Treatment Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Aghajanzadeh, Arian [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Wray, Craig [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); McKane, Aimee [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2015-08-30

    Previous research over a period of six years has identified wastewater treatment facilities as good candidates for demand response (DR), automated demand response (Auto-­DR), and Energy Efficiency (EE) measures. This report summarizes that work, including the characteristics of wastewater treatment facilities, the nature of the wastewater stream, energy used and demand, as well as details of the wastewater treatment process. It also discusses control systems and automated demand response opportunities. Furthermore, this report summarizes the DR potential of three wastewater treatment facilities. In particular, Lawrence Berkeley National Laboratory (LBNL) has collected data at these facilities from control systems, submetered process equipment, utility electricity demand records, and governmental weather stations. The collected data were then used to generate a summary of wastewater power demand, factors affecting that demand, and demand response capabilities. These case studies show that facilities that have implemented energy efficiency measures and that have centralized control systems are well suited to shed or shift electrical loads in response to financial incentives, utility bill savings, and/or opportunities to enhance reliability of service. In summary, municipal wastewater treatment energy demand in California is large, and energy-­intensive equipment offers significant potential for automated demand response. In particular, large load reductions were achieved by targeting effluent pumps and centrifuges. One of the limiting factors to implementing demand response is the reaction of effluent turbidity to reduced aeration at an earlier stage of the process. Another limiting factor is that cogeneration capabilities of municipal facilities, including existing power purchase agreements and utility receptiveness to purchasing electricity from cogeneration facilities, limit a facility’s potential to participate in other DR activities.

  2. Fully automated processing of fMRI data in SPM: from MRI scanner to PACS.

    Science.gov (United States)

    Maldjian, Joseph A; Baer, Aaron H; Kraft, Robert A; Laurienti, Paul J; Burdette, Jonathan H

    2009-01-01

    Here we describe the Wake Forest University Pipeline, a fully automated method for the processing of fMRI data using SPM. The method includes fully automated data transfer and archiving from the point of acquisition, real-time batch script generation, distributed grid processing, interface to SPM in MATLAB, error recovery and data provenance, DICOM conversion and PACS insertion. It has been used for automated processing of fMRI experiments, as well as for the clinical implementation of fMRI and spin-tag perfusion imaging. The pipeline requires no manual intervention, and can be extended to any studies requiring offline processing.

  3. Automation of energy demand forecasting

    Science.gov (United States)

    Siddique, Sanzad

    Automation of energy demand forecasting saves time and effort by searching automatically for an appropriate model in a candidate model space without manual intervention. This thesis introduces a search-based approach that improves the performance of the model searching process for econometrics models. Further improvements in the accuracy of the energy demand forecasting are achieved by integrating nonlinear transformations within the models. This thesis introduces machine learning techniques that are capable of modeling such nonlinearity. Algorithms for learning domain knowledge from time series data using the machine learning methods are also presented. The novel search based approach and the machine learning models are tested with synthetic data as well as with natural gas and electricity demand signals. Experimental results show that the model searching technique is capable of finding an appropriate forecasting model. Further experimental results demonstrate an improved forecasting accuracy achieved by using the novel machine learning techniques introduced in this thesis. This thesis presents an analysis of how the machine learning techniques learn domain knowledge. The learned domain knowledge is used to improve the forecast accuracy.

  4. A fully automated system for ultrasonic power measurement and simulation accordingly to IEC 61161:2006

    International Nuclear Information System (INIS)

    Costa-Felix, Rodrigo P B; Alvarenga, Andre V; Hekkenberg, Rob

    2011-01-01

    The ultrasonic power measurement, worldwide accepted, standard is the IEC 61161, presently in its 2nd edition (2006), but under review. To fulfil its requirements, considering that a radiation force balance is to be used as ultrasonic power detector, a large amount of raw data (mass measurement) shall be collected as function of time to perform all necessary calculations and corrections. Uncertainty determination demands calculation effort of raw and processed data. Although it is possible to be undertaken in an old-fashion way, using spread sheets and manual data collection, automation software are often used in metrology to provide a virtually error free environment concerning data acquisition and repetitive calculations and corrections. Considering that, a fully automate ultrasonic power measurement system was developed and comprehensively tested. A 0,1 mg of precision balance model CP224S (Sartorius, Germany) was used as measuring device and a calibrated continuous wave ultrasound check source (Precision Acoustics, UK) was the device under test. A 150 ml container filled with degassed water and containing an absorbing target at the bottom was placed on the balance pan. Besides the feature of automation software, a routine of power measurement simulation was implemented. It was idealized as a teaching tool of how ultrasonic power emission behaviour is with a radiation force balance equipped with an absorbing target. Automation software was considered as an effective tool for speeding up ultrasonic power measurement, while allowing accurate calculation and attractive graphical partial and final results.

  5. A novel method to determine simultaneously methane production during in vitro gas production using fully automated equipment

    NARCIS (Netherlands)

    Pellikaan, W.F.; Hendriks, W.H.; Uwimanaa, G.; Bongers, L.J.G.M.; Becker, P.M.; Cone, J.W.

    2011-01-01

    An adaptation of fully automated gas production equipment was tested for its ability to simultaneously measure methane and total gas. The simultaneous measurement of gas production and gas composition was not possible using fully automated equipment, as the bottles should be kept closed during the

  6. Evaluation of automated residential demand response with flat and dynamic pricing

    International Nuclear Information System (INIS)

    Swisher, Joel; Wang, Kitty; Stewart, Stewart

    2005-01-01

    This paper reviews the performance of two recent automated load management programs for residential customers of electric utilities in two American states. Both pilot programs have been run with about 200 participant houses each, and both programs have control populations of similar customers without the technology or program treatment. In both cases, the technology used in the pilot is GoodWatts, an advanced, two-way, real-time, comprehensive home energy management system. The purpose of each pilot is to determine the household kW reduction in coincident peak electric load from the energy management technology. Nevada Power has conducted a pilot program for Air-Conditioning Load Management (ACLM), in which customers are sent an electronic curtailment signal for three-hour intervals during times of maximum peak demand. The participating customers receive an annual incentive payment, but otherwise they are on a conventional utility tariff. In California, three major utilities are jointly conducting a pilot demonstration of an Automated Demand Response System (ADRS). Customers are on a time-of-use (ToU) tariff, which includes a critical peak pricing (CPP) element. During times of maximum peak demand, customers are sent an electronic price signal that is three times higher than the normal on-peak price. Houses with the automated GoodWatts technology reduced their demand in both the ACLM and the ADRS programs by about 50% consistently across the summer curtailment or super peak events, relative to homes without the technology or any load management program or tariff in place. The absolute savings were greater in the ACLM program, due to the higher baseline air conditioning loads in the hotter Las Vegas climate. The results suggest that either automated technology or dynamic pricing can deliver significant demand response in low-consumption houses. However, for high-consumption houses, automated technology can reduce load by a greater absolute kWh difference. Targeting

  7. Opportunities for Energy Efficiency and Automated Demand Response in Industrial Refrigerated Warehouses in California

    Energy Technology Data Exchange (ETDEWEB)

    Lekov, Alex; Thompson, Lisa; McKane, Aimee; Rockoff, Alexandra; Piette, Mary Ann

    2009-05-11

    This report summarizes the Lawrence Berkeley National Laboratory's research to date in characterizing energy efficiency and open automated demand response opportunities for industrial refrigerated warehouses in California. The report describes refrigerated warehouses characteristics, energy use and demand, and control systems. It also discusses energy efficiency and open automated demand response opportunities and provides analysis results from three demand response studies. In addition, several energy efficiency, load management, and demand response case studies are provided for refrigerated warehouses. This study shows that refrigerated warehouses can be excellent candidates for open automated demand response and that facilities which have implemented energy efficiency measures and have centralized control systems are well-suited to shift or shed electrical loads in response to financial incentives, utility bill savings, and/or opportunities to enhance reliability of service. Control technologies installed for energy efficiency and load management purposes can often be adapted for open automated demand response (OpenADR) at little additional cost. These improved controls may prepare facilities to be more receptive to OpenADR due to both increased confidence in the opportunities for controlling energy cost/use and access to the real-time data.

  8. A new fully automated TLD badge reader

    International Nuclear Information System (INIS)

    Kannan, S.; Ratna, P.; Kulkarni, M.S.

    2003-01-01

    At present personnel monitoring in India is being carried out using a number of manual and semiautomatic TLD badge Readers and the BARC TL dosimeter badge designed during 1970. Of late the manual TLD badge readers are almost completely replaced by semiautomatic readers with a number of performance improvements like use of hot gas heating to reduce the readout time considerably. PC based design with storage of glow curve for every dosimeter, on-line dose computation and printout of dose reports, etc. However the semiautomatic system suffers from the lack of a machine readable ID code on the badge and the physical design of the dosimeter card not readily compatible for automation. This paper describes a fully automated TLD badge Reader developed in the RSS Division, using a new TLD badge with machine readable ID code. The new PC based reader has a built-in reader for reading the ID code, in the form of an array of holes, on the dosimeter card. The reader has a number of self-diagnostic features to ensure a high degree of reliability. (author)

  9. Detection of virus-specific intrathecally synthesised immunoglobulin G with a fully automated enzyme immunoassay system

    Directory of Open Access Journals (Sweden)

    Weissbrich Benedikt

    2007-05-01

    Full Text Available Abstract Background The determination of virus-specific immunoglobulin G (IgG antibodies in cerebrospinal fluid (CSF is useful for the diagnosis of virus associated diseases of the central nervous system (CNS and for the detection of a polyspecific intrathecal immune response in patients with multiple sclerosis. Quantification of virus-specific IgG in the CSF is frequently performed by calculation of a virus-specific antibody index (AI. Determination of the AI is a demanding and labour-intensive technique and therefore automation is desirable. We evaluated the precision and the diagnostic value of a fully automated enzyme immunoassay for the detection of virus-specific IgG in serum and CSF using the analyser BEP2000 (Dade Behring. Methods The AI for measles, rubella, varicella-zoster, and herpes simplex virus IgG was determined from pairs of serum and CSF samples of patients with viral CNS infections, multiple sclerosis and of control patients. CSF and serum samples were tested simultaneously with reference to a standard curve. Starting dilutions were 1:6 and 1:36 for CSF and 1:1386 and 1:8316 for serum samples. Results The interassay coefficient of variation was below 10% for all parameters tested. There was good agreement between AIs obtained with the BEP2000 and AIs derived from the semi-automated reference method. Conclusion Determination of virus-specific IgG in serum-CSF-pairs for calculation of AI has been successfully automated on the BEP2000. Current limitations of the assay layout imposed by the analyser software should be solved in future versions to offer more convenience in comparison to manual or semi-automated methods.

  10. [18F]FMeNER-D2: Reliable fully-automated synthesis for visualization of the norepinephrine transporter

    International Nuclear Information System (INIS)

    Rami-Mark, Christina; Zhang, Ming-Rong; Mitterhauser, Markus; Lanzenberger, Rupert; Hacker, Marcus; Wadsak, Wolfgang

    2013-01-01

    Purpose: In neurodegenerative diseases and neuropsychiatric disorders dysregulation of the norepinephrine transporter (NET) has been reported. For visualization of NET availability and occupancy in the human brain PET imaging can be used. Therefore, selective NET-PET tracers with high affinity are required. Amongst these, [ 18 F]FMeNER-D2 is showing the best results so far. Furthermore, a reliable fully automated radiosynthesis is a prerequisite for successful application of PET-tracers. The aim of this work was the automation of [ 18 F]FMeNER-D2 radiolabelling for subsequent clinical use. The presented study comprises 25 automated large-scale syntheses, which were directly applied to healthy volunteers and adult patients suffering from attention deficit hyperactivity disorder (ADHD). Procedures: Synthesis of [ 18 F]FMeNER-D2 was automated within a Nuclear Interface Module. Starting from 20–30 GBq [ 18 F]fluoride, azeotropic drying, reaction with Br 2 CD 2 , distillation of 1-bromo-2-[ 18 F]fluoromethane-D2 ([ 18 F]BFM) and reaction of the pure [ 18 F]BFM with unprotected precursor NER were optimized and completely automated. HPLC purification and SPE procedure were completed, formulation and sterile filtration were achieved on-line and full quality control was performed. Results: Purified product was obtained in a fully automated synthesis in clinical scale allowing maximum radiation safety and routine production under GMP-like manner. So far, more than 25 fully automated syntheses were successfully performed, yielding 1.0–2.5 GBq of formulated [ 18 F]FMeNER-D2 with specific activities between 430 and 1707 GBq/μmol within 95 min total preparation time. Conclusions: A first fully automated [ 18 F]FMeNER-D2 synthesis was established, allowing routine production of this NET-PET tracer under maximum radiation safety and standardization

  11. [18F]FMeNER-D2: reliable fully-automated synthesis for visualization of the norepinephrine transporter.

    Science.gov (United States)

    Rami-Mark, Christina; Zhang, Ming-Rong; Mitterhauser, Markus; Lanzenberger, Rupert; Hacker, Marcus; Wadsak, Wolfgang

    2013-11-01

    In neurodegenerative diseases and neuropsychiatric disorders dysregulation of the norepinephrine transporter (NET) has been reported. For visualization of NET availability and occupancy in the human brain PET imaging can be used. Therefore, selective NET-PET tracers with high affinity are required. Amongst these, [(18)F]FMeNER-D2 is showing the best results so far. Furthermore, a reliable fully automated radiosynthesis is a prerequisite for successful application of PET-tracers. The aim of this work was the automation of [(18)F]FMeNER-D2 radiolabelling for subsequent clinical use. The presented study comprises 25 automated large-scale syntheses, which were directly applied to healthy volunteers and adult patients suffering from attention deficit hyperactivity disorder (ADHD). Synthesis of [(18)F]FMeNER-D2 was automated within a Nuclear Interface Module. Starting from 20-30 GBq [(18)F]fluoride, azeotropic drying, reaction with Br2CD2, distillation of 1-bromo-2-[(18)F]fluoromethane-D2 ([(18)F]BFM) and reaction of the pure [(18)F]BFM with unprotected precursor NER were optimized and completely automated. HPLC purification and SPE procedure were completed, formulation and sterile filtration were achieved on-line and full quality control was performed. Purified product was obtained in a fully automated synthesis in clinical scale allowing maximum radiation safety and routine production under GMP-like manner. So far, more than 25 fully automated syntheses were successfully performed, yielding 1.0-2.5 GBq of formulated [(18)F]FMeNER-D2 with specific activities between 430 and 1707 GBq/μmol within 95 min total preparation time. A first fully automated [(18)F]FMeNER-D2 synthesis was established, allowing routine production of this NET-PET tracer under maximum radiation safety and standardization. © 2013.

  12. A fully automated microfluidic femtosecond laser axotomy platform for nerve regeneration studies in C. elegans.

    Science.gov (United States)

    Gokce, Sertan Kutal; Guo, Samuel X; Ghorashian, Navid; Everett, W Neil; Jarrell, Travis; Kottek, Aubri; Bovik, Alan C; Ben-Yakar, Adela

    2014-01-01

    Femtosecond laser nanosurgery has been widely accepted as an axonal injury model, enabling nerve regeneration studies in the small model organism, Caenorhabditis elegans. To overcome the time limitations of manual worm handling techniques, automation and new immobilization technologies must be adopted to improve throughput in these studies. While new microfluidic immobilization techniques have been developed that promise to reduce the time required for axotomies, there is a need for automated procedures to minimize the required amount of human intervention and accelerate the axotomy processes crucial for high-throughput. Here, we report a fully automated microfluidic platform for performing laser axotomies of fluorescently tagged neurons in living Caenorhabditis elegans. The presented automation process reduces the time required to perform axotomies within individual worms to ∼17 s/worm, at least one order of magnitude faster than manual approaches. The full automation is achieved with a unique chip design and an operation sequence that is fully computer controlled and synchronized with efficient and accurate image processing algorithms. The microfluidic device includes a T-shaped architecture and three-dimensional microfluidic interconnects to serially transport, position, and immobilize worms. The image processing algorithms can identify and precisely position axons targeted for ablation. There were no statistically significant differences observed in reconnection probabilities between axotomies carried out with the automated system and those performed manually with anesthetics. The overall success rate of automated axotomies was 67.4±3.2% of the cases (236/350) at an average processing rate of 17.0±2.4 s. This fully automated platform establishes a promising methodology for prospective genome-wide screening of nerve regeneration in C. elegans in a truly high-throughput manner.

  13. Breast Density Estimation with Fully Automated Volumetric Method: Comparison to Radiologists' Assessment by BI-RADS Categories.

    Science.gov (United States)

    Singh, Tulika; Sharma, Madhurima; Singla, Veenu; Khandelwal, Niranjan

    2016-01-01

    The objective of our study was to calculate mammographic breast density with a fully automated volumetric breast density measurement method and to compare it to breast imaging reporting and data system (BI-RADS) breast density categories assigned by two radiologists. A total of 476 full-field digital mammography examinations with standard mediolateral oblique and craniocaudal views were evaluated by two blinded radiologists and BI-RADS density categories were assigned. Using a fully automated software, mean fibroglandular tissue volume, mean breast volume, and mean volumetric breast density were calculated. Based on percentage volumetric breast density, a volumetric density grade was assigned from 1 to 4. The weighted overall kappa was 0.895 (almost perfect agreement) for the two radiologists' BI-RADS density estimates. A statistically significant difference was seen in mean volumetric breast density among the BI-RADS density categories. With increased BI-RADS density category, increase in mean volumetric breast density was also seen (P BI-RADS categories and volumetric density grading by fully automated software (ρ = 0.728, P BI-RADS density category by two observers showed fair agreement (κ = 0.398 and 0.388, respectively). In our study, a good correlation was seen between density grading using fully automated volumetric method and density grading using BI-RADS density categories assigned by the two radiologists. Thus, the fully automated volumetric method may be used to quantify breast density on routine mammography. Copyright © 2016 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.

  14. Automated evaluation of ultrasonic indications

    International Nuclear Information System (INIS)

    Hansch, M.K.T.; Stegemann, D.

    1994-01-01

    Future requirements of reliability and reproducibility in quality assurance demand computer evaluation of defect indications. The ultrasonic method with its large field of applications and a high potential for automation provides all preconditions for fully automated inspection. The survey proposes several desirable hardware improvements, data acquisition requirements and software configurations. (orig.) [de

  15. Fully Automated Lipid Pool Detection Using Near Infrared Spectroscopy

    Directory of Open Access Journals (Sweden)

    Elżbieta Pociask

    2016-01-01

    Full Text Available Background. Detecting and identifying vulnerable plaque, which is prone to rupture, is still a challenge for cardiologist. Such lipid core-containing plaque is still not identifiable by everyday angiography, thus triggering the need to develop a new tool where NIRS-IVUS can visualize plaque characterization in terms of its chemical and morphologic characteristic. The new tool can lead to the development of new methods of interpreting the newly obtained data. In this study, the algorithm to fully automated lipid pool detection on NIRS images is proposed. Method. Designed algorithm is divided into four stages: preprocessing (image enhancement, segmentation of artifacts, detection of lipid areas, and calculation of Lipid Core Burden Index. Results. A total of 31 NIRS chemograms were analyzed by two methods. The metrics, total LCBI, maximal LCBI in 4 mm blocks, and maximal LCBI in 2 mm blocks, were calculated to compare presented algorithm with commercial available system. Both intraclass correlation (ICC and Bland-Altman plots showed good agreement and correlation between used methods. Conclusions. Proposed algorithm is fully automated lipid pool detection on near infrared spectroscopy images. It is a tool developed for offline data analysis, which could be easily augmented for newer functions and projects.

  16. Didactical And Ethics Demands For Automated Pedagogical Diagnostics.

    Directory of Open Access Journals (Sweden)

    O. Kolgatin

    2009-06-01

    Full Text Available Didactical demands for pedagogical diagnostics and its realisation specific characters under conditions of active using of ICT in instruction process of universities are analysed. The ethics questions of pedagogical diagnostics are considered. Ethic aspects, connected with using of the automated pedagogical diagnostic systems, are underlined.

  17. Designing a fully automated multi-bioreactor plant for fast DoE optimization of pharmaceutical protein production.

    Science.gov (United States)

    Fricke, Jens; Pohlmann, Kristof; Jonescheit, Nils A; Ellert, Andree; Joksch, Burkhard; Luttmann, Reiner

    2013-06-01

    The identification of optimal expression conditions for state-of-the-art production of pharmaceutical proteins is a very time-consuming and expensive process. In this report a method for rapid and reproducible optimization of protein expression in an in-house designed small-scale BIOSTAT® multi-bioreactor plant is described. A newly developed BioPAT® MFCS/win Design of Experiments (DoE) module (Sartorius Stedim Systems, Germany) connects the process control system MFCS/win and the DoE software MODDE® (Umetrics AB, Sweden) and enables therefore the implementation of fully automated optimization procedures. As a proof of concept, a commercial Pichia pastoris strain KM71H has been transformed for the expression of potential malaria vaccines. This approach has allowed a doubling of intact protein secretion productivity due to the DoE optimization procedure compared to initial cultivation results. In a next step, robustness regarding the sensitivity to process parameter variability has been proven around the determined optimum. Thereby, a pharmaceutical production process that is significantly improved within seven 24-hour cultivation cycles was established. Specifically, regarding the regulatory demands pointed out in the process analytical technology (PAT) initiative of the United States Food and Drug Administration (FDA), the combination of a highly instrumented, fully automated multi-bioreactor platform with proper cultivation strategies and extended DoE software solutions opens up promising benefits and opportunities for pharmaceutical protein production. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. Open Automated Demand Response Communications in Demand Response for Wholesale Ancillary Services

    Energy Technology Data Exchange (ETDEWEB)

    Kiliccote, Sila; Piette, Mary Ann; Ghatikar, Girish; Koch, Ed; Hennage, Dan; Hernandez, John; Chiu, Albert; Sezgen, Osman; Goodin, John

    2009-11-06

    The Pacific Gas and Electric Company (PG&E) is conducting a pilot program to investigate the technical feasibility of bidding certain demand response (DR) resources into the California Independent System Operator's (CAISO) day-ahead market for ancillary services nonspinning reserve. Three facilities, a retail store, a local government office building, and a bakery, are recruited into the pilot program. For each facility, hourly demand, and load curtailment potential are forecasted two days ahead and submitted to the CAISO the day before the operation as an available resource. These DR resources are optimized against all other generation resources in the CAISO ancillary service. Each facility is equipped with four-second real time telemetry equipment to ensure resource accountability and visibility to CAISO operators. When CAISO requests DR resources, PG&E's OpenADR (Open Automated DR) communications infrastructure is utilized to deliver DR signals to the facilities energy management and control systems (EMCS). The pre-programmed DR strategies are triggered without a human in the loop. This paper describes the automated system architecture and the flow of information to trigger and monitor the performance of the DR events. We outline the DR strategies at each of the participating facilities. At one site a real time electric measurement feedback loop is implemented to assure the delivery of CAISO dispatched demand reductions. Finally, we present results from each of the facilities and discuss findings.

  19. Development of a fully automated online mixing system for SAXS protein structure analysis

    DEFF Research Database (Denmark)

    Nielsen, Søren Skou; Arleth, Lise

    2010-01-01

    This thesis presents the development of an automated high-throughput mixing and exposure system for Small-Angle Scattering analysis on a synchrotron using polymer microfluidics. Software and hardware for both automated mixing, exposure control on a beamline and automated data reduction...... and preliminary analysis is presented. Three mixing systems that have been the corner stones of the development process are presented including a fully functioning high-throughput microfluidic system that is able to produce and expose 36 mixed samples per hour using 30 μL of sample volume. The system is tested...

  20. Comparison of semi-automated center-dot and fully automated endothelial cell analyses from specular microscopy images.

    Science.gov (United States)

    Maruoka, Sachiko; Nakakura, Shunsuke; Matsuo, Naoko; Yoshitomi, Kayo; Katakami, Chikako; Tabuchi, Hitoshi; Chikama, Taiichiro; Kiuchi, Yoshiaki

    2017-10-30

    To evaluate two specular microscopy analysis methods across different endothelial cell densities (ECDs). Endothelial images of one eye from each of 45 patients were taken by using three different specular microscopes (three replicates each). To determine the consistency of the center-dot method, we compared SP-6000 and SP-2000P images. CME-530 and SP-6000 images were compared to assess the consistency of the fully automated method. The SP-6000 images from the two methods were compared. Intraclass correlation coefficients (ICCs) for the three measurements were calculated, and parametric multiple comparisons tests and Bland-Altman analysis were performed. The ECD mean value was 2425 ± 883 (range 516-3707) cells/mm 2 . ICC values were > 0.9 for all three microscopes for ECD, but the coefficients of variation (CVs) were 0.3-0.6. For ECD measurements, Bland-Altman analysis revealed that the mean difference was 42 cells/mm 2 between the SP-2000P and SP-6000 for the center-dot method; 57 cells/mm 2 between the SP-6000 measurements from both methods; and -5 cells/mm 2 between the SP-6000 and CME-530 for the fully automated method (95% limits of agreement: - 201 to 284 cell/mm 2 , - 410 to 522 cells/mm 2 , and - 327 to 318 cells/mm 2 , respectively). For CV measurements, the mean differences were - 3, - 12, and 13% (95% limits of agreement - 18 to 11, - 26 to 2, and - 5 to 32%, respectively). Despite using three replicate measurements, the precision of the center-dot method with the SP-2000P and SP-6000 software was only ± 10% for ECD data and was even worse for the fully automated method. Japan Clinical Trials Register ( http://www.umin.ac.jp/ctr/index/htm9 ) number UMIN 000015236.

  1. A Fully Automated Penumbra Segmentation Tool

    DEFF Research Database (Denmark)

    Nagenthiraja, Kartheeban; Ribe, Lars Riisgaard; Hougaard, Kristina Dupont

    2012-01-01

    Introduction: Perfusion- and diffusion weighted MRI (PWI/DWI) is widely used to select patients who are likely to benefit from recanalization therapy. The visual identification of PWI-DWI-mismatch tissue depends strongly on the observer, prompting a need for software, which estimates potentially...... salavageable tissue, quickly and accurately. We present a fully Automated Penumbra Segmentation (APS) algorithm using PWI and DWI images. We compare automatically generated PWI-DWI mismatch mask to mask outlined manually by experts, in 168 patients. Method: The algorithm initially identifies PWI lesions......) at 600∙10-6 mm2/sec. Due to the nature of thresholding, the ADC mask overestimates the DWI lesion volume and consequently we initialized level-set algorithm on DWI image with ADC mask as prior knowledge. Combining the PWI and inverted DWI mask then yield the PWI-DWI mismatch mask. Four expert raters...

  2. [{sup 18}F]FMeNER-D2: Reliable fully-automated synthesis for visualization of the norepinephrine transporter

    Energy Technology Data Exchange (ETDEWEB)

    Rami-Mark, Christina [Radiochemistry and Biomarker Development Unit, Division of Nuclear Medicine, Department of Biomedical Imaging and Image-guided Therapy, Medical University of Vienna (Austria); Department of Inorganic Chemistry, University of Vienna (Austria); Zhang, Ming-Rong [Molecular Imaging Center, National Institute of Radiological Sciences, Chiba (Japan); Mitterhauser, Markus [Radiochemistry and Biomarker Development Unit, Division of Nuclear Medicine, Department of Biomedical Imaging and Image-guided Therapy, Medical University of Vienna (Austria); Hospital Pharmacy of the General Hospital of Vienna (Austria); Lanzenberger, Rupert [Department of Psychiatry and Psychotherapy, Medical University of Vienna (Austria); Hacker, Marcus [Radiochemistry and Biomarker Development Unit, Division of Nuclear Medicine, Department of Biomedical Imaging and Image-guided Therapy, Medical University of Vienna (Austria); Wadsak, Wolfgang [Radiochemistry and Biomarker Development Unit, Division of Nuclear Medicine, Department of Biomedical Imaging and Image-guided Therapy, Medical University of Vienna (Austria); Department of Inorganic Chemistry, University of Vienna (Austria)

    2013-11-15

    Purpose: In neurodegenerative diseases and neuropsychiatric disorders dysregulation of the norepinephrine transporter (NET) has been reported. For visualization of NET availability and occupancy in the human brain PET imaging can be used. Therefore, selective NET-PET tracers with high affinity are required. Amongst these, [{sup 18}F]FMeNER-D2 is showing the best results so far. Furthermore, a reliable fully automated radiosynthesis is a prerequisite for successful application of PET-tracers. The aim of this work was the automation of [{sup 18}F]FMeNER-D2 radiolabelling for subsequent clinical use. The presented study comprises 25 automated large-scale syntheses, which were directly applied to healthy volunteers and adult patients suffering from attention deficit hyperactivity disorder (ADHD). Procedures: Synthesis of [{sup 18}F]FMeNER-D2 was automated within a Nuclear Interface Module. Starting from 20–30 GBq [{sup 18}F]fluoride, azeotropic drying, reaction with Br{sub 2}CD{sub 2}, distillation of 1-bromo-2-[{sup 18}F]fluoromethane-D2 ([{sup 18}F]BFM) and reaction of the pure [{sup 18}F]BFM with unprotected precursor NER were optimized and completely automated. HPLC purification and SPE procedure were completed, formulation and sterile filtration were achieved on-line and full quality control was performed. Results: Purified product was obtained in a fully automated synthesis in clinical scale allowing maximum radiation safety and routine production under GMP-like manner. So far, more than 25 fully automated syntheses were successfully performed, yielding 1.0–2.5 GBq of formulated [{sup 18}F]FMeNER-D2 with specific activities between 430 and 1707 GBq/μmol within 95 min total preparation time. Conclusions: A first fully automated [{sup 18}F]FMeNER-D2 synthesis was established, allowing routine production of this NET-PET tracer under maximum radiation safety and standardization.

  3. How a Fully Automated eHealth Program Simulates Three Therapeutic Processes: A Case Study.

    Science.gov (United States)

    Holter, Marianne T S; Johansen, Ayna; Brendryen, Håvar

    2016-06-28

    eHealth programs may be better understood by breaking down the components of one particular program and discussing its potential for interactivity and tailoring in regard to concepts from face-to-face counseling. In the search for the efficacious elements within eHealth programs, it is important to understand how a program using lapse management may simultaneously support working alliance, internalization of motivation, and behavior maintenance. These processes have been applied to fully automated eHealth programs individually. However, given their significance in face-to-face counseling, it may be important to simulate the processes simultaneously in interactive, tailored programs. We propose a theoretical model for how fully automated behavior change eHealth programs may be more effective by simulating a therapist's support of a working alliance, internalization of motivation, and managing lapses. We show how the model is derived from theory and its application to Endre, a fully automated smoking cessation program that engages the user in several "counseling sessions" about quitting. A descriptive case study based on tools from the intervention mapping protocol shows how each therapeutic process is simulated. The program supports the user's working alliance through alliance factors, the nonembodied relational agent Endre and computerized motivational interviewing. Computerized motivational interviewing also supports internalized motivation to quit, whereas a lapse management component responds to lapses. The description operationalizes working alliance, internalization of motivation, and managing lapses, in terms of eHealth support of smoking cessation. A program may simulate working alliance, internalization of motivation, and lapse management through interactivity and individual tailoring, potentially making fully automated eHealth behavior change programs more effective.

  4. Fully Automated Deep Learning System for Bone Age Assessment.

    Science.gov (United States)

    Lee, Hyunkwang; Tajmir, Shahein; Lee, Jenny; Zissen, Maurice; Yeshiwas, Bethel Ayele; Alkasab, Tarik K; Choy, Garry; Do, Synho

    2017-08-01

    Skeletal maturity progresses through discrete phases, a fact that is used routinely in pediatrics where bone age assessments (BAAs) are compared to chronological age in the evaluation of endocrine and metabolic disorders. While central to many disease evaluations, little has changed to improve the tedious process since its introduction in 1950. In this study, we propose a fully automated deep learning pipeline to segment a region of interest, standardize and preprocess input radiographs, and perform BAA. Our models use an ImageNet pretrained, fine-tuned convolutional neural network (CNN) to achieve 57.32 and 61.40% accuracies for the female and male cohorts on our held-out test images. Female test radiographs were assigned a BAA within 1 year 90.39% and within 2 years 98.11% of the time. Male test radiographs were assigned 94.18% within 1 year and 99.00% within 2 years. Using the input occlusion method, attention maps were created which reveal what features the trained model uses to perform BAA. These correspond to what human experts look at when manually performing BAA. Finally, the fully automated BAA system was deployed in the clinical environment as a decision supporting system for more accurate and efficient BAAs at much faster interpretation time (<2 s) than the conventional method.

  5. Design and Operation of an Open, Interoperable Automated Demand Response Infrastructure for Commercial Buildings

    Energy Technology Data Exchange (ETDEWEB)

    Piette, Mary Ann; Ghatikar, Girish; Kiliccote, Sila; Watson, David; Koch, Ed; Hennage, Dan

    2009-05-01

    This paper describes the concept for and lessons from the development and field-testing of an open, interoperable communications infrastructure to support automated demand response (auto-DR). Automating DR allows greater levels of participation, improved reliability, and repeatability of the DR in participating facilities. This paper also presents the technical and architectural issues associated with auto-DR and description of the demand response automation server (DRAS), the client/server architecture-based middle-ware used to automate the interactions between the utilities or any DR serving entity and their customers for DR programs. Use case diagrams are presented to show the role of the DRAS between utility/ISO and the clients at the facilities.

  6. FULLY AUTOMATED IMAGE ORIENTATION IN THE ABSENCE OF TARGETS

    Directory of Open Access Journals (Sweden)

    C. Stamatopoulos

    2012-07-01

    Full Text Available Automated close-range photogrammetric network orientation has traditionally been associated with the use of coded targets in the object space to allow for an initial relative orientation (RO and subsequent spatial resection of the images. Over the past decade, automated orientation via feature-based matching (FBM techniques has attracted renewed research attention in both the photogrammetry and computer vision (CV communities. This is largely due to advances made towards the goal of automated relative orientation of multi-image networks covering untargetted (markerless objects. There are now a number of CV-based algorithms, with accompanying open-source software, that can achieve multi-image orientation within narrow-baseline networks. From a photogrammetric standpoint, the results are typically disappointing as the metric integrity of the resulting models is generally poor, or even unknown, while the number of outliers within the image matching and triangulation is large, and generally too large to allow relative orientation (RO via the commonly used coplanarity equations. On the other hand, there are few examples within the photogrammetric research field of automated markerless camera calibration to metric tolerances, and these too are restricted to narrow-baseline, low-convergence imaging geometry. The objective addressed in this paper is markerless automatic multi-image orientation, maintaining metric integrity, within networks that incorporate wide-baseline imagery. By wide-baseline we imply convergent multi-image configurations with convergence angles of up to around 90°. An associated aim is provision of a fast, fully automated process, which can be performed without user intervention. For this purpose, various algorithms require optimisation to allow parallel processing utilising multiple PC cores and graphics processing units (GPUs.

  7. Validation of Fully Automated VMAT Plan Generation for Library-Based Plan-of-the-Day Cervical Cancer Radiotherapy

    OpenAIRE

    Sharfo, Abdul Wahab M.; Breedveld, Sebastiaan; Voet, Peter W. J.; Heijkoop, Sabrina T.; Mens, Jan-Willem M.; Hoogeman, Mischa S.; Heijmen, Ben J. M.

    2016-01-01

    textabstractPurpose: To develop and validate fully automated generation of VMAT plan-libraries for plan-of-the-day adaptive radiotherapy in locally-advanced cervical cancer. Material and Methods: Our framework for fully automated treatment plan generation (Erasmus-iCycle) was adapted to create dual-arc VMAT treatment plan libraries for cervical cancer patients. For each of 34 patients, automatically generated VMAT plans (autoVMAT) were compared to manually generated, clinically delivered 9-be...

  8. Fully automated MR liver volumetry using watershed segmentation coupled with active contouring.

    Science.gov (United States)

    Huynh, Hieu Trung; Le-Trong, Ngoc; Bao, Pham The; Oto, Aytek; Suzuki, Kenji

    2017-02-01

    Our purpose is to develop a fully automated scheme for liver volume measurement in abdominal MR images, without requiring any user input or interaction. The proposed scheme is fully automatic for liver volumetry from 3D abdominal MR images, and it consists of three main stages: preprocessing, rough liver shape generation, and liver extraction. The preprocessing stage reduced noise and enhanced the liver boundaries in 3D abdominal MR images. The rough liver shape was revealed fully automatically by using the watershed segmentation, thresholding transform, morphological operations, and statistical properties of the liver. An active contour model was applied to refine the rough liver shape to precisely obtain the liver boundaries. The liver volumes calculated by the proposed scheme were compared to the "gold standard" references which were estimated by an expert abdominal radiologist. The liver volumes computed by using our developed scheme excellently agreed (Intra-class correlation coefficient was 0.94) with the "gold standard" manual volumes by the radiologist in the evaluation with 27 cases from multiple medical centers. The running time was 8.4 min per case on average. We developed a fully automated liver volumetry scheme in MR, which does not require any interaction by users. It was evaluated with cases from multiple medical centers. The liver volumetry performance of our developed system was comparable to that of the gold standard manual volumetry, and it saved radiologists' time for manual liver volumetry of 24.7 min per case.

  9. TreeRipper web application: towards a fully automated optical tree recognition software

    Directory of Open Access Journals (Sweden)

    Hughes Joseph

    2011-05-01

    Full Text Available Abstract Background Relationships between species, genes and genomes have been printed as trees for over a century. Whilst this may have been the best format for exchanging and sharing phylogenetic hypotheses during the 20th century, the worldwide web now provides faster and automated ways of transferring and sharing phylogenetic knowledge. However, novel software is needed to defrost these published phylogenies for the 21st century. Results TreeRipper is a simple website for the fully-automated recognition of multifurcating phylogenetic trees (http://linnaeus.zoology.gla.ac.uk/~jhughes/treeripper/. The program accepts a range of input image formats (PNG, JPG/JPEG or GIF. The underlying command line c++ program follows a number of cleaning steps to detect lines, remove node labels, patch-up broken lines and corners and detect line edges. The edge contour is then determined to detect the branch length, tip label positions and the topology of the tree. Optical Character Recognition (OCR is used to convert the tip labels into text with the freely available tesseract-ocr software. 32% of images meeting the prerequisites for TreeRipper were successfully recognised, the largest tree had 115 leaves. Conclusions Despite the diversity of ways phylogenies have been illustrated making the design of a fully automated tree recognition software difficult, TreeRipper is a step towards automating the digitization of past phylogenies. We also provide a dataset of 100 tree images and associated tree files for training and/or benchmarking future software. TreeRipper is an open source project licensed under the GNU General Public Licence v3.

  10. Current advances and strategies towards fully automated sample preparation for regulated LC-MS/MS bioanalysis.

    Science.gov (United States)

    Zheng, Naiyu; Jiang, Hao; Zeng, Jianing

    2014-09-01

    Robotic liquid handlers (RLHs) have been widely used in automated sample preparation for liquid chromatography-tandem mass spectrometry (LC-MS/MS) bioanalysis. Automated sample preparation for regulated bioanalysis offers significantly higher assay efficiency, better data quality and potential bioanalytical cost-savings. For RLHs that are used for regulated bioanalysis, there are additional requirements, including 21 CFR Part 11 compliance, software validation, system qualification, calibration verification and proper maintenance. This article reviews recent advances in automated sample preparation for regulated bioanalysis in the last 5 years. Specifically, it covers the following aspects: regulated bioanalysis requirements, recent advances in automation hardware and software development, sample extraction workflow simplification, strategies towards fully automated sample extraction, and best practices in automated sample preparation for regulated bioanalysis.

  11. Feasibility of Commercially Available, Fully Automated Hepatic CT Volumetry for Assessing Both Total and Territorial Liver Volumes in Liver Transplantation

    Energy Technology Data Exchange (ETDEWEB)

    Shin, Cheong Il; Kim, Se Hyung; Rhim, Jung Hyo; Yi, Nam Joon; Suh, Kyung Suk; Lee, Jeong Min; Han, Joon Koo; Choi, Byung Ihn [Seoul National University Hospital, Seoul (Korea, Republic of)

    2013-02-15

    To assess the feasibility of commercially-available, fully automated hepatic CT volumetry for measuring both total and territorial liver volumes by comparing with interactive manual volumetry and measured ex-vivo liver volume. For the assessment of total and territorial liver volume, portal phase CT images of 77 recipients and 107 donors who donated right hemiliver were used. Liver volume was measured using both the fully automated and interactive manual methods with Advanced Liver Analysis software. The quality of the automated segmentation was graded on a 4-point scale. Grading was performed by two radiologists in consensus. For the cases with excellent-to-good quality, the accuracy of automated volumetry was compared with interactive manual volumetry and measured ex-vivo liver volume which was converted from weight using analysis of variance test and Pearson's or Spearman correlation test. Processing time for both automated and interactive manual methods was also compared. Excellent-to-good quality of automated segmentation for total liver and right hemiliver was achieved in 57.1% (44/77) and 17.8% (19/107), respectively. For both total and right hemiliver volumes, there were no significant differences among automated, manual, and ex-vivo volumes except between automate volume and manual volume of the total liver (p = 0.011). There were good correlations between automate volume and ex-vivo liver volume ({gamma}= 0.637 for total liver and {gamma}= 0.767 for right hemiliver). Both correlation coefficients were higher than those with manual method. Fully automated volumetry required significantly less time than interactive manual method (total liver: 48.6 sec vs. 53.2 sec, right hemiliver: 182 sec vs. 244.5 sec). Fully automated hepatic CT volumetry is feasible and time-efficient for total liver volume measurement. However, its usefulness for territorial liver volumetry needs to be improved.

  12. Feasibility of Commercially Available, Fully Automated Hepatic CT Volumetry for Assessing Both Total and Territorial Liver Volumes in Liver Transplantation

    International Nuclear Information System (INIS)

    Shin, Cheong Il; Kim, Se Hyung; Rhim, Jung Hyo; Yi, Nam Joon; Suh, Kyung Suk; Lee, Jeong Min; Han, Joon Koo; Choi, Byung Ihn

    2013-01-01

    To assess the feasibility of commercially-available, fully automated hepatic CT volumetry for measuring both total and territorial liver volumes by comparing with interactive manual volumetry and measured ex-vivo liver volume. For the assessment of total and territorial liver volume, portal phase CT images of 77 recipients and 107 donors who donated right hemiliver were used. Liver volume was measured using both the fully automated and interactive manual methods with Advanced Liver Analysis software. The quality of the automated segmentation was graded on a 4-point scale. Grading was performed by two radiologists in consensus. For the cases with excellent-to-good quality, the accuracy of automated volumetry was compared with interactive manual volumetry and measured ex-vivo liver volume which was converted from weight using analysis of variance test and Pearson's or Spearman correlation test. Processing time for both automated and interactive manual methods was also compared. Excellent-to-good quality of automated segmentation for total liver and right hemiliver was achieved in 57.1% (44/77) and 17.8% (19/107), respectively. For both total and right hemiliver volumes, there were no significant differences among automated, manual, and ex-vivo volumes except between automate volume and manual volume of the total liver (p = 0.011). There were good correlations between automate volume and ex-vivo liver volume (γ= 0.637 for total liver and γ= 0.767 for right hemiliver). Both correlation coefficients were higher than those with manual method. Fully automated volumetry required significantly less time than interactive manual method (total liver: 48.6 sec vs. 53.2 sec, right hemiliver: 182 sec vs. 244.5 sec). Fully automated hepatic CT volumetry is feasible and time-efficient for total liver volume measurement. However, its usefulness for territorial liver volumetry needs to be improved.

  13. Development of a Fully-Automated Monte Carlo Burnup Code Monteburns

    International Nuclear Information System (INIS)

    Poston, D.I.; Trellue, H.R.

    1999-01-01

    Several computer codes have been developed to perform nuclear burnup calculations over the past few decades. In addition, because of advances in computer technology, it recently has become more desirable to use Monte Carlo techniques for such problems. Monte Carlo techniques generally offer two distinct advantages over discrete ordinate methods: (1) the use of continuous energy cross sections and (2) the ability to model detailed, complex, three-dimensional (3-D) geometries. These advantages allow more accurate burnup results to be obtained, provided that the user possesses the required computing power (which is required for discrete ordinate methods as well). Several linkage codes have been written that combine a Monte Carlo N-particle transport code (such as MCNP TM ) with a radioactive decay and burnup code. This paper describes one such code that was written at Los Alamos National Laboratory: monteburns. Monteburns links MCNP with the isotope generation and depletion code ORIGEN2. The basis for the development of monteburns was the need for a fully automated code that could perform accurate burnup (and other) calculations for any 3-D system (accelerator-driven or a full reactor core). Before the initial development of monteburns, a list of desired attributes was made and is given below. o The code should be fully automated (that is, after the input is set up, no further user interaction is required). . The code should allow for the irradiation of several materials concurrently (each material is evaluated collectively in MCNP and burned separately in 0RIGEN2). o The code should allow the transfer of materials (shuffling) between regions in MCNP. . The code should allow any materials to be added or removed before, during, or after each step in an automated fashion. . The code should not require the user to provide input for 0RIGEN2 and should have minimal MCNP input file requirements (other than a working MCNP deck). . The code should be relatively easy to use

  14. Development of Fully Automated Low-Cost Immunoassay System for Research Applications.

    Science.gov (United States)

    Wang, Guochun; Das, Champak; Ledden, Bradley; Sun, Qian; Nguyen, Chien

    2017-10-01

    Enzyme-linked immunosorbent assay (ELISA) automation for routine operation in a small research environment would be very attractive. A portable fully automated low-cost immunoassay system was designed, developed, and evaluated with several protein analytes. It features disposable capillary columns as the reaction sites and uses real-time calibration for improved accuracy. It reduces the overall assay time to less than 75 min with the ability of easy adaptation of new testing targets. The running cost is extremely low due to the nature of automation, as well as reduced material requirements. Details about system configuration, components selection, disposable fabrication, system assembly, and operation are reported. The performance of the system was initially established with a rabbit immunoglobulin G (IgG) assay, and an example of assay adaptation with an interleukin 6 (IL6) assay is shown. This system is ideal for research use, but could work for broader testing applications with further optimization.

  15. Opportunities for Energy Efficiency and Open Automated Demand Response in Wastewater Treatment Facilities in California -- Phase I Report

    Energy Technology Data Exchange (ETDEWEB)

    Lekov, Alex; Thompson, Lisa; McKane, Aimee; Song, Katherine; Piette, Mary Ann

    2009-04-01

    This report summarizes the Lawrence Berkeley National Laboratory?s research to date in characterizing energy efficiency and automated demand response opportunities for wastewater treatment facilities in California. The report describes the characteristics of wastewater treatment facilities, the nature of the wastewater stream, energy use and demand, as well as details of the wastewater treatment process. It also discusses control systems and energy efficiency and automated demand response opportunities. In addition, several energy efficiency and load management case studies are provided for wastewater treatment facilities.This study shows that wastewater treatment facilities can be excellent candidates for open automated demand response and that facilities which have implemented energy efficiency measures and have centralized control systems are well-suited to shift or shed electrical loads in response to financial incentives, utility bill savings, and/or opportunities to enhance reliability of service. Control technologies installed for energy efficiency and load management purposes can often be adapted for automated demand response at little additional cost. These improved controls may prepare facilities to be more receptive to open automated demand response due to both increased confidence in the opportunities for controlling energy cost/use and access to the real-time data.

  16. The future of fully automated vehicles : opportunities for vehicle- and ride-sharing, with cost and emissions savings.

    Science.gov (United States)

    2014-08-01

    Fully automated or autonomous vehicles (AVs) hold great promise for the future of transportation. By 2020 : Google, auto manufacturers and other technology providers intend to introduce self-driving cars to the public with : either limited or fully a...

  17. Fully automated data collection and processing system on macromolecular crystallography beamlines at the PF

    International Nuclear Information System (INIS)

    Yamada, Yusuke; Hiraki, Masahiko; Matsugaki, Naohiro; Chavas, Leonard M.G.; Igarashi, Noriyuki; Wakatsuki, Soichi

    2012-01-01

    Fully automated data collection and processing system has been developed on macromolecular crystallography beamlines at the Photon Factory. In this system, the sample exchange, centering and data collection are sequentially performed for all samples stored in the sample exchange system at a beamline without any manual operations. Data processing of collected data sets is also performed automatically. These results are stored into the database system, and users can monitor the progress and results of automated experiment via a Web browser. (author)

  18. Enabling Automated Dynamic Demand Response: From Theory to Practice

    Energy Technology Data Exchange (ETDEWEB)

    Frincu, Marc; Chelmis, Charalampos; Aman, Saima; Saeed, Rizwan; Zois, Vasileios; Prasanna, Viktor

    2015-07-14

    Demand response (DR) is a technique used in smart grids to shape customer load during peak hours. Automated DR offers utilities a fine grained control and a high degree of confidence in the outcome. However the impact on the customer's comfort means this technique is more suited for industrial and commercial settings than for residential homes. In this paper we propose a system for achieving automated controlled DR in a heterogeneous environment. We present some of the main issues arising in building such a system, including privacy, customer satisfiability, reliability, and fast decision turnaround, with emphasis on the solutions we proposed. Based on the lessons we learned from empirical results we describe an integrated automated system for controlled DR on the USC microgrid. Results show that while on a per building per event basis the accuracy of our prediction and customer selection techniques varies, it performs well on average when considering several events and buildings.

  19. A Fully Automated High-Throughput Flow Cytometry Screening System Enabling Phenotypic Drug Discovery.

    Science.gov (United States)

    Joslin, John; Gilligan, James; Anderson, Paul; Garcia, Catherine; Sharif, Orzala; Hampton, Janice; Cohen, Steven; King, Miranda; Zhou, Bin; Jiang, Shumei; Trussell, Christopher; Dunn, Robert; Fathman, John W; Snead, Jennifer L; Boitano, Anthony E; Nguyen, Tommy; Conner, Michael; Cooke, Mike; Harris, Jennifer; Ainscow, Ed; Zhou, Yingyao; Shaw, Chris; Sipes, Dan; Mainquist, James; Lesley, Scott

    2018-05-01

    The goal of high-throughput screening is to enable screening of compound libraries in an automated manner to identify quality starting points for optimization. This often involves screening a large diversity of compounds in an assay that preserves a connection to the disease pathology. Phenotypic screening is a powerful tool for drug identification, in that assays can be run without prior understanding of the target and with primary cells that closely mimic the therapeutic setting. Advanced automation and high-content imaging have enabled many complex assays, but these are still relatively slow and low throughput. To address this limitation, we have developed an automated workflow that is dedicated to processing complex phenotypic assays for flow cytometry. The system can achieve a throughput of 50,000 wells per day, resulting in a fully automated platform that enables robust phenotypic drug discovery. Over the past 5 years, this screening system has been used for a variety of drug discovery programs, across many disease areas, with many molecules advancing quickly into preclinical development and into the clinic. This report will highlight a diversity of approaches that automated flow cytometry has enabled for phenotypic drug discovery.

  20. Field demonstration of automated demand response for both winter and summer events in large buildings in the Pacific Northwest

    Energy Technology Data Exchange (ETDEWEB)

    Piette, M.A.; Kiliccote, S.; Dudley, J.H. [Lawrence Berkeley National Laboratory, Berkeley, CA (United States)

    2013-11-15

    There are growing strains on the electric grid as cooling peaks grow and equipment ages. Increased penetration of renewables on the grid is also straining electricity supply systems and the need for flexible demand is growing. This paper summarizes results of a series of field test of automated demand response systems in large buildings in the Pacific Northwest. The objective of the research was twofold. One objective was to evaluate the use demand response automation technologies. A second objective was to evaluate control strategies that could change the electric load shape in both winter and summer conditions. Winter conditions focused on cold winter mornings, a time when the electric grid is often stressed. The summer test evaluated DR strategies in the afternoon. We found that we could automate both winter and summer control strategies with the open automated demand response communication standard. The buildings were able to provide significant demand response in both winter and summer events.

  1. Fully automated joint space width measurement and digital X-ray radiogrammetry in early RA

    NARCIS (Netherlands)

    Platten, Michael; Kisten, Yogan; Kälvesten, Johan; Arnaud, Laurent; Forslind, Kristina; van Vollenhoven, Ronald

    2017-01-01

    To study fully automated digital joint space width (JSW) and bone mineral density (BMD) in relation to a conventional radiographic scoring method in early rheumatoid arthritis (eRA). Radiographs scored by the modified Sharp van der Heijde score (SHS) in patients with eRA were acquired from the

  2. Automated Critical Peak Pricing Field Tests: Program Descriptionand Results

    Energy Technology Data Exchange (ETDEWEB)

    Piette, Mary Ann; Watson, David; Motegi, Naoya; Kiliccote, Sila; Xu, Peng

    2006-04-06

    California utilities have been exploring the use of critical peak prices (CPP) to help reduce needle peaks in customer end-use loads. CPP is a form of price-responsive demand response (DR). Recent experience has shown that customers have limited knowledge of how to operate their facilities in order to reduce their electricity costs under CPP (Quantum 2004). While the lack of knowledge about how to develop and implement DR control strategies is a barrier to participation in DR programs like CPP, another barrier is the lack of automation of DR systems. During 2003 and 2004, the PIER Demand Response Research Center (DRRC) conducted a series of tests of fully automated electric demand response (Auto-DR) at 18 facilities. Overall, the average of the site-specific average coincident demand reductions was 8% from a variety of building types and facilities. Many electricity customers have suggested that automation will help them institutionalize their electric demand savings and improve their overall response and DR repeatability. This report focuses on and discusses the specific results of the Automated Critical Peak Pricing (Auto-CPP, a specific type of Auto-DR) tests that took place during 2005, which build on the automated demand response (Auto-DR) research conducted through PIER and the DRRC in 2003 and 2004. The long-term goal of this project is to understand the technical opportunities of automating demand response and to remove technical and market impediments to large-scale implementation of automated demand response (Auto-DR) in buildings and industry. A second goal of this research is to understand and identify best practices for DR strategies and opportunities. The specific objectives of the Automated Critical Peak Pricing test were as follows: (1) Demonstrate how an automated notification system for critical peak pricing can be used in large commercial facilities for demand response (DR). (2) Evaluate effectiveness of such a system. (3) Determine how customers

  3. A Distributed Intelligent Automated Demand Response Building Management System

    Energy Technology Data Exchange (ETDEWEB)

    Auslander, David [Univ. of California, Berkeley, CA (United States); Culler, David [Univ. of California, Berkeley, CA (United States); Wright, Paul [Univ. of California, Berkeley, CA (United States); Lu, Yan [Siemens Corporate Research Inc., Princeton, NJ (United States); Piette, Mary [Univ. of California, Berkeley, CA (United States)

    2013-03-31

    The goal of the 2.5 year Distributed Intelligent Automated Demand Response (DIADR) project was to reduce peak electricity load of Sutardja Dai Hall at UC Berkeley by 30% while maintaining a healthy, comfortable, and productive environment for the occupants. We sought to bring together both central and distributed control to provide “deep” demand response1 at the appliance level of the building as well as typical lighting and HVAC applications. This project brought together Siemens Corporate Research and Siemens Building Technology (the building has a Siemens Apogee Building Automation System (BAS)), Lawrence Berkeley National Laboratory (leveraging their Open Automated Demand Response (openADR), Auto-­Demand Response, and building modeling expertise), and UC Berkeley (related demand response research including distributed wireless control, and grid-­to-­building gateway development). Sutardja Dai Hall houses the Center for Information Technology Research in the Interest of Society (CITRIS), which fosters collaboration among industry and faculty and students of four UC campuses (Berkeley, Davis, Merced, and Santa Cruz). The 141,000 square foot building, occupied in 2009, includes typical office spaces and a nanofabrication laboratory. Heating is provided by a district heating system (steam from campus as a byproduct of the campus cogeneration plant); cooling is provided by one of two chillers: a more typical electric centrifugal compressor chiller designed for the cool months (Nov-­ March) and a steam absorption chiller for use in the warm months (April-­October). Lighting in the open office areas is provided by direct-­indirect luminaries with Building Management System-­based scheduling for open areas, and occupancy sensors for private office areas. For the purposes of this project, we focused on the office portion of the building. Annual energy consumption is approximately 8053 MWh; the office portion is estimated as 1924 MWh. The maximum peak load

  4. Fully Automated Driving: Impact of Trust and Practice on Manual Control Recovery.

    Science.gov (United States)

    Payre, William; Cestac, Julien; Delhomme, Patricia

    2016-03-01

    An experiment was performed in a driving simulator to investigate the impacts of practice, trust, and interaction on manual control recovery (MCR) when employing fully automated driving (FAD). To increase the use of partially or highly automated driving efficiency and to improve safety, some studies have addressed trust in driving automation and training, but few studies have focused on FAD. FAD is an autonomous system that has full control of a vehicle without any need for intervention by the driver. A total of 69 drivers with a valid license practiced with FAD. They were distributed evenly across two conditions: simple practice and elaborate practice. When examining emergency MCR, a correlation was found between trust and reaction time in the simple practice group (i.e., higher trust meant a longer reaction time), but not in the elaborate practice group. This result indicated that to mitigate the negative impact of overtrust on reaction time, more appropriate practice may be needed. Drivers should be trained in how the automated device works so as to improve MCR performance in case of an emergency. The practice format used in this study could be used for the first interaction with an FAD car when acquiring such a vehicle. © 2015, Human Factors and Ergonomics Society.

  5. FULLY AUTOMATED GENERATION OF ACCURATE DIGITAL SURFACE MODELS WITH SUB-METER RESOLUTION FROM SATELLITE IMAGERY

    Directory of Open Access Journals (Sweden)

    J. Wohlfeil

    2012-07-01

    Full Text Available Modern pixel-wise image matching algorithms like Semi-Global Matching (SGM are able to compute high resolution digital surface models from airborne and spaceborne stereo imagery. Although image matching itself can be performed automatically, there are prerequisites, like high geometric accuracy, which are essential for ensuring the high quality of resulting surface models. Especially for line cameras, these prerequisites currently require laborious manual interaction using standard tools, which is a growing problem due to continually increasing demand for such surface models. The tedious work includes partly or fully manual selection of tie- and/or ground control points for ensuring the required accuracy of the relative orientation of images for stereo matching. It also includes masking of large water areas that seriously reduce the quality of the results. Furthermore, a good estimate of the depth range is required, since accurate estimates can seriously reduce the processing time for stereo matching. In this paper an approach is presented that allows performing all these steps fully automated. It includes very robust and precise tie point selection, enabling the accurate calculation of the images’ relative orientation via bundle adjustment. It is also shown how water masking and elevation range estimation can be performed automatically on the base of freely available SRTM data. Extensive tests with a large number of different satellite images from QuickBird and WorldView are presented as proof of the robustness and reliability of the proposed method.

  6. Fully automated synthesis system of 3'-deoxy-3'-[18F]fluorothymidine

    International Nuclear Information System (INIS)

    Oh, Seung Jun; Mosdzianowski, Christoph; Chi, Dae Yoon; Kim, Jung Young; Kang, Se Hun; Ryu, Jin Sook; Yeo, Jeong Seok; Moon, Dae Hyuk

    2004-01-01

    We developed a new fully automated method for the synthesis of 3'-deoxy-3'-[ 18 F]fluorothymidine ([ 18 F]FLT), by modifying a commercial FDG synthesizer and its disposable fluid pathway. Optimal labeling condition was that 40 mg of precursor in acetonitrile (2 mL) was heated at 150 degree sign C for 100 sec, followed by heating at 85 degree sign C for 450 sec and hydrolysis with 1 N HCl at 105 degree sign C for 300 sec. Using 3.7 GBq of [ 18 F]F - as starting activity, [ 18 F]FLT was obtained with a yield of 50.5±5.2% (n=28, decay corrected) within 60.0±5.4 min including HPLC purification. With 37.0 GBq, we obtained 48.7±5.6% (n=10). The [ 18 F]FLT showed the good stability for 6 h. This new automated synthesis procedure combines high and reproducible yields with the benefits of a disposable cassette system

  7. Fully automated joint space width measurement and digital X-ray radiogrammetry in early RA.

    Science.gov (United States)

    Platten, Michael; Kisten, Yogan; Kälvesten, Johan; Arnaud, Laurent; Forslind, Kristina; van Vollenhoven, Ronald

    2017-01-01

    To study fully automated digital joint space width (JSW) and bone mineral density (BMD) in relation to a conventional radiographic scoring method in early rheumatoid arthritis (eRA). Radiographs scored by the modified Sharp van der Heijde score (SHS) in patients with eRA were acquired from the SWEdish FarmacOTherapy study. Fully automated JSW measurements of bilateral metacarpals 2, 3 and 4 were compared with the joint space narrowing (JSN) score in SHS. Multilevel mixed model statistics were applied to calculate the significance of the association between ΔJSW and ΔBMD over 1 year, and the JSW differences between damaged and undamaged joints as evaluated by the JSN. Based on 576 joints of 96 patients with eRA, a significant reduction from baseline to 1 year was observed in the JSW from 1.69 (±0.19) mm to 1.66 (±0.19) mm (p0) joints: 1.68 mm (95% CI 1.70 to 1.67) vs 1.54 mm (95% CI 1.63 to 1.46). Similarly the unadjusted multilevel model showed significant differences in JSW between undamaged (1.68 mm (95% CI 1.72 to 1.64)) and damaged joints (1.63 mm (95% CI 1.68 to 1.58)) (p=0.0048). This difference remained significant in the adjusted model: 1.66 mm (95% CI 1.70 to 1.61) vs 1.62 mm (95% CI 1.68 to 1.56) (p=0.042). To measure the JSW with this fully automated digital tool may be useful as a quick and observer-independent application for evaluating cartilage damage in eRA. NCT00764725.

  8. Fully Automated Trimethylsilyl (TMS) Derivatisation Protocol for Metabolite Profiling by GC-MS.

    Science.gov (United States)

    Zarate, Erica; Boyle, Veronica; Rupprecht, Udo; Green, Saras; Villas-Boas, Silas G; Baker, Philip; Pinu, Farhana R

    2016-12-29

    Gas Chromatography-Mass Spectrometry (GC-MS) has long been used for metabolite profiling of a wide range of biological samples. Many derivatisation protocols are already available and among these, trimethylsilyl (TMS) derivatisation is one of the most widely used in metabolomics. However, most TMS methods rely on off-line derivatisation prior to GC-MS analysis. In the case of manual off-line TMS derivatisation, the derivative created is unstable, so reduction in recoveries occurs over time. Thus, derivatisation is carried out in small batches. Here, we present a fully automated TMS derivatisation protocol using robotic autosamplers and we also evaluate a commercial software, Maestro available from Gerstel GmbH. Because of automation, there was no waiting time of derivatised samples on the autosamplers, thus reducing degradation of unstable metabolites. Moreover, this method allowed us to overlap samples and improved throughputs. We compared data obtained from both manual and automated TMS methods performed on three different matrices, including standard mix, wine, and plasma samples. The automated TMS method showed better reproducibility and higher peak intensity for most of the identified metabolites than the manual derivatisation method. We also validated the automated method using 114 quality control plasma samples. Additionally, we showed that this online method was highly reproducible for most of the metabolites detected and identified (RSD TMS method has been applied to analyse a large number of complex plasma samples. Furthermore, we found that this method was highly applicable for routine metabolite profiling (both targeted and untargeted) in any metabolomics laboratory.

  9. A fully automated conversational agent for promoting mental well-being: A pilot RCT using mixed methods

    Directory of Open Access Journals (Sweden)

    Kien Hoa Ly

    2017-12-01

    Full Text Available Fully automated self-help interventions can serve as highly cost-effective mental health promotion tools for massive amounts of people. However, these interventions are often characterised by poor adherence. One way to address this problem is to mimic therapy support by a conversational agent. The objectives of this study were to assess the effectiveness and adherence of a smartphone app, delivering strategies used in positive psychology and CBT interventions via an automated chatbot (Shim for a non-clinical population — as well as to explore participants' views and experiences of interacting with this chatbot. A total of 28 participants were randomized to either receive the chatbot intervention (n = 14 or to a wait-list control group (n = 14. Findings revealed that participants who adhered to the intervention (n = 13 showed significant interaction effects of group and time on psychological well-being (FS and perceived stress (PSS-10 compared to the wait-list control group, with small to large between effect sizes (Cohen's d range 0.14–1.06. Also, the participants showed high engagement during the 2-week long intervention, with an average open app ratio of 17.71 times for the whole period. This is higher compared to other studies on fully automated interventions claiming to be highly engaging, such as Woebot and the Panoply app. The qualitative data revealed sub-themes which, to our knowledge, have not been found previously, such as the moderating format of the chatbot. The results of this study, in particular the good adherence rate, validated the usefulness of replicating this study in the future with a larger sample size and an active control group. This is important, as the search for fully automated, yet highly engaging and effective digital self-help interventions for promoting mental health is crucial for the public health.

  10. Performance of an Artificial Multi-observer Deep Neural Network for Fully Automated Segmentation of Polycystic Kidneys.

    Science.gov (United States)

    Kline, Timothy L; Korfiatis, Panagiotis; Edwards, Marie E; Blais, Jaime D; Czerwiec, Frank S; Harris, Peter C; King, Bernard F; Torres, Vicente E; Erickson, Bradley J

    2017-08-01

    Deep learning techniques are being rapidly applied to medical imaging tasks-from organ and lesion segmentation to tissue and tumor classification. These techniques are becoming the leading algorithmic approaches to solve inherently difficult image processing tasks. Currently, the most critical requirement for successful implementation lies in the need for relatively large datasets that can be used for training the deep learning networks. Based on our initial studies of MR imaging examinations of the kidneys of patients affected by polycystic kidney disease (PKD), we have generated a unique database of imaging data and corresponding reference standard segmentations of polycystic kidneys. In the study of PKD, segmentation of the kidneys is needed in order to measure total kidney volume (TKV). Automated methods to segment the kidneys and measure TKV are needed to increase measurement throughput and alleviate the inherent variability of human-derived measurements. We hypothesize that deep learning techniques can be leveraged to perform fast, accurate, reproducible, and fully automated segmentation of polycystic kidneys. Here, we describe a fully automated approach for segmenting PKD kidneys within MR images that simulates a multi-observer approach in order to create an accurate and robust method for the task of segmentation and computation of TKV for PKD patients. A total of 2000 cases were used for training and validation, and 400 cases were used for testing. The multi-observer ensemble method had mean ± SD percent volume difference of 0.68 ± 2.2% compared with the reference standard segmentations. The complete framework performs fully automated segmentation at a level comparable with interobserver variability and could be considered as a replacement for the task of segmentation of PKD kidneys by a human.

  11. Effectiveness of a Web-Based Screening and Fully Automated Brief Motivational Intervention for Adolescent Substance Use

    DEFF Research Database (Denmark)

    Arnaud, Nicolas; Baldus, Christiane; Elgán, Tobias H.

    2016-01-01

    ).Conclusions: Although the study is limited by a large drop-out, significant between-group effects for alcohol use indicate that targeted brief motivational intervention in a fully automated Web-based format can be effective to reduce drinking and lessen existing substance use service barriers for at...... of substance use among college students. However, the evidence is sparse among adolescents with at-risk use of alcohol and other drugs. Objective: This study evaluated the effectiveness of a targeted and fully automated Web-based brief motivational intervention with no face-to-face components on substance use......, and polydrug use. All outcome analyses were conducted with and without Expectation Maximization (EM) imputation of missing follow-up data. Results: In total, 2673 adolescents were screened and 1449 (54.2%) participants were randomized to the intervention or control group. After 3 months, 211 adolescents (14...

  12. Effectiveness of a Web-Based Screening and Fully Automated Brief Motivational Intervention for Adolescent Substance Use

    DEFF Research Database (Denmark)

    Arnaud, Nicolas; Baldus, Christiane; Elgán, Tobias H.

    2016-01-01

    of substance use among college students. However, the evidence is sparse among adolescents with at-risk use of alcohol and other drugs. Objective: This study evaluated the effectiveness of a targeted and fully automated Web-based brief motivational intervention with no face-to-face components on substance use...... methods and screened online for at-risk substance use using the CRAFFT (Car, Relax, Alone, Forget, Friends, Trouble) screening instrument. Participants were randomized to a single session brief motivational intervention group or an assessment-only control group but not blinded. Primary outcome......).Conclusions: Although the study is limited by a large drop-out, significant between-group effects for alcohol use indicate that targeted brief motivational intervention in a fully automated Web-based format can be effective to reduce drinking and lessen existing substance use service barriers for at...

  13. Fully Automated Trimethylsilyl (TMS Derivatisation Protocol for Metabolite Profiling by GC-MS

    Directory of Open Access Journals (Sweden)

    Erica Zarate

    2016-12-01

    Full Text Available Gas Chromatography-Mass Spectrometry (GC-MS has long been used for metabolite profiling of a wide range of biological samples. Many derivatisation protocols are already available and among these, trimethylsilyl (TMS derivatisation is one of the most widely used in metabolomics. However, most TMS methods rely on off-line derivatisation prior to GC-MS analysis. In the case of manual off-line TMS derivatisation, the derivative created is unstable, so reduction in recoveries occurs over time. Thus, derivatisation is carried out in small batches. Here, we present a fully automated TMS derivatisation protocol using robotic autosamplers and we also evaluate a commercial software, Maestro available from Gerstel GmbH. Because of automation, there was no waiting time of derivatised samples on the autosamplers, thus reducing degradation of unstable metabolites. Moreover, this method allowed us to overlap samples and improved throughputs. We compared data obtained from both manual and automated TMS methods performed on three different matrices, including standard mix, wine, and plasma samples. The automated TMS method showed better reproducibility and higher peak intensity for most of the identified metabolites than the manual derivatisation method. We also validated the automated method using 114 quality control plasma samples. Additionally, we showed that this online method was highly reproducible for most of the metabolites detected and identified (RSD < 20 and specifically achieved excellent results for sugars, sugar alcohols, and some organic acids. To the very best of our knowledge, this is the first time that the automated TMS method has been applied to analyse a large number of complex plasma samples. Furthermore, we found that this method was highly applicable for routine metabolite profiling (both targeted and untargeted in any metabolomics laboratory.

  14. Fully automated bone mineral density assessment from low-dose chest CT

    Science.gov (United States)

    Liu, Shuang; Gonzalez, Jessica; Zulueta, Javier; de-Torres, Juan P.; Yankelevitz, David F.; Henschke, Claudia I.; Reeves, Anthony P.

    2018-02-01

    A fully automated system is presented for bone mineral density (BMD) assessment from low-dose chest CT (LDCT). BMD assessment is central in the diagnosis and follow-up therapy monitoring of osteoporosis, which is characterized by low bone density and is estimated to affect 12.3 million US population aged 50 years or older, creating tremendous social and economic burdens. BMD assessment from DXA scans (BMDDXA) is currently the most widely used and gold standard technique for the diagnosis of osteoporosis and bone fracture risk estimation. With the recent large-scale implementation of annual lung cancer screening using LDCT, great potential emerges for the concurrent opportunistic osteoporosis screening. In the presented BMDCT assessment system, each vertebral body is first segmented and labeled with its anatomical name. Various 3D region of interest (ROI) inside the vertebral body are then explored for BMDCT measurements at different vertebral levels. The system was validated using 76 pairs of DXA and LDCT scans of the same subject. Average BMDDXA of L1-L4 was used as the reference standard. Statistically significant (p-value correlation is obtained between BMDDXA and BMDCT at all vertebral levels (T1 - L2). A Pearson correlation of 0.857 was achieved between BMDDXA and average BMDCT of T9-T11 by using a 3D ROI taking into account of both trabecular and cortical bone tissue. These encouraging results demonstrate the feasibility of fully automated quantitative BMD assessment and the potential of opportunistic osteoporosis screening with concurrent lung cancer screening using LDCT.

  15. Fully automated gynecomastia quantification from low-dose chest CT

    Science.gov (United States)

    Liu, Shuang; Sonnenblick, Emily B.; Azour, Lea; Yankelevitz, David F.; Henschke, Claudia I.; Reeves, Anthony P.

    2018-02-01

    Gynecomastia is characterized by the enlargement of male breasts, which is a common and sometimes distressing condition found in over half of adult men over the age of 44. Although the majority of gynecomastia is physiologic or idiopathic, its occurrence may also associate with an extensive variety of underlying systemic disease or drug toxicity. With the recent large-scale implementation of annual lung cancer screening using low-dose chest CT (LDCT), gynecomastia is believed to be a frequent incidental finding on LDCT. A fully automated system for gynecomastia quantification from LDCT is presented in this paper. The whole breast region is first segmented using an anatomyorientated approach based on the propagation of pectoral muscle fronts in the vertical direction. The subareolar region is then localized, and the fibroglandular tissue within it is measured for the assessment of gynecomastia. The presented system was validated using 454 breast regions from non-contrast LDCT scans of 227 adult men. The ground truth was established by an experienced radiologist by classifying each breast into one of the five categorical scores. The automated measurements have been demonstrated to achieve promising performance for the gynecomastia diagnosis with the AUC of 0.86 for the ROC curve and have statistically significant Spearman correlation r=0.70 (p early detection as well as the treatment of both gynecomastia and the underlying medical problems, if any, that cause gynecomastia.

  16. Implementation of a fully automated process purge-and-trap gas chromatograph at an environmental remediation site

    International Nuclear Information System (INIS)

    Blair, D.S.; Morrison, D.J.

    1997-01-01

    The AQUASCAN, a commercially available, fully automated purge-and-trap gas chromatograph from Sentex Systems Inc., was implemented and evaluated as an in-field, automated monitoring system of contaminated groundwater at an active DOE remediation site in Pinellas, FL. Though the AQUASCAN is designed as a stand alone process analytical unit, implementation at this site required additional hardware. The hardware included a sample dilution system and a method for delivering standard solution to the gas chromatograph for automated calibration. As a result of the evaluation the system was determined to be a reliable and accurate instrument. The AQUASCAN reported concentration values for methylene chloride, trichloroethylene, and toluene in the Pinellas ground water were within 20% of reference laboratory values

  17. Automated evaluation of ultrasonic indications. State of the art -development trends. Pt. 1

    International Nuclear Information System (INIS)

    Hansch, M.K.T.; Stegemann, D.

    1994-01-01

    Future requirements of reliability and reproducibility in quality assurance demand computer evaluation of defect indications. The ultrasonic method with its large field of applications and a high potential for automation provides all preconditions for fully automated inspection. The survey proposes several desirable hardware improvements, data acquisition requirements and software configurations. (orig.) [de

  18. UBO Detector - A cluster-based, fully automated pipeline for extracting white matter hyperintensities.

    Science.gov (United States)

    Jiang, Jiyang; Liu, Tao; Zhu, Wanlin; Koncz, Rebecca; Liu, Hao; Lee, Teresa; Sachdev, Perminder S; Wen, Wei

    2018-07-01

    We present 'UBO Detector', a cluster-based, fully automated pipeline for extracting and calculating variables for regions of white matter hyperintensities (WMH) (available for download at https://cheba.unsw.edu.au/group/neuroimaging-pipeline). It takes T1-weighted and fluid attenuated inversion recovery (FLAIR) scans as input, and SPM12 and FSL functions are utilised for pre-processing. The candidate clusters are then generated by FMRIB's Automated Segmentation Tool (FAST). A supervised machine learning algorithm, k-nearest neighbor (k-NN), is applied to determine whether the candidate clusters are WMH or non-WMH. UBO Detector generates both image and text (volumes and the number of WMH clusters) outputs for whole brain, periventricular, deep, and lobar WMH, as well as WMH in arterial territories. The computation time for each brain is approximately 15 min. We validated the performance of UBO Detector by showing a) high segmentation (similarity index (SI) = 0.848) and volumetric (intraclass correlation coefficient (ICC) = 0.985) agreement between the UBO Detector-derived and manually traced WMH; b) highly correlated (r 2  > 0.9) and a steady increase of WMH volumes over time; and c) significant associations of periventricular (t = 22.591, p deep (t = 14.523, p < 0.001) WMH volumes generated by UBO Detector with Fazekas rating scores. With parallel computing enabled in UBO Detector, the processing can take advantage of multi-core CPU's that are commonly available on workstations. In conclusion, UBO Detector is a reliable, efficient and fully automated WMH segmentation pipeline. Copyright © 2018 Elsevier Inc. All rights reserved.

  19. A Fully Automated Method to Detect and Segment a Manufactured Object in an Underwater Color Image

    Science.gov (United States)

    Barat, Christian; Phlypo, Ronald

    2010-12-01

    We propose a fully automated active contours-based method for the detection and the segmentation of a moored manufactured object in an underwater image. Detection of objects in underwater images is difficult due to the variable lighting conditions and shadows on the object. The proposed technique is based on the information contained in the color maps and uses the visual attention method, combined with a statistical approach for the detection and an active contour for the segmentation of the object to overcome the above problems. In the classical active contour method the region descriptor is fixed and the convergence of the method depends on the initialization. With our approach, this dependence is overcome with an initialization using the visual attention results and a criterion to select the best region descriptor. This approach improves the convergence and the processing time while providing the advantages of a fully automated method.

  20. A fully automated Drosophila olfactory classical conditioning and testing system for behavioral learning and memory assessment.

    Science.gov (United States)

    Jiang, Hui; Hanna, Eriny; Gatto, Cheryl L; Page, Terry L; Bhuva, Bharat; Broadie, Kendal

    2016-03-01

    Aversive olfactory classical conditioning has been the standard method to assess Drosophila learning and memory behavior for decades, yet training and testing are conducted manually under exceedingly labor-intensive conditions. To overcome this severe limitation, a fully automated, inexpensive system has been developed, which allows accurate and efficient Pavlovian associative learning/memory analyses for high-throughput pharmacological and genetic studies. The automated system employs a linear actuator coupled to an odorant T-maze with airflow-mediated transfer of animals between training and testing stages. Odorant, airflow and electrical shock delivery are automatically administered and monitored during training trials. Control software allows operator-input variables to define parameters of Drosophila learning, short-term memory and long-term memory assays. The approach allows accurate learning/memory determinations with operational fail-safes. Automated learning indices (immediately post-training) and memory indices (after 24h) are comparable to traditional manual experiments, while minimizing experimenter involvement. The automated system provides vast improvements over labor-intensive manual approaches with no experimenter involvement required during either training or testing phases. It provides quality control tracking of airflow rates, odorant delivery and electrical shock treatments, and an expanded platform for high-throughput studies of combinational drug tests and genetic screens. The design uses inexpensive hardware and software for a total cost of ∼$500US, making it affordable to a wide range of investigators. This study demonstrates the design, construction and testing of a fully automated Drosophila olfactory classical association apparatus to provide low-labor, high-fidelity, quality-monitored, high-throughput and inexpensive learning and memory behavioral assays. Copyright © 2015 Elsevier B.V. All rights reserved.

  1. Fully automated atlas-based hippocampal volumetry for detection of Alzheimer's disease in a memory clinic setting.

    Science.gov (United States)

    Suppa, Per; Anker, Ulrich; Spies, Lothar; Bopp, Irene; Rüegger-Frey, Brigitte; Klaghofer, Richard; Gocke, Carola; Hampel, Harald; Beck, Sacha; Buchert, Ralph

    2015-01-01

    Hippocampal volume is a promising biomarker to enhance the accuracy of the diagnosis of dementia due to Alzheimer's disease (AD). However, whereas hippocampal volume is well studied in patient samples from clinical trials, its value in clinical routine patient care is still rather unclear. The aim of the present study, therefore, was to evaluate fully automated atlas-based hippocampal volumetry for detection of AD in the setting of a secondary care expert memory clinic for outpatients. One-hundred consecutive patients with memory complaints were clinically evaluated and categorized into three diagnostic groups: AD, intermediate AD, and non-AD. A software tool based on open source software (Statistical Parametric Mapping SPM8) was employed for fully automated tissue segmentation and stereotactical normalization of high-resolution three-dimensional T1-weighted magnetic resonance images. Predefined standard masks were used for computation of grey matter volume of the left and right hippocampus which then was scaled to the patient's total grey matter volume. The right hippocampal volume provided an area under the receiver operating characteristic curve of 84% for detection of AD patients in the whole sample. This indicates that fully automated MR-based hippocampal volumetry fulfills the requirements for a relevant core feasible biomarker for detection of AD in everyday patient care in a secondary care memory clinic for outpatients. The software used in the present study has been made freely available as an SPM8 toolbox. It is robust and fast so that it is easily integrated into routine workflow.

  2. Development of a fully automated software system for rapid analysis/processing of the falling weight deflectometer data.

    Science.gov (United States)

    2009-02-01

    The Office of Special Investigations at Iowa Department of Transportation (DOT) collects FWD data on regular basis to evaluate pavement structural conditions. The primary objective of this study was to develop a fully-automated software system for ra...

  3. Fully automated gamma spectrometry gauge observing possible radioactive contamination of melting-shop samples

    International Nuclear Information System (INIS)

    Kroos, J.; Westkaemper, G.; Stein, J.

    1999-01-01

    At Salzgitter AG, several monitoring systems have been installed to check the scrap transport by rail and by car. At the moment, the scrap transport by ship is reloaded onto wagons for monitoring afterwards. In the future, a detection system will be mounted onto a crane for a direct check on scrap upon the departure of ship. Furthermore, at Salzgitter AG Central Chemical Laboratory, a fully automated gamma spectrometry gauge is installed in order to observe a possible radioactive contamination of the products. The gamma spectrometer is integrated into the automated OE spectrometry line for testing melting shop samples after performing the OE spectrometry. With this technique the specific activity of selected nuclides and dose rate will be determined. The activity observation is part of the release procedure. The corresponding measurement data are stored in a database for quality management reasons. (author)

  4. A new fully automated FTIR system for total column measurements of greenhouse gases

    Science.gov (United States)

    Geibel, M. C.; Gerbig, C.; Feist, D. G.

    2010-10-01

    This article introduces a new fully automated FTIR system that is part of the Total Carbon Column Observing Network (TCCON). It will provide continuous ground-based measurements of column-averaged volume mixing ratio for CO2, CH4 and several other greenhouse gases in the tropics. Housed in a 20-foot shipping container it was developed as a transportable system that could be deployed almost anywhere in the world. We describe the automation concept which relies on three autonomous subsystems and their interaction. Crucial components like a sturdy and reliable solar tracker dome are described in detail. The automation software employs a new approach relying on multiple processes, database logging and web-based remote control. First results of total column measurements at Jena, Germany show that the instrument works well and can provide parts of the diurnal as well as seasonal cycle for CO2. Instrument line shape measurements with an HCl cell suggest that the instrument stays well-aligned over several months. After a short test campaign for side by side intercomaprison with an existing TCCON instrument in Australia, the system will be transported to its final destination Ascension Island.

  5. Fully automated system for Pu measurement by gamma spectrometry of alpha contaminated solid wastes

    International Nuclear Information System (INIS)

    Cresti, P.

    1986-01-01

    A description is given of a fully automated system developed at Comb/Mepis Laboratories which is based on the detection of specific gamma signatures of Pu isotopes for monitoring Pu content in 15-25 l containers of low density (0.1 g/cm 3 ) wastes. The methodological approach is discussed; based on experimental data, an evaluation of the achievable performances (detection limit, precision, accuracy, etc.) is also given

  6. Fully automated parallel oligonucleotide synthesizer

    Czech Academy of Sciences Publication Activity Database

    Lebl, M.; Burger, Ch.; Ellman, B.; Heiner, D.; Ibrahim, G.; Jones, A.; Nibbe, M.; Thompson, J.; Mudra, Petr; Pokorný, Vít; Poncar, Pavel; Ženíšek, Karel

    2001-01-01

    Roč. 66, č. 8 (2001), s. 1299-1314 ISSN 0010-0765 Institutional research plan: CEZ:AV0Z4055905 Keywords : automated oligonucleotide synthesizer Subject RIV: CC - Organic Chemistry Impact factor: 0.778, year: 2001

  7. An Automation System for Optimizing a Supply Chain Network Design under the Influence of Demand Uncertainty

    OpenAIRE

    Polany, Rany

    2012-01-01

    This research develops and applies an integrated hierarchical framework for modeling a multi-echelon supply chain network design, under the influence of demand uncertainty. The framework is a layered integration of two levels: macro, high-level scenario planning combined with micro, low-level Monte Carlo simulation of uncertainties in demand. To facilitate rapid simulation of the effects of demand uncertainty, the integrated framework was implemented as a dashboard automation system using Mic...

  8. Microscope image based fully automated stomata detection and pore measurement method for grapevines

    Directory of Open Access Journals (Sweden)

    Hiranya Jayakody

    2017-11-01

    Full Text Available Abstract Background Stomatal behavior in grapevines has been identified as a good indicator of the water stress level and overall health of the plant. Microscope images are often used to analyze stomatal behavior in plants. However, most of the current approaches involve manual measurement of stomatal features. The main aim of this research is to develop a fully automated stomata detection and pore measurement method for grapevines, taking microscope images as the input. The proposed approach, which employs machine learning and image processing techniques, can outperform available manual and semi-automatic methods used to identify and estimate stomatal morphological features. Results First, a cascade object detection learning algorithm is developed to correctly identify multiple stomata in a large microscopic image. Once the regions of interest which contain stomata are identified and extracted, a combination of image processing techniques are applied to estimate the pore dimensions of the stomata. The stomata detection approach was compared with an existing fully automated template matching technique and a semi-automatic maximum stable extremal regions approach, with the proposed method clearly surpassing the performance of the existing techniques with a precision of 91.68% and an F1-score of 0.85. Next, the morphological features of the detected stomata were measured. Contrary to existing approaches, the proposed image segmentation and skeletonization method allows us to estimate the pore dimensions even in cases where the stomatal pore boundary is only partially visible in the microscope image. A test conducted using 1267 images of stomata showed that the segmentation and skeletonization approach was able to correctly identify the stoma opening 86.27% of the time. Further comparisons made with manually traced stoma openings indicated that the proposed method is able to estimate stomata morphological features with accuracies of 89.03% for area

  9. Fully automated synthesis of [(18) F]fluoro-dihydrotestosterone ([(18) F]FDHT) using the FlexLab module.

    Science.gov (United States)

    Ackermann, Uwe; Lewis, Jason S; Young, Kenneth; Morris, Michael J; Weickhardt, Andrew; Davis, Ian D; Scott, Andrew M

    2016-08-01

    Imaging of androgen receptor expression in prostate cancer using F-18 FDHT is becoming increasingly popular. With the radiolabelling precursor now commercially available, developing a fully automated synthesis of [(18) F] FDHT is important. We have fully automated the synthesis of F-18 FDHT using the iPhase FlexLab module using only commercially available components. Total synthesis time was 90 min, radiochemical yields were 25-33% (n = 11). Radiochemical purity of the final formulation was > 99% and specific activity was > 18.5 GBq/µmol for all batches. This method can be up-scaled as desired, thus making it possible to study multiple patients in a day. Furthermore, our procedure uses 4 mg of precursor only and is therefore cost-effective. The synthesis has now been validated at Austin Health and is currently used for [(18) F]FDHT studies in patients. We believe that this method can easily adapted by other modules to further widen the availability of [(18) F]FDHT. Copyright © 2016 John Wiley & Sons, Ltd.

  10. Improved protein hydrogen/deuterium exchange mass spectrometry platform with fully automated data processing.

    Science.gov (United States)

    Zhang, Zhongqi; Zhang, Aming; Xiao, Gang

    2012-06-05

    Protein hydrogen/deuterium exchange (HDX) followed by protease digestion and mass spectrometric (MS) analysis is accepted as a standard method for studying protein conformation and conformational dynamics. In this article, an improved HDX MS platform with fully automated data processing is described. The platform significantly reduces systematic and random errors in the measurement by introducing two types of corrections in HDX data analysis. First, a mixture of short peptides with fast HDX rates is introduced as internal standards to adjust the variations in the extent of back exchange from run to run. Second, a designed unique peptide (PPPI) with slow intrinsic HDX rate is employed as another internal standard to reflect the possible differences in protein intrinsic HDX rates when protein conformations at different solution conditions are compared. HDX data processing is achieved with a comprehensive HDX model to simulate the deuterium labeling and back exchange process. The HDX model is implemented into the in-house developed software MassAnalyzer and enables fully unattended analysis of the entire protein HDX MS data set starting from ion detection and peptide identification to final processed HDX output, typically within 1 day. The final output of the automated data processing is a set (or the average) of the most possible protection factors for each backbone amide hydrogen. The utility of the HDX MS platform is demonstrated by exploring the conformational transition of a monoclonal antibody by increasing concentrations of guanidine.

  11. A two-dimensionally coincident second difference cosmic ray spike removal method for the fully automated processing of Raman spectra.

    Science.gov (United States)

    Schulze, H Georg; Turner, Robin F B

    2014-01-01

    Charge-coupled device detectors are vulnerable to cosmic rays that can contaminate Raman spectra with positive going spikes. Because spikes can adversely affect spectral processing and data analyses, they must be removed. Although both hardware-based and software-based spike removal methods exist, they typically require parameter and threshold specification dependent on well-considered user input. Here, we present a fully automated spike removal algorithm that proceeds without requiring user input. It is minimally dependent on sample attributes, and those that are required (e.g., standard deviation of spectral noise) can be determined with other fully automated procedures. At the core of the method is the identification and location of spikes with coincident second derivatives along both the spectral and spatiotemporal dimensions of two-dimensional datasets. The method can be applied to spectra that are relatively inhomogeneous because it provides fairly effective and selective targeting of spikes resulting in minimal distortion of spectra. Relatively effective spike removal obtained with full automation could provide substantial benefits to users where large numbers of spectra must be processed.

  12. Association between fully automated MRI-based volumetry of different brain regions and neuropsychological test performance in patients with amnestic mild cognitive impairment and Alzheimer's disease.

    Science.gov (United States)

    Arlt, Sönke; Buchert, Ralph; Spies, Lothar; Eichenlaub, Martin; Lehmbeck, Jan T; Jahn, Holger

    2013-06-01

    Fully automated magnetic resonance imaging (MRI)-based volumetry may serve as biomarker for the diagnosis in patients with mild cognitive impairment (MCI) or dementia. We aimed at investigating the relation between fully automated MRI-based volumetric measures and neuropsychological test performance in amnestic MCI and patients with mild dementia due to Alzheimer's disease (AD) in a cross-sectional and longitudinal study. In order to assess a possible prognostic value of fully automated MRI-based volumetry for future cognitive performance, the rate of change of neuropsychological test performance over time was also tested for its correlation with fully automated MRI-based volumetry at baseline. In 50 subjects, 18 with amnestic MCI, 21 with mild AD, and 11 controls, neuropsychological testing and T1-weighted MRI were performed at baseline and at a mean follow-up interval of 2.1 ± 0.5 years (n = 19). Fully automated MRI volumetry of the grey matter volume (GMV) was performed using a combined stereotactic normalisation and segmentation approach as provided by SPM8 and a set of pre-defined binary lobe masks. Left and right hippocampus masks were derived from probabilistic cytoarchitectonic maps. Volumes of the inner and outer liquor space were also determined automatically from the MRI. Pearson's test was used for the correlation analyses. Left hippocampal GMV was significantly correlated with performance in memory tasks, and left temporal GMV was related to performance in language tasks. Bilateral frontal, parietal and occipital GMVs were correlated to performance in neuropsychological tests comprising multiple domains. Rate of GMV change in the left hippocampus was correlated with decline of performance in the Boston Naming Test (BNT), Mini-Mental Status Examination, and trail making test B (TMT-B). The decrease of BNT and TMT-A performance over time correlated with the loss of grey matter in multiple brain regions. We conclude that fully automated MRI

  13. A new fully automated FTIR system for total column measurements of greenhouse gases

    Directory of Open Access Journals (Sweden)

    M. C. Geibel

    2010-10-01

    Full Text Available This article introduces a new fully automated FTIR system that is part of the Total Carbon Column Observing Network (TCCON. It will provide continuous ground-based measurements of column-averaged volume mixing ratio for CO2, CH4 and several other greenhouse gases in the tropics.

    Housed in a 20-foot shipping container it was developed as a transportable system that could be deployed almost anywhere in the world. We describe the automation concept which relies on three autonomous subsystems and their interaction. Crucial components like a sturdy and reliable solar tracker dome are described in detail. The automation software employs a new approach relying on multiple processes, database logging and web-based remote control.

    First results of total column measurements at Jena, Germany show that the instrument works well and can provide parts of the diurnal as well as seasonal cycle for CO2. Instrument line shape measurements with an HCl cell suggest that the instrument stays well-aligned over several months.

    After a short test campaign for side by side intercomaprison with an existing TCCON instrument in Australia, the system will be transported to its final destination Ascension Island.

  14. Performance of a fully automated program for measurement of left ventricular ejection fraction

    International Nuclear Information System (INIS)

    Douglass, K.H.; Tibbits, P.; Kasecamp, W.; Han, S.T.; Koller, D.; Links, J.M.; Wagner, H.H. Jr.

    1982-01-01

    A fully automated program developed by us for measurement of left ventricular ejection fraction from equilibrium gated blood studies was evaluated in 130 additional patients. Both of 6-min (130 studies) and 2-min (142 studies in 31 patients) gated blood pool studies were acquired and processed. The program successfully generated ejection fractions in 86% of the studies. These automatically generated ejection fractions were compared with ejection fractions derived from manually drawn regions the interest. When studies were acquired for 6-min with the patient at rest, the correlation between automated and manual ejection fractions was 0.92. When studies were acquired for 2-min, both at rest and during bicycle exercise, the correlation was 0.81. In 25 studies from patients who also underwent contrast ventriculography, the program successfully generated regions of interest in 22 (88%). The correlation between the ejection fraction determined by contrast ventriculography and the automatically generated radionuclide ejection fraction was 0.79. (orig.)

  15. AutoDrug: fully automated macromolecular crystallography workflows for fragment-based drug discovery

    International Nuclear Information System (INIS)

    Tsai, Yingssu; McPhillips, Scott E.; González, Ana; McPhillips, Timothy M.; Zinn, Daniel; Cohen, Aina E.; Feese, Michael D.; Bushnell, David; Tiefenbrunn, Theresa; Stout, C. David; Ludaescher, Bertram; Hedman, Britt; Hodgson, Keith O.; Soltis, S. Michael

    2013-01-01

    New software has been developed for automating the experimental and data-processing stages of fragment-based drug discovery at a macromolecular crystallography beamline. A new workflow-automation framework orchestrates beamline-control and data-analysis software while organizing results from multiple samples. AutoDrug is software based upon the scientific workflow paradigm that integrates the Stanford Synchrotron Radiation Lightsource macromolecular crystallography beamlines and third-party processing software to automate the crystallography steps of the fragment-based drug-discovery process. AutoDrug screens a cassette of fragment-soaked crystals, selects crystals for data collection based on screening results and user-specified criteria and determines optimal data-collection strategies. It then collects and processes diffraction data, performs molecular replacement using provided models and detects electron density that is likely to arise from bound fragments. All processes are fully automated, i.e. are performed without user interaction or supervision. Samples can be screened in groups corresponding to particular proteins, crystal forms and/or soaking conditions. A single AutoDrug run is only limited by the capacity of the sample-storage dewar at the beamline: currently 288 samples. AutoDrug was developed in conjunction with RestFlow, a new scientific workflow-automation framework. RestFlow simplifies the design of AutoDrug by managing the flow of data and the organization of results and by orchestrating the execution of computational pipeline steps. It also simplifies the execution and interaction of third-party programs and the beamline-control system. Modeling AutoDrug as a scientific workflow enables multiple variants that meet the requirements of different user groups to be developed and supported. A workflow tailored to mimic the crystallography stages comprising the drug-discovery pipeline of CoCrystal Discovery Inc. has been deployed and successfully

  16. Fully Automated Data Collection Using PAM and the Development of PAM/SPACE Reversible Cassettes

    Science.gov (United States)

    Hiraki, Masahiko; Watanabe, Shokei; Chavas, Leonard M. G.; Yamada, Yusuke; Matsugaki, Naohiro; Igarashi, Noriyuki; Wakatsuki, Soichi; Fujihashi, Masahiro; Miki, Kunio; Baba, Seiki; Ueno, Go; Yamamoto, Masaki; Suzuki, Mamoru; Nakagawa, Atsushi; Watanabe, Nobuhisa; Tanaka, Isao

    2010-06-01

    To remotely control and automatically collect data in high-throughput X-ray data collection experiments, the Structural Biology Research Center at the Photon Factory (PF) developed and installed sample exchange robots PAM (PF Automated Mounting system) at PF macromolecular crystallography beamlines; BL-5A, BL-17A, AR-NW12A and AR-NE3A. We developed and installed software that manages the flow of the automated X-ray experiments; sample exchanges, loop-centering and X-ray diffraction data collection. The fully automated data collection function has been available since February 2009. To identify sample cassettes, PAM employs a two-dimensional bar code reader. New beamlines, BL-1A at the Photon Factory and BL32XU at SPring-8, are currently under construction as part of Targeted Proteins Research Program (TPRP) by the Ministry of Education, Culture, Sports, Science and Technology of Japan. However, different robots, PAM and SPACE (SPring-8 Precise Automatic Cryo-sample Exchanger), will be installed at BL-1A and BL32XU, respectively. For the convenience of the users of both facilities, pins and cassettes for PAM and SPACE are developed as part of the TPRP.

  17. Reliability of fully automated versus visually controlled pre- and post-processing of resting-state EEG.

    Science.gov (United States)

    Hatz, F; Hardmeier, M; Bousleiman, H; Rüegg, S; Schindler, C; Fuhr, P

    2015-02-01

    To compare the reliability of a newly developed Matlab® toolbox for the fully automated, pre- and post-processing of resting state EEG (automated analysis, AA) with the reliability of analysis involving visually controlled pre- and post-processing (VA). 34 healthy volunteers (age: median 38.2 (20-49), 82% female) had three consecutive 256-channel resting-state EEG at one year intervals. Results of frequency analysis of AA and VA were compared with Pearson correlation coefficients, and reliability over time was assessed with intraclass correlation coefficients (ICC). Mean correlation coefficient between AA and VA was 0.94±0.07, mean ICC for AA 0.83±0.05 and for VA 0.84±0.07. AA and VA yield very similar results for spectral EEG analysis and are equally reliable. AA is less time-consuming, completely standardized, and independent of raters and their training. Automated processing of EEG facilitates workflow in quantitative EEG analysis. Copyright © 2014 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  18. Self-consistent hybrid functionals for solids: a fully-automated implementation

    Science.gov (United States)

    Erba, A.

    2017-08-01

    A fully-automated algorithm for the determination of the system-specific optimal fraction of exact exchange in self-consistent hybrid functionals of the density-functional-theory is illustrated, as implemented into the public Crystal program. The exchange fraction of this new class of functionals is self-consistently updated proportionally to the inverse of the dielectric response of the system within an iterative procedure (Skone et al 2014 Phys. Rev. B 89, 195112). Each iteration of the present scheme, in turn, implies convergence of a self-consistent-field (SCF) and a coupled-perturbed-Hartree-Fock/Kohn-Sham (CPHF/KS) procedure. The present implementation, beside improving the user-friendliness of self-consistent hybrids, exploits the unperturbed and electric-field perturbed density matrices from previous iterations as guesses for subsequent SCF and CPHF/KS iterations, which is documented to reduce the overall computational cost of the whole process by a factor of 2.

  19. Energy Production System Management - Renewable energy power supply integration with Building Automation System

    International Nuclear Information System (INIS)

    Figueiredo, Joao; Martins, Joao

    2010-01-01

    Intelligent buildings, historically and technologically, refers to the integration of four distinctive systems: Building Automation Systems (BAS), Telecommunication Systems, Office Automation Systems and Computer Building Management Systems. The increasing sophisticated BAS has become the 'heart and soul' of modern intelligent buildings. Integrating energy supply and demand elements - often known as Demand-Side Management (DSM) - has became an important energy efficiency policy concept. Nowadays, European countries have diversified their power supplies, reducing the dependence on OPEC, and developing a broader mix of energy sources maximizing the use of renewable energy domestic sources. In this way it makes sense to include a fifth system into the intelligent building group: Energy Production System Management (EPSM). This paper presents a Building Automation System where the Demand-Side Management is fully integrated with the building's Energy Production System, which incorporates a complete set of renewable energy production and storage systems.

  20. 3D model assisted fully automated scanning laser Doppler vibrometer measurements

    Science.gov (United States)

    Sels, Seppe; Ribbens, Bart; Bogaerts, Boris; Peeters, Jeroen; Vanlanduit, Steve

    2017-12-01

    In this paper, a new fully automated scanning laser Doppler vibrometer (LDV) measurement technique is presented. In contrast to existing scanning LDV techniques which use a 2D camera for the manual selection of sample points, we use a 3D Time-of-Flight camera in combination with a CAD file of the test object to automatically obtain measurements at pre-defined locations. The proposed procedure allows users to test prototypes in a shorter time because physical measurement locations are determined without user interaction. Another benefit from this methodology is that it incorporates automatic mapping between a CAD model and the vibration measurements. This mapping can be used to visualize measurements directly on a 3D CAD model. The proposed method is illustrated with vibration measurements of an unmanned aerial vehicle

  1. Cholgate - a randomized controlled trial comparing the effect of automated and on-demand decision support on the management of cardiovascular disease factors in primary care

    NARCIS (Netherlands)

    J.T. van Wyk (Jacobus); M.A.M. van Wijk (Marc); P.W. Moorman (Peter); M. Mosseveld (Mees); J. van der Lei (Johan)

    2003-01-01

    textabstractAutomated and on-demand decision support systems integrated into an electronic medical record have proven to be an effective implementation strategy for guidelines. Cholgate is a randomized controlled trial comparing the effect of automated and on-demand decision

  2. Fully automated chest wall line segmentation in breast MRI by using context information

    Science.gov (United States)

    Wu, Shandong; Weinstein, Susan P.; Conant, Emily F.; Localio, A. Russell; Schnall, Mitchell D.; Kontos, Despina

    2012-03-01

    Breast MRI has emerged as an effective modality for the clinical management of breast cancer. Evidence suggests that computer-aided applications can further improve the diagnostic accuracy of breast MRI. A critical and challenging first step for automated breast MRI analysis, is to separate the breast as an organ from the chest wall. Manual segmentation or user-assisted interactive tools are inefficient, tedious, and error-prone, which is prohibitively impractical for processing large amounts of data from clinical trials. To address this challenge, we developed a fully automated and robust computerized segmentation method that intensively utilizes context information of breast MR imaging and the breast tissue's morphological characteristics to accurately delineate the breast and chest wall boundary. A critical component is the joint application of anisotropic diffusion and bilateral image filtering to enhance the edge that corresponds to the chest wall line (CWL) and to reduce the effect of adjacent non-CWL tissues. A CWL voting algorithm is proposed based on CWL candidates yielded from multiple sequential MRI slices, in which a CWL representative is generated and used through a dynamic time warping (DTW) algorithm to filter out inferior candidates, leaving the optimal one. Our method is validated by a representative dataset of 20 3D unilateral breast MRI scans that span the full range of the American College of Radiology (ACR) Breast Imaging Reporting and Data System (BI-RADS) fibroglandular density categorization. A promising performance (average overlay percentage of 89.33%) is observed when the automated segmentation is compared to manually segmented ground truth obtained by an experienced breast imaging radiologist. The automated method runs time-efficiently at ~3 minutes for each breast MR image set (28 slices).

  3. Validation of a fully automated robotic setup for preparation of whole blood samples for LC-MS toxicology analysis

    DEFF Research Database (Denmark)

    Andersen, David Wederkinck; Rasmussen, Brian; Linnet, Kristian

    2012-01-01

    A fully automated setup was developed for preparing whole blood samples using a Tecan Evo workstation. By integrating several add-ons to the robotic platform, the flexible setup was able to prepare samples from sample tubes to a 96-well sample plate ready for injection on liquid chromatography...

  4. Fuzzy inventory model for deteriorating items, with time depended demand, shortages, and fully backlogging

    OpenAIRE

    Wasim Akram Mandal; Sahidul Islam

    2016-01-01

    In this paper analyzes fuzzy inventory system for deterioration item with time depended demand. Shortages are allowed under fully backlogged. Fixed cost, deterioration cost, shortages cost, holding cost are the cost considered in this model. Fuzziness is applying by allowing the cost components (holding cost, deterioration, shortage cost, holding cost, etc). In fuzzy environment it considered all required parameter to be triangular fuzzy numbers. One numerical solution of the model is obtaine...

  5. Automated Demand Response Technology Demonstration Project for Small and Medium Commercial Buildings

    Energy Technology Data Exchange (ETDEWEB)

    Page, Janie; Kiliccote, Sila; Dudley, Junqiao Han; Piette, Mary Ann; Chiu, Albert K.; Kellow, Bashar; Koch, Ed; Lipkin, Paul

    2011-07-01

    Small and medium commercial customers in California make up about 20-25% of electric peak load in California. With the roll out of smart meters to this customer group, which enable granular measurement of electricity consumption, the investor-owned utilities will offer dynamic prices as default tariffs by the end of 2011. Pacific Gas and Electric Company, which successfully deployed Automated Demand Response (AutoDR) Programs to its large commercial and industrial customers, started investigating the same infrastructures application to the small and medium commercial customers. This project aims to identify available technologies suitable for automating demand response for small-medium commercial buildings; to validate the extent to which that technology does what it claims to be able to do; and determine the extent to which customers find the technology useful for DR purpose. Ten sites, enabled by eight vendors, participated in at least four test AutoDR events per site in the summer of 2010. The results showed that while existing technology can reliably receive OpenADR signals and translate them into pre-programmed response strategies, it is likely that better levels of load sheds could be obtained than what is reported here if better understanding of the building systems were developed and the DR response strategies had been carefully designed and optimized for each site.

  6. Automated Dynamic Demand Response Implementation on a Micro-grid

    Energy Technology Data Exchange (ETDEWEB)

    Kuppannagari, Sanmukh R.; Kannan, Rajgopal; Chelmis, Charalampos; Prasanna, Viktor K.

    2016-11-16

    In this paper, we describe a system for real-time automated Dynamic and Sustainable Demand Response with sparse data consumption prediction implemented on the University of Southern California campus microgrid. Supply side approaches to resolving energy supply-load imbalance do not work at high levels of renewable energy penetration. Dynamic Demand Response (D2R) is a widely used demand-side technique to dynamically adjust electricity consumption during peak load periods. Our D2R system consists of accurate machine learning based energy consumption forecasting models that work with sparse data coupled with fast and sustainable load curtailment optimization algorithms that provide the ability to dynamically adapt to changing supply-load imbalances in near real-time. Our Sustainable DR (SDR) algorithms attempt to distribute customer curtailment evenly across sub-intervals during a DR event and avoid expensive demand peaks during a few sub-intervals. It also ensures that each customer is penalized fairly in order to achieve the targeted curtailment. We develop near linear-time constant-factor approximation algorithms along with Polynomial Time Approximation Schemes (PTAS) for SDR curtailment that minimizes the curtailment error defined as the difference between the target and achieved curtailment values. Our SDR curtailment problem is formulated as an Integer Linear Program that optimally matches customers to curtailment strategies during a DR event while also explicitly accounting for customer strategy switching overhead as a constraint. We demonstrate the results of our D2R system using real data from experiments performed on the USC smartgrid and show that 1) our prediction algorithms can very accurately predict energy consumption even with noisy or missing data and 2) our curtailment algorithms deliver DR with extremely low curtailment errors in the 0.01-0.05 kWh range.

  7. Fully automated left ventricular contour detection for gated radionuclide angiography, (1)

    International Nuclear Information System (INIS)

    Hosoba, Minoru; Wani, Hidenobu; Hiroe, Michiaki; Kusakabe, Kiyoko.

    1984-01-01

    A fully automated practical method has been developed to detect the left ventricular (LV) contour from gated pool images. Ejection fraction and volume curve can be computed accurately without operater variance. The characteristics of the method are summarized as follows: 1. Optimal design of the filter that works on Fourier domain, can be achieved to improve the signal to noise ratio. 2. New algorithm which use the cosine and sine transform images has been developed for the separating ventricle from atrium and defining center of LV. 3. Contrast enhancement by optimized square filter. 4. Radial profiles are generated from the center of LV and smoothed by fourth order Fourier series approximation. The crossing point with local threshold value searched from the center of the LV is defined as edge. 5. LV contour is obtained by conecting all the edge points defined on radial profiles by fitting them to Fourier function. (author)

  8. Computer-aided liver volumetry: performance of a fully-automated, prototype post-processing solution for whole-organ and lobar segmentation based on MDCT imaging.

    Science.gov (United States)

    Fananapazir, Ghaneh; Bashir, Mustafa R; Marin, Daniele; Boll, Daniel T

    2015-06-01

    To evaluate the performance of a prototype, fully-automated post-processing solution for whole-liver and lobar segmentation based on MDCT datasets. A polymer liver phantom was used to assess accuracy of post-processing applications comparing phantom volumes determined via Archimedes' principle with MDCT segmented datasets. For the IRB-approved, HIPAA-compliant study, 25 patients were enrolled. Volumetry performance compared the manual approach with the automated prototype, assessing intraobserver variability, and interclass correlation for whole-organ and lobar segmentation using ANOVA comparison. Fidelity of segmentation was evaluated qualitatively. Phantom volume was 1581.0 ± 44.7 mL, manually segmented datasets estimated 1628.0 ± 47.8 mL, representing a mean overestimation of 3.0%, automatically segmented datasets estimated 1601.9 ± 0 mL, representing a mean overestimation of 1.3%. Whole-liver and segmental volumetry demonstrated no significant intraobserver variability for neither manual nor automated measurements. For whole-liver volumetry, automated measurement repetitions resulted in identical values; reproducible whole-organ volumetry was also achieved with manual segmentation, p(ANOVA) 0.98. For lobar volumetry, automated segmentation improved reproducibility over manual approach, without significant measurement differences for either methodology, p(ANOVA) 0.95-0.99. Whole-organ and lobar segmentation results from manual and automated segmentation showed no significant differences, p(ANOVA) 0.96-1.00. Assessment of segmentation fidelity found that segments I-IV/VI showed greater segmentation inaccuracies compared to the remaining right hepatic lobe segments. Automated whole-liver segmentation showed non-inferiority of fully-automated whole-liver segmentation compared to manual approaches with improved reproducibility and post-processing duration; automated dual-seed lobar segmentation showed slight tendencies for underestimating the right hepatic lobe

  9. Automated Price and Demand Response Demonstration for Large Customers in New York City using OpenADR

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Joyce Jihyun; Yin, Rongxin; Kiliccote, Sila

    2013-10-01

    Open Automated Demand Response (OpenADR), an XML-based information exchange model, is used to facilitate continuous price-responsive operation and demand response participation for large commercial buildings in New York who are subject to the default day-ahead hourly pricing. We summarize the existing demand response programs in New York and discuss OpenADR communication, prioritization of demand response signals, and control methods. Building energy simulation models are developed and field tests are conducted to evaluate continuous energy management and demand response capabilities of two commercial buildings in New York City. Preliminary results reveal that providing machine-readable prices to commercial buildings can facilitate both demand response participation and continuous energy cost savings. Hence, efforts should be made to develop more sophisticated algorithms for building control systems to minimize customer's utility bill based on price and reliability information from the electricity grid.

  10. A fully automated FTIR system for remote sensing of greenhouse gases in the tropics

    Science.gov (United States)

    Geibel, M. C.; Gerbig, C.; Feist, D. G.

    2010-07-01

    This article introduces a new fully automated FTIR system that is part of the Total Carbon Column Observing Network. It will provide continuous ground-based measurements of column-averaged volume mixing ratio for CO2, CH4 and several other greenhouse gases in the tropics. Housed in a 20-foot shipping container it was developed as a transportable system that could be deployed almost anywhere in the world. We describe the automation concept which relies on three autonomous subsystems and their interaction. Crucial components like a sturdy and reliable solar tracker dome are described in detail. First results of total column measurements at Jena, Germany show that the instrument works well and can provide diurnal as well as seasonal cycle for CO2. Instrument line shape measurements with an HCl cell suggest that the instrument stays well-aligned over several months. After a short test campaign for side by side intercomaprison with an existing TCCON instrument in Australia, the system will be transported to its final destination Ascension Island.

  11. A fully automated entanglement-based quantum cryptography system for telecom fiber networks

    International Nuclear Information System (INIS)

    Treiber, Alexander; Ferrini, Daniele; Huebel, Hannes; Zeilinger, Anton; Poppe, Andreas; Loruenser, Thomas; Querasser, Edwin; Matyus, Thomas; Hentschel, Michael

    2009-01-01

    We present in this paper a quantum key distribution (QKD) system based on polarization entanglement for use in telecom fibers. A QKD exchange up to 50 km was demonstrated in the laboratory with a secure key rate of 550 bits s -1 . The system is compact and portable with a fully automated start-up, and stabilization modules for polarization, synchronization and photon coupling allow hands-off operation. Stable and reliable key exchange in a deployed optical fiber of 16 km length was demonstrated. In this fiber network, we achieved over 2 weeks an automatic key generation with an average key rate of 2000 bits s -1 without manual intervention. During this period, the system had an average entanglement visibility of 93%, highlighting the technical level and stability achieved for entanglement-based quantum cryptography.

  12. Fully automated synthesis of 11C-acetate as tumor PET tracer by simple modified solid-phase extraction purification

    International Nuclear Information System (INIS)

    Tang, Xiaolan; Tang, Ganghua; Nie, Dahong

    2013-01-01

    Introduction: Automated synthesis of 11 C-acetate ( 11 C-AC) as the most commonly used radioactive fatty acid tracer is performed by a simple, rapid, and modified solid-phase extraction (SPE) purification. Methods: Automated synthesis of 11 C-AC was implemented by carboxylation reaction of MeMgBr on a polyethylene Teflon loop ring with 11 C-CO 2 , followed by acidic hydrolysis with acid and SCX cartridge, and purification on SCX, AG11A8 and C18 SPE cartridges using a commercially available 11 C-tracer synthesizer. Quality control test and animals positron emission tomography (PET) imaging were also carried out. Results: A high and reproducible decay-uncorrected radiochemical yield of (41.0±4.6)% (n=10) was obtained from 11 C-CO 2 within the whole synthesis time about 8 min. The radiochemical purity of 11 C-AC was over 95% by high-performance liquid chromatography (HPLC) analysis. Quality control test and PET imaging showed that 11 C-AC injection produced by the simple SPE procedure was safe and efficient, and was in agreement with the current Chinese radiopharmaceutical quality control guidelines. Conclusion: The novel, simple, and rapid method is readily adapted to the fully automated synthesis of 11 C-AC on several existing commercial synthesis module. The method can be used routinely to produce 11 C-AC for preclinical and clinical studies with PET imaging. - Highlights: • A fully automated synthesis of 11 C-acetate by simple modified solid-phase extraction purification has been developed. • Typical non-decay-corrected yields were (41.0±4.6)% (n=10) • Radiochemical purity was determined by radio-HPLC analysis on a C18 column using the gradient program, instead of expensive organic acid column or anion column. • QC testing (RCP>99%)

  13. Performance of a fully automated scatterometer for BRDF and BTDF measurements at visible and infrared wavelengths

    International Nuclear Information System (INIS)

    Anderson, S.; Shepard, D.F.; Pompea, S.M.; Castonguay, R.

    1989-01-01

    The general performance of a fully automated scatterometer shows that the instrument can make rapid, accurate BRDF (bidirectional reflectance distribution function) and BTDF (bidirectional transmittance distribution function) measurements of optical surfaces over a range of approximately ten orders of magnitude in BRDF. These measurements can be made for most surfaces even with the detector at the specular angle, because of beam-attenuation techniques. He-Ne and CO2 lasers are used as sources in conjunction with a reference detector and chopper

  14. Searching for prostate cancer by fully automated magnetic resonance imaging classification: deep learning versus non-deep learning.

    Science.gov (United States)

    Wang, Xinggang; Yang, Wei; Weinreb, Jeffrey; Han, Juan; Li, Qiubai; Kong, Xiangchuang; Yan, Yongluan; Ke, Zan; Luo, Bo; Liu, Tao; Wang, Liang

    2017-11-13

    Prostate cancer (PCa) is a major cause of death since ancient time documented in Egyptian Ptolemaic mummy imaging. PCa detection is critical to personalized medicine and varies considerably under an MRI scan. 172 patients with 2,602 morphologic images (axial 2D T2-weighted imaging) of the prostate were obtained. A deep learning with deep convolutional neural network (DCNN) and a non-deep learning with SIFT image feature and bag-of-word (BoW), a representative method for image recognition and analysis, were used to distinguish pathologically confirmed PCa patients from prostate benign conditions (BCs) patients with prostatitis or prostate benign hyperplasia (BPH). In fully automated detection of PCa patients, deep learning had a statistically higher area under the receiver operating characteristics curve (AUC) than non-deep learning (P = 0.0007 deep learning method and 0.70 (95% CI 0.63-0.77) for non-deep learning method, respectively. Our results suggest that deep learning with DCNN is superior to non-deep learning with SIFT image feature and BoW model for fully automated PCa patients differentiation from prostate BCs patients. Our deep learning method is extensible to image modalities such as MR imaging, CT and PET of other organs.

  15. Photochemical-chemiluminometric determination of aldicarb in a fully automated multicommutation based flow-assembly

    International Nuclear Information System (INIS)

    Palomeque, M.; Garcia Bautista, J.A.; Catala Icardo, M.; Garcia Mateo, J.V.; Martinez Calatayud, J.

    2004-01-01

    A sensitive and fully automated method for determination of aldicarb in technical formulations (Temik) and mineral waters is proposed. The automation of the flow-assembly is based on the multicommutation approach, which uses a set of solenoid valves acting as independent switchers. The operating cycle for obtaining a typical analytical transient signal can be easily programmed by means of a home-made software running in the Windows environment. The manifold is provided with a photoreactor consisting of a 150 cm long x 0.8 mm i.d. piece of PTFE tubing coiled around a 20 W low-pressure mercury lamp. The determination of aldicarb is performed on the basis of the iron(III) catalytic mineralization of the pesticide by UV irradiation (150 s), and the chemiluminescent (CL) behavior of the photodegradated pesticide in presence of potassium permanganate and quinine sulphate as sensitizer. UV irradiation of aldicarb turns the very week chemiluminescent pesticide into a strongly chemiluminescent photoproduct. The method is linear over the range 2.2-100.0 μg l -1 of aldicarb; the limit of detection is 0.069 μg l -1 ; the reproducibility (as the R.S.D. of 20 peaks of a 24 μg l -1 solution) is 3.7% and the sample throughput is 17 h -1

  16. Fully-automated identification of fish species based on otolith contour: using short-time Fourier transform and discriminant analysis (STFT-DA).

    Science.gov (United States)

    Salimi, Nima; Loh, Kar Hoe; Kaur Dhillon, Sarinder; Chong, Ving Ching

    2016-01-01

    Background. Fish species may be identified based on their unique otolith shape or contour. Several pattern recognition methods have been proposed to classify fish species through morphological features of the otolith contours. However, there has been no fully-automated species identification model with the accuracy higher than 80%. The purpose of the current study is to develop a fully-automated model, based on the otolith contours, to identify the fish species with the high classification accuracy. Methods. Images of the right sagittal otoliths of 14 fish species from three families namely Sciaenidae, Ariidae, and Engraulidae were used to develop the proposed identification model. Short-time Fourier transform (STFT) was used, for the first time in the area of otolith shape analysis, to extract important features of the otolith contours. Discriminant Analysis (DA), as a classification technique, was used to train and test the model based on the extracted features. Results. Performance of the model was demonstrated using species from three families separately, as well as all species combined. Overall classification accuracy of the model was greater than 90% for all cases. In addition, effects of STFT variables on the performance of the identification model were explored in this study. Conclusions. Short-time Fourier transform could determine important features of the otolith outlines. The fully-automated model proposed in this study (STFT-DA) could predict species of an unknown specimen with acceptable identification accuracy. The model codes can be accessed at http://mybiodiversityontologies.um.edu.my/Otolith/ and https://peerj.com/preprints/1517/. The current model has flexibility to be used for more species and families in future studies.

  17. Bioprocessing automation in cell therapy manufacturing: Outcomes of special interest group automation workshop.

    Science.gov (United States)

    Ball, Oliver; Robinson, Sarah; Bure, Kim; Brindley, David A; Mccall, David

    2018-04-01

    Phacilitate held a Special Interest Group workshop event in Edinburgh, UK, in May 2017. The event brought together leading stakeholders in the cell therapy bioprocessing field to identify present and future challenges and propose potential solutions to automation in cell therapy bioprocessing. Here, we review and summarize discussions from the event. Deep biological understanding of a product, its mechanism of action and indication pathogenesis underpin many factors relating to bioprocessing and automation. To fully exploit the opportunities of bioprocess automation, therapeutics developers must closely consider whether an automation strategy is applicable, how to design an 'automatable' bioprocess and how to implement process modifications with minimal disruption. Major decisions around bioprocess automation strategy should involve all relevant stakeholders; communication between technical and business strategy decision-makers is of particular importance. Developers should leverage automation to implement in-process testing, in turn applicable to process optimization, quality assurance (QA)/ quality control (QC), batch failure control, adaptive manufacturing and regulatory demands, but a lack of precedent and technical opportunities can complicate such efforts. Sparse standardization across product characterization, hardware components and software platforms is perceived to complicate efforts to implement automation. The use of advanced algorithmic approaches such as machine learning may have application to bioprocess and supply chain optimization. Automation can substantially de-risk the wider supply chain, including tracking and traceability, cryopreservation and thawing and logistics. The regulatory implications of automation are currently unclear because few hardware options exist and novel solutions require case-by-case validation, but automation can present attractive regulatory incentives. Copyright © 2018 International Society for Cellular Therapy

  18. Fully automated segmentation of left ventricle using dual dynamic programming in cardiac cine MR images

    Science.gov (United States)

    Jiang, Luan; Ling, Shan; Li, Qiang

    2016-03-01

    Cardiovascular diseases are becoming a leading cause of death all over the world. The cardiac function could be evaluated by global and regional parameters of left ventricle (LV) of the heart. The purpose of this study is to develop and evaluate a fully automated scheme for segmentation of LV in short axis cardiac cine MR images. Our fully automated method consists of three major steps, i.e., LV localization, LV segmentation at end-diastolic phase, and LV segmentation propagation to the other phases. First, the maximum intensity projection image along the time phases of the midventricular slice, located at the center of the image, was calculated to locate the region of interest of LV. Based on the mean intensity of the roughly segmented blood pool in the midventricular slice at each phase, end-diastolic (ED) and end-systolic (ES) phases were determined. Second, the endocardial and epicardial boundaries of LV of each slice at ED phase were synchronously delineated by use of a dual dynamic programming technique. The external costs of the endocardial and epicardial boundaries were defined with the gradient values obtained from the original and enhanced images, respectively. Finally, with the advantages of the continuity of the boundaries of LV across adjacent phases, we propagated the LV segmentation from the ED phase to the other phases by use of dual dynamic programming technique. The preliminary results on 9 clinical cardiac cine MR cases show that the proposed method can obtain accurate segmentation of LV based on subjective evaluation.

  19. Effects of Granular Control on Customers’ Perspective and Behavior with Automated Demand Response Systems

    Energy Technology Data Exchange (ETDEWEB)

    Schetrit, Oren; Kim, Joyce; Yin, Rongxin; Kiliccote, Sila

    2014-08-01

    Automated demand response (Auto-DR) is expected to close the loop between buildings and the grid by providing machine-to-machine communications to curtail loads without the need for human intervention. Hence, it can offer more reliable and repeatable demand response results to the grid than the manual approach and make demand response participation a hassle-free experience for customers. However, many building operators misunderstand Auto-DR and are afraid of losing control over their building operation. To ease the transition from manual to Auto-DR, we designed and implemented granular control of Auto-DR systems so that building operators could modify or opt out of individual load-shed strategies whenever they wanted. This paper reports the research findings from this effort demonstrated through a field study in large commercial buildings located in New York City. We focused on (1) understanding how providing granular control affects building operators’ perspective on Auto-DR, and (2) evaluating the usefulness of granular control by examining their interaction with the Auto-DR user interface during test events. Through trend log analysis, interviews, and surveys, we found that: (1) the opt-out capability during Auto-DR events can remove the feeling of being forced into load curtailments and increase their willingness to adopt Auto-DR; (2) being able to modify individual load-shed strategies allows flexible Auto-DR participation that meets the building’s changing operational requirements; (3) a clear display of automation strategies helps building operators easily identify how Auto-DR is functioning and can build trust in Auto-DR systems.

  20. DeepPicker: A deep learning approach for fully automated particle picking in cryo-EM.

    Science.gov (United States)

    Wang, Feng; Gong, Huichao; Liu, Gaochao; Li, Meijing; Yan, Chuangye; Xia, Tian; Li, Xueming; Zeng, Jianyang

    2016-09-01

    Particle picking is a time-consuming step in single-particle analysis and often requires significant interventions from users, which has become a bottleneck for future automated electron cryo-microscopy (cryo-EM). Here we report a deep learning framework, called DeepPicker, to address this problem and fill the current gaps toward a fully automated cryo-EM pipeline. DeepPicker employs a novel cross-molecule training strategy to capture common features of particles from previously-analyzed micrographs, and thus does not require any human intervention during particle picking. Tests on the recently-published cryo-EM data of three complexes have demonstrated that our deep learning based scheme can successfully accomplish the human-level particle picking process and identify a sufficient number of particles that are comparable to those picked manually by human experts. These results indicate that DeepPicker can provide a practically useful tool to significantly reduce the time and manual effort spent in single-particle analysis and thus greatly facilitate high-resolution cryo-EM structure determination. DeepPicker is released as an open-source program, which can be downloaded from https://github.com/nejyeah/DeepPicker-python. Copyright © 2016 Elsevier Inc. All rights reserved.

  1. Fully automated VMAT treatment planning for advanced-stage NSCLC patients

    International Nuclear Information System (INIS)

    Della Gala, Giuseppe; Dirkx, Maarten L.P.; Hoekstra, Nienke; Fransen, Dennie; Pol, Marjan van de; Heijmen, Ben J.M.; Lanconelli, Nico; Petit, Steven F.

    2017-01-01

    To develop a fully automated procedure for multicriterial volumetric modulated arc therapy (VMAT) treatment planning (autoVMAT) for stage III/IV non-small cell lung cancer (NSCLC) patients treated with curative intent. After configuring the developed autoVMAT system for NSCLC, autoVMAT plans were compared with manually generated clinically delivered intensity-modulated radiotherapy (IMRT) plans for 41 patients. AutoVMAT plans were also compared to manually generated VMAT plans in the absence of time pressure. For 16 patients with reduced planning target volume (PTV) dose prescription in the clinical IMRT plan (to avoid violation of organs at risk tolerances), the potential for dose escalation with autoVMAT was explored. Two physicians evaluated 35/41 autoVMAT plans (85%) as clinically acceptable. Compared to the manually generated IMRT plans, autoVMAT plans showed statistically significant improved PTV coverage (V_9_5_% increased by 1.1% ± 1.1%), higher dose conformity (R_5_0 reduced by 12.2% ± 12.7%), and reduced mean lung, heart, and esophagus doses (reductions of 0.9 Gy ± 1.0 Gy, 1.5 Gy ± 1.8 Gy, 3.6 Gy ± 2.8 Gy, respectively, all p [de

  2. Fully automated laboratory and field-portable goniometer used for performing accurate and precise multiangular reflectance measurements

    Science.gov (United States)

    Harms, Justin D.; Bachmann, Charles M.; Ambeau, Brittany L.; Faulring, Jason W.; Ruiz Torres, Andres J.; Badura, Gregory; Myers, Emily

    2017-10-01

    Field-portable goniometers are created for a wide variety of applications. Many of these applications require specific types of instruments and measurement schemes and must operate in challenging environments. Therefore, designs are based on the requirements that are specific to the application. We present a field-portable goniometer that was designed for measuring the hemispherical-conical reflectance factor (HCRF) of various soils and low-growing vegetation in austere coastal and desert environments and biconical reflectance factors in laboratory settings. Unlike some goniometers, this system features a requirement for "target-plane tracking" to ensure that measurements can be collected on sloped surfaces, without compromising angular accuracy. The system also features a second upward-looking spectrometer to measure the spatially dependent incoming illumination, an integrated software package to provide full automation, an automated leveling system to ensure a standard frame of reference, a design that minimizes the obscuration due to self-shading to measure the opposition effect, and the ability to record a digital elevation model of the target region. This fully automated and highly mobile system obtains accurate and precise measurements of HCRF in a wide variety of terrain and in less time than most other systems while not sacrificing consistency or repeatability in laboratory environments.

  3. Fully automated deformable registration of breast DCE-MRI and PET/CT

    Science.gov (United States)

    Dmitriev, I. D.; Loo, C. E.; Vogel, W. V.; Pengel, K. E.; Gilhuijs, K. G. A.

    2013-02-01

    Accurate characterization of breast tumors is important for the appropriate selection of therapy and monitoring of the response. For this purpose breast imaging and tissue biopsy are important aspects. In this study, a fully automated method for deformable registration of DCE-MRI and PET/CT of the breast is presented. The registration is performed using the CT component of the PET/CT and the pre-contrast T1-weighted non-fat suppressed MRI. Comparable patient setup protocols were used during the MRI and PET examinations in order to avoid having to make assumptions of biomedical properties of the breast during and after the application of chemotherapy. The registration uses a multi-resolution approach to speed up the process and to minimize the probability of converging to local minima. The validation was performed on 140 breasts (70 patients). From a total number of registration cases, 94.2% of the breasts were aligned within 4.0 mm accuracy (1 PET voxel). Fused information may be beneficial to obtain representative biopsy samples, which in turn will benefit the treatment of the patient.

  4. Toxicity assessment of ionic liquids with Vibrio fischeri: an alternative fully automated methodology.

    Science.gov (United States)

    Costa, Susana P F; Pinto, Paula C A G; Lapa, Rui A S; Saraiva, M Lúcia M F S

    2015-03-02

    A fully automated Vibrio fischeri methodology based on sequential injection analysis (SIA) has been developed. The methodology was based on the aspiration of 75 μL of bacteria and 50 μL of inhibitor followed by measurement of the luminescence of bacteria. The assays were conducted for contact times of 5, 15, and 30 min, by means of three mixing chambers that ensured adequate mixing conditions. The optimized methodology provided a precise control of the reaction conditions which is an asset for the analysis of a large number of samples. The developed methodology was applied to the evaluation of the impact of a set of ionic liquids (ILs) on V. fischeri and the results were compared with those provided by a conventional assay kit (Biotox(®)). The collected data evidenced the influence of different cation head groups and anion moieties on the toxicity of ILs. Generally, aromatic cations and fluorine-containing anions displayed higher impact on V. fischeri, evidenced by lower EC50. The proposed methodology was validated through statistical analysis which demonstrated a strong positive correlation (P>0.98) between assays. It is expected that the automated methodology can be tested for more classes of compounds and used as alternative to microplate based V. fischeri assay kits. Copyright © 2014 Elsevier B.V. All rights reserved.

  5. Toward fully automated processing of dynamic susceptibility contrast perfusion MRI for acute ischemic cerebral stroke.

    Science.gov (United States)

    Kim, Jinsuh; Leira, Enrique C; Callison, Richard C; Ludwig, Bryan; Moritani, Toshio; Magnotta, Vincent A; Madsen, Mark T

    2010-05-01

    We developed fully automated software for dynamic susceptibility contrast (DSC) MR perfusion-weighted imaging (PWI) to efficiently and reliably derive critical hemodynamic information for acute stroke treatment decisions. Brain MR PWI was performed in 80 consecutive patients with acute nonlacunar ischemic stroke within 24h after onset of symptom from January 2008 to August 2009. These studies were automatically processed to generate hemodynamic parameters that included cerebral blood flow and cerebral blood volume, and the mean transit time (MTT). To develop reliable software for PWI analysis, we used computationally robust algorithms including the piecewise continuous regression method to determine bolus arrival time (BAT), log-linear curve fitting, arrival time independent deconvolution method and sophisticated motion correction methods. An optimal arterial input function (AIF) search algorithm using a new artery-likelihood metric was also developed. Anatomical locations of the automatically determined AIF were reviewed and validated. The automatically computed BAT values were statistically compared with estimated BAT by a single observer. In addition, gamma-variate curve-fitting errors of AIF and inter-subject variability of AIFs were analyzed. Lastly, two observes independently assessed the quality and area of hypoperfusion mismatched with restricted diffusion area from motion corrected MTT maps and compared that with time-to-peak (TTP) maps using the standard approach. The AIF was identified within an arterial branch and enhanced areas of perfusion deficit were visualized in all evaluated cases. Total processing time was 10.9+/-2.5s (mean+/-s.d.) without motion correction and 267+/-80s (mean+/-s.d.) with motion correction on a standard personal computer. The MTT map produced with our software adequately estimated brain areas with perfusion deficit and was significantly less affected by random noise of the PWI when compared with the TTP map. Results of image

  6. Development of a fully automated network system for long-term health-care monitoring at home.

    Science.gov (United States)

    Motoi, K; Kubota, S; Ikarashi, A; Nogawa, M; Tanaka, S; Nemoto, T; Yamakoshi, K

    2007-01-01

    Daily monitoring of health condition at home is very important not only as an effective scheme for early diagnosis and treatment of cardiovascular and other diseases, but also for prevention and control of such diseases. From this point of view, we have developed a prototype room for fully automated monitoring of various vital signs. From the results of preliminary experiments using this room, it was confirmed that (1) ECG and respiration during bathing, (2) excretion weight and blood pressure, and (3) respiration and cardiac beat during sleep could be monitored with reasonable accuracy by the sensor system installed in bathtub, toilet and bed, respectively.

  7. A Fully Automated Approach to Spike Sorting.

    Science.gov (United States)

    Chung, Jason E; Magland, Jeremy F; Barnett, Alex H; Tolosa, Vanessa M; Tooker, Angela C; Lee, Kye Y; Shah, Kedar G; Felix, Sarah H; Frank, Loren M; Greengard, Leslie F

    2017-09-13

    Understanding the detailed dynamics of neuronal networks will require the simultaneous measurement of spike trains from hundreds of neurons (or more). Currently, approaches to extracting spike times and labels from raw data are time consuming, lack standardization, and involve manual intervention, making it difficult to maintain data provenance and assess the quality of scientific results. Here, we describe an automated clustering approach and associated software package that addresses these problems and provides novel cluster quality metrics. We show that our approach has accuracy comparable to or exceeding that achieved using manual or semi-manual techniques with desktop central processing unit (CPU) runtimes faster than acquisition time for up to hundreds of electrodes. Moreover, a single choice of parameters in the algorithm is effective for a variety of electrode geometries and across multiple brain regions. This algorithm has the potential to enable reproducible and automated spike sorting of larger scale recordings than is currently possible. Copyright © 2017 Elsevier Inc. All rights reserved.

  8. Fully automated treatment planning for head and neck radiotherapy using a voxel-based dose prediction and dose mimicking method

    Science.gov (United States)

    McIntosh, Chris; Welch, Mattea; McNiven, Andrea; Jaffray, David A.; Purdie, Thomas G.

    2017-08-01

    Recent works in automated radiotherapy treatment planning have used machine learning based on historical treatment plans to infer the spatial dose distribution for a novel patient directly from the planning image. We present a probabilistic, atlas-based approach which predicts the dose for novel patients using a set of automatically selected most similar patients (atlases). The output is a spatial dose objective, which specifies the desired dose-per-voxel, and therefore replaces the need to specify and tune dose-volume objectives. Voxel-based dose mimicking optimization then converts the predicted dose distribution to a complete treatment plan with dose calculation using a collapsed cone convolution dose engine. In this study, we investigated automated planning for right-sided oropharaynx head and neck patients treated with IMRT and VMAT. We compare four versions of our dose prediction pipeline using a database of 54 training and 12 independent testing patients by evaluating 14 clinical dose evaluation criteria. Our preliminary results are promising and demonstrate that automated methods can generate comparable dose distributions to clinical. Overall, automated plans achieved an average of 0.6% higher dose for target coverage evaluation criteria, and 2.4% lower dose at the organs at risk criteria levels evaluated compared with clinical. There was no statistically significant difference detected in high-dose conformity between automated and clinical plans as measured by the conformation number. Automated plans achieved nine more unique criteria than clinical across the 12 patients tested and automated plans scored a significantly higher dose at the evaluation limit for two high-risk target coverage criteria and a significantly lower dose in one critical organ maximum dose. The novel dose prediction method with dose mimicking can generate complete treatment plans in 12-13 min without user interaction. It is a promising approach for fully automated treatment

  9. Performance evaluation of vertical feed fully automated TLD badge reader using 0.8 and 0.4 mm teflon embedded CaSO4:Dy dosimeters

    International Nuclear Information System (INIS)

    Ratna, P.; More, Vinay; Kulkarni, M.S.

    2012-01-01

    The personnel monitoring of more than 80,000 radiation workers in India is at present carried out by semi-automated TLD badge Reader systems (TLDBR-7B) developed by Radiation Safety Systems Division, Bhabha Atomic Research Centre. More than 60 such reader systems are in use in all the personnel monitoring centers in the country. Radiation Safety Systems Division also developed the fully automated TLD badge reader based on a new TLD badge having built-in machine readable ID code (in the form of 16x3 hole pattern). This automated reader is designed with minimum of changes in the electronics and mechanical hardware in the semiautomatic version (TLDBR-7B) so that such semi-automatic readers can be easily upgraded to the fully automated versions by using the new TLD badge with ID code. The reader was capable of reading 50 TLD cards in 90 minutes. Based on the feedback from the users, a new model of frilly automated TLD badge Reader (model VEFFA-10) is designed which is an improved version of the previously reported fully Automated TLD badge reader. This VEFFA-10 PC based Reader incorporates vertical loading of TLD bards having machine readable ID code. In this new reader, a vertical rack, which can hold 100 such cards, is mounted from the right side of the reader system. The TLD card falls into the channel by gravity from where it is taken to the reading position by rack and pinion method. After the readout, the TLD card is dropped in a eject tray. The reader employs hot N 2 gas heating method and the gas flow is controlled by a specially designed digital gas flow meter on the front panel of the reader system. The system design is very compact and simple and card stuck up problem is totally eliminated in the reader system. The reader has a number of self-diagnostic features to ensure a high degree of reliability. This paper reports the performance evaluation of the Reader using 0.4 mm thick Teflon embedded CaSO 4 :Dy TLD cards instead of 0.8 mm cards

  10. Fully automated muscle quality assessment by Gabor filtering of second harmonic generation images

    Science.gov (United States)

    Paesen, Rik; Smolders, Sophie; Vega, José Manolo de Hoyos; Eijnde, Bert O.; Hansen, Dominique; Ameloot, Marcel

    2016-02-01

    Although structural changes on the sarcomere level of skeletal muscle are known to occur due to various pathologies, rigorous studies of the reduced sarcomere quality remain scarce. This can possibly be explained by the lack of an objective tool for analyzing and comparing sarcomere images across biological conditions. Recent developments in second harmonic generation (SHG) microscopy and increasing insight into the interpretation of sarcomere SHG intensity profiles have made SHG microscopy a valuable tool to study microstructural properties of sarcomeres. Typically, sarcomere integrity is analyzed by fitting a set of manually selected, one-dimensional SHG intensity profiles with a supramolecular SHG model. To circumvent this tedious manual selection step, we developed a fully automated image analysis procedure to map the sarcomere disorder for the entire image at once. The algorithm relies on a single-frequency wavelet-based Gabor approach and includes a newly developed normalization procedure allowing for unambiguous data interpretation. The method was validated by showing the correlation between the sarcomere disorder, quantified by the M-band size obtained from manually selected profiles, and the normalized Gabor value ranging from 0 to 1 for decreasing disorder. Finally, to elucidate the applicability of our newly developed protocol, Gabor analysis was used to study the effect of experimental autoimmune encephalomyelitis on the sarcomere regularity. We believe that the technique developed in this work holds great promise for high-throughput, unbiased, and automated image analysis to study sarcomere integrity by SHG microscopy.

  11. The worldwide NORM production and a fully automated gamma-ray spectrometer for their characterization

    International Nuclear Information System (INIS)

    Xhixha, G.; Callegari, I.; Guastaldi, E.; De Bianchi, S.; Fiorentini, G.; Universita di Ferrara, Ferrara; Istituto Nazionale di Fisica Nucleare; Kaceli Xhixha, M.

    2013-01-01

    Materials containing radionuclides of natural origin and being subject to regulation because of their radioactivity are known as Naturally Occurring Radioactive Material (NORM). By following International Atomic Energy Agency, we include in NORM those materials with an activity concentration, which is modified by human made processes. We present a brief review of the main categories of non-nuclear industries together with the levels of activity concentration in feed raw materials, products and waste, including mechanisms of radioisotope enrichments. The global management of NORM shows a high level of complexity, mainly due to different degrees of radioactivity enhancement and the huge amount of worldwide waste production. The future tendency of guidelines concerning environmental protection will require both a systematic monitoring based on the ever-increasing sampling and high performance of gamma-ray spectroscopy. On the ground of these requirements a new low-background fully automated high-resolution gamma-ray spectrometer MCA R ad has been developed. The design of lead and cooper shielding allowed to reach a background reduction of two order of magnitude with respect to laboratory radioactivity. A severe lowering of manpower cost is obtained through a fully automation system, which enables up to 24 samples to be measured without any human attendance. Two coupled HPGe detectors increase the detection efficiency, performing accurate measurements on small sample volume (180 cm 3 ) with a reduction of sample transport cost of material. Details of the instrument calibration method are presented. MCA R ad system can measure in less than one hour a typical NORM sample enriched in U and Th with some hundreds of Bq kg -1 , with an overall uncertainty less than 5 %. Quality control of this method has been tested. Measurements of three certified reference materials RGK-1, RGU-2 and RGTh-1 containing concentrations of potassium, uranium and thorium comparable to NORM have

  12. Radiation Planning Assistant - A Streamlined, Fully Automated Radiotherapy Treatment Planning System

    Science.gov (United States)

    Court, Laurence E.; Kisling, Kelly; McCarroll, Rachel; Zhang, Lifei; Yang, Jinzhong; Simonds, Hannah; du Toit, Monique; Trauernicht, Chris; Burger, Hester; Parkes, Jeannette; Mejia, Mike; Bojador, Maureen; Balter, Peter; Branco, Daniela; Steinmann, Angela; Baltz, Garrett; Gay, Skylar; Anderson, Brian; Cardenas, Carlos; Jhingran, Anuja; Shaitelman, Simona; Bogler, Oliver; Schmeller, Kathleen; Followill, David; Howell, Rebecca; Nelson, Christopher; Peterson, Christine; Beadle, Beth

    2018-01-01

    The Radiation Planning Assistant (RPA) is a system developed for the fully automated creation of radiotherapy treatment plans, including volume-modulated arc therapy (VMAT) plans for patients with head/neck cancer and 4-field box plans for patients with cervical cancer. It is a combination of specially developed in-house software that uses an application programming interface to communicate with a commercial radiotherapy treatment planning system. It also interfaces with a commercial secondary dose verification software. The necessary inputs to the system are a Treatment Plan Order, approved by the radiation oncologist, and a simulation computed tomography (CT) image, approved by the radiographer. The RPA then generates a complete radiotherapy treatment plan. For the cervical cancer treatment plans, no additional user intervention is necessary until the plan is complete. For head/neck treatment plans, after the normal tissue and some of the target structures are automatically delineated on the CT image, the radiation oncologist must review the contours, making edits if necessary. They also delineate the gross tumor volume. The RPA then completes the treatment planning process, creating a VMAT plan. Finally, the completed plan must be reviewed by qualified clinical staff. PMID:29708544

  13. A fully automated system for quantification of background parenchymal enhancement in breast DCE-MRI

    Science.gov (United States)

    Ufuk Dalmiş, Mehmet; Gubern-Mérida, Albert; Borelli, Cristina; Vreemann, Suzan; Mann, Ritse M.; Karssemeijer, Nico

    2016-03-01

    Background parenchymal enhancement (BPE) observed in breast dynamic contrast enhanced magnetic resonance imaging (DCE-MRI) has been identified as an important biomarker associated with risk for developing breast cancer. In this study, we present a fully automated framework for quantification of BPE. We initially segmented fibroglandular tissue (FGT) of the breasts using an improved version of an existing method. Subsequently, we computed BPEabs (volume of the enhancing tissue), BPErf (BPEabs divided by FGT volume) and BPErb (BPEabs divided by breast volume), using different relative enhancement threshold values between 1% and 100%. To evaluate and compare the previous and improved FGT segmentation methods, we used 20 breast DCE-MRI scans and we computed Dice similarity coefficient (DSC) values with respect to manual segmentations. For evaluation of the BPE quantification, we used a dataset of 95 breast DCE-MRI scans. Two radiologists, in individual reading sessions, visually analyzed the dataset and categorized each breast into minimal, mild, moderate and marked BPE. To measure the correlation between automated BPE values to the radiologists' assessments, we converted these values into ordinal categories and we used Spearman's rho as a measure of correlation. According to our results, the new segmentation method obtained an average DSC of 0.81 0.09, which was significantly higher (p<0.001) compared to the previous method (0.76 0.10). The highest correlation values between automated BPE categories and radiologists' assessments were obtained with the BPErf measurement (r=0.55, r=0.49, p<0.001 for both), while the correlation between the scores given by the two radiologists was 0.82 (p<0.001). The presented framework can be used to systematically investigate the correlation between BPE and risk in large screening cohorts.

  14. Rapid access to compound libraries through flow technology: fully automated synthesis of a 3-aminoindolizine library via orthogonal diversification.

    Science.gov (United States)

    Lange, Paul P; James, Keith

    2012-10-08

    A novel methodology for the synthesis of druglike heterocycle libraries has been developed through the use of flow reactor technology. The strategy employs orthogonal modification of a heterocyclic core, which is generated in situ, and was used to construct both a 25-membered library of druglike 3-aminoindolizines, and selected examples of a 100-member virtual library. This general protocol allows a broad range of acylation, alkylation and sulfonamidation reactions to be performed in conjunction with a tandem Sonogashira coupling/cycloisomerization sequence. All three synthetic steps were conducted under full automation in the flow reactor, with no handling or isolation of intermediates, to afford the desired products in good yields. This fully automated, multistep flow approach opens the way to highly efficient generation of druglike heterocyclic systems as part of a lead discovery strategy or within a lead optimization program.

  15. Visual Versus Fully Automated Analyses of 18F-FDG and Amyloid PET for Prediction of Dementia Due to Alzheimer Disease in Mild Cognitive Impairment.

    Science.gov (United States)

    Grimmer, Timo; Wutz, Carolin; Alexopoulos, Panagiotis; Drzezga, Alexander; Förster, Stefan; Förstl, Hans; Goldhardt, Oliver; Ortner, Marion; Sorg, Christian; Kurz, Alexander

    2016-02-01

    Biomarkers of Alzheimer disease (AD) can be imaged in vivo and can be used for diagnostic and prognostic purposes in people with cognitive decline and dementia. Indicators of amyloid deposition such as (11)C-Pittsburgh compound B ((11)C-PiB) PET are primarily used to identify or rule out brain diseases that are associated with amyloid pathology but have also been deployed to forecast the clinical course. Indicators of neuronal metabolism including (18)F-FDG PET demonstrate the localization and severity of neuronal dysfunction and are valuable for differential diagnosis and for predicting the progression from mild cognitive impairment (MCI) to dementia. It is a matter of debate whether to analyze these images visually or using automated techniques. Therefore, we compared the usefulness of both imaging methods and both analyzing strategies to predict dementia due to AD. In MCI participants, a baseline examination, including clinical and imaging assessments, and a clinical follow-up examination after a planned interval of 24 mo were performed. Of 28 MCI patients, 9 developed dementia due to AD, 2 developed frontotemporal dementia, and 1 developed moderate dementia of unknown etiology. The positive and negative predictive values and the accuracy of visual and fully automated analyses of (11)C-PiB for the prediction of progression to dementia due to AD were 0.50, 1.00, and 0.68, respectively, for the visual and 0.53, 1.00, and 0.71, respectively, for the automated analyses. Positive predictive value, negative predictive value, and accuracy of fully automated analyses of (18)F-FDG PET were 0.37, 0.78, and 0.50, respectively. Results of visual analyses were highly variable between raters but were superior to automated analyses. Both (18)F-FDG and (11)C-PiB imaging appear to be of limited use for predicting the progression from MCI to dementia due to AD in short-term follow-up, irrespective of the strategy of analysis. On the other hand, amyloid PET is extremely useful to

  16. Parameter evaluation and fully-automated radiosynthesis of [(11)C]harmine for imaging of MAO-A for clinical trials.

    Science.gov (United States)

    Philippe, C; Zeilinger, M; Mitterhauser, M; Dumanic, M; Lanzenberger, R; Hacker, M; Wadsak, W

    2015-03-01

    The aim of the present study was the evaluation and automation of the radiosynthesis of [(11)C]harmine for clinical trials. The following parameters have been investigated: amount of base, precursor concentration, solvent, reaction temperature and time. The optimum reaction conditions were determined to be 2-3mg/mL precursor activated with 1eq. 5M NaOH in DMSO, 80°C reaction temperature and 2min reaction time. Under these conditions 6.1±1GBq (51.0±11% based on [(11)C]CH3I, corrected for decay) of [(11)C]harmine (n=72) were obtained. The specific activity was 101.32±28.2GBq/µmol (at EOS). All quality control parameters were in accordance with the standards for parenteral human application. Due to its reliability and high yields, this fully-automated synthesis method can be used as routine set-up. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  17. Centrifugal LabTube platform for fully automated DNA purification and LAMP amplification based on an integrated, low-cost heating system.

    Science.gov (United States)

    Hoehl, Melanie M; Weißert, Michael; Dannenberg, Arne; Nesch, Thomas; Paust, Nils; von Stetten, Felix; Zengerle, Roland; Slocum, Alexander H; Steigert, Juergen

    2014-06-01

    This paper introduces a disposable battery-driven heating system for loop-mediated isothermal DNA amplification (LAMP) inside a centrifugally-driven DNA purification platform (LabTube). We demonstrate LabTube-based fully automated DNA purification of as low as 100 cell-equivalents of verotoxin-producing Escherichia coli (VTEC) in water, milk and apple juice in a laboratory centrifuge, followed by integrated and automated LAMP amplification with a reduction of hands-on time from 45 to 1 min. The heating system consists of two parallel SMD thick film resistors and a NTC as heating and temperature sensing elements. They are driven by a 3 V battery and controlled by a microcontroller. The LAMP reagents are stored in the elution chamber and the amplification starts immediately after the eluate is purged into the chamber. The LabTube, including a microcontroller-based heating system, demonstrates contamination-free and automated sample-to-answer nucleic acid testing within a laboratory centrifuge. The heating system can be easily parallelized within one LabTube and it is deployable for a variety of heating and electrical applications.

  18. Flexible demand in the GB domestic electricity sector in 2030

    International Nuclear Information System (INIS)

    Drysdale, Brian; Wu, Jianzhong; Jenkins, Nick

    2015-01-01

    Highlights: • Annual domestic demand by category and daily flexible load profiles are shown to 2030. • Valuable flexible demand requires loads to be identifiable, accessible, and useful. • The extent of flexible demand varies significantly on a diurnal and seasonal basis. • Barriers to accessing domestic demand include multiple low value loads and apathy. • Existing market structure a barrier to fully rewarding individual load flexibility. - Abstract: In order to meet greenhouse gas emissions targets the Great Britain (GB) future electricity supply will include a higher fraction of non-dispatchable generation, increasing opportunities for demand side management to maintain a supply/demand balance. This paper examines the extent of flexible domestic demand (FDD) in GB, its usefulness in system balancing and appropriate incentives to encourage consumers to participate. FDD, classified as electric space and water heating (ESWH), and cold and wet appliances, amounts to 59 TW h in 2012 (113 TW h total domestic demand) and is calculated to increase to 67 TW h in 2030. Summer and winter daily load profiles for flexible loads show significant seasonal and diurnal variations in the total flexible load and between load categories. Low levels of reflective consumer engagement with electricity consumption and a resistance to automation present barriers to effective access to FDD. A value of £1.97/household/year has been calculated for cold appliance loads used for frequency response in 2030, using 2013 market rates. The introduction of smart meters in GB by 2020 will allow access to FDD for system balancing. The low commercial value of individual domestic loads increases the attractiveness of non-financial incentives to fully exploit FDD. It was shown that appliance loads have different characteristics which can contribute to an efficient power system in different ways

  19. Development and Demonstration of the Open Automated Demand Response Standard for the Residential Sector

    Energy Technology Data Exchange (ETDEWEB)

    Herter, Karen; Rasin, Josh; Perry, Tim

    2009-11-30

    The goal of this study was to demonstrate a demand response system that can signal nearly every customer in all sectors through the integration of two widely available and non- proprietary communications technologies--Open Automated Demand Response (OpenADR) over lnternet protocol and Utility Messaging Channel (UMC) over FM radio. The outcomes of this project were as follows: (1) a software bridge to allow translation of pricing signals from OpenADR to UMC; and (2) a portable demonstration unit with an lnternet-connected notebook computer, a portfolio of DR-enabling technologies, and a model home. The demonstration unit provides visitors the opportunity to send electricity-pricing information over the lnternet (through OpenADR and UMC) and then watch as the model appliances and lighting respond to the signals. The integration of OpenADR and UMC completed and demonstrated in this study enables utilities to send hourly or sub-hourly electricity pricing information simultaneously to the residential, commercial and industrial sectors.

  20. Fully automated laser ray tracing system to measure changes in the crystalline lens GRIN profile.

    Science.gov (United States)

    Qiu, Chen; Maceo Heilman, Bianca; Kaipio, Jari; Donaldson, Paul; Vaghefi, Ehsan

    2017-11-01

    Measuring the lens gradient refractive index (GRIN) accurately and reliably has proven an extremely challenging technical problem. A fully automated laser ray tracing (LRT) system was built to address this issue. The LRT system captures images of multiple laser projections before and after traversing through an ex vivo lens. These LRT images, combined with accurate measurements of the lens geometry, are used to calculate the lens GRIN profile. Mathematically, this is an ill-conditioned problem; hence, it is essential to apply biologically relevant constraints to produce a feasible solution. The lens GRIN measurements were compared with previously published data. Our GRIN retrieval algorithm produces fast and accurate measurements of the lens GRIN profile. Experiments to study the optics of physiologically perturbed lenses are the future direction of this research.

  1. A fully-automated neural network analysis of AFM force-distance curves for cancer tissue diagnosis

    Science.gov (United States)

    Minelli, Eleonora; Ciasca, Gabriele; Sassun, Tanya Enny; Antonelli, Manila; Palmieri, Valentina; Papi, Massimiliano; Maulucci, Giuseppe; Santoro, Antonio; Giangaspero, Felice; Delfini, Roberto; Campi, Gaetano; De Spirito, Marco

    2017-10-01

    Atomic Force Microscopy (AFM) has the unique capability of probing the nanoscale mechanical properties of biological systems that affect and are affected by the occurrence of many pathologies, including cancer. This capability has triggered growing interest in the translational process of AFM from physics laboratories to clinical practice. A factor still hindering the current use of AFM in diagnostics is related to the complexity of AFM data analysis, which is time-consuming and needs highly specialized personnel with a strong physical and mathematical background. In this work, we demonstrate an operator-independent neural-network approach for the analysis of surgically removed brain cancer tissues. This approach allowed us to distinguish—in a fully automated fashion—cancer from healthy tissues with high accuracy, also highlighting the presence and the location of infiltrating tumor cells.

  2. Evaluation of a Fully Automated Analyzer for Rapid Measurement of Water Vapor Sorption Isotherms for Applications in Soil Science

    DEFF Research Database (Denmark)

    Arthur, Emmanuel; Tuller, Markus; Moldrup, Per

    2014-01-01

    The characterization and description of important soil processes such as water vapor transport, volatilization of pesticides, and hysteresis require accurate means for measuring the soil water characteristic (SWC) at low water potentials. Until recently, measurement of the SWC at low water...... potentials was constrained by hydraulic decoupling and long equilibration times when pressure plates or single-point, chilled-mirror instruments were used. A new, fully automated Vapor Sorption Analyzer (VSA) helps to overcome these challenges and allows faster measurement of highly detailed water vapor...

  3. Buying Program of the Standard Automated Materiel Management System. Automated Small Purchase System: Defense Supply Center Philadelphia

    National Research Council Canada - National Science Library

    2001-01-01

    The Standard Automated Materiel Management System Automated Small Purchase System is a fully automated micro-purchases system used by the General and Industrial Directorate at the Defense Supply Center Philadelphia...

  4. A fully automated cell segmentation and morphometric parameter system for quantifying corneal endothelial cell morphology.

    Science.gov (United States)

    Al-Fahdawi, Shumoos; Qahwaji, Rami; Al-Waisy, Alaa S; Ipson, Stanley; Ferdousi, Maryam; Malik, Rayaz A; Brahma, Arun

    2018-07-01

    Corneal endothelial cell abnormalities may be associated with a number of corneal and systemic diseases. Damage to the endothelial cells can significantly affect corneal transparency by altering hydration of the corneal stroma, which can lead to irreversible endothelial cell pathology requiring corneal transplantation. To date, quantitative analysis of endothelial cell abnormalities has been manually performed by ophthalmologists using time consuming and highly subjective semi-automatic tools, which require an operator interaction. We developed and applied a fully-automated and real-time system, termed the Corneal Endothelium Analysis System (CEAS) for the segmentation and computation of endothelial cells in images of the human cornea obtained by in vivo corneal confocal microscopy. First, a Fast Fourier Transform (FFT) Band-pass filter is applied to reduce noise and enhance the image quality to make the cells more visible. Secondly, endothelial cell boundaries are detected using watershed transformations and Voronoi tessellations to accurately quantify the morphological parameters of the human corneal endothelial cells. The performance of the automated segmentation system was tested against manually traced ground-truth images based on a database consisting of 40 corneal confocal endothelial cell images in terms of segmentation accuracy and obtained clinical features. In addition, the robustness and efficiency of the proposed CEAS system were compared with manually obtained cell densities using a separate database of 40 images from controls (n = 11), obese subjects (n = 16) and patients with diabetes (n = 13). The Pearson correlation coefficient between automated and manual endothelial cell densities is 0.9 (p system, and the possibility of utilizing it in a real world clinical setting to enable rapid diagnosis and for patient follow-up, with an execution time of only 6 seconds per image. Copyright © 2018 Elsevier B.V. All rights reserved.

  5. How to assess sustainability in automated manufacturing

    DEFF Research Database (Denmark)

    Dijkman, Teunis Johannes; Rödger, Jan-Markus; Bey, Niki

    2015-01-01

    The aim of this paper is to describe how sustainability in automation can be assessed. The assessment method is illustrated using a case study of a robot. Three aspects of sustainability assessment in automation are identified. Firstly, we consider automation as part of a larger system...... that fulfills the market demand for a given functionality. Secondly, three aspects of sustainability have to be assessed: environment, economy, and society. Thirdly, automation is part of a system with many levels, with different actors on each level, resulting in meeting the market demand. In this system......, (sustainability) specifications move top-down, which helps avoiding sub-optimization and problem shifting. From these three aspects, sustainable automation is defined as automation that contributes to products that fulfill a market demand in a more sustainable way. The case study presents the carbon footprints...

  6. Fully Automated Atlas-Based Hippocampus Volumetry for Clinical Routine: Validation in Subjects with Mild Cognitive Impairment from the ADNI Cohort.

    Science.gov (United States)

    Suppa, Per; Hampel, Harald; Spies, Lothar; Fiebach, Jochen B; Dubois, Bruno; Buchert, Ralph

    2015-01-01

    Hippocampus volumetry based on magnetic resonance imaging (MRI) has not yet been translated into everyday clinical diagnostic patient care, at least in part due to limited availability of appropriate software tools. In the present study, we evaluate a fully-automated and computationally efficient processing pipeline for atlas based hippocampal volumetry using freely available Statistical Parametric Mapping (SPM) software in 198 amnestic mild cognitive impairment (MCI) subjects from the Alzheimer's Disease Neuroimaging Initiative (ADNI1). Subjects were grouped into MCI stable and MCI to probable Alzheimer's disease (AD) converters according to follow-up diagnoses at 12, 24, and 36 months. Hippocampal grey matter volume (HGMV) was obtained from baseline T1-weighted MRI and then corrected for total intracranial volume and age. Average processing time per subject was less than 4 minutes on a standard PC. The area under the receiver operator characteristic curve of the corrected HGMV for identification of MCI to probable AD converters within 12, 24, and 36 months was 0.78, 0.72, and 0.71, respectively. Thus, hippocampal volume computed with the fully-automated processing pipeline provides similar power for prediction of MCI to probable AD conversion as computationally more expensive methods. The whole processing pipeline has been made freely available as an SPM8 toolbox. It is easily set up and integrated into everyday clinical patient care.

  7. Fast and Efficient Fragment-Based Lead Generation by Fully Automated Processing and Analysis of Ligand-Observed NMR Binding Data.

    Science.gov (United States)

    Peng, Chen; Frommlet, Alexandra; Perez, Manuel; Cobas, Carlos; Blechschmidt, Anke; Dominguez, Santiago; Lingel, Andreas

    2016-04-14

    NMR binding assays are routinely applied in hit finding and validation during early stages of drug discovery, particularly for fragment-based lead generation. To this end, compound libraries are screened by ligand-observed NMR experiments such as STD, T1ρ, and CPMG to identify molecules interacting with a target. The analysis of a high number of complex spectra is performed largely manually and therefore represents a limiting step in hit generation campaigns. Here we report a novel integrated computational procedure that processes and analyzes ligand-observed proton and fluorine NMR binding data in a fully automated fashion. A performance evaluation comparing automated and manual analysis results on (19)F- and (1)H-detected data sets shows that the program delivers robust, high-confidence hit lists in a fraction of the time needed for manual analysis and greatly facilitates visual inspection of the associated NMR spectra. These features enable considerably higher throughput, the assessment of larger libraries, and shorter turn-around times.

  8. Fully automated drug screening of dried blood spots using online LC-MS/MS analysis

    Directory of Open Access Journals (Sweden)

    Stefan Gaugler

    2018-01-01

    Full Text Available A new and fully automated workflow for the cost effective drug screening of large populations based on the dried blood spot (DBS technology was introduced in this study. DBS were prepared by spotting 15 μL of whole blood, previously spiked with alprazolam, amphetamine, cocaine, codeine, diazepam, fentanyl, lysergic acid diethylamide (LSD, 3,4-methylenedioxymethamphet-amine (MDMA, methadone, methamphetamine, morphine and oxycodone onto filter paper cards. The dried spots were scanned, spiked with deuterated standards and directly extracted. The extract was transferred online to an analytical LC column and then to the electrospray ionization tandem mass spectrometry system. All drugs were quantified at their cut-off level and good precision and correlation within the calibration range was obtained. The method was finally applied to DBS samples from two patients with back pain and codeine and oxycodone could be identified and quantified accurately below the level of misuse of 89.6 ng/mL and 39.6 ng/mL respectively.

  9. A Fully Automated Diabetes Prevention Program, Alive-PD: Program Design and Randomized Controlled Trial Protocol.

    Science.gov (United States)

    Block, Gladys; Azar, Kristen Mj; Block, Torin J; Romanelli, Robert J; Carpenter, Heather; Hopkins, Donald; Palaniappan, Latha; Block, Clifford H

    2015-01-21

    In the United States, 86 million adults have pre-diabetes. Evidence-based interventions that are both cost effective and widely scalable are needed to prevent diabetes. Our goal was to develop a fully automated diabetes prevention program and determine its effectiveness in a randomized controlled trial. Subjects with verified pre-diabetes were recruited to participate in a trial of the effectiveness of Alive-PD, a newly developed, 1-year, fully automated behavior change program delivered by email and Web. The program involves weekly tailored goal-setting, team-based and individual challenges, gamification, and other opportunities for interaction. An accompanying mobile phone app supports goal-setting and activity planning. For the trial, participants were randomized by computer algorithm to start the program immediately or after a 6-month delay. The primary outcome measures are change in HbA1c and fasting glucose from baseline to 6 months. The secondary outcome measures are change in HbA1c, glucose, lipids, body mass index (BMI), weight, waist circumference, and blood pressure at 3, 6, 9, and 12 months. Randomization and delivery of the intervention are independent of clinic staff, who are blinded to treatment assignment. Outcomes will be evaluated for the intention-to-treat and per-protocol populations. A total of 340 subjects with pre-diabetes were randomized to the intervention (n=164) or delayed-entry control group (n=176). Baseline characteristics were as follows: mean age 55 (SD 8.9); mean BMI 31.1 (SD 4.3); male 68.5%; mean fasting glucose 109.9 (SD 8.4) mg/dL; and mean HbA1c 5.6 (SD 0.3)%. Data collection and analysis are in progress. We hypothesize that participants in the intervention group will achieve statistically significant reductions in fasting glucose and HbA1c as compared to the control group at 6 months post baseline. The randomized trial will provide rigorous evidence regarding the efficacy of this Web- and Internet-based program in reducing or

  10. Development of a phantom to test fully automated breast density software – A work in progress

    International Nuclear Information System (INIS)

    Waade, G.G.; Hofvind, S.; Thompson, J.D.; Highnam, R.; Hogg, P.

    2017-01-01

    Objectives: Mammographic density (MD) is an independent risk factor for breast cancer and may have a future role for stratified screening. Automated software can estimate MD but the relationship between breast thickness reduction and MD is not fully understood. Our aim is to develop a deformable breast phantom to assess automated density software and the impact of breast thickness reduction on MD. Methods: Several different configurations of poly vinyl alcohol (PVAL) phantoms were created. Three methods were used to estimate their density. Raw image data of mammographic images were processed using Volpara to estimate volumetric breast density (VBD%); Hounsfield units (HU) were measured on CT images; and physical density (g/cm 3 ) was calculated using a formula involving mass and volume. Phantom volume versus contact area and phantom volume versus phantom thickness was compared to values of real breasts. Results: Volpara recognized all deformable phantoms as female breasts. However, reducing the phantom thickness caused a change in phantom density and the phantoms were not able to tolerate same level of compression and thickness reduction experienced by female breasts during mammography. Conclusion: Our results are promising as all phantoms resulted in valid data for automated breast density measurement. Further work should be conducted on PVAL and other materials to produce deformable phantoms that mimic female breast structure and density with the ability of being compressed to the same level as female breasts. Advances in knowledge: We are the first group to have produced deformable phantoms that are recognized as breasts by Volpara software. - Highlights: • Several phantoms of different configurations were created. • Three methods to assess phantom density were implemented. • All phantoms were identified as breasts by the Volpara software. • Reducing phantom thickness caused a change in phantom density.

  11. Clinical validation of fully automated computation of ejection fraction from gated equilibrium blood-pool scintigrams

    International Nuclear Information System (INIS)

    Reiber, J.H.C.; Lie, S.P.; Simoons, M.L.; Hoek, C.; Gerbrands, J.J.; Wijns, W.; Bakker, W.H.; Kooij, P.P.M.

    1983-01-01

    A fully automated procedure for the computation of left-ventricular ejection fraction (EF) from cardiac-gated Tc-99m blood-pool (GBP) scintigrams with fixed, dual, and variable ROI methods is described. By comparison with EF data from contrast ventriculography in 68 patients, the dual-ROI method (separate end-diastolic and end-systolic contours) was found to be the method of choice; processing time was 2 min. Success score of dual-ROI procedure was 92% as assessed from 100 GBP studies. Overall reproducibility of data acquisition and analysis was determined in 12 patients. Mean value and standard deviation of differences between repeat studies (average time interval 27 min) were 0.8% and 4.3% EF units, respectively, (r=0.98). The authors conclude that left-ventricular EF can be computed automatically from GBP scintigrams with minimal operator-interaction and good reproducibility; EFs are similar to those from contrast ventriculography

  12. A user-friendly robotic sample preparation program for fully automated biological sample pipetting and dilution to benefit the regulated bioanalysis.

    Science.gov (United States)

    Jiang, Hao; Ouyang, Zheng; Zeng, Jianing; Yuan, Long; Zheng, Naiyu; Jemal, Mohammed; Arnold, Mark E

    2012-06-01

    Biological sample dilution is a rate-limiting step in bioanalytical sample preparation when the concentrations of samples are beyond standard curve ranges, especially when multiple dilution factors are needed in an analytical run. We have developed and validated a Microsoft Excel-based robotic sample preparation program (RSPP) that automatically transforms Watson worklist sample information (identification, sequence and dilution factor) to comma-separated value (CSV) files. The Freedom EVO liquid handler software imports and transforms the CSV files to executable worklists (.gwl files), allowing the robot to perform sample dilutions at variable dilution factors. The dynamic dilution range is 1- to 1000-fold and divided into three dilution steps: 1- to 10-, 11- to 100-, and 101- to 1000-fold. The whole process, including pipetting samples, diluting samples, and adding internal standard(s), is accomplished within 1 h for two racks of samples (96 samples/rack). This platform also supports online sample extraction (liquid-liquid extraction, solid-phase extraction, protein precipitation, etc.) using 96 multichannel arms. This fully automated and validated sample dilution and preparation process has been applied to several drug development programs. The results demonstrate that application of the RSPP for fully automated sample processing is efficient and rugged. The RSPP not only saved more than 50% of the time in sample pipetting and dilution but also reduced human errors. The generated bioanalytical data are accurate and precise; therefore, this application can be used in regulated bioanalysis.

  13. Fully automated data acquisition, processing, and display in equilibrium radioventriculography

    International Nuclear Information System (INIS)

    Bourguignon, M.H.; Douglass, K.H.; Links, J.M.; Wagner, H.N. Jr.; Johns Hopkins Medical Institutions, Baltimore, MD

    1981-01-01

    A fully automated data acquisition, processing, and display procedure was developed for equilibrium radioventriculography. After a standardized acquisition, the study is automatically analyzed to yield both right and left ventricular time-activity curves. The program first creates a series of edge-enhanced images (difference between squared images and scaled original images). A marker point within each ventricle is then identified as that pixel with maximum counts to the patient's right and left of the count center of gravity of a stroke volume image. Regions of interest are selected on each frame as the first contour of local maxima of the two-dimensional second derivative (pseudo-Laplacian) which encloses the appropriate marker point, using a method developed by Goris. After shifting the left ventricular end-systolic region of interest four pixels to the patient's left, a background region of interest is generated as the crescent-shaped area of the shifted region of interest not intersected by the end systolic region. The average counts/pixel in this background region in the end systolic frame of the original study are subtracted from each pixel in all frames of the gated study. Right and left ventricular time-activity curves are then obtained by applying each region of interest to its corresponding background-subtracted frame, and the ejection fraction, end diastolic, end systolic, and stroke counts determined for both ventricles. In fourteen consecutive patients, in addition to the automatic ejection fractions, manually drawn regions of interest were used to obtain ejection fractions for both ventricles. The manual regions of interest were drawn twice, and the average obtained. (orig./TR)

  14. Development of a fully automated adaptive unsharp masking technique in digital chest radiograph

    International Nuclear Information System (INIS)

    Abe, Katsumi; Katsuragawa, Shigehiko; Sasaki, Yasuo

    1991-01-01

    We are developing a fully automated adaptive unsharp masking technique with various parameters depending on regional image features of a digital chest radiograph. A chest radiograph includes various regions such as lung fields, retrocardiac area and spine in which their texture patterns and optical densities are extremely different. Therefore, it is necessary to enhance image contrast of each region by each optimum parameter. First, we investigated optimum weighting factors and mask sizes of unsharp masking technique in a digital chest radiograph. Then, a chest radiograph is automatically divided into three segments, one for the lung field, one for the retrocardiac area, and one for the spine, by using histogram analysis of pixel values. Finally, high frequency components of the lung field and retrocardiac area are selectively enhanced with a small mask size and mild weighting factors which are previously determined as optimum parameters. In addition, low frequency components of the spine are enhanced with a large mask size and adequate weighting factors. This processed image shows excellent depiction of the lung field, retrocardiac area and spine simultaneously with optimum contrast. Our image processing technique may be useful for diagnosis of chest radiographs. (author)

  15. An inventory control project in a major Danish company using compound renewal demand models

    DEFF Research Database (Denmark)

    Larsen, Christian; Seiding, Claus Hoe; Teller, Christian

    operation is highly automated. However, the procedures for estimating demands and the policies for the inventory control system that were in use at the beginning of the project did not fully match the sophisticated technological standard of the physical system. During the initial phase of the project...... We describe the development of a framework to compute the optimal inventory policy for a large spare-parts' distribution centre operation in the RA division of the Danfoss Group in Denmark. The RA division distributes spare parts worldwide for cooling and A/C systems. The warehouse logistics...

  16. Automated Critical PeakPricing Field Tests: 2006 Pilot ProgramDescription and Results

    Energy Technology Data Exchange (ETDEWEB)

    Piette, Mary Ann; Watson, David; Motegi, Naoya; Kiliccote, Sila

    2007-06-19

    During 2006 Lawrence Berkeley National Laboratory (LBNL) and the Demand Response Research Center (DRRC) performed a technology evaluation for the Pacific Gas and Electric Company (PG&E) Emerging Technologies Programs. This report summarizes the design, deployment, and results from the 2006 Automated Critical Peak Pricing Program (Auto-CPP). The program was designed to evaluate the feasibility of deploying automation systems that allow customers to participate in critical peak pricing (CPP) with a fully-automated response. The 2006 program was in operation during the entire six-month CPP period from May through October. The methodology for this field study included site recruitment, control strategy development, automation system deployment, and evaluation of sites' participation in actual CPP events through the summer of 2006. LBNL recruited sites in PG&E's territory in northern California through contacts from PG&E account managers, conferences, and industry meetings. Each site contact signed a memorandum of understanding with LBNL that outlined the activities needed to participate in the Auto-CPP program. Each facility worked with LBNL to select and implement control strategies for demand response and developed automation system designs based on existing Internet connectivity and building control systems. Once the automation systems were installed, LBNL conducted communications tests to ensure that the Demand Response Automation Server (DRAS) correctly provided and logged the continuous communications of the CPP signals with the energy management and control system (EMCS) for each site. LBNL also observed and evaluated Demand Response (DR) shed strategies to ensure proper commissioning of controls. The communication system allowed sites to receive day-ahead as well as day-of signals for pre-cooling, a DR strategy used at a few sites. Measurement of demand response was conducted using two different baseline models for estimating peak load savings. One

  17. Fully-automated computer-assisted method of CT brain scan analysis for the measurement of cerebrospinal fluid spaces and brain absorption density

    Energy Technology Data Exchange (ETDEWEB)

    Baldy, R.E.; Brindley, G.S.; Jacobson, R.R.; Reveley, M.A.; Lishman, W.A.; Ewusi-Mensah, I.; Turner, S.W.

    1986-03-01

    Computer-assisted methods of CT brain scan analysis offer considerable advantages over visual inspection, particularly in research; and several semi-automated methods are currently available. A new computer-assisted program is presented which provides fully automated processing of CT brain scans, depending on ''anatomical knowledge'' of where cerebrospinal fluid (CSF)-containing spaces are likely to lie. After identifying these regions of interest quantitative estimates are then provided of CSF content in each slice in cisterns, ventricles, Sylvian fissure and interhemispheric fissure. Separate measures are also provided of mean brain density in each slice. These estimates can be summated to provide total ventricular and total brain volumes. The program shows a high correlation with measures derived from mechanical planimetry and visual grading procedures, also when tested against a phantom brain of known ventricular volume. The advantages and limitations of the present program are discussed.

  18. Automation in Clinical Microbiology

    Science.gov (United States)

    Ledeboer, Nathan A.

    2013-01-01

    Historically, the trend toward automation in clinical pathology laboratories has largely bypassed the clinical microbiology laboratory. In this article, we review the historical impediments to automation in the microbiology laboratory and offer insight into the reasons why we believe that we are on the cusp of a dramatic change that will sweep a wave of automation into clinical microbiology laboratories. We review the currently available specimen-processing instruments as well as the total laboratory automation solutions. Lastly, we outline the types of studies that will need to be performed to fully assess the benefits of automation in microbiology laboratories. PMID:23515547

  19. Anesthesiology, automation, and artificial intelligence.

    Science.gov (United States)

    Alexander, John C; Joshi, Girish P

    2018-01-01

    There have been many attempts to incorporate automation into the practice of anesthesiology, though none have been successful. Fundamentally, these failures are due to the underlying complexity of anesthesia practice and the inability of rule-based feedback loops to fully master it. Recent innovations in artificial intelligence, especially machine learning, may usher in a new era of automation across many industries, including anesthesiology. It would be wise to consider the implications of such potential changes before they have been fully realized.

  20. A quality assurance framework for the fully automated and objective evaluation of image quality in cone-beam computed tomography

    International Nuclear Information System (INIS)

    Steiding, Christian; Kolditz, Daniel; Kalender, Willi A.

    2014-01-01

    Purpose: Thousands of cone-beam computed tomography (CBCT) scanners for vascular, maxillofacial, neurological, and body imaging are in clinical use today, but there is no consensus on uniform acceptance and constancy testing for image quality (IQ) and dose yet. The authors developed a quality assurance (QA) framework for fully automated and time-efficient performance evaluation of these systems. In addition, the dependence of objective Fourier-based IQ metrics on direction and position in 3D volumes was investigated for CBCT. Methods: The authors designed a dedicated QA phantom 10 cm in length consisting of five compartments, each with a diameter of 10 cm, and an optional extension ring 16 cm in diameter. A homogeneous section of water-equivalent material allows measuring CT value accuracy, image noise and uniformity, and multidimensional global and local noise power spectra (NPS). For the quantitative determination of 3D high-contrast spatial resolution, the modulation transfer function (MTF) of centrally and peripherally positioned aluminum spheres was computed from edge profiles. Additional in-plane and axial resolution patterns were used to assess resolution qualitatively. The characterization of low-contrast detectability as well as CT value linearity and artifact behavior was tested by utilizing sections with soft-tissue-equivalent and metallic inserts. For an automated QA procedure, a phantom detection algorithm was implemented. All tests used in the dedicated QA program were initially verified in simulation studies and experimentally confirmed on a clinical dental CBCT system. Results: The automated IQ evaluation of volume data sets of the dental CBCT system was achieved with the proposed phantom requiring only one scan for the determination of all desired parameters. Typically, less than 5 min were needed for phantom set-up, scanning, and data analysis. Quantitative evaluation of system performance over time by comparison to previous examinations was also

  1. A quality assurance framework for the fully automated and objective evaluation of image quality in cone-beam computed tomography.

    Science.gov (United States)

    Steiding, Christian; Kolditz, Daniel; Kalender, Willi A

    2014-03-01

    Thousands of cone-beam computed tomography (CBCT) scanners for vascular, maxillofacial, neurological, and body imaging are in clinical use today, but there is no consensus on uniform acceptance and constancy testing for image quality (IQ) and dose yet. The authors developed a quality assurance (QA) framework for fully automated and time-efficient performance evaluation of these systems. In addition, the dependence of objective Fourier-based IQ metrics on direction and position in 3D volumes was investigated for CBCT. The authors designed a dedicated QA phantom 10 cm in length consisting of five compartments, each with a diameter of 10 cm, and an optional extension ring 16 cm in diameter. A homogeneous section of water-equivalent material allows measuring CT value accuracy, image noise and uniformity, and multidimensional global and local noise power spectra (NPS). For the quantitative determination of 3D high-contrast spatial resolution, the modulation transfer function (MTF) of centrally and peripherally positioned aluminum spheres was computed from edge profiles. Additional in-plane and axial resolution patterns were used to assess resolution qualitatively. The characterization of low-contrast detectability as well as CT value linearity and artifact behavior was tested by utilizing sections with soft-tissue-equivalent and metallic inserts. For an automated QA procedure, a phantom detection algorithm was implemented. All tests used in the dedicated QA program were initially verified in simulation studies and experimentally confirmed on a clinical dental CBCT system. The automated IQ evaluation of volume data sets of the dental CBCT system was achieved with the proposed phantom requiring only one scan for the determination of all desired parameters. Typically, less than 5 min were needed for phantom set-up, scanning, and data analysis. Quantitative evaluation of system performance over time by comparison to previous examinations was also verified. The maximum

  2. Opportunities for Automated Demand Response in Wastewater Treatment Facilities in California - Southeast Water Pollution Control Plant Case Study

    Energy Technology Data Exchange (ETDEWEB)

    Olsen, Daniel [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Goli, Sasank [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Faulkner, David [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); McKane, Aimee [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2012-12-20

    This report details a study into the demand response potential of a large wastewater treatment facility in San Francisco. Previous research had identified wastewater treatment facilities as good candidates for demand response and automated demand response, and this study was conducted to investigate facility attributes that are conducive to demand response or which hinder its implementation. One years' worth of operational data were collected from the facility's control system, submetered process equipment, utility electricity demand records, and governmental weather stations. These data were analyzed to determine factors which affected facility power demand and demand response capabilities The average baseline demand at the Southeast facility was approximately 4 MW. During the rainy season (October-March) the facility treated 40% more wastewater than the dry season, but demand only increased by 4%. Submetering of the facility's lift pumps and centrifuges predicted load shifts capabilities of 154 kW and 86 kW, respectively, with large lift pump shifts in the rainy season. Analysis of demand data during maintenance events confirmed the magnitude of these possible load shifts, and indicated other areas of the facility with demand response potential. Load sheds were seen to be possible by shutting down a portion of the facility's aeration trains (average shed of 132 kW). Load shifts were seen to be possible by shifting operation of centrifuges, the gravity belt thickener, lift pumps, and external pump stations These load shifts were made possible by the storage capabilities of the facility and of the city's sewer system. Large load reductions (an average of 2,065 kW) were seen from operating the cogeneration unit, but normal practice is continuous operation, precluding its use for demand response. The study also identified potential demand response opportunities that warrant further study: modulating variable-demand aeration loads, shifting

  3. Fully automated synthesis of ¹¹C-acetate as tumor PET tracer by simple modified solid-phase extraction purification.

    Science.gov (United States)

    Tang, Xiaolan; Tang, Ganghua; Nie, Dahong

    2013-12-01

    Automated synthesis of (11)C-acetate ((11)C-AC) as the most commonly used radioactive fatty acid tracer is performed by a simple, rapid, and modified solid-phase extraction (SPE) purification. Automated synthesis of (11)C-AC was implemented by carboxylation reaction of MeMgBr on a polyethylene Teflon loop ring with (11)C-CO2, followed by acidic hydrolysis with acid and SCX cartridge, and purification on SCX, AG11A8 and C18 SPE cartridges using a commercially available (11)C-tracer synthesizer. Quality control test and animals positron emission tomography (PET) imaging were also carried out. A high and reproducible decay-uncorrected radiochemical yield of (41.0 ± 4.6)% (n=10) was obtained from (11)C-CO2 within the whole synthesis time about 8 min. The radiochemical purity of (11)C-AC was over 95% by high-performance liquid chromatography (HPLC) analysis. Quality control test and PET imaging showed that (11)C-AC injection produced by the simple SPE procedure was safe and efficient, and was in agreement with the current Chinese radiopharmaceutical quality control guidelines. The novel, simple, and rapid method is readily adapted to the fully automated synthesis of (11)C-AC on several existing commercial synthesis module. The method can be used routinely to produce (11)C-AC for preclinical and clinical studies with PET imaging. Copyright © 2013 Elsevier Ltd. All rights reserved.

  4. A fully-automated multiscale kernel graph cuts based particle localization scheme for temporal focusing two-photon microscopy

    Science.gov (United States)

    Huang, Xia; Li, Chunqiang; Xiao, Chuan; Sun, Wenqing; Qian, Wei

    2017-03-01

    The temporal focusing two-photon microscope (TFM) is developed to perform depth resolved wide field fluorescence imaging by capturing frames sequentially. However, due to strong nonignorable noises and diffraction rings surrounding particles, further researches are extremely formidable without a precise particle localization technique. In this paper, we developed a fully-automated scheme to locate particles positions with high noise tolerance. Our scheme includes the following procedures: noise reduction using a hybrid Kalman filter method, particle segmentation based on a multiscale kernel graph cuts global and local segmentation algorithm, and a kinematic estimation based particle tracking method. Both isolated and partial-overlapped particles can be accurately identified with removal of unrelated pixels. Based on our quantitative analysis, 96.22% isolated particles and 84.19% partial-overlapped particles were successfully detected.

  5. A novel fully automated molecular diagnostic system (AMDS for colorectal cancer mutation detection.

    Directory of Open Access Journals (Sweden)

    Shiro Kitano

    Full Text Available BACKGROUND: KRAS, BRAF and PIK3CA mutations are frequently observed in colorectal cancer (CRC. In particular, KRAS mutations are strong predictors for clinical outcomes of EGFR-targeted treatments such as cetuximab and panitumumab in metastatic colorectal cancer (mCRC. For mutation analysis, the current methods are time-consuming, and not readily available to all oncologists and pathologists. We have developed a novel, simple, sensitive and fully automated molecular diagnostic system (AMDS for point of care testing (POCT. Here we report the results of a comparison study between AMDS and direct sequencing (DS in the detection of KRAS, BRAF and PI3KCA somatic mutations. METHODOLOGY/PRINCIPAL FINDING: DNA was extracted from a slice of either frozen (n = 89 or formalin-fixed and paraffin-embedded (FFPE CRC tissue (n = 70, and then used for mutation analysis by AMDS and DS. All mutations (n = 41 among frozen and 27 among FFPE samples detected by DS were also successfully (100% detected by the AMDS. However, 8 frozen and 6 FFPE samples detected as wild-type in the DS analysis were shown as mutants in the AMDS analysis. By cloning-sequencing assays, these discordant samples were confirmed as true mutants. One sample had simultaneous "hot spot" mutations of KRAS and PIK3CA, and cloning assay comfirmed that E542K and E545K were not on the same allele. Genotyping call rates for DS were 100.0% (89/89 and 74.3% (52/70 in frozen and FFPE samples, respectively, for the first attempt; whereas that of AMDS was 100.0% for both sample sets. For automated DNA extraction and mutation detection by AMDS, frozen tissues (n = 41 were successfully detected all mutations within 70 minutes. CONCLUSIONS/SIGNIFICANCE: AMDS has superior sensitivity and accuracy over DS, and is much easier to execute than conventional labor intensive manual mutation analysis. AMDS has great potential for POCT equipment for mutation analysis.

  6. Fully automated intrinsic respiratory and cardiac gating for small animal CT

    Energy Technology Data Exchange (ETDEWEB)

    Kuntz, J; Baeuerle, T; Semmler, W; Bartling, S H [Department of Medical Physics in Radiology, German Cancer Research Center, Heidelberg (Germany); Dinkel, J [Department of Radiology, German Cancer Research Center, Heidelberg (Germany); Zwick, S [Department of Diagnostic Radiology, Medical Physics, Freiburg University (Germany); Grasruck, M [Siemens Healthcare, Forchheim (Germany); Kiessling, F [Chair of Experimental Molecular Imaging, RWTH-Aachen University, Medical Faculty, Aachen (Germany); Gupta, R [Department of Radiology, Massachusetts General Hospital, Boston, MA (United States)], E-mail: j.kuntz@dkfz.de

    2010-04-07

    A fully automated, intrinsic gating algorithm for small animal cone-beam CT is described and evaluated. A parameter representing the organ motion, derived from the raw projection images, is used for both cardiac and respiratory gating. The proposed algorithm makes it possible to reconstruct motion-corrected still images as well as to generate four-dimensional (4D) datasets representing the cardiac and pulmonary anatomy of free-breathing animals without the use of electrocardiogram (ECG) or respiratory sensors. Variation analysis of projections from several rotations is used to place a region of interest (ROI) on the diaphragm. The ROI is cranially extended to include the heart. The centre of mass (COM) variation within this ROI, the filtered frequency response and the local maxima are used to derive a binary motion-gating parameter for phase-sensitive gated reconstruction. This algorithm was implemented on a flat-panel-based cone-beam CT scanner and evaluated using a moving phantom and animal scans (seven rats and eight mice). Volumes were determined using a semiautomatic segmentation. In all cases robust gating signals could be obtained. The maximum volume error in phantom studies was less than 6%. By utilizing extrinsic gating via externally placed cardiac and respiratory sensors, the functional parameters (e.g. cardiac ejection fraction) and image quality were equivalent to this current gold standard. This algorithm obviates the necessity of both gating hardware and user interaction. The simplicity of the proposed algorithm enables adoption in a wide range of small animal cone-beam CT scanners.

  7. Future Trends in Process Automation

    OpenAIRE

    Jämsä-Jounela, Sirkka-Liisa

    2007-01-01

    The importance of automation in the process industries has increased dramatically in recent years. In the highly industrialized countries, process automation serves to enhance product quality, master the whole range of products, improve process safety and plant availability, efficiently utilize resources and lower emissions. In the rapidly developing countries, mass production is the main motivation for applying process automation. The greatest demand for process automation is in the chemical...

  8. An Automated, High-Throughput System for GISAXS and GIWAXS Measurements of Thin Films

    Science.gov (United States)

    Schaible, Eric; Jimenez, Jessica; Church, Matthew; Lim, Eunhee; Stewart, Polite; Hexemer, Alexander

    Grazing incidence small-angle X-ray scattering (GISAXS) and grazing incidence wide-angle X-ray scattering (GIWAXS) are important techniques for characterizing thin films. In order to meet rapidly increasing demand, the SAXSWAXS beamline at the Advanced Light Source (beamline 7.3.3) has implemented a fully automated, high-throughput system to conduct SAXS, GISAXS and GIWAXS measurements. An automated robot arm transfers samples from a holding tray to a measurement stage. Intelligent software aligns each sample in turn, and measures each according to user-defined specifications. Users mail in trays of samples on individually barcoded pucks, and can download and view their data remotely. Data will be pipelined to the NERSC supercomputing facility, and will be available to users via a web portal that facilitates highly parallelized analysis.

  9. Automation systems for radioimmunoassay

    International Nuclear Information System (INIS)

    Yamasaki, Paul

    1974-01-01

    The application of automation systems for radioimmunoassay (RIA) was discussed. Automated systems could be useful in the second step, of the four basic processes in the course of RIA, i.e., preparation of sample for reaction. There were two types of instrumentation, a semi-automatic pipete, and a fully automated pipete station, both providing for fast and accurate dispensing of the reagent or for the diluting of sample with reagent. Illustrations of the instruments were shown. (Mukohata, S.)

  10. Fully automated one-pot radiosynthesis of O-(2-[{sup 18}F]fluoroethyl)-L-tyrosine on the TracerLab FX{sub FN} module

    Energy Technology Data Exchange (ETDEWEB)

    Bourdier, Thomas, E-mail: bts@ansto.gov.au [LifeSciences, Australian Nuclear Science and Technology Organisation, Locked Bag 2001, Kirrawee DC NSW 2232, Sydney (Australia); Greguric, Ivan [LifeSciences, Australian Nuclear Science and Technology Organisation, Locked Bag 2001, Kirrawee DC NSW 2232, Sydney (Australia); Roselt, Peter [Centre for Molecular Imaging, Peter MacCallum Cancer Centre, 12 St Andrew' s Place, East Melbourne, VIC, 3002 (Australia); Jackson, Tim; Faragalla, Jane; Katsifis, Andrew [LifeSciences, Australian Nuclear Science and Technology Organisation, Locked Bag 2001, Kirrawee DC NSW 2232, Sydney (Australia)

    2011-07-15

    Introduction: An efficient fully automated method for the radiosynthesis of enantiomerically pure O-(2-[{sup 18}F]fluoroethyl)-L-tyrosine ([{sup 18}F]FET) using the GE TracerLab FX{sub FN} synthesis module via the O-(2-tosyloxyethyl)-N-trityl-L-tyrosine tert-butylester precursor has been developed. Methods: The radiolabelling of [{sup 18}F]FET involved a classical [{sup 18}F]fluoride nucleophilic substitution performed in acetonitrile using potassium carbonate and Kryptofix 222, followed by acid hydrolysis using 2N hydrochloric acid. Results: [{sup 18}F]FET was produced in 35{+-}5% (n=22) yield non-decay-corrected (55{+-}5% decay-corrected) and with radiochemical and enantiomeric purity of >99% with a specific activity of >90 GBq/{mu}mol after 63 min of radiosynthesis including HPLC purification and formulation. Conclusion: The automated radiosynthesis provides high and reproducible yields suitable for routine clinical use.

  11. A fully-automated computer-assisted method of CT brain scan analysis for the measurement of cerebrospinal fluid spaces and brain absorption density

    International Nuclear Information System (INIS)

    Baldy, R.E.; Brindley, G.S.; Jacobson, R.R.; Reveley, M.A.; Lishman, W.A.; Ewusi-Mensah, I.; Turner, S.W.

    1986-01-01

    Computer-assisted methods of CT brain scan analysis offer considerable advantages over visual inspection, particularly in research; and several semi-automated methods are currently available. A new computer-assisted program is presented which provides fully automated processing of CT brain scans, depending on ''anatomical knowledge'' of where cerebrospinal fluid (CSF)-containing spaces are likely to lie. After identifying these regions of interest quantitative estimates are then provided of CSF content in each slice in cisterns, ventricles, Sylvian fissure and interhemispheric fissure. Separate measures are also provided of mean brain density in each slice. These estimates can be summated to provide total ventricular and total brain volumes. The program shows a high correlation with measures derived from mechanical planimetry and visual grading procedures, also when tested against a phantom brain of known ventricular volume. The advantages and limitations of the present program are discussed. (orig.)

  12. Parameter evaluation and fully-automated radiosynthesis of [11C]harmine for imaging of MAO-A for clinical trials

    International Nuclear Information System (INIS)

    Philippe, C.; Zeilinger, M.; Mitterhauser, M.; Dumanic, M.; Lanzenberger, R.; Hacker, M.; Wadsak, W.

    2015-01-01

    The aim of the present study was the evaluation and automation of the radiosynthesis of [ 11 C]harmine for clinical trials. The following parameters have been investigated: amount of base, precursor concentration, solvent, reaction temperature and time. The optimum reaction conditions were determined to be 2–3 mg/mL precursor activated with 1 eq. 5 M NaOH in DMSO, 80 °C reaction temperature and 2 min reaction time. Under these conditions 6.1±1 GBq (51.0±11% based on [ 11 C]CH 3 I, corrected for decay) of [ 11 C]harmine (n=72) were obtained. The specific activity was 101.32±28.2 GBq/µmol (at EOS). All quality control parameters were in accordance with the standards for parenteral human application. Due to its reliability and high yields, this fully-automated synthesis method can be used as routine set-up. - Highlights: • Preparation of [ 11 C]harmine on a commercially available synthesizer for the routine application. • High reliability: only 4 out of 72 failed syntheses; 5% due to technical problems. • High yields: 6.1±1 GBq overall yield (EOS). • High specific activities: 101.32±28.2 GBq/µmol

  13. Comparison of subjective and fully automated methods for measuring mammographic density.

    Science.gov (United States)

    Moshina, Nataliia; Roman, Marta; Sebuødegård, Sofie; Waade, Gunvor G; Ursin, Giske; Hofvind, Solveig

    2018-02-01

    Background Breast radiologists of the Norwegian Breast Cancer Screening Program subjectively classified mammographic density using a three-point scale between 1996 and 2012 and changed into the fourth edition of the BI-RADS classification since 2013. In 2015, an automated volumetric breast density assessment software was installed at two screening units. Purpose To compare volumetric breast density measurements from the automated method with two subjective methods: the three-point scale and the BI-RADS density classification. Material and Methods Information on subjective and automated density assessment was obtained from screening examinations of 3635 women recalled for further assessment due to positive screening mammography between 2007 and 2015. The score of the three-point scale (I = fatty; II = medium dense; III = dense) was available for 2310 women. The BI-RADS density score was provided for 1325 women. Mean volumetric breast density was estimated for each category of the subjective classifications. The automated software assigned volumetric breast density to four categories. The agreement between BI-RADS and volumetric breast density categories was assessed using weighted kappa (k w ). Results Mean volumetric breast density was 4.5%, 7.5%, and 13.4% for categories I, II, and III of the three-point scale, respectively, and 4.4%, 7.5%, 9.9%, and 13.9% for the BI-RADS density categories, respectively ( P for trend density categories was k w  = 0.5 (95% CI = 0.47-0.53; P density increased with increasing density category of the subjective classifications. The agreement between BI-RADS and volumetric breast density categories was moderate.

  14. Fully automated radiosynthesis of [11C]PBR28, a radiopharmaceutical for the translocator protein (TSPO) 18 kDa, using a GE TRACERlab FXC-Pro

    International Nuclear Information System (INIS)

    Hoareau, Raphaël; Shao, Xia; Henderson, Bradford D.; Scott, Peter J.H.

    2012-01-01

    In order to image the translocator protein (TSPO) 18 kDa in the clinic using positron emission tomography (PET) imaging, we had a cause to prepare [ 11 C]PBR28. In this communication we highlight our novel, recently developed, one-pot synthesis of the desmethyl-PBR28 precursor, as well as present an optimized fully automated preparation of [ 11 C]PBR28 using a GE TRACERlab FX C-Pro . Following radiolabelling, purification is achieved by HPLC and, to the best of our knowledge, the first reported example of reconstituting [ 11 C]PBR28 into ethanolic saline using solid-phase extraction (SPE). This procedure is operationally simple, and provides high quality doses of [ 11 C]PBR28 suitable for use in clinical PET imaging studies. Typical radiochemical yield using the optimized method is 3.6% yield (EOS, n=3), radiochemical and chemical purity are consistently >99%, and specific activities are 14,523 Ci/mmol. Highlights: ► This paper reports a fully automated synthesis of [ 11 C]PBR28 using a TRACERlab FXc-pro. ► We report a solid-phase extraction technique for the reconstitution of [ 11 C]PBR28. ► ICP-MS data for PBR28 precursor is reported confirming suitability for clinical use.

  15. Toward fully automated genotyping: Genotyping microsatellite markers by deconvolution

    Energy Technology Data Exchange (ETDEWEB)

    Perlin, M.W.; Lancia, G.; See-Kiong, Ng [Carnegie Mellon Univ., Pittsburgh, PA (United States)

    1995-11-01

    Dense genetic linkage maps have been constructed for the human and mouse genomes, with average densities of 2.9 cM and 0.35 cM, respectively. These genetic maps are crucial for mapping both Mendelian and complex traits and are useful in clinical genetic diagnosis. Current maps are largely comprised of abundant, easily assayed, and highly polymorphic PCR-based microsatellite markers, primarily dinucleotide (CA){sub n} repeats. One key limitation of these length polymorphisms is the PCR stutter (or slippage) artifact that introduces additional stutter bands. With two (or more) closely spaced alleles, the stutter bands overlap, and it is difficult to accurately determine the correct alleles; this stutter phenomenon has all but precluded full automation, since a human must visually inspect the allele data. We describe here novel deconvolution methods for accurate genotyping that mathematically remove PCR stutter artifact from microsatellite markers. These methods overcome the manual interpretation bottleneck and thereby enable full automation of genetic map construction and use. New functionalities, including the pooling of DNAs and the pooling of markers, are described that may greatly reduce the associated experimentation requirements. 32 refs., 5 figs., 3 tabs.

  16. SU-G-206-01: A Fully Automated CT Tool to Facilitate Phantom Image QA for Quantitative Imaging in Clinical Trials

    International Nuclear Information System (INIS)

    Wahi-Anwar, M; Lo, P; Kim, H; Brown, M; McNitt-Gray, M

    2016-01-01

    Purpose: The use of Quantitative Imaging (QI) methods in Clinical Trials requires both verification of adherence to a specified protocol and an assessment of scanner performance under that protocol, which are currently accomplished manually. This work introduces automated phantom identification and image QA measure extraction towards a fully-automated CT phantom QA system to perform these functions and facilitate the use of Quantitative Imaging methods in clinical trials. Methods: This study used a retrospective cohort of CT phantom scans from existing clinical trial protocols - totaling 84 phantoms, across 3 phantom types using various scanners and protocols. The QA system identifies the input phantom scan through an ensemble of threshold-based classifiers. Each classifier - corresponding to a phantom type - contains a template slice, which is compared to the input scan on a slice-by-slice basis, resulting in slice-wise similarity metric values for each slice compared. Pre-trained thresholds (established from a training set of phantom images matching the template type) are used to filter the similarity distribution, and the slice with the most optimal local mean similarity, with local neighboring slices meeting the threshold requirement, is chosen as the classifier’s matched slice (if it existed). The classifier with the matched slice possessing the most optimal local mean similarity is then chosen as the ensemble’s best matching slice. If the best matching slice exists, image QA algorithm and ROIs corresponding to the matching classifier extracted the image QA measures. Results: Automated phantom identification performed with 84.5% accuracy and 88.8% sensitivity on 84 phantoms. Automated image quality measurements (following standard protocol) on identified water phantoms (n=35) matched user QA decisions with 100% accuracy. Conclusion: We provide a fullyautomated CT phantom QA system consistent with manual QA performance. Further work will include parallel

  17. SU-G-206-01: A Fully Automated CT Tool to Facilitate Phantom Image QA for Quantitative Imaging in Clinical Trials

    Energy Technology Data Exchange (ETDEWEB)

    Wahi-Anwar, M; Lo, P; Kim, H; Brown, M; McNitt-Gray, M [UCLA Radiological Sciences, Los Angeles, CA (United States)

    2016-06-15

    Purpose: The use of Quantitative Imaging (QI) methods in Clinical Trials requires both verification of adherence to a specified protocol and an assessment of scanner performance under that protocol, which are currently accomplished manually. This work introduces automated phantom identification and image QA measure extraction towards a fully-automated CT phantom QA system to perform these functions and facilitate the use of Quantitative Imaging methods in clinical trials. Methods: This study used a retrospective cohort of CT phantom scans from existing clinical trial protocols - totaling 84 phantoms, across 3 phantom types using various scanners and protocols. The QA system identifies the input phantom scan through an ensemble of threshold-based classifiers. Each classifier - corresponding to a phantom type - contains a template slice, which is compared to the input scan on a slice-by-slice basis, resulting in slice-wise similarity metric values for each slice compared. Pre-trained thresholds (established from a training set of phantom images matching the template type) are used to filter the similarity distribution, and the slice with the most optimal local mean similarity, with local neighboring slices meeting the threshold requirement, is chosen as the classifier’s matched slice (if it existed). The classifier with the matched slice possessing the most optimal local mean similarity is then chosen as the ensemble’s best matching slice. If the best matching slice exists, image QA algorithm and ROIs corresponding to the matching classifier extracted the image QA measures. Results: Automated phantom identification performed with 84.5% accuracy and 88.8% sensitivity on 84 phantoms. Automated image quality measurements (following standard protocol) on identified water phantoms (n=35) matched user QA decisions with 100% accuracy. Conclusion: We provide a fullyautomated CT phantom QA system consistent with manual QA performance. Further work will include parallel

  18. Distribution automation

    International Nuclear Information System (INIS)

    Gruenemeyer, D.

    1991-01-01

    This paper reports on a Distribution Automation (DA) System enhances the efficiency and productivity of a utility. It also provides intangible benefits such as improved public image and market advantages. A utility should evaluate the benefits and costs of such a system before committing funds. The expenditure for distribution automation is economical when justified by the deferral of a capacity increase, a decrease in peak power demand, or a reduction in O and M requirements

  19. Quantification of common carotid artery and descending aorta vessel wall thickness from MR vessel wall imaging using a fully automated processing pipeline.

    Science.gov (United States)

    Gao, Shan; van 't Klooster, Ronald; Brandts, Anne; Roes, Stijntje D; Alizadeh Dehnavi, Reza; de Roos, Albert; Westenberg, Jos J M; van der Geest, Rob J

    2017-01-01

    To develop and evaluate a method that can fully automatically identify the vessel wall boundaries and quantify the wall thickness for both common carotid artery (CCA) and descending aorta (DAO) from axial magnetic resonance (MR) images. 3T MRI data acquired with T 1 -weighted gradient-echo black-blood imaging sequence from carotid (39 subjects) and aorta (39 subjects) were used to develop and test the algorithm. The vessel wall segmentation was achieved by respectively fitting a 3D cylindrical B-spline surface to the boundaries of lumen and outer wall. The tube-fitting was based on the edge detection performed on the signal intensity (SI) profile along the surface normal. To achieve a fully automated process, Hough Transform (HT) was developed to estimate the lumen centerline and radii for the target vessel. Using the outputs of HT, a tube model for lumen segmentation was initialized and deformed to fit the image data. Finally, lumen segmentation was dilated to initiate the adaptation procedure of outer wall tube. The algorithm was validated by determining: 1) its performance against manual tracing; 2) its interscan reproducibility in quantifying vessel wall thickness (VWT); 3) its capability of detecting VWT difference in hypertensive patients compared with healthy controls. Statistical analysis including Bland-Altman analysis, t-test, and sample size calculation were performed for the purpose of algorithm evaluation. The mean distance between the manual and automatically detected lumen/outer wall contours was 0.00 ± 0.23/0.09 ± 0.21 mm for CCA and 0.12 ± 0.24/0.14 ± 0.35 mm for DAO. No significant difference was observed between the interscan VWT assessment using automated segmentation for both CCA (P = 0.19) and DAO (P = 0.94). Both manual and automated segmentation detected significantly higher carotid (P = 0.016 and P = 0.005) and aortic (P < 0.001 and P = 0.021) wall thickness in the hypertensive patients. A reliable and reproducible pipeline for fully

  20. Smart Buildings and Demand Response

    Science.gov (United States)

    Kiliccote, Sila; Piette, Mary Ann; Ghatikar, Girish

    2011-11-01

    Advances in communications and control technology, the strengthening of the Internet, and the growing appreciation of the urgency to reduce demand side energy use are motivating the development of improvements in both energy efficiency and demand response (DR) systems in buildings. This paper provides a framework linking continuous energy management and continuous communications for automated demand response (Auto-DR) in various times scales. We provide a set of concepts for monitoring and controls linked to standards and procedures such as Open Automation Demand Response Communication Standards (OpenADR). Basic building energy science and control issues in this approach begin with key building components, systems, end-uses and whole building energy performance metrics. The paper presents a framework about when energy is used, levels of services by energy using systems, granularity of control, and speed of telemetry. DR, when defined as a discrete event, requires a different set of building service levels than daily operations. We provide examples of lessons from DR case studies and links to energy efficiency.

  1. Real-time direct cell concentration and viability determination using a fully automated microfluidic platform for standalone process monitoring

    DEFF Research Database (Denmark)

    Rodrigues de Sousa Nunes, Pedro André; Kjaerulff, S.; Dufva, Martin

    2015-01-01

    system performance by monitoring in real time the cell concentration and viability of yeast extracted directly from an in-house made bioreactor. This is the first demonstration of using the Dean drag force, generated due to the implementation of a curved microchannel geometry in conjunction with high...... flow rates, to promote passive mixing of cell samples and thus homogenization of the diluted cell plug. The autonomous operation of the fluidics furthermore allows implementation of intelligent protocols for administering air bubbles from the bioreactor in the microfluidic system, so...... and thereby ensure optimal cell production, by prolonging the fermentation cycle and increasing the bioreactor output. In this work, we report on the development of a fully automated microfluidic system capable of extracting samples directly from a bioreactor, diluting the sample, staining the cells...

  2. Fully automated SPE-based synthesis and purification of 2-[{sup 18}F]fluoroethyl-choline for human use

    Energy Technology Data Exchange (ETDEWEB)

    Schmaljohann, Joern [Department of Nuclear Medicine, University of Bonn, Bonn (Germany); Department of Nuclear Medicine, University of Aachen, Aachen (Germany); Schirrmacher, Esther [McConnell Brain Imaging Centre, Montreal Neurological Institute, McGill University, Montreal, Quebec (Canada); Waengler, Bjoern; Waengler, Carmen [Department of Nuclear Medicine, Ludwig-Maximilians University, Munich (Germany); Schirrmacher, Ralf, E-mail: ralf.schirrmacher@mcgill.c [McConnell Brain Imaging Centre, Montreal Neurological Institute, McGill University, Montreal, Quebec (Canada); Guhlke, Stefan, E-mail: stefan.guhlke@ukb.uni-bonn.d [Department of Nuclear Medicine, University of Bonn, Bonn (Germany)

    2011-02-15

    Introduction: 2-[{sup 18}F]Fluoroethyl-choline ([{sup 18}F]FECH) is a promising tracer for the detection of prostate cancer as well as brain tumors with positron emission tomography (PET). [{sup 18}F]FECH is actively transported into mammalian cells, becomes phosphorylated by choline kinase and gets incorporated into the cell membrane after being metabolized to phosphatidylcholine. So far, its synthesis is a two-step procedure involving at least one HPLC purification step. To allow a wider dissemination of this tracer, finding a purification method avoiding HPLC is highly desirable and would result in easier accessibility and more reliable production of [{sup 18}F]FECH. Methods: [{sup 18}F]FECH was synthesized by reaction of 2-bromo-1-[{sup 18}F]fluoroethane ([{sup 18}F]BFE) with dimethylaminoethanol (DMAE) in DMSO. We applied a novel and very reliable work-up procedure for the synthesis of [{sup 18}F]BFE. Based on a combination of three different solid-phase cartridges, the purification of [{sup 18}F]BFE from its precursor 2-bromoethyl-4-nitrobenzenesulfonate (BENos) could be achieved without using HPLC. Following the subsequent reaction of the purified [{sup 18}F]BFE with DMAE, the final product [{sup 18}F]FECH was obtained as a sterile solution by passing the crude reaction mixture through a combination of two CM plus cartridges and a sterile filter. The fully automated synthesis was performed using as well a Raytest SynChrom module (Raytest, Germany) or a Scintomics HotboxIII module (Scintomics, Germany). Results: The radiotracer [{sup 18}F]FECH can be synthesized in reliable radiochemical yields (RCY) of 37{+-}5% (Synchrom module) and 33{+-}5% (Hotbox III unit) in less than 1 h using these two fully automated commercially available synthesis units without HPLC involvement for purification. Detailed quality control of the final injectable [{sup 18}F]FECH solution proved the high radiochemical purity and the absence of Kryptofix2.2.2, DMAE and DMSO used in the

  3. A fully automated mass spectrometer for the analysis of organic solids

    International Nuclear Information System (INIS)

    Hillig, H.; Kueper, H.; Riepe, W.

    1979-01-01

    Automation of a mass spectrometer-computer system makes it possible to process up to 30 samples without attention after sample loading. An automatic sample changer introduces the samples successively into the ion source by means of a direct inlet probe. A process control unit determines the operation sequence. Computer programs are available for the hardware support, system supervision and evaluation of the spectrometer signals. The most essential precondition for automation - automatic evaporation of the sample material by electronic control of the total ion current - is confirmed to be satisfactory. The system operates routinely overnight in an industrial laboratory, so that day work can be devoted to difficult analytical problems. The cost of routine analyses is halved. (Auth.)

  4. A Comparison of Fully Automated Methods of Data Analysis and Computer Assisted Heuristic Methods in an Electrode Kinetic Study of the Pathologically Variable [Fe(CN) 6 ] 3–/4– Process by AC Voltammetry

    KAUST Repository

    Morris, Graham P.

    2013-12-17

    Fully automated and computer assisted heuristic data analysis approaches have been applied to a series of AC voltammetric experiments undertaken on the [Fe(CN)6]3-/4- process at a glassy carbon electrode in 3 M KCl aqueous electrolyte. The recovered parameters in all forms of data analysis encompass E0 (reversible potential), k0 (heterogeneous charge transfer rate constant at E0), α (charge transfer coefficient), Ru (uncompensated resistance), and Cdl (double layer capacitance). The automated method of analysis employed time domain optimization and Bayesian statistics. This and all other methods assumed the Butler-Volmer model applies for electron transfer kinetics, planar diffusion for mass transport, Ohm\\'s Law for Ru, and a potential-independent Cdl model. Heuristic approaches utilize combinations of Fourier Transform filtering, sensitivity analysis, and simplex-based forms of optimization applied to resolved AC harmonics and rely on experimenter experience to assist in experiment-theory comparisons. Remarkable consistency of parameter evaluation was achieved, although the fully automated time domain method provided consistently higher α values than those based on frequency domain data analysis. The origin of this difference is that the implemented fully automated method requires a perfect model for the double layer capacitance. In contrast, the importance of imperfections in the double layer model is minimized when analysis is performed in the frequency domain. Substantial variation in k0 values was found by analysis of the 10 data sets for this highly surface-sensitive pathologically variable [Fe(CN) 6]3-/4- process, but remarkably, all fit the quasi-reversible model satisfactorily. © 2013 American Chemical Society.

  5. Comparison and clinical utility evaluation of four multiple allergen simultaneous tests including two newly introduced fully automated analyzers

    Directory of Open Access Journals (Sweden)

    John Hoon Rim

    2016-04-01

    Full Text Available Background: We compared the diagnostic performances of two newly introduced fully automated multiple allergen simultaneous tests (MAST analyzers with two conventional MAST assays. Methods: The serum samples from a total of 53 and 104 patients were tested for food panels and inhalant panels, respectively, in four analyzers including AdvanSure AlloScreen (LG Life Science, Korea, AdvanSure Allostation Smart II (LG Life Science, PROTIA Allergy-Q (ProteomeTech, Korea, and RIDA Allergy Screen (R-Biopharm, Germany. We compared not only the total agreement percentages but also positive propensities among four analyzers. Results: Evaluation of AdvanSure Allostation Smart II as upgraded version of AdvanSure AlloScreen revealed good concordance with total agreement percentages of 93.0% and 92.2% in food and inhalant panel, respectively. Comparisons of AdvanSure Allostation Smart II or PROTIA Allergy-Q with RIDA Allergy Screen also showed good concordance performance with positive propensities of two new analyzers for common allergens (Dermatophagoides farina and Dermatophagoides pteronyssinus. The changes of cut-off level resulted in various total agreement percentage fluctuations among allergens by different analyzers, although current cut-off level of class 2 appeared to be generally suitable. Conclusions: AdvanSure Allostation Smart II and PROTIA Allergy-Q presented favorable agreement performances with RIDA Allergy Screen, although positive propensities were noticed in common allergens. Keywords: Multiple allergen simultaneous test, Automated analyzer

  6. Toward Fully Automated Multicriterial Plan Generation: A Prospective Clinical Study

    International Nuclear Information System (INIS)

    Voet, Peter W.J.; Dirkx, Maarten L.P.; Breedveld, Sebastiaan; Fransen, Dennie; Levendag, Peter C.; Heijmen, Ben J.M.

    2013-01-01

    Purpose: To prospectively compare plans generated with iCycle, an in-house-developed algorithm for fully automated multicriterial intensity modulated radiation therapy (IMRT) beam profile and beam orientation optimization, with plans manually generated by dosimetrists using the clinical treatment planning system. Methods and Materials: For 20 randomly selected head-and-neck cancer patients with various tumor locations (of whom 13 received sequential boost treatments), we offered the treating physician the choice between an automatically generated iCycle plan and a manually optimized plan using standard clinical procedures. Although iCycle used a fixed “wish list” with hard constraints and prioritized objectives, the dosimetrists manually selected the beam configuration and fine tuned the constraints and objectives for each IMRT plan. Dosimetrists were not informed in advance whether a competing iCycle plan was made. The 2 plans were simultaneously presented to the physician, who then selected the plan to be used for treatment. For the patient group, differences in planning target volume coverage and sparing of critical tissues were quantified. Results: In 32 of 33 plan comparisons, the physician selected the iCycle plan for treatment. This highly consistent preference for the automatically generated plans was mainly caused by the improved sparing for the large majority of critical structures. With iCycle, the normal tissue complication probabilities for the parotid and submandibular glands were reduced by 2.4% ± 4.9% (maximum, 18.5%, P=.001) and 6.5% ± 8.3% (maximum, 27%, P=.005), respectively. The reduction in the mean oral cavity dose was 2.8 ± 2.8 Gy (maximum, 8.1 Gy, P=.005). For the swallowing muscles, the esophagus and larynx, the mean dose reduction was 3.3 ± 1.1 Gy (maximum, 9.2 Gy, P<.001). For 15 of the 20 patients, target coverage was also improved. Conclusions: In 97% of cases, automatically generated plans were selected for treatment because of

  7. Varying Levels of Automation on UAS Operator Responses to Traffic Resolution Advisories in Civil Airspace

    Science.gov (United States)

    Kenny, Caitlin; Fern, Lisa

    2012-01-01

    Continuing demand for the use of Unmanned Aircraft Systems (UAS) has put increasing pressure on operations in civil airspace. The need to fly UAS in the National Airspace System (NAS) in order to perform missions vital to national security and defense, emergency management, and science is increasing at a rapid pace. In order to ensure safe operations in the NAS, operators of unmanned aircraft, like those of manned aircraft, may be required to maintain separation assurance and avoid loss of separation with other aircraft while performing their mission tasks. This experiment investigated the effects of varying levels of automation on UAS operator performance and workload while responding to conflict resolution instructions provided by the Tactical Collision Avoidance System II (TCAS II) during a UAS mission in high-density airspace. The purpose of this study was not to investigate the safety of using TCAS II on UAS, but rather to examine the effect of automation on the ability of operators to respond to traffic collision alerts. Six licensed pilots were recruited to act as UAS operators for this study. Operators were instructed to follow a specified mission flight path, while maintaining radio contact with Air Traffic Control and responding to TCAS II resolution advisories. Operators flew four, 45 minute, experimental missions with four different levels of automation: Manual, Knobs, Management by Exception, and Fully Automated. All missions included TCAS II Resolution Advisories (RAs) that required operator attention and rerouting. Operator compliance and reaction time to RAs was measured, and post-run NASA-TLX ratings were collected to measure workload. Results showed significantly higher compliance rates, faster responses to TCAS II alerts, as well as less preemptive operator actions when higher levels of automation are implemented. Physical and Temporal ratings of workload were significantly higher in the Manual condition than in the Management by Exception and

  8. Dynamic adaptive policymaking for the sustainable city: The case of automated taxis

    Directory of Open Access Journals (Sweden)

    Warren E. Walker

    2017-06-01

    Full Text Available By 2050, about two-thirds of the world’s people are expected to live in urban areas. But, the economic viability and sustainability of city centers is threatened by problems related to transport, such as pollution, congestion, and parking. Much has been written about automated vehicles and demand responsive transport. The combination of these potentially disruptive developments could reduce these problems. However, implementation is held back by uncertainties, including public acceptance, liability, and privacy. So, their potential to reduce urban transport problems may not be fully realized. We propose an adaptive approach to implementation that takes some actions right away and creates a framework for future actions that allows for adaptations over time as knowledge about performance and acceptance of the new system (called ‘automated taxis’ accumulates and critical events for implementation take place. The adaptive approach is illustrated in the context of a hypothetical large city.

  9. Manual or automated measuring of antipsychotics' chemical oxygen demand.

    Science.gov (United States)

    Pereira, Sarah A P; Costa, Susana P F; Cunha, Edite; Passos, Marieta L C; Araújo, André R S T; Saraiva, M Lúcia M F S

    2018-05-15

    Antipsychotic (AP) drugs are becoming accumulated in terrestrial and aqueous resources due to their actual consumption. Thus, the search of methods for assessing the contamination load of these drugs is mandatory. The COD is a key parameter used for monitoring water quality upon the assessment of the effect of polluting agents on the oxygen level. Thus, the present work aims to assess the chemical oxygen demand (COD) levels of several typical and atypical antipsychotic drugs in order to obtain structure-activity relationships. It was implemented the titrimetric method with potassium dichromate as oxidant and a digestion step of 2h, followed by the measurement of remained unreduced dichromate by titration. After that, an automated sequential injection analysis (SIA) method was, also, used aiming to overcome some drawbacks of the titrimetric method. The results obtained showed a relationship between the chemical structures of antipsychotic drugs and their COD values, where the presence of aromatic rings and oxidable groups give higher COD values. It was obtained a good compliance between the results of the reference batch procedure and the SIA system, and the APs were clustered in two groups, with the values ratio between the methodologies, of 2 or 4, in the case of lower or higher COD values, respectively. The SIA methodology is capable of operating as a screening method, in any stage of a synthetic process, being also more environmentally friendly, and cost-effective. Besides, the studies presented open promising perspectives for the improvement of the effectiveness of pharmaceutical removal from the waste effluents, by assessing COD values. Copyright © 2018 Elsevier Inc. All rights reserved.

  10. Fully-automated in-syringe dispersive liquid-liquid microextraction for the determination of caffeine in coffee beverages.

    Science.gov (United States)

    Frizzarin, Rejane M; Maya, Fernando; Estela, José M; Cerdà, Víctor

    2016-12-01

    A novel fully-automated magnetic stirring-assisted lab-in-syringe analytical procedure has been developed for the fast and efficient dispersive liquid-liquid microextraction (DLLME) of caffeine in coffee beverages. The procedure is based on the microextraction of caffeine with a minute amount of dichloromethane, isolating caffeine from the sample matrix with no further sample pretreatment. Selection of the relevant extraction parameters such as the dispersive solvent, proportion of aqueous/organic phase, pH and flow rates have been carefully evaluated. Caffeine quantification was linear from 2 to 75mgL(-1), with detection and quantification limits of 0.46mgL(-1) and 1.54mgL(-1), respectively. A coefficient of variation (n=8; 5mgL(-1)) of a 2.1% and a sampling rate of 16h(-1), were obtained. The procedure was satisfactorily applied to the determination of caffeine in brewed, instant and decaf coffee samples, being the results for the sample analysis validated using high-performance liquid chromatography. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. A fully automated fast analysis system for capillary gas chromatography. Part 1. Automation of system control

    NARCIS (Netherlands)

    Snijders, H.M.J.; Rijks, J.P.E.M.; Bombeeck, A.J.; Rijks, J.A.; Sandra, P.; Lee, M.L.

    1992-01-01

    This paper is dealing with the design, the automation and evaluation of a high speed capillary gas chromatographic system. A combination of software and hardware was developed for a new cold trap/reinjection device that allows selective solvent eliminating and on column sample enrichment and an

  12. Automation in Immunohematology

    Directory of Open Access Journals (Sweden)

    Meenu Bajpai

    2012-01-01

    Full Text Available There have been rapid technological advances in blood banking in South Asian region over the past decade with an increasing emphasis on quality and safety of blood products. The conventional test tube technique has given way to newer techniques such as column agglutination technique, solid phase red cell adherence assay, and erythrocyte-magnetized technique. These new technologies are adaptable to automation and major manufacturers in this field have come up with semi and fully automated equipments for immunohematology tests in the blood bank. Automation improves the objectivity and reproducibility of tests. It reduces human errors in patient identification and transcription errors. Documentation and traceability of tests, reagents and processes and archiving of results is another major advantage of automation. Shifting from manual methods to automation is a major undertaking for any transfusion service to provide quality patient care with lesser turnaround time for their ever increasing workload. This article discusses the various issues involved in the process.

  13. Technical Note: A fully automated purge and trap GC-MS system for quantification of volatile organic compound (VOC fluxes between the ocean and atmosphere

    Directory of Open Access Journals (Sweden)

    S. J. Andrews

    2015-04-01

    Full Text Available The oceans are a key source of a number of atmospherically important volatile gases. The accurate and robust determination of trace gases in seawater is a significant analytical challenge, requiring reproducible and ideally automated sample handling, a high efficiency of seawater–air transfer, removal of water vapour from the sample stream, and high sensitivity and selectivity of the analysis. Here we describe a system that was developed for the fully automated analysis of dissolved very short-lived halogenated species (VSLS sampled from an under-way seawater supply. The system can also be used for semi-automated batch sampling from Niskin bottles filled during CTD (conductivity, temperature, depth profiles. The essential components comprise a bespoke, automated purge and trap (AutoP & T unit coupled to a commercial thermal desorption and gas chromatograph mass spectrometer (TD-GC-MS. The AutoP & T system has completed five research cruises, from the tropics to the poles, and collected over 2500 oceanic samples to date. It is able to quantify >25 species over a boiling point range of 34–180 °C with Henry's law coefficients of 0.018 and greater (CH22l, kHcc dimensionless gas/aqueous and has been used to measure organic sulfurs, hydrocarbons, halocarbons and terpenes. In the eastern tropical Pacific, the high sensitivity and sampling frequency provided new information regarding the distribution of VSLS, including novel measurements of a photolytically driven diurnal cycle of CH22l within the surface ocean water.

  14. How do Air Traffic Controllers Use Automation and Tools Differently During High Demand Situations?

    Science.gov (United States)

    Kraut, Joshua M.; Mercer, Joey; Morey, Susan; Homola, Jeffrey; Gomez, Ashley; Prevot, Thomas

    2013-01-01

    In a human-in-the-loop simulation, two air traffic controllers managed identical airspace while burdened with higher than average workload, and while using advanced tools and automation designed to assist with scheduling aircraft on multiple arrival flows to a single meter fix. This paper compares the strategies employed by each controller, and investigates how the controllers' strategies change while managing their airspace under more normal workload conditions and a higher workload condition. Each controller engaged in different methods of maneuvering aircraft to arrive on schedule, and adapted their strategies to cope with the increased workload in different ways. Based on the conclusions three suggestions are made: that quickly providing air traffic controllers with recommendations and information to assist with maneuvering and scheduling aircraft when burdened with increased workload will improve the air traffic controller's effectiveness, that the tools should adapt to the strategy currently employed by a controller, and that training should emphasize which traffic management strategies are most effective given specific airspace demands.

  15. A fully automated multi-modal computer aided diagnosis approach to coronary calcium scoring of MSCT images

    Science.gov (United States)

    Wu, Jing; Ferns, Gordon; Giles, John; Lewis, Emma

    2012-03-01

    Inter- and intra- observer variability is a problem often faced when an expert or observer is tasked with assessing the severity of a disease. This issue is keenly felt in coronary calcium scoring of patients suffering from atherosclerosis where in clinical practice, the observer must identify firstly the presence, followed by the location of candidate calcified plaques found within the coronary arteries that may prevent oxygenated blood flow to the heart muscle. However, it can be difficult for a human observer to differentiate calcified plaques that are located in the coronary arteries from those found in surrounding anatomy such as the mitral valve or pericardium. In addition to the benefits to scoring accuracy, the use of fast, low dose multi-slice CT imaging to perform the cardiac scan is capable of acquiring the entire heart within a single breath hold. Thus exposing the patient to lower radiation dose, which for a progressive disease such as atherosclerosis where multiple scans may be required, is beneficial to their health. Presented here is a fully automated method for calcium scoring using both the traditional Agatston method, as well as the volume scoring method. Elimination of the unwanted regions of the cardiac image slices such as lungs, ribs, and vertebrae is carried out using adaptive heart isolation. Such regions cannot contain calcified plaques but can be of a similar intensity and their removal will aid detection. Removal of both the ascending and descending aortas, as they contain clinical insignificant plaques, is necessary before the final calcium scores are calculated and examined against ground truth scores of three averaged expert observer results. The results presented here are intended to show the feasibility and requirement for an automated scoring method to reduce the subjectivity and reproducibility error inherent with manual clinical calcium scoring.

  16. Automated mammographic breast density estimation using a fully convolutional network.

    Science.gov (United States)

    Lee, Juhun; Nishikawa, Robert M

    2018-03-01

    The purpose of this study was to develop a fully automated algorithm for mammographic breast density estimation using deep learning. Our algorithm used a fully convolutional network, which is a deep learning framework for image segmentation, to segment both the breast and the dense fibroglandular areas on mammographic images. Using the segmented breast and dense areas, our algorithm computed the breast percent density (PD), which is the faction of dense area in a breast. Our dataset included full-field digital screening mammograms of 604 women, which included 1208 mediolateral oblique (MLO) and 1208 craniocaudal (CC) views. We allocated 455, 58, and 91 of 604 women and their exams into training, testing, and validation datasets, respectively. We established ground truth for the breast and the dense fibroglandular areas via manual segmentation and segmentation using a simple thresholding based on BI-RADS density assessments by radiologists, respectively. Using the mammograms and ground truth, we fine-tuned a pretrained deep learning network to train the network to segment both the breast and the fibroglandular areas. Using the validation dataset, we evaluated the performance of the proposed algorithm against radiologists' BI-RADS density assessments. Specifically, we conducted a correlation analysis between a BI-RADS density assessment of a given breast and its corresponding PD estimate by the proposed algorithm. In addition, we evaluated our algorithm in terms of its ability to classify the BI-RADS density using PD estimates, and its ability to provide consistent PD estimates for the left and the right breast and the MLO and CC views of the same women. To show the effectiveness of our algorithm, we compared the performance of our algorithm against a state of the art algorithm, laboratory for individualized breast radiodensity assessment (LIBRA). The PD estimated by our algorithm correlated well with BI-RADS density ratings by radiologists. Pearson's rho values of

  17. The Employment-Impact of Automation in Canada

    OpenAIRE

    McLean, Colin Alexander

    2015-01-01

    Standard neoclassical models of labour demand predict that automation does not produce long-term increases in unemployment. Supporting evidence in Canada between 1970 and 2008 is explained by the reallocation of labour from industries with high levels of automation such as Manufacturing to industries with low levels of automation such as Retail and Wholesale Trade, and Business Services. Recent evidence indicates however that on-going technological advances are now driving labour automation i...

  18. A fully robust PARAFAC method for analyzing fluorescence data

    DEFF Research Database (Denmark)

    Engelen, Sanne; Frosch, Stina; Jørgensen, Bo

    2009-01-01

    and Rayleigh scatter. Recently, a robust PARAFAC method that circumvents the harmful effects of outlying samples has been developed. For removing the scatter effects on the final PARAFAC model, different techniques exist. Newly, an automated scatter identification tool has been constructed. However......, there still exists no robust method for handling fluorescence data encountering both outlying EEM landscapes and scatter. In this paper, we present an iterative algorithm where the robust PARAFAC method and the scatter identification tool are alternately performed. A fully automated robust PARAFAC method...

  19. Fully automated quantification of regional cerebral blood flow with three-dimensional stereotaxic region of interest template. Validation using magnetic resonance imaging. Technical note

    Energy Technology Data Exchange (ETDEWEB)

    Takeuchi, Ryo; Katayama, Shigenori; Takeda, Naoya; Fujita, Katsuzo [Nishi-Kobe Medical Center (Japan); Yonekura, Yoshiharu [Fukui Medical Univ., Matsuoka (Japan); Konishi, Junji [Kyoto Univ. (Japan). Graduate School of Medicine

    2003-03-01

    The previously reported three-dimensional stereotaxic region of interest (ROI) template (3DSRT-t) for the analysis of anatomically standardized technetium-99m-L,L-ethyl cysteinate dimer ({sup 99m}Tc-ECD) single photon emission computed tomography (SPECT) images was modified for use in a fully automated regional cerebral blood flow (rCBF) quantification software, 3DSRT, incorporating an anatomical standardization engine transplanted from statistical parametric mapping 99 and ROIs for quantification based on 3DSRT-t. Three-dimensional T{sub 2}-weighted magnetic resonance images of 10 patients with localized infarcted areas were compared with the ROI contour of 3DSRT, and the positions of the central sulcus in the primary sensorimotor area were also estimated. All positions of the 20 lesions were in strict accordance with the ROI delineation of 3DSRT. The central sulcus was identified on at least one side of 210 paired ROIs and in the middle of 192 (91.4%) of these 210 paired ROIs among the 273 paired ROIs of the primary sensorimotor area. The central sulcus was recognized in the middle of more than 71.4% of the ROIs in which the central sulcus was identifiable in the respective 28 slices of the primary sensorimotor area. Fully automated accurate ROI delineation on anatomically standardized images is possible with 3DSRT, which enables objective quantification of rCBF and vascular reserve in only a few minutes using {sup 99m}Tc-ECD SPECT images obtained by the resting and vascular reserve (RVR) method. (author)

  20. Fully automated quantification of regional cerebral blood flow with three-dimensional stereotaxic region of interest template. Validation using magnetic resonance imaging. Technical note

    International Nuclear Information System (INIS)

    Takeuchi, Ryo; Katayama, Shigenori; Takeda, Naoya; Fujita, Katsuzo; Yonekura, Yoshiharu; Konishi, Junji

    2003-01-01

    The previously reported three-dimensional stereotaxic region of interest (ROI) template (3DSRT-t) for the analysis of anatomically standardized technetium-99m-L,L-ethyl cysteinate dimer ( 99m Tc-ECD) single photon emission computed tomography (SPECT) images was modified for use in a fully automated regional cerebral blood flow (rCBF) quantification software, 3DSRT, incorporating an anatomical standardization engine transplanted from statistical parametric mapping 99 and ROIs for quantification based on 3DSRT-t. Three-dimensional T 2 -weighted magnetic resonance images of 10 patients with localized infarcted areas were compared with the ROI contour of 3DSRT, and the positions of the central sulcus in the primary sensorimotor area were also estimated. All positions of the 20 lesions were in strict accordance with the ROI delineation of 3DSRT. The central sulcus was identified on at least one side of 210 paired ROIs and in the middle of 192 (91.4%) of these 210 paired ROIs among the 273 paired ROIs of the primary sensorimotor area. The central sulcus was recognized in the middle of more than 71.4% of the ROIs in which the central sulcus was identifiable in the respective 28 slices of the primary sensorimotor area. Fully automated accurate ROI delineation on anatomically standardized images is possible with 3DSRT, which enables objective quantification of rCBF and vascular reserve in only a few minutes using 99m Tc-ECD SPECT images obtained by the resting and vascular reserve (RVR) method. (author)

  1. Automated MAD and MIR structure solution

    International Nuclear Information System (INIS)

    Terwilliger, Thomas C.; Berendzen, Joel

    1999-01-01

    A fully automated procedure for solving MIR and MAD structures has been developed using a scoring scheme to convert the structure-solution process into an optimization problem. Obtaining an electron-density map from X-ray diffraction data can be difficult and time-consuming even after the data have been collected, largely because MIR and MAD structure determinations currently require many subjective evaluations of the qualities of trial heavy-atom partial structures before a correct heavy-atom solution is obtained. A set of criteria for evaluating the quality of heavy-atom partial solutions in macromolecular crystallography have been developed. These have allowed the conversion of the crystal structure-solution process into an optimization problem and have allowed its automation. The SOLVE software has been used to solve MAD data sets with as many as 52 selenium sites in the asymmetric unit. The automated structure-solution process developed is a major step towards the fully automated structure-determination, model-building and refinement procedure which is needed for genomic scale structure determinations

  2. Analysis of xanthines in beverages using a fully automated SPE-SPC-DAD hyphenated system

    Energy Technology Data Exchange (ETDEWEB)

    Medvedovici, A. [Bucarest Univ., Bucarest (Romania). Faculty of Chemistry, Dept. of Analytical Chemistry; David, F.; David, V.; Sandra, P. [Research Institute of Chromatography, Kortrijk (Belgium)

    2000-08-01

    Analysis of some xanthines (caffeine, theophylline and theobromine) in beverages has been achieved by a fully automated on-line Solid Phase Extraction - Supercritical Fluid Chromatography - Diode Array Detection (Spe - Sofc - Dad). Three adsorbents have been tested for the Spe procedure: octadecyl modified silicagel (ODS) and two types of styrene-divinylbenzen copolymer based materials, from which Porapack proved to be the most suitable adsorbent. Optimisation and correlation of both Spe and Sofc operational parameters are also discussed. By this technique, caffeine was determined in ice tea and Coca-Cola in a concentration of 0.15 ppm, theobromine - 1.5 ppb, and theophylline - 0.15 ppb. [Italian] Si e' realizzata l'analis di alcune xantine (caffeina, teofillina e teobromina) mediante un sistema, in linea, completamente automatizzato basato su Estrazione in Fase Solida - Cromatografia in Fase Supercritica - Rivelazione con Diode Array (Spe - Sfc - Dad). Per la procedura Spe sono stati valutati tre substrati: silice ottadecilica (ODS) e due tipi di materiali polimerici a base stirene-divinilbenzene, di cui, quello denominato PRP-1, e' risultato essere il piu' efficiente. Sono discusse sia l'ottimizzazione che la correlazione dei parametri operazionali per la Spe e la Sfc. Con questa tecnica sono state determinate, in te' ghiacciato e Coca-Cola, la caffeina, la teobromina e la teofillina alle concentrazini di 0.15, 1.5 e 0.15 ppm.

  3. Safe interaction between cyclists, pedestrians and automated vehicles : what do we know and what do we need to know?

    NARCIS (Netherlands)

    Vissers, L. Kint, S. van der Schagen, I.N.L.G. van & Hagenzieker, M.P.

    2017-01-01

    Automated vehicles are gradually entering our roadway system. Before our roads will be solely used by fully automated vehicles, a long transition period is to be expected in which fully automated vehicles, partly automated vehicles and manually-driven vehicles have to share our roads. The current

  4. LV challenge LKEB contribution : fully automated myocardial contour detection

    NARCIS (Netherlands)

    Wijnhout, J.S.; Hendriksen, D.; Assen, van H.C.; Geest, van der R.J.

    2009-01-01

    In this paper a contour detection method is described and evaluated on the evaluation data sets of the Cardiac MR Left Ventricle Segmentation Challenge as part of MICCAI 2009s 3D Segmentation Challenge for Clinical Applications. The proposed method, using 2D AAM and 3D ASM, performs a fully

  5. Automation of Endmember Pixel Selection in SEBAL/METRIC Model

    Science.gov (United States)

    Bhattarai, N.; Quackenbush, L. J.; Im, J.; Shaw, S. B.

    2015-12-01

    The commonly applied surface energy balance for land (SEBAL) and its variant, mapping evapotranspiration (ET) at high resolution with internalized calibration (METRIC) models require manual selection of endmember (i.e. hot and cold) pixels to calibrate sensible heat flux. Current approaches for automating this process are based on statistical methods and do not appear to be robust under varying climate conditions and seasons. In this paper, we introduce a new approach based on simple machine learning tools and search algorithms that provides an automatic and time efficient way of identifying endmember pixels for use in these models. The fully automated models were applied on over 100 cloud-free Landsat images with each image covering several eddy covariance flux sites in Florida and Oklahoma. Observed land surface temperatures at automatically identified hot and cold pixels were within 0.5% of those from pixels manually identified by an experienced operator (coefficient of determination, R2, ≥ 0.92, Nash-Sutcliffe efficiency, NSE, ≥ 0.92, and root mean squared error, RMSE, ≤ 1.67 K). Daily ET estimates derived from the automated SEBAL and METRIC models were in good agreement with their manual counterparts (e.g., NSE ≥ 0.91 and RMSE ≤ 0.35 mm day-1). Automated and manual pixel selection resulted in similar estimates of observed ET across all sites. The proposed approach should reduce time demands for applying SEBAL/METRIC models and allow for their more widespread and frequent use. This automation can also reduce potential bias that could be introduced by an inexperienced operator and extend the domain of the models to new users.

  6. Coordinated Demand Response and Distributed Generation Management in Residential Smart Microgrids

    DEFF Research Database (Denmark)

    Anvari-Moghaddam, Amjad; Mokhtari, Ghassem; Guerrero, Josep M.

    2016-01-01

    potentials to increase the functionality of a typical demand-side management (DSM) strategy, and typical implementation of building-level DERs by integrating them into a cohesive, networked package that fully utilizes smart energy-efficient end-use devices, advanced building control/automation systems......Nowadays with the emerging of small-scale integrated energy systems (IESs) in form of residential smart microgrids (SMGs), a large portion of energy can be saved through coordinated scheduling of smart household devices and management of distributed energy resources (DERs). There are significant......, and an integrated communications architecture to efficiently manage energy and comfort at the end-use location. By the aid of such technologies, residential consumers have also the capability to mitigate their energy costs and satisfy their own requirements paying less attention to the configuration of the energy...

  7. A Comparison of Fully Automated Methods of Data Analysis and Computer Assisted Heuristic Methods in an Electrode Kinetic Study of the Pathologically Variable [Fe(CN) 6 ] 3–/4– Process by AC Voltammetry

    KAUST Repository

    Morris, Graham P.; Simonov, Alexandr N.; Mashkina, Elena A.; Bordas, Rafel; Gillow, Kathryn; Baker, Ruth E.; Gavaghan, David J.; Bond, Alan M.

    2013-01-01

    Fully automated and computer assisted heuristic data analysis approaches have been applied to a series of AC voltammetric experiments undertaken on the [Fe(CN)6]3-/4- process at a glassy carbon electrode in 3 M KCl aqueous electrolyte. The recovered

  8. A fully automated temperature-dependent resistance measurement setup using van der Pauw method

    Science.gov (United States)

    Pandey, Shivendra Kumar; Manivannan, Anbarasu

    2018-03-01

    The van der Pauw (VDP) method is widely used to identify the resistance of planar homogeneous samples with four contacts placed on its periphery. We have developed a fully automated thin film resistance measurement setup using the VDP method with the capability of precisely measuring a wide range of thin film resistances from few mΩ up to 10 GΩ under controlled temperatures from room-temperature up to 600 °C. The setup utilizes a robust, custom-designed switching network board (SNB) for measuring current-voltage characteristics automatically at four different source-measure configurations based on the VDP method. Moreover, SNB is connected with low noise shielded coaxial cables that reduce the effect of leakage current as well as the capacitance in the circuit thereby enhancing the accuracy of measurement. In order to enable precise and accurate resistance measurement of the sample, wide range of sourcing currents/voltages are pre-determined with the capability of auto-tuning for ˜12 orders of variation in the resistances. Furthermore, the setup has been calibrated with standard samples and also employed to investigate temperature dependent resistance (few Ω-10 GΩ) measurements for various chalcogenide based phase change thin films (Ge2Sb2Te5, Ag5In5Sb60Te30, and In3SbTe2). This setup would be highly helpful for measurement of temperature-dependent resistance of wide range of materials, i.e., metals, semiconductors, and insulators illuminating information about structural change upon temperature as reflected by change in resistances, which are useful for numerous applications.

  9. Fully automated whole-head segmentation with improved smoothness and continuity, with theory reviewed.

    Directory of Open Access Journals (Sweden)

    Yu Huang

    Full Text Available Individualized current-flow models are needed for precise targeting of brain structures using transcranial electrical or magnetic stimulation (TES/TMS. The same is true for current-source reconstruction in electroencephalography and magnetoencephalography (EEG/MEG. The first step in generating such models is to obtain an accurate segmentation of individual head anatomy, including not only brain but also cerebrospinal fluid (CSF, skull and soft tissues, with a field of view (FOV that covers the whole head. Currently available automated segmentation tools only provide results for brain tissues, have a limited FOV, and do not guarantee continuity and smoothness of tissues, which is crucially important for accurate current-flow estimates. Here we present a tool that addresses these needs. It is based on a rigorous Bayesian inference framework that combines image intensity model, anatomical prior (atlas and morphological constraints using Markov random fields (MRF. The method is evaluated on 20 simulated and 8 real head volumes acquired with magnetic resonance imaging (MRI at 1 mm3 resolution. We find improved surface smoothness and continuity as compared to the segmentation algorithms currently implemented in Statistical Parametric Mapping (SPM. With this tool, accurate and morphologically correct modeling of the whole-head anatomy for individual subjects may now be feasible on a routine basis. Code and data are fully integrated into SPM software tool and are made publicly available. In addition, a review on the MRI segmentation using atlas and the MRF over the last 20 years is also provided, with the general mathematical framework clearly derived.

  10. Fully automated whole-head segmentation with improved smoothness and continuity, with theory reviewed.

    Science.gov (United States)

    Huang, Yu; Parra, Lucas C

    2015-01-01

    Individualized current-flow models are needed for precise targeting of brain structures using transcranial electrical or magnetic stimulation (TES/TMS). The same is true for current-source reconstruction in electroencephalography and magnetoencephalography (EEG/MEG). The first step in generating such models is to obtain an accurate segmentation of individual head anatomy, including not only brain but also cerebrospinal fluid (CSF), skull and soft tissues, with a field of view (FOV) that covers the whole head. Currently available automated segmentation tools only provide results for brain tissues, have a limited FOV, and do not guarantee continuity and smoothness of tissues, which is crucially important for accurate current-flow estimates. Here we present a tool that addresses these needs. It is based on a rigorous Bayesian inference framework that combines image intensity model, anatomical prior (atlas) and morphological constraints using Markov random fields (MRF). The method is evaluated on 20 simulated and 8 real head volumes acquired with magnetic resonance imaging (MRI) at 1 mm3 resolution. We find improved surface smoothness and continuity as compared to the segmentation algorithms currently implemented in Statistical Parametric Mapping (SPM). With this tool, accurate and morphologically correct modeling of the whole-head anatomy for individual subjects may now be feasible on a routine basis. Code and data are fully integrated into SPM software tool and are made publicly available. In addition, a review on the MRI segmentation using atlas and the MRF over the last 20 years is also provided, with the general mathematical framework clearly derived.

  11. A fully automated and scalable timing probe-based method for time alignment of the LabPET II scanners

    Science.gov (United States)

    Samson, Arnaud; Thibaudeau, Christian; Bouchard, Jonathan; Gaudin, Émilie; Paulin, Caroline; Lecomte, Roger; Fontaine, Réjean

    2018-05-01

    A fully automated time alignment method based on a positron timing probe was developed to correct the channel-to-channel coincidence time dispersion of the LabPET II avalanche photodiode-based positron emission tomography (PET) scanners. The timing probe was designed to directly detect positrons and generate an absolute time reference. The probe-to-channel coincidences are recorded and processed using firmware embedded in the scanner hardware to compute the time differences between detector channels. The time corrections are then applied in real-time to each event in every channel during PET data acquisition to align all coincidence time spectra, thus enhancing the scanner time resolution. When applied to the mouse version of the LabPET II scanner, the calibration of 6 144 channels was performed in less than 15 min and showed a 47% improvement on the overall time resolution of the scanner, decreasing from 7 ns to 3.7 ns full width at half maximum (FWHM).

  12. A new TLD badge with machine readable ID for fully automated readout

    International Nuclear Information System (INIS)

    Kannan, S. Ratna P.; Kulkarni, M.S.

    2003-01-01

    The TLD badge currently being used for personnel monitoring of more than 40,000 radiation workers has a few drawbacks such as lack of on-badge machine readable ID code, delicate two-point clamping of dosimeters on an aluminium card with the chances of dosimeters falling off during handling or readout, projections on one side making automation of readout difficult etc. A new badge has been designed with a 8-digit identification code in the form of an array of holes and smooth exteriors to enable full automation of readout. The new badge also permits changing of dosimeters when necessary. The new design does not affect the readout time or the dosimetric characteristics. The salient features and the dosimetric characteristics are discussed. (author)

  13. Integrated Platform for Automated Sustainable Demand Response in Smart Grids

    Energy Technology Data Exchange (ETDEWEB)

    Zois, Vassilis [Univ. of Southern California, Los Angeles, CA (United States). Dept. of Computer Science; Frincu, Marc [Univ. of Southern California, Los Angeles, CA (United States). Dept. of Electrical Engineering; Prasanna, Viktor K. [Univ. of Southern California, Los Angeles, CA (United States). Dept. of Electrical Engineering

    2014-10-08

    Demand Response(DR) is a common practice used by utility providers to regulate energy demand. It is used at periods of high demand to minimize the peak to average consumption ratio. Several methods have been Demand Response(DR) is a common praon using information about the baseline consumption and the consumption during DR. Our goal is to provide a sustainable reduction to ensure the elimination of peaks in demand. The proposed system includes an adaptation mechanism for when the provided solution does not meet the DR requirements. We conducted a series of experiments using consumption data from a real life micro grid to evaluate the efficiency as well as the robustness of our solution.

  14. Integrating Standard Operating Procedures with Spacecraft Automation, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — Spacecraft automation can be used to greatly reduce the demands on crew member and flight controllers time and attention. Automation can monitor critical resources,...

  15. A predictive control scheme for automated demand response mechanisms

    NARCIS (Netherlands)

    Lampropoulos, I.; Bosch, van den P.P.J.; Kling, W.L.

    2012-01-01

    The development of demand response mechanisms can provide a considerable option for the integration of renewable energy sources and the establishment of efficient generation and delivery of electrical power. The full potential of demand response can be significant, but its exploration still remains

  16. Fully automated VMAT treatment planning for advanced-stage NSCLC patients

    Energy Technology Data Exchange (ETDEWEB)

    Della Gala, Giuseppe [Erasmus MC Cancer Institute, Department of Radiation Oncology, Rotterdam (Netherlands); Universita di Bologna, Scuola di Scienze, Alma Mater Studiorum, Bologna (Italy); Dirkx, Maarten L.P.; Hoekstra, Nienke; Fransen, Dennie; Pol, Marjan van de; Heijmen, Ben J.M. [Erasmus MC Cancer Institute, Department of Radiation Oncology, Rotterdam (Netherlands); Lanconelli, Nico [Universita di Bologna, Scuola di Scienze, Alma Mater Studiorum, Bologna (Italy); Petit, Steven F. [Erasmus MC Cancer Institute, Department of Radiation Oncology, Rotterdam (Netherlands); Massachusetts General Hospital - Harvard Medical School, Department of Radiation Oncology, Boston, MA (United States)

    2017-05-15

    To develop a fully automated procedure for multicriterial volumetric modulated arc therapy (VMAT) treatment planning (autoVMAT) for stage III/IV non-small cell lung cancer (NSCLC) patients treated with curative intent. After configuring the developed autoVMAT system for NSCLC, autoVMAT plans were compared with manually generated clinically delivered intensity-modulated radiotherapy (IMRT) plans for 41 patients. AutoVMAT plans were also compared to manually generated VMAT plans in the absence of time pressure. For 16 patients with reduced planning target volume (PTV) dose prescription in the clinical IMRT plan (to avoid violation of organs at risk tolerances), the potential for dose escalation with autoVMAT was explored. Two physicians evaluated 35/41 autoVMAT plans (85%) as clinically acceptable. Compared to the manually generated IMRT plans, autoVMAT plans showed statistically significant improved PTV coverage (V{sub 95%} increased by 1.1% ± 1.1%), higher dose conformity (R{sub 50} reduced by 12.2% ± 12.7%), and reduced mean lung, heart, and esophagus doses (reductions of 0.9 Gy ± 1.0 Gy, 1.5 Gy ± 1.8 Gy, 3.6 Gy ± 2.8 Gy, respectively, all p < 0.001). To render the six remaining autoVMAT plans clinically acceptable, a dosimetrist needed less than 10 min hands-on time for fine-tuning. AutoVMAT plans were also considered equivalent or better than manually optimized VMAT plans. For 6/16 patients, autoVMAT allowed tumor dose escalation of 5-10 Gy. Clinically deliverable, high-quality autoVMAT plans can be generated fully automatically for the vast majority of advanced-stage NSCLC patients. For a subset of patients, autoVMAT allowed for tumor dose escalation. (orig.) [German] Entwicklung einer vollautomatisierten, auf multiplen Kriterien basierenden volumenmodulierten Arc-Therapie-(VMAT-)Behandlungsplanung (autoVMAT) fuer kurativ behandelte Patienten mit nicht-kleinzelligem Bronchialkarzinom (NSCLC) im Stadium III/IV. Nach Konfiguration unseres auto

  17. Screening for anabolic steroids in urine of forensic cases using fully automated solid phase extraction and LC-MS-MS.

    Science.gov (United States)

    Andersen, David W; Linnet, Kristian

    2014-01-01

    A screening method for 18 frequently measured exogenous anabolic steroids and the testosterone/epitestosterone (T/E) ratio in forensic cases has been developed and validated. The method involves a fully automated sample preparation including enzyme treatment, addition of internal standards and solid phase extraction followed by analysis by liquid chromatography-tandem mass spectrometry (LC-MS-MS) using electrospray ionization with adduct formation for two compounds. Urine samples from 580 forensic cases were analyzed to determine the T/E ratio and occurrence of exogenous anabolic steroids. Extraction recoveries ranged from 77 to 95%, matrix effects from 48 to 78%, overall process efficiencies from 40 to 54% and the lower limit of identification ranged from 2 to 40 ng/mL. In the 580 urine samples analyzed from routine forensic cases, 17 (2.9%) were found positive for one or more anabolic steroids. Only seven different steroids including testosterone were found in the material, suggesting that only a small number of common steroids are likely to occur in a forensic context. The steroids were often in high concentrations (>100 ng/mL), and a combination of steroids and/or other drugs of abuse were seen in the majority of cases. The method presented serves as a fast and automated screening procedure, proving the suitability of LC-MS-MS for analyzing anabolic steroids. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  18. Fully Automated Robust System to Detect Retinal Edema, Central Serous Chorioretinopathy, and Age Related Macular Degeneration from Optical Coherence Tomography Images

    Directory of Open Access Journals (Sweden)

    Samina Khalid

    2017-01-01

    Full Text Available Maculopathy is the excessive damage to macula that leads to blindness. It mostly occurs due to retinal edema (RE, central serous chorioretinopathy (CSCR, or age related macular degeneration (ARMD. Optical coherence tomography (OCT imaging is the latest eye testing technique that can detect these syndromes in early stages. Many researchers have used OCT images to detect retinal abnormalities. However, to the best of our knowledge, no research that presents a fully automated system to detect all of these macular syndromes is reported. This paper presents the world’s first ever decision support system to automatically detect RE, CSCR, and ARMD retinal pathologies and healthy retina from OCT images. The automated disease diagnosis in our proposed system is based on multilayered support vector machines (SVM classifier trained on 40 labeled OCT scans (10 healthy, 10 RE, 10 CSCR, and 10 ARMD. After training, SVM forms an accurate decision about the type of retinal pathology using 9 extracted features. We have tested our proposed system on 2819 OCT scans (1437 healthy, 640 RE, and 742 CSCR of 502 patients from two different datasets and our proposed system correctly diagnosed 2817/2819 subjects with the accuracy, sensitivity, and specificity ratings of 99.92%, 100%, and 99.86%, respectively.

  19. Rapid detection of enterovirus in cerebrospinal fluid by a fully-automated PCR assay is associated with improved management of aseptic meningitis in adult patients.

    Science.gov (United States)

    Giulieri, Stefano G; Chapuis-Taillard, Caroline; Manuel, Oriol; Hugli, Olivier; Pinget, Christophe; Wasserfallen, Jean-Blaise; Sahli, Roland; Jaton, Katia; Marchetti, Oscar; Meylan, Pascal

    2015-01-01

    Enterovirus (EV) is the most frequent cause of aseptic meningitis (AM). Lack of microbiological documentation results in unnecessary antimicrobial therapy and hospitalization. To assess the impact of rapid EV detection in cerebrospinal fluid (CSF) by a fully-automated PCR (GeneXpert EV assay, GXEA) on the management of AM. Observational study in adult patients with AM. Three groups were analyzed according to EV documentation in CSF: group A = no PCR or negative PCR (n=17), group B = positive real-time PCR (n = 20), and group C = positive GXEA (n = 22). Clinical, laboratory and health-care costs data were compared. Clinical characteristics were similar in the 3 groups. Median turn-around time of EV PCR decreased from 60 h (IQR (interquartile range) 44-87) in group B to 5h (IQR 4-11) in group C (p<0.0001). Median duration of antibiotics was 1 (IQR 0-6), 1 (0-1.9), and 0.5 days (single dose) in groups A, B, and C, respectively (p < 0.001). Median length of hospitalization was 4 days (2.5-7.5), 2 (1-3.7), and 0.5 (0.3-0.7), respectively (p < 0.001). Median hospitalization costs were $5458 (2676-6274) in group A, $2796 (2062-5726) in group B, and $921 (765-1230) in group C (p < 0.0001). Rapid EV detection in CSF by a fully-automated PCR improves management of AM by significantly reducing antibiotic use, hospitalization length and costs. Copyright © 2014 Elsevier B.V. All rights reserved.

  20. Criteria for demand response systems

    NARCIS (Netherlands)

    Lampropoulos, I.; Kling, W.L.; Bosch, van den P.P.J.; Ribeiro, P.F.; Berg, van den J.

    2013-01-01

    The topic of demand side management is currently becoming more important than ever, in parallel with the further deregulation of the electricity sector, and the increasing integration of renewable energy sources. A historical review of automation integration in power system control assists in

  1. Using microwave Doppler radar in automated manufacturing applications

    Science.gov (United States)

    Smith, Gregory C.

    Since the beginning of the Industrial Revolution, manufacturers worldwide have used automation to improve productivity, gain market share, and meet growing or changing consumer demand for manufactured products. To stimulate further industrial productivity, manufacturers need more advanced automation technologies: "smart" part handling systems, automated assembly machines, CNC machine tools, and industrial robots that use new sensor technologies, advanced control systems, and intelligent decision-making algorithms to "see," "hear," "feel," and "think" at the levels needed to handle complex manufacturing tasks without human intervention. The investigator's dissertation offers three methods that could help make "smart" CNC machine tools and industrial robots possible: (1) A method for detecting acoustic emission using a microwave Doppler radar detector, (2) A method for detecting tool wear on a CNC lathe using a Doppler radar detector, and (3) An online non-contact method for detecting industrial robot position errors using a microwave Doppler radar motion detector. The dissertation studies indicate that microwave Doppler radar could be quite useful in automated manufacturing applications. In particular, the methods developed may help solve two difficult problems that hinder further progress in automating manufacturing processes: (1) Automating metal-cutting operations on CNC machine tools by providing a reliable non-contact method for detecting tool wear, and (2) Fully automating robotic manufacturing tasks by providing a reliable low-cost non-contact method for detecting on-line position errors. In addition, the studies offer a general non-contact method for detecting acoustic emission that may be useful in many other manufacturing and non-manufacturing areas, as well (e.g., monitoring and nondestructively testing structures, materials, manufacturing processes, and devices). By advancing the state of the art in manufacturing automation, the studies may help

  2. An expert system for automated robotic grasping

    International Nuclear Information System (INIS)

    Stansfield, S.A.

    1990-01-01

    Many US Department of Energy sites and facilities will be environmentally remediated during the next several decades. A number of the restoration activities (e.g., decontamination and decommissioning of inactive nuclear facilities) can only be carried out by remote means and will be manipulation-intensive tasks. Experience has shown that manipulation tasks are especially slow and fatiguing for the human operator of a remote manipulator. In this paper, the authors present a rule-based expert system for automated, dextrous robotic grasping. This system interprets the features of an object to generate hand shaping and wrist orientation for a robot hand and arm. The system can be used in several different ways to lessen the demands on the human operator of a remote manipulation system - either as a fully autonomous grasping system or one that generates grasping options for a human operator and then automatically carries out the selected option

  3. Development of a fully automated open-column chemical-separation system—COLUMNSPIDER—and its application to Sr-Nd-Pb isotope analyses of igneous rock samples

    Science.gov (United States)

    Miyazaki, Takashi; Vaglarov, Bogdan Stefanov; Takei, Masakazu; Suzuki, Masahiro; Suzuki, Hiroaki; Ohsawa, Kouzou; Chang, Qing; Takahashi, Toshiro; Hirahara, Yuka; Hanyu, Takeshi; Kimura, Jun-Ichi; Tatsumi, Yoshiyuki

    A fully automated open-column resin-bed chemical-separation system, named COLUMNSPIDER, has been developed. The system consists of a programmable micropipetting robot that dispenses chemical reagents and sample solutions into an open-column resin bed for elemental separation. After the initial set up of resin columns, chemical reagents, and beakers for the separated chemical components, all separation procedures are automated. As many as ten samples can be eluted in parallel in a single automated run. Many separation procedures, such as radiogenic isotope ratio analyses for Sr and Nd, involve the use of multiple column separations with different resin columns, chemical reagents, and beakers of various volumes. COLUMNSPIDER completes these separations using multiple runs. Programmable functions, including the positioning of the micropipetter, reagent volume, and elution time, enable flexible operation. Optimized movements for solution take-up and high-efficiency column flushing allow the system to perform as precisely as when carried out manually by a skilled operator. Procedural blanks, examined for COLUMNSPIDER separations of Sr, Nd, and Pb, are low and negligible. The measured Sr, Nd, and Pb isotope ratios for JB-2 and Nd isotope ratios for JB-3 and BCR-2 rock standards all fall within the ranges reported previously in high-accuracy analyses. COLUMNSPIDER is a versatile tool for the efficient elemental separation of igneous rock samples, a process that is both labor intensive and time consuming.

  4. Emerging technologies for demand side management. Demand side management jitsugen no tame no saishin gijutsu

    Energy Technology Data Exchange (ETDEWEB)

    Suzuki, H; Iyoda, I [Mitsubishi Electric Corp., Tokyo (Japan)

    1993-11-05

    This paper explains the latest situation in hardware technologies to realize the demand side management, divided into the following technologies: communications technology, measurement technology, client information system technology, load controlling technology, home automation technology, and energy storing and saving technologies. Speaking of the communications technology, information exchange between the supply side and the demand side is important in the demand side management, whereas a technology intended of automatic power distribution and automatic meter-reading is advancing in development. The technology covers transmissions using from power lines and telephone lines to optical cables and wireless communications. Power line communications using power transmission lines as communication lines are simple and economical, but weak against noise, and not suitable for long-distance communications. Wireless communications have been drawing attentions along with the development of mobile communication device technologies. These technologies will give benefits to electric power companies in the initial stage of their use, such as for load investigation and general automation in power distribution. They would shift to benefiting users in about 2010 covering from security information such as about power interruption to publicity information and educations. 8 refs., 8 figs., 1 tab.

  5. Information management - Assessing the demand for information

    Science.gov (United States)

    Rogers, William H.

    1991-01-01

    Information demand is defined in terms of both information content (what information) and form (when, how, and where it is needed). Providing the information richness required for flight crews to be informed without overwhelming their information processing capabilities will require a great deal of automated intelligence. It is seen that the essence of this intelligence is comprehending and capturing the demand for information.

  6. Fully automated laboratory for the assay of plutonium in wastes and recoverable scraps

    International Nuclear Information System (INIS)

    Guiberteau, P.; Michaut, F.; Bergey, C.; Debruyne, T.

    1990-01-01

    To determine the plutonium content of wastes and recoverable scraps in intermediate size containers (ten liters) an automated laboratory has been carried out. Two passive methods of measurement are used. Gamma ray spectrometry allows plutonium isotopic analysis, americium determination and plutonium assay in wastes and poor scraps. Calorimetry is used for accurate (± 3%) plutonium determination in rich scraps. A full automation was realized with a barcode management and a supply robot to feed the eight assay set-ups. The laboratory works on a 24 hours per day and 365 days per year basis and has a capacity of 8,000 assays per year

  7. Human-automation collaboration in manufacturing: identifying key implementation factors

    OpenAIRE

    Charalambous, George; Fletcher, Sarah; Webb, Philip

    2013-01-01

    Human-automation collaboration refers to the concept of human operators and intelligent automation working together interactively within the same workspace without conventional physical separation. This concept has commanded significant attention in manufacturing because of the potential applications, such as the installation of large sub-assemblies. However, the key human factors relevant to human-automation collaboration have not yet been fully investigated. To maximise effective implement...

  8. Advances in Automated QA/QC for TRISO Fuel Particle Production

    International Nuclear Information System (INIS)

    Hockey, Ronald L.; Bond, Leonard J.; Batishko, Charles R.; Gray, Joseph N.; Saurwein, John J.; Lowden, Richard A.

    2004-01-01

    Fuel in most Generation IV reactor designs typically encompasses billions of the TRISO particles. Present day QA/QC methods, done manually and in many cases destructively, cannot economically test a statistically significant fraction of the large number of the individual fuel particles required. Fully automated inspection technologies are essential to economical TRISO fuel particle production. A combination of in-line nondestructive (NDE) measurements employing electromagnetic induction and digital optical imaging analysis is currently under investigation and preliminary data indicate the potential for meeting the demands of this application. To calibrate high-speed NDE methods, surrogate fuel particle samples are being coated with layers containing a wide array of defect types found to degrade fuel performance and these are being characterized via high-resolution CT and digital radiographic images

  9. Analysis of the Effects of Connected–Automated Vehicle Technologies on Travel Demand

    Energy Technology Data Exchange (ETDEWEB)

    Auld, Joshua [Argonne National Laboratory, 9700 South Cass Avenue, Lemont, IL 60439; Sokolov, Vadim [Department of Systems Engineering and Operations Research, Volgenau School of Engineering, George Mason University, MS 4A6, 4400 University Drive, Fairfax, VA 22030; Stephens, Thomas S. [Argonne National Laboratory, 9700 South Cass Avenue, Lemont, IL 60439

    2017-01-01

    Connected–automated vehicle (CAV) technologies are likely to have significant effects not only on how vehicles operate in the transportation system, but also on how individuals behave and use their vehicles. While many CAV technologies—such as connected adaptive cruise control and ecosignals—have the potential to increase network throughput and efficiency, many of these same technologies have a secondary effect of reducing driver burden, which can drive changes in travel behavior. Such changes in travel behavior—in effect, lowering the cost of driving—have the potential to increase greatly the utilization of the transportation system with concurrent negative externalities, such as congestion, energy use, and emissions, working against the positive effects on the transportation system resulting from increased capacity. To date, few studies have analyzed the potential effects on CAV technologies from a systems perspective; studies often focus on gains and losses to an individual vehicle, at a single intersection, or along a corridor. However, travel demand and traffic flow constitute a complex, adaptive, nonlinear system. Therefore, in this study, an advanced transportation systems simulation model—POLARIS—was used. POLARIS includes cosimulation of travel behavior and traffic flow to study the potential effects of several CAV technologies at the regional level. Various technology penetration levels and changes in travel time sensitivity have been analyzed to determine a potential range of effects on vehicle miles traveled from various CAV technologies.

  10. The development of a fully automated radioimmunoassay instrument - micromedic systems concept 4

    International Nuclear Information System (INIS)

    Painter, K.

    1977-01-01

    The fully automatic RIA system Concept 4 by Micromedic is described in detail. The system uses antibody-coated test tubes to take up the samples. It has a maximum capacity of 200 tubes including standards and control tubes. Its advantages are, in particular, high flow rate, reproducibility, and fully automatic testing i.e. low personnel requirements. Its disadvantage are difficulties in protein assays. (ORU) [de

  11. Experience of automation failures in training: effects on trust, automation bias, complacency and performance.

    Science.gov (United States)

    Sauer, Juergen; Chavaillaz, Alain; Wastell, David

    2016-06-01

    This work examined the effects of operators' exposure to various types of automation failures in training. Forty-five participants were trained for 3.5 h on a simulated process control environment. During training, participants either experienced a fully reliable, automatic fault repair facility (i.e. faults detected and correctly diagnosed), a misdiagnosis-prone one (i.e. faults detected but not correctly diagnosed) or a miss-prone one (i.e. faults not detected). One week after training, participants were tested for 3 h, experiencing two types of automation failures (misdiagnosis, miss). The results showed that automation bias was very high when operators trained on miss-prone automation encountered a failure of the diagnostic system. Operator errors resulting from automation bias were much higher when automation misdiagnosed a fault than when it missed one. Differences in trust levels that were instilled by the different training experiences disappeared during the testing session. Practitioner Summary: The experience of automation failures during training has some consequences. A greater potential for operator errors may be expected when an automatic system failed to diagnose a fault than when it failed to detect one.

  12. Framtagning av en utvecklingsprocess för automation - baserat på konceptet Lean Automation

    OpenAIRE

    Carnbo, Linda

    2012-01-01

    Due to the globalization today the competition in the market has in- creased and it requires flexibility and produce according to customer demand. In order to reduce the cost of wages industrial companies are now considering moving the manufacturing to low-cost countries. To keep up with the competition in the market without moving the manu- facturing abroad, Lean Automation was developed. The concept of Lean Automation is to reduce the perceived complexity with automa- tion and make automati...

  13. The MMP inhibitor (R)-2-(N-benzyl-4-(2-[18F]fluoroethoxy)phenylsulphonamido) -N-hydroxy-3-methylbutanamide: Improved precursor synthesis and fully automated radiosynthesis

    International Nuclear Information System (INIS)

    Wagner, Stefan; Faust, Andreas; Breyholz, Hans-Joerg; Schober, Otmar; Schaefers, Michael; Kopka, Klaus

    2011-01-01

    Summary: The CGS 25966 derivative (R)-2-(N-Benzyl-4-(2-[ 18 F]fluoroethoxy)phenyl-sulphonamido) -N-hydroxy-3-methylbutanamide [ 18 F]9 represents a very potent radiolabelled matrix metalloproteinase inhibitor. For first human PET studies it is mandatory to have a fully automated radiosynthesis and a straightforward precursor synthesis available. The realisation of both requirements is reported herein. In particular, the corresponding precursor 8 was obtained in a reliable 7 step synthesis with an overall chemical yield of 2.3%. Furthermore, the target compound [ 18 F]9 was prepared with a radiochemical yield of 14.8±3.9% (not corrected for decay).

  14. Automated surveillance of healthcare-associated infections : state of the art

    NARCIS (Netherlands)

    Sips, Meander E; Bonten, Marc J M; van Mourik, Maaike S M

    PURPOSE OF REVIEW: This review describes recent advances in the field of automated surveillance of healthcare-associated infections (HAIs), with a focus on data sources and the development of semiautomated or fully automated algorithms. RECENT FINDINGS: The availability of high-quality data in

  15. AuTom: a novel automatic platform for electron tomography reconstruction

    KAUST Repository

    Han, Renmin

    2017-07-26

    We have developed a software package towards automatic electron tomography (ET): Automatic Tomography (AuTom). The presented package has the following characteristics: accurate alignment modules for marker-free datasets containing substantial biological structures; fully automatic alignment modules for datasets with fiducial markers; wide coverage of reconstruction methods including a new iterative method based on the compressed-sensing theory that suppresses the “missing wedge” effect; and multi-platform acceleration solutions that support faster iterative algebraic reconstruction. AuTom aims to achieve fully automatic alignment and reconstruction for electron tomography and has already been successful for a variety of datasets. AuTom also offers user-friendly interface and auxiliary designs for file management and workflow management, in which fiducial marker-based datasets and marker-free datasets are addressed with totally different subprocesses. With all of these features, AuTom can serve as a convenient and effective tool for processing in electron tomography.

  16. Integrating Test-Form Formatting into Automated Test Assembly

    Science.gov (United States)

    Diao, Qi; van der Linden, Wim J.

    2013-01-01

    Automated test assembly uses the methodology of mixed integer programming to select an optimal set of items from an item bank. Automated test-form generation uses the same methodology to optimally order the items and format the test form. From an optimization point of view, production of fully formatted test forms directly from the item pool using…

  17. Performance of an Additional Task During Level 2 Automated Driving: An On-Road Study Comparing Drivers With and Without Experience With Partial Automation.

    Science.gov (United States)

    Solís-Marcos, Ignacio; Ahlström, Christer; Kircher, Katja

    2018-05-01

    To investigate the influence of prior experience with Level 2 automation on additional task performance during manual and Level 2 partially automated driving. Level 2 automation is now on the market, but its effects on driver behavior remain unclear. Based on previous studies, we could expect an increase in drivers' engagement in secondary tasks during Level 2 automated driving, but it is yet unknown how drivers will integrate all the ongoing demands in such situations. Twenty-one drivers (12 without, 9 with Level 2 automation experience) drove on a highway manually and with Level 2 automation (exemplified by Volvo Pilot Assist generation 2; PA2) while performing an additional task. In half of the conditions, the task could be interrupted (self-paced), and in the other half, it could not (system-paced). Drivers' visual attention, additional task performance, and other compensatory strategies were analyzed. Driving with PA2 led to decreased scores in the additional task and more visual attention to the dashboard. In the self-paced condition, all drivers looked more to the task and perceived a lower mental demand. The drivers experienced with PA2 used the system and the task more than the novice group and performed more overtakings. The additional task interfered more with Level 2 automation than with manual driving. The drivers, particularly the automation novice drivers, used some compensatory strategies. Automation designers need to consider these potential effects in the development of future automated systems.

  18. Medical ADP Systems: Automated Medical Records Hold Promise to Improve Patient Care

    Science.gov (United States)

    1991-01-01

    automated medical records. The report discusses the potential benefits that automation could make to the quality of patient care and the factors that impede...information systems, but no organization has fully automated one of the most critical types of information, patient medical records. The patient medical record...its review of automated medical records. GAO’s objectives in this study were to identify the (1) benefits of automating patient records and (2) factors

  19. Twelve automated thresholding methods for segmentation of PET images: a phantom study

    International Nuclear Information System (INIS)

    Prieto, Elena; Peñuelas, Iván; Martí-Climent, Josep M; Lecumberri, Pablo; Gómez, Marisol; Pagola, Miguel; Bilbao, Izaskun; Ecay, Margarita

    2012-01-01

    Tumor volume delineation over positron emission tomography (PET) images is of great interest for proper diagnosis and therapy planning. However, standard segmentation techniques (manual or semi-automated) are operator dependent and time consuming while fully automated procedures are cumbersome or require complex mathematical development. The aim of this study was to segment PET images in a fully automated way by implementing a set of 12 automated thresholding algorithms, classical in the fields of optical character recognition, tissue engineering or non-destructive testing images in high-tech structures. Automated thresholding algorithms select a specific threshold for each image without any a priori spatial information of the segmented object or any special calibration of the tomograph, as opposed to usual thresholding methods for PET. Spherical 18 F-filled objects of different volumes were acquired on clinical PET/CT and on a small animal PET scanner, with three different signal-to-background ratios. Images were segmented with 12 automatic thresholding algorithms and results were compared with the standard segmentation reference, a threshold at 42% of the maximum uptake. Ridler and Ramesh thresholding algorithms based on clustering and histogram-shape information, respectively, provided better results that the classical 42%-based threshold (p < 0.05). We have herein demonstrated that fully automated thresholding algorithms can provide better results than classical PET segmentation tools. (paper)

  20. Robotic and nuclear safety for an automated/teleoperated glove box system

    International Nuclear Information System (INIS)

    Domning, E.E.; McMahon, T.T.; Sievers, R.H.

    1991-09-01

    Lawrence Livermore National Laboratory (LLNL) is developing a fully automated system to handle the processing of special nuclear materials (SNM). This work is performed in response to the new goals at the Department of Energy (DOE) for hazardous waste minimization and radiation dose reduction. This fully automated system, called the automated test bed (ATB), consists of an IBM gantry robot and automated processing equipment sealed within a glove box. While the ATB is a cold system, we are designing it as a prototype of the future hot system. We recognized that identification and application of safety requirements early in the design phase will lead to timely installation and approval of the hot system. This paper identifies these safety issues as well as the general safety requirements necessary for the safe operation of the ATB. 4 refs., 2 figs

  1. Automated quantification of proliferation with automated hot-spot selection in phosphohistone H3/MART1 dual-stained stage I/II melanoma.

    Science.gov (United States)

    Nielsen, Patricia Switten; Riber-Hansen, Rikke; Schmidt, Henrik; Steiniche, Torben

    2016-04-09

    Staging of melanoma includes quantification of a proliferation index, i.e., presumed melanocytic mitoses of H&E stains are counted manually in hot spots. Yet, its reproducibility and prognostic impact increases by immunohistochemical dual staining for phosphohistone H3 (PHH3) and MART1, which also may enable fully automated quantification by image analysis. To ensure manageable workloads and repeatable measurements in modern pathology, the study aimed to present an automated quantification of proliferation with automated hot-spot selection in PHH3/MART1-stained melanomas. Formalin-fixed, paraffin-embedded tissue from 153 consecutive stage I/II melanoma patients was immunohistochemically dual-stained for PHH3 and MART1. Whole slide images were captured, and the number of PHH3/MART1-positive cells was manually and automatically counted in the global tumor area and in a manually and automatically selected hot spot, i.e., a fixed 1-mm(2) square. Bland-Altman plots and hypothesis tests compared manual and automated procedures, and the Cox proportional hazards model established their prognostic impact. The mean difference between manual and automated global counts was 2.9 cells/mm(2) (P = 0.0071) and 0.23 cells per hot spot (P = 0.96) for automated counts in manually and automatically selected hot spots. In 77 % of cases, manual and automated hot spots overlapped. Fully manual hot-spot counts yielded the highest prognostic performance with an adjusted hazard ratio of 5.5 (95 % CI, 1.3-24, P = 0.024) as opposed to 1.3 (95 % CI, 0.61-2.9, P = 0.47) for automated counts with automated hot spots. The automated index and automated hot-spot selection were highly correlated to their manual counterpart, but altogether their prognostic impact was noticeably reduced. Because correct recognition of only one PHH3/MART1-positive cell seems important, extremely high sensitivity and specificity of the algorithm is required for prognostic purposes. Thus, automated

  2. Fully Automated Simultaneous Integrated Boosted-Intensity Modulated Radiation Therapy Treatment Planning Is Feasible for Head-and-Neck Cancer: A Prospective Clinical Study

    Energy Technology Data Exchange (ETDEWEB)

    Wu Binbin, E-mail: binbin.wu@gunet.georgetown.edu [Department of Radiation Oncology and Molecular Radiation Science, Johns Hopkins University, Baltimore, Maryland (United States); Department of Radiation Medicine, Georgetown University Hospital, Washington, DC (United States); McNutt, Todd [Department of Radiation Oncology and Molecular Radiation Science, Johns Hopkins University, Baltimore, Maryland (United States); Zahurak, Marianna [Department of Oncology Biostatistics, Johns Hopkins University, Baltimore, Maryland (United States); Simari, Patricio [Autodesk Research, Toronto, ON (Canada); Pang, Dalong [Department of Radiation Medicine, Georgetown University Hospital, Washington, DC (United States); Taylor, Russell [Department of Computer Science, Johns Hopkins University, Baltimore, Maryland (United States); Sanguineti, Giuseppe [Department of Radiation Oncology and Molecular Radiation Science, Johns Hopkins University, Baltimore, Maryland (United States)

    2012-12-01

    Purpose: To prospectively determine whether overlap volume histogram (OVH)-driven, automated simultaneous integrated boosted (SIB)-intensity-modulated radiation therapy (IMRT) treatment planning for head-and-neck cancer can be implemented in clinics. Methods and Materials: A prospective study was designed to compare fully automated plans (APs) created by an OVH-driven, automated planning application with clinical plans (CPs) created by dosimetrists in a 3-dose-level (70 Gy, 63 Gy, and 58.1 Gy), head-and-neck SIB-IMRT planning. Because primary organ sparing (cord, brain, brainstem, mandible, and optic nerve/chiasm) always received the highest priority in clinical planning, the study aimed to show the noninferiority of APs with respect to PTV coverage and secondary organ sparing (parotid, brachial plexus, esophagus, larynx, inner ear, and oral mucosa). The sample size was determined a priori by a superiority hypothesis test that had 85% power to detect a 4% dose decrease in secondary organ sparing with a 2-sided alpha level of 0.05. A generalized estimating equation (GEE) regression model was used for statistical comparison. Results: Forty consecutive patients were accrued from July to December 2010. GEE analysis indicated that in APs, overall average dose to the secondary organs was reduced by 1.16 (95% CI = 0.09-2.33) with P=.04, overall average PTV coverage was increased by 0.26% (95% CI = 0.06-0.47) with P=.02 and overall average dose to the primary organs was reduced by 1.14 Gy (95% CI = 0.45-1.8) with P=.004. A physician determined that all APs could be delivered to patients, and APs were clinically superior in 27 of 40 cases. Conclusions: The application can be implemented in clinics as a fast, reliable, and consistent way of generating plans that need only minor adjustments to meet specific clinical needs.

  3. Fully Automated Simultaneous Integrated Boosted–Intensity Modulated Radiation Therapy Treatment Planning Is Feasible for Head-and-Neck Cancer: A Prospective Clinical Study

    International Nuclear Information System (INIS)

    Wu Binbin; McNutt, Todd; Zahurak, Marianna; Simari, Patricio; Pang, Dalong; Taylor, Russell; Sanguineti, Giuseppe

    2012-01-01

    Purpose: To prospectively determine whether overlap volume histogram (OVH)–driven, automated simultaneous integrated boosted (SIB)-intensity-modulated radiation therapy (IMRT) treatment planning for head-and-neck cancer can be implemented in clinics. Methods and Materials: A prospective study was designed to compare fully automated plans (APs) created by an OVH-driven, automated planning application with clinical plans (CPs) created by dosimetrists in a 3-dose-level (70 Gy, 63 Gy, and 58.1 Gy), head-and-neck SIB-IMRT planning. Because primary organ sparing (cord, brain, brainstem, mandible, and optic nerve/chiasm) always received the highest priority in clinical planning, the study aimed to show the noninferiority of APs with respect to PTV coverage and secondary organ sparing (parotid, brachial plexus, esophagus, larynx, inner ear, and oral mucosa). The sample size was determined a priori by a superiority hypothesis test that had 85% power to detect a 4% dose decrease in secondary organ sparing with a 2-sided alpha level of 0.05. A generalized estimating equation (GEE) regression model was used for statistical comparison. Results: Forty consecutive patients were accrued from July to December 2010. GEE analysis indicated that in APs, overall average dose to the secondary organs was reduced by 1.16 (95% CI = 0.09-2.33) with P=.04, overall average PTV coverage was increased by 0.26% (95% CI = 0.06-0.47) with P=.02 and overall average dose to the primary organs was reduced by 1.14 Gy (95% CI = 0.45-1.8) with P=.004. A physician determined that all APs could be delivered to patients, and APs were clinically superior in 27 of 40 cases. Conclusions: The application can be implemented in clinics as a fast, reliable, and consistent way of generating plans that need only minor adjustments to meet specific clinical needs.

  4. Comparison of manual & automated analysis methods for corneal endothelial cell density measurements by specular microscopy.

    Science.gov (United States)

    Huang, Jianyan; Maram, Jyotsna; Tepelus, Tudor C; Modak, Cristina; Marion, Ken; Sadda, SriniVas R; Chopra, Vikas; Lee, Olivia L

    2017-08-07

    To determine the reliability of corneal endothelial cell density (ECD) obtained by automated specular microscopy versus that of validated manual methods and factors that predict such reliability. Sharp central images from 94 control and 106 glaucomatous eyes were captured with Konan specular microscope NSP-9900. All images were analyzed by trained graders using Konan CellChek Software, employing the fully- and semi-automated methods as well as Center Method. Images with low cell count (input cells number <100) and/or guttata were compared with the Center and Flex-Center Methods. ECDs were compared and absolute error was used to assess variation. The effect on ECD of age, cell count, cell size, and cell size variation was evaluated. No significant difference was observed between the Center and Flex-Center Methods in corneas with guttata (p=0.48) or low ECD (p=0.11). No difference (p=0.32) was observed in ECD of normal controls <40 yrs old between the fully-automated method and manual Center Method. However, in older controls and glaucomatous eyes, ECD was overestimated by the fully-automated method (p=0.034) and semi-automated method (p=0.025) as compared to manual method. Our findings show that automated analysis significantly overestimates ECD in the eyes with high polymegathism and/or large cell size, compared to the manual method. Therefore, we discourage reliance upon the fully-automated method alone to perform specular microscopy analysis, particularly if an accurate ECD value is imperative. Copyright © 2017. Published by Elsevier España, S.L.U.

  5. Opportunities, Barriers and Actions for Industrial Demand Response in California

    Energy Technology Data Exchange (ETDEWEB)

    McKane, Aimee T.; Piette, Mary Ann; Faulkner, David; Ghatikar, Girish; Radspieler Jr., Anthony; Adesola, Bunmi; Murtishaw, Scott; Kiliccote, Sila

    2008-01-31

    In 2006 the Demand Response Research Center (DRRC) formed an Industrial Demand Response Team to investigate opportunities and barriers to implementation of Automated Demand Response (Auto-DR) systems in California industries. Auto-DR is an open, interoperable communications and technology platform designed to: Provide customers with automated, electronic price and reliability signals; Provide customers with capability to automate customized DR strategies; Automate DR, providing utilities with dispatchable operational capability similar to conventional generation resources. This research began with a review of previous Auto-DR research on the commercial sector. Implementing Auto-DR in industry presents a number of challenges, both practical and perceived. Some of these include: the variation in loads and processes across and within sectors, resource-dependent loading patterns that are driven by outside factors such as customer orders or time-critical processing (e.g. tomato canning), the perceived lack of control inherent in the term 'Auto-DR', and aversion to risk, especially unscheduled downtime. While industry has demonstrated a willingness to temporarily provide large sheds and shifts to maintain grid reliability and be a good corporate citizen, the drivers for widespread Auto-DR will likely differ. Ultimately, most industrial facilities will balance the real and perceived risks associated with Auto-DR against the potential for economic gain through favorable pricing or incentives. Auto-DR, as with any ongoing industrial activity, will need to function effectively within market structures. The goal of the industrial research is to facilitate deployment of industrial Auto-DR that is economically attractive and technologically feasible. Automation will make DR: More visible by providing greater transparency through two-way end-to-end communication of DR signals from end-use customers; More repeatable, reliable, and persistent because the automated

  6. Pyrochemical processing automation at Lawrence Livermore National Laboratory

    International Nuclear Information System (INIS)

    Dennison, D.K.; Domning, E.E.; Seivers, R.

    1991-01-01

    Lawrence Livermore National Laboratory (LLNL) is developing a fully automated system for pyrochemical processing of special nuclear materials (SNM). The system utilizes a glove box, an automated tilt-pour furnace (TPF), an IBM developed gantry robot, and specialized automation tooling. All material handling within the glove box (i.e., furnace loading, furnace unloading, product and slag separation, and product packaging) is performed automatically. The objectives of the effort are to increase process productivity, decrease operator radiation, reduce process wastes, and demonstrate system reliability and availability. This paper provides an overview of the automated system hardware, outlines the overall operations sequence, and discusses the current status

  7. 1st workshop on situational awareness in semi-Automated vehicles

    NARCIS (Netherlands)

    McCall, R.; Baumann, M.; Politis, I.; Borojeni, S.S.; Alvarez, I.; Mirnig, A.; Meschtscherjakov, A.; Tscheligi, M.; Chuang, L.; Terken, J.M.B.

    2016-01-01

    This workshop will focus on the problem of occupant and vehicle situational awareness with respect to automated vehicles when the driver must take over control. It will explore the future of fully automated and mixed traffic situations where vehicles are assumed to be operating at level 3 or above.

  8. Automation synthesis modules review

    International Nuclear Information System (INIS)

    Boschi, S.; Lodi, F.; Malizia, C.; Cicoria, G.; Marengo, M.

    2013-01-01

    The introduction of 68 Ga labelled tracers has changed the diagnostic approach to neuroendocrine tumours and the availability of a reliable, long-lived 68 Ge/ 68 Ga generator has been at the bases of the development of 68 Ga radiopharmacy. The huge increase in clinical demand, the impact of regulatory issues and a careful radioprotection of the operators have boosted for extensive automation of the production process. The development of automated systems for 68 Ga radiochemistry, different engineering and software strategies and post-processing of the eluate were discussed along with impact of automation with regulations. - Highlights: ► Generators availability and robust chemistry boosted for the huge diffusion of 68Ga radiopharmaceuticals. ► Different technological approaches for 68Ga radiopharmaceuticals will be discussed. ► Generator eluate post processing and evolution to cassette based systems were the major issues in automation. ► Impact of regulations on the technological development will be also considered

  9. Generic Automated Multi-function Finger Design

    Science.gov (United States)

    Honarpardaz, M.; Tarkian, M.; Sirkett, D.; Ölvander, J.; Feng, X.; Elf, J.; Sjögren, R.

    2016-11-01

    Multi-function fingers that are able to handle multiple workpieces are crucial in improvement of a robot workcell. Design automation of multi-function fingers is highly demanded by robot industries to overcome the current iterative, time consuming and complex manual design process. However, the existing approaches for the multi-function finger design automation are unable to entirely meet the robot industries’ need. This paper proposes a generic approach for design automation of multi-function fingers. The proposed approach completely automates the design process and requires no expert skill. In addition, this approach executes the design process much faster than the current manual process. To validate the approach, multi-function fingers are successfully designed for two case studies. Further, the results are discussed and benchmarked with existing approaches.

  10. Fully-Automated High-Throughput NMR System for Screening of Haploid Kernels of Maize (Corn by Measurement of Oil Content.

    Directory of Open Access Journals (Sweden)

    Hongzhi Wang

    Full Text Available One of the modern crop breeding techniques uses doubled haploid plants that contain an identical pair of chromosomes in order to accelerate the breeding process. Rapid haploid identification method is critical for large-scale selections of double haploids. The conventional methods based on the color of the endosperm and embryo seeds are slow, manual and prone to error. On the other hand, there exists a significant difference between diploid and haploid seeds generated by high oil inducer, which makes it possible to use oil content to identify the haploid. This paper describes a fully-automated high-throughput NMR screening system for maize haploid kernel identification. The system is comprised of a sampler unit to select a single kernel to feed for measurement of NMR and weight, and a kernel sorter to distribute the kernel according to the measurement result. Tests of the system show a consistent accuracy of 94% with an average screening time of 4 seconds per kernel. Field test result is described and the directions for future improvement are discussed.

  11. Fully-Automated High-Throughput NMR System for Screening of Haploid Kernels of Maize (Corn) by Measurement of Oil Content

    Science.gov (United States)

    Xu, Xiaoping; Huang, Qingming; Chen, Shanshan; Yang, Peiqiang; Chen, Shaojiang; Song, Yiqiao

    2016-01-01

    One of the modern crop breeding techniques uses doubled haploid plants that contain an identical pair of chromosomes in order to accelerate the breeding process. Rapid haploid identification method is critical for large-scale selections of double haploids. The conventional methods based on the color of the endosperm and embryo seeds are slow, manual and prone to error. On the other hand, there exists a significant difference between diploid and haploid seeds generated by high oil inducer, which makes it possible to use oil content to identify the haploid. This paper describes a fully-automated high-throughput NMR screening system for maize haploid kernel identification. The system is comprised of a sampler unit to select a single kernel to feed for measurement of NMR and weight, and a kernel sorter to distribute the kernel according to the measurement result. Tests of the system show a consistent accuracy of 94% with an average screening time of 4 seconds per kernel. Field test result is described and the directions for future improvement are discussed. PMID:27454427

  12. Fully automated dual-frequency three-pulse-echo 2DIR spectrometer accessing spectral range from 800 to 4000 wavenumbers

    Energy Technology Data Exchange (ETDEWEB)

    Leger, Joel D.; Nyby, Clara M.; Varner, Clyde; Tang, Jianan; Rubtsova, Natalia I.; Yue, Yuankai; Kireev, Victor V.; Burtsev, Viacheslav D.; Qasim, Layla N.; Rubtsov, Igor V., E-mail: irubtsov@tulane.edu [Department of Chemistry, Tulane University, New Orleans, Louisiana 70118 (United States); Rubtsov, Grigory I. [Institute for Nuclear Research of the Russian Academy of Sciences, Moscow 117312 (Russian Federation)

    2014-08-15

    A novel dual-frequency two-dimensional infrared instrument is designed and built that permits three-pulse heterodyned echo measurements of any cross-peak within a spectral range from 800 to 4000 cm{sup −1} to be performed in a fully automated fashion. The superior sensitivity of the instrument is achieved by a combination of spectral interferometry, phase cycling, and closed-loop phase stabilization accurate to ∼70 as. The anharmonicity of smaller than 10{sup −4} cm{sup −1} was recorded for strong carbonyl stretching modes using 800 laser shot accumulations. The novel design of the phase stabilization scheme permits tuning polarizations of the mid-infrared (m-IR) pulses, thus supporting measurements of the angles between vibrational transition dipoles. The automatic frequency tuning is achieved by implementing beam direction stabilization schemes for each m-IR beam, providing better than 50 μrad beam stability, and novel scheme for setting the phase-matching geometry for the m-IR beams at the sample. The errors in the cross-peak amplitudes associated with imperfect phase matching conditions and alignment are found to be at the level of 20%. The instrument can be used by non-specialists in ultrafast spectroscopy.

  13. Automated tracking for advanced satellite laser ranging systems

    Science.gov (United States)

    McGarry, Jan F.; Degnan, John J.; Titterton, Paul J., Sr.; Sweeney, Harold E.; Conklin, Brion P.; Dunn, Peter J.

    1996-06-01

    NASA's Satellite Laser Ranging Network was originally developed during the 1970's to track satellites carrying corner cube reflectors. Today eight NASA systems, achieving millimeter ranging precision, are part of a global network of more than 40 stations that track 17 international satellites. To meet the tracking demands of a steadily growing satellite constellation within existing resources, NASA is embarking on a major automation program. While manpower on the current systems will be reduced to a single operator, the fully automated SLR2000 system is being designed to operate for months without human intervention. Because SLR2000 must be eyesafe and operate in daylight, tracking is often performed in a low probability of detection and high noise environment. The goal is to automatically select the satellite, setup the tracking and ranging hardware, verify acquisition, and close the tracking loop to optimize data yield. TO accomplish the autotracking tasks, we are investigating (1) improved satellite force models, (2) more frequent updates of orbital ephemerides, (3) lunar laser ranging data processing techniques to distinguish satellite returns from noise, and (4) angular detection and search techniques to acquire the satellite. A Monte Carlo simulator has been developed to allow optimization of the autotracking algorithms by modeling the relevant system errors and then checking performance against system truth. A combination of simulator and preliminary field results will be presented.

  14. A wearable device for a fully automated in-hospital staff and patient identification.

    Science.gov (United States)

    Cavalleri, M; Morstabilini, R; Reni, G

    2004-01-01

    In the health care context, devices for automated staff / patient identification provide multiple benefits, including error reduction in drug administration, an easier and faster use of the Electronic Health Record, enhanced security and control features when accessing confidential data, etc. Current identification systems (e.g. smartcards, bar codes) are not completely seamless to users and require mechanical operations that sometimes are difficult to perform for impaired subjects. Emerging wireless RFID technologies are encouraging, but cannot still be introduced in health care environments due to their electromagnetic emissions and the need for large size antenna to operate at reasonable distances. The present work describes a prototype of wearable device for automated staff and patient identification which is small in size and complies with the in-hospital electromagnetic requirements. This prototype also implements an anti-counterfeit option. Its experimental application allowed the introduction of some security functions for confidential data management.

  15. TH-AB-207A-05: A Fully-Automated Pipeline for Generating CT Images Across a Range of Doses and Reconstruction Methods

    International Nuclear Information System (INIS)

    Young, S; Lo, P; Hoffman, J; Wahi-Anwar, M; Brown, M; McNitt-Gray, M; Noo, F

    2016-01-01

    Purpose: To evaluate the robustness of CAD or Quantitative Imaging methods, they should be tested on a variety of cases and under a variety of image acquisition and reconstruction conditions that represent the heterogeneity encountered in clinical practice. The purpose of this work was to develop a fully-automated pipeline for generating CT images that represent a wide range of dose and reconstruction conditions. Methods: The pipeline consists of three main modules: reduced-dose simulation, image reconstruction, and quantitative analysis. The first two modules of the pipeline can be operated in a completely automated fashion, using configuration files and running the modules in a batch queue. The input to the pipeline is raw projection CT data; this data is used to simulate different levels of dose reduction using a previously-published algorithm. Filtered-backprojection reconstructions are then performed using FreeCT_wFBP, a freely-available reconstruction software for helical CT. We also added support for an in-house, model-based iterative reconstruction algorithm using iterative coordinate-descent optimization, which may be run in tandem with the more conventional recon methods. The reduced-dose simulations and image reconstructions are controlled automatically by a single script, and they can be run in parallel on our research cluster. The pipeline was tested on phantom and lung screening datasets from a clinical scanner (Definition AS, Siemens Healthcare). Results: The images generated from our test datasets appeared to represent a realistic range of acquisition and reconstruction conditions that we would expect to find clinically. The time to generate images was approximately 30 minutes per dose/reconstruction combination on a hybrid CPU/GPU architecture. Conclusion: The automated research pipeline promises to be a useful tool for either training or evaluating performance of quantitative imaging software such as classifiers and CAD algorithms across the range

  16. TH-AB-207A-05: A Fully-Automated Pipeline for Generating CT Images Across a Range of Doses and Reconstruction Methods

    Energy Technology Data Exchange (ETDEWEB)

    Young, S; Lo, P; Hoffman, J; Wahi-Anwar, M; Brown, M; McNitt-Gray, M [UCLA School of Medicine, Los Angeles, CA (United States); Noo, F [University of Utah, Salt Lake City, UT (United States)

    2016-06-15

    Purpose: To evaluate the robustness of CAD or Quantitative Imaging methods, they should be tested on a variety of cases and under a variety of image acquisition and reconstruction conditions that represent the heterogeneity encountered in clinical practice. The purpose of this work was to develop a fully-automated pipeline for generating CT images that represent a wide range of dose and reconstruction conditions. Methods: The pipeline consists of three main modules: reduced-dose simulation, image reconstruction, and quantitative analysis. The first two modules of the pipeline can be operated in a completely automated fashion, using configuration files and running the modules in a batch queue. The input to the pipeline is raw projection CT data; this data is used to simulate different levels of dose reduction using a previously-published algorithm. Filtered-backprojection reconstructions are then performed using FreeCT-wFBP, a freely-available reconstruction software for helical CT. We also added support for an in-house, model-based iterative reconstruction algorithm using iterative coordinate-descent optimization, which may be run in tandem with the more conventional recon methods. The reduced-dose simulations and image reconstructions are controlled automatically by a single script, and they can be run in parallel on our research cluster. The pipeline was tested on phantom and lung screening datasets from a clinical scanner (Definition AS, Siemens Healthcare). Results: The images generated from our test datasets appeared to represent a realistic range of acquisition and reconstruction conditions that we would expect to find clinically. The time to generate images was approximately 30 minutes per dose/reconstruction combination on a hybrid CPU/GPU architecture. Conclusion: The automated research pipeline promises to be a useful tool for either training or evaluating performance of quantitative imaging software such as classifiers and CAD algorithms across the range

  17. Rig automation: where it's been and where it's going

    Energy Technology Data Exchange (ETDEWEB)

    Rinaldi, R.

    1982-06-01

    For over 30 years dreamers, tinkerers and engineers have attempted to automate various drilling functions. Now this effort is paying off, and a partially automated rig is no longer a curiosity. Fully automated and computerized rigs are on the way. For the contractor this means higher productivity, but more maintenance and training responsibilities.

  18. DeMand: A tool for evaluating and comparing device-level demand and supply forecast models

    DEFF Research Database (Denmark)

    Neupane, Bijay; Siksnys, Laurynas; Pedersen, Torben Bach

    2016-01-01

    Fine-grained device-level predictions of both shiftable and non-shiftable energy demand and supply is vital in order to take advantage of Demand Response (DR) for efficient utilization of Renewable Energy Sources. The selection of an effective device-level load forecast model is a challenging task......, mainly due to the diversity of the models and the lack of proper tools and datasets that can be used to validate them. In this paper, we introduce the DeMand system for fine-tuning, analyzing, and validating the device-level forecast models. The system offers several built-in device-level measurement...... datasets, forecast models, features, and errors measures, thus semi-automating most of the steps of the forecast model selection and validation process. This paper presents the architecture and data model of the DeMand system; and provides a use-case example on how one particular forecast model...

  19. Fully-Automated μMRI Morphometric Phenotyping of the Tc1 Mouse Model of Down Syndrome.

    Directory of Open Access Journals (Sweden)

    Nick M Powell

    Full Text Available We describe a fully automated pipeline for the morphometric phenotyping of mouse brains from μMRI data, and show its application to the Tc1 mouse model of Down syndrome, to identify new morphological phenotypes in the brain of this first transchromosomic animal carrying human chromosome 21. We incorporate an accessible approach for simultaneously scanning multiple ex vivo brains, requiring only a 3D-printed brain holder, and novel image processing steps for their separation and orientation. We employ clinically established multi-atlas techniques-superior to single-atlas methods-together with publicly-available atlas databases for automatic skull-stripping and tissue segmentation, providing high-quality, subject-specific tissue maps. We follow these steps with group-wise registration, structural parcellation and both Voxel- and Tensor-Based Morphometry-advantageous for their ability to highlight morphological differences without the laborious delineation of regions of interest. We show the application of freely available open-source software developed for clinical MRI analysis to mouse brain data: NiftySeg for segmentation and NiftyReg for registration, and discuss atlases and parameters suitable for the preclinical paradigm. We used this pipeline to compare 29 Tc1 brains with 26 wild-type littermate controls, imaged ex vivo at 9.4T. We show an unexpected increase in Tc1 total intracranial volume and, controlling for this, local volume and grey matter density reductions in the Tc1 brain compared to the wild-types, most prominently in the cerebellum, in agreement with human DS and previous histological findings.

  20. Systematic review automation technologies

    Science.gov (United States)

    2014-01-01

    Systematic reviews, a cornerstone of evidence-based medicine, are not produced quickly enough to support clinical practice. The cost of production, availability of the requisite expertise and timeliness are often quoted as major contributors for the delay. This detailed survey of the state of the art of information systems designed to support or automate individual tasks in the systematic review, and in particular systematic reviews of randomized controlled clinical trials, reveals trends that see the convergence of several parallel research projects. We surveyed literature describing informatics systems that support or automate the processes of systematic review or each of the tasks of the systematic review. Several projects focus on automating, simplifying and/or streamlining specific tasks of the systematic review. Some tasks are already fully automated while others are still largely manual. In this review, we describe each task and the effect that its automation would have on the entire systematic review process, summarize the existing information system support for each task, and highlight where further research is needed for realizing automation for the task. Integration of the systems that automate systematic review tasks may lead to a revised systematic review workflow. We envisage the optimized workflow will lead to system in which each systematic review is described as a computer program that automatically retrieves relevant trials, appraises them, extracts and synthesizes data, evaluates the risk of bias, performs meta-analysis calculations, and produces a report in real time. PMID:25005128

  1. A Framework for Fully Automated Performance Testing for Smart Buildings

    DEFF Research Database (Denmark)

    Markoska, Elena; Johansen, Aslak; Lazarova-Molnar, Sanja

    2018-01-01

    , setup of performance tests has been manual and labor-intensive and has required intimate knowledge of buildings’ complexity and systems. The emergence of the concept of smart buildings has provided an opportunity to overcome this restriction. In this paper, we propose a framework for automated......A significant proportion of energy consumption by buildings worldwide, estimated to ca. 40%, has yielded a high importance to studying buildings’ performance. Performance testing is a mean by which buildings can be continuously commissioned to ensure that they operate as designed. Historically...... performance testing of smart buildings that utilizes metadata models. The approach features automatic detection of applicable performance tests using metadata queries and their corresponding instantiation, as well as continuous commissioning based on metadata. The presented approach has been implemented...

  2. Towards fully automated structure-based NMR resonance assignment of 15N-labeled proteins from automatically picked peaks

    KAUST Repository

    Jang, Richard; Gao, Xin; Li, Ming

    2011-01-01

    In NMR resonance assignment, an indispensable step in NMR protein studies, manually processed peaks from both N-labeled and C-labeled spectra are typically used as inputs. However, the use of homologous structures can allow one to use only N-labeled NMR data and avoid the added expense of using C-labeled data. We propose a novel integer programming framework for structure-based backbone resonance assignment using N-labeled data. The core consists of a pair of integer programming models: one for spin system forming and amino acid typing, and the other for backbone resonance assignment. The goal is to perform the assignment directly from spectra without any manual intervention via automatically picked peaks, which are much noisier than manually picked peaks, so methods must be error-tolerant. In the case of semi-automated/manually processed peak data, we compare our system with the Xiong-Pandurangan-Bailey- Kellogg's contact replacement (CR) method, which is the most error-tolerant method for structure-based resonance assignment. Our system, on average, reduces the error rate of the CR method by five folds on their data set. In addition, by using an iterative algorithm, our system has the added capability of using the NOESY data to correct assignment errors due to errors in predicting the amino acid and secondary structure type of each spin system. On a publicly available data set for human ubiquitin, where the typing accuracy is 83%, we achieve 91% accuracy, compared to the 59% accuracy obtained without correcting for such errors. In the case of automatically picked peaks, using assignment information from yeast ubiquitin, we achieve a fully automatic assignment with 97% accuracy. To our knowledge, this is the first system that can achieve fully automatic structure-based assignment directly from spectra. This has implications in NMR protein mutant studies, where the assignment step is repeated for each mutant. © Copyright 2011, Mary Ann Liebert, Inc.

  3. Towards fully automated structure-based NMR resonance assignment of 15N-labeled proteins from automatically picked peaks

    KAUST Repository

    Jang, Richard

    2011-03-01

    In NMR resonance assignment, an indispensable step in NMR protein studies, manually processed peaks from both N-labeled and C-labeled spectra are typically used as inputs. However, the use of homologous structures can allow one to use only N-labeled NMR data and avoid the added expense of using C-labeled data. We propose a novel integer programming framework for structure-based backbone resonance assignment using N-labeled data. The core consists of a pair of integer programming models: one for spin system forming and amino acid typing, and the other for backbone resonance assignment. The goal is to perform the assignment directly from spectra without any manual intervention via automatically picked peaks, which are much noisier than manually picked peaks, so methods must be error-tolerant. In the case of semi-automated/manually processed peak data, we compare our system with the Xiong-Pandurangan-Bailey- Kellogg\\'s contact replacement (CR) method, which is the most error-tolerant method for structure-based resonance assignment. Our system, on average, reduces the error rate of the CR method by five folds on their data set. In addition, by using an iterative algorithm, our system has the added capability of using the NOESY data to correct assignment errors due to errors in predicting the amino acid and secondary structure type of each spin system. On a publicly available data set for human ubiquitin, where the typing accuracy is 83%, we achieve 91% accuracy, compared to the 59% accuracy obtained without correcting for such errors. In the case of automatically picked peaks, using assignment information from yeast ubiquitin, we achieve a fully automatic assignment with 97% accuracy. To our knowledge, this is the first system that can achieve fully automatic structure-based assignment directly from spectra. This has implications in NMR protein mutant studies, where the assignment step is repeated for each mutant. © Copyright 2011, Mary Ann Liebert, Inc.

  4. Demand Response Resource Quantification with Detailed Building Energy Models

    Energy Technology Data Exchange (ETDEWEB)

    Hale, Elaine; Horsey, Henry; Merket, Noel; Stoll, Brady; Nag, Ambarish

    2017-04-03

    Demand response is a broad suite of technologies that enables changes in electrical load operations in support of power system reliability and efficiency. Although demand response is not a new concept, there is new appetite for comprehensively evaluating its technical potential in the context of renewable energy integration. The complexity of demand response makes this task difficult -- we present new methods for capturing the heterogeneity of potential responses from buildings, their time-varying nature, and metrics such as thermal comfort that help quantify likely acceptability of specific demand response actions. Computed with an automated software framework, the methods are scalable.

  5. Fully Automated Laser Ablation Liquid Capture Sample Analysis using NanoElectrospray Ionization Mass Spectrometry

    Energy Technology Data Exchange (ETDEWEB)

    Lorenz, Matthias [ORNL; Ovchinnikova, Olga S [ORNL; Van Berkel, Gary J [ORNL

    2014-01-01

    RATIONALE: Laser ablation provides for the possibility of sampling a large variety of surfaces with high spatial resolution. This type of sampling when employed in conjunction with liquid capture followed by nanoelectrospray ionization provides the opportunity for sensitive and prolonged interrogation of samples by mass spectrometry as well as the ability to analyze surfaces not amenable to direct liquid extraction. METHODS: A fully automated, reflection geometry, laser ablation liquid capture spot sampling system was achieved by incorporating appropriate laser fiber optics and a focusing lens into a commercially available, liquid extraction surface analysis (LESA ) ready Advion TriVersa NanoMate system. RESULTS: Under optimized conditions about 10% of laser ablated material could be captured in a droplet positioned vertically over the ablation region using the NanoMate robot controlled pipette. The sampling spot size area with this laser ablation liquid capture surface analysis (LA/LCSA) mode of operation (typically about 120 m x 160 m) was approximately 50 times smaller than that achievable by direct liquid extraction using LESA (ca. 1 mm diameter liquid extraction spot). The set-up was successfully applied for the analysis of ink on glass and paper as well as the endogenous components in Alstroemeria Yellow King flower petals. In a second mode of operation with a comparable sampling spot size, termed laser ablation/LESA , the laser system was used to drill through, penetrate, or otherwise expose material beneath a solvent resistant surface. Once drilled, LESA was effective in sampling soluble material exposed at that location on the surface. CONCLUSIONS: Incorporating the capability for different laser ablation liquid capture spot sampling modes of operation into a LESA ready Advion TriVersa NanoMate enhanced the spot sampling spatial resolution of this device and broadened the surface types amenable to analysis to include absorbent and solvent resistant

  6. ROBOCAL: An automated NDA [nondestructive analysis] calorimetry and gamma isotopic system

    International Nuclear Information System (INIS)

    Hurd, J.R.; Powell, W.D.; Ostenak, C.A.

    1989-01-01

    ROBOCAL, which is presently being developed and tested at Los Alamos National Laboratory, is a full-scale, prototype robotic system for remote calorimetric and gamma-ray analysis of special nuclear materials. It integrates a fully automated, multidrawer, vertical stacker-retriever system for staging unmeasured nuclear materials, and a fully automated gantry robot for computer-based selection and transfer of nuclear materials to calorimetric and gamma-ray measurement stations. Since ROBOCAL is designed for minimal operator intervention, a completely programmed user interface is provided to interact with the automated mechanical and assay systems. The assay system is designed to completely integrate calorimetric and gamma-ray data acquisition and to perform state-of-the-art analyses on both homogeneous and heterogeneous distributions of nuclear materials in a wide variety of matrices

  7. Histograms of Oriented 3D Gradients for Fully Automated Fetal Brain Localization and Robust Motion Correction in 3 T Magnetic Resonance Images.

    Science.gov (United States)

    Serag, Ahmed; Macnaught, Gillian; Denison, Fiona C; Reynolds, Rebecca M; Semple, Scott I; Boardman, James P

    2017-01-01

    Fetal brain magnetic resonance imaging (MRI) is a rapidly emerging diagnostic imaging tool. However, automated fetal brain localization is one of the biggest obstacles in expediting and fully automating large-scale fetal MRI processing. We propose a method for automatic localization of fetal brain in 3 T MRI when the images are acquired as a stack of 2D slices that are misaligned due to fetal motion. First, the Histogram of Oriented Gradients (HOG) feature descriptor is extended from 2D to 3D images. Then, a sliding window is used to assign a score to all possible windows in an image, depending on the likelihood of it containing a brain, and the window with the highest score is selected. In our evaluation experiments using a leave-one-out cross-validation strategy, we achieved 96% of complete brain localization using a database of 104 MRI scans at gestational ages between 34 and 38 weeks. We carried out comparisons against template matching and random forest based regression methods and the proposed method showed superior performance. We also showed the application of the proposed method in the optimization of fetal motion correction and how it is essential for the reconstruction process. The method is robust and does not rely on any prior knowledge of fetal brain development.

  8. Histograms of Oriented 3D Gradients for Fully Automated Fetal Brain Localization and Robust Motion Correction in 3 T Magnetic Resonance Images

    Directory of Open Access Journals (Sweden)

    Ahmed Serag

    2017-01-01

    Full Text Available Fetal brain magnetic resonance imaging (MRI is a rapidly emerging diagnostic imaging tool. However, automated fetal brain localization is one of the biggest obstacles in expediting and fully automating large-scale fetal MRI processing. We propose a method for automatic localization of fetal brain in 3 T MRI when the images are acquired as a stack of 2D slices that are misaligned due to fetal motion. First, the Histogram of Oriented Gradients (HOG feature descriptor is extended from 2D to 3D images. Then, a sliding window is used to assign a score to all possible windows in an image, depending on the likelihood of it containing a brain, and the window with the highest score is selected. In our evaluation experiments using a leave-one-out cross-validation strategy, we achieved 96% of complete brain localization using a database of 104 MRI scans at gestational ages between 34 and 38 weeks. We carried out comparisons against template matching and random forest based regression methods and the proposed method showed superior performance. We also showed the application of the proposed method in the optimization of fetal motion correction and how it is essential for the reconstruction process. The method is robust and does not rely on any prior knowledge of fetal brain development.

  9. Marketing automation

    Directory of Open Access Journals (Sweden)

    TODOR Raluca Dania

    2017-01-01

    Full Text Available The automation of the marketing process seems to be nowadays, the only solution to face the major changes brought by the fast evolution of technology and the continuous increase in supply and demand. In order to achieve the desired marketing results, businessis have to employ digital marketing and communication services. These services are efficient and measurable thanks to the marketing technology used to track, score and implement each campaign. Due to the technical progress, the marketing fragmentation, demand for customized products and services on one side and the need to achieve constructive dialogue with the customers, immediate and flexible response and the necessity to measure the investments and the results on the other side, the classical marketing approached had changed continue to improve substantially.

  10. Implementation of a demand elasticity model in the building energy management system

    NARCIS (Netherlands)

    Ożadowicz, A.; Grela, J.; Babar, M.

    2016-01-01

    Nowadays, crucial part of modern Building Automation and Control Systems (BACS) is electric energy management. An active demand side management is very important feature of a Building Energy Management Systems (BEMS) integrated within the BACS. Since demand value changes in time and depends on

  11. Model-based automated testing of critical PLC programs.

    CERN Document Server

    Fernández Adiego, B; Tournier, J-C; González Suárez, V M; Bliudze, S

    2014-01-01

    Testing of critical PLC (Programmable Logic Controller) programs remains a challenging task for control system engineers as it can rarely be automated. This paper proposes a model based approach which uses the BIP (Behavior, Interactions and Priorities) framework to perform automated testing of PLC programs developed with the UNICOS (UNified Industrial COntrol System) framework. This paper defines the translation procedure and rules from UNICOS to BIP which can be fully automated in order to hide the complexity of the underlying model from the control engineers. The approach is illustrated and validated through the study of a water treatment process.

  12. Automating the radiographic NDT process

    International Nuclear Information System (INIS)

    Aman, J.K.

    1986-01-01

    Automation, the removal of the human element in inspection, has not been generally applied to film radiographic NDT. The justication for automating is not only productivity but also reliability of results. Film remains in the automated system of the future because of its extremely high image content, approximately 8 x 10 9 bits per 14 x 17. The equivalent to 2200 computer floppy discs. Parts handling systems and robotics applied for manufacturing and some NDT modalities, should now be applied to film radiographic NDT systems. Automatic film handling can be achieved with the daylight NDT film handling system. Automatic film processing is becoming the standard in industry and can be coupled to the daylight system. Robots offer the opportunity to automate fully the exposure step. Finally, computer aided interpretation appears on the horizon. A unit which laser scans a 14 x 17 (inch) film in 6 - 8 seconds can digitize film information for further manipulation and possible automatic interrogations (computer aided interpretation). The system called FDRS (for Film Digital Radiography System) is moving toward 50 micron (*approx* 16 lines/mm) resolution. This is believed to meet the need of the majority of image content needs. We expect the automated system to appear first in parts (modules) as certain operations are automated. The future will see it all come together in an automated film radiographic NDT system (author) [pt

  13. Scenarios about development and implications of automated vehicles in the Netherlands

    NARCIS (Netherlands)

    Milakis, D.; Snelder, M.; Van Arem, B.; Van Wee, G.P.; Homem De Almeida Rodriguez Correia, G.

    2016-01-01

    Automated driving technology is emerging. Yet, less is known about when automated vehicles will hit the market, how penetration rates will evolve and to what extent this new transportation technology will affect transportation demand and planning. This study identified through scenario analysis

  14. Comparison of two theory-based, fully automated telephone interventions designed to maintain dietary change in healthy adults: study protocol of a three-arm randomized controlled trial.

    Science.gov (United States)

    Wright, Julie A; Quintiliani, Lisa M; Turner-McGrievy, Gabrielle M; Migneault, Jeffrey P; Heeren, Timothy; Friedman, Robert H

    2014-11-10

    Health behavior change interventions have focused on obtaining short-term intervention effects; few studies have evaluated mid-term and long-term outcomes, and even fewer have evaluated interventions that are designed to maintain and enhance initial intervention effects. Moreover, behavior theory has not been developed for maintenance or applied to maintenance intervention design to the degree that it has for behavior change initiation. The objective of this paper is to describe a study that compared two theory-based interventions (social cognitive theory [SCT] vs goal systems theory [GST]) designed to maintain previously achieved improvements in fruit and vegetable (F&V) consumption. The interventions used tailored, interactive conversations delivered by a fully automated telephony system (Telephone-Linked Care [TLC]) over a 6-month period. TLC maintenance intervention based on SCT used a skills-based approach to build self-efficacy. It assessed confidence in and barriers to eating F&V, provided feedback on how to overcome barriers, plan ahead, and set goals. The TLC maintenance intervention based on GST used a cognitive-based approach. Conversations trained participants in goal management to help them integrate their newly acquired dietary behavior into their hierarchical system of goals. Content included goal facilitation, conflict, shielding, and redundancy, and reflection on personal goals and priorities. To evaluate and compare the two approaches, a sample of adults whose F&V consumption was below public health goal levels were recruited from a large urban area to participate in a fully automated telephony intervention (TLC-EAT) for 3-6 months. Participants who increase their daily intake of F&V by ≥1 serving/day will be eligible for the three-arm randomized controlled trial. A sample of 405 participants will be randomized to one of three arms: (1) an assessment-only control, (2) TLC-SCT, and (3) TLC-GST. The maintenance interventions are 6 months. All 405

  15. A facile and rapid automated synthesis of 3'-deoxy-3'-[18F]fluorothymidine

    International Nuclear Information System (INIS)

    Tang Ganghua; Tang Xiaolan; Wen Fuhua; Wang Mingfang; Li Baoyuan

    2010-01-01

    Aim: To develop a simplified and fully automated synthesis procedure of 3'-deoxy-3'-[ 18 F]fluorothymidine ([ 18 F]FLT) using PET-MF-2V-IT-I synthesis module. Methods: Synthesis of [ 18 F]FLT was performed using PET-MF-2V-IT-I synthesis module by one-pot two-step reaction procedure, including nucleophilic fluorination of 3-N-t-butoxycarbonyl-1-[5'-O-(4,4'-dimethoxy triphenylmethyl)-2'-deoxy-3'-O-(4-nitrobenzenesulfonyl) -β-D-threopentofuranosyl]thymine (15 mg) as the precursor molecule with [ 18 F]fluoride, and subsequent hydrolysis of the protecting group with 1.0 M HCl at the same reaction vessel and purification with SEP PAK cartridges instead of the HPLC system. Results: The automated synthesis of [ 18 F]FLT with SEP PAK purification gave corrected radiochemical yield of 23.2±2.6% (n=6, uncorrected yield: 16-22%) and radiochemical purity of >97% within the total synthesis time of 35 min. Conclusion: The fully one-pot automated synthesis procedure with SEP PAK purification can be applied to the fully automated synthesis of [ 18 F]FLT using commercial [ 18 F]FDG synthesis module.

  16. Sensors and Automated Analyzers for Radionuclides

    International Nuclear Information System (INIS)

    Grate, Jay W.; Egorov, Oleg B.

    2003-01-01

    The production of nuclear weapons materials has generated large quantities of nuclear waste and significant environmental contamination. We have developed new, rapid, automated methods for determination of radionuclides using sequential injection methodologies to automate extraction chromatographic separations, with on-line flow-through scintillation counting for real time detection. This work has progressed in two main areas: radionuclide sensors for water monitoring and automated radiochemical analyzers for monitoring nuclear waste processing operations. Radionuclide sensors have been developed that collect and concentrate radionuclides in preconcentrating minicolumns with dual functionality: chemical selectivity for radionuclide capture and scintillation for signal output. These sensors can detect pertechnetate to below regulatory levels and have been engineered into a prototype for field testing. A fully automated process monitor has been developed for total technetium in nuclear waste streams. This instrument performs sample acidification, speciation adjustment, separation and detection in fifteen minutes or less

  17. Delineated Analysis of Robotic Process Automation Tools

    OpenAIRE

    Ruchi Isaac; Riya Muni; Kenali Desai

    2017-01-01

    In this age and time when celerity is expected out of all the sectors of the country, the speed of execution of various processes and hence efficiency, becomes a prominent factor. To facilitate the speeding demands of these diverse platforms, Robotic Process Automation (RPA) is used. Robotic Process Automation can expedite back-office tasks in commercial industries, remote management tasks in IT industries and conservation of resources in multiple sectors. To implement RPA, many software ...

  18. Automated MRI segmentation for individualized modeling of current flow in the human head.

    Science.gov (United States)

    Huang, Yu; Dmochowski, Jacek P; Su, Yuzhuo; Datta, Abhishek; Rorden, Christopher; Parra, Lucas C

    2013-12-01

    High-definition transcranial direct current stimulation (HD-tDCS) and high-density electroencephalography require accurate models of current flow for precise targeting and current source reconstruction. At a minimum, such modeling must capture the idiosyncratic anatomy of the brain, cerebrospinal fluid (CSF) and skull for each individual subject. Currently, the process to build such high-resolution individualized models from structural magnetic resonance images requires labor-intensive manual segmentation, even when utilizing available automated segmentation tools. Also, accurate placement of many high-density electrodes on an individual scalp is a tedious procedure. The goal was to develop fully automated techniques to reduce the manual effort in such a modeling process. A fully automated segmentation technique based on Statical Parametric Mapping 8, including an improved tissue probability map and an automated correction routine for segmentation errors, was developed, along with an automated electrode placement tool for high-density arrays. The performance of these automated routines was evaluated against results from manual segmentation on four healthy subjects and seven stroke patients. The criteria include segmentation accuracy, the difference of current flow distributions in resulting HD-tDCS models and the optimized current flow intensities on cortical targets. The segmentation tool can segment out not just the brain but also provide accurate results for CSF, skull and other soft tissues with a field of view extending to the neck. Compared to manual results, automated segmentation deviates by only 7% and 18% for normal and stroke subjects, respectively. The predicted electric fields in the brain deviate by 12% and 29% respectively, which is well within the variability observed for various modeling choices. Finally, optimized current flow intensities on cortical targets do not differ significantly. Fully automated individualized modeling may now be feasible

  19. Validation of the fully automated A&D TM-2656 blood pressure monitor according to the British Hypertension Society Protocol.

    Science.gov (United States)

    Zeng, Wei-Fang; Liu, Ming; Kang, Yuan-Yuan; Li, Yan; Wang, Ji-Guang

    2013-08-01

    The present study aimed to evaluate the accuracy of the fully automated oscillometric upper-arm blood pressure monitor TM-2656 according to the British Hypertension Society (BHS) Protocol 1993. We recruited individuals until there were 85 eligible participants and their blood pressure could meet the blood pressure distribution requirements specified by the BHS Protocol. For each individual, we sequentially measured the systolic and diastolic blood pressures using a mercury sphygmomanometer (two observers) and the TM-2656 device (one supervisor). Data analysis was carried out according to the BHS Protocol. The device achieved grade A. The percentage of blood pressure differences within 5, 10, and 15 mmHg was 62, 85, and 96%, respectively, for systolic blood pressure, and 71, 93, and 99%, respectively, for diastolic blood pressure. The average (±SD) of the device-observer differences was -2.1±7.8 mmHg (P<0.0001) and -1.1±5.8 mmHg (P<0.0001) for systolic and diastolic blood pressures, respectively. The A&D upper-arm blood pressure monitor TM-2656 has passed the requirements of the BHS Protocol, and can thus be recommended for blood pressure measurement.

  20. Maximizing Your Investment in Building Automation System Technology.

    Science.gov (United States)

    Darnell, Charles

    2001-01-01

    Discusses how organizational issues and system standardization can be important factors that determine an institution's ability to fully exploit contemporary building automation systems (BAS). Further presented is management strategy for maximizing BAS investments. (GR)

  1. Automated tetraploid genotype calling by hierarchical clustering

    Science.gov (United States)

    SNP arrays are transforming breeding and genetics research for autotetraploids. To fully utilize these arrays, however, the relationship between signal intensity and allele dosage must be inferred independently for each marker. We developed an improved computational method to automate this process, ...

  2. Bayesian Safety Risk Modeling of Human-Flightdeck Automation Interaction

    Science.gov (United States)

    Ancel, Ersin; Shih, Ann T.

    2015-01-01

    Usage of automatic systems in airliners has increased fuel efficiency, added extra capabilities, enhanced safety and reliability, as well as provide improved passenger comfort since its introduction in the late 80's. However, original automation benefits, including reduced flight crew workload, human errors or training requirements, were not achieved as originally expected. Instead, automation introduced new failure modes, redistributed, and sometimes increased workload, brought in new cognitive and attention demands, and increased training requirements. Modern airliners have numerous flight modes, providing more flexibility (and inherently more complexity) to the flight crew. However, the price to pay for the increased flexibility is the need for increased mode awareness, as well as the need to supervise, understand, and predict automated system behavior. Also, over-reliance on automation is linked to manual flight skill degradation and complacency in commercial pilots. As a result, recent accidents involving human errors are often caused by the interactions between humans and the automated systems (e.g., the breakdown in man-machine coordination), deteriorated manual flying skills, and/or loss of situational awareness due to heavy dependence on automated systems. This paper describes the development of the increased complexity and reliance on automation baseline model, named FLAP for FLightdeck Automation Problems. The model development process starts with a comprehensive literature review followed by the construction of a framework comprised of high-level causal factors leading to an automation-related flight anomaly. The framework was then converted into a Bayesian Belief Network (BBN) using the Hugin Software v7.8. The effects of automation on flight crew are incorporated into the model, including flight skill degradation, increased cognitive demand and training requirements along with their interactions. Besides flight crew deficiencies, automation system

  3. Quantum dots assisted photocatalysis for the chemiluminometric determination of chemical oxygen demand using a single interface flow system

    Energy Technology Data Exchange (ETDEWEB)

    Silvestre, Cristina I.C.; Frigerio, Christian [Requimte, Department of Chemistry, Faculty of Pharmacy, Porto University, Rua Anibal Cunha 164, 4099-030, Porto (Portugal); Santos, Joao L.M., E-mail: joaolms@ff.up.pt [Requimte, Department of Chemistry, Faculty of Pharmacy, Porto University, Rua Anibal Cunha 164, 4099-030, Porto (Portugal); Lima, Jose L.F.C. [Requimte, Department of Chemistry, Faculty of Pharmacy, Porto University, Rua Anibal Cunha 164, 4099-030, Porto (Portugal)

    2011-08-12

    Highlights: {yields} A novel flow method for the determination of chemical oxygen demand is proposed. {yields} CdTe nanocrystals are irradiated with UV light to generate strong oxidizing species. {yields} Reactive species promote a fast catalytic degradation of organic matter. {yields} Luminol is used as a chemiluminescence probe for indirect COD assessment. {yields} A single interface flow system was implemented to automate the assays. - Abstract: A novel flow method for the determination of chemical oxygen demand (COD) is proposed in this work. It relies on the combination of a fully automated single interface flow system, an on-line UV photocatalytic unit and quantum dot (QD) nanotechnology. The developed approach takes advantage of CdTe nanocrystals capacity to generate strong oxidizing species upon irradiation with UV light, which fostered a fast catalytic degradation of the organic compounds. Luminol was used as a chemiluminescence (CL) probe for indirect COD assessment, since it is easily oxidized by the QD generated species yielding a strong CL emission that is quenched in the presence of the organic matter. The proposed methodology allowed the determination of COD concentrations between 1 and 35 mg L{sup -1}, with good precision (R.S.D. < 1.1%, n = 3) and a sampling frequency of about 33 h{sup -1}. The procedure was applied to the determination of COD in wastewater certified reference materials and the obtained results showed an excellent agreement with the certified values.

  4. An automated tensile machine for small specimens heavily neutron irradiated in FFTF/MOTA

    International Nuclear Information System (INIS)

    Kohyama, Akira; Sato, Shinji; Hamada, Kenichi

    1993-01-01

    The objective of this work is to develop a fully automated tensile machine for post-irradiation examination (PIE) of Fast Flux Test Facility (FFTF)/Materials Open Test Assembly (MOTA) irradiated miniature tension specimens. The anticipated merit of the automated tensile machine is to reduce damage to specimens during specimen handling for PIE and to reduce exposure to radioactive specimens. This machine is designed for testing at elevated temperatures, up to 873 K, in a vacuum or in an inert gas environment. Twelve specimen assemblies are placed in the vacuum chamber that can be tested successively in a fully automated manner. A unique automated tensile machine for the PIE of FFTF/MOTA irradiated specimens, the Monbusho Automated Tensile Machine (MATRON) consists of a test frame with controlling units and an automated specimen-loading apparatus. The qualification of the test frame has been completed, and the results have satisfied the machine specifications. The capabilities of producing creep and relaxation data have been demonstrated for Cu, Al, 316SS, and ferritic steels. The specimen holders for the three-point bending test and the small bulge test (small punch test; SP test) were also designed and produced

  5. A Multi-Scale Flood Monitoring System Based on Fully Automatic MODIS and TerraSAR-X Processing Chains

    Directory of Open Access Journals (Sweden)

    Enrico Stein

    2013-10-01

    Full Text Available A two-component fully automated flood monitoring system is described and evaluated. This is a result of combining two individual flood services that are currently under development at DLR’s (German Aerospace Center Center for Satellite based Crisis Information (ZKI to rapidly support disaster management activities. A first-phase monitoring component of the system systematically detects potential flood events on a continental scale using daily-acquired medium spatial resolution optical data from the Moderate Resolution Imaging Spectroradiometer (MODIS. A threshold set controls the activation of the second-phase crisis component of the system, which derives flood information at higher spatial detail using a Synthetic Aperture Radar (SAR based satellite mission (TerraSAR-X. The proposed activation procedure finds use in the identification of flood situations in different spatial resolutions and in the time-critical and on demand programming of SAR satellite acquisitions at an early stage of an evolving flood situation. The automated processing chains of the MODIS (MFS and the TerraSAR-X Flood Service (TFS include data pre-processing, the computation and adaptation of global auxiliary data, thematic classification, and the subsequent dissemination of flood maps using an interactive web-client. The system is operationally demonstrated and evaluated via the monitoring two recent flood events in Russia 2013 and Albania/Montenegro 2013.

  6. Automated ultrasonic inspection using PULSDAT

    International Nuclear Information System (INIS)

    Naybour, P.J.

    1992-01-01

    PULSDAT (Portable Ultrasonic Data Acquisition Tool) is a system for recording the data from single probe automated ultrasonic inspections. It is one of a range of instruments and software developed by Nuclear Electric to carry out a wide variety of high quality ultrasonic inspections. These vary from simple semi-automated inspections through to multi-probe, highly automated ones. PULSDAT runs under the control of MIPS software, and collects data which is compatible with the GUIDE data display system. PULSDAT is therefore fully compatible with Nuclear Electric's multi-probe inspection systems and utilises all the reliability and quality assurance of the software. It is a rugged, portable system that can be used in areas of difficult access. The paper discusses the benefits of automated inspection and gives an outline of the main features of PULSDAT. Since April 1990 PULSDAT has been used in several applications within Nuclear Electric and this paper presents two examples: the first is a ferritic set-through nozzle and the second is an austenitic fillet weld. (Author)

  7. Automated x-ray fluorescence analysis

    International Nuclear Information System (INIS)

    O'Connell, A.M.

    1977-01-01

    A fully automated x-ray fluorescence analytical system is described. The hardware is based on a Philips PW1220 sequential x-ray spectrometer. Software for on-line analysis of a wide range of sample types has been developed for the Hewlett-Packard 9810A programmable calculator. Routines to test the system hardware are also described. (Author)

  8. Fully automated processing of buffy-coat-derived pooled platelet concentrates.

    Science.gov (United States)

    Janetzko, Karin; Klüter, Harald; van Waeg, Geert; Eichler, Hermann

    2004-07-01

    The OrbiSac device, which was developed to automate the manufacture of buffy-coat PLT concentrates (BC-PCs), was evaluated. In-vitro characteristics of BC-PC preparations using the OrbiSac device were compared with manually prepared BC-PCs. For standard processing (Std-PC, n = 20), four BC-PCs were pooled using 300 mL of PLT AS (PAS) followed by soft-spin centrifugation and WBC filtration. The OrbiSac preparation (OS-PC, n = 20) was performed by automated pooling of four BC-PCs with 300 mL PAS followed by centrifugation and inline WBC filtration. All PCs were stored at 22 degrees C. Samples were withdrawn on Day 1, 5, and 7 evaluating PTL count, blood gas analysis, glucose, lactate, LDH, beta-thromboglobulin, hypotonic shock response, and CD62p expression. A PLT content of 3.1 +/- 0.4 x 10(11) (OS-PCs) versus 2.7 +/- 0.5 x 10(11) (Std-PCs, p < 0.05) was found. A CV of 19 percent (Std-PC) versus 14 percent (OS-PC) suggests more standardization in the OS group. At Day 7, the Std-PCs versus OS-PCs showed a glucose consumption of 1.03 +/- 0.32 micro mol per 10(9) PLT versus 0.75 +/- 0.25 micro mol per 10(9) PLT (p < 0.001), and a lactate production of 1.50 +/- 0.86 micro mol per 10(9) versus 1.11 +/- 0.61 micro mol per 10(9) (p < 0.001). The pH (7.00 +/- 0.19 vs. 7.23 +/- 0.06; p < 0.001), pO(2) (45.3 +/- 18 vs. 31.3 +/- 10.4 mmHg; p < 0.01), and HCO(3) levels (4.91 +/- 1.49 vs. 7.14 +/- 0.95 mmol/L; p < 0.001) suggest a slightly better aerobic metabolism within the OS group. Only small differences in CD62p expression was observed (37.3 +/- 12.9% Std-PC vs. 44.8 +/- 6.6% OS-PC; p < 0.05). The OrbiSac device allows an improved PLT yield without affecting PLT in-vitro characteristics and may enable an improved consistency in product volume and yield.

  9. Progress report on a fully automatic Gas Tungsten Arc Welding (GTAW) system development

    Energy Technology Data Exchange (ETDEWEB)

    Daumeyer, G.J. III

    1994-12-01

    A plan to develop a fully automatic gas tungsten arc welding (GTAW) system that will utilize a vision-sensing computer (which will provide in-process feedback control) is presently in work. Evaluations of different technological aspects and system design requirements continue. This report summaries major activities in the plan`s successful progress. The technological feasibility of producing the fully automated GTAW system has been proven. The goal of this process development project is to provide a production-ready system within the shortest reasonable time frame.

  10. Development of automated analytical systems for large throughput

    International Nuclear Information System (INIS)

    Ernst, P.C.; Hoffman, E.L.

    1982-01-01

    The need to be able to handle a large throughput of samples for neutron activation analysis has led to the development of automated counting and sample handling systems. These are coupled with available computer-assisted INAA techniques to perform a wide range of analytical services on a commercial basis. A fully automated delayed neutron counting system and a computer controlled pneumatic transfer for INAA use are described, as is a multi-detector gamma-spectroscopy system. (author)

  11. Selecting automation for the clinical chemistry laboratory.

    Science.gov (United States)

    Melanson, Stacy E F; Lindeman, Neal I; Jarolim, Petr

    2007-07-01

    Laboratory automation proposes to improve the quality and efficiency of laboratory operations, and may provide a solution to the quality demands and staff shortages faced by today's clinical laboratories. Several vendors offer automation systems in the United States, with both subtle and obvious differences. Arriving at a decision to automate, and the ensuing evaluation of available products, can be time-consuming and challenging. Although considerable discussion concerning the decision to automate has been published, relatively little attention has been paid to the process of evaluating and selecting automation systems. To outline a process for evaluating and selecting automation systems as a reference for laboratories contemplating laboratory automation. Our Clinical Chemistry Laboratory staff recently evaluated all major laboratory automation systems in the United States, with their respective chemistry and immunochemistry analyzers. Our experience is described and organized according to the selection process, the important considerations in clinical chemistry automation, decisions and implementation, and we give conclusions pertaining to this experience. Including the formation of a committee, workflow analysis, submitting a request for proposal, site visits, and making a final decision, the process of selecting chemistry automation took approximately 14 months. We outline important considerations in automation design, preanalytical processing, analyzer selection, postanalytical storage, and data management. Selecting clinical chemistry laboratory automation is a complex, time-consuming process. Laboratories considering laboratory automation may benefit from the concise overview and narrative and tabular suggestions provided.

  12. Access control for on-demand provisioned cloud infrastructure services

    NARCIS (Netherlands)

    Ngo, C.T.

    2016-01-01

    The evolution of Cloud Computing brings advantages to both customers and service providers to utilize and manage computing and network resources more efficiently with virtualization, service-oriented architecture technologies, and automated on-demand resource provisioning. However, these advantages

  13. Automated meteorological data from commercial aircraft via satellite: Present experience and future implications

    Science.gov (United States)

    Steinberg, R.

    1978-01-01

    A low-cost communications system to provide meteorological data from commercial aircraft, in neat real-time, on a fully automated basis has been developed. The complete system including the low profile antenna and all installation hardware weighs 34 kg. The prototype system was installed on a B-747 aircraft and provided meteorological data (wind angle and velocity, temperature, altitude and position as a function of time) on a fully automated basis. The results were exceptional. This concept is expected to have important implications for operational meteorology and airline route forecasting.

  14. The Set-Up and Implementation of Fully Virtualized Lessons with an Automated Workflow Utilizing VMC/Moodle at the Medical University of Graz

    Directory of Open Access Journals (Sweden)

    Herwig Erich Rehatschek

    2011-12-01

    Full Text Available With start of winter semester 2010/11 the Medical University of Graz (MUG successfully introduced a new primary learning management system (LMS Moodle. Moodle currently serves more than 4,300 students from three studies and holds more than 7,500 unique learning objects. With begin of the summer semester 2010 we decided to start a pilot with Moodle and 430 students. For the pilot we migrated the learning content of one module and two optional subjects to Moodle. The evaluation results were extremely promising – more than 92% of the students wanted immediately Moodle – also Moodle did meet our high expectations in terms of performance and scalability. Within this paper we describe how we defined and set-up a scalable and highly available platform for hosting Moodle and extended it by the functionality for fully automated virtual lessons. We state our experiences and give valuable clues for universities and institutions who want to introduce Moodle in the near future.

  15. Automated optical assembly

    Science.gov (United States)

    Bala, John L.

    1995-08-01

    Automation and polymer science represent fundamental new technologies which can be directed toward realizing the goal of establishing a domestic, world-class, commercial optics business. Use of innovative optical designs using precision polymer optics will enable the US to play a vital role in the next generation of commercial optical products. The increased cost savings inherent in the utilization of optical-grade polymers outweighs almost every advantage of using glass for high volume situations. Optical designers must gain experience with combined refractive/diffractive designs and broaden their knowledge base regarding polymer technology beyond a cursory intellectual exercise. Implementation of a fully automated assembly system, combined with utilization of polymer optics, constitutes the type of integrated manufacturing process which will enable the US to successfully compete with the low-cost labor employed in the Far East, as well as to produce an equivalent product.

  16. The Impact of Automation Reliability and Operator Fatigue on Performance and Reliance

    Science.gov (United States)

    2016-09-23

    Cummings et al., 2007). Automation designed to assist operators in overload situations may promote operator disengagement during periods of low...Calhoun et al., 2011). This testbed offers several tasks designed to emulate the cognitive demands that an operator managing multiple UAVs is likely...reliable (Cronbach’s α = 0.94) measure of affective and cognitive components of trust in automation. Items gauge confidence in an automation and

  17. Integration of drinking water treatment plant process models and emulated process automation software

    NARCIS (Netherlands)

    Worm, G.I.M.

    2012-01-01

    The objective of this research is to limit the risks of fully automated operation of drinking water treatment plants and to improve their operation by using an integrated system of process models and emulated process automation software. This thesis contains the design of such an integrated system.

  18. Simulation-based Strategies for Smart Demand Response

    Directory of Open Access Journals (Sweden)

    Ines Leobner

    2018-03-01

    Full Text Available Demand Response can be seen as one effective way to harmonize demand and supply in order to achieve high self-coverage of energy consumption by means of renewable energy sources. This paper presents two different simulation-based concepts to integrate demand-response strategies into energy management systems in the customer domain of the Smart Grid. The first approach is a Model Predictive Control of the heating and cooling system of a low-energy office building. The second concept aims at industrial Demand Side Management by integrating energy use optimization into industrial automation systems. Both approaches are targeted at day-ahead planning. Furthermore, insights gained into the implications of the concepts onto the design of the model, simulation and optimization will be discussed. While both approaches share a similar architecture, different modelling and simulation approaches were required by the use cases.

  19. Implementation and development of an automated, ultra-high-capacity, acoustic, flexible dispensing platform for assay-ready plate delivery.

    Science.gov (United States)

    Griffith, Dylan; Northwood, Roger; Owen, Paul; Simkiss, Ellen; Brierley, Andrew; Cross, Kevin; Slaney, Andrew; Davis, Miranda; Bath, Colin

    2012-10-01

    Compound management faces the daily challenge of providing high-quality samples to drug discovery. The advent of new screening technologies has seen demand for liquid samples move toward nanoliter ranges, dispensed by contactless acoustic droplet ejection. Within AstraZeneca, a totally integrated assay-ready plate production platform has been created to fully exploit the advantages of this technology. This enables compound management to efficiently deliver large throughputs demanded by high-throughput screening while maintaining regular delivery of smaller numbers of compounds in varying plate formats for cellular or biochemical concentration-response curves in support of hit and lead optimization (structure-activity relationship screening). The automation solution, CODA, has the capability to deliver compounds on demand for single- and multiple-concentration ranges, in batch sizes ranging from 1 sample to 2 million samples, integrating seamlessly into local compound and test management systems. The software handles compound orders intelligently, grouping test requests together dependent on output plate type and serial dilution ranges so that source compound vessels are shared among numerous tests, ensuring conservation of sample, reduced labware and costs, and efficiency of work cell logistics. We describe the development of CODA to address the customer demand, challenges experienced, learning made, and subsequent enhancements.

  20. Evaluating a method for automated rigid registration

    DEFF Research Database (Denmark)

    Darkner, Sune; Vester-Christensen, Martin; Larsen, Rasmus

    2007-01-01

    to point distance. T-test for common mean are used to determine the performance of the two methods (supported by a Wilcoxon signed rank test). The performance influence of sampling density, sampling quantity, and norms is analyzed using a similar method.......We evaluate a novel method for fully automated rigid registration of 2D manifolds in 3D space based on distance maps, the Gibbs sampler and Iterated Conditional Modes (ICM). The method is tested against the ICP considered as the gold standard for automated rigid registration. Furthermore...

  1. Demand-Driven Success: Designing Your PDA Experiment

    OpenAIRE

    Hillen, Charles; Johnson-Grau, Glenn

    2012-01-01

    Initiating demand-driving acquisition is daunting. Implications for developing a sustainable budget model, choosing a vendor, controlling metadata, monitoring purchases and developing invoice workflows are significant areas of concern that require determinative planning. From mid-February through August 2011, Loyola Marymount University conducted a pilot using demand-driven acquisition; the result of this successful experiment was the library’s decision to fully integrate this purchasing mode...

  2. FULLY AUTOMATED GIS-BASED INDIVIDUAL TREE CROWN DELINEATION BASED ON CURVATURE VALUES FROM A LIDAR DERIVED CANOPY HEIGHT MODEL IN A CONIFEROUS PLANTATION

    Directory of Open Access Journals (Sweden)

    R. J. L. Argamosa

    2016-06-01

    Full Text Available The generation of high resolution canopy height model (CHM from LiDAR makes it possible to delineate individual tree crown by means of a fully-automated method using the CHM’s curvature through its slope. The local maxima are obtained by taking the maximum raster value in a 3 m x 3 m cell. These values are assumed as tree tops and therefore considered as individual trees. Based on the assumptions, thiessen polygons were generated to serve as buffers for the canopy extent. The negative profile curvature is then measured from the slope of the CHM. The results show that the aggregated points from a negative profile curvature raster provide the most realistic crown shape. The absence of field data regarding tree crown dimensions require accurate visual assessment after the appended delineated tree crown polygon was superimposed to the hill shaded CHM.

  3. Automated driving and its effects on the safety ecosystem: How do compatibility issues affect the transition period?

    NARCIS (Netherlands)

    van Loon, R.J.; Martens, Marieke Hendrikje

    2015-01-01

    Different components of automated vehicles are being made available commercially as we speak. Much research has been conducted into these components and many of these have been studied with respect to their effects on safety, but the transition period from non-automated driving to fully automated

  4. Automated driving and its effect on the safety ecosystem: how do compatibility issues affect the transition period?

    NARCIS (Netherlands)

    Loon, R.J. van; Martens, M.H.

    2015-01-01

    Different components of automated vehicles are being made available commercially as we speak. Much research has been conducted into these components and many of these have been studied with respect to their effects on safety, but the transition period from non-automated driving to fully automated

  5. A device for fully automated on-site process monitoring and control of trihalomethane concentrations in drinking water

    International Nuclear Information System (INIS)

    Brown, Aaron W.; Simone, Paul S.; York, J.C.; Emmert, Gary L.

    2015-01-01

    Highlights: • Commercial device for on-line monitoring of trihalomethanes in drinking water. • Method detection limits for individual trihalomethanes range from 0.01–0.04 μg L –1 . • Rugged and robust device operates automatically for on-site process control. • Used for process mapping and process optimization to reduce treatment costs. • Hourly measurements of trihalomethanes made continuously for ten months. - Abstract: An instrument designed for fully automated on-line monitoring of trihalomethane concentrations in chlorinated drinking water is presented. The patented capillary membrane sampling device automatically samples directly from a water tap followed by injection of the sample into a gas chromatograph equipped with a nickel-63 electron capture detector. Detailed studies using individual trihalomethane species exhibited method detection limits ranging from 0.01–0.04 μg L −1 . Mean percent recoveries ranged from 77.1 to 86.5% with percent relative standard deviation values ranging from 1.2 to 4.6%. Out of more than 5200 samples analyzed, 95% of the concentration ranges were detectable, 86.5% were quantifiable. The failure rate was less than 2%. Using the data from the instrument, two different treatment processes were optimized so that total trihalomethane concentrations were maintained at acceptable levels while reducing treatment costs significantly. This ongoing trihalomethane monitoring program has been operating for more than ten months and has produced the longest continuous and most finely time-resolved data on trihalomethane concentrations reported in the literature

  6. A device for fully automated on-site process monitoring and control of trihalomethane concentrations in drinking water

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Aaron W. [The University of Memphis, Department of Chemistry, Memphis, TN 38152 (United States); Simone, Paul S. [The University of Memphis, Department of Chemistry, Memphis, TN 38152 (United States); Foundation Instruments, Inc., Collierville, TN 38017 (United States); York, J.C. [City of Lebanon, TN Water Treatment Plant, 7 Gilmore Hill Rd., Lebanon, TN 37087 (United States); Emmert, Gary L., E-mail: gemmert@memphis.edu [The University of Memphis, Department of Chemistry, Memphis, TN 38152 (United States); Foundation Instruments, Inc., Collierville, TN 38017 (United States)

    2015-01-01

    Highlights: • Commercial device for on-line monitoring of trihalomethanes in drinking water. • Method detection limits for individual trihalomethanes range from 0.01–0.04 μg L{sup –1}. • Rugged and robust device operates automatically for on-site process control. • Used for process mapping and process optimization to reduce treatment costs. • Hourly measurements of trihalomethanes made continuously for ten months. - Abstract: An instrument designed for fully automated on-line monitoring of trihalomethane concentrations in chlorinated drinking water is presented. The patented capillary membrane sampling device automatically samples directly from a water tap followed by injection of the sample into a gas chromatograph equipped with a nickel-63 electron capture detector. Detailed studies using individual trihalomethane species exhibited method detection limits ranging from 0.01–0.04 μg L{sup −1}. Mean percent recoveries ranged from 77.1 to 86.5% with percent relative standard deviation values ranging from 1.2 to 4.6%. Out of more than 5200 samples analyzed, 95% of the concentration ranges were detectable, 86.5% were quantifiable. The failure rate was less than 2%. Using the data from the instrument, two different treatment processes were optimized so that total trihalomethane concentrations were maintained at acceptable levels while reducing treatment costs significantly. This ongoing trihalomethane monitoring program has been operating for more than ten months and has produced the longest continuous and most finely time-resolved data on trihalomethane concentrations reported in the literature.

  7. Automated leak localization performance without detailed demand distribution data

    NARCIS (Netherlands)

    Moors, Janneke; Scholten, L.; van der Hoek, J.P.; den Besten, J.

    2018-01-01

    Automatic leak localization has been suggested to reduce the time and personnel efforts needed to localize
    (small) leaks. Yet, the available methods require a detailed demand distribution model for successful
    calibration and good leak localization performance. The main aim of this work was

  8. A fast fiducial marker tracking model for fully automatic alignment in electron tomography

    KAUST Repository

    Han, Renmin; Zhang, Fa; Gao, Xin

    2017-01-01

    Automatic alignment, especially fiducial marker-based alignment, has become increasingly important due to the high demand of subtomogram averaging and the rapid development of large-field electron microscopy. Among the alignment steps, fiducial marker tracking is a crucial one that determines the quality of the final alignment. Yet, it is still a challenging problem to track the fiducial markers accurately and effectively in a fully automatic manner.In this paper, we propose a robust and efficient scheme for fiducial marker tracking. Firstly, we theoretically prove the upper bound of the transformation deviation of aligning the positions of fiducial markers on two micrographs by affine transformation. Secondly, we design an automatic algorithm based on the Gaussian mixture model to accelerate the procedure of fiducial marker tracking. Thirdly, we propose a divide-and-conquer strategy against lens distortions to ensure the reliability of our scheme. To our knowledge, this is the first attempt that theoretically relates the projection model with the tracking model. The real-world experimental results further support our theoretical bound and demonstrate the effectiveness of our algorithm. This work facilitates the fully automatic tracking for datasets with a massive number of fiducial markers.The C/C ++ source code that implements the fast fiducial marker tracking is available at https://github.com/icthrm/gmm-marker-tracking. Markerauto 1.6 version or later (also integrated in the AuTom platform at http://ear.ict.ac.cn/) offers a complete implementation for fast alignment, in which fast fiducial marker tracking is available by the

  9. A fast fiducial marker tracking model for fully automatic alignment in electron tomography

    KAUST Repository

    Han, Renmin

    2017-10-20

    Automatic alignment, especially fiducial marker-based alignment, has become increasingly important due to the high demand of subtomogram averaging and the rapid development of large-field electron microscopy. Among the alignment steps, fiducial marker tracking is a crucial one that determines the quality of the final alignment. Yet, it is still a challenging problem to track the fiducial markers accurately and effectively in a fully automatic manner.In this paper, we propose a robust and efficient scheme for fiducial marker tracking. Firstly, we theoretically prove the upper bound of the transformation deviation of aligning the positions of fiducial markers on two micrographs by affine transformation. Secondly, we design an automatic algorithm based on the Gaussian mixture model to accelerate the procedure of fiducial marker tracking. Thirdly, we propose a divide-and-conquer strategy against lens distortions to ensure the reliability of our scheme. To our knowledge, this is the first attempt that theoretically relates the projection model with the tracking model. The real-world experimental results further support our theoretical bound and demonstrate the effectiveness of our algorithm. This work facilitates the fully automatic tracking for datasets with a massive number of fiducial markers.The C/C ++ source code that implements the fast fiducial marker tracking is available at https://github.com/icthrm/gmm-marker-tracking. Markerauto 1.6 version or later (also integrated in the AuTom platform at http://ear.ict.ac.cn/) offers a complete implementation for fast alignment, in which fast fiducial marker tracking is available by the

  10. Toward fully automated genotyping: Allele assignment, pedigree construction, phase determination, and recombination detection in Duchenne muscular dystrophy

    Energy Technology Data Exchange (ETDEWEB)

    Perlin, M.W.; Burks, M.B. [Carnegie Mellon Univ., Pittsburgh, PA (United States); Hoop, R.C.; Hoffman, E.P. [Univ. of Pittsburgh School of Medicine, PA (United States)

    1994-10-01

    Human genetic maps have made quantum leaps in the past few years, because of the characterization of >2,000 CA dinucleotide repeat loci: these PCR-based markers offer extraordinarily high PIC, and within the next year their density is expected to reach intervals of a few centimorgans per marker. These new genetic maps open new avenues for disease gene research, including large-scale genotyping for both simple and complex disease loci. However, the allele patterns of many dinucleotide repeat loci can be complex and difficult to interpret, with genotyping errors a recognized problem. Furthermore, the possibility of genotyping individuals at hundreds or thousands of polymorphic loci requires improvements in data handling and analysis. The automation of genotyping and analysis of computer-derived haplotypes would remove many of the barriers preventing optimal use of dense and informative dinucleotide genetic maps. Toward this end, we have automated the allele identification, genotyping, phase determinations, and inheritance consistency checks generated by four CA repeats within the 2.5-Mbp, 10-cM X-linked dystrophin gene, using fluorescein-labeled multiplexed PCR products analyzed on automated sequencers. The described algorithms can deconvolute and resolve closely spaced alleles, despite interfering stutter noise; set phase in females; propagate the phase through the family; and identify recombination events. We show the implementation of these algorithms for the completely automated interpretation of allele data and risk assessment for five Duchenne/Becker muscular dystrophy families. The described approach can be scaled up to perform genome-based analyses with hundreds or thousands of CA-repeat loci, using multiple fluorophors on automated sequencers. 16 refs., 5 figs., 1 tab.

  11. A LabVIEW®-based software for the control of the AUTORAD platform. A fully automated multisequential flow injection analysis Lab-on-Valve (MSFIA-LOV) system for radiochemical analysis

    International Nuclear Information System (INIS)

    Barbesi, Donato; Vilas, Victor Vicente; Millet, Sylvain; Sandow, Miguel; Colle, Jean-Yves; Heras, Laura Aldave de las

    2017-01-01

    A LabVIEW®-based software for the control of the fully automated multi-sequential flow injection analysis Lab-on-Valve (MSFIA-LOV) platform AutoRAD performing radiochemical analysis is described. The analytical platform interfaces an Arduino®-based device triggering multiple detectors providing a flexible and fit for purpose choice of detection systems. The different analytical devices are interfaced to the PC running LabVIEW®VI software using USB and RS232 interfaces, both for sending commands and receiving confirmation or error responses. The AUTORAD platform has been successfully applied for the chemical separation and determination of Sr, an important fission product pertinent to nuclear waste. (author)

  12. A LabVIEW®-based software for the control of the AUTORAD platform: a fully automated multisequential flow injection analysis Lab-on-Valve (MSFIA-LOV) system for radiochemical analysis.

    Science.gov (United States)

    Barbesi, Donato; Vicente Vilas, Víctor; Millet, Sylvain; Sandow, Miguel; Colle, Jean-Yves; Aldave de Las Heras, Laura

    2017-01-01

    A LabVIEW ® -based software for the control of the fully automated multi-sequential flow injection analysis Lab-on-Valve (MSFIA-LOV) platform AutoRAD performing radiochemical analysis is described. The analytical platform interfaces an Arduino ® -based device triggering multiple detectors providing a flexible and fit for purpose choice of detection systems. The different analytical devices are interfaced to the PC running LabVIEW ® VI software using USB and RS232 interfaces, both for sending commands and receiving confirmation or error responses. The AUTORAD platform has been successfully applied for the chemical separation and determination of Sr, an important fission product pertinent to nuclear waste.

  13. Automated spoof-detection for fingerprints using optical coherence tomography

    CSIR Research Space (South Africa)

    Darlow, LN

    2016-05-01

    Full Text Available that they are highly separable, resulting in 100% accuracy regarding spoof-detection, with no false rejections of real fingers. This is the first attempt at fully automated spoof-detection using OCT....

  14. Reactor pressure vessel stud management automation strategies

    International Nuclear Information System (INIS)

    Biach, W.L.; Hill, R.; Hung, K.

    1992-01-01

    The adoption of hydraulic tensioner technology as the standard for bolting and unbolting the reactor pressure vessel (RPV) head 35 yr ago represented an incredible commitment to new technology, but the existing technology was so primitive as to be clearly unacceptable. Today, a variety of approaches for improvement make the decision more difficult. Automation in existing installations must meet complex physical, logistic, and financial parameters while addressing the demands of reduced exposure, reduced critical path, and extended plant life. There are two generic approaches to providing automated RPV stud engagement and disengagement: the multiple stud tensioner and automated individual tools. A variation of the latter would include the handling system. Each has its benefits and liabilities

  15. Fully printable, strain-engineered electronic wrap for customizable soft electronics.

    Science.gov (United States)

    Byun, Junghwan; Lee, Byeongmoon; Oh, Eunho; Kim, Hyunjong; Kim, Sangwoo; Lee, Seunghwan; Hong, Yongtaek

    2017-03-24

    Rapid growth of stretchable electronics stimulates broad uses in multidisciplinary fields as well as industrial applications. However, existing technologies are unsuitable for implementing versatile applications involving adaptable system design and functions in a cost/time-effective way because of vacuum-conditioned, lithographically-predefined processes. Here, we present a methodology for a fully printable, strain-engineered electronic wrap as a universal strategy which makes it more feasible to implement various stretchable electronic systems with customizable layouts and functions. The key aspects involve inkjet-printed rigid island (PRI)-based stretchable platform technology and corresponding printing-based automated electronic functionalization methodology, the combination of which provides fully printed, customized layouts of stretchable electronic systems with simplified process. Specifically, well-controlled contact line pinning effect of printed polymer solution enables the formation of PRIs with tunable thickness; and surface strain analysis on those PRIs leads to the optimized stability and device-to-island fill factor of strain-engineered electronic wraps. Moreover, core techniques of image-based automated pinpointing, surface-mountable device based electronic functionalizing, and one-step interconnection networking of PRIs enable customized circuit design and adaptable functionalities. To exhibit the universality of our approach, multiple types of practical applications ranging from self-computable digital logics to display and sensor system are demonstrated on skin in a customized form.

  16. Fully printable, strain-engineered electronic wrap for customizable soft electronics

    Science.gov (United States)

    Byun, Junghwan; Lee, Byeongmoon; Oh, Eunho; Kim, Hyunjong; Kim, Sangwoo; Lee, Seunghwan; Hong, Yongtaek

    2017-03-01

    Rapid growth of stretchable electronics stimulates broad uses in multidisciplinary fields as well as industrial applications. However, existing technologies are unsuitable for implementing versatile applications involving adaptable system design and functions in a cost/time-effective way because of vacuum-conditioned, lithographically-predefined processes. Here, we present a methodology for a fully printable, strain-engineered electronic wrap as a universal strategy which makes it more feasible to implement various stretchable electronic systems with customizable layouts and functions. The key aspects involve inkjet-printed rigid island (PRI)-based stretchable platform technology and corresponding printing-based automated electronic functionalization methodology, the combination of which provides fully printed, customized layouts of stretchable electronic systems with simplified process. Specifically, well-controlled contact line pinning effect of printed polymer solution enables the formation of PRIs with tunable thickness; and surface strain analysis on those PRIs leads to the optimized stability and device-to-island fill factor of strain-engineered electronic wraps. Moreover, core techniques of image-based automated pinpointing, surface-mountable device based electronic functionalizing, and one-step interconnection networking of PRIs enable customized circuit design and adaptable functionalities. To exhibit the universality of our approach, multiple types of practical applications ranging from self-computable digital logics to display and sensor system are demonstrated on skin in a customized form.

  17. Synthesis of tracers using automated radiochemistry and robotics

    International Nuclear Information System (INIS)

    Dannals, R.F.

    1992-07-01

    Synthesis of high specific activity radiotracers labeled with short-lived positron-emitting radionuclides for positron emission tomography (PET) often requires handling large initial quantities of radioactivity. High specific activities are required when preparing tracers for use in PET studies of neuroreceptors. A fully automated approach for tracer synthesis is highly desirable. This proposal involves the development of a system for the Synthesis of Tracers using Automated Radiochemistry and Robotics (STARR) for this purpose. While the long range objective of the proposed research is the development of a totally automated radiochemistry system for the production of major high specific activity 11 C-radiotracers for use in PET, the specific short range objectives are the automation of 11 C-methyl iodide ( 11 CH 3 I) production via an integrated approach using both radiochemistry modular labstations and robotics, and the extension of this automated capability to the production of several radiotracers for PET (initially, 11 C-methionine, 3-N-[ 11 C-methyl]spiperone, and [ 11 C]-carfentanil)

  18. Fully automated synthesis of the M{sub 1} receptor agonist [{sup 11}C]GSK1034702 for clinical use on an Eckert and Ziegler Modular Lab system

    Energy Technology Data Exchange (ETDEWEB)

    Huiban, Mickael, E-mail: Mickael.x.huiban@gsk.com [GlaxoSmithKline, Clinical Imaging Centre, Imperial College London, Hammersmith Hospital, Du Cane Road, London, W12 0NN (United Kingdom); Pampols-Maso, Sabina; Passchier, Jan [GlaxoSmithKline, Clinical Imaging Centre, Imperial College London, Hammersmith Hospital, Du Cane Road, London, W12 0NN (United Kingdom)

    2011-10-15

    A fully automated and GMP compatible synthesis has been developed to reliably label the M{sub 1} receptor agonist GSK1034702 with carbon-11. Stille reaction of the trimethylstannyl precursor with [{sup 11}C]methyl iodide afforded [{sup 11}C]GSK1034702 in an estimated 10{+-}3% decay corrected yield. This method utilises the commercially available modular laboratory equipment and provides high purity [{sup 11}C]GSK1034702 in a formulation suitable for human use. - Highlights: > Preparation of [{sup 11}C]GSK1034702 through a Stille cross-coupling reaction. > Provision of the applicability of commercially available modules for the synthesis of non-standard PET tracers. > Defining specification for heavy metals content in final dose product. > Presenting results from validation of manufacturing process.

  19. Fully automated reconstruction of three-dimensional vascular tree structures from two orthogonal views using computational algorithms and productionrules

    Science.gov (United States)

    Liu, Iching; Sun, Ying

    1992-10-01

    A system for reconstructing 3-D vascular structure from two orthogonally projected images is presented. The formidable problem of matching segments between two views is solved using knowledge of the epipolar constraint and the similarity of segment geometry and connectivity. The knowledge is represented in a rule-based system, which also controls the operation of several computational algorithms for tracking segments in each image, representing 2-D segments with directed graphs, and reconstructing 3-D segments from matching 2-D segment pairs. Uncertain reasoning governs the interaction between segmentation and matching; it also provides a framework for resolving the matching ambiguities in an iterative way. The system was implemented in the C language and the C Language Integrated Production System (CLIPS) expert system shell. Using video images of a tree model, the standard deviation of reconstructed centerlines was estimated to be 0.8 mm (1.7 mm) when the view direction was parallel (perpendicular) to the epipolar plane. Feasibility of clinical use was shown using x-ray angiograms of a human chest phantom. The correspondence of vessel segments between two views was accurate. Computational time for the entire reconstruction process was under 30 s on a workstation. A fully automated system for two-view reconstruction that does not require the a priori knowledge of vascular anatomy is demonstrated.

  20. Advanced automation for in-space vehicle processing

    Science.gov (United States)

    Sklar, Michael; Wegerif, D.

    1990-01-01

    The primary objective of this 3-year planned study is to assure that the fully evolved Space Station Freedom (SSF) can support automated processing of exploratory mission vehicles. Current study assessments show that required extravehicular activity (EVA) and to some extent intravehicular activity (IVA) manpower requirements for required processing tasks far exceeds the available manpower. Furthermore, many processing tasks are either hazardous operations or they exceed EVA capability. Thus, automation is essential for SSF transportation node functionality. Here, advanced automation represents the replacement of human performed tasks beyond the planned baseline automated tasks. Both physical tasks such as manipulation, assembly and actuation, and cognitive tasks such as visual inspection, monitoring and diagnosis, and task planning are considered. During this first year of activity both the Phobos/Gateway Mars Expedition and Lunar Evolution missions proposed by the Office of Exploration have been evaluated. A methodology for choosing optimal tasks to be automated has been developed. Processing tasks for both missions have been ranked on the basis of automation potential. The underlying concept in evaluating and describing processing tasks has been the use of a common set of 'Primitive' task descriptions. Primitive or standard tasks have been developed both for manual or crew processing and automated machine processing.

  1. Quantification of 31 illicit and medicinal drugs and metabolites in whole blood by fully automated solid-phase extraction and ultra-performance liquid chromatography-tandem mass spectrometry.

    Science.gov (United States)

    Bjørk, Marie Kjærgaard; Simonsen, Kirsten Wiese; Andersen, David Wederkinck; Dalsgaard, Petur Weihe; Sigurðardóttir, Stella Rögn; Linnet, Kristian; Rasmussen, Brian Schou

    2013-03-01

    An efficient method for analyzing illegal and medicinal drugs in whole blood using fully automated sample preparation and short ultra-high-performance liquid chromatography-tandem mass spectrometry (MS/MS) run time is presented. A selection of 31 drugs, including amphetamines, cocaine, opioids, and benzodiazepines, was used. In order to increase the efficiency of routine analysis, a robotic system based on automated liquid handling and capable of handling all unit operation for sample preparation was built on a Freedom Evo 200 platform with several add-ons from Tecan and third-party vendors. Solid-phase extraction was performed using Strata X-C plates. Extraction time for 96 samples was less than 3 h. Chromatography was performed using an ACQUITY UPLC system (Waters Corporation, Milford, USA). Analytes were separated on a 100 mm × 2.1 mm, 1.7 μm Acquity UPLC CSH C(18) column using a 6.5 min 0.1 % ammonia (25 %) in water/0.1 % ammonia (25 %) in methanol gradient and quantified by MS/MS (Waters Quattro Premier XE) in multiple-reaction monitoring mode. Full validation, including linearity, precision and trueness, matrix effect, ion suppression/enhancement of co-eluting analytes, recovery, and specificity, was performed. The method was employed successfully in the laboratory and used for routine analysis of forensic material. In combination with tetrahydrocannabinol analysis, the method covered 96 % of cases involving driving under the influence of drugs. The manual labor involved in preparing blood samples, solvents, etc., was reduced to a half an hour per batch. The automated sample preparation setup also minimized human exposure to hazardous materials, provided highly improved ergonomics, and eliminated manual pipetting.

  2. An automated coil winding machine for the SSC dipole magnets

    International Nuclear Information System (INIS)

    Kamiya, S.; Iwase, T.; Inoue, I.; Fukui, I.; Ishida, K.; Kashiwagi, S.; Sato, Y.; Yoshihara, T.; Yamamoto, S.; Johnson, E.; Gibson, C.

    1990-01-01

    The authors have finished the preliminary design of a fully automated coil winding machine that can be used to manufacture the large number of SSC dipole magnets. The machine aims to perform all coil winding operations including coil parts inserting without human operators at a high productive rate. The machine is composed of five industrial robots. In order to verify the design, they built a small winding machine using an industrial robot and successfully wound a 1 meter long coil using SSC dipole magnet wire. The basic design for the full length coil and the robot winding technique are described in this paper. A fully automated coil winding machine using standard industrial components would be very useful if duplicate production lines are used. 5 figs., 1 tab

  3. Demonstration of automated price response in large customers in New York City using Auto-DR and OpenADR

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Joyce Jihyun [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Schetrit, Oren [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Yin, Rongxin [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Kiliccote, Sila [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2014-05-01

    Demand response (DR) – allowing customers to respond to reliability requests and market prices by changing electricity use from their normal consumption pattern – continues to be seen as an attractive means of demand-side management and a fundamental smart-grid improvement that links supply and demand. From October 2011 to December 2013, the Demand Response Research Center at Lawrence Berkeley National Laboratory, the New York State Energy Research and Development Authority, and partners Honeywell and Akuacom, have conducted a demonstration project enabling Automated Demand Response (Auto-DR) in large commercial buildings located in New York City using Open Automated Demand Response (OpenADR) communication protocols. In particular, this project focuses on demonstrating how the OpenADR platform, enabled by Akuacom, can automate and simplify interactions between buildings and various stakeholders in New York State and enable the automation of customers’ price response to yield bill savings under dynamic pricing. In this paper, the cost control opportunities under day-ahead hourly pricing and Auto-DR control strategies are presented for four demonstration buildings; present the breakdown of Auto-DR enablement costs; summarize the field test results and their load impact; and show potential bill savings by enabling automated price response under Consolidated Edison’s Mandatory Hourly Pricing (MHP) tariff. For one of the sites, the potential bill savings at the site’s current retail rate are shown. Facility managers were given granular equipment-level opt-out capability to ensure full control of the sites during the Auto-DR implementation. The expected bill savings ranged from 1.1% to 8.0% of the total MHP bill. The automation and enablement costs ranged from $70 to $725 per kW shed. The results show that OpenADR can facilitate the automation of price response, deliver savings to the customers and opt-out capability of the implementation retains control of the

  4. Energy-conscious building automation. Phase 1 - pilot study. Main report; Energirigtig bygningsautomation. Fase 1 - Forundersoegelse. Hovedrapport

    Energy Technology Data Exchange (ETDEWEB)

    Hummelshoej, R.M.; Kaarup Olsen, P. (COWI A/S, Kgs. Lyngby (Denmark)); Brohus, H. (Aalborg Univ. (AAU), Inst. for Byggeri og Anlaeg, Aalborg (Denmark)); Olesen, Bjarne W. (Danmarks Tekniske Univ., DTU Byg. Institut for Byggeri og Anlaeg, ICIEE, Kgs. Lyngby (Denmark)); Bang Skjoedt, A.; Giliamsen, P. (TAC A/S, Herlev (Denmark))

    2009-09-15

    Building Automation is a significant and sometimes overlooked element of low-energy buildings to ensure large energy savings by e.g. demand control and to ensure optimised operation of ventilation system, cooling and heating system etc. in both new buildings and in modernisation of existing buildings. In the building design including installation, automation makes up a large part of the total construction costs. A rough figure is 3-500 DKK/m{sup 2}. It is estimated that energy-efficient building automation can reduce buildings' energy demand by about 15 kWh/m{sup 2} (electricity and heating) on average. Based on this possible reduction of building energy demand, the project deals with optimisation of control/adjustment of building installations for lighting, heating, cooling and ventilation systems. An investigated key parameter is how much it will be okay for the room temperature to 'glide' (change) during the occupied hours in a heavy and a light office building, and how this will influence the energy demand and the sensation of the indoor climate. The project shows that it is possible to save about 15 kWh/m{sup 2} primary energy by letting the temperature 'glide' +2,5 deg. C during the daily occupied hours instead of +1 deg. C. With this operation philosophy, the building constructions' thermo-active capacity can be utilised better, and there will be less requirements for automation and operation of the climate system. In the project is assessed how large a zone area can be covered by typical sensors in order to maintain a satisfactory indoor climate and the corresponding energy demand. The electricity consumption for building automation may account for up to 6 kWh/m{sup 2} per year. The project has made a preliminary basis for indicating a good practice for 1) automation of buildings with a view to good indoor climate and a low energy demand and 2) necessary equipping with meters to be able to document energy savings and energy

  5. CMS on the GRID: Toward a fully distributed computing architecture

    International Nuclear Information System (INIS)

    Innocente, Vincenzo

    2003-01-01

    The computing systems required to collect, analyse and store the physics data at LHC would need to be distributed and global in scope. CMS is actively involved in several grid-related projects to develop and deploy a fully distributed computing architecture. We present here recent developments of tools for automating job submission and for serving data to remote analysis stations. Plans for further test and deployment of a production grid are also described

  6. Automation trust and attention allocation in multitasking workspace.

    Science.gov (United States)

    Karpinsky, Nicole D; Chancey, Eric T; Palmer, Dakota B; Yamani, Yusuke

    2018-07-01

    Previous research suggests that operators with high workload can distrust and then poorly monitor automation, which has been generally inferred from automation dependence behaviors. To test automation monitoring more directly, the current study measured operators' visual attention allocation, workload, and trust toward imperfect automation in a dynamic multitasking environment. Participants concurrently performed a manual tracking task with two levels of difficulty and a system monitoring task assisted by an unreliable signaling system. Eye movement data indicate that operators allocate less visual attention to monitor automation when the tracking task is more difficult. Participants reported reduced levels of trust toward the signaling system when the tracking task demanded more focused visual attention. Analyses revealed that trust mediated the relationship between the load of the tracking task and attention allocation in Experiment 1, an effect that was not replicated in Experiment 2. Results imply a complex process underlying task load, visual attention allocation, and automation trust during multitasking. Automation designers should consider operators' task load in multitasking workspaces to avoid reduced automation monitoring and distrust toward imperfect signaling systems. Copyright © 2018. Published by Elsevier Ltd.

  7. MISENS DEVICE AS A NEW AUTOMATED BIOSENSING PLATFORM BASED ON REAL-TIME ELECTROCHEMICAL PROFILING (REP

    Directory of Open Access Journals (Sweden)

    yıldız uludağ

    2016-09-01

    Full Text Available In various fields like health, environmental control, food security and military defense; there is an increasing demand for on-site detection, fast identification and urgent response which brings the necessity to employ laboratory detection procedures on standalone automatic devices. In response to that TUBITAK BILGEM’s Bioelectronic Devices and Systems Group has been developing portable and fully automated biosensor devices using optical and electrochemical biosensor detection techniques. Here we describe a new integrated and fully automated lab-on-a-chip based biosensor device ‘MiSens’. The key features of the MiSens include a new electrode array, an integrated microfluidic system and real-time amperometric measurements during the flow of enzyme substrate. While simple protocols can be controlled from the LCD display on the device, other main device control procedures can be run wireless by a tablet/PC using the MiCont™ software developed by the team. For the device, a new plug and play type sensor chip docking station has been designed that with one move it enables the formation of a ~ 7-10 µl capacity flow cell on the electrode array with the necessary microfluidic and electronic connections. The MiSens device has been developed by our multi-disciplinary team by integrating and automatising the earlier developed sensing platform REP™ (Real-time Electrochemical Profiling. The performance of the MiSens device has been tested using cyclic voltammetry and amperometry tests and the results were compared with an of the shelf potantiostat.

  8. Simulation Model of Automated Peat Briquetting Press Drive

    Directory of Open Access Journals (Sweden)

    A. A. Marozka

    2012-01-01

    Full Text Available The paper presents the developed fully functional simulation model of an automated peat briquetting press drive. The given model makes it possible to reduce financial and time costs while developing, designing and operating a double-stamp peat briquetting press drive.

  9. An Intelligent Automation Platform for Rapid Bioprocess Design.

    Science.gov (United States)

    Wu, Tianyi; Zhou, Yuhong

    2014-08-01

    Bioprocess development is very labor intensive, requiring many experiments to characterize each unit operation in the process sequence to achieve product safety and process efficiency. Recent advances in microscale biochemical engineering have led to automated experimentation. A process design workflow is implemented sequentially in which (1) a liquid-handling system performs high-throughput wet lab experiments, (2) standalone analysis devices detect the data, and (3) specific software is used for data analysis and experiment design given the user's inputs. We report an intelligent automation platform that integrates these three activities to enhance the efficiency of such a workflow. A multiagent intelligent architecture has been developed incorporating agent communication to perform the tasks automatically. The key contribution of this work is the automation of data analysis and experiment design and also the ability to generate scripts to run the experiments automatically, allowing the elimination of human involvement. A first-generation prototype has been established and demonstrated through lysozyme precipitation process design. All procedures in the case study have been fully automated through an intelligent automation platform. The realization of automated data analysis and experiment design, and automated script programming for experimental procedures has the potential to increase lab productivity. © 2013 Society for Laboratory Automation and Screening.

  10. Modelling UK energy demand to 2000

    International Nuclear Information System (INIS)

    Thomas, S.D.

    1980-01-01

    A recent long-term demand forecast for the UK was made by Cheshire and Surrey. (SPRU Occasional Paper Series No.5, Science Policy Research Unit, Univ. Of Sussex, 1978.) Although they adopted a sectoral approach their study leaves some questions unanswered. Do they succeed in their aim of making all their assumptions fully explicit. How sensitive are their estimates to changes in assumptions and policies. Are important problems and 'turning points' fully identified in the period up to and immediately beyond their time horizon of 2000. The author addresses these questions by using a computer model based on the study by Cheshire and Surrey. This article is a shortened version of the report, S.D. Thomas, 'Modelling UK Energy Demand to 2000', Operational Research, Univ. of Sussex, Brighton, UK, 1979, in which full details of the author's model are given. Copies are available from the author. (author)

  11. Modelling UK energy demand to 2000

    Energy Technology Data Exchange (ETDEWEB)

    Thomas, S D [Sussex Univ., Brighton (UK)

    1980-03-01

    A recent long-term demand forecast for the UK was made by Cheshire and Surrey. (SPRU Occasional Paper Series No.5, Science Policy Research Unit, Univ. Of Sussex, 1978.) Although they adopted a sectoral approach their study leaves some questions unanswered. Do they succeed in their aim of making all their assumptions fully explicit. How sensitive are their estimates to changes in assumptions and policies. Are important problems and 'turning points' fully identified in the period up to and immediately beyond their time horizon of 2000. The author addresses these questions by using a computer model based on the study by Cheshire and Surrey. This article is a shortened version of the report, S.D. Thomas, 'Modelling UK Energy Demand to 2000', Operational Research, Univ. of Sussex, Brighton, UK, 1979, in which full details of the author's model are given. Copies are available from the author.

  12. Responsiveness of residential electricity demand to dynamic tariffs : experiences from a large field test in the Netherlands

    NARCIS (Netherlands)

    Klaassen, E.A.M.; Kobus, C.B.A.; Frunt, J.; Slootweg, J.G.

    2016-01-01

    To efficiently facilitate the energy transition it is essential to evaluate the potential of demand response in practice. Based on the results of a Dutch smart grid pilot, this paper assesses the potential of both manual and semi-automated demand response in residential areas. To stimulate demand

  13. Responsiveness of residential electricity demand to dynamic tariffs : Experiences from a large field test in the Netherlands

    NARCIS (Netherlands)

    Klaassen, EAM; Kobus, C.B.A.; Frunt, J; Slootweg, JG

    2016-01-01

    To efficiently facilitate the energy transition it is essential to evaluate the potential of demand response in practice. Based on the results of a Dutch smart grid pilot, this paper assesses the potential of both manual and semi-automated demand response in residential areas. To stimulate demand

  14. Introducing a demand-based electricity distribution tariff in the residential sector: Demand response and customer perception

    International Nuclear Information System (INIS)

    Bartusch, Cajsa; Wallin, Fredrik; Odlare, Monica; Vassileva, Iana; Wester, Lars

    2011-01-01

    Increased demand response is essential to fully exploit the Swedish power system, which in turn is an absolute prerequisite for meeting political goals related to energy efficiency and climate change. Demand response programs are, nonetheless, still exceptional in the residential sector of the Swedish electricity market, one contributory factor being lack of knowledge about the extent of the potential gains. In light of these circumstances, this empirical study set out with the intention of estimating the scope of households' response to, and assessing customers' perception of, a demand-based time-of-use electricity distribution tariff. The results show that households as a whole have a fairly high opinion of the demand-based tariff and act on its intrinsic price signals by decreasing peak demand in peak periods and shifting electricity use from peak to off-peak periods. - Highlights: → Households are sympathetic to demand-based tariffs, seeing as they relate to environmental issues. → Households adjust their electricity use to the price signals of demand-based tariffs. → Demand-based tariffs lead to a shift in electricity use from peak to off-peak hours. → Demand-based tariffs lead to a decrease in maximum demand in peak periods. → Magnitude of these effects increases over time.

  15. A SURVEY OF AUTOMATION TECHNIQUES COMING FORTH IN SHEET-FED OFFSET PRINTING ORGANIZATIONS

    OpenAIRE

    Mr. Ramesh Kumar*, Mr. Bijender & Mr. Sandeep Boora

    2017-01-01

    Sheet-Fed offset is one of the premier processes in India as well as abroad. To cope up with customers large quantity demands automation has become mandatory. From prepress to post press a wide range of automation techniques exist and coming forth for sheet fed offset presses. Objective of this paper is to throw light on various sheet-fed offset automation techniques existing today and their futuristic implications. The data related to automation was collected with the help of survey conducte...

  16. Demand Response and Energy Storage Integration Study

    Energy Technology Data Exchange (ETDEWEB)

    Ma, Ookie; Cheung, Kerry; Olsen, Daniel J.; Matson, Nance; Sohn, Michael D.; Rose, Cody M.; Dudley, Junqiao Han; Goli, Sasank; Kiliccote, Sila; Cappers, Peter; MacDonald, Jason; Denholm, Paul; Hummon, Marissa; Jorgenson, Jennie; Palchak, David; Starke, Michael; Alkadi, Nasr; Bhatnagar, Dhruv; Currier, Aileen; Hernandez, Jaci; Kirby, Brendan; O' Malley, Mark

    2016-03-01

    Demand response and energy storage resources present potentially important sources of bulk power system services that can aid in integrating variable renewable generation. While renewable integration studies have evaluated many of the challenges associated with deploying large amounts of variable wind and solar generation technologies, integration analyses have not yet fully incorporated demand response and energy storage resources. This report represents an initial effort in analyzing the potential integration value of demand response and energy storage, focusing on the western United States. It evaluates two major aspects of increased deployment of demand response and energy storage: (1) Their operational value in providing bulk power system services and (2) Market and regulatory issues, including potential barriers to deployment.

  17. A scheme for a future distribution automation system in Finnish utilities

    Energy Technology Data Exchange (ETDEWEB)

    Lehtonen, M.; Kaerkkaeinen, S. [VTT Energy, Espoo (Finland); Partanen, J. [Lappeenranta Univ. of Technology (Finland)

    1996-12-31

    This presentation summarizes the results of a project, the aim of which was to define the optimal set of functions for the future distribution automation (DA) systems in Finland. The general factors. which affect the automation needs, are first discussed. The benefits of various functions of DA and demand side management (DSM) are then studied. Next a computer model for a DA feasibility analysis is presented, and some computation results are given. From these. the proposed automation scheme is finally concluded

  18. A scheme for a future distribution automation system in Finnish utilities

    Energy Technology Data Exchange (ETDEWEB)

    Lehtonen, M; Kaerkkaeinen, S [VTT Energy, Espoo (Finland); Partanen, J [Lappeenranta Univ. of Technology (Finland)

    1998-08-01

    This presentation summarizes the results of a project, the aim of which was to define the optimal set of functions for the future distribution automation (DA) systems in Finland. The general factors, which affect the automation needs, are first discussed. The benefits of various functions of DA and demand side management (DSM) are then studied. Next a computer model for a DA feasibility analysis is presented, and some computation results are given. From these, the proposed automation scheme is finally concluded

  19. A scheme for a future distribution automation system in Finnish utilities

    Energy Technology Data Exchange (ETDEWEB)

    Lehtonen, M; Kaerkkaeinen, S [VTT Energy, Espoo (Finland); Partanen, J [Lappeenranta Univ. of Technology (Finland)

    1997-12-31

    This presentation summarizes the results of a project, the aim of which was to define the optimal set of functions for the future distribution automation (DA) systems in Finland. The general factors. which affect the automation needs, are first discussed. The benefits of various functions of DA and demand side management (DSM) are then studied. Next a computer model for a DA feasibility analysis is presented, and some computation results are given. From these. the proposed automation scheme is finally concluded

  20. Automated pipe handling systems for new and retrofit applications in shallow drilling markets

    Energy Technology Data Exchange (ETDEWEB)

    McDougall, P.; Fikowski, L.M. [Blackbird Well Servicing Inc., Calgary, AB (Canada)

    2003-07-01

    This presentation discussed the importance of the human interface as the main element in the development of automated mechanical systems on drilling rigs. Improvements in drilling rig designs are meant to improve manpower efficiencies and performance. The goal for Blackbird Well Servicing is to design automated and integrated processes that can be controlled manually at any point during an operation. Although some drilling operations can be fully automated and fully integrated, certain steps in the process are intentionally left open ended for human intervention. It was concluded that the consistency of performance is the most significant feature of integrated systems and that all drilling contractors should strive for smooth, steady performance rather than brute labour. Speed and efficiency increases with consistent performance. Reliability results in better performance, thereby lowering operating costs and more work for drilling contractors.

  1. A Domain-specific Framework for Automated Construction and Verification of Railway Control Systems

    DEFF Research Database (Denmark)

    Haxthausen, Anne Elisabeth

    2009-01-01

    in a demand for a higher degree of automation for the development verification, validation and test phases of projects, without impairing the thoroughness of safety-related quality measures and certification activities. Motivated by these considerations, this presentation describes an approach for automated...... elaborate safety mechanisms in order to keep the risk at the same low level that has been established for European railways until today. The challenge is further increased by the demand for shorter time-to-market periods and higher competition among suppliers of the railway domain; both factors resulting...

  2. Architecture Views Illustrating the Service Automation Aspect of SOA

    Science.gov (United States)

    Gu, Qing; Cuadrado, Félix; Lago, Patricia; Duenãs, Juan C.

    Earlier in this book, Chapter 8 provided a detailed analysis of service engineering, including a review of service engineering techniques and methodologies. This chapter is closely related to Chapter 8 as shows how such approaches can be used to develop a service, with particular emphasis on the identification of three views (the automation decision view, degree of service automation view and service automation related data view) that structure and ease elicitation and documentation of stakeholders' concerns. This is carried out through two large case studies to learn the industrial needs in illustrating services deployment and configuration automation. This set of views adds to the more traditional notations like UML, the visual power of attracting the attention of their users to the addressed concerns, and assist them in their work. This is especially crucial in service oriented architecting where service automation is highly demanded.

  3. On-Demand Mobility (ODM) Technical Pathway: Enabling Ease of Use and Safety

    Science.gov (United States)

    Goodrich, Ken; Moore, Mark

    2015-01-01

    On-demand mobility (ODM) through aviation refers to the ability to quickly and easily move people or equivalent cargo without delays introduced by lack of, or infrequently, scheduled service. A necessary attribute of ODM is that it be easy to use, requiring a minimum of special training, skills, or workload. Fully-autonomous vehicles would provide the ultimate in ease-of-use (EU) but are currently unproven for safety-critical applications outside of a few, situationally constrained applications (e.g. automated trains operating in segregated systems). Applied to aviation, the current and near-future state of the art of full-autonomy, may entail undesirable trade-offs such as very conservative operational margins resulting in reduced trip reliability and transportation utility. Furthermore, acceptance by potential users and regulatory authorities will be challenging without confidence in autonomous systems in developed in less critical, but still challenging applications. A question for the aviation community is how we can best develop practical ease-of-use for aircraft that are sized to carry a small number of passengers (e.g. 1-9) or equivalent cargo. Such development is unlikely to be a single event, but rather a managed, evolutionary process where responsibility and authority transitions from human to automation agents as operational experience is gained with increasingly intelligent systems. This talk presents a technology road map being developed at NASA Langley, as part of an overall strategy to foster ODM, for the development of ease-of-use for ODM aviation.

  4. Distribution automation at BC Hydro : a case study

    Energy Technology Data Exchange (ETDEWEB)

    Siew, C. [BC Hydro, Vancouver, BC (Canada). Smart Grid Development Program

    2009-07-01

    This presentation discussed a distribution automation study conducted by BC Hydro to determine methods of improving grid performance by supporting intelligent transmission and distribution systems. The utility's smart grid program includes a number of utility-side and customer-side applications, including enabled demand response, microgrid, and operational efficiency applications. The smart grid program will improve reliability and power quality by 40 per cent, improve conservation and energy efficiency throughout the province, and provide enhanced customer service. Programs and initiatives currently underway at the utility include distribution management, smart metering, distribution automation, and substation automation programs. The utility's automation functionality will include fault interruption and locating, restoration capability, and restoration success. A decision support system has also been established to assist control room and field operating personnel with monitoring and control of the electric distribution system. Protection, control and monitoring (PCM) and volt VAR optimization upgrades are also planned. Reclosers are also being automated, and an automation guide has been developed for switches. tabs., figs.

  5. AUTOMATED PROCESS MONITORING: APPLYING PROVEN AUTOMATION TECHNIQUES TO INTERNATIONAL SAFEGUARDS NEEDS

    International Nuclear Information System (INIS)

    O'Hara, Matthew J.; Durst, Philip C.; Grate, Jay W.; Devol, Timothy A.; Egorov, Oleg; Clements, John P.

    2008-01-01

    Identification and quantification of specific alpha- and beta-emitting radionuclides in complex liquid matrices is highly challenging, and is typically accomplished through laborious wet chemical sample preparation and separations followed by analysis using a variety of detection methodologies (e.g., liquid scintillation, gas proportional counting, alpha energy analysis, mass spectrometry). Analytical results may take days or weeks to report. Chains of custody and sample security measures may also complicate or slow the analytical process. When an industrial process-scale plant requires the monitoring of specific radionuclides as an indication of the composition of its feed stream or of plant performance, radiochemical measurements must be fast, accurate, and reliable. Scientists at Pacific Northwest National Laboratory have assembled a fully automated prototype Process Monitor instrument capable of a variety of tasks: automated sampling directly from a feed stream, sample digestion/analyte redox adjustment, chemical separations, radiochemical detection and data analysis/reporting. The system is compact, its components are fluidically inter-linked, and analytical results could be immediately transmitted to on- or off-site locations. The development of a rapid radiochemical Process Monitor for 99Tc in Hanford tank waste processing streams, capable of performing several measurements per hour, will be discussed in detail. More recently, the automated platform was modified to perform measurements of 90Sr in Hanford tank waste stimulant. The system exemplifies how automation could be integrated into reprocessing facilities to support international nuclear safeguards needs

  6. Automation of cDNA Synthesis and Labelling Improves Reproducibility

    Directory of Open Access Journals (Sweden)

    Daniel Klevebring

    2009-01-01

    Full Text Available Background. Several technologies, such as in-depth sequencing and microarrays, enable large-scale interrogation of genomes and transcriptomes. In this study, we asses reproducibility and throughput by moving all laboratory procedures to a robotic workstation, capable of handling superparamagnetic beads. Here, we describe a fully automated procedure for cDNA synthesis and labelling for microarrays, where the purification steps prior to and after labelling are based on precipitation of DNA on carboxylic acid-coated paramagnetic beads. Results. The fully automated procedure allows for samples arrayed on a microtiter plate to be processed in parallel without manual intervention and ensuring high reproducibility. We compare our results to a manual sample preparation procedure and, in addition, use a comprehensive reference dataset to show that the protocol described performs better than similar manual procedures. Conclusions. We demonstrate, in an automated gene expression microarray experiment, a reduced variance between replicates, resulting in an increase in the statistical power to detect differentially expressed genes, thus allowing smaller differences between samples to be identified. This protocol can with minor modifications be used to create cDNA libraries for other applications such as in-depth analysis using next-generation sequencing technologies.

  7. A fully automated primary screening system for the discovery of therapeutic antibodies directly from B cells.

    Science.gov (United States)

    Tickle, Simon; Howells, Louise; O'Dowd, Victoria; Starkie, Dale; Whale, Kevin; Saunders, Mark; Lee, David; Lightwood, Daniel

    2015-04-01

    For a therapeutic antibody to succeed, it must meet a range of potency, stability, and specificity criteria. Many of these characteristics are conferred by the amino acid sequence of the heavy and light chain variable regions and, for this reason, can be screened for during antibody selection. However, it is important to consider that antibodies satisfying all these criteria may be of low frequency in an immunized animal; for this reason, it is essential to have a mechanism that allows for efficient sampling of the immune repertoire. UCB's core antibody discovery platform combines high-throughput B cell culture screening and the identification and isolation of single, antigen-specific IgG-secreting B cells through a proprietary technique called the "fluorescent foci" method. Using state-of-the-art automation to facilitate primary screening, extremely efficient interrogation of the natural antibody repertoire is made possible; more than 1 billion immune B cells can now be screened to provide a useful starting point from which to identify the rare therapeutic antibody. This article will describe the design, construction, and commissioning of a bespoke automated screening platform and two examples of how it was used to screen for antibodies against two targets. © 2014 Society for Laboratory Automation and Screening.

  8. The bright side of snow cover effects on PV production - How to lower the seasonal mismatch between electricity supply and demand in a fully renewable Switzerland

    Science.gov (United States)

    Kahl, Annelen; Dujardin, Jérôme; Dupuis, Sonia; Lehning, Michael

    2017-04-01

    One of the major problems with solar PV in the context of a fully renewable electricity production at mid-latitudes is the trend of higher production in summer and lower production in winter. This trend is most often exactly opposite to demand patterns, causing a seasonal mismatch that requires extensive balancing power from other production sources or large storage capacities. Which possibilities do we have to bring PV production into closer correlation with demand? This question motivated our research and in response we investigated the effects of placing PV panels at different tilt angles in regions with extensive snow cover to increase winter production from ground reflected short wave radiation. The aim of this project is therefore to quantify the effect of varying snow cover duration (SCD) and of panel tilt angle on the annual total production and on production during winter months when electricity is most needed. We chose Switzerland as ideal test site, because it has a wide range of snow cover conditions and a high potential for renewable electricity production. But methods can be applied to other regions of comparable conditions for snow cover and irradiance. Our analysis can be separated into two steps: 1. A systematic, GIS and satellite-based analysis for all of Switzerland: We use time series of satellite-derived irradiance, and snow cover characteristics together with land surface cover types and elevation information to quantify the environmental conditions and to estimate potential production and ideal tilt angles. 2. A scenario-based analysis that contrasts the production patterns of different placement scenarios for PV panels in urban, rural and mountainous areas. We invoke a model of a fully renewable electricity system (including Switzerland's large hydropower system) at national level to compute the electricity import and storage capacity that will be required to balance the remaining mismatch between production and demand to further illuminate

  9. Automated Analysis of Security in Networking Systems

    DEFF Research Database (Denmark)

    Buchholtz, Mikael

    2004-01-01

    such networking systems are modelled in the process calculus LySa. On top of this programming language based formalism an analysis is developed, which relies on techniques from data and control ow analysis. These are techniques that can be fully automated, which make them an ideal basis for tools targeted at non...

  10. Evaluation of an automated struvite reactor to recover phosphorus ...

    African Journals Online (AJOL)

    2015-04-03

    Apr 3, 2015 ... A reactor was developed that can run fully automated and recover up to 93% ..... uncertainty. This technique will work best when the concentration of ... Ken Jack at the School of Chemical Engineering to help build the reactor ...

  11. Automated meteorological data from commercial aircraft via satellite - Present experience and future implications

    Science.gov (United States)

    Steinberg, R.

    1978-01-01

    The National Aeronautics and Space Administration has developed a low-cost communications system to provide meteorological data from commercial aircraft, in near real-time, on a fully automated basis. The complete system including the low profile antenna and all installation hardware weighs 34 kg. The prototype system has been installed on a Pan American B-747 aircraft and has been providing meteorological data (wind angle and velocity, temperature, altitude and position as a function of time) on a fully automated basis for the past several months. The results have been exceptional. This concept is expected to have important implications for operational meteorology and airline route forecasting.

  12. An Intelligent Automation Platform for Rapid Bioprocess Design

    Science.gov (United States)

    Wu, Tianyi

    2014-01-01

    Bioprocess development is very labor intensive, requiring many experiments to characterize each unit operation in the process sequence to achieve product safety and process efficiency. Recent advances in microscale biochemical engineering have led to automated experimentation. A process design workflow is implemented sequentially in which (1) a liquid-handling system performs high-throughput wet lab experiments, (2) standalone analysis devices detect the data, and (3) specific software is used for data analysis and experiment design given the user’s inputs. We report an intelligent automation platform that integrates these three activities to enhance the efficiency of such a workflow. A multiagent intelligent architecture has been developed incorporating agent communication to perform the tasks automatically. The key contribution of this work is the automation of data analysis and experiment design and also the ability to generate scripts to run the experiments automatically, allowing the elimination of human involvement. A first-generation prototype has been established and demonstrated through lysozyme precipitation process design. All procedures in the case study have been fully automated through an intelligent automation platform. The realization of automated data analysis and experiment design, and automated script programming for experimental procedures has the potential to increase lab productivity. PMID:24088579

  13. Automated drawing generation system

    International Nuclear Information System (INIS)

    Yoshinaga, Toshiaki; Kawahata, Junichi; Yoshida, Naoto; Ono, Satoru

    1991-01-01

    Since automated CAD drawing generation systems still require human intervention, improvements were focussed on an interactive processing section (data input and correcting operation) which necessitates a vast amount of work. As a result, human intervention was eliminated, the original objective of a computerized system. This is the first step taken towards complete automation. The effects of development and commercialization of the system are as described below. (1) The interactive processing time required for generating drawings was improved. It was determined that introduction of the CAD system has reduced the time required for generating drawings. (2) The difference in skills between workers preparing drawings has been eliminated and the quality of drawings has been made uniform. (3) The extent of knowledge and experience demanded of workers has been reduced. (author)

  14. A Method for Automated Planning of FTTH Access Network Infrastructures

    DEFF Research Database (Denmark)

    Riaz, Muhammad Tahir; Pedersen, Jens Myrup; Madsen, Ole Brun

    2005-01-01

    In this paper a method for automated planning of Fiber to the Home (FTTH) access networks is proposed. We introduced a systematic approach for planning access network infrastructure. The GIS data and a set of algorithms were employed to make the planning process more automatic. The method explains...... method. The method, however, does not fully automate the planning but make the planning process significantly fast. The results and discussion are presented and conclusion is given in the end....

  15. Fully Automated Electro Membrane Extraction Autosampler for LC-MS Systems Allowing Soft Extractions for High-Throughput Applications

    DEFF Research Database (Denmark)

    Fuchs, David; Pedersen-Bjergaard, Stig; Jensen, Henrik

    2016-01-01

    was optimized for soft extraction of analytes and high sample throughput. Further, it was demonstrated that by flushing the EME-syringe with acidic wash buffer and reverting the applied electric potential, carry-over between samples can be reduced to below 1%. Performance of the system was characterized (RSD......, a complete analytical workflow of purification, separation, and analysis of sample could be achieved within only 5.5 min. With the developed system large sequences of samples could be analyzed in a completely automated manner. This high degree of automation makes the developed EME-autosampler a powerful tool...

  16. Fully Automated Segmentation of Fluid/Cyst Regions in Optical Coherence Tomography Images With Diabetic Macular Edema Using Neutrosophic Sets and Graph Algorithms.

    Science.gov (United States)

    Rashno, Abdolreza; Koozekanani, Dara D; Drayna, Paul M; Nazari, Behzad; Sadri, Saeed; Rabbani, Hossein; Parhi, Keshab K

    2018-05-01

    This paper presents a fully automated algorithm to segment fluid-associated (fluid-filled) and cyst regions in optical coherence tomography (OCT) retina images of subjects with diabetic macular edema. The OCT image is segmented using a novel neutrosophic transformation and a graph-based shortest path method. In neutrosophic domain, an image is transformed into three sets: (true), (indeterminate) that represents noise, and (false). This paper makes four key contributions. First, a new method is introduced to compute the indeterminacy set , and a new -correction operation is introduced to compute the set in neutrosophic domain. Second, a graph shortest-path method is applied in neutrosophic domain to segment the inner limiting membrane and the retinal pigment epithelium as regions of interest (ROI) and outer plexiform layer and inner segment myeloid as middle layers using a novel definition of the edge weights . Third, a new cost function for cluster-based fluid/cyst segmentation in ROI is presented which also includes a novel approach in estimating the number of clusters in an automated manner. Fourth, the final fluid regions are achieved by ignoring very small regions and the regions between middle layers. The proposed method is evaluated using two publicly available datasets: Duke, Optima, and a third local dataset from the UMN clinic which is available online. The proposed algorithm outperforms the previously proposed Duke algorithm by 8% with respect to the dice coefficient and by 5% with respect to precision on the Duke dataset, while achieving about the same sensitivity. Also, the proposed algorithm outperforms a prior method for Optima dataset by 6%, 22%, and 23% with respect to the dice coefficient, sensitivity, and precision, respectively. Finally, the proposed algorithm also achieves sensitivity of 67.3%, 88.8%, and 76.7%, for the Duke, Optima, and the university of minnesota (UMN) datasets, respectively.

  17. Automated rapid chemistry in heavy element research

    International Nuclear Information System (INIS)

    Schaedel, M.

    1994-01-01

    With the increasingly short half-lives of the heavy element isotopes in the transition region from the heaviest actinides to the transactinide elements the demand for automated rapid chemistry techniques is also increasing. Separation times of significantly less than one minute, high chemical yields, high repetition rates, and an adequate detection system are prerequisites for many successful experiments in this field. The development of techniques for separations in the gas phase and in the aqueous phase for applications of chemical or nuclear studies of the heaviest elements are briefly outlined. Typical examples of results obtained with automated techniques are presented for studies up to element 105, especially those obtained with the Automated Rapid Chemistry Apparatus, ARCA. The prospects to investigate the properties of even heavier elements with chemical techniques are discussed

  18. Automated sampling and data processing derived from biomimetic membranes

    DEFF Research Database (Denmark)

    Perry, Mark; Vissing, Thomas; Boesen, P.

    2009-01-01

    data processing software to analyze and organize the large amounts of data generated. In this work, we developed an automated instrumental voltage clamp solution based on a custom-designed software controller application (the WaveManager), which enables automated on-line voltage clamp data acquisition...... applicable to long-time series experiments. We designed another software program for off-line data processing. The automation of the on-line voltage clamp data acquisition and off-line processing was furthermore integrated with a searchable database (DiscoverySheet (TM)) for efficient data management......Recent advances in biomimetic membrane systems have resulted in an increase in membrane lifetimes from hours to days and months. Long-lived membrane systems demand the development of both new automated monitoring equipment capable of measuring electrophysiological membrane characteristics and new...

  19. Automating the radiographic NDT process

    International Nuclear Information System (INIS)

    Aman, J.K.

    1988-01-01

    Automation, the removal of the human element in inspection has not been generally applied to film radiographic NDT. The justification for automation is not only productivity but also reliability of results. Film remains in the automated system of the future because of its extremely high image content, approximately 3x10 (to the power of nine) bits per 14x17. This is equivalent to 2200 computer floppy disks parts handling systems and robotics applied for manufacturing and some NDT modalities, should now be applied to film radiographic NDT systems. Automatic film handling can be achieved with the daylight NDT film handling system. Automatic film processing is becoming the standard in industry and can be coupled to the daylight system. Robots offer the opportunity to automate fully the exposure step. Finally, a computer aided interpretation appears on the horizon. A unit which laser scans a 14x27 (inch) film in 6-8 seconds can digitize film in information for further manipulation and possible automatic interrogations (computer aided interpretation). The system called FDRS (for film digital radiography system) is moving toward 50 micron (16 lines/mm) resolution. This is believed to meet the need of the majority of image content needs. (Author). 4 refs.; 21 figs

  20. Automated structure solution, density modification and model building.

    Science.gov (United States)

    Terwilliger, Thomas C

    2002-11-01

    The approaches that form the basis of automated structure solution in SOLVE and RESOLVE are described. The use of a scoring scheme to convert decision making in macromolecular structure solution to an optimization problem has proven very useful and in many cases a single clear heavy-atom solution can be obtained and used for phasing. Statistical density modification is well suited to an automated approach to structure solution because the method is relatively insensitive to choices of numbers of cycles and solvent content. The detection of non-crystallographic symmetry (NCS) in heavy-atom sites and checking of potential NCS operations against the electron-density map has proven to be a reliable method for identification of NCS in most cases. Automated model building beginning with an FFT-based search for helices and sheets has been successful in automated model building for maps with resolutions as low as 3 A. The entire process can be carried out in a fully automatic fashion in many cases.

  1. Multiagent-Based Flexible Automation of Microproduction Systems Including Mobile Transport Robots

    OpenAIRE

    Voos, Holger; Wangmanaopituk, Suparchoek

    2013-01-01

    In microproduction, i.e. in the production and assembly of micro-scale components and products, fully automated systems hardly exist so far. Besides the requirements of handling small parts with extreme precision, small batch sizes of highly customized products are among the main challenges. Therefore, economic microproduction requires very flexible production systems with a high level of automation. This contribution proposes a new concept of such a system that provides two main innova...

  2. Automated procedure for performing computer security risk analysis

    International Nuclear Information System (INIS)

    Smith, S.T.; Lim, J.J.

    1984-05-01

    Computers, the invisible backbone of nuclear safeguards, monitor and control plant operations and support many materials accounting systems. Our automated procedure to assess computer security effectiveness differs from traditional risk analysis methods. The system is modeled as an interactive questionnaire, fully automated on a portable microcomputer. A set of modular event trees links the questionnaire to the risk assessment. Qualitative scores are obtained for target vulnerability, and qualitative impact measures are evaluated for a spectrum of threat-target pairs. These are then combined by a linguistic algebra to provide an accurate and meaningful risk measure. 12 references, 7 figures

  3. Exploring the Use of a Test Automation Framework

    Science.gov (United States)

    Cervantes, Alex

    2009-01-01

    It is known that software testers, more often than not, lack the time needed to fully test the delivered software product within the time period allotted to them. When problems in the implementation phase of a development project occur, it normally causes the software delivery date to slide. As a result, testers either need to work longer hours, or supplementary resources need to be added to the test team in order to meet aggressive test deadlines. One solution to this problem is to provide testers with a test automation framework to facilitate the development of automated test solutions.

  4. Automated Test Requirement Document Generation

    Science.gov (United States)

    1987-11-01

    DIAGNOSTICS BASED ON THE PRINCIPLES OF ARTIFICIAL INTELIGENCE ", 1984 International Test Conference, 01Oct84, (A3, 3, Cs D3, E2, G2, H2, 13, J6, K) 425...j0O GLOSSARY OF ACRONYMS 0 ABBREVIATION DEFINITION AFSATCOM Air Force Satellite Communication Al Artificial Intelligence ASIC Application Specific...In-Test Equipment (BITE) and AI ( Artificial Intelligence) - Expert Systems - need to be fully applied before a completely automated process can be

  5. Will the future of knowledge work automation transform personalized medicine?

    OpenAIRE

    Gauri Naik; Sanika S. Bhide

    2014-01-01

    Today, we live in a world of ?information overload? which demands high level of knowledge-based work. However, advances in computer hardware and software have opened possibilities to automate ?routine cognitive tasks? for knowledge processing. Engineering intelligent software systems that can process large data sets using unstructured commands and subtle judgments and have the ability to learn ?on the fly? are a significant step towards automation of knowledge work. The applications of this t...

  6. Effects of Automation Types on Air Traffic Controller Situation Awareness and Performance

    Science.gov (United States)

    Sethumadhavan, A.

    2009-01-01

    The Joint Planning and Development Office has proposed the introduction of automated systems to help air traffic controllers handle the increasing volume of air traffic in the next two decades (JPDO, 2007). Because fully automated systems leave operators out of the decision-making loop (e.g., Billings, 1991), it is important to determine the right level and type of automation that will keep air traffic controllers in the loop. This study examined the differences in the situation awareness (SA) and collision detection performance of individuals when they worked with information acquisition, information analysis, decision and action selection and action implementation automation to control air traffic (Parasuraman, Sheridan, & Wickens, 2000). When the automation was unreliable, the time taken to detect an upcoming collision was significantly longer for all the automation types compared with the information acquisition automation. This poor performance following automation failure was mediated by SA, with lower SA yielding poor performance. Thus, the costs associated with automation failure are greater when automation is applied to higher order stages of information processing. Results have practical implications for automation design and development of SA training programs.

  7. Automated visual fruit detection for harvest estimation and robotic harvesting

    OpenAIRE

    Puttemans, Steven; Vanbrabant, Yasmin; Tits, Laurent; Goedemé, Toon

    2016-01-01

    Fully automated detection and localisation of fruit in orchards is a key component in creating automated robotic harvesting systems, a dream of many farmers around the world to cope with large production and personnel costs. In recent years a lot of research on this topic has been performed, using basic computer vision techniques, like colour based segmentation, as a suggested solution. When not using standard RGB cameras, research tends to resort to other sensors, like hyper spectral or 3D. ...

  8. Preface to the special section on human factors and automation in vehicles: designing highly automated vehicles with the driver in mind.

    Science.gov (United States)

    Merat, Natasha; Lee, John D

    2012-10-01

    This special section brings together diverse research regarding driver interaction with advanced automotive technology to guide design of increasingly automated vehicles. Rapidly evolving vehicle automation will likely change cars and trucks more in the next 5 years than the preceding 50, radically redefining what it means to drive. This special section includes 10 articles from European and North American researchers reporting simulator and naturalistic driving studies. Little research has considered the consequences of fully automated driving, with most focusing on lane-keeping and speed control systems individually. The studies reveal two underlying design philosophies: automate driving versus support driving. Results of several studies, consistent with previous research in other domains, suggest that the automate philosophy can delay driver responses to incidents in which the driver has to intervene and take control from the automation. Understanding how to orchestrate the transfer or sharing of control between the system and the driver, particularly in critical incidents, emerges as a central challenge. Designers should not assume that automation can substitute seamlessly for a human driver, nor can they assume that the driver can safely accommodate the limitations of automation. Designers, policy makers, and researchers must give careful consideration to what role the person should have in highly automated vehicles and how to support the driver if the driver is to be responsible for vehicle control. As in other domains, driving safety increasingly depends on the combined performance of the human and automation, and successful designs will depend on recognizing and supporting the new roles of the driver.

  9. Development and transport implications of automated vehicles in the Netherlands: Scenarios for 2030 and 2050

    NARCIS (Netherlands)

    Milakis, D.; Snelder, M.; Arem, B. van; Wee, B. van; Almeida Correia, G.H. de

    2017-01-01

    Automated driving technology is emerging. Yet, little is known in the literature about when automated vehicles will reach the market, how penetration rates will evolve and to what extent this new transport technology will affect transport demand and planning. This study uses scenario analysis to

  10. Automated Motion Estimation for 2D Cine DENSE MRI

    Science.gov (United States)

    Gilliam, Andrew D.; Epstein, Frederick H.

    2013-01-01

    Cine displacement encoding with stimulated echoes (DENSE) is a magnetic resonance (MR) method that directly encodes tissue displacement into MR phase images. This technique has successfully interrogated many forms of tissue motion, but is most commonly used to evaluate cardiac mechanics. Currently, motion analysis from cine DENSE images requires manually delineated anatomical structures. An automated analysis would improve measurement throughput, simplify data interpretation, and potentially access important physiological information during the MR exam. In this article, we present the first fully automated solution for the estimation of tissue motion and strain from 2D cine DENSE data. Results using both simulated and human cardiac cine DENSE data indicate good agreement between the automated algorithm and the standard semi-manual analysis method. PMID:22575669

  11. Operating procedure automation to enhance safety of nuclear power plants

    International Nuclear Information System (INIS)

    Husseiny, A.A.; Sabri, Z.A.; Adams, S.K.; Rodriguez, R.J.; Packer, D.; Holmes, J.W.

    1989-01-01

    Use of logic statements and computer assist are explored as means for automation and improvement on design of operating procedures including those employed in abnormal and emergency situations. Operating procedures for downpower and loss of forced circulation are used for demonstration. Human-factors analysis is performed on generic emergency operating procedures for three strategies of control; manual, semi-automatic and automatic, using standard emergency operating procedures. Such preliminary analysis shows that automation of procedures is feasible provided that fault-tolerant software and hardware become available for design of the controllers. Recommendations are provided for tests to substantiate the promise of enhancement of plant safety. Adequate design of operating procedures through automation may alleviate several major operational problems of nuclear power plants. Also, automation of procedures is necessary for partial or overall automatic control of plants. Fully automatic operations are needed for space applications while supervised automation of land-based and offshore plants may become the thrust of new generation of nulcear power plants. (orig.)

  12. Analysis of an Automated Vehicle Routing Problem in Logistics considering Path Interruption

    Directory of Open Access Journals (Sweden)

    Yong Zhang

    2017-01-01

    Full Text Available The application of automated vehicles in logistics can efficiently reduce the cost of logistics and reduce the potential risks in the last mile. Considering the path restriction in the initial stage of the application of automated vehicles in logistics, the conventional model for a vehicle routing problem (VRP is modified. Thus, the automated vehicle routing problem with time windows (AVRPTW model considering path interruption is established. Additionally, an improved particle swarm optimisation (PSO algorithm is designed to solve this problem. Finally, a case study is undertaken to test the validity of the model and the algorithm. Four automated vehicles are designated to execute all delivery tasks required by 25 stores. Capacities of all of the automated vehicles are almost fully utilised. It is of considerable significance for the promotion of automated vehicles in last-mile situations to develop such research into real problems arising in the initial period.

  13. Vibration-based Energy Harvesting Systems Characterization Using Automated Electronic Equipment

    Directory of Open Access Journals (Sweden)

    Ioannis KOSMADAKIS

    2015-04-01

    Full Text Available A measurement bench has been developed to fully automate the procedure for the characterization of a vibration-based energy scavenging system. The measurement system is capable of monitoring all important characteristics of a vibration harvesting system (input and output voltage, current, and other parameters, frequency and acceleration values, etc.. It is composed of a PC, typical digital measuring instruments (oscilloscope, waveform generator, etc., certain sensors and actuators, along with a microcontroller based automation module. The automation of the procedure and the manipulation of the acquired data are performed by LabVIEW software. Typical measurements of a system consisting of a vibrating source, a vibration transducer and an active rectifier are presented.

  14. Automated analysis of objective-prism spectra

    International Nuclear Information System (INIS)

    Hewett, P.C.; Irwin, M.J.; Bunclark, P.; Bridgeland, M.T.; Kibblewhite, E.J.; Smith, M.G.

    1985-01-01

    A fully automated system for the location, measurement and analysis of large numbers of low-resolution objective-prism spectra is described. The system is based on the APM facility at the University of Cambridge, and allows processing of objective-prism, grens or grism data. Particular emphasis is placed on techniques to obtain the maximum signal-to-noise ratio from the data, both in the initial spectral estimation procedure and for subsequent feature identification. Comparison of a high-quality visual catalogue of faint quasar candidates with an equivalent automated sample demonstrates the ability of the APM system to identify all the visually selected quasar candidates. In addition, a large population of new, faint (msub(J)approx. 20) candidates is identified. (author)

  15. Automated, High Temperature Furnace for Glovebox Operation

    International Nuclear Information System (INIS)

    Neikirk, K.

    2001-01-01

    The U.S. Department of Energy will immobilize excess plutonium in the proposed Plutonium Immobilization Plant (PIP) at the Savannah River Site (SRS) as part of a two track approach for the disposition of weapons usable plutonium. As such, the Department of Energy is funding a development and testing effort for the PIP. This effort is being performed jointly by Lawrence Livermore National Laboratory (LLNL), Westinghouse Savannah River Company (WSRC), Pacific Northwest National Laboratory (PNNL), and Argonne National Laboratory (ANL). The Plutonium Immobilization process involves the disposition of excess plutonium by incorporation into ceramic pucks. As part of the immobilization process, furnaces are needed for sintering the ceramic pucks. The furnace being developed for puck sintering is an automated, bottom loaded furnace with insulting package and resistance heating elements located within a nuclear glovebox. Other furnaces considered for the application include retort furnaces and pusher furnaces. This paper, in part, will discuss the furnace technologies considered and furnace technology selected to support reliable puck sintering in a glovebox environment. Due to the radiation levels and contamination associated with the plutonium material, the sintering process will be fully automated and contained within nuclear material gloveboxes. As such, the furnace currently under development incorporates water and air cooling to minimize heat load to the glovebox. This paper will describe the furnace equipment and systems needed to employ a fully automated puck sintering process within nuclear gloveboxes as part of the Plutonium Immobilization Plant

  16. A Framework for the Automation of Air Defence Systems

    NARCIS (Netherlands)

    Choenni, R.S.; Leijnse, C.

    The need for more efficiency in military organizations is growing. It is expected that a significant increase in efficiency can be obtained by an integration of communication and information technology. This integration may result in (sub)systems that are fully automated, i.e., systems that are

  17. Development and transport implications of automated vehicles in the Netherlands : Scenarios for 2030 and 2050

    NARCIS (Netherlands)

    Milakis, D.; Snelder, M.; van Arem, B.; van Wee, G.P.; Homem de Almeida Correia, G.

    2017-01-01

    Automated driving technology is emerging. Yet, little is known in the literature about when automated vehicles will reach the market, how penetration rates will evolve and to what extent this new transport technology will affect transport demand and planning. This study uses scenario analysis to

  18. Advances toward fully automated in vivo assessment of oral epithelial dysplasia by nuclear endomicroscopy-A pilot study.

    Science.gov (United States)

    Liese, Jan; Winter, Karsten; Glass, Änne; Bertolini, Julia; Kämmerer, Peer Wolfgang; Frerich, Bernhard; Schiefke, Ingolf; Remmerbach, Torsten W

    2017-11-01

    Uncertainties in detection of oral epithelial dysplasia (OED) frequently result from sampling error especially in inflammatory oral lesions. Endomicroscopy allows non-invasive, "en face" imaging of upper oral epithelium, but parameters of OED are unknown. Mucosal nuclei were imaged in 34 toluidine blue-stained oral lesions with a commercial endomicroscopy. Histopathological diagnosis showed four biopsies in "dys-/neoplastic," 23 in "inflammatory," and seven in "others" disease groups. Strength of different assessment strategies of nuclear scoring, nuclear count, and automated nuclear analysis were measured by area under ROC curve (AUC) to identify histopathological "dys-/neoplastic" group. Nuclear objects from automated image analysis were visually corrected. Best-performing parameters of nuclear-to-image ratios were the count of large nuclei (AUC=0.986) and 6-nearest neighborhood relation (AUC=0.896), and best parameters of nuclear polymorphism were the count of atypical nuclei (AUC=0.996) and compactness of nuclei (AUC=0.922). Excluding low-grade OED, nuclear scoring and count reached 100% sensitivity and 98% specificity for detection of dys-/neoplastic lesions. In automated analysis, combination of parameters enhanced diagnostic strength. Sensitivity of 100% and specificity of 87% were seen for distances of 6-nearest neighbors and aspect ratios even in uncorrected objects. Correction improved measures of nuclear polymorphism only. The hue of background color was stronger than nuclear density (AUC=0.779 vs 0.687) to detect dys-/neoplastic group indicating that macroscopic aspect is biased. Nuclear-to-image ratios are applicable for automated optical in vivo diagnostics for oral potentially malignant disorders. Nuclear endomicroscopy may promote non-invasive, early detection of dys-/neoplastic lesions by reducing sampling error. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  19. Design of an optimal automation system : Finding a balance between a human's task engagement and exhaustion

    NARCIS (Netherlands)

    Klein, Michel; van Lambalgen, Rianne

    2011-01-01

    In demanding tasks, human performance can seriously degrade as a consequence of increased workload and limited resources. In such tasks it is very important to maintain an optimal performance quality, therefore automation assistance is required. On the other hand, automation can also impose

  20. Auditory interfaces in automated driving: an international survey

    Directory of Open Access Journals (Sweden)

    Pavlo Bazilinskyy

    2015-08-01

    Full Text Available This study investigated peoples’ opinion on auditory interfaces in contemporary cars and their willingness to be exposed to auditory feedback in automated driving. We used an Internet-based survey to collect 1,205 responses from 91 countries. The respondents stated their attitudes towards two existing auditory driver assistance systems, a parking assistant (PA and a forward collision warning system (FCWS, as well as towards a futuristic augmented sound system (FS proposed for fully automated driving. The respondents were positive towards the PA and FCWS, and rated the willingness to have automated versions of these systems as 3.87 and 3.77, respectively (on a scale from 1 = disagree strongly to 5 = agree strongly. The respondents tolerated the FS (the mean willingness to use it was 3.00 on the same scale. The results showed that among the available response options, the female voice was the most preferred feedback type for takeover requests in highly automated driving, regardless of whether the respondents’ country was English speaking or not. The present results could be useful for designers of automated vehicles and other stakeholders.

  1. Automated Analysis of Corpora Callosa

    DEFF Research Database (Denmark)

    Stegmann, Mikkel Bille; Davies, Rhodri H.

    2003-01-01

    This report describes and evaluates the steps needed to perform modern model-based interpretation of the corpus callosum in MRI. The process is discussed from the initial landmark-free contours to full-fledged statistical models based on the Active Appearance Models framework. Topics treated incl...... include landmark placement, background modelling and multi-resolution analysis. Preliminary quantitative and qualitative validation in a cross-sectional study show that fully automated analysis and segmentation of the corpus callosum are feasible....

  2. EDISON - research programme on electric distribution automation 1993-1997. Final report 1997

    Energy Technology Data Exchange (ETDEWEB)

    Lehtonen, M [ed.; VTT Energy, Espoo (Finland). Energy Systems

    1998-08-01

    This report comprises a summary of the results of the five year research programme EDISON on distribution automation in Finnish utilities. The research programme (1993 - 1997) was conducted under the leadership of VTT Energy, in cooperation with universities, distribution companies and the manufacturing industry. The main part of the funding has been from the Technology Development Centre TEKES and from manufacturing companies. The goal of the research programme was to develop a new scheme for a complete distribution automation system, including the network automation, computer systems in the control centre and the customer associated automation functions. In addition, the techniques for demand side management were developed and integrated into the automation scheme. The final aim was to demonstrate the automation functions and systems of the scheme in real distribution systems. The results of nineteen projects are given in this report

  3. Energy Assessment of Automated Mobility Districts

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Yuche [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-08-03

    Automated vehicles (AVs) are increasingly being discussed as the basis for on-demand mobility services, introducing a new paradigm in which a fleet of AVs displace private automobiles for day-to-day travel in dense activity districts. This project examines such a concept to displace privately owned automobiles within a region containing dense activity generators (jobs, retail, entertainment, etc.), referred to as an automated mobility district (AMDs). The project reviews several such districts including airport, college campuses, business parks, downtown urban cores, and military bases, with examples of previous attempts to meet the mobility needs apart from private automobiles, some with automated technology and others with more traditional transit based solutions. The issues and benefits of AMDs are framed within the perspective of intra-district, inter-district, and border issues, and the requirements for a modeling framework are identified to adequately reflect the breadth of mobility, energy, and emissions impact anticipated with AMDs.

  4. Automated particulate sampler field test model operations guide

    Energy Technology Data Exchange (ETDEWEB)

    Bowyer, S.M.; Miley, H.S.

    1996-10-01

    The Automated Particulate Sampler Field Test Model Operations Guide is a collection of documents which provides a complete picture of the Automated Particulate Sampler (APS) and the Field Test in which it was evaluated. The Pacific Northwest National Laboratory (PNNL) Automated Particulate Sampler was developed for the purpose of radionuclide particulate monitoring for use under the Comprehensive Test Ban Treaty (CTBT). Its design was directed by anticipated requirements of small size, low power consumption, low noise level, fully automatic operation, and most predominantly the sensitivity requirements of the Conference on Disarmament Working Paper 224 (CDWP224). This guide is intended to serve as both a reference document for the APS and to provide detailed instructions on how to operate the sampler. This document provides a complete description of the APS Field Test Model and all the activity related to its evaluation and progression.

  5. Accurate, fully-automated NMR spectral profiling for metabolomics.

    Directory of Open Access Journals (Sweden)

    Siamak Ravanbakhsh

    Full Text Available Many diseases cause significant changes to the concentrations of small molecules (a.k.a. metabolites that appear in a person's biofluids, which means such diseases can often be readily detected from a person's "metabolic profile"-i.e., the list of concentrations of those metabolites. This information can be extracted from a biofluids Nuclear Magnetic Resonance (NMR spectrum. However, due to its complexity, NMR spectral profiling has remained manual, resulting in slow, expensive and error-prone procedures that have hindered clinical and industrial adoption of metabolomics via NMR. This paper presents a system, BAYESIL, which can quickly, accurately, and autonomously produce a person's metabolic profile. Given a 1D 1H NMR spectrum of a complex biofluid (specifically serum or cerebrospinal fluid, BAYESIL can automatically determine the metabolic profile. This requires first performing several spectral processing steps, then matching the resulting spectrum against a reference compound library, which contains the "signatures" of each relevant metabolite. BAYESIL views spectral matching as an inference problem within a probabilistic graphical model that rapidly approximates the most probable metabolic profile. Our extensive studies on a diverse set of complex mixtures including real biological samples (serum and CSF, defined mixtures and realistic computer generated spectra; involving > 50 compounds, show that BAYESIL can autonomously find the concentration of NMR-detectable metabolites accurately (~ 90% correct identification and ~ 10% quantification error, in less than 5 minutes on a single CPU. These results demonstrate that BAYESIL is the first fully-automatic publicly-accessible system that provides quantitative NMR spectral profiling effectively-with an accuracy on these biofluids that meets or exceeds the performance of trained experts. We anticipate this tool will usher in high-throughput metabolomics and enable a wealth of new applications of

  6. Development of a Real-Time PCR Protocol Requiring Minimal Handling for Detection of Vancomycin-Resistant Enterococci with the Fully Automated BD Max System.

    Science.gov (United States)

    Dalpke, Alexander H; Hofko, Marjeta; Zimmermann, Stefan

    2016-09-01

    Vancomycin-resistant enterococci (VRE) are an important cause of health care-associated infections, resulting in significant mortality and a significant economic burden in hospitals. Active surveillance for at-risk populations contributes to the prevention of infections with VRE. The availability of a combination of automation and molecular detection procedures for rapid screening would be beneficial. Here, we report on the development of a laboratory-developed PCR for detection of VRE which runs on the fully automated Becton Dickinson (BD) Max platform, which combines DNA extraction, PCR setup, and real-time PCR amplification. We evaluated two protocols: one using a liquid master mix and the other employing commercially ordered dry-down reagents. The BD Max VRE PCR was evaluated in two rounds with 86 and 61 rectal elution swab (eSwab) samples, and the results were compared to the culture results. The sensitivities of the different PCR formats were 84 to 100% for vanA and 83.7 to 100% for vanB; specificities were 96.8 to 100% for vanA and 81.8 to 97% for vanB The use of dry-down reagents and the ExK DNA-2 kit for extraction showed that the samples were less inhibited (3.3%) than they were by the use of the liquid master mix (14.8%). Adoption of a cutoff threshold cycle of 35 for discrimination of vanB-positive samples allowed an increase of specificity to 87.9%. The performance of the BD Max VRE assay equaled that of the BD GeneOhm VanR assay, which was run in parallel. The use of dry-down reagents simplifies the assay and omits any need to handle liquid PCR reagents. Copyright © 2016, American Society for Microbiology. All Rights Reserved.

  7. Left Ventricle: Fully Automated Segmentation Based on Spatiotemporal Continuity and Myocardium Information in Cine Cardiac Magnetic Resonance Imaging (LV-FAST

    Directory of Open Access Journals (Sweden)

    Lijia Wang

    2015-01-01

    Full Text Available CMR quantification of LV chamber volumes typically and manually defines the basal-most LV, which adds processing time and user-dependence. This study developed an LV segmentation method that is fully automated based on the spatiotemporal continuity of the LV (LV-FAST. An iteratively decreasing threshold region growing approach was used first from the midventricle to the apex, until the LV area and shape discontinued, and then from midventricle to the base, until less than 50% of the myocardium circumference was observable. Region growth was constrained by LV spatiotemporal continuity to improve robustness of apical and basal segmentations. The LV-FAST method was compared with manual tracing on cardiac cine MRI data of 45 consecutive patients. Of the 45 patients, LV-FAST and manual selection identified the same apical slices at both ED and ES and the same basal slices at both ED and ES in 38, 38, 38, and 41 cases, respectively, and their measurements agreed within -1.6±8.7 mL, -1.4±7.8 mL, and 1.0±5.8% for EDV, ESV, and EF, respectively. LV-FAST allowed LV volume-time course quantitatively measured within 3 seconds on a standard desktop computer, which is fast and accurate for processing the cine volumetric cardiac MRI data, and enables LV filling course quantification over the cardiac cycle.

  8. Experimental optimization of a direct injection homogeneous charge compression ignition gasoline engine using split injections with fully automated microgenetic algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Canakci, M. [Kocaeli Univ., Izmit (Turkey); Reitz, R.D. [Wisconsin Univ., Dept. of Mechanical Engineering, Madison, WI (United States)

    2003-03-01

    Homogeneous charge compression ignition (HCCI) is receiving attention as a new low-emission engine concept. Little is known about the optimal operating conditions for this engine operation mode. Combustion under homogeneous, low equivalence ratio conditions results in modest temperature combustion products, containing very low concentrations of NO{sub x} and particulate matter (PM) as well as providing high thermal efficiency. However, this combustion mode can produce higher HC and CO emissions than those of conventional engines. An electronically controlled Caterpillar single-cylinder oil test engine (SCOTE), originally designed for heavy-duty diesel applications, was converted to an HCCI direct injection (DI) gasoline engine. The engine features an electronically controlled low-pressure direct injection gasoline (DI-G) injector with a 60 deg spray angle that is capable of multiple injections. The use of double injection was explored for emission control and the engine was optimized using fully automated experiments and a microgenetic algorithm optimization code. The variables changed during the optimization include the intake air temperature, start of injection timing and the split injection parameters (per cent mass of fuel in each injection, dwell between the pulses). The engine performance and emissions were determined at 700 r/min with a constant fuel flowrate at 10 MPa fuel injection pressure. The results show that significant emissions reductions are possible with the use of optimal injection strategies. (Author)

  9. A FULLY AUTOMATED PIPELINE FOR CLASSIFICATION TASKS WITH AN APPLICATION TO REMOTE SENSING

    Directory of Open Access Journals (Sweden)

    K. Suzuki

    2016-06-01

    Full Text Available Nowadays deep learning has been intensively in spotlight owing to its great victories at major competitions, which undeservedly pushed ‘shallow’ machine learning methods, relatively naive/handy algorithms commonly used by industrial engineers, to the background in spite of their facilities such as small requisite amount of time/dataset for training. We, with a practical point of view, utilized shallow learning algorithms to construct a learning pipeline such that operators can utilize machine learning without any special knowledge, expensive computation environment, and a large amount of labelled data. The proposed pipeline automates a whole classification process, namely feature-selection, weighting features and the selection of the most suitable classifier with optimized hyperparameters. The configuration facilitates particle swarm optimization, one of well-known metaheuristic algorithms for the sake of generally fast and fine optimization, which enables us not only to optimize (hyperparameters but also to determine appropriate features/classifier to the problem, which has conventionally been a priori based on domain knowledge and remained untouched or dealt with naïve algorithms such as grid search. Through experiments with the MNIST and CIFAR-10 datasets, common datasets in computer vision field for character recognition and object recognition problems respectively, our automated learning approach provides high performance considering its simple setting (i.e. non-specialized setting depending on dataset, small amount of training data, and practical learning time. Moreover, compared to deep learning the performance stays robust without almost any modification even with a remote sensing object recognition problem, which in turn indicates that there is a high possibility that our approach contributes to general classification problems.

  10. Comparisons of fully automated syphilis tests with conventional VDRL and FTA-ABS tests.

    Science.gov (United States)

    Choi, Seung Jun; Park, Yongjung; Lee, Eun Young; Kim, Sinyoung; Kim, Hyon-Suk

    2013-06-01

    Serologic tests are widely used for the diagnosis of syphilis. However, conventional methods require well-trained technicians to produce reliable results. We compared automated nontreponemal and treponemal tests with conventional methods. The HiSens Auto Rapid Plasma Reagin (AutoRPR) and Treponema Pallidum particle agglutination (AutoTPPA) tests, which utilize latex turbidimetric immunoassay, were assessed. A total of 504 sera were assayed by AutoRPR, AutoTPPA, conventional VDRL and FTA-ABS. Among them, 250 samples were also tested by conventional TPPA. The concordance rate between the results of VDRL and AutoRPR was 67.5%, and 164 discrepant cases were all VDRL reactive but AutoRPR negative. In the 164 cases, 133 showed FTA-ABS reactivity. Medical records of 106 among the 133 cases were reviewed, and 82 among 106 specimens were found to be collected from patients already treated for syphilis. The concordance rate between the results of AutoTPPA and FTA-ABS was 97.8%. The results of conventional TPPA and AutoTPPA for 250 samples were concordant in 241 cases (96.4%). AutoRPR showed higher specificity than that of VDRL, while VDRL demonstrated higher sensitivity than that of AutoRPR regardless of whether the patients had been already treated for syphilis or not. Both FTA-ABS and AutoTPPA showed high sensitivities and specificities greater than 98.0%. Automated RPR and TPPA tests could be alternatives to conventional syphilis tests, and AutoRPR would be particularly suitable in treatment monitoring, since results by AutoRPR in cases after treatment became negative more rapidly than by VDRL. Copyright © 2013. Published by Elsevier Inc.

  11. Evaluation of Representative Smart Grid Investment Project Technologies: Demand Response

    Energy Technology Data Exchange (ETDEWEB)

    Fuller, Jason C.; Prakash Kumar, Nirupama; Bonebrake, Christopher A.

    2012-02-14

    This document is one of a series of reports estimating the benefits of deploying technologies similar to those implemented on the Smart Grid Investment Grant (SGIG) projects. Four technical reports cover the various types of technologies deployed in the SGIG projects, distribution automation, demand response, energy storage, and renewables integration. A fifth report in the series examines the benefits of deploying these technologies on a national level. This technical report examines the impacts of a limited number of demand response technologies and implementations deployed in the SGIG projects.

  12. Standard IEC 61850 substation automation

    Energy Technology Data Exchange (ETDEWEB)

    Bricchi, A.; Mezzadri, D. [Selta, Tortoreto (Italy)

    2008-07-01

    The International Electrotechnical Commission (IEC) 61850 standard is the reference communication protocol for all electrical substations protection and control systems. It creates models of all the elements and functionalities of an electrical substation, including physical elements such as switches or circuit breakers, as well as protection, control and monitoring functionalities. Network managers need to renew power substation automation and control systems in order to improve the efficiency and quality of services offered by electric utilities. Selta has proposed a new integrated solution for the automation of power substations which is fully compliant with the IEC 61850 norms. The solution involves the integration of control, automation, protection, monitoring and maintenance functions and applies leading edge technology to its systems, particularly for the TERNA network. The system is based on the use of many electronic devices at a power plant, each one with a specialized function, and all interconnected via a Station LAN. This solution, was tested on the TERNA network in Italy, in VHV and HV stations. It was shown to offer many advantages, such as an architecture based on full interoperability between control, monitoring and protection equipment; centralized and distributed automation; a LAN station that allows full interoperability between different bay units and protection relays in order to integrate equipment from various suppliers; the integration of automation systems in existing bay units and protection relays equipped with standard communication buses or with proprietary interfaces; and time synchronization for the entire system through a station GPS reception system. 10 refs., 1 tab., 7 figs.

  13. Assessing Library Automation and Virtual Library Development in Four Academic Libraries in Oyo, Oyo State, Nigeria

    Science.gov (United States)

    Gbadamosi, Belau Olatunde

    2011-01-01

    The paper examines the level of library automation and virtual library development in four academic libraries. A validated questionnaire was used to capture the responses from academic librarians of the libraries under study. The paper discovers that none of the four academic libraries is fully automated. The libraries make use of librarians with…

  14. Improving reticle defect disposition via fully automated lithography simulation

    Science.gov (United States)

    Mann, Raunak; Goodman, Eliot; Lao, Keith; Ha, Steven; Vacca, Anthony; Fiekowsky, Peter; Fiekowsky, Dan

    2016-03-01

    Most advanced wafer fabs have embraced complex pattern decoration, which creates numerous challenges during in-fab reticle qualification. These optical proximity correction (OPC) techniques create assist features that tend to be very close in size and shape to the main patterns as seen in Figure 1. A small defect on an assist feature will most likely have little or no impact on the fidelity of the wafer image, whereas the same defect on a main feature could significantly decrease device functionality. In order to properly disposition these defects, reticle inspection technicians need an efficient method that automatically separates main from assist features and predicts the resulting defect impact on the wafer image. Analysis System (ADAS) defect simulation system[1]. Up until now, using ADAS simulation was limited to engineers due to the complexity of the settings that need to be manually entered in order to create an accurate result. A single error in entering one of these values can cause erroneous results, therefore full automation is necessary. In this study, we propose a new method where all needed simulation parameters are automatically loaded into ADAS. This is accomplished in two parts. First we have created a scanner parameter database that is automatically identified from mask product and level names. Second, we automatically determine the appropriate simulation printability threshold by using a new reference image (provided by the inspection tool) that contains a known measured value of the reticle critical dimension (CD). This new method automatically loads the correct scanner conditions, sets the appropriate simulation threshold, and automatically measures the percentage of CD change caused by the defect. This streamlines qualification and reduces the number of reticles being put on hold, waiting for engineer review. We also present data showing the consistency and reliability of the new method, along with the impact on the efficiency of in

  15. The Rise of the Machines: Automation, Horizontal Innovation and Income Inequality

    OpenAIRE

    Morten Olsen; David Hemous

    2014-01-01

    We construct an endogenous growth model of directed technical change with automation (the introduction of machines which replace low-skill labor and complement high-skill labor) and horizontal innovation (the introduction of new products, which increases demand for both types of labor). For general processes of technical change, we demonstrate that although low-skill wages can drop during periods of increasing automation intensity, the asymptotic growth rate is weakly positive --- though lowe...

  16. Evaluation of an automated microplate technique in the Galileo system for ABO and Rh(D) blood grouping.

    Science.gov (United States)

    Xu, Weiyi; Wan, Feng; Lou, Yufeng; Jin, Jiali; Mao, Weilin

    2014-01-01

    A number of automated devices for pretransfusion testing have recently become available. This study evaluated the Immucor Galileo System, a fully automated device based on the microplate hemagglutination technique for ABO/Rh (D) determinations. Routine ABO/Rh typing tests were performed on 13,045 samples using the Immucor automated instruments. Manual tube method was used to resolve ABO forward and reverse grouping discrepancies. D-negative test results were investigated and confirmed manually by the indirect antiglobulin test (IAT). The system rejected 70 tests for sample inadequacy. 87 samples were read as "No-type-determined" due to forward and reverse grouping discrepancies. 25 tests gave these results because of sample hemolysis. After further tests, we found 34 tests were caused by weakened RBC antibodies, 5 tests were attributable to weak A and/or B antigens, 4 tests were due to mixed-field reactions, and 8 tests had high titer cold agglutinin with blood qualifications which react only at temperatures below 34 degrees C. In the remaining 11 cases, irregular RBC antibodies were identified in 9 samples (seven anti-M and two anti-P) and two subgroups were identified in 2 samples (one A1 and one A2) by a reference laboratory. As for D typing, 2 weak D+ samples missed by automated systems gave negative results, but weak-positive reactions were observed in the IAT. The Immucor Galileo System is reliable and suited for ABO and D blood groups, some reasons may cause a discrepancy in ABO/D typing using a fully automated system. It is suggested that standardization of sample collection may improve the performance of the fully automated system.

  17. Space station automation study. Automation requirements derived from space manufacturing concepts. Volume 1: Executive summary

    Science.gov (United States)

    1984-01-01

    The two manufacturing concepts developed represent innovative, technologically advanced manufacturing schemes. The concepts were selected to facilitate an in depth analysis of manufacturing automation requirements in the form of process mechanization, teleoperation and robotics, and artificial intelligence. While the cost effectiveness of these facilities has not been analyzed as part of this study, both appear entirely feasible for the year 2000 timeframe. The growing demand for high quality gallium arsenide microelectronics may warrant the ventures.

  18. Automated DNA electrophoresis, hybridization and detection

    International Nuclear Information System (INIS)

    Zapolski, E.J.; Gersten, D.M.; Golab, T.J.; Ledley, R.S.

    1986-01-01

    A fully automated, computer controlled system for nucleic acid hybridization analysis has been devised and constructed. In practice, DNA is digested with restriction endonuclease enzyme(s) and loaded into the system by pipette; 32 P-labelled nucleic acid probe(s) is loaded into the nine hybridization chambers. Instructions for all the steps in the automated process are specified by answering questions that appear on the computer screen at the start of the experiment. Subsequent steps are performed automatically. The system performs horizontal electrophoresis in agarose gel, fixed the fragments to a solid phase matrix, denatures, neutralizes, prehybridizes, hybridizes, washes, dries and detects the radioactivity according to the specifications given by the operator. The results, printed out at the end, give the positions on the matrix to which radioactivity remains hybridized following stringent washing

  19. Automated radiochemical processing for clinical PET

    International Nuclear Information System (INIS)

    Padgett, H.C.; Schmidt, D.G.; Bida, G.T.; Wieland, B.W.; Pekrul, E.; Kingsbury, W.G.

    1991-01-01

    With the recent emergence of positron emission tomography (PET) as a viable clinical tool, there is a need for a convenient, cost-effective source of the positron emitter-labeled radiotracers labeled with carbon-11, nitrogen-13, oxygen-15, and fluorine-18. These short-lived radioisotopes are accelerator produced and thus, require a cyclotron and radiochemistry processing instrumentation that can be operated 3 in a clinical environment by competant technicians. The basic goal is to ensure safety and reliability while setting new standards for economy and ease of operation. The Siemens Radioisotope Delivery System (RDS 112) is a fully automated system dedicated to the production and delivery of positron-emitter labeled precursors and radiochemicals required to support a clinical PET imaging program. Thus, the entire RDS can be thought of as an automated radiochemical processing apparatus

  20. Why Are There Still So Many Jobs? The History and Future of Workplace Automation

    OpenAIRE

    David H. Autor

    2015-01-01

    In this essay, I begin by identifying the reasons that automation has not wiped out a majority of jobs over the decades and centuries. Automation does indeed substitute for labor—as it is typically intended to do. However, automation also complements labor, raises output in ways that leads to higher demand for labor, and interacts with adjustments in labor supply. Journalists and even expert commentators tend to overstate the extent of machine substitution for human labor and ignore the stron...

  1. On transcending the impasse of respiratory motion correction applications in routine clinical imaging - a consideration of a fully automated data driven motion control framework

    International Nuclear Information System (INIS)

    Kesner, Adam L; Schleyer, Paul J; Büther, Florian; Walter, Martin A; Schäfers, Klaus P; Koo, Phillip J

    2014-01-01

    Positron emission tomography (PET) is increasingly used for the detection, characterization, and follow-up of tumors located in the thorax. However, patient respiratory motion presents a unique limitation that hinders the application of high-resolution PET technology for this type of imaging. Efforts to transcend this limitation have been underway for more than a decade, yet PET remains for practical considerations a modality vulnerable to motion-induced image degradation. Respiratory motion control is not employed in routine clinical operations. In this article, we take an opportunity to highlight some of the recent advancements in data-driven motion control strategies and how they may form an underpinning for what we are presenting as a fully automated data-driven motion control framework. This framework represents an alternative direction for future endeavors in motion control and can conceptually connect individual focused studies with a strategy for addressing big picture challenges and goals. The online version of this article (doi:10.1186/2197-7364-1-8) contains supplementary material, which is available to authorized users.

  2. Concept of a computer network architecture for complete automation of nuclear power plants

    International Nuclear Information System (INIS)

    Edwards, R.M.; Ray, A.

    1990-01-01

    The state of the art in automation of nuclear power plants has been largely limited to computerized data acquisition, monitoring, display, and recording of process signals. Complete automation of nuclear power plants, which would include plant operations, control, and management, fault diagnosis, and system reconfiguration with efficient and reliable man/machine interactions, has been projected as a realistic goal. This paper presents the concept of a computer network architecture that would use a high-speed optical data highway to integrate diverse, interacting, and spatially distributed functions that are essential for a fully automated nuclear power plant

  3. Ten years of R&D and full automation in molecular diagnosis.

    Science.gov (United States)

    Greub, Gilbert; Sahli, Roland; Brouillet, René; Jaton, Katia

    2016-01-01

    A 10-year experience of our automated molecular diagnostic platform that carries out 91 different real-time PCR is described. Progresses and future perspectives in molecular diagnostic microbiology are reviewed: why automation is important; how our platform was implemented; how homemade PCRs were developed; the advantages/disadvantages of homemade PCRs, including the critical aspects of troubleshooting and the need to further reduce the turnaround time for specific samples, at least for defined clinical settings such as emergencies. The future of molecular diagnosis depends on automation, and in a novel perspective, it is time now to fully acknowledge the true contribution of molecular diagnostic and to reconsider the indication for PCR, by also using these tests as first-line assays.

  4. Fully 3D GPU PET reconstruction

    Energy Technology Data Exchange (ETDEWEB)

    Herraiz, J.L., E-mail: joaquin@nuclear.fis.ucm.es [Grupo de Fisica Nuclear, Departmento Fisica Atomica, Molecular y Nuclear, Universidad Complutense de Madrid (Spain); Espana, S. [Department of Radiation Oncology, Massachusetts General Hospital and Harvard Medical School, Boston, MA (United States); Cal-Gonzalez, J. [Grupo de Fisica Nuclear, Departmento Fisica Atomica, Molecular y Nuclear, Universidad Complutense de Madrid (Spain); Vaquero, J.J. [Departmento de Bioingenieria e Ingenieria Espacial, Universidad Carlos III, Madrid (Spain); Desco, M. [Departmento de Bioingenieria e Ingenieria Espacial, Universidad Carlos III, Madrid (Spain); Unidad de Medicina y Cirugia Experimental, Hospital General Universitario Gregorio Maranon, Madrid (Spain); Udias, J.M. [Grupo de Fisica Nuclear, Departmento Fisica Atomica, Molecular y Nuclear, Universidad Complutense de Madrid (Spain)

    2011-08-21

    Fully 3D iterative tomographic image reconstruction is computationally very demanding. Graphics Processing Unit (GPU) has been proposed for many years as potential accelerators in complex scientific problems, but it has not been used until the recent advances in the programmability of GPUs that the best available reconstruction codes have started to be implemented to be run on GPUs. This work presents a GPU-based fully 3D PET iterative reconstruction software. This new code may reconstruct sinogram data from several commercially available PET scanners. The most important and time-consuming parts of the code, the forward and backward projection operations, are based on an accurate model of the scanner obtained with the Monte Carlo code PeneloPET and they have been massively parallelized on the GPU. For the PET scanners considered, the GPU-based code is more than 70 times faster than a similar code running on a single core of a fast CPU, obtaining in both cases the same images. The code has been designed to be easily adapted to reconstruct sinograms from any other PET scanner, including scanner prototypes.

  5. Fully 3D GPU PET reconstruction

    International Nuclear Information System (INIS)

    Herraiz, J.L.; Espana, S.; Cal-Gonzalez, J.; Vaquero, J.J.; Desco, M.; Udias, J.M.

    2011-01-01

    Fully 3D iterative tomographic image reconstruction is computationally very demanding. Graphics Processing Unit (GPU) has been proposed for many years as potential accelerators in complex scientific problems, but it has not been used until the recent advances in the programmability of GPUs that the best available reconstruction codes have started to be implemented to be run on GPUs. This work presents a GPU-based fully 3D PET iterative reconstruction software. This new code may reconstruct sinogram data from several commercially available PET scanners. The most important and time-consuming parts of the code, the forward and backward projection operations, are based on an accurate model of the scanner obtained with the Monte Carlo code PeneloPET and they have been massively parallelized on the GPU. For the PET scanners considered, the GPU-based code is more than 70 times faster than a similar code running on a single core of a fast CPU, obtaining in both cases the same images. The code has been designed to be easily adapted to reconstruct sinograms from any other PET scanner, including scanner prototypes.

  6. Validation of a Fully Automated HER2 Staining Kit in Breast Cancer

    Directory of Open Access Journals (Sweden)

    Cathy B. Moelans

    2010-01-01

    Full Text Available Background: Testing for HER2 amplification and/or overexpression is currently routine practice to guide Herceptin therapy in invasive breast cancer. At present, HER2 status is most commonly assessed by immunohistochemistry (IHC. Standardization of HER2 IHC assays is of utmost clinical and economical importance. At present, HER2 IHC is most commonly performed with the HercepTest which contains a polyclonal antibody and applies a manual staining procedure. Analytical variability in HER2 IHC testing could be diminished by a fully automatic staining system with a monoclonal antibody.

  7. Flexible automated approach for quantitative liquid handling of complex biological samples.

    Science.gov (United States)

    Palandra, Joe; Weller, David; Hudson, Gary; Li, Jeff; Osgood, Sarah; Hudson, Emily; Zhong, Min; Buchholz, Lisa; Cohen, Lucinda H

    2007-11-01

    A fully automated protein precipitation technique for biological sample preparation has been developed for the quantitation of drugs in various biological matrixes. All liquid handling during sample preparation was automated using a Hamilton MicroLab Star Robotic workstation, which included the preparation of standards and controls from a Watson laboratory information management system generated work list, shaking of 96-well plates, and vacuum application. Processing time is less than 30 s per sample or approximately 45 min per 96-well plate, which is then immediately ready for injection onto an LC-MS/MS system. An overview of the process workflow is discussed, including the software development. Validation data are also provided, including specific liquid class data as well as comparative data of automated vs manual preparation using both quality controls and actual sample data. The efficiencies gained from this automated approach are described.

  8. Robotics and Office Automation: Implications for Vocational Education.

    Science.gov (United States)

    Fraser, Jeannette L.; And Others

    Directed to individuals responsible for program planning in vocational education at the national and state levels, this review and synthesis of technological developments in robotics and office automation identifies the potential demand for skills in these technologies in the next 3 to 5 years. The procedures for the study are described in the…

  9. A time-use model for the automated vehicle-era

    NARCIS (Netherlands)

    Pudāne, Baiba; Molin, Eric J.E.; Arentze, Theo A.; Maknoon, Yousef; Chorus, Caspar G.

    2018-01-01

    Automated Vehicles (AVs) offer their users a possibility to perform new non-driving activities while being on the way. The effects of this opportunity on travel choices and travel demand have mostly been conceptualised and modelled via a reduced penalty associated with (in-vehicle) travel time. This

  10. A Time-use Model for the Automated Vehicle-era

    NARCIS (Netherlands)

    Pudane, B.; Molin, E.J.E.; Arentze, TA; Maknoon, M.Y.; Chorus, C.G.

    2018-01-01

    Automated Vehicles (AVs) offer their users a possibility to perform new non-driving activities while being on the way. The effects of this opportunity on travel choices and travel demand have mostly been conceptualised and modelled via a reduced penalty associated with (in-vehicle) travel time. This

  11. Simple heuristics: A bridge between manual core design and automated optimization methods

    International Nuclear Information System (INIS)

    White, J.R.; Delmolino, P.M.

    1993-01-01

    The primary function of RESCUE is to serve as an aid in the analysis and identification of feasible loading patterns for LWR reload cores. The unique feature of RESCUE is that its physics model is based on some recent advances in generalized perturbation theory (GPT) methods. The high order GPT techniques offer the accuracy, computational efficiency, and flexibility needed for the implementation of a full range of capabilities within a set of compatible interactive (manual and semi-automated) and automated design tools. The basic design philosophy and current features within RESCUE are reviewed, and the new semi-automated capability is highlighted. The online advisor facility appears quite promising and it provides a natural bridge between the traditional trial-and-error manual process and the recent progress towards fully automated optimization sequences. (orig.)

  12. Optimization of the Automated Spray Layer-by-Layer Technique for Thin Film Deposition

    Science.gov (United States)

    2010-06-01

    air- pumped spray-paint cans 17,18 to fully automated systems using high pressure gas .7’ 19 This work uses the automated spray system previously...spray solutions were delivered by ultra high purity nitrogen gas (AirGas) regulated to 25psi, except when examining air pressure effects . The PAH solution...polyelectrolyte solution feed tube, the resulting Venturi effect causes the liquid solution to be drawn up into the airbrush nozzle, where it is

  13. Fully Automated Detection of Corticospinal Tract Damage in Chronic Stroke Patients

    Directory of Open Access Journals (Sweden)

    Ming Yang

    2014-01-01

    Full Text Available Structural integrity of the corticospinal tract (CST after stroke is closely linked to the degree of motor impairment. However, current methods for measurement of fractional atrophy (FA of CST based on region of interest (ROI are time-consuming and open to bias. Here, we used tract-based spatial statistics (TBSS together with a CST template with healthy volunteers to quantify structural integrity of CST automatically. Two groups of patients after ischemic stroke were enrolled, group 1 (10 patients, 7 men, and Fugl-Meyer assessment (FMA scores ⩽ 50 and group 2 (12 patients, 12 men, and FMA scores = 100. CST of FAipsi, FAcontra, and FAratio was compared between the two groups. Relative to group 2, FA was decreased in group 1 in the ipsilesional CST (P<0.01, as well as the FAratio (P<0.01. There was no significant difference between the two subgroups in the contralesional CST (P=0.23. Compared with contralesional CST, FA of ipsilesional CST decreased in group 1 (P<0.01. These results suggest that the automated method used in our study could detect a surrogate biomarker to quantify the CST after stroke, which would facilitate implementation of clinical practice.

  14. Flexible automated manufacturing for SMEs

    DEFF Research Database (Denmark)

    Grube Hansen, David; Bilberg, Arne; Madsen, Erik Skov

    2017-01-01

    SMEs are in general highly flexible and agile in order to accommodate the customer demands in the paradigm of High Mix-Low Volume manufacturing. The flexibility and agility have mainly been enabled by manual labor, but as we are entering the technology and data driven fourth industrial revolution......, where augmented operators and machines work in cooperation in a highly flexible and productive manufacturing system both an opportunity and a need has raised for developing highly flexible and efficient automation....

  15. Energy demand in Portuguese manufacturing: a two-stage model

    International Nuclear Information System (INIS)

    Borges, A.M.; Pereira, A.M.

    1992-01-01

    We use a two-stage model of factor demand to estimate the parameters determining energy demand in Portuguese manufacturing. In the first stage, a capital-labor-energy-materials framework is used to analyze the substitutability between energy as a whole and other factors of production. In the second stage, total energy demand is decomposed into oil, coal and electricity demands. The two stages are fully integrated since the energy composite used in the first stage and its price are obtained from the second stage energy sub-model. The estimates obtained indicate that energy demand in manufacturing responds significantly to price changes. In addition, estimation results suggest that there are important substitution possibilities among energy forms and between energy and other factors of production. The role of price changes in energy-demand forecasting, as well as in energy policy in general, is clearly established. (author)

  16. Automated sampling and data processing derived from biomimetic membranes

    International Nuclear Information System (INIS)

    Perry, M; Vissing, T; Hansen, J S; Nielsen, C H; Boesen, T P; Emneus, J

    2009-01-01

    Recent advances in biomimetic membrane systems have resulted in an increase in membrane lifetimes from hours to days and months. Long-lived membrane systems demand the development of both new automated monitoring equipment capable of measuring electrophysiological membrane characteristics and new data processing software to analyze and organize the large amounts of data generated. In this work, we developed an automated instrumental voltage clamp solution based on a custom-designed software controller application (the WaveManager), which enables automated on-line voltage clamp data acquisition applicable to long-time series experiments. We designed another software program for off-line data processing. The automation of the on-line voltage clamp data acquisition and off-line processing was furthermore integrated with a searchable database (DiscoverySheet(TM)) for efficient data management. The combined solution provides a cost efficient and fast way to acquire, process and administrate large amounts of voltage clamp data that may be too laborious and time consuming to handle manually. (communication)

  17. Automated sampling and data processing derived from biomimetic membranes

    Energy Technology Data Exchange (ETDEWEB)

    Perry, M; Vissing, T; Hansen, J S; Nielsen, C H [Aquaporin A/S, Diplomvej 377, DK-2800 Kgs. Lyngby (Denmark); Boesen, T P [Xefion ApS, Kildegaardsvej 8C, DK-2900 Hellerup (Denmark); Emneus, J, E-mail: Claus.Nielsen@fysik.dtu.d [DTU Nanotech, Technical University of Denmark, DK-2800 Kgs. Lyngby (Denmark)

    2009-12-15

    Recent advances in biomimetic membrane systems have resulted in an increase in membrane lifetimes from hours to days and months. Long-lived membrane systems demand the development of both new automated monitoring equipment capable of measuring electrophysiological membrane characteristics and new data processing software to analyze and organize the large amounts of data generated. In this work, we developed an automated instrumental voltage clamp solution based on a custom-designed software controller application (the WaveManager), which enables automated on-line voltage clamp data acquisition applicable to long-time series experiments. We designed another software program for off-line data processing. The automation of the on-line voltage clamp data acquisition and off-line processing was furthermore integrated with a searchable database (DiscoverySheet(TM)) for efficient data management. The combined solution provides a cost efficient and fast way to acquire, process and administrate large amounts of voltage clamp data that may be too laborious and time consuming to handle manually. (communication)

  18. Point-of-Care Test Equipment for Flexible Laboratory Automation.

    Science.gov (United States)

    You, Won Suk; Park, Jae Jun; Jin, Sung Moon; Ryew, Sung Moo; Choi, Hyouk Ryeol

    2014-08-01

    Blood tests are some of the core clinical laboratory tests for diagnosing patients. In hospitals, an automated process called total laboratory automation, which relies on a set of sophisticated equipment, is normally adopted for blood tests. Noting that the total laboratory automation system typically requires a large footprint and significant amount of power, slim and easy-to-move blood test equipment is necessary for specific demands such as emergency departments or small-size local clinics. In this article, we present a point-of-care test system that can provide flexibility and portability with low cost. First, the system components, including a reagent tray, dispensing module, microfluidic disk rotor, and photometry scanner, and their functions are explained. Then, a scheduler algorithm to provide a point-of-care test platform with an efficient test schedule to reduce test time is introduced. Finally, the results of diagnostic tests are presented to evaluate the system. © 2014 Society for Laboratory Automation and Screening.

  19. EDISON - research programme on electricity distribution automation 1993-1997. Interim report 1995

    Energy Technology Data Exchange (ETDEWEB)

    Lehtonen, M. [ed.] [VTT Energy, Espoo (Finland). Energy Systems

    1996-12-31

    The report comprises a summary of the results of the first three years of the research programme EDISON on distribution automation in Finnish electrical utilities. The five year research programme (1993-1997) is conducted under the leadership of VTT Energy, in cooperation with universities, distribution companies and the manufacturing industry. The main part of funding is from the Technology Development Centre (Tekes) and from manufacturing companies. The goal of the research programme is to develop a new scheme for a complete distribution automation system, including the network automation, computer systems in the control centre and the customer automation functions. In addition, the techniques for demand side management are developed and integrated into the automation scheme. The final aim is to demonstrate the automation functions and systems of the scheme in real distribution systems. The results of thirteen projects are now given. These results should be considered intermediate, since most projects will be continued in 1996. (orig.)

  20. EDISON - research programme on electricity distribution automation 1993-1997. Interim report 1996

    Energy Technology Data Exchange (ETDEWEB)

    Lehtonen, M. [ed.] [VTT Energy, Espoo (Finland). Energy Systems

    1997-12-31

    The report comprises a summary of the results of the first four years of the research programme EDISON on distribution automation in Finnish utilities. The five year research programme (1993-1997) is conducted under the leadership of VTT Energy, in cooperation with universities, distribution companies and the manufacturing industry. The main part of the funding is from the Technology Development Centre TEKES and from manufacturing companies. The goal of the research programme is to develop a new scheme for a complete distribution automation system, including the network automation, computer systems in the control centre and the customer associated automation functions. In addition, the techniques for demand side management are developed and integrated into the automation scheme. The final aim is to demonstrate the automation functions and systems of the scheme in real distribution systems. The results of fifteen projects are now given. These results should be considered intermediate, since most projects will be continued in 1997. (orig.) 43 refs.

  1. EDISON - research programme on electricity distribution automation 1993-1997. Interim report 1995

    Energy Technology Data Exchange (ETDEWEB)

    Lehtonen, M [ed.; VTT Energy, Espoo (Finland). Energy Systems

    1997-12-31

    The report comprises a summary of the results of the first three years of the research programme EDISON on distribution automation in Finnish electrical utilities. The five year research programme (1993-1997) is conducted under the leadership of VTT Energy, in cooperation with universities, distribution companies and the manufacturing industry. The main part of funding is from the Technology Development Centre (Tekes) and from manufacturing companies. The goal of the research programme is to develop a new scheme for a complete distribution automation system, including the network automation, computer systems in the control centre and the customer automation functions. In addition, the techniques for demand side management are developed and integrated into the automation scheme. The final aim is to demonstrate the automation functions and systems of the scheme in real distribution systems. The results of thirteen projects are now given. These results should be considered intermediate, since most projects will be continued in 1996. (orig.)

  2. Evaluation of an automated struvite reactor to recover phosphorus ...

    African Journals Online (AJOL)

    In the present study we attempted to develop a reactor system to recover phosphorus by struvite precipitation, and which can be installed anywhere in the field without access to a laboratory. A reactor was developed that can run fully automated and recover up to 93% of total phosphorus (total P). Turbidity and conductivity ...

  3. DEWS (DEep White matter hyperintensity Segmentation framework): A fully automated pipeline for detecting small deep white matter hyperintensities in migraineurs.

    Science.gov (United States)

    Park, Bo-Yong; Lee, Mi Ji; Lee, Seung-Hak; Cha, Jihoon; Chung, Chin-Sang; Kim, Sung Tae; Park, Hyunjin

    2018-01-01

    Migraineurs show an increased load of white matter hyperintensities (WMHs) and more rapid deep WMH progression. Previous methods for WMH segmentation have limited efficacy to detect small deep WMHs. We developed a new fully automated detection pipeline, DEWS (DEep White matter hyperintensity Segmentation framework), for small and superficially-located deep WMHs. A total of 148 non-elderly subjects with migraine were included in this study. The pipeline consists of three components: 1) white matter (WM) extraction, 2) WMH detection, and 3) false positive reduction. In WM extraction, we adjusted the WM mask to re-assign misclassified WMHs back to WM using many sequential low-level image processing steps. In WMH detection, the potential WMH clusters were detected using an intensity based threshold and region growing approach. For false positive reduction, the detected WMH clusters were classified into final WMHs and non-WMHs using the random forest (RF) classifier. Size, texture, and multi-scale deep features were used to train the RF classifier. DEWS successfully detected small deep WMHs with a high positive predictive value (PPV) of 0.98 and true positive rate (TPR) of 0.70 in the training and test sets. Similar performance of PPV (0.96) and TPR (0.68) was attained in the validation set. DEWS showed a superior performance in comparison with other methods. Our proposed pipeline is freely available online to help the research community in quantifying deep WMHs in non-elderly adults.

  4. Strategies for Demand Response in Commercial Buildings

    Energy Technology Data Exchange (ETDEWEB)

    Watson, David S.; Kiliccote, Sila; Motegi, Naoya; Piette, Mary Ann

    2006-06-20

    This paper describes strategies that can be used in commercial buildings to temporarily reduce electric load in response to electric grid emergencies in which supplies are limited or in response to high prices that would be incurred if these strategies were not employed. The demand response strategies discussed herein are based on the results of three years of automated demand response field tests in which 28 commercial facilities with an occupied area totaling over 11 million ft{sup 2} were tested. Although the demand response events in the field tests were initiated remotely and performed automatically, the strategies used could also be initiated by on-site building operators and performed manually, if desired. While energy efficiency measures can be used during normal building operations, demand response measures are transient; they are employed to produce a temporary reduction in demand. Demand response strategies achieve reductions in electric demand by temporarily reducing the level of service in facilities. Heating ventilating and air conditioning (HVAC) and lighting are the systems most commonly adjusted for demand response in commercial buildings. The goal of demand response strategies is to meet the electric shed savings targets while minimizing any negative impacts on the occupants of the buildings or the processes that they perform. Occupant complaints were minimal in the field tests. In some cases, ''reductions'' in service level actually improved occupant comfort or productivity. In other cases, permanent improvements in efficiency were discovered through the planning and implementation of ''temporary'' demand response strategies. The DR strategies that are available to a given facility are based on factors such as the type of HVAC, lighting and energy management and control systems (EMCS) installed at the site.

  5. Fully automatic multi-language translation with a catalogue of phrases – successful employment for the Swiss avalanche bulletin

    NARCIS (Netherlands)

    Winkler, K.; Kuhn, T.

    2016-01-01

    The Swiss avalanche bulletin is produced twice a day in four languages. Due to the lack of time available for manual translation, a fully automated translation system is employed, based on a catalogue of predefined phrases and predetermined rules of how these phrases can be combined to produce

  6. Bayesian ISOLA: new tool for automated centroid moment tensor inversion

    Science.gov (United States)

    Vackář, Jiří; Burjánek, Jan; Gallovič, František; Zahradník, Jiří; Clinton, John

    2017-04-01

    Focal mechanisms are important for understanding seismotectonics of a region, and they serve as a basic input for seismic hazard assessment. Usually, the point source approximation and the moment tensor (MT) are used. We have developed a new, fully automated tool for the centroid moment tensor (CMT) inversion in a Bayesian framework. It includes automated data retrieval, data selection where station components with various instrumental disturbances and high signal-to-noise are rejected, and full-waveform inversion in a space-time grid around a provided hypocenter. The method is innovative in the following aspects: (i) The CMT inversion is fully automated, no user interaction is required, although the details of the process can be visually inspected latter on many figures which are automatically plotted.(ii) The automated process includes detection of disturbances based on MouseTrap code, so disturbed recordings do not affect inversion.(iii) A data covariance matrix calculated from pre-event noise yields an automated weighting of the station recordings according to their noise levels and also serves as an automated frequency filter suppressing noisy frequencies.(iv) Bayesian approach is used, so not only the best solution is obtained, but also the posterior probability density function.(v) A space-time grid search effectively combined with the least-squares inversion of moment tensor components speeds up the inversion and allows to obtain more accurate results compared to stochastic methods. The method has been tested on synthetic and observed data. It has been tested by comparison with manually processed moment tensors of all events greater than M≥3 in the Swiss catalogue over 16 years using data available at the Swiss data center (http://arclink.ethz.ch). The quality of the results of the presented automated process is comparable with careful manual processing of data. The software package programmed in Python has been designed to be as versatile as possible in

  7. Exploring the use of automated vehicles as last mile connection of train trips through an agent-based simulation model: An application to Delft, Netherlands

    Directory of Open Access Journals (Sweden)

    Arthur Scheltes

    2017-06-01

    Full Text Available The last mile in a public transport trip is known to bring a large disutility for passengers, because the conventional transport modes for this stage of the trip can, in many cases, be rather slow, inflexible and not provide a seamless experience to passengers. Fully automated vehicles (AVs, that is, those which do not need a driver, could act as a first mile/last mile connection to mass public transport modes. In this paper, we study a system that we call Automated Last-Mile Transport (ALMT, which consists of a fleet of small, fully automated, electric vehicles to improve the last mile performance of a trip done in a train. An agent-based simulation model was proposed for the ALMT whereby a dispatching algorithm distributes travel requests amongst the available vehicles using a FIFO sequence and selects a vehicle based on a set of specified control conditions (e.g. travel time to reach a requesting passenger. The model was applied to the case-study of the connection between the train station Delft Zuid and the Technological Innovation Campus (Delft, The Netherlands in order to test the methodology and understand the performance of the system in function of several operational parameters and demand scenarios. The most important conclusion from the baseline scenario was that the ALMT system was only able to compete with the walking mode and that additional measures were needed to increase the performance of the ALMT system in order to be competitive with cycling. Relocating empty vehicles or allowing pre-booking of vehicles led to a significant reduction in average waiting time, whilst allowing passengers to drive at a higher speed led to a large reduction in average travel time, whilst simultaneously reducing system capacity as energy use is increased.

  8. Will the future of knowledge work automation transform personalized medicine?

    Science.gov (United States)

    Naik, Gauri; Bhide, Sanika S

    2014-09-01

    Today, we live in a world of 'information overload' which demands high level of knowledge-based work. However, advances in computer hardware and software have opened possibilities to automate 'routine cognitive tasks' for knowledge processing. Engineering intelligent software systems that can process large data sets using unstructured commands and subtle judgments and have the ability to learn 'on the fly' are a significant step towards automation of knowledge work. The applications of this technology for high throughput genomic analysis, database updating, reporting clinically significant variants, and diagnostic imaging purposes are explored using case studies.

  9. Automated Cable Preparation for Robotized Stator Cable Winding

    Directory of Open Access Journals (Sweden)

    Erik Hultman

    2017-04-01

    Full Text Available A method for robotized cable winding of the Uppsala University Wave Energy Converter generator stator has previously been presented and validated. The purpose of this study is to present and validate further developments to the method: automated stand-alone equipment for the preparation of the winding cables. The cable preparation consists of three parts: feeding the cable from a drum, forming the cable end and cutting the cable. Forming and cutting the cable was previously done manually and only small cable drums could be handled. Therefore the robot cell needed to be stopped frequently. The new equipment was tested in an experimental robot stator cable winding setup. Through the experiments, the equipment was validated to be able to perform fully automated and robust cable preparation. Suggestions are also given on how to further develop the equipment with regards to performance, robustness and quality. Hence, this work represents another important step towards demonstrating completely automated robotized stator cable winding.

  10. Automating radiochemistry: Considerations for commerical suppliers of devices

    International Nuclear Information System (INIS)

    Schmidt, D.G.

    1993-01-01

    The fundamental decision to automate a particular radiochemical synthesis for in house use depends primarily on the demand for the compound and the total number of studies to be carried out with that compound. For a commercial supplier of automated chemistry systems, much more goes in to the decision to design, develop and produce a particular automated chemistry system. There is a dramatic difference in design effort between an industrial environment and an academic environment. An in house system must be built only once and needs only to incrementally simplify the synthesis process. A commercial product must: have reasonable manufacturing costs; be easy to use; be aesthetically pleasing; be easy to install and service; be functionally integral with other equipment sold by the manufacturer; be marketable within the regulatory environment; address radiation safety issues. This paper discusses issues that guide commercial suppliers in the formation of their product lines

  11. Understanding reliance on automation: effects of error type, error distribution, age and experience

    Science.gov (United States)

    Sanchez, Julian; Rogers, Wendy A.; Fisk, Arthur D.; Rovira, Ericka

    2015-01-01

    An obstacle detection task supported by “imperfect” automation was used with the goal of understanding the effects of automation error types and age on automation reliance. Sixty younger and sixty older adults interacted with a multi-task simulation of an agricultural vehicle (i.e. a virtual harvesting combine). The simulator included an obstacle detection task and a fully manual tracking task. A micro-level analysis provided insight into the way reliance patterns change over time. The results indicated that there are distinct patterns of reliance that develop as a function of error type. A prevalence of automation false alarms led participants to under-rely on the automation during alarm states while over relying on it during non-alarms states. Conversely, a prevalence of automation misses led participants to over-rely on automated alarms and under-rely on the automation during non-alarm states. Older adults adjusted their behavior according to the characteristics of the automation similarly to younger adults, although it took them longer to do so. The results of this study suggest the relationship between automation reliability and reliance depends on the prevalence of specific errors and on the state of the system. Understanding the effects of automation detection criterion settings on human-automation interaction can help designers of automated systems make predictions about human behavior and system performance as a function of the characteristics of the automation. PMID:25642142

  12. Steam generator automated eddy current data analysis: A benchmarking study. Final report

    International Nuclear Information System (INIS)

    Brown, S.D.

    1998-12-01

    The eddy current examination of steam generator tubes is a very demanding process. Challenges include: complex signal analysis, massive amount of data to be reviewed quickly with extreme precision and accuracy, shortages of data analysts during peak periods, and the desire to reduce examination costs. One method to address these challenges is by incorporating automation into the data analysis process. Specific advantages, which automated data analysis has the potential to provide, include the ability to analyze data more quickly, consistently and accurately than can be performed manually. Also, automated data analysis can potentially perform the data analysis function with significantly smaller levels of analyst staffing. Despite the clear advantages that an automated data analysis system has the potential to provide, no automated system has been produced and qualified that can perform all of the functions that utility engineers demand. This report investigates the current status of automated data analysis, both at the commercial and developmental level. A summary of the various commercial and developmental data analysis systems is provided which includes the signal processing methodologies used and, where available, the performance data obtained for each system. Also, included in this report is input from seventeen research organizations regarding the actions required and obstacles to be overcome in order to bring automatic data analysis from the laboratory into the field environment. In order to provide assistance with ongoing and future research efforts in the automated data analysis arena, the most promising approaches to signal processing are described in this report. These approaches include: wavelet applications, pattern recognition, template matching, expert systems, artificial neural networks, fuzzy logic, case based reasoning and genetic algorithms. Utility engineers and NDE researchers can use this information to assist in developing automated data

  13. Conflict Resolution Automation and Pilot Situation Awareness

    Science.gov (United States)

    Dao, Arik-Quang V.; Brandt, Summer L.; Bacon, Paige; Kraut, Josh; Nguyen, Jimmy; Minakata, Katsumi; Raza, Hamzah; Rozovski, David; Johnson, Walter W.

    2010-01-01

    This study compared pilot situation awareness across three traffic management concepts. The Concepts varied in terms of the allocation of traffic avoidance responsibility between the pilot on the flight deck, the air traffic controllers, and a conflict resolution automation system. In Concept 1, the flight deck was equipped with conflict resolution tools that enable them to fully handle the responsibility of weather avoidance and maintaining separation between ownship and surrounding traffic. In Concept 2, pilots were not responsible for traffic separation, but were provided tools for weather and traffic avoidance. In Concept 3, flight deck tools allowed pilots to deviate for weather, but conflict detection tools were disabled. In this concept pilots were dependent on ground based automation for conflict detection and resolution. Situation awareness of the pilots was measured using online probes. Results showed that individual situation awareness was highest in Concept 1, where the pilots were most engaged, and lowest in Concept 3, where automation was heavily used. These findings suggest that for conflict resolution tasks, situation awareness is improved when pilots remain in the decision-making loop.

  14. Automation of management processes as a factor in the emergence of the jobs of the future

    Directory of Open Access Journals (Sweden)

    Veretehin Vladislav Vadimovich

    2016-03-01

    Full Text Available In the article the review of researches of domestic and foreign organizations, modeling the demand and supply of professions on the labour market. Determined that most of the management functions is transferred to automated systems, robots and machines. The article presents the table containing the list of names of professions that are replaced by automated systems of management of objects and the list of names of professions that are replaced by automated systems in the management of documents. Defined the professions that will be in demand tomorrow (2020, the day after tomorrow" (after 2020, "profession-retired," and your future profession. Described professional skills the jobs of the future. Based on research by the Agency for strategic initiatives and the Moscow school of management SKOLKOVO is defined as the TOP 10 most popular professions of the future. Identified the need to replace old professions. Generated table the jobs of the future according to eaten research school of management "SKOLKOVO". The efficiency of transferring management functions to automated systems.

  15. Automated identification of insect vectors of Chagas disease in Brazil and Mexico: the Virtual Vector Lab

    Directory of Open Access Journals (Sweden)

    Rodrigo Gurgel-Gonçalves

    2017-04-01

    Full Text Available Identification of arthropods important in disease transmission is a crucial, yet difficult, task that can demand considerable training and experience. An important case in point is that of the 150+ species of Triatominae, vectors of Trypanosoma cruzi, causative agent of Chagas disease across the Americas. We present a fully automated system that is able to identify triatomine bugs from Mexico and Brazil with an accuracy consistently above 80%, and with considerable potential for further improvement. The system processes digital photographs from a photo apparatus into landmarks, and uses ratios of measurements among those landmarks, as well as (in a preliminary exploration two measurements that approximate aspects of coloration, as the basis for classification. This project has thus produced a working prototype that achieves reasonably robust correct identification rates, although many more developments can and will be added, and—more broadly—the project illustrates the value of multidisciplinary collaborations in resolving difficult and complex challenges.

  16. ROUTING DEMAND CHANGES TO USERS ON THE WM LATERAL CANAL WITH SACMAN

    Science.gov (United States)

    Most canals have either long travel times or insufficient in-canal storage to operate on-demand. Thus, most flow changes must be routed through the canal. Volume compensation has been proposed as a method for easily applying feedforward control to irrigation canals. SacMan (Software for Automated Ca...

  17. Long-term maintenance of human induced pluripotent stem cells by automated cell culture system.

    Science.gov (United States)

    Konagaya, Shuhei; Ando, Takeshi; Yamauchi, Toshiaki; Suemori, Hirofumi; Iwata, Hiroo

    2015-11-17

    Pluripotent stem cells, such as embryonic stem cells and induced pluripotent stem (iPS) cells, are regarded as new sources for cell replacement therapy. These cells can unlimitedly expand under undifferentiated conditions and be differentiated into multiple cell types. Automated culture systems enable the large-scale production of cells. In addition to reducing the time and effort of researchers, an automated culture system improves the reproducibility of cell cultures. In the present study, we newly designed a fully automated cell culture system for human iPS maintenance. Using an automated culture system, hiPS cells maintained their undifferentiated state for 60 days. Automatically prepared hiPS cells had a potency of differentiation into three germ layer cells including dopaminergic neurons and pancreatic cells.

  18. Development of Hardware and Software for Automated Ultrasonic Testing

    International Nuclear Information System (INIS)

    Choi, Sung Nam; Lee, Hee Jong; Yang, Seung Ok

    2012-01-01

    Nondestructive testing (NDT) for the construction and operating of NPPs plays an important role in confirming the integrity of the NPPs. Especially, Automated ultrasonic testing (AUT) is one of the primary nondestructive examination methods for in-service inspection of the welding parts in major components in NPPs. AUT is a reliable nondestructive testing because the data of AUT are saved and reviewed with other examiners. Korea Hydro and Nuclear Power-Central Research Institute (KHNP-CRI) has developed an automated ultrasonic testing (AUT) system based on a high speed pulser-receiver. In combination with the designed software and hardware architecture, this new system permits user configurations for a wide range of user-specific applications through fully automated inspections using compact portable systems with up to eight channels. This paper gives an overview of hardware (H/W) and software (S/W) for the AUT system to inspect welds in NPPs

  19. Remote sensing inputs to water demand modeling

    Science.gov (United States)

    Estes, J. E.; Jensen, J. R.; Tinney, L. R.; Rector, M.

    1975-01-01

    In an attempt to determine the ability of remote sensing techniques to economically generate data required by water demand models, the Geography Remote Sensing Unit, in conjunction with the Kern County Water Agency of California, developed an analysis model. As a result it was determined that agricultural cropland inventories utilizing both high altitude photography and LANDSAT imagery can be conducted cost effectively. In addition, by using average irrigation application rates in conjunction with cropland data, estimates of agricultural water demand can be generated. However, more accurate estimates are possible if crop type, acreage, and crop specific application rates are employed. An analysis of the effect of saline-alkali soils on water demand in the study area is also examined. Finally, reference is made to the detection and delineation of water tables that are perched near the surface by semi-permeable clay layers. Soil salinity prediction, automated crop identification on a by-field basis, and a potential input to the determination of zones of equal benefit taxation are briefly touched upon.

  20. Laboratory automation of high-quality and efficient ligand-binding assays for biotherapeutic drug development.

    Science.gov (United States)

    Wang, Jin; Patel, Vimal; Burns, Daniel; Laycock, John; Pandya, Kinnari; Tsoi, Jennifer; DeSilva, Binodh; Ma, Mark; Lee, Jean

    2013-07-01

    Regulated bioanalytical laboratories that run ligand-binding assays in support of biotherapeutics development face ever-increasing demand to support more projects with increased efficiency. Laboratory automation is a tool that has the potential to improve both quality and efficiency in a bioanalytical laboratory. The success of laboratory automation requires thoughtful evaluation of program needs and fit-for-purpose strategies, followed by pragmatic implementation plans and continuous user support. In this article, we present the development of fit-for-purpose automation of total walk-away and flexible modular modes. We shared the sustaining experience of vendor collaboration and team work to educate, promote and track the use of automation. The implementation of laboratory automation improves assay performance, data quality, process efficiency and method transfer to CRO in a regulated bioanalytical laboratory environment.

  1. Responsiveness of residential electricity demand to dynamic tariffs: Experiences from a large field test in the Netherlands

    OpenAIRE

    Klaassen, EAM; Kobus, C.B.A.; Frunt, J; Slootweg, JG

    2016-01-01

    To efficiently facilitate the energy transition it is essential to evaluate the potential of demand response in practice. Based on the results of a Dutch smart grid pilot, this paper assesses the potential of both manual and semi-automated demand response in residential areas. To stimulate demand response, a dynamic tariff and smart appliances were used. The participating households were informed about the tariff day-ahead through a home energy management system, connected to a display instal...

  2. Closed-Loop Real-Time Imaging Enables Fully Automated Cell-Targeted Patch-Clamp Neural Recording In Vivo.

    Science.gov (United States)

    Suk, Ho-Jun; van Welie, Ingrid; Kodandaramaiah, Suhasa B; Allen, Brian; Forest, Craig R; Boyden, Edward S

    2017-08-30

    Targeted patch-clamp recording is a powerful method for characterizing visually identified cells in intact neural circuits, but it requires skill to perform. We previously developed an algorithm that automates "blind" patching in vivo, but full automation of visually guided, targeted in vivo patching has not been demonstrated, with currently available approaches requiring human intervention to compensate for cell movement as a patch pipette approaches a targeted neuron. Here we present a closed-loop real-time imaging strategy that automatically compensates for cell movement by tracking cell position and adjusting pipette motion while approaching a target. We demonstrate our system's ability to adaptively patch, under continuous two-photon imaging and real-time analysis, fluorophore-expressing neurons of multiple types in the living mouse cortex, without human intervention, with yields comparable to skilled human experimenters. Our "imagepatching" robot is easy to implement and will help enable scalable characterization of identified cell types in intact neural circuits. Copyright © 2017 Elsevier Inc. All rights reserved.

  3. Evaluation of a fully automated treponemal test and comparison with conventional VDRL and FTA-ABS tests.

    Science.gov (United States)

    Park, Yongjung; Park, Younhee; Joo, Shin Young; Park, Myoung Hee; Kim, Hyon-Suk

    2011-11-01

    We evaluated analytic performances of an automated treponemal test and compared this test with the Venereal Disease Research Laboratory test (VDRL) and fluorescent treponemal antibody absorption test (FTA-ABS). Precision performance of the Architect Syphilis TP assay (TP; Abbott Japan, Tokyo, Japan) was assessed, and 150 serum samples were assayed with the TP before and after heat inactivation to estimate the effect of heat inactivation. A total of 616 specimens were tested with the FTA-ABS and TP, and 400 were examined with the VDRL. The TP showed good precision performance with total imprecision of less than a 10% coefficient of variation. An excellent linear relationship between results before and after heat inactivation was observed (R(2) = 0.9961). The FTA-ABS and TP agreed well with a κ coefficient of 0.981. The concordance rate between the FTA-ABS and TP was the highest (99.0%), followed by the rates between FTA-ABS and VDRL (85.0%) and between TP and VDRL (83.8%). The automated TP assay may be adequate for screening for syphilis in a large volume of samples and can be an alternative to FTA-ABS.

  4. Extensible automated dispersive liquid–liquid microextraction

    Energy Technology Data Exchange (ETDEWEB)

    Li, Songqing; Hu, Lu; Chen, Ketao; Gao, Haixiang, E-mail: hxgao@cau.edu.cn

    2015-05-04

    Highlights: • An extensible automated dispersive liquid–liquid microextraction was developed. • A fully automatic SPE workstation with a modified operation program was used. • Ionic liquid-based in situ DLLME was used as model method. • SPE columns packed with nonwoven polypropylene fiber was used for phase separation. • The approach was applied to the determination of benzoylurea insecticides in water. - Abstract: In this study, a convenient and extensible automated ionic liquid-based in situ dispersive liquid–liquid microextraction (automated IL-based in situ DLLME) was developed. 1-Octyl-3-methylimidazolium bis[(trifluoromethane)sulfonyl]imide ([C{sub 8}MIM]NTf{sub 2}) is formed through the reaction between [C{sub 8}MIM]Cl and lithium bis[(trifluoromethane)sulfonyl]imide (LiNTf{sub 2}) to extract the analytes. Using a fully automatic SPE workstation, special SPE columns packed with nonwoven polypropylene (NWPP) fiber, and a modified operation program, the procedures of the IL-based in situ DLLME, including the collection of a water sample, injection of an ion exchange solvent, phase separation of the emulsified solution, elution of the retained extraction phase, and collection of the eluent into vials, can be performed automatically. The developed approach, coupled with high-performance liquid chromatography–diode array detection (HPLC–DAD), was successfully applied to the detection and concentration determination of benzoylurea (BU) insecticides in water samples. Parameters affecting the extraction performance were investigated and optimized. Under the optimized conditions, the proposed method achieved extraction recoveries of 80% to 89% for water samples. The limits of detection (LODs) of the method were in the range of 0.16–0.45 ng mL{sup −1}. The intra-column and inter-column relative standard deviations (RSDs) were <8.6%. Good linearity (r > 0.9986) was obtained over the calibration range from 2 to 500 ng mL{sup −1}. The proposed

  5. Extensible automated dispersive liquid–liquid microextraction

    International Nuclear Information System (INIS)

    Li, Songqing; Hu, Lu; Chen, Ketao; Gao, Haixiang

    2015-01-01

    Highlights: • An extensible automated dispersive liquid–liquid microextraction was developed. • A fully automatic SPE workstation with a modified operation program was used. • Ionic liquid-based in situ DLLME was used as model method. • SPE columns packed with nonwoven polypropylene fiber was used for phase separation. • The approach was applied to the determination of benzoylurea insecticides in water. - Abstract: In this study, a convenient and extensible automated ionic liquid-based in situ dispersive liquid–liquid microextraction (automated IL-based in situ DLLME) was developed. 1-Octyl-3-methylimidazolium bis[(trifluoromethane)sulfonyl]imide ([C 8 MIM]NTf 2 ) is formed through the reaction between [C 8 MIM]Cl and lithium bis[(trifluoromethane)sulfonyl]imide (LiNTf 2 ) to extract the analytes. Using a fully automatic SPE workstation, special SPE columns packed with nonwoven polypropylene (NWPP) fiber, and a modified operation program, the procedures of the IL-based in situ DLLME, including the collection of a water sample, injection of an ion exchange solvent, phase separation of the emulsified solution, elution of the retained extraction phase, and collection of the eluent into vials, can be performed automatically. The developed approach, coupled with high-performance liquid chromatography–diode array detection (HPLC–DAD), was successfully applied to the detection and concentration determination of benzoylurea (BU) insecticides in water samples. Parameters affecting the extraction performance were investigated and optimized. Under the optimized conditions, the proposed method achieved extraction recoveries of 80% to 89% for water samples. The limits of detection (LODs) of the method were in the range of 0.16–0.45 ng mL −1 . The intra-column and inter-column relative standard deviations (RSDs) were <8.6%. Good linearity (r > 0.9986) was obtained over the calibration range from 2 to 500 ng mL −1 . The proposed method opens a new avenue

  6. E-health, phase two: the imperative to integrate process automation with communication automation for large clinical reference laboratories.

    Science.gov (United States)

    White, L; Terner, C

    2001-01-01

    The initial efforts of e-health have fallen far short of expectations. They were buoyed by the hype and excitement of the Internet craze but limited by their lack of understanding of important market and environmental factors. E-health now recognizes that legacy systems and processes are important, that there is a technology adoption process that needs to be followed, and that demonstrable value drives adoption. Initial e-health transaction solutions have targeted mostly low-cost problems. These solutions invariably are difficult to integrate into existing systems, typically requiring manual interfacing to supported processes. This limitation in particular makes them unworkable for large volume providers. To meet the needs of these providers, e-health companies must rethink their approaches, appropriately applying technology to seamlessly integrate all steps into existing business functions. E-automation is a transaction technology that automates steps, integration of steps, and information communication demands, resulting in comprehensive automation of entire business functions. We applied e-automation to create a billing management solution for clinical reference laboratories. Large volume, onerous regulations, small margins, and only indirect access to patients challenge large laboratories' billing departments. Couple these problems with outmoded, largely manual systems and it becomes apparent why most laboratory billing departments are in crisis. Our approach has been to focus on the most significant and costly problems in billing: errors, compliance, and system maintenance and management. The core of the design relies on conditional processing, a "universal" communications interface, and ASP technologies. The result is comprehensive automation of all routine processes, driving out errors and costs. Additionally, compliance management and billing system support and management costs are dramatically reduced. The implications of e-automated processes can extend

  7. Harnessing Vehicle Automation for Public Mobility -- An Overview of Ongoing Efforts

    Energy Technology Data Exchange (ETDEWEB)

    Young, Stanley E.

    2015-11-05

    This presentation takes a look at the efforts to harness automated vehicle technology for public transport. The European CityMobil2 is the leading demonstration project in which automated shuttles were, or are planned to be, demonstrated in several cities and regions. The presentation provides a brief overview of the demonstrations at Oristano, Italy (July 2014), LaRochelle, France (Dec 2014), Lausanne, Switzerland (Apr 2015), Vantaa, Finland (July 2015), and Trikala, Greece (Sept 2015). In addition to technology exposition, the objectives included generating a legal framework for operation in each location and gaging the reaction of the public to unmanned shuttles, both of which were successfully achieved. Several such demonstrations are planned throughout the world, including efforts in North America in conjunction with the GoMentum Station in California. These early demonstration with low-speed automated shuttles provide a glimpse of the possible with a fully automated fleet of driverless vehicle providing a public transit service.

  8. Note: Automated electrochemical etching and polishing of silver scanning tunneling microscope tips.

    Science.gov (United States)

    Sasaki, Stephen S; Perdue, Shawn M; Rodriguez Perez, Alejandro; Tallarida, Nicholas; Majors, Julia H; Apkarian, V Ara; Lee, Joonhee

    2013-09-01

    Fabrication of sharp and smooth Ag tips is crucial in optical scanning probe microscope experiments. To ensure reproducible tip profiles, the polishing process is fully automated using a closed-loop laminar flow system to deliver the electrolytic solution to moving electrodes mounted on a motorized translational stage. The repetitive translational motion is controlled precisely on the μm scale with a stepper motor and screw-thread mechanism. The automated setup allows reproducible control over the tip profile and improves smoothness and sharpness of tips (radius 27 ± 18 nm), as measured by ultrafast field emission.

  9. Demand Response Opportunities in Industrial Refrigerated Warehouses in California

    Energy Technology Data Exchange (ETDEWEB)

    Goli, Sasank; McKane, Aimee; Olsen, Daniel

    2011-06-14

    Industrial refrigerated warehouses that implemented energy efficiency measures and have centralized control systems can be excellent candidates for Automated Demand Response (Auto-DR) due to equipment synergies, and receptivity of facility managers to strategies that control energy costs without disrupting facility operations. Auto-DR utilizes OpenADR protocol for continuous and open communication signals over internet, allowing facilities to automate their Demand Response (DR). Refrigerated warehouses were selected for research because: They have significant power demand especially during utility peak periods; most processes are not sensitive to short-term (2-4 hours) lower power and DR activities are often not disruptive to facility operations; the number of processes is limited and well understood; and past experience with some DR strategies successful in commercial buildings may apply to refrigerated warehouses. This paper presents an overview of the potential for load sheds and shifts from baseline electricity use in response to DR events, along with physical configurations and operating characteristics of refrigerated warehouses. Analysis of data from two case studies and nine facilities in Pacific Gas and Electric territory, confirmed the DR abilities inherent to refrigerated warehouses but showed significant variation across facilities. Further, while load from California's refrigerated warehouses in 2008 was 360 MW with estimated DR potential of 45-90 MW, actual achieved was much less due to low participation. Efforts to overcome barriers to increased participation may include, improved marketing and recruitment of potential DR sites, better alignment and emphasis on financial benefits of participation, and use of Auto-DR to increase consistency of participation.

  10. Fully automated calculation of image-derived input function in simultaneous PET/MRI in a sheep model

    International Nuclear Information System (INIS)

    Jochimsen, Thies H.; Zeisig, Vilia; Schulz, Jessica; Werner, Peter; Patt, Marianne; Patt, Jörg; Dreyer, Antje Y.; Boltze, Johannes; Barthel, Henryk; Sabri, Osama; Sattler, Bernhard

    2016-01-01

    Obtaining the arterial input function (AIF) from image data in dynamic positron emission tomography (PET) examinations is a non-invasive alternative to arterial blood sampling. In simultaneous PET/magnetic resonance imaging (PET/MRI), high-resolution MRI angiographies can be used to define major arteries for correction of partial-volume effects (PVE) and point spread function (PSF) response in the PET data. The present study describes a fully automated method to obtain the image-derived input function (IDIF) in PET/MRI. Results are compared to those obtained by arterial blood sampling. To segment the trunk of the major arteries in the neck, a high-resolution time-of-flight MRI angiography was postprocessed by a vessel-enhancement filter based on the inertia tensor. Together with the measured PSF of the PET subsystem, the arterial mask was used for geometrical deconvolution, yielding the time-resolved activity concentration averaged over a major artery. The method was compared to manual arterial blood sampling at the hind leg of 21 sheep (animal stroke model) during measurement of blood flow with O15-water. Absolute quantification of activity concentration was compared after bolus passage during steady state, i.e., between 2.5- and 5-min post injection. Cerebral blood flow (CBF) values from blood sampling and IDIF were also compared. The cross-calibration factor obtained by comparing activity concentrations in blood samples and IDIF during steady state is 0.98 ± 0.10. In all examinations, the IDIF provided a much earlier and sharper bolus peak than in the time course of activity concentration obtained by arterial blood sampling. CBF using the IDIF was 22 % higher than CBF obtained by using the AIF yielded by blood sampling. The small deviation between arterial blood sampling and IDIF during steady state indicates that correction of PVE and PSF is possible with the method presented. The differences in bolus dynamics and, hence, CBF values can be explained by the

  11. Gene Expression Measurement Module (GEMM) - A Fully Automated, Miniaturized Instrument for Measuring Gene Expression in Space

    Science.gov (United States)

    Pohorille, Andrew; Peyvan, Kia; Karouia, Fathi; Ricco, Antonio

    2012-01-01

    The capability to measure gene expression on board spacecraft opens the door to a large number of high-value experiments on the influence of the space environment on biological systems. For example, measurements of gene expression will help us to understand adaptation of terrestrial life to conditions beyond the planet of origin, identify deleterious effects of the space environment on a wide range of organisms from microbes to humans, develop effective countermeasures against these effects, and determine the metabolic bases of microbial pathogenicity and drug resistance. These and other applications hold significant potential for discoveries in space biology, biotechnology, and medicine. Supported by funding from the NASA Astrobiology Science and Technology Instrument Development Program, we are developing a fully automated, miniaturized, integrated fluidic system for small spacecraft capable of in-situ measurement of expression of several hundreds of microbial genes from multiple samples. The instrument will be capable of (1) lysing cell walls of bacteria sampled from cultures grown in space, (2) extracting and purifying RNA released from cells, (3) hybridizing the RNA on a microarray and (4) providing readout of the microarray signal, all in a single microfluidics cartridge. The device is suitable for deployment on nanosatellite platforms developed by NASA Ames' Small Spacecraft Division. To meet space and other technical constraints imposed by these platforms, a number of technical innovations are being implemented. The integration and end-to-end technological and biological validation of the instrument are carried out using as a model the photosynthetic bacterium Synechococcus elongatus, known for its remarkable metabolic diversity and resilience to adverse conditions. Each step in the measurement process-lysis, nucleic acid extraction, purification, and hybridization to an array-is assessed through comparison of the results obtained using the instrument with

  12. Fully automated calculation of image-derived input function in simultaneous PET/MRI in a sheep model

    Energy Technology Data Exchange (ETDEWEB)

    Jochimsen, Thies H.; Zeisig, Vilia [Department of Nuclear Medicine, Leipzig University Hospital, Liebigstr. 18, Leipzig (Germany); Schulz, Jessica [Max Planck Institute for Human Cognitive and Brain Sciences, Stephanstr. 1a, Leipzig, D-04103 (Germany); Werner, Peter; Patt, Marianne; Patt, Jörg [Department of Nuclear Medicine, Leipzig University Hospital, Liebigstr. 18, Leipzig (Germany); Dreyer, Antje Y. [Fraunhofer Institute of Cell Therapy and Immunology, Perlickstr. 1, Leipzig, D-04103 (Germany); Translational Centre for Regenerative Medicine, University Leipzig, Philipp-Rosenthal-Str. 55, Leipzig, D-04103 (Germany); Boltze, Johannes [Fraunhofer Institute of Cell Therapy and Immunology, Perlickstr. 1, Leipzig, D-04103 (Germany); Translational Centre for Regenerative Medicine, University Leipzig, Philipp-Rosenthal-Str. 55, Leipzig, D-04103 (Germany); Fraunhofer Research Institution of Marine Biotechnology and Institute for Medical and Marine Biotechnology, University of Lübeck, Lübeck (Germany); Barthel, Henryk; Sabri, Osama; Sattler, Bernhard [Department of Nuclear Medicine, Leipzig University Hospital, Liebigstr. 18, Leipzig (Germany)

    2016-02-13

    Obtaining the arterial input function (AIF) from image data in dynamic positron emission tomography (PET) examinations is a non-invasive alternative to arterial blood sampling. In simultaneous PET/magnetic resonance imaging (PET/MRI), high-resolution MRI angiographies can be used to define major arteries for correction of partial-volume effects (PVE) and point spread function (PSF) response in the PET data. The present study describes a fully automated method to obtain the image-derived input function (IDIF) in PET/MRI. Results are compared to those obtained by arterial blood sampling. To segment the trunk of the major arteries in the neck, a high-resolution time-of-flight MRI angiography was postprocessed by a vessel-enhancement filter based on the inertia tensor. Together with the measured PSF of the PET subsystem, the arterial mask was used for geometrical deconvolution, yielding the time-resolved activity concentration averaged over a major artery. The method was compared to manual arterial blood sampling at the hind leg of 21 sheep (animal stroke model) during measurement of blood flow with O15-water. Absolute quantification of activity concentration was compared after bolus passage during steady state, i.e., between 2.5- and 5-min post injection. Cerebral blood flow (CBF) values from blood sampling and IDIF were also compared. The cross-calibration factor obtained by comparing activity concentrations in blood samples and IDIF during steady state is 0.98 ± 0.10. In all examinations, the IDIF provided a much earlier and sharper bolus peak than in the time course of activity concentration obtained by arterial blood sampling. CBF using the IDIF was 22 % higher than CBF obtained by using the AIF yielded by blood sampling. The small deviation between arterial blood sampling and IDIF during steady state indicates that correction of PVE and PSF is possible with the method presented. The differences in bolus dynamics and, hence, CBF values can be explained by the

  13. Automated comparison of Bayesian reconstructions of experimental profiles with physical models

    International Nuclear Information System (INIS)

    Irishkin, Maxim

    2014-01-01

    In this work we developed an expert system that carries out in an integrated and fully automated way i) a reconstruction of plasma profiles from the measurements, using Bayesian analysis ii) a prediction of the reconstructed quantities, according to some models and iii) an intelligent comparison of the first two steps. This system includes systematic checking of the internal consistency of the reconstructed quantities, enables automated model validation and, if a well-validated model is used, can be applied to help detecting interesting new physics in an experiment. The work shows three applications of this quite general system. The expert system can successfully detect failures in the automated plasma reconstruction and provide (on successful reconstruction cases) statistics of agreement of the models with the experimental data, i.e. information on the model validity. (author) [fr

  14. How automated image analysis techniques help scientists in species identification and classification?

    Science.gov (United States)

    Yousef Kalafi, Elham; Town, Christopher; Kaur Dhillon, Sarinder

    2017-09-04

    Identification of taxonomy at a specific level is time consuming and reliant upon expert ecologists. Hence the demand for automated species identification increased over the last two decades. Automation of data classification is primarily focussed on images, incorporating and analysing image data has recently become easier due to developments in computational technology. Research efforts in identification of species include specimens' image processing, extraction of identical features, followed by classifying them into correct categories. In this paper, we discuss recent automated species identification systems, categorizing and evaluating their methods. We reviewed and compared different methods in step by step scheme of automated identification and classification systems of species images. The selection of methods is influenced by many variables such as level of classification, number of training data and complexity of images. The aim of writing this paper is to provide researchers and scientists an extensive background study on work related to automated species identification, focusing on pattern recognition techniques in building such systems for biodiversity studies.

  15. Automated Demand Response Approaches to Household Energy Management in a Smart Grid Environment

    Science.gov (United States)

    Adika, Christopher Otieno

    The advancement of renewable energy technologies and the deregulation of the electricity market have seen the emergence of Demand response (DR) programs. Demand response is a cost-effective load management strategy which enables the electricity suppliers to maintain the integrity of the power grid during high peak periods, when the customers' electrical load is high. DR programs are designed to influence electricity users to alter their normal consumption patterns by offering them financial incentives. A well designed incentive-based DR scheme that offer competitive electricity pricing structure can result in numerous benefits to all the players in the electricity market. Lower power consumption during peak periods will significantly enhance the robustness of constrained networks by reducing the level of power of generation and transmission infrastructure needed to provide electric service. Therefore, this will ease the pressure of building new power networks as we avoiding costly energy procurements thereby translating into huge financial savings for the power suppliers. Peak load reduction will also reduce the inconveniences suffered by end users as a result of brownouts or blackouts. Demand response will also drastically lower the price peaks associated with wholesale markets. This will in turn reduce the electricity costs and risks for all the players in the energy market. Additionally, DR is environmentally friendly since it enhances the flexibility of the power grid through accommodation of renewable energy resources. Despite its many benefits, DR has not been embraced by most electricity networks. This can be attributed to the fact that the existing programs do not provide enough incentives to the end users and, therefore, most electricity users are not willing to participate in them. To overcome these challenges, most utilities are coming up with innovative strategies that will be more attractive to their customers. Thus, this dissertation presents various

  16. Automation of the National Water Quality Laboratories, U. S. Geological Survey. I. Description of laboratory functions and definition of the automation project

    Energy Technology Data Exchange (ETDEWEB)

    Morris, W.F.; Ames, H.S.

    1977-07-01

    In January 1976, the Water Resources Division of the U.S. Geological Survey asked Lawrence Livermore Laboratory to conduct a feasibility study for automation of the National Water Quality (NWQ) Laboratory in Denver, Colorado (formerly Denver Central Laboratory). Results of the study were published in the Feasibility Study for Automation of the Central Laboratories, Lawrence Livermore Laboratory, Rept. UCRL-52001 (1976). Because the present system for processing water samples was found inadequate to meet the demands of a steadily increasing workload, new automation was recommended. In this document we present details necessary for future implementation of the new system, as well as descriptions of current laboratory automatic data processing and analytical facilities to better define the scope of the project and illustrate what the new system will accomplish. All pertinent inputs, outputs, and other operations that define the project are shown in functional designs.

  17. Automation of the National Water Quality Laboratories, U.S. Geological Survey. I. Description of laboratory functions and definition of the automation project

    International Nuclear Information System (INIS)

    Morris, W.F.; Ames, H.S.

    1977-01-01

    In January 1976, the Water Resources Division of the U.S. Geological Survey asked Lawrence Livermore Laboratory to conduct a feasibility study for automation of the National Water Quality (NWQ) Laboratory in Denver, Colorado (formerly Denver Central Laboratory). Results of the study were published in the Feasibility Study for Automation of the Central Laboratories, Lawrence Livermore Laboratory, Rept. UCRL-52001 (1976). Because the present system for processing water samples was found inadequate to meet the demands of a steadily increasing workload, new automation was recommended. In this document we present details necessary for future implementation of the new system, as well as descriptions of current laboratory automatic data processing and analytical facilities to better define the scope of the project and illustrate what the new system will accomplish. All pertinent inputs, outputs, and other operations that define the project are shown in functional designs

  18. Construction and calibration of a low cost and fully automated vibrating sample magnetometer

    International Nuclear Information System (INIS)

    El-Alaily, T.M.; El-Nimr, M.K.; Saafan, S.A.; Kamel, M.M.; Meaz, T.M.; Assar, S.T.

    2015-01-01

    A low cost vibrating sample magnetometer (VSM) has been constructed by using an electromagnet and an audio loud speaker; where both are controlled by a data acquisition device. The constructed VSM records the magnetic hysteresis loop up to 8.3 KG at room temperature. The apparatus has been calibrated and tested by using magnetic hysteresis data of some ferrite samples measured by two scientifically calibrated magnetometers; model (Lake Shore 7410) and model (LDJ Electronics Inc. Troy, MI). Our VSM lab-built new design proved success and reliability. - Highlights: • A low cost automated vibrating sample magnetometer VSM has been constructed. • The VSM records the magnetic hysteresis loop up to 8.3 KG at room temperature. • The VSM has been calibrated and tested by using some measured ferrite samples. • Our VSM lab-built new design proved success and reliability

  19. Construction and calibration of a low cost and fully automated vibrating sample magnetometer

    Energy Technology Data Exchange (ETDEWEB)

    El-Alaily, T.M., E-mail: toson_alaily@yahoo.com [Physics Department, Faculty of Science, Tanta University, Tanta (Egypt); El-Nimr, M.K.; Saafan, S.A.; Kamel, M.M.; Meaz, T.M. [Physics Department, Faculty of Science, Tanta University, Tanta (Egypt); Assar, S.T. [Engineering Physics and Mathematics Department, Faculty of Engineering, Tanta University, Tanta (Egypt)

    2015-07-15

    A low cost vibrating sample magnetometer (VSM) has been constructed by using an electromagnet and an audio loud speaker; where both are controlled by a data acquisition device. The constructed VSM records the magnetic hysteresis loop up to 8.3 KG at room temperature. The apparatus has been calibrated and tested by using magnetic hysteresis data of some ferrite samples measured by two scientifically calibrated magnetometers; model (Lake Shore 7410) and model (LDJ Electronics Inc. Troy, MI). Our VSM lab-built new design proved success and reliability. - Highlights: • A low cost automated vibrating sample magnetometer VSM has been constructed. • The VSM records the magnetic hysteresis loop up to 8.3 KG at room temperature. • The VSM has been calibrated and tested by using some measured ferrite samples. • Our VSM lab-built new design proved success and reliability.

  20. Service-oriented architectural framework for support and automation of collaboration tasks

    Directory of Open Access Journals (Sweden)

    Ana Sasa

    2011-06-01

    Full Text Available Due to more and more demanding requirements for business flexibility and agility, automation of end-to-end industrial processes has become an important topic. Systems supporting business process execution need to enable automated tasks execution as well as integrate human performed tasks (human tasks into a business process. In this paper, we focus on collaboration tasks, which are an important type of composite human tasks. We propose a service-oriented architectural framework describing a service responsible for human task execution (Human task service, which not only implements collaboration tasks but also improves their execution by automated and semi-automated decision making and collaboration based on ontologies and agent technology. The approach is very generic and can be used for any type of business processes. A case study was performed for a human task intensive business process from an electric power transmission domain.

  1. Applying Dataflow Architecture and Visualization Tools to In Vitro Pharmacology Data Automation.

    Science.gov (United States)

    Pechter, David; Xu, Serena; Kurtz, Marc; Williams, Steven; Sonatore, Lisa; Villafania, Artjohn; Agrawal, Sony

    2016-12-01

    The pace and complexity of modern drug discovery places ever-increasing demands on scientists for data analysis and interpretation. Data flow programming and modern visualization tools address these demands directly. Three different requirements-one for allosteric modulator analysis, one for a specialized clotting analysis, and one for enzyme global progress curve analysis-are reviewed, and their execution in a combined data flow/visualization environment is outlined. © 2016 Society for Laboratory Automation and Screening.

  2. A Fully Automated Web-Based Program Improves Lifestyle Habits and HbA1c in Patients With Type 2 Diabetes and Abdominal Obesity: Randomized Trial of Patient E-Coaching Nutritional Support (The ANODE Study).

    Science.gov (United States)

    Hansel, Boris; Giral, Philippe; Gambotti, Laetitia; Lafourcade, Alexandre; Peres, Gilbert; Filipecki, Claude; Kadouch, Diana; Hartemann, Agnes; Oppert, Jean-Michel; Bruckert, Eric; Marre, Michel; Bruneel, Arnaud; Duchene, Emilie; Roussel, Ronan

    2017-11-08

    The prevalence of abdominal obesity and type 2 diabetes mellitus (T2DM) is a public health challenge. New solutions need to be developed to help patients implement lifestyle changes. The objective of the study was to evaluate a fully automated Web-based intervention designed to help users improve their dietary habits and increase their physical activity. The Accompagnement Nutritionnel de l'Obésité et du Diabète par E-coaching (ANODE) study was a 16-week, 1:1 parallel-arm, open-label randomized clinical trial. Patients with T2DM and abdominal obesity (n=120, aged 18-75 years) were recruited. Patients in the intervention arm (n=60) had access to a fully automated program (ANODE) to improve their lifestyle. Patients were asked to log on at least once per week. Human contact was limited to hotline support in cases of technical issues. The dietetic tool provided personalized menus and a shopping list for the day or the week. Stepwise physical activity was prescribed. The control arm (n=60) received general nutritional advice. The primary outcome was the change of the dietary score (International Diet Quality Index; DQI-I) between baseline and the end of the study. Secondary endpoints included changes in body weight, waist circumference, hemoglobin A1c (HbA1c) and measured maximum oxygen consumption (VO2 max). The mean age of the participants was 57 years (standard deviation [SD] 9), mean body mass index was 33 kg/m² (SD 4), mean HbA1c was 7.2% (SD 1.1), and 66.7% (80/120) of participants were women. Using an intention-to-treat analysis, the DQI-I score (54.0, SD 5.7 in the ANODE arm; 52.8, SD 6.2 in the control arm; P=.28) increased significantly in the ANODE arm compared to the control arm (+4.55, SD 5.91 vs -1.68, SD 5.18; between arms Pchanges improved significantly in the intervention. Among patients with T2DM and abdominal obesity, the use of a fully automated Web-based program resulted in a significant improvement in dietary habits and favorable clinical and

  3. Will the future of knowledge work automation transform personalized medicine?

    Directory of Open Access Journals (Sweden)

    Gauri Naik

    2014-09-01

    Full Text Available Today, we live in a world of ‘information overload’ which demands high level of knowledge-based work. However, advances in computer hardware and software have opened possibilities to automate ‘routine cognitive tasks’ for knowledge processing. Engineering intelligent software systems that can process large data sets using unstructured commands and subtle judgments and have the ability to learn ‘on the fly’ are a significant step towards automation of knowledge work. The applications of this technology for high throughput genomic analysis, database updating, reporting clinically significant variants, and diagnostic imaging purposes are explored using case studies.

  4. Pilot opinions on high level flight deck automation issues: Toward the development of a design philosophy

    Science.gov (United States)

    Tenney, Yvette J.; Rogers, William H.; Pew, Richard W.

    1995-01-01

    There has been much concern in recent years about the rapid increase in automation on commercial flight decks. The survey was composed of three major sections. The first section asked pilots to rate different automation components that exist on the latest commercial aircraft regarding their obtrusiveness and the attention and effort required in using them. The second section addressed general 'automation philosophy' issues. The third section focused on issues related to levels and amount of automation. The results indicate that pilots of advanced aircraft like their automation, use it, and would welcome more automation. However, they also believe that automation has many disadvantages, especially fully autonomous automation. They want their automation to be simple and reliable and to produce predictable results. The biggest needs for higher levels of automation were in pre-flight, communication, systems management, and task management functions, planning as well as response tasks, and high workload situations. There is an irony and a challenge in the implications of these findings. On the one hand pilots would like new automation to be simple and reliable, but they need it to support the most complex part of the job--managing and planning tasks in high workload situations.

  5. Automation of diagnostic genetic testing: mutation detection by cyclic minisequencing.

    Science.gov (United States)

    Alagrund, Katariina; Orpana, Arto K

    2014-01-01

    The rising role of nucleic acid testing in clinical decision making is creating a need for efficient and automated diagnostic nucleic acid test platforms. Clinical use of nucleic acid testing sets demands for shorter turnaround times (TATs), lower production costs and robust, reliable methods that can easily adopt new test panels and is able to run rare tests in random access principle. Here we present a novel home-brew laboratory automation platform for diagnostic mutation testing. This platform is based on the cyclic minisequecing (cMS) and two color near-infrared (NIR) detection. Pipetting is automated using Tecan Freedom EVO pipetting robots and all assays are performed in 384-well micro plate format. The automation platform includes a data processing system, controlling all procedures, and automated patient result reporting to the hospital information system. We have found automated cMS a reliable, inexpensive and robust method for nucleic acid testing for a wide variety of diagnostic tests. The platform is currently in clinical use for over 80 mutations or polymorphisms. Additionally to tests performed from blood samples, the system performs also epigenetic test for the methylation of the MGMT gene promoter, and companion diagnostic tests for analysis of KRAS and BRAF gene mutations from formalin fixed and paraffin embedded tumor samples. Automation of genetic test reporting is found reliable and efficient decreasing the work load of academic personnel.

  6. Automated image analysis of lateral lumber X-rays by a form model

    International Nuclear Information System (INIS)

    Mahnken, A.H.; Kohnen, M.; Steinberg, S.; Wein, B.B.; Guenther, R.W.

    2001-01-01

    Development of a software for fully automated image analysis of lateral lumbar spine X-rays. Material and method: Using the concept of active shape models, we developed a software that produces a form model of the lumbar spine from lateral lumbar spine radiographs and runs an automated image segmentation. This model is able to detect lumbar vertebrae automatically after the filtering of digitized X-ray images. The model was trained with 20 lateral lumbar spine radiographs with no pathological findings before we evaluated the software with 30 further X-ray images which were sorted by image quality ranging from one (best) to three (worst). There were 10 images for each quality. Results: Image recognition strongly depended on image quality. In group one 52 and in group two 51 out of 60 vertebral bodies including the sacrum were recognized, but in group three only 18 vertebral bodies were properly identified. Conclusion: Fully automated and reliable recognition of vertebral bodies from lateral spine radiographs using the concept of active shape models is possible. The precision of this technique is limited by the superposition of different structures. Further improvements are necessary. Therefore standardized image quality and enlargement of the training data set are required. (orig.) [de

  7. Building a framework to manage trust in automation

    Science.gov (United States)

    Metcalfe, J. S.; Marathe, A. R.; Haynes, B.; Paul, V. J.; Gremillion, G. M.; Drnec, K.; Atwater, C.; Estepp, J. R.; Lukos, J. R.; Carter, E. C.; Nothwang, W. D.

    2017-05-01

    All automations must, at some point in their lifecycle, interface with one or more humans. Whether operators, end-users, or bystanders, human responses can determine the perceived utility and acceptance of an automation. It has been long believed that human trust is a primary determinant of human-automation interactions and further presumed that calibrating trust can lead to appropriate choices regarding automation use. However, attempts to improve joint system performance by calibrating trust have not yet provided a generalizable solution. To address this, we identified several factors limiting the direct integration of trust, or metrics thereof, into an active mitigation strategy. The present paper outlines our approach to addressing this important issue, its conceptual underpinnings, and practical challenges encountered in execution. Among the most critical outcomes has been a shift in focus from trust to basic interaction behaviors and their antecedent decisions. This change in focus inspired the development of a testbed and paradigm that was deployed in two experiments of human interactions with driving automation that were executed in an immersive, full-motion simulation environment. Moreover, by integrating a behavior and physiology-based predictor within a novel consequence-based control system, we demonstrated that it is possible to anticipate particular interaction behaviors and influence humans towards more optimal choices about automation use in real time. Importantly, this research provides a fertile foundation for the development and integration of advanced, wearable technologies for sensing and inferring critical state variables for better integration of human elements into otherwise fully autonomous systems.

  8. Validation of a fully automated solid‐phase extraction and ultra‐high‐performance liquid chromatography–tandem mass spectrometry method for quantification of 30 pharmaceuticals and metabolites in post‐mortem blood and brain samples

    DEFF Research Database (Denmark)

    Nielsen, Marie Katrine Klose; Nedahl, Michael; Johansen, Sys Stybe

    2018-01-01

    In this study, we present the validation of an analytical method capable of quantifying 30 commonly encountered pharmaceuticals and metabolites in whole blood and brain tissue from forensic cases. Solid‐phase extraction was performed by a fully automated robotic system, thereby minimising manual...... labour and human error while increasing sample throughput, robustness, and traceability. The method was validated in blood in terms of selectivity, linear range, matrix effect, extraction recovery, process efficiency, carry‐over, stability, precision, and accuracy. Deuterated analogues of each analyte....../kg. Thus, the linear range covered both therapeutic and toxic levels. The method showed acceptable accuracy and precision, with accuracies ranging from 80 to 118% and precision below 19% for the majority of the analytes. Linear range, matrix effect, extraction recovery, process efficiency, precision...

  9. Automated Localization of Multiple Pelvic Bone Structures on MRI.

    Science.gov (United States)

    Onal, Sinan; Lai-Yuen, Susana; Bao, Paul; Weitzenfeld, Alfredo; Hart, Stuart

    2016-01-01

    In this paper, we present a fully automated localization method for multiple pelvic bone structures on magnetic resonance images (MRI). Pelvic bone structures are at present identified manually on MRI to locate reference points for measurement and evaluation of pelvic organ prolapse (POP). Given that this is a time-consuming and subjective procedure, there is a need to localize pelvic bone structures automatically. However, bone structures are not easily differentiable from soft tissue on MRI as their pixel intensities tend to be very similar. In this paper, we present a model that combines support vector machines and nonlinear regression capturing global and local information to automatically identify the bounding boxes of bone structures on MRI. The model identifies the location of the pelvic bone structures by establishing the association between their relative locations and using local information such as texture features. Results show that the proposed method is able to locate the bone structures of interest accurately (dice similarity index >0.75) in 87-91% of the images. This research aims to enable accurate, consistent, and fully automated localization of bone structures on MRI to facilitate and improve the diagnosis of health conditions such as female POP.

  10. Level of Automation and Failure Frequency Effects on Simulated Lunar Lander Performance

    Science.gov (United States)

    Marquez, Jessica J.; Ramirez, Margarita

    2014-01-01

    A human-in-the-loop experiment was conducted at the NASA Ames Research Center Vertical Motion Simulator, where instrument-rated pilots completed a simulated terminal descent phase of a lunar landing. Ten pilots participated in a 2 x 2 mixed design experiment, with level of automation as the within-subjects factor and failure frequency as the between subjects factor. The two evaluated levels of automation were high (fully automated landing) and low (manual controlled landing). During test trials, participants were exposed to either a high number of failures (75% failure frequency) or low number of failures (25% failure frequency). In order to investigate the pilots' sensitivity to changes in levels of automation and failure frequency, the dependent measure selected for this experiment was accuracy of failure diagnosis, from which D Prime and Decision Criterion were derived. For each of the dependent measures, no significant difference was found for level of automation and no significant interaction was detected between level of automation and failure frequency. A significant effect was identified for failure frequency suggesting failure frequency has a significant effect on pilots' sensitivity to failure detection and diagnosis. Participants were more likely to correctly identify and diagnose failures if they experienced the higher levels of failures, regardless of level of automation

  11. Space station automation: the role of robotics and artificial intelligence (Invited Paper)

    Science.gov (United States)

    Park, W. T.; Firschein, O.

    1985-12-01

    Automation of the space station is necessary to make more effective use of the crew, to carry out repairs that are impractical or dangerous, and to monitor and control the many space station subsystems. Intelligent robotics and expert systems play a strong role in automation, and both disciplines are highly dependent on a common artificial intelligence (Al) technology base. The AI technology base provides the reasoning and planning capabilities needed in robotic tasks, such as perception of the environment and planning a path to a goal, and in expert systems tasks, such as control of subsystems and maintenance of equipment. This paper describes automation concepts for the space station, the specific robotic and expert systems required to attain this automation, and the research and development required. It also presents an evolutionary development plan that leads to fully automatic mobile robots for servicing satellites. Finally, we indicate the sequence of demonstrations and the research and development needed to confirm the automation capabilities. We emphasize that advanced robotics requires AI, and that to advance, AI needs the "real-world" problems provided by robotics.

  12. Development of a framework of human-centered automation for the nuclear industry

    International Nuclear Information System (INIS)

    Nelson, W.R.; Haney, L.N.

    1993-01-01

    Introduction of automated systems into control rooms for advanced reactor designs is often justified on the basis of increased efficiency and reliability, without a detailed assessment of how the new technologies will influence the role of the operator. Such a ''technology-centered'' approach carries with it the risk that entirely new mechanisms for human error will be introduced, resulting in some unpleasant surprises when the plant goes into operation. The aviation industry has experienced some of these surprises since the introduction of automated systems into the cockpits of advanced technology aircraft. Pilot errors have actually been induced by automated systems, especially when the pilot doesn't fully understand what the automated systems are doing during all modes of operation. In order to structure the research program for investigating these problems, the National Aeronautics and Space Administration (NASA) has developed a framework for human-centered automation. This framework is described in the NASA document Human-Centered Aircraft Automation Philosophy by Charles Billings. It is the thesis of this paper that a corresponding framework of human-centered automation should be developed for the nuclear industry. Such a framework would serve to guide the design and regulation of automated systems for advanced reactor designs, and would help prevent some of the problems that have arisen in other applications that have followed a ''technology-centered'' approach

  13. Automated system for crack detection using infrared thermograph

    International Nuclear Information System (INIS)

    Starman, Stanislav

    2009-01-01

    The objective of this study was the development of the automated system for crack detection on square steel bars used in the automotive industry for axle and shaft construction. The automated system for thermographic crack detection uses brief pulsed eddy currents to heat steel components under inspection. Cracks, if present, will disturb the current flow and so generate changes in the temperature profile in the crack area. These changes of temperature are visualized using an infrared camera. The image acquired by the infrared camera is evaluated through an image processing system. The advantages afforded by the system are its inspection time, its excellent flaw detection sensitivity and its ability to detect hidden, subsurface cracks. The automated system consists of four IR cameras (each side of steel bar is evaluated at a time), coil, high frequency generator and control place with computers. The system is a part of the inspection line where the subsurface and surface cracks are searched. If the crack is present, the cracked place is automatically marked. The components without cracks are then deposited apart from defective blocks. The system is fully automated and its ability is to evaluate four meter blocks within 20 seconds. This is the real reason for using this system in real industrial applications. (author)

  14. Policy challenges of increasing automation in driving

    Directory of Open Access Journals (Sweden)

    Ata M. Khan

    2012-03-01

    Full Text Available The convergence of information and communication technologies (ICT with automotive technologies has already resulted in automation features in road vehicles and this trend is expected to continue in the future owing to consumer demand, dropping costs of components, and improved reliability. While the automation features that have taken place so far are mainly in the form of information and driver warning technologies (classified as level I pre-2010, future developments in the medium term (level II 2010–2025 are expected to exhibit connected cognitive vehicle features and encompass increasing degree of automation in the form of advanced driver assistance systems. Although autonomous vehicles have been developed for research purposes and are being tested in controlled driving missions, the autonomous driving case is only a long term (level III 2025+ scenario. This paper contributes knowledge on technological forecasts regarding automation, policy challenges for each level of technology development and application context, and the essential instrument of cost-effectiveness for policy analysis which enables policy decisions on the automation systems to be assessed in a consistent and balanced manner. The cost of a system per vehicle is viewed against its effectiveness in meeting policy objectives of improving safety, efficiency, mobility, convenience and reducing environmental effects. Example applications are provided that illustrate the contribution of the methodology in providing information for supporting policy decisions. Given the uncertainties in system costs as well as effectiveness, the tool for assessing policies for future generation features probabilistic and utility-theoretic analysis capability. The policy issues defined and the assessment framework enable the resolution of policy challenges while allowing worthy innovative automation in driving to enhance future road transportation.

  15. Towards a fully automatic and robust DIMM (DIMMA)

    International Nuclear Information System (INIS)

    Varela, A M; Muñoz-Tuñón, C; Del Olmo-García, A M; Rodríguez, L F; Delgado, J M; Castro-Almazán, J A

    2015-01-01

    Quantitative seeing measurements have been provided at the Canarian Observatories since 1990 by differential image motion monitors (DIMMs). Image quality needs to be studied in long term (routine) measurements. This is important, for instance, in deciding on the siting of large telescopes or in the development of adaptive optics programmes, not to mention the development and design of new instruments. On the other hand, the continuous real time monitoring is essential in the day-to-day operation of telescopes.These routine measurements have to be carried out by standard, easy-to-operate and cross- calibrated instruments that required to be be operational with minimum intervention over many years. The DIMMA (Automatic Differential Image Motion Monitor) is the next step, a fully automated seeing monitor that is capable of providing data without manual operation and in remote locations. Currently, the IAC has two DIMMs working at Roque de los Muchachos Observatory (ORM) and Teide Observatory (OT). They are robotic and require an operator to start and initialize the program, focus the telescope, change the star when needed and turn off at the end of the night, all of which is done remotely. With a view to automation, we have designed a code for monitoring image quality (avoiding spurious data) and a program for autofocus, which is presented here. The data quality control protocol is also given. (paper)

  16. Automated data processing of high-resolution mass spectra

    DEFF Research Database (Denmark)

    Hansen, Michael Adsetts Edberg; Smedsgaard, Jørn

    of the massive amounts of data. We present an automated data processing method to quantitatively compare large numbers of spectra from the analysis of complex mixtures, exploiting the full quality of high-resolution mass spectra. By projecting all detected ions - within defined intervals on both the time...... infusion of crude extracts into the source taking advantage of the high sensitivity, high mass resolution and accuracy and the limited fragmentation. Unfortunately, there has not been a comparable development in the data processing techniques to fully exploit gain in high resolution and accuracy...... infusion analyses of crude extract to find the relationship between species from several species terverticillate Penicillium, and also that the ions responsible for the segregation can be identified. Furthermore the process can automate the process of detecting unique species and unique metabolites....

  17. Distress and worry as mediators in the relationship between psychosocial risks and upper body musculoskeletal complaints in highly automated manufacturing.

    Science.gov (United States)

    Wixted, Fiona; Shevlin, Mark; O'Sullivan, Leonard W

    2018-03-15

    As a result of changes in manufacturing including an upward trend in automation and the advent of the fourth industrial revolution, the requirement for supervisory monitoring and consequently, cognitive demand has increased in automated manufacturing. The incidence of musculoskeletal disorders has also increased in the manufacturing sector. A model was developed based on survey data to test if distress and worry mediate the relationship between psychosocial factors (job control, cognitive demand, social isolation and skill discretion), stress states and symptoms of upper body musculoskeletal disorders in highly automated manufacturing companies (n = 235). These constructs facilitated the development of a statistically significant model (RMSEA 0.057, TLI 0.924, CFI 0.935). Cognitive demand was shown to be related to higher distress in employees, and distress to a higher incidence of self-reported shoulder and lower back symptoms. The mediation model incorporating stress states (distress, worry) as mediators is a novel approach in linking psychosocial risks to musculoskeletal disorders. Practitioners' Summary With little requirement for physical work in many modern automated manufacturing workplaces, there is often minimal management focus on Work-Related Musculoskeletal Disorders (WRMSDs) as important occupational health problems. Our model provides evidence that psychosocial factors are important risk factors in symptoms of WRMSD and should be managed.

  18. The standard laboratory module approach to automation of the chemical laboratory

    International Nuclear Information System (INIS)

    Hollen, R.M.; Erkkila, T.H.

    1993-01-01

    Automation of the technology and practice of environmental laboratory automation has not been as rapid or complete as one might expect. Confined to autosamplers and limited robotic systems, our ability to apply production concepts to environmental analytical analysis is not great. With the impending remediation of our hazardous waste sites in the US, only the application of production chemistry techniques will even begin to provide those responsible with the necessary knowledge to accomplish the cleanup expeditiously and safely. Tightening regulatory requirements have already mandated staggering increases in sampling and characterization needs with the future only guaranteeing greater demands. The Contaminant Analysis Automation Program has been initiated by our government to address these current and future characterization by application of a new robotic paradigm for analytical chemistry. By using standardized modular instruments, named Standard Laboratory Modules, flexible automation systems can rapidly be configured to apply production techniques to our nations environmental problems at-site

  19. Aviation Frontiers: On-Demand Aircraft

    Science.gov (United States)

    Moore, Mark D.

    2010-01-01

    Throughout the 20th Century, NASA has defined the forefront of aeronautical technology, and the aviation industry owes much of its prosperity to this knowledge and technology. In recent decades, centralized aeronautics has become a mature discipline, which raises questions concerning the future aviation innovation frontiers. Three transformational aviation capabilities, bounded together by the development of a Free Flight airspace management system, have the potential to transform 21st Century society as profoundly as civil aviation transformed the 20th Century. These mobility breakthroughs will re-establish environmental sustainable centralized aviation, while opening up latent markets for civil distributed sensing and on-demand rural and regional transportation. Of these three transformations, on-demand aviation has the potential to have the largest market and productivity improvement to society. The information system revolution over the past 20 years shows that vehicles lead, and the interconnecting infrastructure to make them more effective follows; that is, unless on-demand aircraft are pioneered, a distributed Air Traffic Control system will likely never be established. There is no single technology long-pole that will enable on-demand vehicle solutions. However, fully digital aircraft that include electric propulsion has the potential to be a multi-disciplinary initiator of solid state technologies that can provide order of magnitude improvements in the ease of use, safety/reliability, community and environmental friendliness, and affordability.

  20. Automated real-time detection of tonic-clonic seizures using a wearable EMG device

    DEFF Research Database (Denmark)

    Beniczky, Sándor; Conradsen, Isa; Henning, Oliver

    2018-01-01

    OBJECTIVE: To determine the accuracy of automated detection of generalized tonic-clonic seizures (GTCS) using a wearable surface EMG device. METHODS: We prospectively tested the technical performance and diagnostic accuracy of real-time seizure detection using a wearable surface EMG device....... The seizure detection algorithm and the cutoff values were prespecified. A total of 71 patients, referred to long-term video-EEG monitoring, on suspicion of GTCS, were recruited in 3 centers. Seizure detection was real-time and fully automated. The reference standard was the evaluation of video-EEG recordings...

  1. ICECAP: an integrated, general-purpose, automation-assisted IC50/EC50 assay platform.

    Science.gov (United States)

    Li, Ming; Chou, Judy; King, Kristopher W; Jing, Jing; Wei, Dong; Yang, Liyu

    2015-02-01

    IC50 and EC50 values are commonly used to evaluate drug potency. Mass spectrometry (MS)-centric bioanalytical and biomarker labs are now conducting IC50/EC50 assays, which, if done manually, are tedious and error-prone. Existing bioanalytical sample preparation automation systems cannot meet IC50/EC50 assay throughput demand. A general-purpose, automation-assisted IC50/EC50 assay platform was developed to automate the calculations of spiking solutions and the matrix solutions preparation scheme, the actual spiking and matrix solutions preparations, as well as the flexible sample extraction procedures after incubation. In addition, the platform also automates the data extraction, nonlinear regression curve fitting, computation of IC50/EC50 values, graphing, and reporting. The automation-assisted IC50/EC50 assay platform can process the whole class of assays of varying assay conditions. In each run, the system can handle up to 32 compounds and up to 10 concentration levels per compound, and it greatly improves IC50/EC50 assay experimental productivity and data processing efficiency. © 2014 Society for Laboratory Automation and Screening.

  2. Review: Behavioral signs of estrus and the potential of fully automated systems for detection of estrus in dairy cattle.

    Science.gov (United States)

    Reith, S; Hoy, S

    2018-02-01

    Efficient detection of estrus is a permanent challenge for successful reproductive performance in dairy cattle. In this context, comprehensive knowledge of estrus-related behaviors is fundamental to achieve optimal estrus detection rates. This review was designed to identify the characteristics of behavioral estrus as a necessary basis for developing strategies and technologies to improve the reproductive management on dairy farms. The focus is on secondary symptoms of estrus (mounting, activity, aggressive and agonistic behaviors) which seem more indicative than standing behavior. The consequences of management, housing conditions and cow- and environmental-related factors impacting expression and detection of estrus as well as their relative importance are described in order to increase efficiency and accuracy of estrus detection. As traditional estrus detection via visual observation is time-consuming and ineffective, there has been a considerable advancement of detection aids during the last 10 years. By now, a number of fully automated technologies including pressure sensing systems, activity meters, video cameras, recordings of vocalization as well as measurements of body temperature and milk progesterone concentration are available. These systems differ in many aspects regarding sustainability and efficiency as keys to their adoption for farm use. As being most practical for estrus detection a high priority - according to the current research - is given to the detection based on sensor-supported activity monitoring, especially accelerometer systems. Due to differences in individual intensity and duration of estrus multivariate analysis can support herd managers in determining the onset of estrus. Actually, there is increasing interest in investigating the potential of combining data of activity monitoring and information of several other methods, which may lead to the best results concerning sensitivity and specificity of detection. Future improvements will

  3. Exploring ethical justification for self-demand amputation.

    Science.gov (United States)

    Tomasini, Floris

    2006-01-01

    Self-demand amputees are persons who need to have one or more healthy limbs or digits amputated to fit the way they see themselves. They want to rid themselves of a limb that they believe does not belong to their body-identity. The obsessive desire to have appendages surgically removed to fit an alternative body-image is medically and ethically controversial. My purpose in this paper is to provide a number of normative and professional ethical perspectives on whether or not it is possible to justify surgery for self-demand amputees. In doing so I proceed dialogically, moving between empirical context and normative theory, revealing the taken for granted normative assumptions (what I call the natural attitude--a technical term borrowed from phenomenology) that provide ethical limits to justifying the treatment of self-demand amputees. While I critically examine both Kantian responses against as well as Utilitarian responses for amputation on demand, I conclude that neither normative tradition can fully incorporate an understanding of what it is like to be a self-demand amputee. Since neither theory can justify the apparent non-rational desire of amputation on demand, ethical justification, I argue, falls short of the recognition that there may be a problem. To end, I introduce a meta-ethical idea, "the struggle for recognition," opening up the theoretical possibility of a hermeneutics of recognition before ethical justification that may be more sensitive to the problem of radical embodied difference exemplified by self-demand amputees.

  4. Automated PET Radiotracer Manufacture on the BG75 System and Imaging Validation Studies of [18F]fluoromisonidazole ([18F]FMISO).

    Science.gov (United States)

    Yuan, Hong; Frank, Jonathan E; Merrill, Joseph R; Hillesheim, Daniel A; Khachaturian, Mark H; Anzellotti, Atilio I

    2016-01-01

    The hypoxia PET tracer, 1-[18F]fluoro-3-(2-nitro-1Himidazol- 1-yl)-propan-2-ol ([18F]FMISO) is the first radiotracer developed for hypoxia PET imaging and has shown promising for cancer diagnosis and prognosis. However, access to [18F]FMISO radiotracer is limited due to the needed cyclotron and radiochemistry expertise. The study aimed to develop the automated production method on the [18F]FMISO radiotracer with the novel fully automated platform of the BG75 system and validate its usage on animal tumor models. [18F]FMISO was produced with the dose synthesis cartridge automatically on the BG75 system. Validation of [18F]FMISO hypoxia imaging functionality was conducted on two tumor mouse models (FaDu/U87 tumor). The distribution of [18F]FMISO within tumor was further validated by the standard hypoxia marker EF5. The average radiochemical purity was (99±1) % and the average pH was 5.5±0.2 with other quality attributes passing standard criteria (n=12). Overall biodistribution for [18F]FMISO in both tumor models was consistent with reported studies where bladder and large intestines presented highest activity at 90 min post injection. High spatial correlation was found between [18F]FMISO autoradiography and EF5 hypoxia staining, indicating high hypoxia specificity of [18MF]FMISO. This study shows that qualified [18F]FMISO can be efficiently produced on the BG75 system in an automated "dose-on-demand" mode using single dose disposable cards. The possibilities of having a low-cost, automated system manufacturing ([18F]Fluoride production + synthesis + QC) different radiotracers will greatly enhance the potential for PET technology to reach new geographical areas and underserved patient populations. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  5. An Overview of the Automated Dispatch Controller Algorithms in the System Advisor Model (SAM)

    Energy Technology Data Exchange (ETDEWEB)

    DiOrio, Nicholas A [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2017-11-22

    Three automatic dispatch modes have been added to the battery model within the System Adviser Model. These controllers have been developed to perform peak shaving in an automated fashion, providing users with a way to see the benefit of reduced demand charges without manually programming a complicated dispatch control. A flexible input option allows more advanced interaction with the automated controller. This document will describe the algorithms in detail and present brief results on its use and limitations.

  6. Automated assay for screening the enzymatic release of reducing sugars from micronized biomass

    Directory of Open Access Journals (Sweden)

    Asther Marcel

    2010-07-01

    Full Text Available Abstract Background To reduce the production cost of bioethanol obtained from fermentation of the sugars provided by degradation of lignocellulosic biomass (i.e., second generation bioethanol, it is necessary to screen for new enzymes endowed with more efficient biomass degrading properties. This demands the set-up of high-throughput screening methods. Several methods have been devised all using microplates in the industrial SBS format. Although this size reduction and standardization has greatly improved the screening process, the published methods comprise one or more manual steps that seriously decrease throughput. Therefore, we worked to devise a screening method devoid of any manual steps. Results We describe a fully automated assay for measuring the amount of reducing sugars released by biomass-degrading enzymes from wheat-straw and spruce. The method comprises two independent and automated steps. The first step is the making of "substrate plates". It consists of filling 96-well microplates with slurry suspensions of micronized substrate which are then stored frozen until use. The second step is an enzymatic activity assay. After thawing, the substrate plates are supplemented by the robot with cell-wall degrading enzymes where necessary, and the whole process from addition of enzymes to quantification of released sugars is autonomously performed by the robot. We describe how critical parameters (amount of substrate, amount of enzyme, incubation duration and temperature were selected to fit with our specific use. The ability of this automated small-scale assay to discriminate among different enzymatic activities was validated using a set of commercial enzymes. Conclusions Using an automatic microplate sealer solved three main problems generally encountered during the set-up of methods for measuring the sugar-releasing activity of plant cell wall-degrading enzymes: throughput, automation, and evaporation losses. In its present set-up, the

  7. The Molecular Industrial Revolution: Automated Synthesis of Small Molecules.

    Science.gov (United States)

    Trobe, Melanie; Burke, Martin D

    2018-04-09

    Today we are poised for a transition from the highly customized crafting of specific molecular targets by hand to the increasingly general and automated assembly of different types of molecules with the push of a button. Creating machines that are capable of making many different types of small molecules on demand, akin to that which has been achieved on the macroscale with 3D printers, is challenging. Yet important progress is being made toward this objective with two complementary approaches: 1) Automation of customized synthesis routes to different targets by machines that enable the use of many reactions and starting materials, and 2) automation of generalized platforms that make many different targets using common coupling chemistry and building blocks. Continued progress in these directions has the potential to shift the bottleneck in molecular innovation from synthesis to imagination, and thereby help drive a new industrial revolution on the molecular scale. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. Human-Automation Cooperation for Separation Assurance in Future NextGen Environments

    Science.gov (United States)

    Mercer, Joey; Homola, Jeffrey; Cabrall, Christopher; Martin, Lynne; Morey, Susan; Gomez, Ashley; Prevot, Thomas

    2014-01-01

    A 2012 Human-In-The-Loop air traffic control simulation investigated a gradual paradigm-shift in the allocation of functions between operators and automation. Air traffic controllers staffed five adjacent high-altitude en route sectors, and during the course of a two-week experiment, worked traffic under different function-allocation approaches aligned with four increasingly mature NextGen operational environments. These NextGen time-frames ranged from near current-day operations to nearly fully-automated control, in which the ground systems automation was responsible for detecting conflicts, issuing strategic and tactical resolutions, and alerting the controller to exceptional circumstances. Results indicate that overall performance was best in the most automated NextGen environment. Safe operations were achieved in this environment for twice todays peak airspace capacity, while being rated by the controllers as highly acceptable. However, results show that sector operations were not always safe; separation violations did in fact occur. This paper will describe in detail the simulation conducted, as well discuss important results and their implications.

  9. Improved automated lumen contour detection by novel multifrequency processing algorithm with current intravascular ultrasound system.

    Science.gov (United States)

    Kume, Teruyoshi; Kim, Byeong-Keuk; Waseda, Katsuhisa; Sathyanarayana, Shashidhar; Li, Wenguang; Teo, Tat-Jin; Yock, Paul G; Fitzgerald, Peter J; Honda, Yasuhiro

    2013-02-01

    The aim of this study was to evaluate a new fully automated lumen border tracing system based on a novel multifrequency processing algorithm. We developed the multifrequency processing method to enhance arterial lumen detection by exploiting the differential scattering characteristics of blood and arterial tissue. The implementation of the method can be integrated into current intravascular ultrasound (IVUS) hardware. This study was performed in vivo with conventional 40-MHz IVUS catheters (Atlantis SR Pro™, Boston Scientific Corp, Natick, MA) in 43 clinical patients with coronary artery disease. A total of 522 frames were randomly selected, and lumen areas were measured after automatically tracing lumen borders with the new tracing system and a commercially available tracing system (TraceAssist™) referred to as the "conventional tracing system." The data assessed by the two automated systems were compared with the results of manual tracings by experienced IVUS analysts. New automated lumen measurements showed better agreement with manual lumen area tracings compared with those of the conventional tracing system (correlation coefficient: 0.819 vs. 0.509). When compared against manual tracings, the new algorithm also demonstrated improved systematic error (mean difference: 0.13 vs. -1.02 mm(2) ) and random variability (standard deviation of difference: 2.21 vs. 4.02 mm(2) ) compared with the conventional tracing system. This preliminary study showed that the novel fully automated tracing system based on the multifrequency processing algorithm can provide more accurate lumen border detection than current automated tracing systems and thus, offer a more reliable quantitative evaluation of lumen geometry. Copyright © 2011 Wiley Periodicals, Inc.

  10. High-throughput mouse genotyping using robotics automation.

    Science.gov (United States)

    Linask, Kaari L; Lo, Cecilia W

    2005-02-01

    The use of mouse models is rapidly expanding in biomedical research. This has dictated the need for the rapid genotyping of mutant mouse colonies for more efficient utilization of animal holding space. We have established a high-throughput protocol for mouse genotyping using two robotics workstations: a liquid-handling robot to assemble PCR and a microfluidics electrophoresis robot for PCR product analysis. This dual-robotics setup incurs lower start-up costs than a fully automated system while still minimizing human intervention. Essential to this automation scheme is the construction of a database containing customized scripts for programming the robotics workstations. Using these scripts and the robotics systems, multiple combinations of genotyping reactions can be assembled simultaneously, allowing even complex genotyping data to be generated rapidly with consistency and accuracy. A detailed protocol, database, scripts, and additional background information are available at http://dir.nhlbi.nih.gov/labs/ldb-chd/autogene/.

  11. Energy conservation and management system using efficient building automation

    Science.gov (United States)

    Ahmed, S. Faiz; Hazry, D.; Tanveer, M. Hassan; Joyo, M. Kamran; Warsi, Faizan A.; Kamarudin, H.; Wan, Khairunizam; Razlan, Zuradzman M.; Shahriman A., B.; Hussain, A. T.

    2015-05-01

    In countries where the demand and supply gap of electricity is huge and the people are forced to endure increasing hours of load shedding, unnecessary consumption of electricity makes matters even worse. So the importance and need for electricity conservation increases exponentially. This paper outlines a step towards the conservation of energy in general and electricity in particular by employing efficient Building Automation technique. It should be noted that by careful designing and implementation of the Building Automation System, up to 30% to 40% of energy consumption can be reduced, which makes a huge difference for energy saving. In this study above mentioned concept is verified by performing experiment on a prototype experimental room and by implementing efficient building automation technique. For the sake of this efficient automation, Programmable Logic Controller (PLC) is employed as a main controller, monitoring various system parameters and controlling appliances as per required. The hardware test run and experimental findings further clarifies and proved the concept. The added advantage of this project is that it can be implemented to both small and medium level domestic homes thus greatly reducing the overall unnecessary load on the Utility provider.

  12. How we can measure the non-driving-task engagement in automated driving: Comparing flow experience and workload.

    Science.gov (United States)

    Ko, Sang Min; Ji, Yong Gu

    2018-02-01

    In automated driving, a driver can completely concentrate on non-driving-related tasks (NDRTs). This study investigated the flow experience of a driver who concentrated on NDRTs and tasks that induce mental workload under conditional automation. Participants performed NDRTs under different demand levels: a balanced demand-skill level (fit condition) to induce flow, low-demand level to induce boredom, and high-demand level to induce anxiety. In addition, they performed the additional N-Back task, which artificially induces mental workload. The results showed participants had the longest reaction time when they indicated the highest flow score, and had the longest gaze-on time, road-fixation time, hands-on time, and take-over time under the fit condition. Significant differences were not observed in the driver reaction times in the fit condition and the additional N-Back task, indicating that performing NDRTs that induce a high flow experience could influence driver reaction time similar to performing tasks with a high mental workload. Copyright © 2017. Published by Elsevier Ltd.

  13. The possibility of a fully automated procedure for radiosynthesis of fluorine-18-labeled fluoromisonidazole using a simplified single, neutral alumina column purification procedure

    International Nuclear Information System (INIS)

    Nandy, Saikat; Rajan, M.G.R.; Korde, A.; Krishnamurthy, N.V.

    2010-01-01

    A novel fully automated radiosynthesis procedure for [ 18 F]Fluoromisonidazole using a simple alumina cartridge-column for purification instead of conventionally used semi-preparative HPLC was developed. [ 18 F]FMISO was prepared via a one-pot, two-step synthesis procedure using a modified nuclear interface synthesis module. Nucleophilic fluorination of the precursor molecule 1-(2'-nitro-1'-imidazolyl) -2-O-tetrahydropyranyl-3-O-toluenesulphonylpropanediol (NITTP) with no-carrier added [ 18 F]fluoride followed by hydrolysis of the protecting group with 1 M HCl. Purification was carried out using a single neutral alumina cartridge-column instead of semi-preparative HPLC. The maximum overall radiochemical yield obtained was 37.49±1.68% with 10 mg NITTP (n=3, without any decay correction) and the total synthesis time was 40±1 min. The radiochemical purity was greater than 95% and the product was devoid of other chemical impurities including residual aluminum and acetonitrile. The biodistribution study in fibrosarcoma tumor model showed maximum uptake in tumor, 2 h post injection. Finally, PET/CT imaging studies in normal healthy rabbit, showed clear uptake in the organs involved in the metabolic process of MISO. No bone uptake was observed excluding the presence of free [ 18 F]fluoride. The reported method can be easily adapted in any commercial FDG synthesis module.

  14. Driver Vigilance in Automated Vehicles: Hazard Detection Failures Are a Matter of Time.

    Science.gov (United States)

    Greenlee, Eric T; DeLucia, Patricia R; Newton, David C

    2018-03-01

    The primary aim of the current study was to determine whether monitoring the roadway for hazards during automated driving results in a vigilance decrement. Although automated vehicles are relatively novel, the nature of human-automation interaction within them has the classic hallmarks of a vigilance task. Drivers must maintain attention for prolonged periods of time to detect and respond to rare and unpredictable events, for example, roadway hazards that automation may be ill equipped to detect. Given the similarity with traditional vigilance tasks, we predicted that drivers of a simulated automated vehicle would demonstrate a vigilance decrement in hazard detection performance. Participants "drove" a simulated automated vehicle for 40 minutes. During that time, their task was to monitor the roadway for roadway hazards. As predicted, hazard detection rate declined precipitously, and reaction times slowed as the drive progressed. Further, subjective ratings of workload and task-related stress indicated that sustained monitoring is demanding and distressing and it is a challenge to maintain task engagement. Monitoring the roadway for potential hazards during automated driving results in workload, stress, and performance decrements similar to those observed in traditional vigilance tasks. To the degree that vigilance is required of automated vehicle drivers, performance errors and associated safety risks are likely to occur as a function of time on task. Vigilance should be a focal safety concern in the development of vehicle automation.

  15. Gene Expression Measurement Module (GEMM) - a fully automated, miniaturized instrument for measuring gene expression in space

    Science.gov (United States)

    Karouia, Fathi; Ricco, Antonio; Pohorille, Andrew; Peyvan, Kianoosh

    2012-07-01

    The capability to measure gene expression on board spacecrafts opens the doors to a large number of experiments on the influence of space environment on biological systems that will profoundly impact our ability to conduct safe and effective space travel, and might also shed light on terrestrial physiology or biological function and human disease and aging processes. Measurements of gene expression will help us to understand adaptation of terrestrial life to conditions beyond the planet of origin, identify deleterious effects of the space environment on a wide range of organisms from microbes to humans, develop effective countermeasures against these effects, determine metabolic basis of microbial pathogenicity and drug resistance, test our ability to sustain and grow in space organisms that can be used for life support and in situ resource utilization during long-duration space exploration, and monitor both the spacecraft environment and crew health. These and other applications hold significant potential for discoveries in space biology, biotechnology and medicine. Accordingly, supported by funding from the NASA Astrobiology Science and Technology Instrument Development Program, we are developing a fully automated, miniaturized, integrated fluidic system for small spacecraft capable of in-situ measuring microbial expression of thousands of genes from multiple samples. The instrument will be capable of (1) lysing bacterial cell walls, (2) extracting and purifying RNA released from cells, (3) hybridizing it on a microarray and (4) providing electrochemical readout, all in a microfluidics cartridge. The prototype under development is suitable for deployment on nanosatellite platforms developed by the NASA Small Spacecraft Office. The first target application is to cultivate and measure gene expression of the photosynthetic bacterium Synechococcus elongatus, i.e. a cyanobacterium known to exhibit remarkable metabolic diversity and resilience to adverse conditions

  16. Coping with the psychological impact of automated systems

    International Nuclear Information System (INIS)

    Suzuki, K.; Nogami, T.; Inoue, T.; Mitsumori, K.; Taguchi, T.

    1991-01-01

    Japanese surveys and experiments have found that operators sometimes find it difficult to anticipate automatic processes, which in turn limits their ability to keep up with those processes. One of the factors which makes anticipation difficult is the lack of flexible communication between operators and computers - communication which is easier among human operators. At present the only way of dealing with this psychological effect is to ensure that trainees fully master the characteristics of the automated processes. (author)

  17. Fully automated atlas-based method for prescribing 3D PRESS MR spectroscopic imaging: Toward robust and reproducible metabolite measurements in human brain.

    Science.gov (United States)

    Bian, Wei; Li, Yan; Crane, Jason C; Nelson, Sarah J

    2018-02-01

    To implement a fully automated atlas-based method for prescribing 3D PRESS MR spectroscopic imaging (MRSI). The PRESS selected volume and outer-volume suppression bands were predefined on the MNI152 standard template image. The template image was aligned to the subject T 1 -weighted image during a scan, and the resulting transformation was then applied to the predefined prescription. To evaluate the method, H-1 MRSI data were obtained in repeat scan sessions from 20 healthy volunteers. In each session, datasets were acquired twice without repositioning. The overlap ratio of the prescribed volume in the two sessions was calculated and the reproducibility of inter- and intrasession metabolite peak height and area ratios was measured by the coefficient of variation (CoV). The CoVs from intra- and intersession were compared by a paired t-test. The average overlap ratio of the automatically prescribed selection volumes between two sessions was 97.8%. The average voxel-based intersession CoVs were less than 0.124 and 0.163 for peak height and area ratios, respectively. Paired t-test showed no significant difference between the intra- and intersession CoVs. The proposed method provides a time efficient method to prescribe 3D PRESS MRSI with reproducible imaging positioning and metabolite measurements. Magn Reson Med 79:636-642, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.

  18. Automated Radioanalytical Chemistry: Applications For The Laboratory And Industrial Process Monitoring

    International Nuclear Information System (INIS)

    O'Hara, Matthew J.; Farawila, Anne F.; Grate, Jay W.

    2009-01-01

    The identification and quantification of targeted α- and β-emitting radionuclides via destructive analysis in complex radioactive liquid matrices is highly challenging. Analyses are typically accomplished at on- or off-site laboratories through laborious sample preparation steps and extensive chemical separations followed by analysis using a variety of detection methodologies (e.g., liquid scintillation, alpha energy spectroscopy, mass spectrometry). Analytical results may take days or weeks to report. When an industrial-scale plant requires periodic or continuous monitoring of radionuclides as an indication of the composition of its feed stream, diversion of safeguarded nuclides, or of plant operational conditions (for example), radiochemical measurements should be rapid, but not at the expense of precision and accuracy. Scientists at Pacific Northwest National Laboratory have developed and characterized a host of automated radioanalytical systems designed to perform reproducible and rapid radioanalytical processes. Platforms have been assembled for (1) automation and acceleration of sample analysis in the laboratory and (2) automated monitors for monitoring industrial scale nuclear processes on-line with near-real time results. These methods have been applied to the analysis of environmental-level actinides and fission products to high-level nuclear process fluids. Systems have been designed to integrate a number of discrete sample handling steps, including sample pretreatment (e.g., digestion and valence state adjustment) and chemical separations. The systems have either utilized on-line analyte detection or have collected the purified analyte fractions for off-line measurement applications. One PNNL system of particular note is a fully automated prototype on-line radioanalytical system designed for the Waste Treatment Plant at Hanford, WA, USA. This system demonstrated nearly continuous destructive analysis of the soft β-emitting radionuclide 99Tc in nuclear

  19. Automated evaluation of one-loop scattering amplitudes

    International Nuclear Information System (INIS)

    Deurzen, Hans van

    2015-01-01

    In this dissertation the developments toward fully automated evaluation of one-loop scattering amplitudes will be presented, as implemented in the GoSam framework. The code Xsamurai, part of GoSam, is described, which implements the integrand reduction algorithm including an extension to higher-rank capability. GoSam was used to compute several Higgs boson production channels at NLO QCD. An interface between GoSam and a Monte Carlo program was constructed, which enables computing any process at NLO precision needed in the LHC era.

  20. Effects of steering demand on lane keeping behaviour, self-reports, and physiology. A simulator study.

    Science.gov (United States)

    Dijksterhuis, Chris; Brookhuis, Karel A; De Waard, Dick

    2011-05-01

    In this study a driving simulator was used to determine changes in mental effort in response to manipulations of steering demand. Changes in mental effort were assessed by using subjective effort ratings, physiology, and the standard deviation of the lateral position. Steering demand was increased by exposure to narrow lane widths and high density oncoming traffic while speed was fixed in all conditions to prevent a compensatory reaction. Results indicated that both steering demand factors influence mental effort expenditure and using multiple measures contributes to effort assessment. Application of these outcomes for adaptive automation is envisaged. Copyright © 2010 Elsevier Ltd. All rights reserved.